Add latest changes from gitlab-org/gitlab@master

This commit is contained in:
GitLab Bot 2023-01-10 03:07:25 +00:00
parent 070ac34d47
commit 4a6dacc866
69 changed files with 1253 additions and 1016 deletions

View File

@ -2,6 +2,20 @@
documentation](doc/development/changelog.md) for instructions on adding your own
entry.
## 15.7.2 (2023-01-09)
### Security (9 changes)
- [Avoid regex with potential for poorly performing backtracking](gitlab-org/security/gitlab@1cb3b4904b25f1e47a40ddd48f3fdcb16bf02401) ([merge request](gitlab-org/security/gitlab!2987))
- [Protect web-hook url variables after changing URL](gitlab-org/security/gitlab@58015aa49e63456094fcbf06a8fa739ac2a27f21) ([merge request](gitlab-org/security/gitlab!2976))
- [Limit the size of user agent to reduce ReDos attack](gitlab-org/security/gitlab@ac3eb7cbf4a1701a499d0cbbbae568c55914c8c3) ([merge request](gitlab-org/security/gitlab!2985))
- [Protect Sentry auth-token after changing URL](gitlab-org/security/gitlab@eba316d255caaa497e3a137aba5f262fd6272939) ([merge request](gitlab-org/security/gitlab!2983))
- [Delete project specific licenses when license policy is deleted](gitlab-org/security/gitlab@a6bef9aee6175401408a12fe1439e775b84bc8cb) ([merge request](gitlab-org/security/gitlab!2969))
- [Restrict user avatar availability based on visibility restrictions](gitlab-org/security/gitlab@9620a1bcae911c84112cc14da22711a344b89acf) ([merge request](gitlab-org/security/gitlab!2971))
- [Policy change to read and destroy token without license for .com](gitlab-org/security/gitlab@5fcf1350fafe9a30f17fa19a3567620f10df1ccd) ([merge request](gitlab-org/security/gitlab!2968))
- [Restrict Grafana API access on public projects](gitlab-org/security/gitlab@3274a7fbeabc04f9db69ffd052e0e77a6b71a7f8) ([merge request](gitlab-org/security/gitlab!2960))
- [Fix "Race condition enables verified email forgery"](gitlab-org/security/gitlab@c3e6fede4230a3ce0fc1d0e4c82f5f3ede41f663) ([merge request](gitlab-org/security/gitlab!2966))
## 15.7.1 (2023-01-05)
### Fixed (2 changes)
@ -822,6 +836,20 @@ entry.
- [Propagate RemoteIP to Gitaly via Workhorse](gitlab-org/gitlab@71da945c85931bac0263c193902dc1b54e2e62da) ([merge request](gitlab-org/gitlab!103635))
- [Documentation to reflect 100MB upload limit](gitlab-org/gitlab@33063bb26ab7699802ecb2b325cc8619d6fe7b86) ([merge request](gitlab-org/gitlab!103978))
## 15.6.4 (2023-01-09)
### Security (9 changes)
- [Avoid regex with potential for poorly performing backtracking](gitlab-org/security/gitlab@76052c2c1d89b47fe1a39d6a2118ced0d26d4e5f) ([merge request](gitlab-org/security/gitlab!2988))
- [Protect web-hook url variables after changing URL](gitlab-org/security/gitlab@55b7e051e4c6ca50ef1165130c465f1d11bd968f) ([merge request](gitlab-org/security/gitlab!2977))
- [Limit the size of user agent to reduce ReDos attack](gitlab-org/security/gitlab@b9e42f4fe131f4a17d24d69076444d68c6a31b18) ([merge request](gitlab-org/security/gitlab!2990))
- [Protect Sentry auth-token after changing URL](gitlab-org/security/gitlab@3b1d4ae2fbd1845d7659b21c65426275fb0b72d3) ([merge request](gitlab-org/security/gitlab!2984))
- [Delete project specific licenses when license policy is deleted](gitlab-org/security/gitlab@79142b8c727a3d43b3555c4600b0b6cb3e070ebe) ([merge request](gitlab-org/security/gitlab!2943))
- [Restrict user avatar availability based on visibility restrictions](gitlab-org/security/gitlab@15732554472373586769a8ca46c2b5cbf0b40783) ([merge request](gitlab-org/security/gitlab!2972))
- [Policy change to read and destroy token without license for .com](gitlab-org/security/gitlab@9219eab8a5180ae34bb92cbd52c5e7be0602b66d) ([merge request](gitlab-org/security/gitlab!2913))
- [Restrict Grafana API access on public projects](gitlab-org/security/gitlab@7a23bd7fe68a47ac5ae56c212d5ec3695631a4db) ([merge request](gitlab-org/security/gitlab!2958))
- [Fix "Race condition enables verified email forgery"](gitlab-org/security/gitlab@d0c0852118adaeb8e99f443c06769b9564294290) ([merge request](gitlab-org/security/gitlab!2963))
## 15.6.3 (2022-12-21)
No changes.
@ -1465,6 +1493,21 @@ No changes.
- [Update Gitlab Shell to 14.13.0](gitlab-org/gitlab@691262f5c25c17efcfa50307862afa66d07366a4) ([merge request](gitlab-org/gitlab!101372))
- [Migrate card to Pajamas](gitlab-org/gitlab@10577294ed64b13d7668be0c2041ec133e8f7f87) ([merge request](gitlab-org/gitlab!98861)) **GitLab Enterprise Edition**
## 15.5.7 (2023-01-09)
### Security (10 changes)
- [Avoid regex with potential for poorly performing backtracking](gitlab-org/security/gitlab@c3f8d8c93e99ac3f226668086bfbf21739b02a0e) ([merge request](gitlab-org/security/gitlab!2989))
- [Protect web-hook url variables after changing URL](gitlab-org/security/gitlab@8a18fea752a2759938b4c3d28516b6ed9386404f) ([merge request](gitlab-org/security/gitlab!2978))
- [Limit the size of user agent to reduce ReDos attack](gitlab-org/security/gitlab@293db707009b7dd133a9a55b25892506013062fd) ([merge request](gitlab-org/security/gitlab!2991))
- [Only allow safe params for diff helper](gitlab-org/security/gitlab@0c5de464c1d062103d6bc81cca45f7298929ca68) ([merge request](gitlab-org/security/gitlab!2951))
- [Protect Sentry auth-token after changing URL](gitlab-org/security/gitlab@a2c3380748eb3aa36f23c74f1666c741fafec635) ([merge request](gitlab-org/security/gitlab!2986))
- [Delete project specific licenses when license policy is deleted](gitlab-org/security/gitlab@312a28196df206b501861b6528b4b6fcaf7cc686) ([merge request](gitlab-org/security/gitlab!2896))
- [Restrict user avatar availability based on visibility restrictions](gitlab-org/security/gitlab@f7b5c0a57b64c15edb0f555dd53c26b9d6147f0e) ([merge request](gitlab-org/security/gitlab!2973))
- [Policy change to read and destroy token without license for .com](gitlab-org/security/gitlab@b51bc20ba07d8ef3d339aeacd1b0f904521f4158) ([merge request](gitlab-org/security/gitlab!2914))
- [Restrict Grafana API access on public projects](gitlab-org/security/gitlab@d9798aa2d31ddef9ed6fedfc7b32bc8a8bac76bc) ([merge request](gitlab-org/security/gitlab!2959))
- [Fix "Race condition enables verified email forgery"](gitlab-org/security/gitlab@95e65f637ed193b9c8b3c39af58a9bc0d552bad2) ([merge request](gitlab-org/security/gitlab!2962))
## 15.5.6 (2022-12-07)
No changes.

View File

@ -1 +1 @@
c47edc13a5c4ceb1bc0eca0e9361e2d883d1106c
daa91577c5add2dd851719bc79eb6d5272f95005

View File

@ -58,7 +58,7 @@ module PageLimiter
# Record the page limit being hit in Prometheus
def record_page_limit_interception
dd = DeviceDetector.new(request.user_agent)
dd = Gitlab::SafeDeviceDetector.new(request.user_agent)
Gitlab::Metrics.counter(:gitlab_page_out_of_bounds,
controller: params[:controller],

View File

@ -14,7 +14,8 @@ class Groups::ImportsController < Groups::ApplicationController
redirect_to group_path(@group), notice: s_('GroupImport|The group was successfully imported.')
end
elsif @group.import_state.failed?
redirect_to new_group_path(@group), alert: s_('GroupImport|Failed to import group.')
redirect_to new_group_path(@group),
alert: format(s_('GroupImport|Failed to import group: %{error}'), error: @group.import_state.last_error)
else
flash.now[:notice] = continue_params[:notice_now]
end

View File

@ -4,6 +4,8 @@ class Projects::GrafanaApiController < Projects::ApplicationController
include RenderServiceResults
include MetricsDashboard
before_action :authorize_read_grafana!, only: :proxy
feature_category :metrics
urgency :low

View File

@ -52,6 +52,8 @@ class UploadsController < ApplicationController
# access to itself when a secret is given.
# For instance, user avatars are readable by anyone,
# while temporary, user snippet uploads are not.
return false if !current_user && public_visibility_restricted?
!secret? || can?(current_user, :update_user, model)
when Appearance
true

View File

@ -17,40 +17,26 @@ module SubmoduleHelper
url = File.join(Gitlab.config.gitlab.url, repository.project.full_path)
end
if url =~ %r{([^/:]+)/([^/]+(?:\.git)?)\Z}
namespace = Regexp.last_match(1)
project = Regexp.last_match(2)
gitlab_hosts = [Gitlab.config.gitlab.url,
Gitlab.config.gitlab_shell.ssh_path_prefix]
namespace, project = extract_namespace_project(url)
gitlab_hosts.each do |host|
if url.start_with?(host)
namespace, _, project = url.sub(host, '').rpartition('/')
break
end
end
if namespace.blank? || project.blank?
return [sanitize_submodule_url(url), nil, nil]
end
namespace.delete_prefix!('/')
project.rstrip!
project.delete_suffix!('.git')
if self_url?(url, namespace, project)
[
url_helpers.namespace_project_path(namespace, project),
url_helpers.namespace_project_tree_path(namespace, project, submodule_item_id),
(url_helpers.namespace_project_compare_path(namespace, project, to: submodule_item_id, from: old_submodule_item_id) if old_submodule_item_id)
]
elsif relative_self_url?(url)
relative_self_links(url, submodule_item_id, old_submodule_item_id, repository.project)
elsif gist_github_dot_com_url?(url)
gist_github_com_tree_links(namespace, project, submodule_item_id)
elsif github_dot_com_url?(url)
github_com_tree_links(namespace, project, submodule_item_id, old_submodule_item_id)
elsif gitlab_dot_com_url?(url)
gitlab_com_tree_links(namespace, project, submodule_item_id, old_submodule_item_id)
else
[sanitize_submodule_url(url), nil, nil]
end
if self_url?(url, namespace, project)
[
url_helpers.namespace_project_path(namespace, project),
url_helpers.namespace_project_tree_path(namespace, project, submodule_item_id),
(url_helpers.namespace_project_compare_path(namespace, project, to: submodule_item_id, from: old_submodule_item_id) if old_submodule_item_id)
]
elsif relative_self_url?(url)
relative_self_links(url, submodule_item_id, old_submodule_item_id, repository.project)
elsif gist_github_dot_com_url?(url)
gist_github_com_tree_links(namespace, project, submodule_item_id)
elsif github_dot_com_url?(url)
github_com_tree_links(namespace, project, submodule_item_id, old_submodule_item_id)
elsif gitlab_dot_com_url?(url)
gitlab_com_tree_links(namespace, project, submodule_item_id, old_submodule_item_id)
else
[sanitize_submodule_url(url), nil, nil]
end
@ -58,6 +44,30 @@ module SubmoduleHelper
protected
def extract_namespace_project(url)
namespace_fragment, _, project = url.rpartition('/')
namespace = namespace_fragment.rpartition(%r{[:/]}).last
return [nil, nil] unless project.present? && namespace.present?
gitlab_hosts = [Gitlab.config.gitlab.url,
Gitlab.config.gitlab_shell.ssh_path_prefix]
matching_host = gitlab_hosts.find do |host|
url.start_with?(host)
end
if matching_host
namespace, _, project = url.delete_prefix(matching_host).rpartition('/')
end
namespace.delete_prefix!('/')
project.rstrip!
project.delete_suffix!('.git')
[namespace, project]
end
def gist_github_dot_com_url?(url)
url =~ %r{gist\.github\.com[/:][^/]+/[^/]+\Z}
end

View File

@ -67,7 +67,7 @@ class ActiveSession
def self.set(user, request)
Gitlab::Redis::Sessions.with do |redis|
session_private_id = request.session.id.private_id
client = DeviceDetector.new(request.user_agent)
client = Gitlab::SafeDeviceDetector.new(request.user_agent)
timestamp = Time.current
expiry = Settings.gitlab['session_expire_delay'] * 60

View File

@ -98,6 +98,27 @@ class Environment < ApplicationRecord
scope :auto_stoppable, -> (limit) { available.where('auto_stop_at < ?', Time.zone.now).limit(limit) }
scope :auto_deletable, -> (limit) { stopped.where('auto_delete_at < ?', Time.zone.now).limit(limit) }
scope :deployed_and_updated_before, -> (project_id, before) do
# this query joins deployments and filters out any environment that has recent deployments
joins = %{
LEFT JOIN "deployments" on "deployments".environment_id = "environments".id
AND "deployments".project_id = #{project_id}
AND "deployments".updated_at >= #{connection.quote(before)}
}
Environment.joins(joins)
.where(project_id: project_id, updated_at: ...before)
.group('id', 'deployments.id')
.having('deployments.id IS NULL')
end
scope :without_protected, -> (project) {} # no-op when not in EE mode
scope :without_names, -> (names) do
where.not(name: names)
end
scope :without_tiers, -> (tiers) do
where.not(tier: tiers)
end
##
# Search environments which have names like the given query.
# Do not set a large limit unless you've confirmed that it works on gitlab.com scale.

View File

@ -41,6 +41,7 @@ class WebHook < ApplicationRecord
after_initialize :initialize_url_variables
before_validation :reset_token
before_validation :reset_url_variables, unless: ->(hook) { hook.is_a?(ServiceHook) }
before_validation :set_branch_filter_nil, if: :branch_filter_strategy_all_branches?
validates :push_events_branch_filter, untrusted_regexp: true, if: :branch_filter_strategy_regex?
validates :push_events_branch_filter, "web_hooks/wildcard_branch_filter": true, if: :branch_filter_strategy_wildcard?
@ -213,6 +214,10 @@ class WebHook < ApplicationRecord
self.token = nil if url_changed? && !encrypted_token_changed?
end
def reset_url_variables
self.url_variables = {} if url_changed? && !encrypted_url_variables_changed?
end
def next_failure_count
recent_failures.succ.clamp(1, MAX_FAILURES)
end

View File

@ -273,6 +273,9 @@ class GroupPolicy < Namespaces::GroupProjectNamespaceSharedPolicy
rule { can?(:admin_group) & resource_access_token_feature_available }.policy do
enable :read_resource_access_tokens
enable :destroy_resource_access_tokens
end
rule { can?(:admin_group) & resource_access_token_creation_allowed }.policy do
enable :admin_setting_to_allow_project_access_token_creation
end
@ -338,12 +341,16 @@ class GroupPolicy < Namespaces::GroupProjectNamespaceSharedPolicy
true
end
def resource_access_token_create_feature_available?
true
end
def can_read_group_member?
!(@subject.private? && access_level == GroupMember::NO_ACCESS)
end
def resource_access_token_creation_allowed?
resource_access_token_feature_available? && group.root_ancestor.namespace_settings.resource_access_token_creation_allowed?
resource_access_token_create_feature_available? && group.root_ancestor.namespace_settings.resource_access_token_creation_allowed?
end
def valid_dependency_proxy_deploy_token

View File

@ -157,7 +157,9 @@ class ProjectPolicy < BasePolicy
condition(:service_desk_enabled) { @subject.service_desk_enabled? }
with_scope :subject
condition(:resource_access_token_feature_available) { resource_access_token_feature_available? }
condition(:resource_access_token_feature_available) do
resource_access_token_feature_available?
end
condition(:resource_access_token_creation_allowed) { resource_access_token_creation_allowed? }
# We aren't checking `:read_issue` or `:read_merge_request` in this case
@ -308,6 +310,8 @@ class ProjectPolicy < BasePolicy
rule { guest & can?(:download_code) }.enable :build_download_code
rule { guest & can?(:read_container_image) }.enable :build_read_container_image
rule { guest & ~public_project }.enable :read_grafana
rule { can?(:reporter_access) }.policy do
enable :admin_issue_board
enable :download_code
@ -340,6 +344,7 @@ class ProjectPolicy < BasePolicy
enable :read_package
enable :read_product_analytics
enable :read_ci_cd_analytics
enable :read_grafana
end
# We define `:public_user_access` separately because there are cases in gitlab-ee
@ -521,6 +526,7 @@ class ProjectPolicy < BasePolicy
enable :read_upload
enable :destroy_upload
enable :admin_incident_management_timeline_event_tag
enable :stop_environment
end
rule { public_project & metrics_dashboard_allowed }.policy do
@ -919,12 +925,16 @@ class ProjectPolicy < BasePolicy
true
end
def resource_access_token_create_feature_available?
true
end
def resource_access_token_creation_allowed?
group = project.group
return true unless group # always enable for projects in personal namespaces
resource_access_token_feature_available? && group.root_ancestor.namespace_settings.resource_access_token_creation_allowed?
resource_access_token_create_feature_available? && group.root_ancestor.namespace_settings.resource_access_token_creation_allowed?
end
def project

View File

@ -0,0 +1,24 @@
# frozen_string_literal: true
module Environments
class StopStaleService < BaseService
def execute
return ServiceResponse.error(message: 'Before date must be provided') unless params[:before].present?
return ServiceResponse.error(message: 'Unauthorized') unless can?(current_user, :stop_environment, project)
Environment.available
.deployed_and_updated_before(project.id, params[:before])
.without_protected(project)
.in_batches(of: 100) do |env_batch| # rubocop:disable Cop/InBatches
Environments::AutoStopWorker.bulk_perform_async_with_contexts(
env_batch,
arguments_proc: ->(environment) { environment.id },
context_proc: ->(environment) { { project: project } }
)
end
ServiceResponse.success(message: 'Successfully scheduled stale environments to stop')
end
end
end

View File

@ -2,6 +2,8 @@
module ErrorTracking
class ListProjectsService < ErrorTracking::BaseService
MASKED_TOKEN_REGEX = /\A\*+\z/.freeze
private
def perform
@ -20,23 +22,31 @@ module ErrorTracking
def project_error_tracking_setting
(super || project.build_error_tracking_setting).tap do |setting|
url_changed = !setting.api_url&.start_with?(params[:api_host])
setting.api_url = ErrorTracking::ProjectErrorTrackingSetting.build_api_url_from(
api_host: params[:api_host],
organization_slug: 'org',
project_slug: 'proj'
)
setting.token = token(setting)
setting.token = token(setting, url_changed)
setting.enabled = true
end
end
strong_memoize_attr :project_error_tracking_setting
def token(setting)
def token(setting, url_changed)
return if url_changed && masked_token?
# Use param token if not masked, otherwise use database token
return params[:token] unless /\A\*+\z/.match?(params[:token])
return params[:token] unless masked_token?
setting.token
end
def masked_token?
MASKED_TOKEN_REGEX.match?(params[:token])
end
end
end

View File

@ -71,7 +71,7 @@ module Groups
end
def tree_exporter
tree_exporter_class.new(
Gitlab::ImportExport::Group::TreeSaver.new(
group: group,
current_user: current_user,
shared: shared,
@ -79,18 +79,6 @@ module Groups
)
end
def tree_exporter_class
if ndjson?
Gitlab::ImportExport::Group::TreeSaver
else
Gitlab::ImportExport::Group::LegacyTreeSaver
end
end
def ndjson?
::Feature.enabled?(:group_export_ndjson, group&.parent)
end
def version_saver
Gitlab::ImportExport::VersionSaver.new(shared: shared)
end

View File

@ -29,7 +29,7 @@ module Groups
def execute
Gitlab::Tracking.event(self.class.name, 'create', label: 'import_group_from_file')
if valid_user_permissions? && import_file && restorers.all?(&:restore)
if valid_user_permissions? && import_file && valid_import_file? && restorers.all?(&:restore)
notify_success
Gitlab::Tracking.event(
@ -75,25 +75,11 @@ module Groups
def tree_restorer
@tree_restorer ||=
if ndjson?
Gitlab::ImportExport::Group::TreeRestorer.new(
user: current_user,
shared: shared,
group: group
)
else
Gitlab::ImportExport::Group::LegacyTreeRestorer.new(
user: current_user,
shared: shared,
group: group,
group_hash: nil
)
end
end
def ndjson?
::Feature.enabled?(:group_import_ndjson, group&.parent) &&
File.exist?(File.join(shared.export_path, 'tree/groups/_all.ndjson'))
Gitlab::ImportExport::Group::TreeRestorer.new(
user: current_user,
shared: shared,
group: group
)
end
def remove_import_file
@ -115,6 +101,14 @@ module Groups
end
end
def valid_import_file?
return true if File.exist?(File.join(shared.export_path, 'tree/groups/_all.ndjson'))
shared.error(::Gitlab::ImportExport::Error.incompatible_import_file_error)
false
end
def notify_success
@logger.info(
group_id: group.id,

View File

@ -31,6 +31,7 @@ module Users
assign_identity
build_canonical_email
reset_unconfirmed_email
if @user.save(validate: validate) && update_status
notify_success(user_exists)
@ -64,6 +65,13 @@ module Users
Users::UpdateCanonicalEmailService.new(user: @user).execute
end
def reset_unconfirmed_email
return unless @user.persisted?
return unless @user.email_changed?
@user.update_column(:unconfirmed_email, nil)
end
def update_status
return true unless @status_params

View File

@ -1,7 +1,7 @@
- confirmation_link = confirmation_url(@resource, confirmation_token: @token)
- if @resource.unconfirmed_email.present? || !@resource.created_recently?
#content
= email_default_heading(@resource.unconfirmed_email || @resource.email)
= email_default_heading(@email)
%p= _('Click the link below to confirm your email address.')
#cta
= link_to _('Confirm your email address'), confirmation_link

View File

@ -1,5 +1,5 @@
<% if @resource.unconfirmed_email.present? || !@resource.created_recently? %>
<%= @resource.unconfirmed_email || @resource.email %>,
<%= @email %>,
<%= _('Use the link below to confirm your email address.') %>
<% else %>
<% if Gitlab.com? %>

View File

@ -1,8 +0,0 @@
---
name: group_export_ndjson
introduced_by_url: https://gitlab.com/gitlab-org/gitlab/-/merge_requests/29590
rollout_issue_url:
milestone: '13.0'
type: development
group: group::import
default_enabled: true

View File

@ -1,8 +0,0 @@
---
name: group_import_ndjson
introduced_by_url: https://gitlab.com/gitlab-org/gitlab/-/merge_requests/29716
rollout_issue_url:
milestone: '13.0'
type: development
group: group::import
default_enabled: true

View File

@ -0,0 +1,18 @@
# frozen_string_literal: true
class AddTempIndexOnOverlongVulnerabilityHtmlTitle < Gitlab::Database::Migration[2.0]
INDEX_NAME = 'tmp_index_vulnerability_overlong_title_html'
disable_ddl_transaction!
def up
# Temporary index to speed up the truncation of vulnerabilities with invalid html title length
add_concurrent_index :vulnerabilities, [:id],
name: INDEX_NAME,
where: "LENGTH(title_html) > 800"
end
def down
remove_concurrent_index_by_name :vulnerabilities, INDEX_NAME
end
end

View File

@ -0,0 +1,29 @@
# frozen_string_literal: true
class QueueTruncateOverlongVulnerabilityHtmlTitles < Gitlab::Database::Migration[2.0]
MIGRATION = 'TruncateOverlongVulnerabilityHtmlTitles'
INTERVAL = 2.minutes
BATCH_SIZE = 1_000
MAX_BATCH_SIZE = 10_000
SUB_BATCH_SIZE = 200
disable_ddl_transaction!
restrict_gitlab_migration gitlab_schema: :gitlab_main
def up
queue_batched_background_migration(
MIGRATION,
:vulnerabilities,
:id,
job_interval: INTERVAL,
batch_size: BATCH_SIZE,
max_batch_size: MAX_BATCH_SIZE,
sub_batch_size: SUB_BATCH_SIZE
)
end
def down
delete_batched_background_migration(MIGRATION, :vulnerabilities, :id, [])
end
end

View File

@ -0,0 +1,23 @@
# frozen_string_literal: true
class DeleteQueuedJobsForVulnerabilitiesFeedbackMigration < Gitlab::Database::Migration[2.1]
MIGRATION = 'MigrateVulnerabilitiesFeedbackToVulnerabilitiesStateTransition'
TABLE_NAME = :vulnerability_feedback
BATCH_COLUMN = :id
disable_ddl_transaction!
restrict_gitlab_migration gitlab_schema: :gitlab_main
def up
delete_batched_background_migration(
MIGRATION,
TABLE_NAME,
BATCH_COLUMN,
[]
)
end
def down
# no-op
end
end

View File

@ -0,0 +1 @@
ff748a75deac671ea4ff0ce9df901672afc5dfef794353bec9ab6e0c5d44c981

View File

@ -0,0 +1 @@
a6234578eeaa90365894d345b74cd66d73bd630f2037e07278466cf59ca42210

View File

@ -0,0 +1 @@
ecfd0d17f89aef734239365a79a48b0f8122326030a717a0114db5063bacc58f

View File

@ -31775,6 +31775,8 @@ CREATE INDEX tmp_index_on_vulnerabilities_non_dismissed ON vulnerabilities USING
CREATE INDEX tmp_index_project_statistics_cont_registry_size ON project_statistics USING btree (project_id) WHERE (container_registry_size = 0);
CREATE INDEX tmp_index_vulnerability_overlong_title_html ON vulnerabilities USING btree (id) WHERE (length(title_html) > 800);
CREATE UNIQUE INDEX uniq_pkgs_deb_grp_architectures_on_distribution_id_and_name ON packages_debian_group_architectures USING btree (distribution_id, name);
CREATE UNIQUE INDEX uniq_pkgs_deb_grp_components_on_distribution_id_and_name ON packages_debian_group_components USING btree (distribution_id, name);

View File

@ -0,0 +1,167 @@
---
status: proposed
creation-date: 2022-11-25
authors: [ "@theoretick" ]
coach: "@DylanGriffith"
approvers: [ "@connorgilbert", "@amarpatel" ]
owning-stage: "~devops::secure"
participating-stages: []
---
# Secret Detection as a platform-wide experience
## Summary
Today's secret detection feature is built around containerized scans of repositories
within a pipeline context. This feature is quite limited compared to where leaks
or compromised tokens may appear and should be expanded to include a much wider scope.
Secret detection as a platform-wide experience encompasses detection across
platform features with high risk of secret leakage, including repository contents,
job logs, and project management features such as issues, epics, and MRs.
## Motivation
### Goals
- Support asynchronous secret detection for:
- push events
- issuable creation
- issuable updates
- issuable comments
### Non-Goals
The current proposal is limited to asynchronous detection and alerting only.
**Blocking** secrets on push events is high-risk to a critical path and
would require extensive performance profiling before implementing. See
[a recent example](https://gitlab.com/gitlab-org/gitlab/-/issues/246819#note_1164411983)
of a customer incident where this was attempted.
Secret revocation and rotation is also beyond the scope of this new capability.
Scanned object types beyond the scope of this MVC include:
- Media types (JPEGs, PDFs,...)
- Snippets
- Wikis
## Proposal
To achieve scalable secret detection for a variety of domain objects a dedicated
scanning service must be created and deployed alongside the GitLab distribution.
This is referred to as the `SecretScanningService`.
This service must be:
- highly performant
- horizontally scalable
- generic in domain object scanning capability
Platform-wide secret detection should be enabled by-default on GitLab SaaS as well
as self-managed instances.
## Challenges
- Secure authentication to GitLab.com infrastructure
- Performance of scanning against large blobs
- Performance of scanning against volume of domain objects (such as push frequency)
## Design and implementation details
The critical paths as outlined under [goals above](#goals) cover two major object
types: Git blobs (corresponding to push events) and arbitrary text blobs.
The detection flow for push events relies on subscribing to the PostReceive hook
and enqueueing Sidekiq requests to the `SecretScanningService`. The `SecretScanningService`
service fetches enqueued refs, queries Gitaly for the ref blob contents, scans
the commit contents, and notifies the Rails application when a secret is detected.
See [Push event detection flow](#push-event-detection-flow) for sequence.
The detection flow for arbitrary text blobs, such as issue comments, relies on
subscribing to `Notes::PostProcessService` (or equivalent service) and enqueueing
Sidekiq requests to the `SecretScanningService` to process the text blob by object type
and primary key of domain object. The `SecretScanningService` service fetches the
relevant text blob, scans the contents, and notifies the Rails application when a secret
is detected.
The detection flow for job logs requires processing the log during archive to object
storage. See discussion [in this issue](https://gitlab.com/groups/gitlab-org/-/epics/8847#note_1116647883)
around scanning during streaming and the added complexity in buffering lookbacks
for arbitrary trace chunks.
In any case of detection, the Rails application manually creates a vulnerability
using the `Vulnerabilities::ManuallyCreateService` to surface the finding within the
existing Vulnerability Management UI.
See [technical discovery](https://gitlab.com/gitlab-org/gitlab/-/issues/376716)
for further background exploration.
### Token types
The existing Secret Detection configuration covers ~100 rules across a variety
of platforms. To reduce total cost of execution and likelihood of false positives
the dedicated service targets only well-defined tokens. A well-defined token is
defined as a token with a precise definition, most often a fixed substring prefix or
suffix and fixed length.
Token types to identify in order of importance:
1. Well-defined GitLab tokens (including Personal Access Tokens and Pipeline Trigger Tokens)
1. Verified Partner tokens (including AWS)
1. Remainder tokens currently included in Secret Detection CI configuration
### Detection engine
Our current secret detection offering utilizes [Gitleaks](https://github.com/zricethezav/gitleaks/)
for all secret scanning within pipeline contexts. By using its `--no-git` configuration
we can scan arbitrary text blobs outside of a repository context and continue to
utilize it for non-pipeline scanning.
Given our existing familiarity with the tool and its extensibility, it should
remain our engine of choice. Changes to the detection engine are out of scope
unless benchmarking unveils performance concerns.
### Push event detection flow
```mermaid
sequenceDiagram
autonumber
actor User
User->>+Workhorse: git push
Workhorse->>+Gitaly: tcp
Gitaly->>+Rails: grpc
Sidekiq->>+Rails: poll job
Rails->>-Sidekiq: PostReceive worker
Sidekiq-->>+Sidekiq: enqueue PostReceiveSecretScanWorker
Sidekiq->>+Rails: poll job
loop PostReceiveSecretScanWorker
Rails->>-Sidekiq: PostReceiveSecretScanWorker
Sidekiq->>+SecretScanningSvc: ScanBlob(ref)
SecretScanningSvc->>+Sidekiq: accepted
Note right of SecretScanningSvc: Scanning job enqueued
Sidekiq-->>+Rails: done
SecretScanningSvc->>+Gitaly: retrieve blob
SecretScanningSvc->>+SecretScanningSvc: scan blob
SecretScanningSvc->>+Rails: secret found
end
```
## Iterations
1. Requirements definition for detection coverage and actions
1. PoC of secret scanning service
1. gRPC commit retrieval from Gitaly
1. blob scanning
1. benchmarking of issuables, comments, job logs and blobs to gain confidence that the total costs will be viable
1. Implementation of secret scanning service MVC (targeting individual commits)
1. Security and readiness review
1. Deployment and monitoring
1. Implementation of secret scanning service MVC (targeting arbitrary text blobs)
1. Deployment and monitoring
1. High priority domain object rollout (priority `TBD`)
1. Issuable comments
1. Issuable bodies
1. Job logs

View File

@ -985,8 +985,11 @@ Expressions evaluate as `true` if:
For example:
- `$VARIABLE =~ /^content.*/`
- `$VARIABLE_1 !~ /^content.*/`
- `if: $VARIABLE =~ /^content.*/`
- `if: $VARIABLE !~ /^content.*/`
Single-character regular expressions, like `/./`, are not supported and
produce an `invalid expression syntax` error.
Pattern matching is case-sensitive by default. Use the `i` flag modifier to make a
pattern case-insensitive. For example: `/pattern/i`.

View File

@ -36,6 +36,7 @@ There are two places defined variables can be used. On the:
| [`include`](../yaml/index.md#include) | yes | GitLab | The variable expansion is made by the [internal variable expansion mechanism](#gitlab-internal-variable-expansion-mechanism) in GitLab. <br/><br/>See [Use variables with include](../yaml/includes.md#use-variables-with-include) for more information on supported variables. |
| [`only:variables`](../yaml/index.md#onlyvariables--exceptvariables) | no | Not applicable | The variable must be in the form of `$variable`. Not supported are the following:<br/><br/>- `CI_ENVIRONMENT_*` variables, except `CI_ENVIRONMENT_NAME` which is supported.<br/>- [Persisted variables](#persisted-variables). |
| [`resource_group`](../yaml/index.md#resource_group) | yes | GitLab | Similar to `environment:url`, but the variables expansion doesn't support the following:<br/>- `CI_ENVIRONMENT_URL`<br/>- [Persisted variables](#persisted-variables). |
| [`rules:changes`](../yaml/index.md#ruleschanges) | yes | GitLab | The variable expansion is made by the [internal variable expansion mechanism](#gitlab-internal-variable-expansion-mechanism) in GitLab. |
| [`rules:exists`](../yaml/index.md#rulesexists) | yes | GitLab | The variable expansion is made by the [internal variable expansion mechanism](#gitlab-internal-variable-expansion-mechanism) in GitLab. |
| [`rules:if`](../yaml/index.md#rulesif) | no | Not applicable | The variable must be in the form of `$variable`. Not supported are the following:<br/><br/>- `CI_ENVIRONMENT_*` variables, except `CI_ENVIRONMENT_NAME` which is supported.<br/>- [Persisted variables](#persisted-variables). |
| [`script`](../yaml/index.md#script) | yes | Script execution shell | The variable expansion is made by the [execution shell environment](#execution-shell-environment). |

View File

@ -21,10 +21,20 @@ YAML has a feature called 'anchors' that you can use to duplicate
content across your document.
Use anchors to duplicate or inherit properties. Use anchors with [hidden jobs](../jobs/index.md#hide-jobs)
to provide templates for your jobs. When there are duplicate keys, GitLab
performs a reverse deep merge based on the keys.
to provide templates for your jobs. When there are duplicate keys, the latest included key wins, overriding the other keys.
You can use YAML anchors to merge YAML arrays.
In certain cases (see [YAML anchors for scripts](#yaml-anchors-for-scripts)), you can use YAML anchors to build arrays with multiple components defined elsewhere. For example:
```yaml
.default_scripts: &default_scripts
- ./default-script1.sh
- ./default-script2.sh
job1:
script:
- *default_scripts
- ./job-script.sh
```
You can't use YAML anchors across multiple files when using the [`include`](index.md#include)
keyword. Anchors are only valid in the file they were defined in. To reuse configuration
@ -43,12 +53,12 @@ with their own custom `script` defined:
- redis
test1:
<<: *job_configuration # Merge the contents of the 'job_configuration' alias
<<: *job_configuration # Add the contents of the 'job_configuration' alias
script:
- test1 project
test2:
<<: *job_configuration # Merge the contents of the 'job_configuration' alias
<<: *job_configuration # Add the contents of the 'job_configuration' alias
script:
- test2 project
```
@ -307,8 +317,9 @@ to the contents of the `script`:
### Merge details
You can use `extends` to merge hashes but not arrays.
The algorithm used for merge is "closest scope wins," so
keys from the last member always override anything defined on other
The algorithm used for merge is "closest scope wins". When there are
duplicate keys, GitLab performs a reverse deep merge based on the keys.
Keys from the last member always override anything defined on other
levels. For example:
```yaml

View File

@ -55,9 +55,10 @@ To set up the Grafana API in Grafana:
1. Select **Save Changes**.
NOTE:
If the Grafana integration is enabled, any user with read access to the GitLab
project can query metrics from the Prometheus instance. All requests proxied
through GitLab are authenticated with the same Grafana Administrator API token.
If the Grafana integration is enabled, users with the Reporter role on public
projects and the Guest role on non-public projects can query metrics from the
Prometheus instance. All requests proxied through GitLab are authenticated with
the same Grafana Administrator API token.
### Generate a link to a panel

View File

@ -213,6 +213,11 @@ To help avoid abuse, by default, users are rate limited to:
## Version history
### 15.8+
Starting with GitLab 15.8, importing groupgs from a JSON export is no longer supported. Groups need to be imported
in NDJSON format.
### 14.0+
In GitLab 14.0, the JSON format is no longer supported for project and group exports. To allow for a

View File

@ -423,7 +423,7 @@
canonical: |
<p>## foo</p>
static: |-
<p data-sourcepos="1:1-1:27" dir="auto"><span>#</span># foo</p>
<p data-sourcepos="1:1-1:28" dir="auto"><span>#</span># foo</p>
wysiwyg: |-
<p>## foo</p>
04_02_00__leaf_blocks__atx_headings__005:
@ -533,11 +533,11 @@
<h2>foo ###</h2>
<h1>foo #</h1>
static: |-
<h3 data-sourcepos="1:1-1:32" dir="auto">
<h3 data-sourcepos="1:1-1:33" dir="auto">
<a id="user-content-foo-" class="anchor" href="#foo-" aria-hidden="true"></a>foo <span>#</span>##</h3>
<h2 data-sourcepos="2:1-2:31" dir="auto">
<h2 data-sourcepos="2:1-2:32" dir="auto">
<a id="user-content-foo--1" class="anchor" href="#foo--1" aria-hidden="true"></a>foo #<span>#</span>#</h2>
<h1 data-sourcepos="3:1-3:28" dir="auto">
<h1 data-sourcepos="3:1-3:29" dir="auto">
<a id="user-content-foo--2" class="anchor" href="#foo--2" aria-hidden="true"></a>foo <span>#</span>
</h1>
wysiwyg: |-
@ -4785,7 +4785,7 @@
canonical: |
<p>!&quot;#$%&amp;'()*+,-./:;&lt;=&gt;?@[\]^_`{|}~</p>
static: |-
<p data-sourcepos="1:1-1:224" dir="auto"><span>!</span>"<span>#</span><span>$</span><span>%</span><span>&amp;</span>'()*+,-./:;&lt;=&gt;?<span>@</span>[\]<span>^</span>_`{|}<span>~</span></p>
<p data-sourcepos="1:1-1:232" dir="auto"><span>!</span>"<span>#</span><span>$</span><span>%</span><span>&amp;</span>'()*+,-./:;&lt;=&gt;?<span>@</span>[\]<span>^</span>_`{|}<span>~</span></p>
wysiwyg: |-
<p>!"#$%&amp;'()*+,-./:;&lt;=&gt;?@[\]^_`{|}~</p>
06_02_00__inlines__backslash_escapes__002:
@ -4804,7 +4804,7 @@
[foo]: /url &quot;not a reference&quot;
&amp;ouml; not a character entity</p>
static: |-
<p data-sourcepos="1:1-9:50" dir="auto">*not emphasized*
<p data-sourcepos="1:1-9:51" dir="auto">*not emphasized*
&lt;br/&gt; not a tag
<a href="/foo">not a link</a>
`not code`
@ -6639,7 +6639,7 @@
canonical: |
<p>[bar][foo!]</p>
static: |-
<p data-sourcepos="1:1-1:32" dir="auto">[bar][foo<span>!</span>]</p>
<p data-sourcepos="1:1-1:33" dir="auto">[bar][foo<span>!</span>]</p>
wysiwyg: |-
<p>[bar][foo!]</p>
<pre>[foo!]: /url</pre>
@ -7043,7 +7043,7 @@
canonical: |
<p>!<a href="/url" title="title">foo</a></p>
static: |-
<p data-sourcepos="1:1-1:27" dir="auto"><span>!</span><a href="/url" title="title">foo</a></p>
<p data-sourcepos="1:1-1:28" dir="auto"><span>!</span><a href="/url" title="title">foo</a></p>
wysiwyg: |-
<p>!<a target="_blank" rel="noopener noreferrer nofollow" href="/url" title="title">foo</a></p>
<pre>[foo]: /url "title"</pre>

View File

@ -6849,7 +6849,7 @@ not have their usual Markdown meanings:</p>
<span id="LC6" class="line" lang="plaintext">\* not a list</span>
<span id="LC7" class="line" lang="plaintext">\# not a heading</span>
<span id="LC8" class="line" lang="plaintext">\[foo]: /url "not a reference"</span>
<span id="LC9" class="line" lang="plaintext">\&amp;ouml; not a character entity</span></code></pre>
<span id="LC9" class="line" lang="plaintext">\ö not a character entity</span></code></pre>
<copy-code></copy-code>
</div>
<div class="gl-relative markdown-code-block js-markdown-code">

View File

@ -2,33 +2,67 @@
module Banzai
module Filter
# See comments in MarkdownPreEscapeFilter for details on strategy
class MarkdownPostEscapeFilter < HTML::Pipeline::Filter
LITERAL_KEYWORD = MarkdownPreEscapeFilter::LITERAL_KEYWORD
LITERAL_REGEX = %r{#{LITERAL_KEYWORD}-(.*?)-#{LITERAL_KEYWORD}}.freeze
NOT_LITERAL_REGEX = %r{#{LITERAL_KEYWORD}-((%5C|\\).+?)-#{LITERAL_KEYWORD}}.freeze
SPAN_REGEX = %r{<span>(.*?)</span>}.freeze
CSS_A = 'a'
XPATH_A = Gitlab::Utils::Nokogiri.css_to_xpath(CSS_A).freeze
CSS_LANG_TAG = 'pre'
XPATH_LANG_TAG = Gitlab::Utils::Nokogiri.css_to_xpath(CSS_LANG_TAG).freeze
XPATH_A = Gitlab::Utils::Nokogiri.css_to_xpath('a').freeze
XPATH_LANG_TAG = Gitlab::Utils::Nokogiri.css_to_xpath('pre').freeze
def call
return doc unless result[:escaped_literals]
# For any literals that actually didn't get escape processed
# (for example in code blocks), remove the special sequence.
html.gsub!(NOT_LITERAL_REGEX, '\1')
new_html = unescaped_literals(doc.to_html)
new_html = add_spans(new_html)
# Replace any left over literal sequences with `span` so that our
# reference processing is short-circuited
html.gsub!(LITERAL_REGEX, '<span>\1</span>')
@doc = parse_html(new_html)
# Since literals are converted in links, we need to remove any surrounding `span`.
# Note: this could have been done in the renderer,
# Banzai::Renderer::CommonMark::HTML. However, we eventually want to use
# the built-in compiled renderer, rather than the ruby version, for speed.
# So let's do this work here.
remove_spans_in_certain_attributes
doc
end
private
# For any literals that actually didn't get escape processed
# (for example in code blocks), remove the special sequence.
def unescaped_literals(html)
html.gsub!(NOT_LITERAL_REGEX) do |match|
last_match = ::Regexp.last_match(1)
last_match_token = last_match.sub('%5C', '\\')
escaped_item = Banzai::Filter::MarkdownPreEscapeFilter::ESCAPABLE_CHARS.find { |item| item[:token] == last_match_token }
escaped_char = escaped_item ? escaped_item[:escaped] : last_match
escaped_char = escaped_char.sub('\\', '%5C') if last_match.start_with?('%5C')
escaped_char
end
html
end
# Replace any left over literal sequences with `span` so that our
# reference processing is short-circuited
def add_spans(html)
html.gsub!(LITERAL_REGEX) do |match|
last_match = ::Regexp.last_match(1)
last_match_token = "\\#{last_match}"
escaped_item = Banzai::Filter::MarkdownPreEscapeFilter::ESCAPABLE_CHARS.find { |item| item[:token] == last_match_token }
escaped_char = escaped_item ? escaped_item[:char] : ::Regexp.last_match(1)
"<span>#{escaped_char}</span>"
end
html
end
# Since literals are converted in links, we need to remove any surrounding `span`.
def remove_spans_in_certain_attributes
doc.xpath(XPATH_A).each do |node|
node.attributes['href'].value = node.attributes['href'].value.gsub(SPAN_REGEX, '\1') if node.attributes['href']
node.attributes['title'].value = node.attributes['title'].value.gsub(SPAN_REGEX, '\1') if node.attributes['title']
@ -37,8 +71,6 @@ module Banzai
doc.xpath(XPATH_LANG_TAG).each do |node|
node.attributes['lang'].value = node.attributes['lang'].value.gsub(SPAN_REGEX, '\1') if node.attributes['lang']
end
doc
end
end
end

View File

@ -10,6 +10,10 @@ module Banzai
# This way CommonMark will properly handle the backslash escaped chars
# but we will maintain knowledge (the sequence) that it was a literal.
#
# This processing is also important for the handling of escaped characters
# in LaTeX math. These will need to be converted back into their escaped
# versions if they are detected in math blocks.
#
# We need to surround the character, not just prefix it. It could
# get converted into an entity by CommonMark and we wouldn't know how many
# characters there are. The entire literal needs to be surrounded with
@ -24,9 +28,33 @@ module Banzai
# This filter does the initial surrounding, and MarkdownPostEscapeFilter
# does the conversion into span tags.
class MarkdownPreEscapeFilter < HTML::Pipeline::TextFilter
# We just need to target those that are special GitLab references
REFERENCE_CHARACTERS = '@#!$&~%^'
ASCII_PUNCTUATION = %r{(\\[#{REFERENCE_CHARACTERS}])}.freeze
# Table of characters that need this special handling. It consists of the
# GitLab special reference characters and special LaTeX characters.
#
# The `token` is used when we do the initial replacement - for example converting
# `\$` into `cmliteral-\+a-cmliteral`. We don't simply replace `\$` with `$`,
# because this can cause difficulties in parsing math blocks that use `$` as a
# delimiter. We also include a character that _can_ be escaped, `\+`. By examining
# the text once it's been passed to markdown, we can determine that `cmliteral-\+a-cmliteral`
# was in a block that markdown did _not_ escape the character, for example an inline
# code block or some other element. In this case, we must convert back to the
# original escaped version, `\$`. However if we detect `cmliteral-+a-cmliteral`,
# then we know markdown considered it an escaped character, and we should replace it
# with the non-escaped version, `$`.
# See the MarkdownPostEscapeFilter for how this is done.
ESCAPABLE_CHARS = [
{ char: '$', escaped: '\$', token: '\+a', reference: true, latex: true },
{ char: '%', escaped: '\%', token: '\+b', reference: true, latex: true },
{ char: '#', escaped: '\#', token: '\+c', reference: true, latex: true },
{ char: '&', escaped: '\&', token: '\+d', reference: true, latex: true },
{ char: '@', escaped: '\@', token: '\+h', reference: true, latex: false },
{ char: '!', escaped: '\!', token: '\+i', reference: true, latex: false },
{ char: '~', escaped: '\~', token: '\+j', reference: true, latex: false },
{ char: '^', escaped: '\^', token: '\+k', reference: true, latex: false }
].freeze
TARGET_CHARS = ESCAPABLE_CHARS.pluck(:char).join.freeze
ASCII_PUNCTUATION = %r{(\\[#{TARGET_CHARS}])}.freeze
LITERAL_KEYWORD = 'cmliteral'
def call
@ -35,7 +63,10 @@ module Banzai
# are found, we can bypass the post filter
result[:escaped_literals] = true
"#{LITERAL_KEYWORD}-#{match}-#{LITERAL_KEYWORD}"
escaped_item = ESCAPABLE_CHARS.find { |item| item[:escaped] == match }
token = escaped_item ? escaped_item[:token] : match
"#{LITERAL_KEYWORD}-#{token}-#{LITERAL_KEYWORD}"
end
end
end

View File

@ -0,0 +1,22 @@
# frozen_string_literal: true
module Gitlab
module BackgroundMigration
# Truncate the Vulnerability html_title if it exceeds 800 chars
class TruncateOverlongVulnerabilityHtmlTitles < BatchedMigrationJob
feature_category :vulnerability_management
scope_to ->(relation) { relation.where("LENGTH(title_html) > 800") }
operation_name :truncate_vulnerability_title_htmls
class Vulnerability < ApplicationRecord # rubocop:disable Style/Documentation
self.table_name = "vulnerabilities"
end
def perform
each_sub_batch do |sub_batch|
sub_batch.update_all("title_html = left(title_html, 800)")
end
end
end
end
end

View File

@ -17,6 +17,10 @@ module Gitlab
def self.file_compression_error
self.new('File compression/decompression failed')
end
def self.incompatible_import_file_error
self.new('The import file is incompatible')
end
end
end
end

View File

@ -1,132 +0,0 @@
# frozen_string_literal: true
module Gitlab
module ImportExport
module Group
class LegacyTreeRestorer
include Gitlab::Utils::StrongMemoize
attr_reader :user
attr_reader :shared
attr_reader :group
def initialize(user:, shared:, group:, group_hash:)
@user = user
@shared = shared
@group = group
@group_hash = group_hash
end
def restore
@group_attributes = relation_reader.consume_attributes(nil)
@group_members = relation_reader.consume_relation(nil, 'members')
.map(&:first)
# We need to remove `name` and `path` as we did consume it in previous pass
@group_attributes.delete('name')
@group_attributes.delete('path')
@children = @group_attributes.delete('children')
if members_mapper.map && restorer.restore
@children&.each do |group_hash|
group = create_group(group_hash: group_hash, parent_group: @group)
shared = Gitlab::ImportExport::Shared.new(group)
self.class.new(
user: @user,
shared: shared,
group: group,
group_hash: group_hash
).restore
end
end
return false if @shared.errors.any?
true
rescue StandardError => e
@shared.error(e)
false
end
private
def relation_reader
strong_memoize(:relation_reader) do
if @group_hash.present?
ImportExport::Json::LegacyReader::Hash.new(
@group_hash,
relation_names: reader.group_relation_names)
else
ImportExport::Json::LegacyReader::File.new(
File.join(shared.export_path, 'group.json'),
relation_names: reader.group_relation_names)
end
end
end
def restorer
@relation_tree_restorer ||= RelationTreeRestorer.new(
user: @user,
shared: @shared,
relation_reader: relation_reader,
members_mapper: members_mapper,
object_builder: object_builder,
relation_factory: relation_factory,
reader: reader,
importable: @group,
importable_attributes: @group_attributes,
importable_path: nil
)
end
def create_group(group_hash:, parent_group:)
group_params = {
name: group_hash['name'],
path: group_hash['path'],
parent_id: parent_group&.id,
visibility_level: sub_group_visibility_level(group_hash, parent_group)
}
::Groups::CreateService.new(@user, group_params).execute
end
def sub_group_visibility_level(group_hash, parent_group)
original_visibility_level = group_hash['visibility_level'] || Gitlab::VisibilityLevel::PRIVATE
if parent_group && parent_group.visibility_level < original_visibility_level
Gitlab::VisibilityLevel.closest_allowed_level(parent_group.visibility_level)
else
original_visibility_level
end
end
def members_mapper
@members_mapper ||= Gitlab::ImportExport::MembersMapper.new(
exported_members: @group_members,
user: @user,
importable: @group
)
end
def relation_factory
Gitlab::ImportExport::Group::RelationFactory
end
def object_builder
Gitlab::ImportExport::Group::ObjectBuilder
end
def reader
@reader ||= Gitlab::ImportExport::Reader.new(
shared: @shared,
config: Gitlab::ImportExport::Config.new(
config: Gitlab::ImportExport.legacy_group_config_file
).to_h
)
end
end
end
end
end

View File

@ -1,57 +0,0 @@
# frozen_string_literal: true
module Gitlab
module ImportExport
module Group
class LegacyTreeSaver
attr_reader :full_path, :shared
def initialize(group:, current_user:, shared:, params: {})
@params = params
@current_user = current_user
@shared = shared
@group = group
@full_path = File.join(@shared.export_path, ImportExport.group_filename)
end
def save
group_tree = serialize(@group, reader.group_tree)
tree_saver.save(group_tree, @shared.export_path, ImportExport.group_filename)
true
rescue StandardError => e
@shared.error(e)
false
end
private
def serialize(group, relations_tree)
group_tree = tree_saver.serialize(group, relations_tree)
group.children.each do |child|
group_tree['children'] ||= []
group_tree['children'] << serialize(child, relations_tree)
end
group_tree
rescue StandardError => e
@shared.error(e)
end
def reader
@reader ||= Gitlab::ImportExport::Reader.new(
shared: @shared,
config: Gitlab::ImportExport::Config.new(
config: Gitlab::ImportExport.legacy_group_config_file
).to_h
)
end
def tree_saver
@tree_saver ||= LegacyRelationTreeSaver.new
end
end
end
end
end

View File

@ -0,0 +1,16 @@
# frozen_string_literal: true
# rubocop:disable Gitlab/NamespacedClass
require 'device_detector'
module Gitlab
class SafeDeviceDetector < ::DeviceDetector
USER_AGENT_MAX_SIZE = 1024
def initialize(user_agent)
super(user_agent)
@user_agent = user_agent && user_agent[0..USER_AGENT_MAX_SIZE]
end
end
end
# rubocop:enable Gitlab/NamespacedClass

View File

@ -19504,7 +19504,7 @@ msgstr ""
msgid "GroupActivityMetrics|Recent activity"
msgstr ""
msgid "GroupImport|Failed to import group."
msgid "GroupImport|Failed to import group: %{error}"
msgstr ""
msgid "GroupImport|Group '%{group_name}' is being imported."

View File

@ -69,7 +69,7 @@ module RuboCop
message = format(
':warning: `%{job_name}` passed :green: but contained <%{job_url}|silenced offenses>. ' \
'See <%{docs_link}|docs>.',
docs_link: 'https://docs.gitlab.com/ee/development/contributing/style_guides.html#silenced-offenses',
docs_link: 'https://docs.gitlab.com/ee/development/rubocop_development_guide.html#silenced-offenses',
job_name: job_name,
job_url: job_url)

View File

@ -45,7 +45,7 @@ RSpec.describe Groups::ImportsController do
it 'sets a flash error' do
get :show, params: { group_id: group }
expect(flash[:alert]).to eq 'Failed to import group.'
expect(flash[:alert]).to eq 'Failed to import group: '
end
end

View File

@ -2,13 +2,20 @@
require 'spec_helper'
RSpec.describe Projects::GrafanaApiController do
let_it_be(:project) { create(:project) }
let_it_be(:user) { create(:user) }
RSpec.describe Projects::GrafanaApiController, feature_category: :metrics do
let_it_be(:project) { create(:project, :public) }
let_it_be(:reporter) { create(:user) }
let_it_be(:guest) { create(:user) }
let(:anonymous) { nil }
let(:user) { reporter }
before_all do
project.add_reporter(reporter)
project.add_guest(guest)
end
before do
project.add_reporter(user)
sign_in(user)
sign_in(user) if user
end
describe 'GET #proxy' do
@ -41,6 +48,39 @@ RSpec.describe Projects::GrafanaApiController do
end
end
shared_examples_for 'accessible' do
let(:service_result) { nil }
it 'returns non erroneous response' do
get :proxy, params: params
# We don't care about the specific code as long it's not an error.
expect(response).to have_gitlab_http_status(:no_content)
end
end
shared_examples_for 'not accessible' do
let(:service_result) { nil }
it 'returns 404 Not found' do
get :proxy, params: params
expect(response).to have_gitlab_http_status(:not_found)
expect(Grafana::ProxyService).not_to have_received(:new)
end
end
shared_examples_for 'login required' do
let(:service_result) { nil }
it 'redirects to login page' do
get :proxy, params: params
expect(response).to redirect_to(new_user_session_path)
expect(Grafana::ProxyService).not_to have_received(:new)
end
end
context 'with a successful result' do
let(:service_result) { { status: :success, body: '{}' } }
@ -96,6 +136,38 @@ RSpec.describe Projects::GrafanaApiController do
it_behaves_like 'error response', :bad_request
end
end
context 'as guest' do
let(:user) { guest }
it_behaves_like 'not accessible'
end
context 'as anonymous' do
let(:user) { anonymous }
it_behaves_like 'not accessible'
end
context 'on a private project' do
let_it_be(:project) { create(:project, :private) }
before_all do
project.add_guest(guest)
end
context 'as anonymous' do
let(:user) { anonymous }
it_behaves_like 'login required'
end
context 'as guest' do
let(:user) { guest }
it_behaves_like 'accessible'
end
end
end
describe 'GET #metrics_dashboard' do

View File

@ -268,17 +268,35 @@ RSpec.describe UploadsController do
end
context "when not signed in" do
it "responds with status 200" do
get :show, params: { model: "user", mounted_as: "avatar", id: user.id, filename: "dk.png" }
context "when restricted visibility level is not set to public" do
before do
stub_application_setting(restricted_visibility_levels: [])
end
expect(response).to have_gitlab_http_status(:ok)
it "responds with status 200" do
get :show, params: { model: "user", mounted_as: "avatar", id: user.id, filename: "dk.png" }
expect(response).to have_gitlab_http_status(:ok)
end
it_behaves_like 'content publicly cached' do
subject do
get :show, params: { model: 'user', mounted_as: 'avatar', id: user.id, filename: 'dk.png' }
response
end
end
end
it_behaves_like 'content publicly cached' do
subject do
get :show, params: { model: 'user', mounted_as: 'avatar', id: user.id, filename: 'dk.png' }
context "when restricted visibility level is set to public" do
before do
stub_application_setting(restricted_visibility_levels: [Gitlab::VisibilityLevel::PUBLIC])
end
response
it "responds with status 401" do
get :show, params: { model: "user", mounted_as: "avatar", id: user.id, filename: "dk.png" }
expect(response).to have_gitlab_http_status(:unauthorized)
end
end
end

View File

@ -2,7 +2,7 @@
require 'spec_helper'
RSpec.describe SubmoduleHelper do
RSpec.describe SubmoduleHelper, feature_category: :source_code_management do
include RepoHelpers
let(:submodule_item) { double(id: 'hash', path: 'rack') }

View File

@ -2,7 +2,7 @@
require 'spec_helper'
RSpec.describe Banzai::Pipeline::FullPipeline do
RSpec.describe Banzai::Pipeline::FullPipeline, feature_category: :team_planning do
describe 'References' do
let(:project) { create(:project, :public) }
let(:issue) { create(:issue, project: project) }

View File

@ -2,24 +2,25 @@
require 'spec_helper'
RSpec.describe Banzai::Pipeline::PlainMarkdownPipeline do
RSpec.describe Banzai::Pipeline::PlainMarkdownPipeline, feature_category: :team_planning do
using RSpec::Parameterized::TableSyntax
describe 'backslash escapes', :aggregate_failures do
let_it_be(:project) { create(:project, :public) }
let_it_be(:issue) { create(:issue, project: project) }
it 'converts all reference punctuation to literals' do
reference_chars = Banzai::Filter::MarkdownPreEscapeFilter::REFERENCE_CHARACTERS
markdown = reference_chars.split('').map { |char| char.prepend("\\") }.join
punctuation = Banzai::Filter::MarkdownPreEscapeFilter::REFERENCE_CHARACTERS.split('')
punctuation = punctuation.delete_if { |char| char == '&' }
punctuation << '&amp;'
it 'converts all escapable punctuation to literals' do
markdown = Banzai::Filter::MarkdownPreEscapeFilter::ESCAPABLE_CHARS.pluck(:escaped).join
result = described_class.call(markdown, project: project)
output = result[:output].to_html
punctuation.each { |char| expect(output).to include("<span>#{char}</span>") }
Banzai::Filter::MarkdownPreEscapeFilter::ESCAPABLE_CHARS.pluck(:char).each do |char|
char = '&amp;' if char == '&'
expect(output).to include("<span>#{char}</span>")
end
expect(result[:escaped_literals]).to be_truthy
end
@ -33,12 +34,12 @@ RSpec.describe Banzai::Pipeline::PlainMarkdownPipeline do
end.compact
reference_chars.all? do |char|
Banzai::Filter::MarkdownPreEscapeFilter::REFERENCE_CHARACTERS.include?(char)
Banzai::Filter::MarkdownPreEscapeFilter::TARGET_CHARS.include?(char)
end
end
it 'does not convert non-reference punctuation to spans' do
markdown = %q(\"\'\*\+\,\-\.\/\:\;\<\=\>\?\[\]\_\`\{\|\}) + %q[\(\)\\\\]
it 'does not convert non-reference/latex punctuation to spans' do
markdown = %q(\"\'\*\+\,\-\.\/\:\;\<\=\>\?\[\]\`\|) + %q[\(\)\\\\]
result = described_class.call(markdown, project: project)
output = result[:output].to_html
@ -55,7 +56,7 @@ RSpec.describe Banzai::Pipeline::PlainMarkdownPipeline do
expect(result[:escaped_literals]).to be_falsey
end
describe 'backslash escapes do not work in code blocks, code spans, autolinks, or raw HTML' do
describe 'backslash escapes are untouched in code blocks, code spans, autolinks, or raw HTML' do
where(:markdown, :expected) do
%q(`` \@\! ``) | %q(<code>\@\!</code>)
%q( \@\!) | %Q(<code>\\@\\!\n</code>)

View File

@ -0,0 +1,78 @@
# frozen_string_literal: true
require 'spec_helper'
# rubocop:disable Layout/LineLength
RSpec.describe Gitlab::BackgroundMigration::TruncateOverlongVulnerabilityHtmlTitles, schema: 20221110100602, feature_category: :vulnerability_management do
# rubocop:enable Layout/LineLength
let(:namespaces) { table(:namespaces) }
let(:projects) { table(:projects) }
let(:vulnerabilities) { table(:vulnerabilities) }
let(:users) { table(:users) }
let(:namespace) { namespaces.create!(name: 'name', path: 'path') }
let(:project) do
projects
.create!(name: "project", path: "project", namespace_id: namespace.id, project_namespace_id: namespace.id)
end
let!(:user) { create_user! }
let!(:vulnerability_1) { create_vulnerability!(title_html: 'a' * 900, project_id: project.id, author_id: user.id) }
let!(:vulnerability_2) { create_vulnerability!(title_html: 'a' * 801, project_id: project.id, author_id: user.id) }
let!(:vulnerability_3) { create_vulnerability!(title_html: 'a' * 800, project_id: project.id, author_id: user.id) }
let!(:vulnerability_4) { create_vulnerability!(title_html: 'a' * 544, project_id: project.id, author_id: user.id) }
subject do
described_class.new(
start_id: vulnerabilities.minimum(:id),
end_id: vulnerabilities.maximum(:id),
batch_table: :vulnerabilities,
batch_column: :id,
sub_batch_size: 200,
pause_ms: 2.minutes,
connection: ApplicationRecord.connection
)
end
describe '#perform' do
it 'truncates the vulnerability html title when longer than 800 characters' do
subject.perform
expect(vulnerability_1.reload.title_html.length).to eq(800)
expect(vulnerability_2.reload.title_html.length).to eq(800)
expect(vulnerability_3.reload.title_html.length).to eq(800)
expect(vulnerability_4.reload.title_html.length).to eq(544)
end
end
private
# rubocop:disable Metrics/ParameterLists
def create_vulnerability!(
project_id:, author_id:, title: 'test', title_html: 'test', severity: 7, confidence: 7, report_type: 0, state: 1,
dismissed_at: nil
)
vulnerabilities.create!(
project_id: project_id,
author_id: author_id,
title: title,
title_html: title_html,
severity: severity,
confidence: confidence,
report_type: report_type,
state: state,
dismissed_at: dismissed_at
)
end
# rubocop:enable Metrics/ParameterLists
def create_user!(name: "Example User", email: "user@example.com", user_type: nil)
users.create!(
name: name,
email: email,
username: name,
projects_limit: 10
)
end
end

View File

@ -1,153 +0,0 @@
# frozen_string_literal: true
require 'spec_helper'
RSpec.describe Gitlab::ImportExport::Group::LegacyTreeRestorer do
include ImportExport::CommonUtil
let(:shared) { Gitlab::ImportExport::Shared.new(group) }
describe 'restore group tree' do
before_all do
# Using an admin for import, so we can check assignment of existing members
user = create(:admin, email: 'root@gitlabexample.com')
create(:user, email: 'adriene.mcclure@gitlabexample.com')
create(:user, email: 'gwendolyn_robel@gitlabexample.com')
RSpec::Mocks.with_temporary_scope do
@group = create(:group, name: 'group', path: 'group')
@shared = Gitlab::ImportExport::Shared.new(@group)
setup_import_export_config('group_exports/complex')
group_tree_restorer = described_class.new(user: user, shared: @shared, group: @group, group_hash: nil)
@restored_group_json = group_tree_restorer.restore
end
end
context 'JSON' do
it 'restores models based on JSON' do
expect(@restored_group_json).to be_truthy
end
it 'has the group description' do
expect(Group.find_by_path('group').description).to eq('Group Description')
end
it 'has group labels' do
expect(@group.labels.count).to eq(10)
end
context 'issue boards' do
it 'has issue boards' do
expect(@group.boards.count).to eq(1)
end
it 'has board label lists' do
lists = @group.boards.find_by(name: 'first board').lists
expect(lists.count).to eq(3)
expect(lists.first.label.title).to eq('TSL')
expect(lists.second.label.title).to eq('Sosync')
end
end
it 'has badges' do
expect(@group.badges.count).to eq(1)
end
it 'has milestones' do
expect(@group.milestones.count).to eq(5)
end
it 'has group children' do
expect(@group.children.count).to eq(2)
end
it 'has group members' do
expect(@group.members.map(&:user).map(&:email)).to contain_exactly('root@gitlabexample.com', 'adriene.mcclure@gitlabexample.com', 'gwendolyn_robel@gitlabexample.com')
end
end
end
context 'excluded attributes' do
let!(:source_user) { create(:user, id: 123) }
let!(:importer_user) { create(:user) }
let(:group) { create(:group) }
let(:shared) { Gitlab::ImportExport::Shared.new(group) }
let(:group_tree_restorer) { described_class.new(user: importer_user, shared: shared, group: group, group_hash: nil) }
let(:group_json) { Gitlab::Json.parse(File.read(File.join(shared.export_path, 'group.json'))) }
shared_examples 'excluded attributes' do
excluded_attributes = %w[
id
owner_id
parent_id
created_at
updated_at
runners_token
runners_token_encrypted
saml_discovery_token
]
before do
group.add_owner(importer_user)
setup_import_export_config('group_exports/complex')
end
excluded_attributes.each do |excluded_attribute|
it 'does not allow override of excluded attributes' do
expect(group_json[excluded_attribute]).not_to eq(group.public_send(excluded_attribute))
end
end
end
include_examples 'excluded attributes'
end
context 'group.json file access check' do
let(:user) { create(:user) }
let!(:group) { create(:group, name: 'group2', path: 'group2') }
let(:group_tree_restorer) { described_class.new(user: user, shared: shared, group: group, group_hash: nil) }
let(:restored_group_json) { group_tree_restorer.restore }
it 'does not read a symlink' do
Dir.mktmpdir do |tmpdir|
setup_symlink(tmpdir, 'group.json')
allow(shared).to receive(:export_path).and_call_original
expect(group_tree_restorer.restore).to eq(false)
expect(shared.errors).to include('Incorrect JSON format')
end
end
end
context 'group visibility levels' do
let(:user) { create(:user) }
let(:shared) { Gitlab::ImportExport::Shared.new(group) }
let(:group_tree_restorer) { described_class.new(user: user, shared: shared, group: group, group_hash: nil) }
before do
setup_import_export_config(filepath)
group_tree_restorer.restore
end
shared_examples 'with visibility level' do |visibility_level, expected_visibilities|
context "when visibility level is #{visibility_level}" do
let(:group) { create(:group, visibility_level) }
let(:filepath) { "group_exports/visibility_levels/#{visibility_level}" }
it "imports all subgroups as #{visibility_level}" do
expect(group.children.map(&:visibility_level)).to match_array(expected_visibilities)
end
end
end
include_examples 'with visibility level', :public, [20, 10, 0]
include_examples 'with visibility level', :private, [0, 0, 0]
include_examples 'with visibility level', :internal, [10, 10, 0]
end
end

View File

@ -1,159 +0,0 @@
# frozen_string_literal: true
require 'spec_helper'
RSpec.describe Gitlab::ImportExport::Group::LegacyTreeSaver do
describe 'saves the group tree into a json object' do
let(:shared) { Gitlab::ImportExport::Shared.new(group) }
let(:group_tree_saver) { described_class.new(group: group, current_user: user, shared: shared) }
let(:export_path) { "#{Dir.tmpdir}/group_tree_saver_spec" }
let(:user) { create(:user) }
let!(:group) { setup_group }
before do
group.add_maintainer(user)
allow(Gitlab::ImportExport).to receive(:storage_path).and_return(export_path)
end
after do
FileUtils.rm_rf(export_path)
end
it 'saves group successfully' do
expect(group_tree_saver.save).to be true
end
# It is mostly duplicated in
# `spec/lib/gitlab/import_export/fast_hash_serializer_spec.rb`
# except:
# context 'with description override' do
# context 'group members' do
# ^ These are specific for the Group::LegacyTreeSaver
context 'JSON' do
let(:saved_group_json) do
group_tree_saver.save # rubocop:disable Rails/SaveBang
group_json(group_tree_saver.full_path)
end
it 'saves the correct json' do
expect(saved_group_json).to include({ 'description' => 'description' })
end
it 'has milestones' do
expect(saved_group_json['milestones']).not_to be_empty
end
it 'has labels' do
expect(saved_group_json['labels']).not_to be_empty
end
it 'has boards' do
expect(saved_group_json['boards']).not_to be_empty
end
it 'has board label list' do
expect(saved_group_json['boards'].first['lists']).not_to be_empty
end
it 'has group members' do
expect(saved_group_json['members']).not_to be_empty
end
it 'has priorities associated to labels' do
expect(saved_group_json['labels'].first['priorities']).not_to be_empty
end
it 'has badges' do
expect(saved_group_json['badges']).not_to be_empty
end
context 'group children' do
let(:children) { group.children }
it 'exports group children' do
expect(saved_group_json['children'].length).to eq(children.count)
end
it 'exports group children of children' do
expect(saved_group_json['children'].first['children'].length).to eq(children.first.children.count)
end
end
context 'group members' do
let(:user2) { create(:user, email: 'group@member.com') }
let(:member_emails) do
saved_group_json['members'].map do |pm|
pm['user']['public_email']
end
end
before do
user2.update!(public_email: user2.email)
group.add_developer(user2)
end
it 'exports group members as group owner' do
group.add_owner(user)
expect(member_emails).to include('group@member.com')
end
context 'as admin' do
let(:user) { create(:admin) }
it 'exports group members as admin' do
expect(member_emails).to include('group@member.com')
end
it 'exports group members' do
member_types = saved_group_json['members'].map { |pm| pm['source_type'] }
expect(member_types).to all(eq('Namespace'))
end
end
end
context 'group attributes' do
shared_examples 'excluded attributes' do
excluded_attributes = %w[
id
owner_id
parent_id
created_at
updated_at
runners_token
runners_token_encrypted
saml_discovery_token
]
excluded_attributes.each do |excluded_attribute|
it 'does not contain excluded attribute' do
expect(saved_group_json).not_to include(excluded_attribute => group.public_send(excluded_attribute))
end
end
end
include_examples 'excluded attributes'
end
end
end
def setup_group
group = create(:group, description: 'description')
sub_group = create(:group, description: 'description', parent: group)
create(:group, description: 'description', parent: sub_group)
create(:milestone, group: group)
create(:group_badge, group: group)
group_label = create(:group_label, group: group)
create(:label_priority, label: group_label, priority: 1)
board = create(:board, group: group, milestone_id: Milestone::Upcoming.id)
create(:list, board: board, label: group_label)
create(:group_badge, group: group)
group
end
def group_json(filename)
::JSON.parse(File.read(filename))
end
end

View File

@ -0,0 +1,20 @@
# frozen_string_literal: true
require 'fast_spec_helper'
require 'device_detector'
require_relative '../../../lib/gitlab/safe_device_detector'
RSpec.describe Gitlab::SafeDeviceDetector, feature_category: :authentication_and_authorization do
it 'retains the behavior for normal user agents' do
chrome_user_agent = "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 \
(KHTML, like Gecko) Chrome/108.0.0.0 Safari/537.36"
expect(described_class.new(chrome_user_agent).user_agent).to be_eql(chrome_user_agent)
expect(described_class.new(chrome_user_agent).name).to be_eql('Chrome')
end
it 'truncates big user agents' do
big_user_agent = "chrome #{'abc' * 1024}"
expect(described_class.new(big_user_agent).user_agent).not_to be_eql(big_user_agent)
end
end

View File

@ -11,7 +11,7 @@ RSpec.describe DeviseMailer do
subject { described_class.confirmation_instructions(user, 'faketoken', {}) }
context "when confirming a new account" do
let(:user) { build(:user, created_at: 1.minute.ago, unconfirmed_email: nil) }
let(:user) { create(:user, created_at: 1.minute.ago) }
it "shows the expected text" do
expect(subject.body.encoded).to have_text "Welcome"
@ -20,7 +20,13 @@ RSpec.describe DeviseMailer do
end
context "when confirming the unconfirmed_email" do
let(:user) { build(:user, unconfirmed_email: 'jdoe@example.com') }
subject { described_class.confirmation_instructions(user, user.confirmation_token, { to: user.unconfirmed_email }) }
let(:user) { create(:user) }
before do
user.update!(email: 'unconfirmed-email@example.com')
end
it "shows the expected text" do
expect(subject.body.encoded).not_to have_text "Welcome"
@ -30,7 +36,7 @@ RSpec.describe DeviseMailer do
end
context "when re-confirming the primary email after a security issue" do
let(:user) { build(:user, created_at: 10.days.ago, unconfirmed_email: nil) }
let(:user) { create(:user, created_at: Devise.confirm_within.ago) }
it "shows the expected text" do
expect(subject.body.encoded).not_to have_text "Welcome"

View File

@ -0,0 +1,33 @@
# frozen_string_literal: true
require 'spec_helper'
require_migration!
RSpec.describe DeleteQueuedJobsForVulnerabilitiesFeedbackMigration, feature_category: :vulnerability_management do
let!(:migration) { described_class.new }
let(:batched_background_migrations) { table(:batched_background_migrations) }
before do
batched_background_migrations.create!(
max_value: 10,
batch_size: 250,
sub_batch_size: 50,
interval: 300,
job_class_name: 'MigrateVulnerabilitiesFeedbackToVulnerabilitiesStateTransition',
table_name: 'vulnerability_feedback',
column_name: 'id',
job_arguments: [],
gitlab_schema: "gitlab_main"
)
end
describe "#up" do
it "deletes all batched migration records" do
expect(batched_background_migrations.count).to eq(1)
migration.up
expect(batched_background_migrations.count).to eq(0)
end
end
end

View File

@ -2,7 +2,7 @@
require 'spec_helper'
RSpec.describe Environment, :use_clean_rails_memory_store_caching do
RSpec.describe Environment, :use_clean_rails_memory_store_caching, feature_category: :continuous_delivery do
include ReactiveCachingHelpers
using RSpec::Parameterized::TableSyntax
include RepoHelpers
@ -2029,4 +2029,40 @@ RSpec.describe Environment, :use_clean_rails_memory_store_caching do
subject
end
end
describe '#deployed_and_updated_before' do
subject do
described_class.deployed_and_updated_before(project_id, before)
end
let(:project_id) { project.id }
let(:before) { 1.week.ago.to_date.to_s }
let(:environment) { create(:environment, project: project, updated_at: 2.weeks.ago) }
let!(:stale_deployment) { create(:deployment, environment: environment, updated_at: 2.weeks.ago) }
it 'excludes environments with recent deployments' do
create(:deployment, environment: environment, updated_at: Date.current)
is_expected.to match_array([])
end
it 'includes environments with no deployments' do
environment1 = create(:environment, project: project, updated_at: 2.weeks.ago)
is_expected.to match_array([environment, environment1])
end
it 'excludes environments that have been recently updated with no deployments' do
create(:environment, project: project)
is_expected.to match_array([environment])
end
it 'excludes environments that have been recently updated with stale deployments' do
environment1 = create(:environment, project: project)
create(:deployment, environment: environment1, updated_at: 2.weeks.ago)
is_expected.to match_array([environment])
end
end
end

View File

@ -2,7 +2,7 @@
require 'spec_helper'
RSpec.describe WebHook do
RSpec.describe WebHook, feature_category: :integrations do
include AfterNextHelpers
let_it_be(:project) { create(:project) }
@ -225,6 +225,32 @@ RSpec.describe WebHook do
end
end
describe 'before_validation :reset_url_variables' do
subject(:hook) { build_stubbed(:project_hook, :url_variables, project: project, url: 'http://example.com/{abc}') }
it 'resets url variables if url changed' do
hook.url = 'http://example.com/new-hook'
expect(hook).to be_valid
expect(hook.url_variables).to eq({})
end
it 'resets url variables if url is changed but url variables stayed the same' do
hook.url = 'http://test.example.com/{abc}'
expect(hook).not_to be_valid
expect(hook.url_variables).to eq({})
end
it 'does not reset url variables if both url and url variables are changed' do
hook.url = 'http://example.com/{one}/{two}'
hook.url_variables = { 'one' => 'foo', 'two' => 'bar' }
expect(hook).to be_valid
expect(hook.url_variables).to eq({ 'one' => 'foo', 'two' => 'bar' })
end
end
it "only consider these branch filter strategies are valid" do
expected_valid_types = %w[all_branches regex wildcard]
expect(described_class.branch_filter_strategies.keys).to contain_exactly(*expected_valid_types)

View File

@ -361,6 +361,34 @@ RSpec.describe User do
end
end
end
describe 'confirmation instructions for unconfirmed email' do
let(:unconfirmed_email) { 'first-unconfirmed-email@example.com' }
let(:another_unconfirmed_email) { 'another-unconfirmed-email@example.com' }
context 'when email is changed to another before performing the job that sends confirmation instructions for previous email change request' do
it "mentions the recipient's email in the message body", :aggregate_failures do
same_user = User.find(user.id)
same_user.update!(email: unconfirmed_email)
user.update!(email: another_unconfirmed_email)
perform_enqueued_jobs
confirmation_instructions_for_unconfirmed_email = ActionMailer::Base.deliveries.find do |message|
message.subject == 'Confirmation instructions' && message.to.include?(unconfirmed_email)
end
expect(confirmation_instructions_for_unconfirmed_email.html_part.body.encoded).to match same_user.unconfirmed_email
expect(confirmation_instructions_for_unconfirmed_email.text_part.body.encoded).to match same_user.unconfirmed_email
confirmation_instructions_for_another_unconfirmed_email = ActionMailer::Base.deliveries.find do |message|
message.subject == 'Confirmation instructions' && message.to.include?(another_unconfirmed_email)
end
expect(confirmation_instructions_for_another_unconfirmed_email.html_part.body.encoded).to match user.unconfirmed_email
expect(confirmation_instructions_for_another_unconfirmed_email.text_part.body.encoded).to match user.unconfirmed_email
end
end
end
end
describe 'validations' do

View File

@ -668,6 +668,35 @@ RSpec.describe ProjectPolicy do
end
end
describe 'read_grafana', feature_category: :metrics do
using RSpec::Parameterized::TableSyntax
let(:policy) { :read_grafana }
where(:project_visibility, :role, :allowed) do
:public | :anonymous | false
:public | :guest | false
:public | :reporter | true
:internal | :anonymous | false
:internal | :guest | true
:internal | :reporter | true
:private | :anonymous | false
:private | :guest | true
:private | :reporter | true
end
with_them do
let(:current_user) { public_send(role) }
let(:project) { public_send("#{project_visibility}_project") }
if params[:allowed]
it { is_expected.to be_allowed(policy) }
else
it { is_expected.not_to be_allowed(policy) }
end
end
end
describe 'update_max_artifacts_size' do
context 'when no user' do
let(:current_user) { anonymous }

View File

@ -68,7 +68,7 @@ RSpec.describe RuboCop::CheckGracefulTask do
let(:user_name) { 'GitLab Bot' }
let(:job_name) { 'some job name' }
let(:job_url) { 'some job url' }
let(:docs_link) { 'https://docs.gitlab.com/ee/development/contributing/style_guides.html#silenced-offenses' }
let(:docs_link) { 'https://docs.gitlab.com/ee/development/rubocop_development_guide.html#silenced-offenses' }
before do
env = {

View File

@ -0,0 +1,49 @@
# frozen_string_literal: true
require 'spec_helper'
RSpec.describe Environments::StopStaleService,
:clean_gitlab_redis_shared_state,
:sidekiq_inline,
feature_category: :continuous_delivery do
let_it_be(:project) { create(:project, :repository) }
let_it_be(:user) { create(:user) }
let(:params) { { after: nil } }
let(:service) { described_class.new(project, user, params) }
describe '#execute' do
subject { service.execute }
let_it_be(:project) { create(:project, :repository) }
let_it_be(:user) { create(:user) }
let_it_be(:stale_environment) { create(:environment, project: project, updated_at: 2.weeks.ago) }
let_it_be(:stale_environment2) { create(:environment, project: project, updated_at: 2.weeks.ago) }
let_it_be(:recent_environment) { create(:environment, project: project, updated_at: Date.today) }
let_it_be(:params) { { before: 1.week.ago } }
before do
allow(service).to receive(:can?).with(user, :stop_environment, project).and_return(true)
end
it 'only stops stale environments' do
spy_service = Environments::AutoStopWorker.new
allow(Environments::AutoStopWorker).to receive(:new) { spy_service }
expect(spy_service).to receive(:perform).with(stale_environment.id).and_call_original
expect(spy_service).to receive(:perform).with(stale_environment2.id).and_call_original
expect(spy_service).not_to receive(:perform).with(recent_environment.id)
expect(Environment).to receive(:deployed_and_updated_before).with(project.id, params[:before]).and_call_original
expect(Environment).to receive(:without_protected).with(project).and_call_original
expect(subject.success?).to be_truthy
expect(stale_environment.reload).to be_stopped
expect(stale_environment2.reload).to be_stopped
expect(recent_environment.reload).to be_available
end
end
end

View File

@ -2,7 +2,7 @@
require 'spec_helper'
RSpec.describe ErrorTracking::ListProjectsService do
RSpec.describe ErrorTracking::ListProjectsService, feature_category: :integrations do
let_it_be(:user) { create(:user) }
let_it_be(:project, reload: true) { create(:project) }
@ -51,15 +51,33 @@ RSpec.describe ErrorTracking::ListProjectsService do
end
context 'masked param token' do
let(:params) { ActionController::Parameters.new(token: "*********", api_host: new_api_host) }
let(:params) { ActionController::Parameters.new(token: "*********", api_host: api_host) }
before do
expect(error_tracking_setting).to receive(:list_sentry_projects)
context 'with the current api host' do
let(:api_host) { 'https://sentrytest.gitlab.com' }
before do
expect(error_tracking_setting).to receive(:list_sentry_projects)
.and_return({ projects: [] })
end
it 'uses database token' do
expect { subject.execute }.not_to change { error_tracking_setting.token }
end
end
it 'uses database token' do
expect { subject.execute }.not_to change { error_tracking_setting.token }
context 'with a new api host' do
let(:api_host) { new_api_host }
it 'returns an error' do
expect(result[:message]).to start_with('Token is a required field')
expect(error_tracking_setting).not_to be_valid
expect(error_tracking_setting).not_to receive(:list_sentry_projects)
end
it 'resets the token' do
expect { subject.execute }.to change { error_tracking_setting.token }.from(token).to(nil)
end
end
end

View File

@ -56,21 +56,11 @@ RSpec.describe Groups::ImportExport::ExportService do
end
it 'saves the models using ndjson tree saver' do
stub_feature_flags(group_export_ndjson: true)
expect(Gitlab::ImportExport::Group::TreeSaver).to receive(:new).and_call_original
service.execute
end
it 'saves the models using legacy tree saver' do
stub_feature_flags(group_export_ndjson: false)
expect(Gitlab::ImportExport::Group::LegacyTreeSaver).to receive(:new).and_call_original
service.execute
end
it 'compresses and removes tmp files' do
expect(group.import_export_upload).to be_nil
expect(Gitlab::ImportExport::Saver).to receive(:new).and_call_original

View File

@ -59,32 +59,32 @@ RSpec.describe Groups::ImportExport::ImportService do
end
end
context 'with group_import_ndjson feature flag disabled' do
context 'when importing a ndjson export' do
let(:user) { create(:user) }
let(:group) { create(:group) }
let(:import_file) { fixture_file_upload('spec/fixtures/group_export.tar.gz') }
let(:import_logger) { instance_double(Gitlab::Import::Logger) }
subject(:service) { described_class.new(group: group, user: user) }
before do
stub_feature_flags(group_import_ndjson: false)
group.add_owner(user)
ImportExportUpload.create!(group: group, import_file: import_file)
allow(Gitlab::Import::Logger).to receive(:build).and_return(import_logger)
allow(import_logger).to receive(:error)
allow(import_logger).to receive(:info)
allow(import_logger).to receive(:warn)
allow(FileUtils).to receive(:rm_rf).and_call_original
end
context 'with a json file' do
let(:import_file) { fixture_file_upload('spec/fixtures/legacy_group_export.tar.gz') }
context 'when user has correct permissions' do
before do
group.add_owner(user)
end
it 'uses LegacyTreeRestorer to import the file' do
expect(Gitlab::ImportExport::Group::LegacyTreeRestorer).to receive(:new).and_call_original
service.execute
it 'imports group structure successfully' do
expect(service.execute).to be_truthy
end
it 'tracks the event' do
@ -95,317 +95,151 @@ RSpec.describe Groups::ImportExport::ImportService do
action: 'create',
label: 'import_group_from_file'
)
expect_snowplow_event(
category: 'Groups::ImportExport::ImportService',
action: 'create',
label: 'import_access_level',
user: user,
extra: { user_role: 'Owner', import_type: 'import_group_from_file' }
)
end
it 'removes import file' do
service.execute
expect(group.import_export_upload.import_file.file).to be_nil
end
it 'removes tmp files' do
shared = Gitlab::ImportExport::Shared.new(group)
allow(Gitlab::ImportExport::Shared).to receive(:new).and_return(shared)
service.execute
expect(FileUtils).to have_received(:rm_rf).with(shared.base_path)
expect(Dir.exist?(shared.base_path)).to eq(false)
end
it 'logs the import success' do
expect(import_logger).to receive(:info).with(
group_id: group.id,
group_name: group.name,
message: 'Group Import/Export: Import succeeded'
).once
service.execute
end
end
context 'with a ndjson file' do
let(:import_file) { fixture_file_upload('spec/fixtures/group_export.tar.gz') }
context 'when user does not have correct permissions' do
it 'logs the error and raises an exception' do
expect(import_logger).to receive(:error).with(
group_id: group.id,
group_name: group.name,
message: a_string_including('Errors occurred')
)
it 'fails to import' do
expect { service.execute }.to raise_error(Gitlab::ImportExport::Error, 'Incorrect JSON format')
expect { service.execute }.to raise_error(Gitlab::ImportExport::Error)
end
it 'tracks the error' do
shared = Gitlab::ImportExport::Shared.new(group)
allow(Gitlab::ImportExport::Shared).to receive(:new).and_return(shared)
expect(shared).to receive(:error) do |param|
expect(param.message).to include 'does not have required permissions for'
end
expect { service.execute }.to raise_error(Gitlab::ImportExport::Error)
end
end
context 'when there are errors with the import file' do
let(:import_file) { fixture_file_upload('spec/fixtures/symlink_export.tar.gz') }
it 'logs the error and raises an exception' do
expect(import_logger).to receive(:error).with(
group_id: group.id,
group_name: group.name,
message: a_string_including('Errors occurred')
).once
expect { service.execute }.to raise_error(Gitlab::ImportExport::Error)
end
end
context 'when there are errors with the sub-relations' do
let(:import_file) { fixture_file_upload('spec/fixtures/group_export_invalid_subrelations.tar.gz') }
before do
group.add_owner(user)
end
it 'successfully imports the group' do
expect(service.execute).to be_truthy
end
it 'logs the import success' do
allow(Gitlab::Import::Logger).to receive(:build).and_return(import_logger)
expect(import_logger).to receive(:info).with(
group_id: group.id,
group_name: group.name,
message: 'Group Import/Export: Import succeeded'
)
service.execute
expect_snowplow_event(
category: 'Groups::ImportExport::ImportService',
action: 'create',
label: 'import_access_level',
user: user,
extra: { user_role: 'Owner', import_type: 'import_group_from_file' }
)
end
end
end
context 'with group_import_ndjson feature flag enabled' do
context 'when importing a json export' do
let(:user) { create(:user) }
let(:group) { create(:group) }
let(:import_file) { fixture_file_upload('spec/fixtures/legacy_group_export.tar.gz') }
let(:import_logger) { instance_double(Gitlab::Import::Logger) }
subject(:service) { described_class.new(group: group, user: user) }
before do
stub_feature_flags(group_import_ndjson: true)
group.add_owner(user)
ImportExportUpload.create!(group: group, import_file: import_file)
allow(Gitlab::Import::Logger).to receive(:build).and_return(import_logger)
allow(import_logger).to receive(:error)
allow(import_logger).to receive(:warn)
allow(import_logger).to receive(:info)
end
context 'when importing a ndjson export' do
let(:user) { create(:user) }
let(:group) { create(:group) }
let(:import_file) { fixture_file_upload('spec/fixtures/group_export.tar.gz') }
it 'logs the error and raises an exception' do
expect(import_logger).to receive(:error).with(
group_id: group.id,
group_name: group.name,
message: a_string_including('Errors occurred')
).once
let(:import_logger) { instance_double(Gitlab::Import::Logger) }
subject(:service) { described_class.new(group: group, user: user) }
before do
ImportExportUpload.create!(group: group, import_file: import_file)
allow(Gitlab::Import::Logger).to receive(:build).and_return(import_logger)
allow(import_logger).to receive(:error)
allow(import_logger).to receive(:info)
allow(import_logger).to receive(:warn)
allow(FileUtils).to receive(:rm_rf).and_call_original
end
context 'when user has correct permissions' do
before do
group.add_owner(user)
end
it 'imports group structure successfully' do
expect(service.execute).to be_truthy
end
it 'tracks the event' do
service.execute
expect_snowplow_event(
category: 'Groups::ImportExport::ImportService',
action: 'create',
label: 'import_group_from_file'
)
expect_snowplow_event(
category: 'Groups::ImportExport::ImportService',
action: 'create',
label: 'import_access_level',
user: user,
extra: { user_role: 'Owner', import_type: 'import_group_from_file' }
)
end
it 'removes import file' do
service.execute
expect(group.import_export_upload.import_file.file).to be_nil
end
it 'removes tmp files' do
shared = Gitlab::ImportExport::Shared.new(group)
allow(Gitlab::ImportExport::Shared).to receive(:new).and_return(shared)
service.execute
expect(FileUtils).to have_received(:rm_rf).with(shared.base_path)
expect(Dir.exist?(shared.base_path)).to eq(false)
end
it 'logs the import success' do
expect(import_logger).to receive(:info).with(
group_id: group.id,
group_name: group.name,
message: 'Group Import/Export: Import succeeded'
).once
service.execute
end
end
context 'when user does not have correct permissions' do
it 'logs the error and raises an exception' do
expect(import_logger).to receive(:error).with(
group_id: group.id,
group_name: group.name,
message: a_string_including('Errors occurred')
)
expect { service.execute }.to raise_error(Gitlab::ImportExport::Error)
end
it 'tracks the error' do
shared = Gitlab::ImportExport::Shared.new(group)
allow(Gitlab::ImportExport::Shared).to receive(:new).and_return(shared)
expect(shared).to receive(:error) do |param|
expect(param.message).to include 'does not have required permissions for'
end
expect { service.execute }.to raise_error(Gitlab::ImportExport::Error)
end
end
context 'when there are errors with the import file' do
let(:import_file) { fixture_file_upload('spec/fixtures/symlink_export.tar.gz') }
it 'logs the error and raises an exception' do
expect(import_logger).to receive(:error).with(
group_id: group.id,
group_name: group.name,
message: a_string_including('Errors occurred')
).once
expect { service.execute }.to raise_error(Gitlab::ImportExport::Error)
end
end
context 'when there are errors with the sub-relations' do
let(:import_file) { fixture_file_upload('spec/fixtures/group_export_invalid_subrelations.tar.gz') }
before do
group.add_owner(user)
end
it 'successfully imports the group' do
expect(service.execute).to be_truthy
end
it 'logs the import success' do
allow(Gitlab::Import::Logger).to receive(:build).and_return(import_logger)
expect(import_logger).to receive(:info).with(
group_id: group.id,
group_name: group.name,
message: 'Group Import/Export: Import succeeded'
)
service.execute
expect_snowplow_event(
category: 'Groups::ImportExport::ImportService',
action: 'create',
label: 'import_access_level',
user: user,
extra: { user_role: 'Owner', import_type: 'import_group_from_file' }
)
end
end
expect { service.execute }.to raise_error(Gitlab::ImportExport::Error)
end
context 'when importing a json export' do
let(:user) { create(:user) }
let(:group) { create(:group) }
let(:import_file) { fixture_file_upload('spec/fixtures/legacy_group_export.tar.gz') }
it 'tracks the error' do
shared = Gitlab::ImportExport::Shared.new(group)
allow(Gitlab::ImportExport::Shared).to receive(:new).and_return(shared)
let(:import_logger) { instance_double(Gitlab::Import::Logger) }
subject(:service) { described_class.new(group: group, user: user) }
before do
ImportExportUpload.create!(group: group, import_file: import_file)
allow(Gitlab::Import::Logger).to receive(:build).and_return(import_logger)
allow(import_logger).to receive(:error)
allow(import_logger).to receive(:warn)
allow(import_logger).to receive(:info)
allow(FileUtils).to receive(:rm_rf).and_call_original
expect(shared).to receive(:error) do |param|
expect(param.message).to include 'The import file is incompatible'
end
context 'when user has correct permissions' do
before do
group.add_owner(user)
end
it 'imports group structure successfully' do
expect(service.execute).to be_truthy
end
it 'tracks the event' do
service.execute
expect_snowplow_event(
category: 'Groups::ImportExport::ImportService',
action: 'create',
label: 'import_group_from_file'
)
expect_snowplow_event(
category: 'Groups::ImportExport::ImportService',
action: 'create',
label: 'import_access_level',
user: user,
extra: { user_role: 'Owner', import_type: 'import_group_from_file' }
)
end
it 'removes import file' do
service.execute
expect(group.import_export_upload.import_file.file).to be_nil
end
it 'removes tmp files' do
shared = Gitlab::ImportExport::Shared.new(group)
allow(Gitlab::ImportExport::Shared).to receive(:new).and_return(shared)
service.execute
expect(FileUtils).to have_received(:rm_rf).with(shared.base_path)
expect(Dir.exist?(shared.base_path)).to eq(false)
end
it 'logs the import success' do
expect(import_logger).to receive(:info).with(
group_id: group.id,
group_name: group.name,
message: 'Group Import/Export: Import succeeded'
).once
service.execute
end
end
context 'when user does not have correct permissions' do
it 'logs the error and raises an exception' do
expect(import_logger).to receive(:error).with(
group_id: group.id,
group_name: group.name,
message: a_string_including('Errors occurred')
)
expect { service.execute }.to raise_error(Gitlab::ImportExport::Error)
end
it 'tracks the error' do
shared = Gitlab::ImportExport::Shared.new(group)
allow(Gitlab::ImportExport::Shared).to receive(:new).and_return(shared)
expect(shared).to receive(:error) do |param|
expect(param.message).to include 'does not have required permissions for'
end
expect { service.execute }.to raise_error(Gitlab::ImportExport::Error)
end
end
context 'when there are errors with the import file' do
let(:import_file) { fixture_file_upload('spec/fixtures/legacy_symlink_export.tar.gz') }
it 'logs the error and raises an exception' do
expect(import_logger).to receive(:error).with(
group_id: group.id,
group_name: group.name,
message: a_string_including('Errors occurred')
).once
expect { service.execute }.to raise_error(Gitlab::ImportExport::Error)
end
end
context 'when there are errors with the sub-relations' do
let(:import_file) { fixture_file_upload('spec/fixtures/legacy_group_export_invalid_subrelations.tar.gz') }
before do
group.add_owner(user)
end
it 'successfully imports the group' do
expect(service.execute).to be_truthy
end
it 'tracks the event' do
service.execute
expect_snowplow_event(
category: 'Groups::ImportExport::ImportService',
action: 'create',
label: 'import_group_from_file'
)
expect_snowplow_event(
category: 'Groups::ImportExport::ImportService',
action: 'create',
label: 'import_access_level',
user: user,
extra: { user_role: 'Owner', import_type: 'import_group_from_file' }
)
end
it 'logs the import success' do
allow(Gitlab::Import::Logger).to receive(:build).and_return(import_logger)
expect(import_logger).to receive(:info).with(
group_id: group.id,
group_name: group.name,
message: 'Group Import/Export: Import succeeded'
)
service.execute
end
end
expect { service.execute }.to raise_error(Gitlab::ImportExport::Error)
end
end
end

View File

@ -77,6 +77,34 @@ RSpec.describe Users::UpdateService do
subject
end
context 'when race condition' do
# See https://gitlab.com/gitlab-org/gitlab/-/issues/382957
it 'updates email for stale user', :aggregate_failures do
unconfirmed_email = 'unconfirmed-email-user-has-access-to@example.com'
forgery_email = 'forgery@example.com'
user.update!(email: unconfirmed_email)
stale_user = User.find(user.id)
service1 = described_class.new(stale_user, { email: unconfirmed_email }.merge(user: stale_user))
service2 = described_class.new(user, { email: forgery_email }.merge(user: user))
service2.execute
reloaded_user = User.find(user.id)
expect(reloaded_user.unconfirmed_email).to eq(forgery_email)
expect(stale_user.confirmation_token).not_to eq(user.confirmation_token)
expect(reloaded_user.confirmation_token).to eq(user.confirmation_token)
service1.execute
reloaded_user = User.find(user.id)
expect(reloaded_user.unconfirmed_email).to eq(unconfirmed_email)
expect(stale_user.confirmation_token).not_to eq(user.confirmation_token)
expect(reloaded_user.confirmation_token).to eq(stale_user.confirmation_token)
end
end
context 'when check_password is true' do
def update_user(user, opts)
described_class.new(user, opts.merge(user: user)).execute(check_password: true)
@ -139,9 +167,24 @@ RSpec.describe Users::UpdateService do
update_user(user, job_title: 'supreme leader of the universe')
end.not_to change { user.user_canonical_email }
end
it 'does not reset unconfirmed email' do
unconfirmed_email = 'unconfirmed-email@example.com'
user.update!(email: unconfirmed_email)
expect do
update_user(user, job_title: 'supreme leader of the universe')
end.not_to change { user.unconfirmed_email }
end
end
end
it 'does not try to reset unconfirmed email for a new user' do
expect do
update_user(build(:user), job_title: 'supreme leader of the universe')
end.not_to raise_error
end
def update_user(user, opts)
described_class.new(user, opts.merge(user: user)).execute
end

View File

@ -129,7 +129,10 @@ RSpec.describe WebHookService, :request_store, :clean_gitlab_redis_shared_state
context 'there is userinfo' do
before do
project_hook.update!(url: 'http://{one}:{two}@example.com')
project_hook.update!(
url: 'http://{one}:{two}@example.com',
url_variables: { 'one' => 'a', 'two' => 'b' }
)
stub_full_request('http://example.com', method: :post)
end

View File

@ -71,26 +71,3 @@ RSpec.shared_examples 'Self-managed Core resource access tokens' do
end
end
end
RSpec.shared_examples 'GitLab.com Core resource access tokens' do
before do
allow(::Gitlab).to receive(:com?).and_return(true)
stub_ee_application_setting(should_check_namespace_plan: true)
end
context 'with owner access' do
let(:current_user) { owner }
context 'create resource access tokens' do
it { is_expected.not_to be_allowed(:create_resource_access_tokens) }
end
context 'read resource access tokens' do
it { is_expected.not_to be_allowed(:read_resource_access_tokens) }
end
context 'destroy resource access tokens' do
it { is_expected.not_to be_allowed(:destroy_resource_access_tokens) }
end
end
end