Add latest changes from gitlab-org/gitlab@master

This commit is contained in:
GitLab Bot 2021-06-23 18:07:10 +00:00
parent 676430584d
commit d04f2be14d
65 changed files with 699 additions and 577 deletions

View File

@ -13,62 +13,28 @@
<!-- Link related issues below. --> <!-- Link related issues below. -->
## Author's checklist (required) ## Author's checklist
- [ ] Follow the [Documentation Guidelines](https://docs.gitlab.com/ee/development/documentation/) and [Style Guide](https://docs.gitlab.com/ee/development/documentation/styleguide/). - [ ] Follow the [Documentation Guidelines](https://docs.gitlab.com/ee/development/documentation/) and [Style Guide](https://docs.gitlab.com/ee/development/documentation/styleguide/).
- If you have **Developer** permissions or higher:
- [ ] Ensure that the [product tier badge](https://docs.gitlab.com/ee/development/documentation/styleguide/index.html#product-tier-badges) is added to doc's `h1`. - [ ] Ensure that the [product tier badge](https://docs.gitlab.com/ee/development/documentation/styleguide/index.html#product-tier-badges) is added to doc's `h1`.
- [ ] Apply the ~documentation label, plus: - [ ] [Request a review](https://docs.gitlab.com/ee/development/code_review.html#dogfooding-the-reviewers-feature) based on the documentation page's metadata and [associated Technical Writer](https://about.gitlab.com/handbook/product/categories/#devops-stages).
- The corresponding DevOps stage and group labels, if applicable.
- ~"development guidelines" when changing docs under `doc/development/*`, `CONTRIBUTING.md`, or `README.md`.
- ~"development guidelines" and ~"Documentation guidelines" when changing docs under `development/documentation/*`.
- ~"development guidelines" and ~"Description templates (.gitlab/\*)" when creating/updating issue and MR description templates.
- [ ] [Request a review](https://docs.gitlab.com/ee/development/code_review.html#dogfooding-the-reviewers-feature)
from the [designated Technical Writer](https://about.gitlab.com/handbook/engineering/ux/technical-writing/#assignments).
/label ~documentation To avoid having this MR be added to code verification QA issues, don't add these labels: ~"feature", ~"frontend", ~"backend", ~"bug", or ~"database"
/assign me
Do not add the ~"feature", ~"frontend", ~"backend", ~"bug", or ~"database" labels if you are only updating documentation. These labels will cause the MR to be added to code verification QA issues.
When applicable:
- [ ] Update the [permissions table](https://docs.gitlab.com/ee/user/permissions.html).
- [ ] Link docs to and from the higher-level index page, plus other related docs where helpful.
- [ ] Add the [product tier badge](https://docs.gitlab.com/ee/development/documentation/styleguide/index.html#product-tier-badges) accordingly.
- [ ] Add [GitLab's version history note(s)](https://docs.gitlab.com/ee/development/documentation/styleguide/index.html#gitlab-versions).
- [ ] Add/update the [feature flag section](https://docs.gitlab.com/ee/development/documentation/feature_flags.html).
## Review checklist ## Review checklist
All reviewers can help ensure accuracy, clarity, completeness, and adherence to the [Documentation Guidelines](https://docs.gitlab.com/ee/development/documentation/) and [Style Guide](https://docs.gitlab.com/ee/development/documentation/styleguide/). Documentation-related MRs should be reviewed by a Technical Writer for a non-blocking review, based on [Documentation Guidelines](https://docs.gitlab.com/ee/development/documentation/) and the [Style Guide](https://docs.gitlab.com/ee/development/documentation/styleguide/).
**1. Primary Reviewer** - [ ] If the content requires it, ensure the information is reviewed by a subject matter expert.
- Technical writer review items:
- [ ] Ensure docs metadata is present and up-to-date.
- [ ] Ensure the appropriate [labels](https://about.gitlab.com/handbook/engineering/ux/technical-writing/workflow/#labels) are added to this MR.
- If relevant to this MR, ensure [content topic type](https://docs.gitlab.com/ee/development/documentation/structure.html) principles are in use, including:
- [ ] The headings should be something you'd do a Google search for. Instead of `Default behavior`, say something like `Default behavior when you close an issue`.
- [ ] The headings (other than the page title) should be active. Instead of `Configuring GDK`, say something like `Configure GDK`.
- [ ] Any task steps should be written as a numbered list.
- [ ] Review by assigned maintainer, who can always request/require the above reviews. Maintainer's review can occur before or after a technical writer review.
- [ ] Ensure a release milestone is set.
* [ ] Review by a code reviewer or other selected colleague to confirm accuracy, clarity, and completeness. This can be skipped for minor fixes without substantive content changes. /label ~documentation
/assign me
**2. Technical Writer**
- [ ] Technical writer review. If not requested for this MR, must be scheduled post-merge. To request for this MR, assign the writer listed for the applicable [DevOps stage](https://about.gitlab.com/handbook/product/categories/#devops-stages).
- [ ] Ensure docs metadata are present and up-to-date.
- [ ] Ensure ~"Technical Writing" and ~"documentation" are added.
- [ ] Add the corresponding `docs::` [scoped label](https://gitlab.com/groups/gitlab-org/-/labels?subscribed=&search=docs%3A%3A).
- [ ] If working on UI text, add the corresponding `UI Text` [scoped label](https://gitlab.com/groups/gitlab-org/-/labels?subscribed=&search=ui+text).
- [ ] Add ~"tw::doing" when starting work on the MR.
- [ ] Add ~"tw::finished" if Technical Writing team work on the MR is complete but it remains open.
For more information about labels, see [Technical Writing workflows - Labels](https://about.gitlab.com/handbook/engineering/ux/technical-writing/workflow/#labels).
For suggestions that you are confident don't need to be reviewed, change them locally
and push a commit directly to save others from unneeded reviews. For example:
- Clear typos, like `this is a typpo`.
- Minor issues, like single quotes instead of double quotes, Oxford commas, and periods.
For more information, see our documentation on [Merging a merge request](https://docs.gitlab.com/ee/development/code_review.html#merging-a-merge-request).
**3. Maintainer**
1. [ ] Review by assigned maintainer, who can always request/require the above reviews. Maintainer's review can occur before or after a technical writer review.
1. [ ] Ensure a release milestone is set.
1. [ ] If there has not been a technical writer review, [create an issue for one using the Doc Review template](https://gitlab.com/gitlab-org/gitlab/issues/new?issuable_template=Doc%20Review).

View File

@ -1,5 +1,11 @@
query getBlobContent($projectPath: ID!, $path: String, $ref: String!) { query getBlobContent($projectPath: ID!, $path: String!, $ref: String) {
blobContent(projectPath: $projectPath, path: $path, ref: $ref) @client { project(fullPath: $projectPath) {
rawData repository {
blobs(paths: [$path], ref: $ref) {
nodes {
rawBlob
}
}
}
} }
} }

View File

@ -1,20 +1,9 @@
import produce from 'immer'; import produce from 'immer';
import Api from '~/api';
import axios from '~/lib/utils/axios_utils'; import axios from '~/lib/utils/axios_utils';
import getCurrentBranchQuery from './queries/client/current_branch.graphql'; import getCurrentBranchQuery from './queries/client/current_branch.graphql';
import getLastCommitBranchQuery from './queries/client/last_commit_branch.query.graphql'; import getLastCommitBranchQuery from './queries/client/last_commit_branch.query.graphql';
export const resolvers = { export const resolvers = {
Query: {
blobContent(_, { projectPath, path, ref }) {
return {
__typename: 'BlobContent',
rawData: Api.getRawFile(projectPath, path, { ref }).then(({ data }) => {
return data;
}),
};
},
},
Mutation: { Mutation: {
lintCI: (_, { endpoint, content, dry_run }) => { lintCI: (_, { endpoint, content, dry_run }) => {
return axios.post(endpoint, { content, dry_run }).then(({ data }) => ({ return axios.post(endpoint, { content, dry_run }).then(({ data }) => ({

View File

@ -1,7 +1,6 @@
<script> <script>
import { GlLoadingIcon } from '@gitlab/ui'; import { GlLoadingIcon } from '@gitlab/ui';
import { fetchPolicies } from '~/lib/graphql'; import { fetchPolicies } from '~/lib/graphql';
import httpStatusCodes from '~/lib/utils/http_status';
import { s__ } from '~/locale'; import { s__ } from '~/locale';
import { unwrapStagesWithNeeds } from '~/pipelines/components/unwrapping_utils'; import { unwrapStagesWithNeeds } from '~/pipelines/components/unwrapping_utils';
@ -76,22 +75,40 @@ export default {
}; };
}, },
update(data) { update(data) {
return data?.blobContent?.rawData; return data?.project?.repository?.blobs?.nodes[0]?.rawBlob;
}, },
result({ data }) { result({ data }) {
const fileContent = data?.blobContent?.rawData ?? ''; const nodes = data?.project?.repository?.blobs?.nodes;
if (!nodes) {
this.reportFailure(LOAD_FAILURE_UNKNOWN);
} else {
const rawBlob = nodes[0]?.rawBlob;
const fileContent = rawBlob ?? '';
this.lastCommittedContent = fileContent; this.lastCommittedContent = fileContent;
this.currentCiFileContent = fileContent; this.currentCiFileContent = fileContent;
// make sure to reset the start screen flag during a refetch // If rawBlob is defined and returns a string, it means that there is
// a CI config file with empty content. If `rawBlob` is not defined
// at all, it means there was no file found.
const hasCIFile = rawBlob === '' || fileContent.length > 0;
if (!fileContent.length) {
this.setAppStatus(EDITOR_APP_STATUS_EMPTY);
}
if (!hasCIFile) {
this.showStartScreen = true;
} else if (fileContent.length) {
// If the file content is > 0, then we make sure to reset the
// start screen flag during a refetch
// e.g. when switching branches // e.g. when switching branches
if (fileContent.length) {
this.showStartScreen = false; this.showStartScreen = false;
} }
}
}, },
error(error) { error() {
this.handleBlobContentError(error); this.reportFailure(LOAD_FAILURE_UNKNOWN);
}, },
watchLoading(isLoading) { watchLoading(isLoading) {
if (isLoading) { if (isLoading) {
@ -187,22 +204,6 @@ export default {
}, },
}, },
methods: { methods: {
handleBlobContentError(error = {}) {
const { networkError } = error;
const { response } = networkError;
// 404 for missing CI file
// 400 for blank projects with no repository
if (
response?.status === httpStatusCodes.NOT_FOUND ||
response?.status === httpStatusCodes.BAD_REQUEST
) {
this.setAppStatus(EDITOR_APP_STATUS_EMPTY);
this.showStartScreen = true;
} else {
this.reportFailure(LOAD_FAILURE_UNKNOWN);
}
},
hideFailure() { hideFailure() {
this.showFailure = false; this.showFailure = false;
}, },

View File

@ -101,9 +101,6 @@ export default {
showJobLinks() { showJobLinks() {
return !this.isStageView && this.showLinks; return !this.isStageView && this.showLinks;
}, },
shouldShowStageName() {
return !this.isStageView;
},
// The show downstream check prevents showing redundant linked columns // The show downstream check prevents showing redundant linked columns
showDownstreamPipelines() { showDownstreamPipelines() {
return ( return (
@ -202,7 +199,7 @@ export default {
:groups="column.groups" :groups="column.groups"
:action="column.status.action" :action="column.status.action"
:highlighted-jobs="highlightedJobs" :highlighted-jobs="highlightedJobs"
:show-stage-name="shouldShowStageName" :is-stage-view="isStageView"
:job-hovered="hoveredJobName" :job-hovered="hoveredJobName"
:source-job-hovered="hoveredSourceJobName" :source-job-hovered="hoveredSourceJobName"
:pipeline-expanded="pipelineExpanded" :pipeline-expanded="pipelineExpanded"

View File

@ -40,6 +40,11 @@ export default {
required: false, required: false,
default: () => [], default: () => [],
}, },
isStageView: {
type: Boolean,
required: false,
default: false,
},
jobHovered: { jobHovered: {
type: String, type: String,
required: false, required: false,
@ -50,11 +55,6 @@ export default {
required: false, required: false,
default: () => ({}), default: () => ({}),
}, },
showStageName: {
type: Boolean,
required: false,
default: false,
},
sourceJobHovered: { sourceJobHovered: {
type: String, type: String,
required: false, required: false,
@ -73,6 +73,12 @@ export default {
'gl-pl-3', 'gl-pl-3',
], ],
computed: { computed: {
canUpdatePipeline() {
return this.userPermissions.updatePipeline;
},
columnSpacingClass() {
return this.isStageView ? 'gl-px-6' : 'gl-px-9';
},
/* /*
currentGroups and filteredGroups are part of currentGroups and filteredGroups are part of
a test to hunt down a bug a test to hunt down a bug
@ -94,8 +100,8 @@ export default {
hasAction() { hasAction() {
return !isEmpty(this.action); return !isEmpty(this.action);
}, },
canUpdatePipeline() { showStageName() {
return this.userPermissions.updatePipeline; return !this.isStageView;
}, },
}, },
errorCaptured(err, _vm, info) { errorCaptured(err, _vm, info) {
@ -130,7 +136,7 @@ export default {
}; };
</script> </script>
<template> <template>
<main-graph-wrapper class="gl-px-6" data-testid="stage-column"> <main-graph-wrapper :class="columnSpacingClass" data-testid="stage-column">
<template #stages> <template #stages>
<div <div
data-testid="stage-column-title" data-testid="stage-column-title"

View File

@ -75,11 +75,11 @@ export const generateLinksData = ({ links }, containerID, modifier = '') => {
// until we can safely draw the bezier to look nice. // until we can safely draw the bezier to look nice.
// The adjustment number here is a magic number to make things // The adjustment number here is a magic number to make things
// look nice and should change if the padding changes. This goes well // look nice and should change if the padding changes. This goes well
// with gl-px-6. gl-px-8 is more like 100. // with gl-px-9 which we translate with 100px here.
const straightLineDestinationX = targetNodeX - 60; const straightLineDestinationX = targetNodeX - 100;
const controlPointX = straightLineDestinationX + (targetNodeX - straightLineDestinationX) / 2; const controlPointX = straightLineDestinationX + (targetNodeX - straightLineDestinationX) / 2;
if (straightLineDestinationX > 0) { if (straightLineDestinationX > firstPointCoordinateX) {
path.lineTo(straightLineDestinationX, sourceNodeY); path.lineTo(straightLineDestinationX, sourceNodeY);
} }

View File

@ -14,7 +14,7 @@ export default {
type: Number, type: Number,
required: true, required: true,
}, },
isHighlighted: { isHovered: {
type: Boolean, type: Boolean,
required: false, required: false,
default: false, default: false,
@ -42,7 +42,7 @@ export default {
jobPillClasses() { jobPillClasses() {
return [ return [
{ 'gl-opacity-3': this.isFadedOut }, { 'gl-opacity-3': this.isFadedOut },
this.isHighlighted ? 'gl-shadow-blue-200-x0-y0-b4-s2' : 'gl-inset-border-2-green-400', { 'gl-bg-gray-50 gl-inset-border-1-gray-200': this.isHovered },
]; ];
}, },
}, },
@ -57,10 +57,11 @@ export default {
}; };
</script> </script>
<template> <template>
<div class="gl-w-full">
<tooltip-on-truncate :title="jobName" truncate-target="child" placement="top"> <tooltip-on-truncate :title="jobName" truncate-target="child" placement="top">
<div <div
:id="id" :id="id"
class="gl-w-15 gl-bg-white gl-text-center gl-text-truncate gl-rounded-pill gl-mb-3 gl-px-5 gl-py-2 gl-relative gl-z-index-1 gl-transition-duration-slow gl-transition-timing-function-ease" class="gl-bg-white gl-inset-border-1-gray-100 gl-text-center gl-text-truncate gl-rounded-6 gl-mb-3 gl-px-5 gl-py-3 gl-relative gl-z-index-1 gl-transition-duration-slow gl-transition-timing-function-ease"
:class="jobPillClasses" :class="jobPillClasses"
@mouseover="onMouseEnter" @mouseover="onMouseEnter"
@mouseleave="onMouseLeave" @mouseleave="onMouseLeave"
@ -68,4 +69,5 @@ export default {
{{ jobName }} {{ jobName }}
</div> </div>
</tooltip-on-truncate> </tooltip-on-truncate>
</div>
</template> </template>

View File

@ -4,14 +4,14 @@ import { __ } from '~/locale';
import { DRAW_FAILURE, DEFAULT } from '../../constants'; import { DRAW_FAILURE, DEFAULT } from '../../constants';
import LinksLayer from '../graph_shared/links_layer.vue'; import LinksLayer from '../graph_shared/links_layer.vue';
import JobPill from './job_pill.vue'; import JobPill from './job_pill.vue';
import StagePill from './stage_pill.vue'; import StageName from './stage_name.vue';
export default { export default {
components: { components: {
GlAlert, GlAlert,
JobPill, JobPill,
LinksLayer, LinksLayer,
StagePill, StageName,
}, },
CONTAINER_REF: 'PIPELINE_GRAPH_CONTAINER_REF', CONTAINER_REF: 'PIPELINE_GRAPH_CONTAINER_REF',
BASE_CONTAINER_ID: 'pipeline-graph-container', BASE_CONTAINER_ID: 'pipeline-graph-container',
@ -21,6 +21,11 @@ export default {
[DRAW_FAILURE]: __('Could not draw the lines for job relationships'), [DRAW_FAILURE]: __('Could not draw the lines for job relationships'),
[DEFAULT]: __('An unknown error occurred.'), [DEFAULT]: __('An unknown error occurred.'),
}, },
// The combination of gl-w-full gl-min-w-full and gl-max-w-15 is necessary.
// The max width and the width make sure the ellipsis to work and the min width
// is for when there is less text than the stage column width (which the width 100% does not fix)
jobWrapperClasses:
'gl-display-flex gl-flex-direction-column gl-align-items-center gl-w-full gl-px-8 gl-min-w-full gl-max-w-15',
props: { props: {
pipelineData: { pipelineData: {
required: true, required: true,
@ -85,23 +90,8 @@ export default {
height: this.$refs[this.$options.CONTAINER_REF].scrollHeight, height: this.$refs[this.$options.CONTAINER_REF].scrollHeight,
}; };
}, },
getStageBackgroundClasses(index) { isFadedOut(jobName) {
const { length } = this.pipelineStages; return this.highlightedJobs.length > 1 && !this.isJobHighlighted(jobName);
// It's possible for a graph to have only one stage, in which
// case we concatenate both the left and right rounding classes
if (length === 1) {
return 'gl-rounded-bottom-left-6 gl-rounded-top-left-6 gl-rounded-bottom-right-6 gl-rounded-top-right-6';
}
if (index === 0) {
return 'gl-rounded-bottom-left-6 gl-rounded-top-left-6';
}
if (index === length - 1) {
return 'gl-rounded-bottom-right-6 gl-rounded-top-right-6';
}
return '';
}, },
isJobHighlighted(jobName) { isJobHighlighted(jobName) {
return this.highlightedJobs.includes(jobName); return this.highlightedJobs.includes(jobName);
@ -137,7 +127,12 @@ export default {
> >
{{ failure.text }} {{ failure.text }}
</gl-alert> </gl-alert>
<div :id="containerId" :ref="$options.CONTAINER_REF" data-testid="graph-container"> <div
:id="containerId"
:ref="$options.CONTAINER_REF"
class="gl-bg-gray-10 gl-overflow-auto"
data-testid="graph-container"
>
<links-layer <links-layer
:pipeline-data="pipelineStages" :pipeline-data="pipelineStages"
:pipeline-id="$options.PIPELINE_ID" :pipeline-id="$options.PIPELINE_ID"
@ -152,23 +147,17 @@ export default {
:key="`${stage.name}-${index}`" :key="`${stage.name}-${index}`"
class="gl-flex-direction-column" class="gl-flex-direction-column"
> >
<div <div class="gl-display-flex gl-align-items-center gl-w-full gl-px-9 gl-py-4 gl-mb-5">
class="gl-display-flex gl-align-items-center gl-bg-white gl-w-full gl-px-8 gl-py-4 gl-mb-5" <stage-name :stage-name="stage.name" />
:class="getStageBackgroundClasses(index)"
data-testid="stage-background"
>
<stage-pill :stage-name="stage.name" :is-empty="stage.groups.length === 0" />
</div> </div>
<div <div :class="$options.jobWrapperClasses">
class="gl-display-flex gl-flex-direction-column gl-align-items-center gl-w-full gl-px-8"
>
<job-pill <job-pill
v-for="group in stage.groups" v-for="group in stage.groups"
:key="group.name" :key="group.name"
:job-name="group.name" :job-name="group.name"
:pipeline-id="$options.PIPELINE_ID" :pipeline-id="$options.PIPELINE_ID"
:is-highlighted="hasHighlightedJob && isJobHighlighted(group.name)" :is-hovered="highlightedJob === group.name"
:is-faded-out="hasHighlightedJob && !isJobHighlighted(group.name)" :is-faded-out="isFadedOut(group.name)"
@on-mouse-enter="setHoveredJob" @on-mouse-enter="setHoveredJob"
@on-mouse-leave="removeHoveredJob" @on-mouse-leave="removeHoveredJob"
/> />

View File

@ -1,4 +1,5 @@
<script> <script>
import { capitalize, escape } from 'lodash';
import tooltipOnTruncate from '~/vue_shared/components/tooltip_on_truncate.vue'; import tooltipOnTruncate from '~/vue_shared/components/tooltip_on_truncate.vue';
export default { export default {
@ -10,26 +11,18 @@ export default {
type: String, type: String,
required: true, required: true,
}, },
isEmpty: {
type: Boolean,
required: false,
default: false,
},
}, },
computed: { computed: {
emptyClass() { formattedTitle() {
return this.isEmpty ? 'gl-bg-gray-200' : 'gl-bg-gray-600'; return capitalize(escape(this.stageName));
}, },
}, },
}; };
</script> </script>
<template> <template>
<tooltip-on-truncate :title="stageName" truncate-target="child" placement="top"> <tooltip-on-truncate :title="stageName" truncate-target="child" placement="top">
<div <div class="gl-py-2 gl-text-truncate gl-font-weight-bold gl-w-20">
class="gl-px-5 gl-py-2 gl-text-white gl-text-center gl-text-truncate gl-rounded-pill gl-w-20" {{ formattedTitle }}
:class="emptyClass"
>
{{ stageName }}
</div> </div>
</tooltip-on-truncate> </tooltip-on-truncate>
</template> </template>

View File

@ -7,7 +7,7 @@ class Projects::MattermostsController < Projects::ApplicationController
layout 'project_settings' layout 'project_settings'
before_action :authorize_admin_project! before_action :authorize_admin_project!
before_action :service before_action :integration
before_action :teams, only: [:new] before_action :teams, only: [:new]
feature_category :integrations feature_category :integrations
@ -16,11 +16,11 @@ class Projects::MattermostsController < Projects::ApplicationController
end end
def create def create
result, message = @service.configure(current_user, configure_params) result, message = integration.configure(current_user, configure_params)
if result if result
flash[:notice] = 'This service is now configured' flash[:notice] = 'This service is now configured'
redirect_to edit_project_service_path(@project, service) redirect_to edit_project_service_path(@project, integration)
else else
flash[:alert] = message || 'Failed to configure service' flash[:alert] = message || 'Failed to configure service'
redirect_to new_project_mattermost_path(@project) redirect_to new_project_mattermost_path(@project)
@ -31,15 +31,16 @@ class Projects::MattermostsController < Projects::ApplicationController
def configure_params def configure_params
params.require(:mattermost).permit(:trigger, :team_id).merge( params.require(:mattermost).permit(:trigger, :team_id).merge(
url: service_trigger_url(@service), url: service_trigger_url(integration),
icon_url: asset_url('slash-command-logo.png', skip_pipeline: true)) icon_url: asset_url('slash-command-logo.png', skip_pipeline: true))
end end
def teams def teams
@teams, @teams_error_message = @service.list_teams(current_user) @teams, @teams_error_message = integration.list_teams(current_user)
end end
def service def integration
@service ||= @project.find_or_initialize_service('mattermost_slash_commands') @integration ||= @project.find_or_initialize_integration('mattermost_slash_commands')
@service = @integration # TODO: remove when https://gitlab.com/gitlab-org/gitlab/-/issues/330300 is complete
end end
end end

View File

@ -1,20 +1,21 @@
# frozen_string_literal: true # frozen_string_literal: true
class Projects::ServiceHookLogsController < Projects::HookLogsController class Projects::ServiceHookLogsController < Projects::HookLogsController
before_action :service, only: [:show, :retry] before_action :integration, only: [:show, :retry]
def retry def retry
execute_hook execute_hook
redirect_to edit_project_service_path(@project, @service) redirect_to edit_project_service_path(@project, @integration)
end end
private private
def hook def hook
@hook ||= service.service_hook @hook ||= integration.service_hook
end end
def service def integration
@service ||= @project.find_or_initialize_service(params[:service_id]) @integration ||= @project.find_or_initialize_integration(params[:service_id])
@service = @integration # TODO: remove when https://gitlab.com/gitlab-org/gitlab/-/issues/330300 is complete
end end
end end

View File

@ -84,7 +84,7 @@ class Projects::ServicesController < Projects::ApplicationController
end end
def integration def integration
@integration ||= @project.find_or_initialize_service(params[:id]) @integration ||= @project.find_or_initialize_integration(params[:id])
end end
alias_method :service, :integration alias_method :service, :integration

View File

@ -9,7 +9,7 @@ module Projects
feature_category :integrations feature_category :integrations
def show def show
@integrations = @project.find_or_initialize_services @integrations = @project.find_or_initialize_integrations
end end
end end
end end

View File

@ -5,7 +5,7 @@ module Types
class ServiceTypeEnum < BaseEnum class ServiceTypeEnum < BaseEnum
graphql_name 'ServiceType' graphql_name 'ServiceType'
::Integration.available_services_types(include_dev: false).each do |type| ::Integration.available_integration_types(include_dev: false).each do |type|
value type.underscore.upcase, value: type, description: "#{type} type" value type.underscore.upcase, value: type, description: "#{type} type"
end end
end end

View File

@ -5,7 +5,7 @@ module OperationsHelper
def prometheus_integration def prometheus_integration
strong_memoize(:prometheus_integration) do strong_memoize(:prometheus_integration) do
@project.find_or_initialize_service(::Integrations::Prometheus.to_param) @project.find_or_initialize_integration(::Integrations::Prometheus.to_param)
end end
end end

View File

@ -108,9 +108,9 @@ class Integration < ApplicationRecord
scope :by_active_flag, -> (flag) { where(active: flag) } scope :by_active_flag, -> (flag) { where(active: flag) }
scope :inherit_from_id, -> (id) { where(inherit_from_id: id) } scope :inherit_from_id, -> (id) { where(inherit_from_id: id) }
scope :inherit, -> { where.not(inherit_from_id: nil) } scope :inherit, -> { where.not(inherit_from_id: nil) }
scope :for_group, -> (group) { where(group_id: group, type: available_services_types(include_project_specific: false)) } scope :for_group, -> (group) { where(group_id: group, type: available_integration_types(include_project_specific: false)) }
scope :for_template, -> { where(template: true, type: available_services_types(include_project_specific: false)) } scope :for_template, -> { where(template: true, type: available_integration_types(include_project_specific: false)) }
scope :for_instance, -> { where(instance: true, type: available_services_types(include_project_specific: false)) } scope :for_instance, -> { where(instance: true, type: available_integration_types(include_project_specific: false)) }
scope :push_hooks, -> { where(push_events: true, active: true) } scope :push_hooks, -> { where(push_events: true, active: true) }
scope :tag_push_hooks, -> { where(tag_push_events: true, active: true) } scope :tag_push_hooks, -> { where(tag_push_events: true, active: true) }
@ -217,7 +217,7 @@ class Integration < ApplicationRecord
private_class_method :create_nonexistent_templates private_class_method :create_nonexistent_templates
def self.find_or_initialize_non_project_specific_integration(name, instance: false, group_id: nil) def self.find_or_initialize_non_project_specific_integration(name, instance: false, group_id: nil)
return unless name.in?(available_services_names(include_project_specific: false)) return unless name.in?(available_integration_names(include_project_specific: false))
integration_name_to_model(name).find_or_initialize_by(instance: instance, group_id: group_id) integration_name_to_model(name).find_or_initialize_by(instance: instance, group_id: group_id)
end end
@ -238,19 +238,19 @@ class Integration < ApplicationRecord
def self.nonexistent_services_types_for(scope) def self.nonexistent_services_types_for(scope)
# Using #map instead of #pluck to save one query count. This is because # Using #map instead of #pluck to save one query count. This is because
# ActiveRecord loaded the object here, so we don't need to query again later. # ActiveRecord loaded the object here, so we don't need to query again later.
available_services_types(include_project_specific: false) - scope.map(&:type) available_integration_types(include_project_specific: false) - scope.map(&:type)
end end
private_class_method :nonexistent_services_types_for private_class_method :nonexistent_services_types_for
# Returns a list of available service names. # Returns a list of available integration names.
# Example: ["asana", ...] # Example: ["asana", ...]
# @deprecated # @deprecated
def self.available_services_names(include_project_specific: true, include_dev: true) def self.available_integration_names(include_project_specific: true, include_dev: true)
service_names = services_names names = integration_names
service_names += project_specific_services_names if include_project_specific names += project_specific_integration_names if include_project_specific
service_names += dev_services_names if include_dev names += dev_integration_names if include_dev
service_names.sort_by(&:downcase) names.sort_by(&:downcase)
end end
def self.integration_names def self.integration_names
@ -261,21 +261,21 @@ class Integration < ApplicationRecord
integration_names integration_names
end end
def self.dev_services_names def self.dev_integration_names
return [] unless Rails.env.development? return [] unless Rails.env.development?
DEV_INTEGRATION_NAMES DEV_INTEGRATION_NAMES
end end
def self.project_specific_services_names def self.project_specific_integration_names
PROJECT_SPECIFIC_INTEGRATION_NAMES PROJECT_SPECIFIC_INTEGRATION_NAMES
end end
# Returns a list of available service types. # Returns a list of available integration types.
# Example: ["AsanaService", ...] # Example: ["AsanaService", ...]
def self.available_services_types(include_project_specific: true, include_dev: true) def self.available_integration_types(include_project_specific: true, include_dev: true)
available_services_names(include_project_specific: include_project_specific, include_dev: include_dev).map do |service_name| available_integration_names(include_project_specific: include_project_specific, include_dev: include_dev).map do
integration_name_to_type(service_name) integration_name_to_type(_1)
end end
end end

View File

@ -120,8 +120,6 @@ module Integrations
end end
def execute(data) def execute(data)
return if project.disabled_services.include?(to_param)
object_kind = data[:object_kind] object_kind = data[:object_kind]
object_kind = 'job' if object_kind == 'build' object_kind = 'job' if object_kind == 'build'
return unless supported_events.include?(object_kind) return unless supported_events.include?(object_kind)

View File

@ -550,7 +550,7 @@ class Project < ApplicationRecord
scope :with_namespace, -> { includes(:namespace) } scope :with_namespace, -> { includes(:namespace) }
scope :with_import_state, -> { includes(:import_state) } scope :with_import_state, -> { includes(:import_state) }
scope :include_project_feature, -> { includes(:project_feature) } scope :include_project_feature, -> { includes(:project_feature) }
scope :with_service, ->(service) { joins(service).eager_load(service) } scope :with_integration, ->(integration) { joins(integration).eager_load(integration) }
scope :with_shared_runners, -> { where(shared_runners_enabled: true) } scope :with_shared_runners, -> { where(shared_runners_enabled: true) }
scope :with_container_registry, -> { where(container_registry_enabled: true) } scope :with_container_registry, -> { where(container_registry_enabled: true) }
scope :inside_path, ->(path) do scope :inside_path, ->(path) do
@ -1398,22 +1398,22 @@ class Project < ApplicationRecord
@external_wiki ||= integrations.external_wikis.first @external_wiki ||= integrations.external_wikis.first
end end
def find_or_initialize_services def find_or_initialize_integrations
available_services_names = Integration.available_services_names - disabled_services Integration
.available_integration_names
available_services_names.map do |service_name| .difference(disabled_integrations)
find_or_initialize_service(service_name) .map { find_or_initialize_integration(_1) }
end.sort_by(&:title) .sort_by(&:title)
end end
def disabled_services def disabled_integrations
[] []
end end
def find_or_initialize_service(name) def find_or_initialize_integration(name)
return if disabled_services.include?(name) return if disabled_integrations.include?(name)
find_service(integrations, name) || build_from_instance_or_template(name) || build_service(name) find_integration(integrations, name) || build_from_instance_or_template(name) || build_integration(name)
end end
# rubocop: disable CodeReuse/ServiceClass # rubocop: disable CodeReuse/ServiceClass
@ -2659,19 +2659,19 @@ class Project < ApplicationRecord
project_feature.update!(container_registry_access_level: access_level) project_feature.update!(container_registry_access_level: access_level)
end end
def find_service(services, name) def find_integration(integrations, name)
services.find { |service| service.to_param == name } integrations.find { _1.to_param == name }
end end
def build_from_instance_or_template(name) def build_from_instance_or_template(name)
instance = find_service(services_instances, name) instance = find_integration(services_instances, name)
return Integration.build_from_integration(instance, project_id: id) if instance return Integration.build_from_integration(instance, project_id: id) if instance
template = find_service(services_templates, name) template = find_integration(services_templates, name)
return Integration.build_from_integration(template, project_id: id) if template return Integration.build_from_integration(template, project_id: id) if template
end end
def build_service(name) def build_integration(name)
Integration.integration_name_to_model(name).new(project_id: id) Integration.integration_name_to_model(name).new(project_id: id)
end end

View File

@ -193,14 +193,14 @@ module Projects
# Deprecated: https://gitlab.com/gitlab-org/gitlab/-/issues/326665 # Deprecated: https://gitlab.com/gitlab-org/gitlab/-/issues/326665
def create_prometheus_integration def create_prometheus_integration
service = @project.find_or_initialize_service(::Integrations::Prometheus.to_param) integration = @project.find_or_initialize_integration(::Integrations::Prometheus.to_param)
# If the service has already been inserted in the database, that # If the service has already been inserted in the database, that
# means it came from a template, and there's nothing more to do. # means it came from a template, and there's nothing more to do.
return if service.persisted? return if integration.persisted?
if service.prometheus_available? if integration.prometheus_available?
service.save! integration.save!
else else
@project.prometheus_integration = nil @project.prometheus_integration = nil
end end

View File

@ -102,10 +102,10 @@ module Projects
def prometheus_integration_params def prometheus_integration_params
return {} unless attrs = params[:prometheus_integration_attributes] return {} unless attrs = params[:prometheus_integration_attributes]
service = project.find_or_initialize_service(::Integrations::Prometheus.to_param) integration = project.find_or_initialize_integration(::Integrations::Prometheus.to_param)
service.assign_attributes(attrs) integration.assign_attributes(attrs)
{ prometheus_integration_attributes: service.attributes.except(*%w(id project_id created_at updated_at)) } { prometheus_integration_attributes: integration.attributes.except(*%w[id project_id created_at updated_at]) }
end end
def incident_management_setting_params def incident_management_setting_params

View File

@ -67,7 +67,7 @@ module Projects
end end
def valid_for_manual?(token) def valid_for_manual?(token)
prometheus = project.find_or_initialize_service('prometheus') prometheus = project.find_or_initialize_integration('prometheus')
return false unless prometheus.manual_configuration? return false unless prometheus.manual_configuration?
if setting = project.alerting_setting if setting = project.alerting_setting

View File

@ -1309,6 +1309,15 @@
:idempotent: true :idempotent: true
:tags: :tags:
- :exclude_from_kubernetes - :exclude_from_kubernetes
- :name: package_repositories:packages_helm_extraction
:worker_name: Packages::Helm::ExtractionWorker
:feature_category: :package_registry
:has_external_dependencies:
:urgency: :low
:resource_boundary: :unknown
:weight: 1
:idempotent: true
:tags: []
- :name: package_repositories:packages_maven_metadata_sync - :name: package_repositories:packages_maven_metadata_sync
:worker_name: Packages::Maven::Metadata::SyncWorker :worker_name: Packages::Maven::Metadata::SyncWorker
:feature_category: :package_registry :feature_category: :package_registry

View File

@ -15,7 +15,7 @@ module Clusters
return unless cluster return unless cluster
cluster.all_projects.find_each do |project| cluster.all_projects.find_each do |project|
project.find_or_initialize_service(service_name).update!(active: true) project.find_or_initialize_integration(service_name).update!(active: true)
end end
end end
end end

View File

@ -15,7 +15,7 @@ module Clusters
raise cluster_missing_error(integration_name) unless cluster raise cluster_missing_error(integration_name) unless cluster
integration = ::Project.integration_association_name(integration_name).to_sym integration = ::Project.integration_association_name(integration_name).to_sym
cluster.all_projects.with_service(integration).find_each do |project| cluster.all_projects.with_integration(integration).find_each do |project|
project.public_send(integration).update!(active: false) # rubocop:disable GitlabSecurity/PublicSend project.public_send(integration).update!(active: false) # rubocop:disable GitlabSecurity/PublicSend
end end
end end

View File

@ -0,0 +1,29 @@
# frozen_string_literal: true
module Packages
module Helm
class ExtractionWorker
include ApplicationWorker
queue_namespace :package_repositories
feature_category :package_registry
deduplicate :until_executing
idempotent!
def perform(channel, package_file_id)
package_file = ::Packages::PackageFile.find_by_id(package_file_id)
return unless package_file && !package_file.package.default?
::Packages::Helm::ProcessFileService.new(channel, package_file).execute
rescue ::Packages::Helm::ExtractFileMetadataService::ExtractionError,
::Packages::Helm::ProcessFileService::ExtractionError,
::ActiveModel::ValidationError => e
Gitlab::ErrorTracking.log_exception(e, project_id: package_file.project_id)
package_file.package.update_column(:status, :error)
end
end
end
end

View File

@ -21,15 +21,15 @@ module Projects
private private
def create_prometheus_integration(project) def create_prometheus_integration(project)
service = project.find_or_initialize_service(::Integrations::Prometheus.to_param) integration = project.find_or_initialize_integration(::Integrations::Prometheus.to_param)
# If the service has already been inserted in the database, that # If the service has already been inserted in the database, that
# means it came from a template, and there's nothing more to do. # means it came from a template, and there's nothing more to do.
return if service.persisted? return if integration.persisted?
return unless service.prometheus_available? return unless integration.prometheus_available?
service.save! integration.save!
rescue ActiveRecord::RecordInvalid => e rescue ActiveRecord::RecordInvalid => e
Gitlab::ErrorTracking.track_exception(e, extra: { project_id: project.id }) Gitlab::ErrorTracking.track_exception(e, extra: { project_id: project.id })
end end

View File

@ -385,7 +385,15 @@ module Gitlab
initializer :correct_precompile_targets, after: :set_default_precompile do |app| initializer :correct_precompile_targets, after: :set_default_precompile do |app|
app.config.assets.precompile.reject! { |entry| entry == Sprockets::Railtie::LOOSE_APP_ASSETS } app.config.assets.precompile.reject! { |entry| entry == Sprockets::Railtie::LOOSE_APP_ASSETS }
asset_roots = [config.root.join("app/assets").to_s] # if two files in assets are named the same, it'll likely resolve to the normal app/assets version.
# See https://gitlab.com/gitlab-jh/gitlab/-/merge_requests/27#note_609101582 for more details
asset_roots = []
if Gitlab.jh?
asset_roots << config.root.join("jh/app/assets").to_s
end
asset_roots << config.root.join("app/assets").to_s
if Gitlab.ee? if Gitlab.ee?
asset_roots << config.root.join("ee/app/assets").to_s asset_roots << config.root.join("ee/app/assets").to_s
@ -413,16 +421,18 @@ module Gitlab
end end
end end
# Add EE assets. They should take precedence over CE. This means if two files exist, e.g.: # Add assets for variants of GitLab. They should take precedence over CE.
# This means if multiple files exist, e.g.:
# #
# jh/app/assets/stylesheets/example.scss
# ee/app/assets/stylesheets/example.scss # ee/app/assets/stylesheets/example.scss
# app/assets/stylesheets/example.scss # app/assets/stylesheets/example.scss
# #
# The ee/ version will be preferred. # The jh/ version will be preferred.
initializer :prefer_ee_assets, after: :append_assets_path do |app| initializer :prefer_specialized_assets, after: :append_assets_path do |app|
if Gitlab.ee? Gitlab.extensions.each do |extension|
%w[images javascripts stylesheets].each do |path| %w[images javascripts stylesheets].each do |path|
app.config.assets.paths.unshift("#{config.root}/ee/app/assets/#{path}") app.config.assets.paths.unshift("#{config.root}/#{extension}/app/assets/#{path}")
end end
end end
end end

View File

@ -5,4 +5,4 @@ rollout_issue_url: https://gitlab.com/gitlab-org/gitlab/-/issues/325528
milestone: '13.12' milestone: '13.12'
type: development type: development
group: group::gitaly group: group::gitaly
default_enabled: false default_enabled: true

View File

@ -5,4 +5,4 @@ rollout_issue_url: https://gitlab.com/gitlab-org/gitlab/-/issues/333517
milestone: '14.0' milestone: '14.0'
type: development type: development
group: group::gitaly group: group::gitaly
default_enabled: false default_enabled: true

View File

@ -6,10 +6,10 @@ info: To determine the technical writer assigned to the Stage/Group associated w
# Incoming email **(FREE SELF)** # Incoming email **(FREE SELF)**
GitLab has several features based on receiving incoming emails: GitLab has several features based on receiving incoming email messages:
- [Reply by Email](reply_by_email.md): allow GitLab users to comment on issues - [Reply by Email](reply_by_email.md): allow GitLab users to comment on issues
and merge requests by replying to notification emails. and merge requests by replying to notification email.
- [New issue by email](../user/project/issues/managing_issues.md#new-issue-via-email): - [New issue by email](../user/project/issues/managing_issues.md#new-issue-via-email):
allow GitLab users to create a new issue by sending an email to a allow GitLab users to create a new issue by sending an email to a
user-specific email address. user-specific email address.
@ -22,9 +22,9 @@ GitLab has several features based on receiving incoming emails:
## Requirements ## Requirements
We recommend using an email address that receives **only** messages that are intended for We recommend using an email address that receives **only** messages that are intended for
the GitLab instance. Any incoming emails not intended for GitLab receive a reject notice. the GitLab instance. Any incoming email messages not intended for GitLab receive a reject notice.
Handling incoming emails requires an [IMAP](https://en.wikipedia.org/wiki/Internet_Message_Access_Protocol)-enabled Handling incoming email messages requires an [IMAP](https://en.wikipedia.org/wiki/Internet_Message_Access_Protocol)-enabled
email account. GitLab requires one of the following three strategies: email account. GitLab requires one of the following three strategies:
- Email sub-addressing (recommended) - Email sub-addressing (recommended)
@ -53,7 +53,7 @@ leaving a catch-all available for other purposes beyond GitLab.
### Catch-all mailbox ### Catch-all mailbox
A [catch-all mailbox](https://en.wikipedia.org/wiki/Catch-all) for a domain A [catch-all mailbox](https://en.wikipedia.org/wiki/Catch-all) for a domain
receives all emails addressed to the domain that do not match any addresses that receives all email messages addressed to the domain that do not match any addresses that
exist on the mail server. exist on the mail server.
As of GitLab 11.7, catch-all mailboxes support the same features as As of GitLab 11.7, catch-all mailboxes support the same features as
@ -68,7 +68,7 @@ this method only supports replies, and not the other features of [incoming email
## Set it up ## Set it up
If you want to use Gmail / Google Apps for incoming emails, make sure you have If you want to use Gmail / Google Apps for incoming email, make sure you have
[IMAP access enabled](https://support.google.com/mail/answer/7126229) [IMAP access enabled](https://support.google.com/mail/answer/7126229)
and [allowed less secure apps to access the account](https://support.google.com/accounts/answer/6010255) and [allowed less secure apps to access the account](https://support.google.com/accounts/answer/6010255)
or [turn-on 2-step validation](https://support.google.com/accounts/answer/185839) or [turn-on 2-step validation](https://support.google.com/accounts/answer/185839)

View File

@ -12540,6 +12540,7 @@ Represents summary of a security report.
| <a id="securityreportsummarycoveragefuzzing"></a>`coverageFuzzing` | [`SecurityReportSummarySection`](#securityreportsummarysection) | Aggregated counts for the `coverage_fuzzing` scan. | | <a id="securityreportsummarycoveragefuzzing"></a>`coverageFuzzing` | [`SecurityReportSummarySection`](#securityreportsummarysection) | Aggregated counts for the `coverage_fuzzing` scan. |
| <a id="securityreportsummarydast"></a>`dast` | [`SecurityReportSummarySection`](#securityreportsummarysection) | Aggregated counts for the `dast` scan. | | <a id="securityreportsummarydast"></a>`dast` | [`SecurityReportSummarySection`](#securityreportsummarysection) | Aggregated counts for the `dast` scan. |
| <a id="securityreportsummarydependencyscanning"></a>`dependencyScanning` | [`SecurityReportSummarySection`](#securityreportsummarysection) | Aggregated counts for the `dependency_scanning` scan. | | <a id="securityreportsummarydependencyscanning"></a>`dependencyScanning` | [`SecurityReportSummarySection`](#securityreportsummarysection) | Aggregated counts for the `dependency_scanning` scan. |
| <a id="securityreportsummaryrunningcontainerscanning"></a>`runningContainerScanning` | [`SecurityReportSummarySection`](#securityreportsummarysection) | Aggregated counts for the `running_container_scanning` scan. |
| <a id="securityreportsummarysast"></a>`sast` | [`SecurityReportSummarySection`](#securityreportsummarysection) | Aggregated counts for the `sast` scan. | | <a id="securityreportsummarysast"></a>`sast` | [`SecurityReportSummarySection`](#securityreportsummarysection) | Aggregated counts for the `sast` scan. |
| <a id="securityreportsummarysecretdetection"></a>`secretDetection` | [`SecurityReportSummarySection`](#securityreportsummarysection) | Aggregated counts for the `secret_detection` scan. | | <a id="securityreportsummarysecretdetection"></a>`secretDetection` | [`SecurityReportSummarySection`](#securityreportsummarysection) | Aggregated counts for the `secret_detection` scan. |
@ -13393,7 +13394,7 @@ Represents a vulnerability.
| <a id="vulnerabilitynotes"></a>`notes` | [`NoteConnection!`](#noteconnection) | All notes on this noteable. (see [Connections](#connections)) | | <a id="vulnerabilitynotes"></a>`notes` | [`NoteConnection!`](#noteconnection) | All notes on this noteable. (see [Connections](#connections)) |
| <a id="vulnerabilityprimaryidentifier"></a>`primaryIdentifier` | [`VulnerabilityIdentifier`](#vulnerabilityidentifier) | Primary identifier of the vulnerability. | | <a id="vulnerabilityprimaryidentifier"></a>`primaryIdentifier` | [`VulnerabilityIdentifier`](#vulnerabilityidentifier) | Primary identifier of the vulnerability. |
| <a id="vulnerabilityproject"></a>`project` | [`Project`](#project) | The project on which the vulnerability was found. | | <a id="vulnerabilityproject"></a>`project` | [`Project`](#project) | The project on which the vulnerability was found. |
| <a id="vulnerabilityreporttype"></a>`reportType` | [`VulnerabilityReportType`](#vulnerabilityreporttype) | Type of the security report that found the vulnerability (SAST, DEPENDENCY_SCANNING, CONTAINER_SCANNING, DAST, SECRET_DETECTION, COVERAGE_FUZZING, API_FUZZING). `Scan Type` in the UI. | | <a id="vulnerabilityreporttype"></a>`reportType` | [`VulnerabilityReportType`](#vulnerabilityreporttype) | Type of the security report that found the vulnerability (SAST, DEPENDENCY_SCANNING, CONTAINER_SCANNING, DAST, SECRET_DETECTION, COVERAGE_FUZZING, API_FUZZING, RUNNING_CONTAINER_SCANNING). `Scan Type` in the UI. |
| <a id="vulnerabilityresolvedat"></a>`resolvedAt` | [`Time`](#time) | Timestamp of when the vulnerability state was changed to resolved. | | <a id="vulnerabilityresolvedat"></a>`resolvedAt` | [`Time`](#time) | Timestamp of when the vulnerability state was changed to resolved. |
| <a id="vulnerabilityresolvedby"></a>`resolvedBy` | [`UserCore`](#usercore) | The user that resolved the vulnerability. | | <a id="vulnerabilityresolvedby"></a>`resolvedBy` | [`UserCore`](#usercore) | The user that resolved the vulnerability. |
| <a id="vulnerabilityresolvedondefaultbranch"></a>`resolvedOnDefaultBranch` | [`Boolean!`](#boolean) | Indicates whether the vulnerability is fixed on the default branch or not. | | <a id="vulnerabilityresolvedondefaultbranch"></a>`resolvedOnDefaultBranch` | [`Boolean!`](#boolean) | Indicates whether the vulnerability is fixed on the default branch or not. |
@ -15065,6 +15066,7 @@ The type of the security scan that found the vulnerability.
| <a id="vulnerabilityreporttypecoverage_fuzzing"></a>`COVERAGE_FUZZING` | | | <a id="vulnerabilityreporttypecoverage_fuzzing"></a>`COVERAGE_FUZZING` | |
| <a id="vulnerabilityreporttypedast"></a>`DAST` | | | <a id="vulnerabilityreporttypedast"></a>`DAST` | |
| <a id="vulnerabilityreporttypedependency_scanning"></a>`DEPENDENCY_SCANNING` | | | <a id="vulnerabilityreporttypedependency_scanning"></a>`DEPENDENCY_SCANNING` | |
| <a id="vulnerabilityreporttyperunning_container_scanning"></a>`RUNNING_CONTAINER_SCANNING` | |
| <a id="vulnerabilityreporttypesast"></a>`SAST` | | | <a id="vulnerabilityreporttypesast"></a>`SAST` | |
| <a id="vulnerabilityreporttypesecret_detection"></a>`SECRET_DETECTION` | | | <a id="vulnerabilityreporttypesecret_detection"></a>`SECRET_DETECTION` | |

View File

@ -17330,6 +17330,18 @@ Status: `data_available`
Tiers: `ultimate` Tiers: `ultimate`
### `usage_activity_by_stage.secure.running_container_scanning_scans`
Counts running container scanning jobs
[YAML definition](https://gitlab.com/gitlab-org/gitlab/-/blob/master/ee/config/metrics/counts_all/20210618124854_running_container_scanning_scans.yml)
Group: `group::container security`
Status: `data_available`
Tiers: `ultimate`
### `usage_activity_by_stage.secure.sast_scans` ### `usage_activity_by_stage.secure.sast_scans`
Counts sast jobs Counts sast jobs
@ -19430,6 +19442,30 @@ Status: `data_available`
Tiers: `ultimate` Tiers: `ultimate`
### `usage_activity_by_stage_monthly.secure.running_container_scanning_pipeline`
Pipelines containing a Running Container Scanning job
[YAML definition](https://gitlab.com/gitlab-org/gitlab/-/blob/master/ee/config/metrics/counts_28d/20210618125224_running_container_scanning_pipeline.yml)
Group: `group::container security`
Status: `data_available`
Tiers: `ultimate`
### `usage_activity_by_stage_monthly.secure.running_container_scanning_scans`
Counts running container scanning jobs
[YAML definition](https://gitlab.com/gitlab-org/gitlab/-/blob/master/ee/config/metrics/counts_28d/20210618101233_running_container_scanning_scans.yml)
Group: `group::container security`
Status: `data_available`
Tiers: `ultimate`
### `usage_activity_by_stage_monthly.secure.sast_pipeline` ### `usage_activity_by_stage_monthly.secure.sast_pipeline`
Counts of Pipelines that have at least 1 SAST job Counts of Pipelines that have at least 1 SAST job

View File

@ -706,51 +706,53 @@ dast:
### Available CI/CD variables ### Available CI/CD variables
DAST can be [configured](#customizing-the-dast-settings) using CI/CD variables. You can use CI/CD variables to customize DAST.
| CI/CD variable | Type | Description | | CI/CD variable | Type | Description |
|:--------------------------------------------|:--------------|:-----------------------------------| |:------------------------------------------------|:--------------|:-------------------------------|
| `SECURE_ANALYZERS_PREFIX` | URL | Set the Docker registry base address from which to download the analyzer. | | `SECURE_ANALYZERS_PREFIX` | URL | Set the Docker registry base address from which to download the analyzer. |
| `DAST_WEBSITE` (**1**) | URL | The URL of the website to scan. `DAST_API_OPENAPI` must be specified if this is omitted. | | `DAST_WEBSITE` <sup>1</sup> | URL | The URL of the website to scan. `DAST_API_OPENAPI` must be specified if this is omitted. |
| `DAST_API_OPENAPI` | URL or string | The API specification to import. The specification can be hosted at a URL, or the name of a file present in the `/zap/wrk` directory. `DAST_WEBSITE` must be specified if this is omitted. | | `DAST_API_OPENAPI` | URL or string | The API specification to import. The specification can be hosted at a URL, or the name of a file present in the `/zap/wrk` directory. `DAST_WEBSITE` must be specified if this is omitted. |
| `DAST_API_SPECIFICATION` (**1**) | URL or string | [Deprecated](https://gitlab.com/gitlab-org/gitlab/-/issues/290241) in GitLab 13.12 and replaced by `DAST_API_OPENAPI`. To be removed in GitLab 15.0. The API specification to import. The specification can be hosted at a URL, or the name of a file present in the `/zap/wrk` directory. `DAST_WEBSITE` must be specified if this is omitted. | | `DAST_API_SPECIFICATION` <sup>1</sup> | URL or string | [Deprecated](https://gitlab.com/gitlab-org/gitlab/-/issues/290241) in GitLab 13.12 and replaced by `DAST_API_OPENAPI`. To be removed in GitLab 15.0. The API specification to import. The specification can be hosted at a URL, or the name of a file present in the `/zap/wrk` directory. `DAST_WEBSITE` must be specified if this is omitted. |
| `DAST_SPIDER_START_AT_HOST` | boolean | Set to `false` to prevent DAST from resetting the target to its host before scanning. When `true`, non-host targets `http://test.site/some_path` is reset to `http://test.site` before scan. Default: `false`. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/258805) in GitLab 13.6. | | `DAST_SPIDER_START_AT_HOST` | boolean | Set to `false` to prevent DAST from resetting the target to its host before scanning. When `true`, non-host targets `http://test.site/some_path` is reset to `http://test.site` before scan. Default: `true`. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/258805) in GitLab 13.6. |
| `DAST_AUTH_URL` (**1**) | URL | The URL of the page containing the sign-in HTML form on the target website. `DAST_USERNAME` and `DAST_PASSWORD` are submitted with the login form to create an authenticated scan. Not supported for API scans. | | `DAST_AUTH_URL` <sup>1</sup> | URL | The URL of the page containing the sign-in HTML form on the target website. `DAST_USERNAME` and `DAST_PASSWORD` are submitted with the login form to create an authenticated scan. Not supported for API scans. |
| `DAST_AUTH_VERIFICATION_URL` (**1**) | URL | A URL only accessible to logged in users that DAST can use to confirm successful authentication. If provided, DAST exits if it cannot access the URL. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/207335) in GitLab 13.8. | | `DAST_AUTH_VERIFICATION_URL` <sup>1</sup> | URL | A URL only accessible to logged in users that DAST can use to confirm successful authentication. If provided, DAST exits if it cannot access the URL. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/207335) in GitLab 13.8. |
| `DAST_USERNAME` (**1**) | string | The username to enter into the username field on the sign-in HTML form. | | `DAST_USERNAME` <sup>1</sup> | string | The username to authenticate to in the website. |
| `DAST_PASSWORD` (**1**) | string | The password to enter into the password field on the sign-in HTML form. | | `DAST_PASSWORD` <sup>1</sup> | string | The password to authenticate to in the website. |
| `DAST_USERNAME_FIELD` (**1**) | selector | A selector describing the username field on the sign-in HTML form. Example: `id:user` | | `DAST_USERNAME_FIELD` <sup>1</sup> | string | The name of username field at the sign-in HTML form. |
| `DAST_PASSWORD_FIELD` (**1**) | selector | A selector describing the password field on the sign-in HTML form. Example: `css:.password-field` | | `DAST_PASSWORD_FIELD` <sup>1</sup> | string | The name of password field at the sign-in HTML form. |
| `DAST_SKIP_TARGET_CHECK` | boolean | Set to `true` to prevent DAST from checking that the target is available before scanning. Default: `false`. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/229067) in GitLab 13.8. | | `DAST_SKIP_TARGET_CHECK` | boolean | Set to `true` to prevent DAST from checking that the target is available before scanning. Default: `false`. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/229067) in GitLab 13.8. |
| `DAST_MASK_HTTP_HEADERS` | string | Comma-separated list of request and response headers to be masked (GitLab 13.1). Must contain **all** headers to be masked. Refer to [list of headers that are masked by default](#hide-sensitive-information). | | `DAST_MASK_HTTP_HEADERS` | string | Comma-separated list of request and response headers to be masked (GitLab 13.1). Must contain **all** headers to be masked. Refer to [list of headers that are masked by default](#hide-sensitive-information). |
| `DAST_EXCLUDE_URLS` (**1**) | URLs | The URLs to skip during the authenticated scan; comma-separated. Regular expression syntax can be used to match multiple URLs. For example, `.*` matches an arbitrary character sequence. Not supported for API scans. | | `DAST_EXCLUDE_URLS` <sup>1</sup> | URLs | The URLs to skip during the authenticated scan; comma-separated. Regular expression syntax can be used to match multiple URLs. For example, `.*` matches an arbitrary character sequence. Not supported for API scans. |
| `DAST_FULL_SCAN_ENABLED` (**1**) | boolean | Set to `true` to run a [ZAP Full Scan](https://github.com/zaproxy/zaproxy/wiki/ZAP-Full-Scan) instead of a [ZAP Baseline Scan](https://github.com/zaproxy/zaproxy/wiki/ZAP-Baseline-Scan). Default: `false` | | `DAST_FULL_SCAN_ENABLED` <sup>1</sup> | boolean | Set to `true` to run a [ZAP Full Scan](https://github.com/zaproxy/zaproxy/wiki/ZAP-Full-Scan) instead of a [ZAP Baseline Scan](https://github.com/zaproxy/zaproxy/wiki/ZAP-Baseline-Scan). Default: `false` |
| `DAST_FULL_SCAN_DOMAIN_VALIDATION_REQUIRED` | boolean | **{warning}** **[Removed](https://gitlab.com/gitlab-org/gitlab/-/issues/293595)** in GitLab 14.0. Set to `true` to require domain validation when running DAST full scans. Not supported for API scans. Default: `false` |
| `DAST_AUTO_UPDATE_ADDONS` | boolean | ZAP add-ons are pinned to specific versions in the DAST Docker image. Set to `true` to download the latest versions when the scan starts. Default: `false` | | `DAST_AUTO_UPDATE_ADDONS` | boolean | ZAP add-ons are pinned to specific versions in the DAST Docker image. Set to `true` to download the latest versions when the scan starts. Default: `false` |
| `DAST_API_HOST_OVERRIDE` (**1**) | string | Used to override domains defined in API specification files. Only supported when importing the API specification from a URL. Example: `example.com:8080` | | `DAST_API_HOST_OVERRIDE` <sup>1</sup> | string | Used to override domains defined in API specification files. Only supported when importing the API specification from a URL. Example: `example.com:8080` |
| `DAST_EXCLUDE_RULES` | string | Set to a comma-separated list of Vulnerability Rule IDs to exclude them from running during the scan. Rule IDs are numbers and can be found from the DAST log or on the [ZAP project](https://www.zaproxy.org/docs/alerts/). For example, `HTTP Parameter Override` has a rule ID of `10026`. Cannot be used when `DAST_ONLY_INCLUDE_RULES` is set. **Note:** In earlier versions of GitLab the excluded rules were executed but vulnerabilities they generated were suppressed. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/118641) in GitLab 12.10. | | `DAST_EXCLUDE_RULES` | string | Set to a comma-separated list of Vulnerability Rule IDs to exclude them from running during the scan. Rule IDs are numbers and can be found from the DAST log or on the [ZAP project](https://www.zaproxy.org/docs/alerts/). For example, `HTTP Parameter Override` has a rule ID of `10026`. Cannot be used when `DAST_ONLY_INCLUDE_RULES` is set. **Note:** In earlier versions of GitLab the excluded rules were executed but vulnerabilities they generated were suppressed. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/118641) in GitLab 12.10. |
| `DAST_ONLY_INCLUDE_RULES` | string | Set to a comma-separated list of Vulnerability Rule IDs to configure the scan to run only them. Rule IDs are numbers and can be found from the DAST log or on the [ZAP project](https://www.zaproxy.org/docs/alerts/). Cannot be used when `DAST_EXCLUDE_RULES` is set. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/250651) in GitLab 13.12. | | `DAST_ONLY_INCLUDE_RULES` | string | Set to a comma-separated list of Vulnerability Rule IDs to configure the scan to run only them. Rule IDs are numbers and can be found from the DAST log or on the [ZAP project](https://www.zaproxy.org/docs/alerts/). Cannot be used when `DAST_EXCLUDE_RULES` is set. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/250651) in GitLab 13.12. |
| `DAST_REQUEST_HEADERS` (**1**) | string | Set to a comma-separated list of request header names and values. Headers are added to every request made by DAST. For example, `Cache-control: no-cache,User-Agent: DAST/1.0` | | `DAST_REQUEST_HEADERS` <sup>1</sup> | string | Set to a comma-separated list of request header names and values. Headers are added to every request made by DAST. For example, `Cache-control: no-cache,User-Agent: DAST/1.0` |
| `DAST_DEBUG` (**1**) | boolean | Enable debug message output. Default: `false`. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/12652) in GitLab 13.1. | | `DAST_DEBUG` <sup>1</sup> | boolean | Enable debug message output. Default: `false`. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/12652) in GitLab 13.1. |
| `DAST_TARGET_AVAILABILITY_TIMEOUT` (**1**) | number | Time limit in seconds to wait for target availability. | `DAST_TARGET_AVAILABILITY_TIMEOUT` <sup>1</sup> | number | Time limit in seconds to wait for target availability. |
| `DAST_SPIDER_MINS` (**1**) | number | The maximum duration of the spider scan in minutes. Set to `0` for unlimited. Default: One minute, or unlimited when the scan is a full scan. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/12652) in GitLab 13.1. | | `DAST_SPIDER_MINS` <sup>1</sup> | number | The maximum duration of the spider scan in minutes. Set to `0` for unlimited. Default: One minute, or unlimited when the scan is a full scan. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/12652) in GitLab 13.1. |
| `DAST_HTML_REPORT` | string | The filename of the HTML report written at the end of a scan. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/12652) in GitLab 13.1. | | `DAST_HTML_REPORT` | string | The filename of the HTML report written at the end of a scan. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/12652) in GitLab 13.1. |
| `DAST_MARKDOWN_REPORT` | string | The filename of the Markdown report written at the end of a scan. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/12652) in GitLab 13.1. | | `DAST_MARKDOWN_REPORT` | string | The filename of the Markdown report written at the end of a scan. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/12652) in GitLab 13.1. |
| `DAST_XML_REPORT` | string | The filename of the XML report written at the end of a scan. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/12652) in GitLab 13.1. | | `DAST_XML_REPORT` | string | The filename of the XML report written at the end of a scan. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/12652) in GitLab 13.1. |
| `DAST_INCLUDE_ALPHA_VULNERABILITIES` | boolean | Set to `true` to include alpha passive and active scan rules. Default: `false`. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/12652) in GitLab 13.1. | | `DAST_INCLUDE_ALPHA_VULNERABILITIES` | boolean | Set to `true` to include alpha passive and active scan rules. Default: `false`. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/12652) in GitLab 13.1. |
| `DAST_USE_AJAX_SPIDER` (**1**) | boolean | Set to `true` to use the AJAX spider in addition to the traditional spider, useful for crawling sites that require JavaScript. Default: `false`. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/12652) in GitLab 13.1. | | `DAST_USE_AJAX_SPIDER` <sup>1</sup> | boolean | Set to `true` to use the AJAX spider in addition to the traditional spider, useful for crawling sites that require JavaScript. Default: `false`. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/12652) in GitLab 13.1. |
| `DAST_PATHS` | string | Set to a comma-separated list of URLs for DAST to scan. For example, `/page1.html,/category1/page3.html,/page2.html`. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/214120) in GitLab 13.4. | | `DAST_PATHS` | string | Set to a comma-separated list of URLs for DAST to scan. For example, `/page1.html,/category1/page3.html,/page2.html`. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/214120) in GitLab 13.4. |
| `DAST_PATHS_FILE` | string | The file path containing the paths within `DAST_WEBSITE` to scan. The file must be plain text with one path per line. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/258825) in GitLab 13.6. | | `DAST_PATHS_FILE` | string | The file path containing the paths within `DAST_WEBSITE` to scan. The file must be plain text with one path per line. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/258825) in GitLab 13.6. |
| `DAST_SUBMIT_FIELD` | selector | A selector describing the element that when clicked submits the login form, or the password form of a multi-page login process. Example: `xpath://input[@value='Login']`. [Introduced](https://gitlab.com/gitlab-org/gitlab-ee/issues/9894) in GitLab 12.4. | | `DAST_SUBMIT_FIELD` | string | The `id` or `name` of the element that when clicked submits the login form or the password form of a multi-page login process. [Introduced](https://gitlab.com/gitlab-org/gitlab-ee/issues/9894) in GitLab 12.4. |
| `DAST_FIRST_SUBMIT_FIELD` | selector | A selector describing the element that when clicked submits the username form of a multi-page login process. Example: `.submit`. [Introduced](https://gitlab.com/gitlab-org/gitlab-ee/issues/9894) in GitLab 12.4. | | `DAST_FIRST_SUBMIT_FIELD` | string | The `id` or `name` of the element that when clicked submits the username form of a multi-page login process. [Introduced](https://gitlab.com/gitlab-org/gitlab-ee/issues/9894) in GitLab 12.4. |
| `DAST_ZAP_CLI_OPTIONS` | string | ZAP server command-line options. For example, `-Xmx3072m` would set the Java maximum memory allocation pool size. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/12652) in GitLab 13.1. | | `DAST_ZAP_CLI_OPTIONS` | string | ZAP server command-line options. For example, `-Xmx3072m` would set the Java maximum memory allocation pool size. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/12652) in GitLab 13.1. |
| `DAST_ZAP_LOG_CONFIGURATION` | string | Set to a semicolon-separated list of additional log4j properties for the ZAP Server. | | `DAST_ZAP_LOG_CONFIGURATION` | string | Set to a semicolon-separated list of additional log4j properties for the ZAP Server. For example, `log4j.logger.org.parosproxy.paros.network.HttpSender=DEBUG;log4j.logger.com.crawljax=DEBUG` |
| `DAST_AUTH_EXCLUDE_URLS` | URLs | **{warning}** **[Removed](https://gitlab.com/gitlab-org/gitlab/-/issues/289959)** in GitLab 14.0. Replaced by `DAST_EXCLUDE_URLS`. The URLs to skip during the authenticated scan; comma-separated. Regular expression syntax can be used to match multiple URLs. For example, `.*` matches an arbitrary character sequence. Not supported for API scans. |
| `DAST_AGGREGATE_VULNERABILITIES` | boolean | Vulnerability aggregation is set to `true` by default. To disable this feature and see each vulnerability individually set to `false`. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/254043) in GitLab 14.0. | | `DAST_AGGREGATE_VULNERABILITIES` | boolean | Vulnerability aggregation is set to `true` by default. To disable this feature and see each vulnerability individually set to `false`. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/254043) in GitLab 14.0. |
| `DAST_MAX_URLS_PER_VULNERABILITY` | number | The maximum number of URLs reported for a single vulnerability. `DAST_MAX_URLS_PER_VULNERABILITY` is set to `50` by default. To list all the URLs set to `0`. [Introduced](https://gitlab.com/gitlab-org/security-products/dast/-/merge_requests/433) in GitLab 13.12. | | `DAST_MAX_URLS_PER_VULNERABILITY` | number | The maximum number of URLs reported for a single vulnerability. `DAST_MAX_URLS_PER_VULNERABILITY` is set to `50` by default. To list all the URLs set to `0`. [Introduced](https://gitlab.com/gitlab-org/security-products/dast/-/merge_requests/433) in GitLab 13.12. |
| `DAST_AUTH_REPORT` | boolean | Used in combination with exporting the `gl-dast-debug-auth-report.html` artifact to aid in debugging authentication issues. | | `DAST_AUTH_REPORT` | boolean | Used in combination with exporting the `gl-dast-debug-auth-report.html` artifact to aid in debugging authentication issues. |
| `DAST_AUTH_VERIFICATION_SELECTOR` | selector | Verifies successful authentication by checking for presence of a selector once the login form has been submitted. Example: `css:.user-photo` | | `DAST_AUTH_VERIFICATION_SELECTOR` | selector | Verifies successful authentication by checking for presence of a selector once the login form has been submitted. Example: `css:.user-photo` |
| `DAST_AUTH_VERIFICATION_LOGIN_FORM` | boolean | Verifies successful authentication by checking for the lack of a login form once the login form has been submitted. | | `DAST_AUTH_VERIFICATION_LOGIN_FORM` | boolean | Verifies successful authentication by checking for the lack of a login form once the login form has been submitted. |
1. DAST CI/CD variable available to an on-demand scan. 1. Available to an on-demand DAST scan.
#### Selectors #### Selectors

View File

@ -332,6 +332,36 @@ If you forget to set the service alias, the `docker:19.03.12` image is unable to
error during connect: Get http://docker:2376/v1.39/info: dial tcp: lookup docker on 192.168.0.1:53: no such host error during connect: Get http://docker:2376/v1.39/info: dial tcp: lookup docker on 192.168.0.1:53: no such host
``` ```
### Using a Docker-in-Docker image with Dependency Proxy
To use your own Docker images with Dependency Proxy, follow these steps
in addition to the steps in the
[Docker-in-Docker](../../../ci/docker/using_docker_build.md#use-the-docker-executor-with-the-docker-image-docker-in-docker) section:
1. Update the `image` and `service` to point to your registry.
1. Add a service [alias](../../../ci/yaml/README.md#servicesalias).
Below is an example of what your `.gitlab-ci.yml` should look like:
```yaml
build:
image: ${CI_DEPENDENCY_PROXY_GROUP_IMAGE_PREFIX}/group/project/docker:19.03.12
services:
- name: ${CI_DEPENDENCY_PROXY_GROUP_IMAGE_PREFIX}/docker:18.09.7-dind
alias: docker
stage: build
script:
- docker build -t my-docker-image .
- docker run my-docker-image /script/to/run/tests
```
If you forget to set the service alias, the `docker:19.03.12` image is unable to find the
`dind` service, and an error like the following is thrown:
```plaintext
error during connect: Get http://docker:2376/v1.39/info: dial tcp: lookup docker on 192.168.0.1:53: no such host
```
## Delete images ## Delete images
You can delete images from your Container Registry in multiple ways. You can delete images from your Container Registry in multiple ways.

View File

@ -252,3 +252,21 @@ hub_docker_quota_check:
- | - |
TOKEN=$(curl "https://auth.docker.io/token?service=registry.docker.io&scope=repository:ratelimitpreview/test:pull" | jq --raw-output .token) && curl --head --header "Authorization: Bearer $TOKEN" "https://registry-1.docker.io/v2/ratelimitpreview/test/manifests/latest" 2>&1 TOKEN=$(curl "https://auth.docker.io/token?service=registry.docker.io&scope=repository:ratelimitpreview/test:pull" | jq --raw-output .token) && curl --head --header "Authorization: Bearer $TOKEN" "https://registry-1.docker.io/v2/ratelimitpreview/test/manifests/latest" 2>&1
``` ```
## Troubleshooting
### Dependency Proxy Connection Failure
If a service alias is not set the `docker:19.03.12` image is unable to find the
`dind` service, and an error like the following is thrown:
```plaintext
error during connect: Get http://docker:2376/v1.39/info: dial tcp: lookup docker on 192.168.0.1:53: no such host
```
This can be resolved by setting a service alias for the Docker service:
```plaintext
services:
- name: ${CI_DEPENDENCY_PROXY_GROUP_IMAGE_PREFIX}/docker:18.09.7-dind
alias: docker```

View File

@ -745,7 +745,7 @@ You can create a new package each time the `master` branch is updated.
<repositories> <repositories>
<repository> <repository>
<id>gitlab-maven</id> <id>gitlab-maven</id>
<url>$env{CI_API_V4_URL}/projects/${env.CI_PROJECT_ID}/packages/maven</url> <url>${env.CI_API_V4_URL}/projects/${env.CI_PROJECT_ID}/packages/maven</url>
</repository> </repository>
</repositories> </repositories>
<distributionManagement> <distributionManagement>

View File

@ -337,7 +337,7 @@ updated:
stage: deploy stage: deploy
script: script:
- dotnet pack -c Release - dotnet pack -c Release
- dotnet nuget add source "${CI_API_V4_URL}/${CI_PROJECT_ID}/packages/nuget/index.json" --name gitlab --username gitlab-ci-token --password $CI_JOB_TOKEN --store-password-in-clear-text - dotnet nuget add source "${CI_API_V4_URL}/projects/${CI_PROJECT_ID}/packages/nuget/index.json" --name gitlab --username gitlab-ci-token --password $CI_JOB_TOKEN --store-password-in-clear-text
- dotnet nuget push "bin/Release/*.nupkg" --source gitlab - dotnet nuget push "bin/Release/*.nupkg" --source gitlab
only: only:
- master - master

View File

@ -320,7 +320,7 @@ python -m twine upload --repository <source_name> dist/<package_file>
You cannot publish a package if a package of the same name and version already exists. You cannot publish a package if a package of the same name and version already exists.
You must delete the existing package first. If you attempt to publish the same package You must delete the existing package first. If you attempt to publish the same package
more than once, a `404 Bad Request` error occurs. more than once, a `400 Bad Request` error occurs.
## Install a PyPI package ## Install a PyPI package

View File

@ -14,6 +14,20 @@ module API
detail 'This feature was introduced in GitLab 14.0' detail 'This feature was introduced in GitLab 14.0'
end end
get ':id/avatar' do get ':id/avatar' do
avatar = user_group.avatar
not_found!('Avatar') if avatar.blank?
filename = File.basename(avatar.file.file)
header(
'Content-Disposition',
ActionDispatch::Http::ContentDisposition.format(
disposition: 'attachment',
filename: filename
)
)
present_carrierwave_file!(user_group.avatar) present_carrierwave_file!(user_group.avatar)
end end
end end

View File

@ -77,8 +77,8 @@ module API
present services, with: Entities::ProjectServiceBasic present services, with: Entities::ProjectServiceBasic
end end
SERVICES.each do |service_slug, settings| SERVICES.each do |slug, settings|
desc "Set #{service_slug} service for project" desc "Set #{slug} service for project"
params do params do
settings.each do |setting| settings.each do |setting|
if setting[:required] if setting[:required]
@ -88,12 +88,12 @@ module API
end end
end end
end end
put ":id/services/#{service_slug}" do put ":id/services/#{slug}" do
service = user_project.find_or_initialize_service(service_slug.underscore) integration = user_project.find_or_initialize_integration(slug.underscore)
service_params = declared_params(include_missing: false).merge(active: true) params = declared_params(include_missing: false).merge(active: true)
if service.update(service_params) if integration.update(params)
present service, with: Entities::ProjectService present integration, with: Entities::ProjectService
else else
render_api_error!('400 Bad Request', 400) render_api_error!('400 Bad Request', 400)
end end
@ -102,19 +102,15 @@ module API
desc "Delete a service for project" desc "Delete a service for project"
params do params do
requires :service_slug, type: String, values: SERVICES.keys, desc: 'The name of the service' requires :slug, type: String, values: SERVICES.keys, desc: 'The name of the service'
end end
delete ":id/services/:service_slug" do delete ":id/services/:slug" do
service = user_project.find_or_initialize_service(params[:service_slug].underscore) integration = user_project.find_or_initialize_integration(params[:slug].underscore)
destroy_conditionally!(service) do destroy_conditionally!(integration) do
attrs = service_attributes(service).inject({}) do |hash, key| attrs = service_attributes(integration).index_with { nil }.merge(active: false)
hash.merge!(key => nil)
end
unless service.update(attrs.merge(active: false)) render_api_error!('400 Bad Request', 400) unless integration.update(attrs)
render_api_error!('400 Bad Request', 400)
end
end end
end end
@ -122,10 +118,10 @@ module API
success Entities::ProjectService success Entities::ProjectService
end end
params do params do
requires :service_slug, type: String, values: SERVICES.keys, desc: 'The name of the service' requires :slug, type: String, values: SERVICES.keys, desc: 'The name of the service'
end end
get ":id/services/:service_slug" do get ":id/services/:slug" do
integration = user_project.find_or_initialize_service(params[:service_slug].underscore) integration = user_project.find_or_initialize_integration(params[:slug].underscore)
not_found!('Service') unless integration&.persisted? not_found!('Service') unless integration&.persisted?

View File

@ -156,10 +156,10 @@ module Gitlab
underscored_service = matched_login['service'].underscore underscored_service = matched_login['service'].underscore
return unless Integration.available_services_names.include?(underscored_service) return unless Integration.available_integration_names.include?(underscored_service)
# We treat underscored_service as a trusted input because it is included # We treat underscored_service as a trusted input because it is included
# in the Integration.available_services_names allowlist. # in the Integration.available_integration_names allowlist.
accessor = Project.integration_association_name(underscored_service) accessor = Project.integration_association_name(underscored_service)
service = project.public_send(accessor) # rubocop:disable GitlabSecurity/PublicSend service = project.public_send(accessor) # rubocop:disable GitlabSecurity/PublicSend

View File

@ -107,10 +107,10 @@ module Gitlab
return success(result) unless prometheus_enabled? return success(result) unless prometheus_enabled?
return success(result) unless prometheus_server_address.present? return success(result) unless prometheus_server_address.present?
service = result[:project].find_or_initialize_service('prometheus') prometheus = result[:project].find_or_initialize_integration('prometheus')
unless service.update(prometheus_integration_attributes) unless prometheus.update(prometheus_integration_attributes)
log_error('Could not save prometheus manual configuration for self-monitoring project. Errors: %{errors}' % { errors: service.errors.full_messages }) log_error('Could not save prometheus manual configuration for self-monitoring project. Errors: %{errors}' % { errors: prometheus.errors.full_messages })
return error(_('Could not save prometheus manual configuration')) return error(_('Could not save prometheus manual configuration'))
end end

View File

@ -26,7 +26,7 @@ module Gitlab
private private
def service_prometheus_adapter def service_prometheus_adapter
project.find_or_initialize_service('prometheus') project.find_or_initialize_integration('prometheus')
end end
end end
end end

View File

@ -403,7 +403,7 @@ module Gitlab
def services_usage def services_usage
# rubocop: disable UsageData/LargeTable: # rubocop: disable UsageData/LargeTable:
Integration.available_services_names(include_dev: false).each_with_object({}) do |name, response| Integration.available_integration_names(include_dev: false).each_with_object({}) do |name, response|
type = Integration.integration_name_to_type(name) type = Integration.integration_name_to_type(name)
response[:"projects_#{name}_active"] = count(Integration.active.where.not(project: nil).where(type: type)) response[:"projects_#{name}_active"] = count(Integration.active.where.not(project: nil).where(type: type))

View File

@ -10,7 +10,7 @@ RSpec.describe Admin::IntegrationsController do
end end
describe '#edit' do describe '#edit' do
Integration.available_services_names.each do |integration_name| Integration.available_integration_names.each do |integration_name|
context "#{integration_name}" do context "#{integration_name}" do
it 'successfully displays the template' do it 'successfully displays the template' do
get :edit, params: { id: integration_name } get :edit, params: { id: integration_name }
@ -27,7 +27,7 @@ RSpec.describe Admin::IntegrationsController do
end end
it 'returns 404' do it 'returns 404' do
get :edit, params: { id: Integration.available_services_names.sample } get :edit, params: { id: Integration.available_integration_names.sample }
expect(response).to have_gitlab_http_status(:not_found) expect(response).to have_gitlab_http_status(:not_found)
end end

View File

@ -36,7 +36,7 @@ RSpec.describe Groups::Settings::IntegrationsController do
describe '#edit' do describe '#edit' do
context 'when user is not owner' do context 'when user is not owner' do
it 'renders not_found' do it 'renders not_found' do
get :edit, params: { group_id: group, id: Integration.available_services_names(include_project_specific: false).sample } get :edit, params: { group_id: group, id: Integration.available_integration_names(include_project_specific: false).sample }
expect(response).to have_gitlab_http_status(:not_found) expect(response).to have_gitlab_http_status(:not_found)
end end
@ -47,8 +47,8 @@ RSpec.describe Groups::Settings::IntegrationsController do
group.add_owner(user) group.add_owner(user)
end end
Integration.available_services_names(include_project_specific: false).each do |integration_name| Integration.available_integration_names(include_project_specific: false).each do |integration_name|
context "#{integration_name}" do context integration_name do
it 'successfully displays the template' do it 'successfully displays the template' do
get :edit, params: { group_id: group, id: integration_name } get :edit, params: { group_id: group, id: integration_name }

View File

@ -1,15 +1,8 @@
import MockAdapter from 'axios-mock-adapter'; import MockAdapter from 'axios-mock-adapter';
import Api from '~/api';
import axios from '~/lib/utils/axios_utils'; import axios from '~/lib/utils/axios_utils';
import httpStatus from '~/lib/utils/http_status'; import httpStatus from '~/lib/utils/http_status';
import { resolvers } from '~/pipeline_editor/graphql/resolvers'; import { resolvers } from '~/pipeline_editor/graphql/resolvers';
import { import { mockLintResponse } from '../mock_data';
mockCiConfigPath,
mockCiYml,
mockDefaultBranch,
mockLintResponse,
mockProjectFullPath,
} from '../mock_data';
jest.mock('~/api', () => { jest.mock('~/api', () => {
return { return {
@ -18,36 +11,6 @@ jest.mock('~/api', () => {
}); });
describe('~/pipeline_editor/graphql/resolvers', () => { describe('~/pipeline_editor/graphql/resolvers', () => {
describe('Query', () => {
describe('blobContent', () => {
beforeEach(() => {
Api.getRawFile.mockResolvedValue({
data: mockCiYml,
});
});
afterEach(() => {
Api.getRawFile.mockReset();
});
it('resolves lint data with type names', async () => {
const result = resolvers.Query.blobContent(null, {
projectPath: mockProjectFullPath,
path: mockCiConfigPath,
ref: mockDefaultBranch,
});
expect(Api.getRawFile).toHaveBeenCalledWith(mockProjectFullPath, mockCiConfigPath, {
ref: mockDefaultBranch,
});
// eslint-disable-next-line no-underscore-dangle
expect(result.__typename).toBe('BlobContent');
await expect(result.rawData).resolves.toBe(mockCiYml);
});
});
});
describe('Mutation', () => { describe('Mutation', () => {
describe('lintCI', () => { describe('lintCI', () => {
let mock; let mock;

View File

@ -35,6 +35,23 @@ job_build:
- echo "build" - echo "build"
needs: ["job_test_2"] needs: ["job_test_2"]
`; `;
export const mockBlobContentQueryResponse = {
data: {
project: { repository: { blobs: { nodes: [{ rawBlob: mockCiYml }] } } },
},
};
export const mockBlobContentQueryResponseNoCiFile = {
data: {
project: { repository: { blobs: { nodes: [] } } },
},
};
export const mockBlobContentQueryResponseEmptyCiFile = {
data: {
project: { repository: { blobs: { nodes: [{ rawBlob: '' }] } } },
},
};
const mockJobFields = { const mockJobFields = {
beforeScript: [], beforeScript: [],

View File

@ -3,7 +3,6 @@ import { shallowMount, createLocalVue } from '@vue/test-utils';
import VueApollo from 'vue-apollo'; import VueApollo from 'vue-apollo';
import createMockApollo from 'helpers/mock_apollo_helper'; import createMockApollo from 'helpers/mock_apollo_helper';
import waitForPromises from 'helpers/wait_for_promises'; import waitForPromises from 'helpers/wait_for_promises';
import httpStatusCodes from '~/lib/utils/http_status';
import CommitForm from '~/pipeline_editor/components/commit/commit_form.vue'; import CommitForm from '~/pipeline_editor/components/commit/commit_form.vue';
import TextEditor from '~/pipeline_editor/components/editor/text_editor.vue'; import TextEditor from '~/pipeline_editor/components/editor/text_editor.vue';
@ -11,15 +10,19 @@ import PipelineEditorTabs from '~/pipeline_editor/components/pipeline_editor_tab
import PipelineEditorEmptyState from '~/pipeline_editor/components/ui/pipeline_editor_empty_state.vue'; import PipelineEditorEmptyState from '~/pipeline_editor/components/ui/pipeline_editor_empty_state.vue';
import PipelineEditorMessages from '~/pipeline_editor/components/ui/pipeline_editor_messages.vue'; import PipelineEditorMessages from '~/pipeline_editor/components/ui/pipeline_editor_messages.vue';
import { COMMIT_SUCCESS, COMMIT_FAILURE } from '~/pipeline_editor/constants'; import { COMMIT_SUCCESS, COMMIT_FAILURE } from '~/pipeline_editor/constants';
import getBlobContent from '~/pipeline_editor/graphql/queries/blob_content.graphql';
import getCiConfigData from '~/pipeline_editor/graphql/queries/ci_config.graphql'; import getCiConfigData from '~/pipeline_editor/graphql/queries/ci_config.graphql';
import PipelineEditorApp from '~/pipeline_editor/pipeline_editor_app.vue'; import PipelineEditorApp from '~/pipeline_editor/pipeline_editor_app.vue';
import PipelineEditorHome from '~/pipeline_editor/pipeline_editor_home.vue'; import PipelineEditorHome from '~/pipeline_editor/pipeline_editor_home.vue';
import { import {
mockCiConfigPath, mockCiConfigPath,
mockCiConfigQueryResponse, mockCiConfigQueryResponse,
mockCiYml, mockBlobContentQueryResponse,
mockBlobContentQueryResponseEmptyCiFile,
mockBlobContentQueryResponseNoCiFile,
mockDefaultBranch, mockDefaultBranch,
mockProjectFullPath, mockProjectFullPath,
mockCiYml,
} from './mock_data'; } from './mock_data';
const localVue = createLocalVue(); const localVue = createLocalVue();
@ -75,19 +78,12 @@ describe('Pipeline editor app component', () => {
}; };
const createComponentWithApollo = async ({ props = {}, provide = {} } = {}) => { const createComponentWithApollo = async ({ props = {}, provide = {} } = {}) => {
const handlers = [[getCiConfigData, mockCiConfigData]]; const handlers = [
const resolvers = { [getBlobContent, mockBlobContentData],
Query: { [getCiConfigData, mockCiConfigData],
blobContent() { ];
return {
__typename: 'BlobContent',
rawData: mockBlobContentData(),
};
},
},
};
mockApollo = createMockApollo(handlers, resolvers); mockApollo = createMockApollo(handlers);
const options = { const options = {
localVue, localVue,
@ -133,7 +129,7 @@ describe('Pipeline editor app component', () => {
describe('when queries are called', () => { describe('when queries are called', () => {
beforeEach(() => { beforeEach(() => {
mockBlobContentData.mockResolvedValue(mockCiYml); mockBlobContentData.mockResolvedValue(mockBlobContentQueryResponse);
mockCiConfigData.mockResolvedValue(mockCiConfigQueryResponse); mockCiConfigData.mockResolvedValue(mockCiConfigQueryResponse);
}); });
@ -159,35 +155,14 @@ describe('Pipeline editor app component', () => {
}); });
describe('when no CI config file exists', () => { describe('when no CI config file exists', () => {
describe('in a project without a repository', () => {
it('shows an empty state and does not show editor home component', async () => { it('shows an empty state and does not show editor home component', async () => {
mockBlobContentData.mockRejectedValueOnce({ mockBlobContentData.mockResolvedValue(mockBlobContentQueryResponseNoCiFile);
response: {
status: httpStatusCodes.BAD_REQUEST,
},
});
await createComponentWithApollo(); await createComponentWithApollo();
expect(findEmptyState().exists()).toBe(true); expect(findEmptyState().exists()).toBe(true);
expect(findAlert().exists()).toBe(false); expect(findAlert().exists()).toBe(false);
expect(findEditorHome().exists()).toBe(false); expect(findEditorHome().exists()).toBe(false);
}); });
});
describe('in a project with a repository', () => {
it('shows an empty state and does not show editor home component', async () => {
mockBlobContentData.mockRejectedValueOnce({
response: {
status: httpStatusCodes.NOT_FOUND,
},
});
await createComponentWithApollo();
expect(findEmptyState().exists()).toBe(true);
expect(findAlert().exists()).toBe(false);
expect(findEditorHome().exists()).toBe(false);
});
});
describe('because of a fetching error', () => { describe('because of a fetching error', () => {
it('shows a unkown error message', async () => { it('shows a unkown error message', async () => {
@ -204,14 +179,29 @@ describe('Pipeline editor app component', () => {
}); });
}); });
describe('when landing on the empty state with feature flag on', () => { describe('with an empty CI config file', () => {
it('user can click on CTA button and see an empty editor', async () => { describe('with empty state feature flag on', () => {
mockBlobContentData.mockRejectedValueOnce({ it('does not show the empty screen state', async () => {
response: { mockBlobContentData.mockResolvedValue(mockBlobContentQueryResponseEmptyCiFile);
status: httpStatusCodes.NOT_FOUND,
await createComponentWithApollo({
provide: {
glFeatures: {
pipelineEditorEmptyStateAction: true,
},
}, },
}); });
expect(findEmptyState().exists()).toBe(false);
expect(findTextEditor().exists()).toBe(true);
});
});
});
describe('when landing on the empty state with feature flag on', () => {
it('user can click on CTA button and see an empty editor', async () => {
mockBlobContentData.mockResolvedValue(mockBlobContentQueryResponseNoCiFile);
await createComponentWithApollo({ await createComponentWithApollo({
provide: { provide: {
glFeatures: { glFeatures: {
@ -315,17 +305,13 @@ describe('Pipeline editor app component', () => {
}); });
it('hides start screen when refetch fetches CI file', async () => { it('hides start screen when refetch fetches CI file', async () => {
mockBlobContentData.mockRejectedValue({ mockBlobContentData.mockResolvedValue(mockBlobContentQueryResponseNoCiFile);
response: {
status: httpStatusCodes.NOT_FOUND,
},
});
await createComponentWithApollo(); await createComponentWithApollo();
expect(findEmptyState().exists()).toBe(true); expect(findEmptyState().exists()).toBe(true);
expect(findEditorHome().exists()).toBe(false); expect(findEditorHome().exists()).toBe(false);
mockBlobContentData.mockResolvedValue(mockCiYml); mockBlobContentData.mockResolvedValue(mockBlobContentQueryResponse);
await wrapper.vm.$apollo.queries.initialCiFileContent.refetch(); await wrapper.vm.$apollo.queries.initialCiFileContent.refetch();
expect(findEmptyState().exists()).toBe(false); expect(findEmptyState().exists()).toBe(false);

View File

@ -56,7 +56,6 @@ describe('stage column component', () => {
afterEach(() => { afterEach(() => {
wrapper.destroy(); wrapper.destroy();
wrapper = null;
}); });
describe('when mounted', () => { describe('when mounted', () => {

View File

@ -2,29 +2,29 @@
exports[`Links Inner component with a large number of needs matches snapshot and has expected path 1`] = ` exports[`Links Inner component with a large number of needs matches snapshot and has expected path 1`] = `
"<div class=\\"gl-display-flex gl-relative\\" totalgroups=\\"10\\"><svg id=\\"link-svg\\" viewBox=\\"0,0,1019,445\\" width=\\"1019px\\" height=\\"445px\\" class=\\"gl-absolute gl-pointer-events-none\\"> "<div class=\\"gl-display-flex gl-relative\\" totalgroups=\\"10\\"><svg id=\\"link-svg\\" viewBox=\\"0,0,1019,445\\" width=\\"1019px\\" height=\\"445px\\" class=\\"gl-absolute gl-pointer-events-none\\">
<path d=\\"M202,118L42,118C72,118,72,138,102,138\\" stroke-width=\\"2\\" class=\\"gl-fill-transparent gl-transition-duration-slow gl-transition-timing-function-ease gl-stroke-gray-200\\"></path> <path d=\\"M202,118C52,118,52,138,102,138\\" stroke-width=\\"2\\" class=\\"gl-fill-transparent gl-transition-duration-slow gl-transition-timing-function-ease gl-stroke-gray-200\\"></path>
<path d=\\"M202,118L52,118C82,118,82,148,112,148\\" stroke-width=\\"2\\" class=\\"gl-fill-transparent gl-transition-duration-slow gl-transition-timing-function-ease gl-stroke-gray-200\\"></path> <path d=\\"M202,118C62,118,62,148,112,148\\" stroke-width=\\"2\\" class=\\"gl-fill-transparent gl-transition-duration-slow gl-transition-timing-function-ease gl-stroke-gray-200\\"></path>
<path d=\\"M222,138L62,138C92,138,92,158,122,158\\" stroke-width=\\"2\\" class=\\"gl-fill-transparent gl-transition-duration-slow gl-transition-timing-function-ease gl-stroke-gray-200\\"></path> <path d=\\"M222,138C72,138,72,158,122,158\\" stroke-width=\\"2\\" class=\\"gl-fill-transparent gl-transition-duration-slow gl-transition-timing-function-ease gl-stroke-gray-200\\"></path>
<path d=\\"M212,128L72,128C102,128,102,168,132,168\\" stroke-width=\\"2\\" class=\\"gl-fill-transparent gl-transition-duration-slow gl-transition-timing-function-ease gl-stroke-gray-200\\"></path> <path d=\\"M212,128C82,128,82,168,132,168\\" stroke-width=\\"2\\" class=\\"gl-fill-transparent gl-transition-duration-slow gl-transition-timing-function-ease gl-stroke-gray-200\\"></path>
<path d=\\"M232,148L82,148C112,148,112,178,142,178\\" stroke-width=\\"2\\" class=\\"gl-fill-transparent gl-transition-duration-slow gl-transition-timing-function-ease gl-stroke-gray-200\\"></path> <path d=\\"M232,148C92,148,92,178,142,178\\" stroke-width=\\"2\\" class=\\"gl-fill-transparent gl-transition-duration-slow gl-transition-timing-function-ease gl-stroke-gray-200\\"></path>
</svg> </div>" </svg> </div>"
`; `;
exports[`Links Inner component with a parallel need matches snapshot and has expected path 1`] = ` exports[`Links Inner component with a parallel need matches snapshot and has expected path 1`] = `
"<div class=\\"gl-display-flex gl-relative\\" totalgroups=\\"10\\"><svg id=\\"link-svg\\" viewBox=\\"0,0,1019,445\\" width=\\"1019px\\" height=\\"445px\\" class=\\"gl-absolute gl-pointer-events-none\\"> "<div class=\\"gl-display-flex gl-relative\\" totalgroups=\\"10\\"><svg id=\\"link-svg\\" viewBox=\\"0,0,1019,445\\" width=\\"1019px\\" height=\\"445px\\" class=\\"gl-absolute gl-pointer-events-none\\">
<path d=\\"M192,108L22,108C52,108,52,118,82,118\\" stroke-width=\\"2\\" class=\\"gl-fill-transparent gl-transition-duration-slow gl-transition-timing-function-ease gl-stroke-gray-200\\"></path> <path d=\\"M192,108C32,108,32,118,82,118\\" stroke-width=\\"2\\" class=\\"gl-fill-transparent gl-transition-duration-slow gl-transition-timing-function-ease gl-stroke-gray-200\\"></path>
</svg> </div>" </svg> </div>"
`; `;
exports[`Links Inner component with one need matches snapshot and has expected path 1`] = ` exports[`Links Inner component with one need matches snapshot and has expected path 1`] = `
"<div class=\\"gl-display-flex gl-relative\\" totalgroups=\\"10\\"><svg id=\\"link-svg\\" viewBox=\\"0,0,1019,445\\" width=\\"1019px\\" height=\\"445px\\" class=\\"gl-absolute gl-pointer-events-none\\"> "<div class=\\"gl-display-flex gl-relative\\" totalgroups=\\"10\\"><svg id=\\"link-svg\\" viewBox=\\"0,0,1019,445\\" width=\\"1019px\\" height=\\"445px\\" class=\\"gl-absolute gl-pointer-events-none\\">
<path d=\\"M202,118L42,118C72,118,72,138,102,138\\" stroke-width=\\"2\\" class=\\"gl-fill-transparent gl-transition-duration-slow gl-transition-timing-function-ease gl-stroke-gray-200\\"></path> <path d=\\"M202,118C52,118,52,138,102,138\\" stroke-width=\\"2\\" class=\\"gl-fill-transparent gl-transition-duration-slow gl-transition-timing-function-ease gl-stroke-gray-200\\"></path>
</svg> </div>" </svg> </div>"
`; `;
exports[`Links Inner component with same stage needs matches snapshot and has expected path 1`] = ` exports[`Links Inner component with same stage needs matches snapshot and has expected path 1`] = `
"<div class=\\"gl-display-flex gl-relative\\" totalgroups=\\"10\\"><svg id=\\"link-svg\\" viewBox=\\"0,0,1019,445\\" width=\\"1019px\\" height=\\"445px\\" class=\\"gl-absolute gl-pointer-events-none\\"> "<div class=\\"gl-display-flex gl-relative\\" totalgroups=\\"10\\"><svg id=\\"link-svg\\" viewBox=\\"0,0,1019,445\\" width=\\"1019px\\" height=\\"445px\\" class=\\"gl-absolute gl-pointer-events-none\\">
<path d=\\"M192,108L22,108C52,108,52,118,82,118\\" stroke-width=\\"2\\" class=\\"gl-fill-transparent gl-transition-duration-slow gl-transition-timing-function-ease gl-stroke-gray-200\\"></path> <path d=\\"M192,108C32,108,32,118,82,118\\" stroke-width=\\"2\\" class=\\"gl-fill-transparent gl-transition-duration-slow gl-transition-timing-function-ease gl-stroke-gray-200\\"></path>
<path d=\\"M202,118L32,118C62,118,62,128,92,128\\" stroke-width=\\"2\\" class=\\"gl-fill-transparent gl-transition-duration-slow gl-transition-timing-function-ease gl-stroke-gray-200\\"></path> <path d=\\"M202,118C42,118,42,128,92,128\\" stroke-width=\\"2\\" class=\\"gl-fill-transparent gl-transition-duration-slow gl-transition-timing-function-ease gl-stroke-gray-200\\"></path>
</svg> </div>" </svg> </div>"
`; `;

View File

@ -6,7 +6,7 @@ import LinksInner from '~/pipelines/components/graph_shared/links_inner.vue';
import LinksLayer from '~/pipelines/components/graph_shared/links_layer.vue'; import LinksLayer from '~/pipelines/components/graph_shared/links_layer.vue';
import JobPill from '~/pipelines/components/pipeline_graph/job_pill.vue'; import JobPill from '~/pipelines/components/pipeline_graph/job_pill.vue';
import PipelineGraph from '~/pipelines/components/pipeline_graph/pipeline_graph.vue'; import PipelineGraph from '~/pipelines/components/pipeline_graph/pipeline_graph.vue';
import StagePill from '~/pipelines/components/pipeline_graph/stage_pill.vue'; import StageName from '~/pipelines/components/pipeline_graph/stage_name.vue';
import { pipelineData, singleStageData } from './mock_data'; import { pipelineData, singleStageData } from './mock_data';
describe('pipeline graph component', () => { describe('pipeline graph component', () => {
@ -35,11 +35,9 @@ describe('pipeline graph component', () => {
const findAlert = () => wrapper.findComponent(GlAlert); const findAlert = () => wrapper.findComponent(GlAlert);
const findAllJobPills = () => wrapper.findAll(JobPill); const findAllJobPills = () => wrapper.findAll(JobPill);
const findAllStageBackgroundElements = () => wrapper.findAll('[data-testid="stage-background"]'); const findAllStageNames = () => wrapper.findAllComponents(StageName);
const findAllStagePills = () => wrapper.findAllComponents(StagePill);
const findLinksLayer = () => wrapper.findComponent(LinksLayer); const findLinksLayer = () => wrapper.findComponent(LinksLayer);
const findPipelineGraph = () => wrapper.find('[data-testid="graph-container"]'); const findPipelineGraph = () => wrapper.find('[data-testid="graph-container"]');
const findStageBackgroundElementAt = (index) => findAllStageBackgroundElements().at(index);
afterEach(() => { afterEach(() => {
wrapper.destroy(); wrapper.destroy();
@ -67,10 +65,10 @@ describe('pipeline graph component', () => {
wrapper = createComponent({ pipelineData: singleStageData }); wrapper = createComponent({ pipelineData: singleStageData });
}); });
it('renders the right number of stage pills', () => { it('renders the right number of stage titles', () => {
const expectedStagesLength = singleStageData.stages.length; const expectedStagesLength = singleStageData.stages.length;
expect(findAllStagePills()).toHaveLength(expectedStagesLength); expect(findAllStageNames()).toHaveLength(expectedStagesLength);
}); });
it('renders the right number of job pills', () => { it('renders the right number of job pills', () => {
@ -81,20 +79,6 @@ describe('pipeline graph component', () => {
expect(findAllJobPills()).toHaveLength(expectedJobsLength); expect(findAllJobPills()).toHaveLength(expectedJobsLength);
}); });
describe('rounds corner', () => {
it.each`
cssClass | expectedState
${'gl-rounded-bottom-left-6'} | ${true}
${'gl-rounded-top-left-6'} | ${true}
${'gl-rounded-top-right-6'} | ${true}
${'gl-rounded-bottom-right-6'} | ${true}
`('$cssClass should be $expectedState on the only element', ({ cssClass, expectedState }) => {
const classes = findStageBackgroundElementAt(0).classes();
expect(classes.includes(cssClass)).toBe(expectedState);
});
});
}); });
describe('with multiple stages and jobs', () => { describe('with multiple stages and jobs', () => {
@ -102,10 +86,10 @@ describe('pipeline graph component', () => {
wrapper = createComponent(); wrapper = createComponent();
}); });
it('renders the right number of stage pills', () => { it('renders the right number of stage titles', () => {
const expectedStagesLength = pipelineData.stages.length; const expectedStagesLength = pipelineData.stages.length;
expect(findAllStagePills()).toHaveLength(expectedStagesLength); expect(findAllStageNames()).toHaveLength(expectedStagesLength);
}); });
it('renders the right number of job pills', () => { it('renders the right number of job pills', () => {
@ -116,34 +100,5 @@ describe('pipeline graph component', () => {
expect(findAllJobPills()).toHaveLength(expectedJobsLength); expect(findAllJobPills()).toHaveLength(expectedJobsLength);
}); });
describe('rounds corner', () => {
it.each`
cssClass | expectedState
${'gl-rounded-bottom-left-6'} | ${true}
${'gl-rounded-top-left-6'} | ${true}
${'gl-rounded-top-right-6'} | ${false}
${'gl-rounded-bottom-right-6'} | ${false}
`(
'$cssClass should be $expectedState on the first element',
({ cssClass, expectedState }) => {
const classes = findStageBackgroundElementAt(0).classes();
expect(classes.includes(cssClass)).toBe(expectedState);
},
);
it.each`
cssClass | expectedState
${'gl-rounded-bottom-left-6'} | ${false}
${'gl-rounded-top-left-6'} | ${false}
${'gl-rounded-top-right-6'} | ${true}
${'gl-rounded-bottom-right-6'} | ${true}
`('$cssClass should be $expectedState on the last element', ({ cssClass, expectedState }) => {
const classes = findStageBackgroundElementAt(pipelineData.stages.length - 1).classes();
expect(classes.includes(cssClass)).toBe(expectedState);
});
});
}); });
}); });

View File

@ -8,6 +8,6 @@ RSpec.describe GitlabSchema.types['ServiceType'] do
end end
def available_services_enum def available_services_enum
::Integration.available_services_types(include_dev: false).map(&:underscore).map(&:upcase) ::Integration.available_integration_types(include_dev: false).map(&:underscore).map(&:upcase)
end end
end end

View File

@ -238,16 +238,16 @@ RSpec.describe EmailsHelper do
it 'returns the default header logo' do it 'returns the default header logo' do
create :appearance, header_logo: nil create :appearance, header_logo: nil
expect(header_logo).to eq( expect(header_logo).to match(
%{<img alt="GitLab" src="/images/mailers/gitlab_header_logo.gif" width="55" height="50" />} %r{<img alt="GitLab" src="/images/mailers/gitlab_header_logo\.(?:gif|png)" width="\d+" height="\d+" />}
) )
end end
end end
context 'there is no brand item' do context 'there is no brand item' do
it 'returns the default header logo' do it 'returns the default header logo' do
expect(header_logo).to eq( expect(header_logo).to match(
%{<img alt="GitLab" src="/images/mailers/gitlab_header_logo.gif" width="55" height="50" />} %r{<img alt="GitLab" src="/images/mailers/gitlab_header_logo\.(?:gif|png)" width="\d+" height="\d+" />}
) )
end end
end end

View File

@ -24,8 +24,8 @@ RSpec.describe OperationsHelper do
let_it_be(:prometheus_integration) { ::Integrations::Prometheus.new(project: project) } let_it_be(:prometheus_integration) { ::Integrations::Prometheus.new(project: project) }
before do before do
allow(project).to receive(:find_or_initialize_service).and_call_original allow(project).to receive(:find_or_initialize_integration).and_call_original
allow(project).to receive(:find_or_initialize_service).with('prometheus').and_return(prometheus_integration) allow(project).to receive(:find_or_initialize_integration).with('prometheus').and_return(prometheus_integration)
end end
it 'returns the correct values' do it 'returns the correct values' do

View File

@ -13,7 +13,7 @@ RSpec.describe Gitlab::Prometheus::Adapter do
let(:prometheus_integration) { double(:prometheus_integration, can_query?: true) } let(:prometheus_integration) { double(:prometheus_integration, can_query?: true) }
before do before do
allow(project).to receive(:find_or_initialize_service).with('prometheus').and_return prometheus_integration allow(project).to receive(:find_or_initialize_integration).with('prometheus').and_return prometheus_integration
end end
it 'return prometheus integration as prometheus adapter' do it 'return prometheus integration as prometheus adapter' do
@ -33,7 +33,7 @@ RSpec.describe Gitlab::Prometheus::Adapter do
let(:prometheus_integration) { double(:prometheus_integration, can_query?: false) } let(:prometheus_integration) { double(:prometheus_integration, can_query?: false) }
before do before do
allow(project).to receive(:find_or_initialize_service).with('prometheus').and_return prometheus_integration allow(project).to receive(:find_or_initialize_integration).with('prometheus').and_return prometheus_integration
end end
context 'with cluster with prometheus disabled' do context 'with cluster with prometheus disabled' do

View File

@ -23,7 +23,7 @@ RSpec.describe DeploymentMetrics do
let(:prometheus_integration) { instance_double(::Integrations::Prometheus, can_query?: true, configured?: true) } let(:prometheus_integration) { instance_double(::Integrations::Prometheus, can_query?: true, configured?: true) }
before do before do
allow(deployment.project).to receive(:find_or_initialize_service).with('prometheus').and_return prometheus_integration allow(deployment.project).to receive(:find_or_initialize_integration).with('prometheus').and_return prometheus_integration
end end
it { is_expected.to be_truthy } it { is_expected.to be_truthy }
@ -33,7 +33,7 @@ RSpec.describe DeploymentMetrics do
let(:prometheus_integration) { instance_double(::Integrations::Prometheus, configured?: true, can_query?: false) } let(:prometheus_integration) { instance_double(::Integrations::Prometheus, configured?: true, can_query?: false) }
before do before do
allow(deployment.project).to receive(:find_or_initialize_service).with('prometheus').and_return prometheus_integration allow(deployment.project).to receive(:find_or_initialize_integration).with('prometheus').and_return prometheus_integration
end end
it { is_expected.to be_falsy } it { is_expected.to be_falsy }
@ -43,7 +43,7 @@ RSpec.describe DeploymentMetrics do
let(:prometheus_integration) { instance_double(::Integrations::Prometheus, configured?: false, can_query?: false) } let(:prometheus_integration) { instance_double(::Integrations::Prometheus, configured?: false, can_query?: false) }
before do before do
allow(deployment.project).to receive(:find_or_initialize_service).with('prometheus').and_return prometheus_integration allow(deployment.project).to receive(:find_or_initialize_integration).with('prometheus').and_return prometheus_integration
end end
it { is_expected.to be_falsy } it { is_expected.to be_falsy }

View File

@ -140,10 +140,10 @@ RSpec.describe Integration do
end end
describe "Test Button" do describe "Test Button" do
let(:service) { build(:service, project: project) } let(:integration) { build(:service, project: project) }
describe '#can_test?' do describe '#can_test?' do
subject { service.can_test? } subject { integration.can_test? }
context 'when repository is not empty' do context 'when repository is not empty' do
let(:project) { build(:project, :repository) } let(:project) { build(:project, :repository) }
@ -158,9 +158,9 @@ RSpec.describe Integration do
end end
context 'when instance-level service' do context 'when instance-level service' do
Integration.available_services_types.each do |service_type| Integration.available_integration_types.each do |type|
let(:service) do let(:integration) do
described_class.send(:integration_type_to_model, service_type).new(instance: true) described_class.send(:integration_type_to_model, type).new(instance: true)
end end
it { is_expected.to be_falsey } it { is_expected.to be_falsey }
@ -168,9 +168,9 @@ RSpec.describe Integration do
end end
context 'when group-level service' do context 'when group-level service' do
Integration.available_services_types.each do |service_type| Integration.available_integration_types.each do |type|
let(:service) do let(:integration) do
described_class.send(:integration_type_to_model, service_type).new(group_id: group.id) described_class.send(:integration_type_to_model, type).new(group_id: group.id)
end end
it { is_expected.to be_falsey } it { is_expected.to be_falsey }
@ -185,9 +185,9 @@ RSpec.describe Integration do
let(:project) { build(:project, :repository) } let(:project) { build(:project, :repository) }
it 'test runs execute' do it 'test runs execute' do
expect(service).to receive(:execute).with(data) expect(integration).to receive(:execute).with(data)
service.test(data) integration.test(data)
end end
end end
@ -195,9 +195,9 @@ RSpec.describe Integration do
let(:project) { build(:project) } let(:project) { build(:project) }
it 'test runs execute' do it 'test runs execute' do
expect(service).to receive(:execute).with(data) expect(integration).to receive(:execute).with(data)
service.test(data) integration.test(data)
end end
end end
end end
@ -251,11 +251,13 @@ RSpec.describe Integration do
describe '.find_or_initialize_all_non_project_specific' do describe '.find_or_initialize_all_non_project_specific' do
shared_examples 'service instances' do shared_examples 'service instances' do
it 'returns the available service instances' do it 'returns the available service instances' do
expect(Integration.find_or_initialize_all_non_project_specific(Integration.for_instance).map(&:to_param)).to match_array(Integration.available_services_names(include_project_specific: false)) expect(Integration.find_or_initialize_all_non_project_specific(Integration.for_instance).map(&:to_param))
.to match_array(Integration.available_integration_names(include_project_specific: false))
end end
it 'does not create service instances' do it 'does not create service instances' do
expect { Integration.find_or_initialize_all_non_project_specific(Integration.for_instance) }.not_to change { Integration.count } expect { Integration.find_or_initialize_all_non_project_specific(Integration.for_instance) }
.not_to change(Integration, :count)
end end
end end
@ -264,7 +266,7 @@ RSpec.describe Integration do
context 'with all existing instances' do context 'with all existing instances' do
before do before do
Integration.insert_all( Integration.insert_all(
Integration.available_services_types(include_project_specific: false).map { |type| { instance: true, type: type } } Integration.available_integration_types(include_project_specific: false).map { |type| { instance: true, type: type } }
) )
end end
@ -292,13 +294,15 @@ RSpec.describe Integration do
describe 'template' do describe 'template' do
shared_examples 'retrieves service templates' do shared_examples 'retrieves service templates' do
it 'returns the available service templates' do it 'returns the available service templates' do
expect(Integration.find_or_create_templates.pluck(:type)).to match_array(Integration.available_services_types(include_project_specific: false)) expect(Integration.find_or_create_templates.pluck(:type)).to match_array(Integration.available_integration_types(include_project_specific: false))
end end
end end
describe '.find_or_create_templates' do describe '.find_or_create_templates' do
it 'creates service templates' do it 'creates service templates' do
expect { Integration.find_or_create_templates }.to change { Integration.count }.from(0).to(Integration.available_services_names(include_project_specific: false).size) total = Integration.available_integration_names(include_project_specific: false).size
expect { Integration.find_or_create_templates }.to change(Integration, :count).from(0).to(total)
end end
it_behaves_like 'retrieves service templates' it_behaves_like 'retrieves service templates'
@ -306,7 +310,7 @@ RSpec.describe Integration do
context 'with all existing templates' do context 'with all existing templates' do
before do before do
Integration.insert_all( Integration.insert_all(
Integration.available_services_types(include_project_specific: false).map { |type| { template: true, type: type } } Integration.available_integration_types(include_project_specific: false).map { |type| { template: true, type: type } }
) )
end end
@ -332,7 +336,9 @@ RSpec.describe Integration do
end end
it 'creates the rest of the service templates' do it 'creates the rest of the service templates' do
expect { Integration.find_or_create_templates }.to change { Integration.count }.from(1).to(Integration.available_services_names(include_project_specific: false).size) total = Integration.available_integration_names(include_project_specific: false).size
expect { Integration.find_or_create_templates }.to change(Integration, :count).from(1).to(total)
end end
it_behaves_like 'retrieves service templates' it_behaves_like 'retrieves service templates'
@ -461,13 +467,15 @@ RSpec.describe Integration do
describe 'is prefilled for projects pushover service' do describe 'is prefilled for projects pushover service' do
it "has all fields prefilled" do it "has all fields prefilled" do
service = project.find_or_initialize_service('pushover') integration = project.find_or_initialize_integration('pushover')
expect(service.template).to eq(false) expect(integration).to have_attributes(
expect(service.device).to eq('MyDevice') template: eq(false),
expect(service.sound).to eq('mic') device: eq('MyDevice'),
expect(service.priority).to eq(4) sound: eq('mic'),
expect(service.api_key).to eq('123456789') priority: eq(4),
api_key: eq('123456789')
)
end end
end end
end end
@ -896,37 +904,37 @@ RSpec.describe Integration do
end end
end end
describe '.available_services_names' do describe '.available_integration_names' do
it 'calls the right methods' do it 'calls the right methods' do
expect(described_class).to receive(:services_names).and_call_original expect(described_class).to receive(:integration_names).and_call_original
expect(described_class).to receive(:dev_services_names).and_call_original expect(described_class).to receive(:dev_integration_names).and_call_original
expect(described_class).to receive(:project_specific_services_names).and_call_original expect(described_class).to receive(:project_specific_integration_names).and_call_original
described_class.available_services_names described_class.available_integration_names
end end
it 'does not call project_specific_services_names with include_project_specific false' do it 'does not call project_specific_integration_names with include_project_specific false' do
expect(described_class).to receive(:services_names).and_call_original expect(described_class).to receive(:integration_names).and_call_original
expect(described_class).to receive(:dev_services_names).and_call_original expect(described_class).to receive(:dev_integration_names).and_call_original
expect(described_class).not_to receive(:project_specific_services_names) expect(described_class).not_to receive(:project_specific_integration_names)
described_class.available_services_names(include_project_specific: false) described_class.available_integration_names(include_project_specific: false)
end end
it 'does not call dev_services_names with include_dev false' do it 'does not call dev_services_names with include_dev false' do
expect(described_class).to receive(:services_names).and_call_original expect(described_class).to receive(:integration_names).and_call_original
expect(described_class).not_to receive(:dev_services_names) expect(described_class).not_to receive(:dev_integration_names)
expect(described_class).to receive(:project_specific_services_names).and_call_original expect(described_class).to receive(:project_specific_integration_names).and_call_original
described_class.available_services_names(include_dev: false) described_class.available_integration_names(include_dev: false)
end end
it { expect(described_class.available_services_names).to include('jenkins') } it { expect(described_class.available_integration_names).to include('jenkins') }
end end
describe '.project_specific_services_names' do describe '.project_specific_integration_names' do
it do it do
expect(described_class.project_specific_services_names) expect(described_class.project_specific_integration_names)
.to include(*described_class::PROJECT_SPECIFIC_INTEGRATION_NAMES) .to include(*described_class::PROJECT_SPECIFIC_INTEGRATION_NAMES)
end end
end end

View File

@ -1557,13 +1557,16 @@ RSpec.describe Project, factory_default: :keep do
end end
end end
describe '.with_service' do describe '.with_integration' do
before do before do
create_list(:prometheus_project, 2) create_list(:prometheus_project, 2)
end end
it 'avoid n + 1' do let(:integration) { :prometheus_integration }
expect { described_class.with_service(:prometheus_integration).map(&:prometheus_integration) }.not_to exceed_query_limit(1)
it 'avoids n + 1' do
expect { described_class.with_integration(integration).map(&integration) }
.not_to exceed_query_limit(1)
end end
end end
@ -5838,53 +5841,53 @@ RSpec.describe Project, factory_default: :keep do
end end
end end
describe '#find_or_initialize_services' do describe '#find_or_initialize_integrations' do
let_it_be(:subject) { create(:project) } let_it_be(:subject) { create(:project) }
it 'avoids N+1 database queries' do it 'avoids N+1 database queries' do
control_count = ActiveRecord::QueryRecorder.new { subject.find_or_initialize_services }.count control_count = ActiveRecord::QueryRecorder.new { subject.find_or_initialize_integrations }.count
expect(control_count).to be <= 4 expect(control_count).to be <= 4
end end
it 'avoids N+1 database queries with more available services' do it 'avoids N+1 database queries with more available integrations' do
allow(Integration).to receive(:available_services_names).and_return(%w[pushover]) allow(Integration).to receive(:available_integration_names).and_return(%w[pushover])
control_count = ActiveRecord::QueryRecorder.new { subject.find_or_initialize_services } control_count = ActiveRecord::QueryRecorder.new { subject.find_or_initialize_integrations }
allow(Integration).to receive(:available_services_names).and_call_original allow(Integration).to receive(:available_integration_names).and_call_original
expect { subject.find_or_initialize_services }.not_to exceed_query_limit(control_count) expect { subject.find_or_initialize_integrations }.not_to exceed_query_limit(control_count)
end end
context 'with disabled services' do context 'with disabled integrations' do
before do before do
allow(Integration).to receive(:available_services_names).and_return(%w[prometheus pushover teamcity]) allow(Integration).to receive(:available_integration_names).and_return(%w[prometheus pushover teamcity])
allow(subject).to receive(:disabled_services).and_return(%w[prometheus]) allow(subject).to receive(:disabled_integrations).and_return(%w[prometheus])
end end
it 'returns only enabled services sorted' do it 'returns only enabled services sorted' do
services = subject.find_or_initialize_services expect(subject.find_or_initialize_integrations).to match [
have_attributes(title: 'JetBrains TeamCity'),
expect(services.size).to eq(2) have_attributes(title: 'Pushover')
expect(services.map(&:title)).to eq(['JetBrains TeamCity', 'Pushover']) ]
end end
end end
end end
describe '#find_or_initialize_service' do describe '#find_or_initialize_integration' do
it 'avoids N+1 database queries' do it 'avoids N+1 database queries' do
allow(Integration).to receive(:available_services_names).and_return(%w[prometheus pushover]) allow(Integration).to receive(:available_integration_names).and_return(%w[prometheus pushover])
control_count = ActiveRecord::QueryRecorder.new { subject.find_or_initialize_service('prometheus') }.count control_count = ActiveRecord::QueryRecorder.new { subject.find_or_initialize_integration('prometheus') }.count
allow(Integration).to receive(:available_services_names).and_call_original allow(Integration).to receive(:available_integration_names).and_call_original
expect { subject.find_or_initialize_service('prometheus') }.not_to exceed_query_limit(control_count) expect { subject.find_or_initialize_integration('prometheus') }.not_to exceed_query_limit(control_count)
end end
it 'returns nil if integration is disabled' do it 'returns nil if integration is disabled' do
allow(subject).to receive(:disabled_services).and_return(%w[prometheus]) allow(subject).to receive(:disabled_integrations).and_return(%w[prometheus])
expect(subject.find_or_initialize_service('prometheus')).to be_nil expect(subject.find_or_initialize_integration('prometheus')).to be_nil
end end
context 'with an existing integration' do context 'with an existing integration' do
@ -5895,7 +5898,7 @@ RSpec.describe Project, factory_default: :keep do
end end
it 'retrieves the integration' do it 'retrieves the integration' do
expect(subject.find_or_initialize_service('prometheus').api_url).to eq('https://prometheus.project.com/') expect(subject.find_or_initialize_integration('prometheus').api_url).to eq('https://prometheus.project.com/')
end end
end end
@ -5905,25 +5908,25 @@ RSpec.describe Project, factory_default: :keep do
create(:prometheus_integration, :template, api_url: 'https://prometheus.template.com/') create(:prometheus_integration, :template, api_url: 'https://prometheus.template.com/')
end end
it 'builds the service from the instance if exists' do it 'builds the service from the instance integration' do
expect(subject.find_or_initialize_service('prometheus').api_url).to eq('https://prometheus.instance.com/') expect(subject.find_or_initialize_integration('prometheus').api_url).to eq('https://prometheus.instance.com/')
end end
end end
context 'with an instance-level and template integrations' do context 'with a template integration and no instance-level' do
before do before do
create(:prometheus_integration, :template, api_url: 'https://prometheus.template.com/') create(:prometheus_integration, :template, api_url: 'https://prometheus.template.com/')
end end
it 'builds the service from the template if instance does not exists' do it 'builds the service from the template' do
expect(subject.find_or_initialize_service('prometheus').api_url).to eq('https://prometheus.template.com/') expect(subject.find_or_initialize_integration('prometheus').api_url).to eq('https://prometheus.template.com/')
end end
end end
context 'without an exisiting integration, nor instance-level or template' do context 'without an exisiting integration, or instance-level or template' do
it 'builds the service if instance or template does not exists' do it 'builds the service' do
expect(subject.find_or_initialize_service('prometheus')).to be_a(::Integrations::Prometheus) expect(subject.find_or_initialize_integration('prometheus')).to be_a(::Integrations::Prometheus)
expect(subject.find_or_initialize_service('prometheus').api_url).to be_nil expect(subject.find_or_initialize_integration('prometheus').api_url).to be_nil
end end
end end
end end

View File

@ -15,6 +15,8 @@ RSpec.describe API::GroupAvatar do
get api(avatar_path(group)) get api(avatar_path(group))
expect(response).to have_gitlab_http_status(:ok) expect(response).to have_gitlab_http_status(:ok)
expect(response.headers['Content-Disposition'])
.to eq(%(attachment; filename="dk.png"; filename*=UTF-8''dk.png))
end end
context 'when the group does not have avatar' do context 'when the group does not have avatar' do
@ -24,6 +26,8 @@ RSpec.describe API::GroupAvatar do
get api(avatar_path(group)) get api(avatar_path(group))
expect(response).to have_gitlab_http_status(:not_found) expect(response).to have_gitlab_http_status(:not_found)
expect(response.body)
.to eq(%({"message":"404 Avatar Not Found"}))
end end
end end

View File

@ -24,11 +24,11 @@ RSpec.describe API::Services do
expect(response).to have_gitlab_http_status(:forbidden) expect(response).to have_gitlab_http_status(:forbidden)
end end
context 'project with services' do context 'with integrations' do
let!(:active_integration) { create(:emails_on_push_integration, project: project, active: true) } let!(:active_integration) { create(:emails_on_push_integration, project: project, active: true) }
let!(:integration) { create(:custom_issue_tracker_integration, project: project, active: false) } let!(:integration) { create(:custom_issue_tracker_integration, project: project, active: false) }
it "returns a list of all active services" do it "returns a list of all active integrations" do
get api("/projects/#{project.id}/services", user) get api("/projects/#{project.id}/services", user)
aggregate_failures 'expect successful response with all active services' do aggregate_failures 'expect successful response with all active services' do
@ -42,7 +42,7 @@ RSpec.describe API::Services do
end end
end end
Integration.available_services_names.each do |service| Integration.available_integration_names.each do |service|
describe "PUT /projects/:id/services/#{service.dasherize}" do describe "PUT /projects/:id/services/#{service.dasherize}" do
include_context service include_context service
@ -99,7 +99,7 @@ RSpec.describe API::Services do
include_context service include_context service
before do before do
initialize_service(service) initialize_integration(service)
end end
it "deletes #{service}" do it "deletes #{service}" do
@ -114,7 +114,7 @@ RSpec.describe API::Services do
describe "GET /projects/:id/services/#{service.dasherize}" do describe "GET /projects/:id/services/#{service.dasherize}" do
include_context service include_context service
let!(:initialized_service) { initialize_service(service, active: true) } let!(:initialized_service) { initialize_integration(service, active: true) }
let_it_be(:project2) do let_it_be(:project2) do
create(:project, creator_id: user.id, namespace: user.namespace) create(:project, creator_id: user.id, namespace: user.namespace)
@ -141,7 +141,7 @@ RSpec.describe API::Services do
expect(json_response['properties'].keys).to match_array(service_instance.api_field_names) expect(json_response['properties'].keys).to match_array(service_instance.api_field_names)
end end
it "returns all properties of inactive service #{service}" do it "returns all properties of inactive integration #{service}" do
deactive_service! deactive_service!
get api("/projects/#{project.id}/services/#{dashed_service}", user) get api("/projects/#{project.id}/services/#{dashed_service}", user)
@ -151,16 +151,16 @@ RSpec.describe API::Services do
expect(json_response['properties'].keys).to match_array(service_instance.api_field_names) expect(json_response['properties'].keys).to match_array(service_instance.api_field_names)
end end
it "returns not found if service does not exist" do it "returns not found if integration does not exist" do
get api("/projects/#{project2.id}/services/#{dashed_service}", user) get api("/projects/#{project2.id}/services/#{dashed_service}", user)
expect(response).to have_gitlab_http_status(:not_found) expect(response).to have_gitlab_http_status(:not_found)
expect(json_response['message']).to eq('404 Service Not Found') expect(json_response['message']).to eq('404 Service Not Found')
end end
it "returns not found if service exists but is in `Project#disabled_services`" do it "returns not found if service exists but is in `Project#disabled_integrations`" do
expect_next_found_instance_of(Project) do |project| expect_next_found_instance_of(Project) do |project|
expect(project).to receive(:disabled_services).at_least(:once).and_return([service]) expect(project).to receive(:disabled_integrations).at_least(:once).and_return([service])
end end
get api("/projects/#{project.id}/services/#{dashed_service}", user) get api("/projects/#{project.id}/services/#{dashed_service}", user)

View File

@ -394,11 +394,11 @@ RSpec.describe Projects::Operations::UpdateService do
} }
end end
it 'uses Project#find_or_initialize_service to include instance defined defaults and pass them to Projects::UpdateService', :aggregate_failures do it 'uses Project#find_or_initialize_integration to include instance defined defaults and pass them to Projects::UpdateService', :aggregate_failures do
project_update_service = double(Projects::UpdateService) project_update_service = double(Projects::UpdateService)
expect(project) expect(project)
.to receive(:find_or_initialize_service) .to receive(:find_or_initialize_integration)
.with('prometheus') .with('prometheus')
.and_return(prometheus_integration) .and_return(prometheus_integration)
expect(Projects::UpdateService).to receive(:new) do |project_arg, user_arg, update_params_hash| expect(Projects::UpdateService).to receive(:new) do |project_arg, user_arg, update_params_hash|
@ -413,13 +413,13 @@ RSpec.describe Projects::Operations::UpdateService do
end end
end end
context 'prometheus params were not passed into service' do context 'when prometheus params are not passed into service' do
let(:params) { { something: :else } } let(:params) { { something: :else } }
it 'does not pass any prometheus params into Projects::UpdateService', :aggregate_failures do it 'does not pass any prometheus params into Projects::UpdateService', :aggregate_failures do
project_update_service = double(Projects::UpdateService) project_update_service = double(Projects::UpdateService)
expect(project).not_to receive(:find_or_initialize_service) expect(project).not_to receive(:find_or_initialize_integration)
expect(Projects::UpdateService) expect(Projects::UpdateService)
.to receive(:new) .to receive(:new)
.with(project, user, {}) .with(project, user, {})

View File

@ -1,6 +1,6 @@
# frozen_string_literal: true # frozen_string_literal: true
Integration.available_services_names.each do |service| Integration.available_integration_names.each do |service|
RSpec.shared_context service do RSpec.shared_context service do
include JiraServiceHelper if service == 'jira' include JiraServiceHelper if service == 'jira'
@ -49,12 +49,12 @@ Integration.available_services_names.each do |service|
stub_jira_integration_test if service == 'jira' stub_jira_integration_test if service == 'jira'
end end
def initialize_service(service, attrs = {}) def initialize_integration(integration, attrs = {})
service_item = project.find_or_initialize_service(service) record = project.find_or_initialize_integration(integration)
service_item.attributes = attrs record.attributes = attrs
service_item.properties = service_attrs record.properties = service_attrs
service_item.save! record.save!
service_item record
end end
private private
@ -66,7 +66,7 @@ Integration.available_services_names.each do |service|
return unless licensed_feature return unless licensed_feature
stub_licensed_features(licensed_feature => true) stub_licensed_features(licensed_feature => true)
project.clear_memoization(:disabled_services) project.clear_memoization(:disabled_integrations)
end end
end end
end end

View File

@ -0,0 +1,92 @@
# frozen_string_literal: true
require 'spec_helper'
RSpec.describe Packages::Helm::ExtractionWorker, type: :worker do
describe '#perform' do
let_it_be(:package) { create(:helm_package, without_package_files: true, status: 'processing')}
let!(:package_file) { create(:helm_package_file, without_loaded_metadatum: true, package: package) }
let(:package_file_id) { package_file.id }
let(:channel) { 'stable' }
let(:expected_metadata) do
{
'apiVersion' => 'v2',
'description' => 'File, Block, and Object Storage Services for your Cloud-Native Environment',
'icon' => 'https://rook.io/images/rook-logo.svg',
'name' => 'rook-ceph',
'sources' => ['https://github.com/rook/rook'],
'version' => 'v1.5.8'
}
end
subject { described_class.new.perform(channel, package_file_id) }
shared_examples 'handling error' do
it 'mark the package as errored', :aggregate_failures do
expect(Gitlab::ErrorTracking).to receive(:log_exception).with(
instance_of(Packages::Helm::ExtractFileMetadataService::ExtractionError),
project_id: package_file.package.project_id
)
expect { subject }
.to not_change { Packages::Package.count }
.and not_change { Packages::PackageFile.count }
.and change { package.reload.status }.from('processing').to('error')
end
end
context 'with valid package file' do
it_behaves_like 'an idempotent worker' do
let(:job_args) { [channel, package_file_id] }
it 'updates package and package file', :aggregate_failures do
expect(Gitlab::ErrorTracking).not_to receive(:log_exception)
expect { subject }
.to not_change { Packages::Package.count }
.and not_change { Packages::PackageFile.count }
.and change { Packages::Helm::FileMetadatum.count }.from(0).to(1)
.and change { package.reload.status }.from('processing').to('default')
helm_file_metadatum = package_file.helm_file_metadatum
expect(helm_file_metadatum.channel).to eq(channel)
expect(helm_file_metadatum.metadata).to eq(expected_metadata)
end
end
end
context 'with invalid package file id' do
let(:package_file_id) { 5555 }
it "doesn't update helm_file_metadatum", :aggregate_failures do
expect { subject }
.to not_change { Packages::Package.count }
.and not_change { Packages::PackageFile.count }
.and not_change { Packages::Helm::FileMetadatum.count }
.and not_change { package.reload.status }
end
end
context 'with an empty package file' do
before do
expect_next_instance_of(Gem::Package::TarReader) do |tar_reader|
expect(tar_reader).to receive(:each).and_return([])
end
end
it_behaves_like 'handling error'
end
context 'with an invalid YAML' do
before do
expect_next_instance_of(Gem::Package::TarReader::Entry) do |entry|
expect(entry).to receive(:read).and_return('{')
end
end
it_behaves_like 'handling error'
end
end
end