Add latest changes from gitlab-org/gitlab@master
This commit is contained in:
parent
676430584d
commit
d04f2be14d
|
|
@ -13,62 +13,28 @@
|
|||
|
||||
<!-- Link related issues below. -->
|
||||
|
||||
## Author's checklist (required)
|
||||
## Author's checklist
|
||||
|
||||
- [ ] Follow the [Documentation Guidelines](https://docs.gitlab.com/ee/development/documentation/) and [Style Guide](https://docs.gitlab.com/ee/development/documentation/styleguide/).
|
||||
- If you have **Developer** permissions or higher:
|
||||
- [ ] Ensure that the [product tier badge](https://docs.gitlab.com/ee/development/documentation/styleguide/index.html#product-tier-badges) is added to doc's `h1`.
|
||||
- [ ] Apply the ~documentation label, plus:
|
||||
- The corresponding DevOps stage and group labels, if applicable.
|
||||
- ~"development guidelines" when changing docs under `doc/development/*`, `CONTRIBUTING.md`, or `README.md`.
|
||||
- ~"development guidelines" and ~"Documentation guidelines" when changing docs under `development/documentation/*`.
|
||||
- ~"development guidelines" and ~"Description templates (.gitlab/\*)" when creating/updating issue and MR description templates.
|
||||
- [ ] [Request a review](https://docs.gitlab.com/ee/development/code_review.html#dogfooding-the-reviewers-feature)
|
||||
from the [designated Technical Writer](https://about.gitlab.com/handbook/engineering/ux/technical-writing/#assignments).
|
||||
- [ ] Ensure that the [product tier badge](https://docs.gitlab.com/ee/development/documentation/styleguide/index.html#product-tier-badges) is added to doc's `h1`.
|
||||
- [ ] [Request a review](https://docs.gitlab.com/ee/development/code_review.html#dogfooding-the-reviewers-feature) based on the documentation page's metadata and [associated Technical Writer](https://about.gitlab.com/handbook/product/categories/#devops-stages).
|
||||
|
||||
/label ~documentation
|
||||
/assign me
|
||||
|
||||
Do not add the ~"feature", ~"frontend", ~"backend", ~"bug", or ~"database" labels if you are only updating documentation. These labels will cause the MR to be added to code verification QA issues.
|
||||
|
||||
When applicable:
|
||||
|
||||
- [ ] Update the [permissions table](https://docs.gitlab.com/ee/user/permissions.html).
|
||||
- [ ] Link docs to and from the higher-level index page, plus other related docs where helpful.
|
||||
- [ ] Add the [product tier badge](https://docs.gitlab.com/ee/development/documentation/styleguide/index.html#product-tier-badges) accordingly.
|
||||
- [ ] Add [GitLab's version history note(s)](https://docs.gitlab.com/ee/development/documentation/styleguide/index.html#gitlab-versions).
|
||||
- [ ] Add/update the [feature flag section](https://docs.gitlab.com/ee/development/documentation/feature_flags.html).
|
||||
To avoid having this MR be added to code verification QA issues, don't add these labels: ~"feature", ~"frontend", ~"backend", ~"bug", or ~"database"
|
||||
|
||||
## Review checklist
|
||||
|
||||
All reviewers can help ensure accuracy, clarity, completeness, and adherence to the [Documentation Guidelines](https://docs.gitlab.com/ee/development/documentation/) and [Style Guide](https://docs.gitlab.com/ee/development/documentation/styleguide/).
|
||||
Documentation-related MRs should be reviewed by a Technical Writer for a non-blocking review, based on [Documentation Guidelines](https://docs.gitlab.com/ee/development/documentation/) and the [Style Guide](https://docs.gitlab.com/ee/development/documentation/styleguide/).
|
||||
|
||||
**1. Primary Reviewer**
|
||||
- [ ] If the content requires it, ensure the information is reviewed by a subject matter expert.
|
||||
- Technical writer review items:
|
||||
- [ ] Ensure docs metadata is present and up-to-date.
|
||||
- [ ] Ensure the appropriate [labels](https://about.gitlab.com/handbook/engineering/ux/technical-writing/workflow/#labels) are added to this MR.
|
||||
- If relevant to this MR, ensure [content topic type](https://docs.gitlab.com/ee/development/documentation/structure.html) principles are in use, including:
|
||||
- [ ] The headings should be something you'd do a Google search for. Instead of `Default behavior`, say something like `Default behavior when you close an issue`.
|
||||
- [ ] The headings (other than the page title) should be active. Instead of `Configuring GDK`, say something like `Configure GDK`.
|
||||
- [ ] Any task steps should be written as a numbered list.
|
||||
- [ ] Review by assigned maintainer, who can always request/require the above reviews. Maintainer's review can occur before or after a technical writer review.
|
||||
- [ ] Ensure a release milestone is set.
|
||||
|
||||
* [ ] Review by a code reviewer or other selected colleague to confirm accuracy, clarity, and completeness. This can be skipped for minor fixes without substantive content changes.
|
||||
|
||||
**2. Technical Writer**
|
||||
|
||||
- [ ] Technical writer review. If not requested for this MR, must be scheduled post-merge. To request for this MR, assign the writer listed for the applicable [DevOps stage](https://about.gitlab.com/handbook/product/categories/#devops-stages).
|
||||
- [ ] Ensure docs metadata are present and up-to-date.
|
||||
- [ ] Ensure ~"Technical Writing" and ~"documentation" are added.
|
||||
- [ ] Add the corresponding `docs::` [scoped label](https://gitlab.com/groups/gitlab-org/-/labels?subscribed=&search=docs%3A%3A).
|
||||
- [ ] If working on UI text, add the corresponding `UI Text` [scoped label](https://gitlab.com/groups/gitlab-org/-/labels?subscribed=&search=ui+text).
|
||||
- [ ] Add ~"tw::doing" when starting work on the MR.
|
||||
- [ ] Add ~"tw::finished" if Technical Writing team work on the MR is complete but it remains open.
|
||||
|
||||
For more information about labels, see [Technical Writing workflows - Labels](https://about.gitlab.com/handbook/engineering/ux/technical-writing/workflow/#labels).
|
||||
|
||||
For suggestions that you are confident don't need to be reviewed, change them locally
|
||||
and push a commit directly to save others from unneeded reviews. For example:
|
||||
|
||||
- Clear typos, like `this is a typpo`.
|
||||
- Minor issues, like single quotes instead of double quotes, Oxford commas, and periods.
|
||||
|
||||
For more information, see our documentation on [Merging a merge request](https://docs.gitlab.com/ee/development/code_review.html#merging-a-merge-request).
|
||||
|
||||
**3. Maintainer**
|
||||
|
||||
1. [ ] Review by assigned maintainer, who can always request/require the above reviews. Maintainer's review can occur before or after a technical writer review.
|
||||
1. [ ] Ensure a release milestone is set.
|
||||
1. [ ] If there has not been a technical writer review, [create an issue for one using the Doc Review template](https://gitlab.com/gitlab-org/gitlab/issues/new?issuable_template=Doc%20Review).
|
||||
/label ~documentation
|
||||
/assign me
|
||||
|
|
|
|||
|
|
@ -1,5 +1,11 @@
|
|||
query getBlobContent($projectPath: ID!, $path: String, $ref: String!) {
|
||||
blobContent(projectPath: $projectPath, path: $path, ref: $ref) @client {
|
||||
rawData
|
||||
query getBlobContent($projectPath: ID!, $path: String!, $ref: String) {
|
||||
project(fullPath: $projectPath) {
|
||||
repository {
|
||||
blobs(paths: [$path], ref: $ref) {
|
||||
nodes {
|
||||
rawBlob
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1,20 +1,9 @@
|
|||
import produce from 'immer';
|
||||
import Api from '~/api';
|
||||
import axios from '~/lib/utils/axios_utils';
|
||||
import getCurrentBranchQuery from './queries/client/current_branch.graphql';
|
||||
import getLastCommitBranchQuery from './queries/client/last_commit_branch.query.graphql';
|
||||
|
||||
export const resolvers = {
|
||||
Query: {
|
||||
blobContent(_, { projectPath, path, ref }) {
|
||||
return {
|
||||
__typename: 'BlobContent',
|
||||
rawData: Api.getRawFile(projectPath, path, { ref }).then(({ data }) => {
|
||||
return data;
|
||||
}),
|
||||
};
|
||||
},
|
||||
},
|
||||
Mutation: {
|
||||
lintCI: (_, { endpoint, content, dry_run }) => {
|
||||
return axios.post(endpoint, { content, dry_run }).then(({ data }) => ({
|
||||
|
|
|
|||
|
|
@ -1,7 +1,6 @@
|
|||
<script>
|
||||
import { GlLoadingIcon } from '@gitlab/ui';
|
||||
import { fetchPolicies } from '~/lib/graphql';
|
||||
import httpStatusCodes from '~/lib/utils/http_status';
|
||||
import { s__ } from '~/locale';
|
||||
|
||||
import { unwrapStagesWithNeeds } from '~/pipelines/components/unwrapping_utils';
|
||||
|
|
@ -76,22 +75,40 @@ export default {
|
|||
};
|
||||
},
|
||||
update(data) {
|
||||
return data?.blobContent?.rawData;
|
||||
return data?.project?.repository?.blobs?.nodes[0]?.rawBlob;
|
||||
},
|
||||
result({ data }) {
|
||||
const fileContent = data?.blobContent?.rawData ?? '';
|
||||
const nodes = data?.project?.repository?.blobs?.nodes;
|
||||
if (!nodes) {
|
||||
this.reportFailure(LOAD_FAILURE_UNKNOWN);
|
||||
} else {
|
||||
const rawBlob = nodes[0]?.rawBlob;
|
||||
const fileContent = rawBlob ?? '';
|
||||
|
||||
this.lastCommittedContent = fileContent;
|
||||
this.currentCiFileContent = fileContent;
|
||||
|
||||
// make sure to reset the start screen flag during a refetch
|
||||
// If rawBlob is defined and returns a string, it means that there is
|
||||
// a CI config file with empty content. If `rawBlob` is not defined
|
||||
// at all, it means there was no file found.
|
||||
const hasCIFile = rawBlob === '' || fileContent.length > 0;
|
||||
|
||||
if (!fileContent.length) {
|
||||
this.setAppStatus(EDITOR_APP_STATUS_EMPTY);
|
||||
}
|
||||
|
||||
if (!hasCIFile) {
|
||||
this.showStartScreen = true;
|
||||
} else if (fileContent.length) {
|
||||
// If the file content is > 0, then we make sure to reset the
|
||||
// start screen flag during a refetch
|
||||
// e.g. when switching branches
|
||||
if (fileContent.length) {
|
||||
this.showStartScreen = false;
|
||||
}
|
||||
}
|
||||
},
|
||||
error(error) {
|
||||
this.handleBlobContentError(error);
|
||||
error() {
|
||||
this.reportFailure(LOAD_FAILURE_UNKNOWN);
|
||||
},
|
||||
watchLoading(isLoading) {
|
||||
if (isLoading) {
|
||||
|
|
@ -187,22 +204,6 @@ export default {
|
|||
},
|
||||
},
|
||||
methods: {
|
||||
handleBlobContentError(error = {}) {
|
||||
const { networkError } = error;
|
||||
|
||||
const { response } = networkError;
|
||||
// 404 for missing CI file
|
||||
// 400 for blank projects with no repository
|
||||
if (
|
||||
response?.status === httpStatusCodes.NOT_FOUND ||
|
||||
response?.status === httpStatusCodes.BAD_REQUEST
|
||||
) {
|
||||
this.setAppStatus(EDITOR_APP_STATUS_EMPTY);
|
||||
this.showStartScreen = true;
|
||||
} else {
|
||||
this.reportFailure(LOAD_FAILURE_UNKNOWN);
|
||||
}
|
||||
},
|
||||
hideFailure() {
|
||||
this.showFailure = false;
|
||||
},
|
||||
|
|
|
|||
|
|
@ -101,9 +101,6 @@ export default {
|
|||
showJobLinks() {
|
||||
return !this.isStageView && this.showLinks;
|
||||
},
|
||||
shouldShowStageName() {
|
||||
return !this.isStageView;
|
||||
},
|
||||
// The show downstream check prevents showing redundant linked columns
|
||||
showDownstreamPipelines() {
|
||||
return (
|
||||
|
|
@ -202,7 +199,7 @@ export default {
|
|||
:groups="column.groups"
|
||||
:action="column.status.action"
|
||||
:highlighted-jobs="highlightedJobs"
|
||||
:show-stage-name="shouldShowStageName"
|
||||
:is-stage-view="isStageView"
|
||||
:job-hovered="hoveredJobName"
|
||||
:source-job-hovered="hoveredSourceJobName"
|
||||
:pipeline-expanded="pipelineExpanded"
|
||||
|
|
|
|||
|
|
@ -40,6 +40,11 @@ export default {
|
|||
required: false,
|
||||
default: () => [],
|
||||
},
|
||||
isStageView: {
|
||||
type: Boolean,
|
||||
required: false,
|
||||
default: false,
|
||||
},
|
||||
jobHovered: {
|
||||
type: String,
|
||||
required: false,
|
||||
|
|
@ -50,11 +55,6 @@ export default {
|
|||
required: false,
|
||||
default: () => ({}),
|
||||
},
|
||||
showStageName: {
|
||||
type: Boolean,
|
||||
required: false,
|
||||
default: false,
|
||||
},
|
||||
sourceJobHovered: {
|
||||
type: String,
|
||||
required: false,
|
||||
|
|
@ -73,6 +73,12 @@ export default {
|
|||
'gl-pl-3',
|
||||
],
|
||||
computed: {
|
||||
canUpdatePipeline() {
|
||||
return this.userPermissions.updatePipeline;
|
||||
},
|
||||
columnSpacingClass() {
|
||||
return this.isStageView ? 'gl-px-6' : 'gl-px-9';
|
||||
},
|
||||
/*
|
||||
currentGroups and filteredGroups are part of
|
||||
a test to hunt down a bug
|
||||
|
|
@ -94,8 +100,8 @@ export default {
|
|||
hasAction() {
|
||||
return !isEmpty(this.action);
|
||||
},
|
||||
canUpdatePipeline() {
|
||||
return this.userPermissions.updatePipeline;
|
||||
showStageName() {
|
||||
return !this.isStageView;
|
||||
},
|
||||
},
|
||||
errorCaptured(err, _vm, info) {
|
||||
|
|
@ -130,7 +136,7 @@ export default {
|
|||
};
|
||||
</script>
|
||||
<template>
|
||||
<main-graph-wrapper class="gl-px-6" data-testid="stage-column">
|
||||
<main-graph-wrapper :class="columnSpacingClass" data-testid="stage-column">
|
||||
<template #stages>
|
||||
<div
|
||||
data-testid="stage-column-title"
|
||||
|
|
|
|||
|
|
@ -75,11 +75,11 @@ export const generateLinksData = ({ links }, containerID, modifier = '') => {
|
|||
// until we can safely draw the bezier to look nice.
|
||||
// The adjustment number here is a magic number to make things
|
||||
// look nice and should change if the padding changes. This goes well
|
||||
// with gl-px-6. gl-px-8 is more like 100.
|
||||
const straightLineDestinationX = targetNodeX - 60;
|
||||
// with gl-px-9 which we translate with 100px here.
|
||||
const straightLineDestinationX = targetNodeX - 100;
|
||||
const controlPointX = straightLineDestinationX + (targetNodeX - straightLineDestinationX) / 2;
|
||||
|
||||
if (straightLineDestinationX > 0) {
|
||||
if (straightLineDestinationX > firstPointCoordinateX) {
|
||||
path.lineTo(straightLineDestinationX, sourceNodeY);
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -14,7 +14,7 @@ export default {
|
|||
type: Number,
|
||||
required: true,
|
||||
},
|
||||
isHighlighted: {
|
||||
isHovered: {
|
||||
type: Boolean,
|
||||
required: false,
|
||||
default: false,
|
||||
|
|
@ -42,7 +42,7 @@ export default {
|
|||
jobPillClasses() {
|
||||
return [
|
||||
{ 'gl-opacity-3': this.isFadedOut },
|
||||
this.isHighlighted ? 'gl-shadow-blue-200-x0-y0-b4-s2' : 'gl-inset-border-2-green-400',
|
||||
{ 'gl-bg-gray-50 gl-inset-border-1-gray-200': this.isHovered },
|
||||
];
|
||||
},
|
||||
},
|
||||
|
|
@ -57,10 +57,11 @@ export default {
|
|||
};
|
||||
</script>
|
||||
<template>
|
||||
<div class="gl-w-full">
|
||||
<tooltip-on-truncate :title="jobName" truncate-target="child" placement="top">
|
||||
<div
|
||||
:id="id"
|
||||
class="gl-w-15 gl-bg-white gl-text-center gl-text-truncate gl-rounded-pill gl-mb-3 gl-px-5 gl-py-2 gl-relative gl-z-index-1 gl-transition-duration-slow gl-transition-timing-function-ease"
|
||||
class="gl-bg-white gl-inset-border-1-gray-100 gl-text-center gl-text-truncate gl-rounded-6 gl-mb-3 gl-px-5 gl-py-3 gl-relative gl-z-index-1 gl-transition-duration-slow gl-transition-timing-function-ease"
|
||||
:class="jobPillClasses"
|
||||
@mouseover="onMouseEnter"
|
||||
@mouseleave="onMouseLeave"
|
||||
|
|
@ -68,4 +69,5 @@ export default {
|
|||
{{ jobName }}
|
||||
</div>
|
||||
</tooltip-on-truncate>
|
||||
</div>
|
||||
</template>
|
||||
|
|
|
|||
|
|
@ -4,14 +4,14 @@ import { __ } from '~/locale';
|
|||
import { DRAW_FAILURE, DEFAULT } from '../../constants';
|
||||
import LinksLayer from '../graph_shared/links_layer.vue';
|
||||
import JobPill from './job_pill.vue';
|
||||
import StagePill from './stage_pill.vue';
|
||||
import StageName from './stage_name.vue';
|
||||
|
||||
export default {
|
||||
components: {
|
||||
GlAlert,
|
||||
JobPill,
|
||||
LinksLayer,
|
||||
StagePill,
|
||||
StageName,
|
||||
},
|
||||
CONTAINER_REF: 'PIPELINE_GRAPH_CONTAINER_REF',
|
||||
BASE_CONTAINER_ID: 'pipeline-graph-container',
|
||||
|
|
@ -21,6 +21,11 @@ export default {
|
|||
[DRAW_FAILURE]: __('Could not draw the lines for job relationships'),
|
||||
[DEFAULT]: __('An unknown error occurred.'),
|
||||
},
|
||||
// The combination of gl-w-full gl-min-w-full and gl-max-w-15 is necessary.
|
||||
// The max width and the width make sure the ellipsis to work and the min width
|
||||
// is for when there is less text than the stage column width (which the width 100% does not fix)
|
||||
jobWrapperClasses:
|
||||
'gl-display-flex gl-flex-direction-column gl-align-items-center gl-w-full gl-px-8 gl-min-w-full gl-max-w-15',
|
||||
props: {
|
||||
pipelineData: {
|
||||
required: true,
|
||||
|
|
@ -85,23 +90,8 @@ export default {
|
|||
height: this.$refs[this.$options.CONTAINER_REF].scrollHeight,
|
||||
};
|
||||
},
|
||||
getStageBackgroundClasses(index) {
|
||||
const { length } = this.pipelineStages;
|
||||
// It's possible for a graph to have only one stage, in which
|
||||
// case we concatenate both the left and right rounding classes
|
||||
if (length === 1) {
|
||||
return 'gl-rounded-bottom-left-6 gl-rounded-top-left-6 gl-rounded-bottom-right-6 gl-rounded-top-right-6';
|
||||
}
|
||||
|
||||
if (index === 0) {
|
||||
return 'gl-rounded-bottom-left-6 gl-rounded-top-left-6';
|
||||
}
|
||||
|
||||
if (index === length - 1) {
|
||||
return 'gl-rounded-bottom-right-6 gl-rounded-top-right-6';
|
||||
}
|
||||
|
||||
return '';
|
||||
isFadedOut(jobName) {
|
||||
return this.highlightedJobs.length > 1 && !this.isJobHighlighted(jobName);
|
||||
},
|
||||
isJobHighlighted(jobName) {
|
||||
return this.highlightedJobs.includes(jobName);
|
||||
|
|
@ -137,7 +127,12 @@ export default {
|
|||
>
|
||||
{{ failure.text }}
|
||||
</gl-alert>
|
||||
<div :id="containerId" :ref="$options.CONTAINER_REF" data-testid="graph-container">
|
||||
<div
|
||||
:id="containerId"
|
||||
:ref="$options.CONTAINER_REF"
|
||||
class="gl-bg-gray-10 gl-overflow-auto"
|
||||
data-testid="graph-container"
|
||||
>
|
||||
<links-layer
|
||||
:pipeline-data="pipelineStages"
|
||||
:pipeline-id="$options.PIPELINE_ID"
|
||||
|
|
@ -152,23 +147,17 @@ export default {
|
|||
:key="`${stage.name}-${index}`"
|
||||
class="gl-flex-direction-column"
|
||||
>
|
||||
<div
|
||||
class="gl-display-flex gl-align-items-center gl-bg-white gl-w-full gl-px-8 gl-py-4 gl-mb-5"
|
||||
:class="getStageBackgroundClasses(index)"
|
||||
data-testid="stage-background"
|
||||
>
|
||||
<stage-pill :stage-name="stage.name" :is-empty="stage.groups.length === 0" />
|
||||
<div class="gl-display-flex gl-align-items-center gl-w-full gl-px-9 gl-py-4 gl-mb-5">
|
||||
<stage-name :stage-name="stage.name" />
|
||||
</div>
|
||||
<div
|
||||
class="gl-display-flex gl-flex-direction-column gl-align-items-center gl-w-full gl-px-8"
|
||||
>
|
||||
<div :class="$options.jobWrapperClasses">
|
||||
<job-pill
|
||||
v-for="group in stage.groups"
|
||||
:key="group.name"
|
||||
:job-name="group.name"
|
||||
:pipeline-id="$options.PIPELINE_ID"
|
||||
:is-highlighted="hasHighlightedJob && isJobHighlighted(group.name)"
|
||||
:is-faded-out="hasHighlightedJob && !isJobHighlighted(group.name)"
|
||||
:is-hovered="highlightedJob === group.name"
|
||||
:is-faded-out="isFadedOut(group.name)"
|
||||
@on-mouse-enter="setHoveredJob"
|
||||
@on-mouse-leave="removeHoveredJob"
|
||||
/>
|
||||
|
|
|
|||
|
|
@ -1,4 +1,5 @@
|
|||
<script>
|
||||
import { capitalize, escape } from 'lodash';
|
||||
import tooltipOnTruncate from '~/vue_shared/components/tooltip_on_truncate.vue';
|
||||
|
||||
export default {
|
||||
|
|
@ -10,26 +11,18 @@ export default {
|
|||
type: String,
|
||||
required: true,
|
||||
},
|
||||
isEmpty: {
|
||||
type: Boolean,
|
||||
required: false,
|
||||
default: false,
|
||||
},
|
||||
},
|
||||
computed: {
|
||||
emptyClass() {
|
||||
return this.isEmpty ? 'gl-bg-gray-200' : 'gl-bg-gray-600';
|
||||
formattedTitle() {
|
||||
return capitalize(escape(this.stageName));
|
||||
},
|
||||
},
|
||||
};
|
||||
</script>
|
||||
<template>
|
||||
<tooltip-on-truncate :title="stageName" truncate-target="child" placement="top">
|
||||
<div
|
||||
class="gl-px-5 gl-py-2 gl-text-white gl-text-center gl-text-truncate gl-rounded-pill gl-w-20"
|
||||
:class="emptyClass"
|
||||
>
|
||||
{{ stageName }}
|
||||
<div class="gl-py-2 gl-text-truncate gl-font-weight-bold gl-w-20">
|
||||
{{ formattedTitle }}
|
||||
</div>
|
||||
</tooltip-on-truncate>
|
||||
</template>
|
||||
|
|
@ -7,7 +7,7 @@ class Projects::MattermostsController < Projects::ApplicationController
|
|||
layout 'project_settings'
|
||||
|
||||
before_action :authorize_admin_project!
|
||||
before_action :service
|
||||
before_action :integration
|
||||
before_action :teams, only: [:new]
|
||||
|
||||
feature_category :integrations
|
||||
|
|
@ -16,11 +16,11 @@ class Projects::MattermostsController < Projects::ApplicationController
|
|||
end
|
||||
|
||||
def create
|
||||
result, message = @service.configure(current_user, configure_params)
|
||||
result, message = integration.configure(current_user, configure_params)
|
||||
|
||||
if result
|
||||
flash[:notice] = 'This service is now configured'
|
||||
redirect_to edit_project_service_path(@project, service)
|
||||
redirect_to edit_project_service_path(@project, integration)
|
||||
else
|
||||
flash[:alert] = message || 'Failed to configure service'
|
||||
redirect_to new_project_mattermost_path(@project)
|
||||
|
|
@ -31,15 +31,16 @@ class Projects::MattermostsController < Projects::ApplicationController
|
|||
|
||||
def configure_params
|
||||
params.require(:mattermost).permit(:trigger, :team_id).merge(
|
||||
url: service_trigger_url(@service),
|
||||
url: service_trigger_url(integration),
|
||||
icon_url: asset_url('slash-command-logo.png', skip_pipeline: true))
|
||||
end
|
||||
|
||||
def teams
|
||||
@teams, @teams_error_message = @service.list_teams(current_user)
|
||||
@teams, @teams_error_message = integration.list_teams(current_user)
|
||||
end
|
||||
|
||||
def service
|
||||
@service ||= @project.find_or_initialize_service('mattermost_slash_commands')
|
||||
def integration
|
||||
@integration ||= @project.find_or_initialize_integration('mattermost_slash_commands')
|
||||
@service = @integration # TODO: remove when https://gitlab.com/gitlab-org/gitlab/-/issues/330300 is complete
|
||||
end
|
||||
end
|
||||
|
|
|
|||
|
|
@ -1,20 +1,21 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
class Projects::ServiceHookLogsController < Projects::HookLogsController
|
||||
before_action :service, only: [:show, :retry]
|
||||
before_action :integration, only: [:show, :retry]
|
||||
|
||||
def retry
|
||||
execute_hook
|
||||
redirect_to edit_project_service_path(@project, @service)
|
||||
redirect_to edit_project_service_path(@project, @integration)
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def hook
|
||||
@hook ||= service.service_hook
|
||||
@hook ||= integration.service_hook
|
||||
end
|
||||
|
||||
def service
|
||||
@service ||= @project.find_or_initialize_service(params[:service_id])
|
||||
def integration
|
||||
@integration ||= @project.find_or_initialize_integration(params[:service_id])
|
||||
@service = @integration # TODO: remove when https://gitlab.com/gitlab-org/gitlab/-/issues/330300 is complete
|
||||
end
|
||||
end
|
||||
|
|
|
|||
|
|
@ -84,7 +84,7 @@ class Projects::ServicesController < Projects::ApplicationController
|
|||
end
|
||||
|
||||
def integration
|
||||
@integration ||= @project.find_or_initialize_service(params[:id])
|
||||
@integration ||= @project.find_or_initialize_integration(params[:id])
|
||||
end
|
||||
alias_method :service, :integration
|
||||
|
||||
|
|
|
|||
|
|
@ -9,7 +9,7 @@ module Projects
|
|||
feature_category :integrations
|
||||
|
||||
def show
|
||||
@integrations = @project.find_or_initialize_services
|
||||
@integrations = @project.find_or_initialize_integrations
|
||||
end
|
||||
end
|
||||
end
|
||||
|
|
|
|||
|
|
@ -5,7 +5,7 @@ module Types
|
|||
class ServiceTypeEnum < BaseEnum
|
||||
graphql_name 'ServiceType'
|
||||
|
||||
::Integration.available_services_types(include_dev: false).each do |type|
|
||||
::Integration.available_integration_types(include_dev: false).each do |type|
|
||||
value type.underscore.upcase, value: type, description: "#{type} type"
|
||||
end
|
||||
end
|
||||
|
|
|
|||
|
|
@ -5,7 +5,7 @@ module OperationsHelper
|
|||
|
||||
def prometheus_integration
|
||||
strong_memoize(:prometheus_integration) do
|
||||
@project.find_or_initialize_service(::Integrations::Prometheus.to_param)
|
||||
@project.find_or_initialize_integration(::Integrations::Prometheus.to_param)
|
||||
end
|
||||
end
|
||||
|
||||
|
|
|
|||
|
|
@ -108,9 +108,9 @@ class Integration < ApplicationRecord
|
|||
scope :by_active_flag, -> (flag) { where(active: flag) }
|
||||
scope :inherit_from_id, -> (id) { where(inherit_from_id: id) }
|
||||
scope :inherit, -> { where.not(inherit_from_id: nil) }
|
||||
scope :for_group, -> (group) { where(group_id: group, type: available_services_types(include_project_specific: false)) }
|
||||
scope :for_template, -> { where(template: true, type: available_services_types(include_project_specific: false)) }
|
||||
scope :for_instance, -> { where(instance: true, type: available_services_types(include_project_specific: false)) }
|
||||
scope :for_group, -> (group) { where(group_id: group, type: available_integration_types(include_project_specific: false)) }
|
||||
scope :for_template, -> { where(template: true, type: available_integration_types(include_project_specific: false)) }
|
||||
scope :for_instance, -> { where(instance: true, type: available_integration_types(include_project_specific: false)) }
|
||||
|
||||
scope :push_hooks, -> { where(push_events: true, active: true) }
|
||||
scope :tag_push_hooks, -> { where(tag_push_events: true, active: true) }
|
||||
|
|
@ -217,7 +217,7 @@ class Integration < ApplicationRecord
|
|||
private_class_method :create_nonexistent_templates
|
||||
|
||||
def self.find_or_initialize_non_project_specific_integration(name, instance: false, group_id: nil)
|
||||
return unless name.in?(available_services_names(include_project_specific: false))
|
||||
return unless name.in?(available_integration_names(include_project_specific: false))
|
||||
|
||||
integration_name_to_model(name).find_or_initialize_by(instance: instance, group_id: group_id)
|
||||
end
|
||||
|
|
@ -238,19 +238,19 @@ class Integration < ApplicationRecord
|
|||
def self.nonexistent_services_types_for(scope)
|
||||
# Using #map instead of #pluck to save one query count. This is because
|
||||
# ActiveRecord loaded the object here, so we don't need to query again later.
|
||||
available_services_types(include_project_specific: false) - scope.map(&:type)
|
||||
available_integration_types(include_project_specific: false) - scope.map(&:type)
|
||||
end
|
||||
private_class_method :nonexistent_services_types_for
|
||||
|
||||
# Returns a list of available service names.
|
||||
# Returns a list of available integration names.
|
||||
# Example: ["asana", ...]
|
||||
# @deprecated
|
||||
def self.available_services_names(include_project_specific: true, include_dev: true)
|
||||
service_names = services_names
|
||||
service_names += project_specific_services_names if include_project_specific
|
||||
service_names += dev_services_names if include_dev
|
||||
def self.available_integration_names(include_project_specific: true, include_dev: true)
|
||||
names = integration_names
|
||||
names += project_specific_integration_names if include_project_specific
|
||||
names += dev_integration_names if include_dev
|
||||
|
||||
service_names.sort_by(&:downcase)
|
||||
names.sort_by(&:downcase)
|
||||
end
|
||||
|
||||
def self.integration_names
|
||||
|
|
@ -261,21 +261,21 @@ class Integration < ApplicationRecord
|
|||
integration_names
|
||||
end
|
||||
|
||||
def self.dev_services_names
|
||||
def self.dev_integration_names
|
||||
return [] unless Rails.env.development?
|
||||
|
||||
DEV_INTEGRATION_NAMES
|
||||
end
|
||||
|
||||
def self.project_specific_services_names
|
||||
def self.project_specific_integration_names
|
||||
PROJECT_SPECIFIC_INTEGRATION_NAMES
|
||||
end
|
||||
|
||||
# Returns a list of available service types.
|
||||
# Returns a list of available integration types.
|
||||
# Example: ["AsanaService", ...]
|
||||
def self.available_services_types(include_project_specific: true, include_dev: true)
|
||||
available_services_names(include_project_specific: include_project_specific, include_dev: include_dev).map do |service_name|
|
||||
integration_name_to_type(service_name)
|
||||
def self.available_integration_types(include_project_specific: true, include_dev: true)
|
||||
available_integration_names(include_project_specific: include_project_specific, include_dev: include_dev).map do
|
||||
integration_name_to_type(_1)
|
||||
end
|
||||
end
|
||||
|
||||
|
|
|
|||
|
|
@ -120,8 +120,6 @@ module Integrations
|
|||
end
|
||||
|
||||
def execute(data)
|
||||
return if project.disabled_services.include?(to_param)
|
||||
|
||||
object_kind = data[:object_kind]
|
||||
object_kind = 'job' if object_kind == 'build'
|
||||
return unless supported_events.include?(object_kind)
|
||||
|
|
|
|||
|
|
@ -550,7 +550,7 @@ class Project < ApplicationRecord
|
|||
scope :with_namespace, -> { includes(:namespace) }
|
||||
scope :with_import_state, -> { includes(:import_state) }
|
||||
scope :include_project_feature, -> { includes(:project_feature) }
|
||||
scope :with_service, ->(service) { joins(service).eager_load(service) }
|
||||
scope :with_integration, ->(integration) { joins(integration).eager_load(integration) }
|
||||
scope :with_shared_runners, -> { where(shared_runners_enabled: true) }
|
||||
scope :with_container_registry, -> { where(container_registry_enabled: true) }
|
||||
scope :inside_path, ->(path) do
|
||||
|
|
@ -1398,22 +1398,22 @@ class Project < ApplicationRecord
|
|||
@external_wiki ||= integrations.external_wikis.first
|
||||
end
|
||||
|
||||
def find_or_initialize_services
|
||||
available_services_names = Integration.available_services_names - disabled_services
|
||||
|
||||
available_services_names.map do |service_name|
|
||||
find_or_initialize_service(service_name)
|
||||
end.sort_by(&:title)
|
||||
def find_or_initialize_integrations
|
||||
Integration
|
||||
.available_integration_names
|
||||
.difference(disabled_integrations)
|
||||
.map { find_or_initialize_integration(_1) }
|
||||
.sort_by(&:title)
|
||||
end
|
||||
|
||||
def disabled_services
|
||||
def disabled_integrations
|
||||
[]
|
||||
end
|
||||
|
||||
def find_or_initialize_service(name)
|
||||
return if disabled_services.include?(name)
|
||||
def find_or_initialize_integration(name)
|
||||
return if disabled_integrations.include?(name)
|
||||
|
||||
find_service(integrations, name) || build_from_instance_or_template(name) || build_service(name)
|
||||
find_integration(integrations, name) || build_from_instance_or_template(name) || build_integration(name)
|
||||
end
|
||||
|
||||
# rubocop: disable CodeReuse/ServiceClass
|
||||
|
|
@ -2659,19 +2659,19 @@ class Project < ApplicationRecord
|
|||
project_feature.update!(container_registry_access_level: access_level)
|
||||
end
|
||||
|
||||
def find_service(services, name)
|
||||
services.find { |service| service.to_param == name }
|
||||
def find_integration(integrations, name)
|
||||
integrations.find { _1.to_param == name }
|
||||
end
|
||||
|
||||
def build_from_instance_or_template(name)
|
||||
instance = find_service(services_instances, name)
|
||||
instance = find_integration(services_instances, name)
|
||||
return Integration.build_from_integration(instance, project_id: id) if instance
|
||||
|
||||
template = find_service(services_templates, name)
|
||||
template = find_integration(services_templates, name)
|
||||
return Integration.build_from_integration(template, project_id: id) if template
|
||||
end
|
||||
|
||||
def build_service(name)
|
||||
def build_integration(name)
|
||||
Integration.integration_name_to_model(name).new(project_id: id)
|
||||
end
|
||||
|
||||
|
|
|
|||
|
|
@ -193,14 +193,14 @@ module Projects
|
|||
|
||||
# Deprecated: https://gitlab.com/gitlab-org/gitlab/-/issues/326665
|
||||
def create_prometheus_integration
|
||||
service = @project.find_or_initialize_service(::Integrations::Prometheus.to_param)
|
||||
integration = @project.find_or_initialize_integration(::Integrations::Prometheus.to_param)
|
||||
|
||||
# If the service has already been inserted in the database, that
|
||||
# means it came from a template, and there's nothing more to do.
|
||||
return if service.persisted?
|
||||
return if integration.persisted?
|
||||
|
||||
if service.prometheus_available?
|
||||
service.save!
|
||||
if integration.prometheus_available?
|
||||
integration.save!
|
||||
else
|
||||
@project.prometheus_integration = nil
|
||||
end
|
||||
|
|
|
|||
|
|
@ -102,10 +102,10 @@ module Projects
|
|||
def prometheus_integration_params
|
||||
return {} unless attrs = params[:prometheus_integration_attributes]
|
||||
|
||||
service = project.find_or_initialize_service(::Integrations::Prometheus.to_param)
|
||||
service.assign_attributes(attrs)
|
||||
integration = project.find_or_initialize_integration(::Integrations::Prometheus.to_param)
|
||||
integration.assign_attributes(attrs)
|
||||
|
||||
{ prometheus_integration_attributes: service.attributes.except(*%w(id project_id created_at updated_at)) }
|
||||
{ prometheus_integration_attributes: integration.attributes.except(*%w[id project_id created_at updated_at]) }
|
||||
end
|
||||
|
||||
def incident_management_setting_params
|
||||
|
|
|
|||
|
|
@ -67,7 +67,7 @@ module Projects
|
|||
end
|
||||
|
||||
def valid_for_manual?(token)
|
||||
prometheus = project.find_or_initialize_service('prometheus')
|
||||
prometheus = project.find_or_initialize_integration('prometheus')
|
||||
return false unless prometheus.manual_configuration?
|
||||
|
||||
if setting = project.alerting_setting
|
||||
|
|
|
|||
|
|
@ -1309,6 +1309,15 @@
|
|||
:idempotent: true
|
||||
:tags:
|
||||
- :exclude_from_kubernetes
|
||||
- :name: package_repositories:packages_helm_extraction
|
||||
:worker_name: Packages::Helm::ExtractionWorker
|
||||
:feature_category: :package_registry
|
||||
:has_external_dependencies:
|
||||
:urgency: :low
|
||||
:resource_boundary: :unknown
|
||||
:weight: 1
|
||||
:idempotent: true
|
||||
:tags: []
|
||||
- :name: package_repositories:packages_maven_metadata_sync
|
||||
:worker_name: Packages::Maven::Metadata::SyncWorker
|
||||
:feature_category: :package_registry
|
||||
|
|
|
|||
|
|
@ -15,7 +15,7 @@ module Clusters
|
|||
return unless cluster
|
||||
|
||||
cluster.all_projects.find_each do |project|
|
||||
project.find_or_initialize_service(service_name).update!(active: true)
|
||||
project.find_or_initialize_integration(service_name).update!(active: true)
|
||||
end
|
||||
end
|
||||
end
|
||||
|
|
|
|||
|
|
@ -15,7 +15,7 @@ module Clusters
|
|||
raise cluster_missing_error(integration_name) unless cluster
|
||||
|
||||
integration = ::Project.integration_association_name(integration_name).to_sym
|
||||
cluster.all_projects.with_service(integration).find_each do |project|
|
||||
cluster.all_projects.with_integration(integration).find_each do |project|
|
||||
project.public_send(integration).update!(active: false) # rubocop:disable GitlabSecurity/PublicSend
|
||||
end
|
||||
end
|
||||
|
|
|
|||
|
|
@ -0,0 +1,29 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
module Packages
|
||||
module Helm
|
||||
class ExtractionWorker
|
||||
include ApplicationWorker
|
||||
|
||||
queue_namespace :package_repositories
|
||||
feature_category :package_registry
|
||||
deduplicate :until_executing
|
||||
|
||||
idempotent!
|
||||
|
||||
def perform(channel, package_file_id)
|
||||
package_file = ::Packages::PackageFile.find_by_id(package_file_id)
|
||||
|
||||
return unless package_file && !package_file.package.default?
|
||||
|
||||
::Packages::Helm::ProcessFileService.new(channel, package_file).execute
|
||||
|
||||
rescue ::Packages::Helm::ExtractFileMetadataService::ExtractionError,
|
||||
::Packages::Helm::ProcessFileService::ExtractionError,
|
||||
::ActiveModel::ValidationError => e
|
||||
Gitlab::ErrorTracking.log_exception(e, project_id: package_file.project_id)
|
||||
package_file.package.update_column(:status, :error)
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
|
|
@ -21,15 +21,15 @@ module Projects
|
|||
private
|
||||
|
||||
def create_prometheus_integration(project)
|
||||
service = project.find_or_initialize_service(::Integrations::Prometheus.to_param)
|
||||
integration = project.find_or_initialize_integration(::Integrations::Prometheus.to_param)
|
||||
|
||||
# If the service has already been inserted in the database, that
|
||||
# means it came from a template, and there's nothing more to do.
|
||||
return if service.persisted?
|
||||
return if integration.persisted?
|
||||
|
||||
return unless service.prometheus_available?
|
||||
return unless integration.prometheus_available?
|
||||
|
||||
service.save!
|
||||
integration.save!
|
||||
rescue ActiveRecord::RecordInvalid => e
|
||||
Gitlab::ErrorTracking.track_exception(e, extra: { project_id: project.id })
|
||||
end
|
||||
|
|
|
|||
|
|
@ -385,7 +385,15 @@ module Gitlab
|
|||
initializer :correct_precompile_targets, after: :set_default_precompile do |app|
|
||||
app.config.assets.precompile.reject! { |entry| entry == Sprockets::Railtie::LOOSE_APP_ASSETS }
|
||||
|
||||
asset_roots = [config.root.join("app/assets").to_s]
|
||||
# if two files in assets are named the same, it'll likely resolve to the normal app/assets version.
|
||||
# See https://gitlab.com/gitlab-jh/gitlab/-/merge_requests/27#note_609101582 for more details
|
||||
asset_roots = []
|
||||
|
||||
if Gitlab.jh?
|
||||
asset_roots << config.root.join("jh/app/assets").to_s
|
||||
end
|
||||
|
||||
asset_roots << config.root.join("app/assets").to_s
|
||||
|
||||
if Gitlab.ee?
|
||||
asset_roots << config.root.join("ee/app/assets").to_s
|
||||
|
|
@ -413,16 +421,18 @@ module Gitlab
|
|||
end
|
||||
end
|
||||
|
||||
# Add EE assets. They should take precedence over CE. This means if two files exist, e.g.:
|
||||
# Add assets for variants of GitLab. They should take precedence over CE.
|
||||
# This means if multiple files exist, e.g.:
|
||||
#
|
||||
# jh/app/assets/stylesheets/example.scss
|
||||
# ee/app/assets/stylesheets/example.scss
|
||||
# app/assets/stylesheets/example.scss
|
||||
#
|
||||
# The ee/ version will be preferred.
|
||||
initializer :prefer_ee_assets, after: :append_assets_path do |app|
|
||||
if Gitlab.ee?
|
||||
# The jh/ version will be preferred.
|
||||
initializer :prefer_specialized_assets, after: :append_assets_path do |app|
|
||||
Gitlab.extensions.each do |extension|
|
||||
%w[images javascripts stylesheets].each do |path|
|
||||
app.config.assets.paths.unshift("#{config.root}/ee/app/assets/#{path}")
|
||||
app.config.assets.paths.unshift("#{config.root}/#{extension}/app/assets/#{path}")
|
||||
end
|
||||
end
|
||||
end
|
||||
|
|
|
|||
|
|
@ -5,4 +5,4 @@ rollout_issue_url: https://gitlab.com/gitlab-org/gitlab/-/issues/325528
|
|||
milestone: '13.12'
|
||||
type: development
|
||||
group: group::gitaly
|
||||
default_enabled: false
|
||||
default_enabled: true
|
||||
|
|
|
|||
|
|
@ -5,4 +5,4 @@ rollout_issue_url: https://gitlab.com/gitlab-org/gitlab/-/issues/333517
|
|||
milestone: '14.0'
|
||||
type: development
|
||||
group: group::gitaly
|
||||
default_enabled: false
|
||||
default_enabled: true
|
||||
|
|
|
|||
|
|
@ -6,10 +6,10 @@ info: To determine the technical writer assigned to the Stage/Group associated w
|
|||
|
||||
# Incoming email **(FREE SELF)**
|
||||
|
||||
GitLab has several features based on receiving incoming emails:
|
||||
GitLab has several features based on receiving incoming email messages:
|
||||
|
||||
- [Reply by Email](reply_by_email.md): allow GitLab users to comment on issues
|
||||
and merge requests by replying to notification emails.
|
||||
and merge requests by replying to notification email.
|
||||
- [New issue by email](../user/project/issues/managing_issues.md#new-issue-via-email):
|
||||
allow GitLab users to create a new issue by sending an email to a
|
||||
user-specific email address.
|
||||
|
|
@ -22,9 +22,9 @@ GitLab has several features based on receiving incoming emails:
|
|||
## Requirements
|
||||
|
||||
We recommend using an email address that receives **only** messages that are intended for
|
||||
the GitLab instance. Any incoming emails not intended for GitLab receive a reject notice.
|
||||
the GitLab instance. Any incoming email messages not intended for GitLab receive a reject notice.
|
||||
|
||||
Handling incoming emails requires an [IMAP](https://en.wikipedia.org/wiki/Internet_Message_Access_Protocol)-enabled
|
||||
Handling incoming email messages requires an [IMAP](https://en.wikipedia.org/wiki/Internet_Message_Access_Protocol)-enabled
|
||||
email account. GitLab requires one of the following three strategies:
|
||||
|
||||
- Email sub-addressing (recommended)
|
||||
|
|
@ -53,7 +53,7 @@ leaving a catch-all available for other purposes beyond GitLab.
|
|||
### Catch-all mailbox
|
||||
|
||||
A [catch-all mailbox](https://en.wikipedia.org/wiki/Catch-all) for a domain
|
||||
receives all emails addressed to the domain that do not match any addresses that
|
||||
receives all email messages addressed to the domain that do not match any addresses that
|
||||
exist on the mail server.
|
||||
|
||||
As of GitLab 11.7, catch-all mailboxes support the same features as
|
||||
|
|
@ -68,7 +68,7 @@ this method only supports replies, and not the other features of [incoming email
|
|||
|
||||
## Set it up
|
||||
|
||||
If you want to use Gmail / Google Apps for incoming emails, make sure you have
|
||||
If you want to use Gmail / Google Apps for incoming email, make sure you have
|
||||
[IMAP access enabled](https://support.google.com/mail/answer/7126229)
|
||||
and [allowed less secure apps to access the account](https://support.google.com/accounts/answer/6010255)
|
||||
or [turn-on 2-step validation](https://support.google.com/accounts/answer/185839)
|
||||
|
|
|
|||
|
|
@ -12540,6 +12540,7 @@ Represents summary of a security report.
|
|||
| <a id="securityreportsummarycoveragefuzzing"></a>`coverageFuzzing` | [`SecurityReportSummarySection`](#securityreportsummarysection) | Aggregated counts for the `coverage_fuzzing` scan. |
|
||||
| <a id="securityreportsummarydast"></a>`dast` | [`SecurityReportSummarySection`](#securityreportsummarysection) | Aggregated counts for the `dast` scan. |
|
||||
| <a id="securityreportsummarydependencyscanning"></a>`dependencyScanning` | [`SecurityReportSummarySection`](#securityreportsummarysection) | Aggregated counts for the `dependency_scanning` scan. |
|
||||
| <a id="securityreportsummaryrunningcontainerscanning"></a>`runningContainerScanning` | [`SecurityReportSummarySection`](#securityreportsummarysection) | Aggregated counts for the `running_container_scanning` scan. |
|
||||
| <a id="securityreportsummarysast"></a>`sast` | [`SecurityReportSummarySection`](#securityreportsummarysection) | Aggregated counts for the `sast` scan. |
|
||||
| <a id="securityreportsummarysecretdetection"></a>`secretDetection` | [`SecurityReportSummarySection`](#securityreportsummarysection) | Aggregated counts for the `secret_detection` scan. |
|
||||
|
||||
|
|
@ -13393,7 +13394,7 @@ Represents a vulnerability.
|
|||
| <a id="vulnerabilitynotes"></a>`notes` | [`NoteConnection!`](#noteconnection) | All notes on this noteable. (see [Connections](#connections)) |
|
||||
| <a id="vulnerabilityprimaryidentifier"></a>`primaryIdentifier` | [`VulnerabilityIdentifier`](#vulnerabilityidentifier) | Primary identifier of the vulnerability. |
|
||||
| <a id="vulnerabilityproject"></a>`project` | [`Project`](#project) | The project on which the vulnerability was found. |
|
||||
| <a id="vulnerabilityreporttype"></a>`reportType` | [`VulnerabilityReportType`](#vulnerabilityreporttype) | Type of the security report that found the vulnerability (SAST, DEPENDENCY_SCANNING, CONTAINER_SCANNING, DAST, SECRET_DETECTION, COVERAGE_FUZZING, API_FUZZING). `Scan Type` in the UI. |
|
||||
| <a id="vulnerabilityreporttype"></a>`reportType` | [`VulnerabilityReportType`](#vulnerabilityreporttype) | Type of the security report that found the vulnerability (SAST, DEPENDENCY_SCANNING, CONTAINER_SCANNING, DAST, SECRET_DETECTION, COVERAGE_FUZZING, API_FUZZING, RUNNING_CONTAINER_SCANNING). `Scan Type` in the UI. |
|
||||
| <a id="vulnerabilityresolvedat"></a>`resolvedAt` | [`Time`](#time) | Timestamp of when the vulnerability state was changed to resolved. |
|
||||
| <a id="vulnerabilityresolvedby"></a>`resolvedBy` | [`UserCore`](#usercore) | The user that resolved the vulnerability. |
|
||||
| <a id="vulnerabilityresolvedondefaultbranch"></a>`resolvedOnDefaultBranch` | [`Boolean!`](#boolean) | Indicates whether the vulnerability is fixed on the default branch or not. |
|
||||
|
|
@ -15065,6 +15066,7 @@ The type of the security scan that found the vulnerability.
|
|||
| <a id="vulnerabilityreporttypecoverage_fuzzing"></a>`COVERAGE_FUZZING` | |
|
||||
| <a id="vulnerabilityreporttypedast"></a>`DAST` | |
|
||||
| <a id="vulnerabilityreporttypedependency_scanning"></a>`DEPENDENCY_SCANNING` | |
|
||||
| <a id="vulnerabilityreporttyperunning_container_scanning"></a>`RUNNING_CONTAINER_SCANNING` | |
|
||||
| <a id="vulnerabilityreporttypesast"></a>`SAST` | |
|
||||
| <a id="vulnerabilityreporttypesecret_detection"></a>`SECRET_DETECTION` | |
|
||||
|
||||
|
|
|
|||
|
|
@ -17330,6 +17330,18 @@ Status: `data_available`
|
|||
|
||||
Tiers: `ultimate`
|
||||
|
||||
### `usage_activity_by_stage.secure.running_container_scanning_scans`
|
||||
|
||||
Counts running container scanning jobs
|
||||
|
||||
[YAML definition](https://gitlab.com/gitlab-org/gitlab/-/blob/master/ee/config/metrics/counts_all/20210618124854_running_container_scanning_scans.yml)
|
||||
|
||||
Group: `group::container security`
|
||||
|
||||
Status: `data_available`
|
||||
|
||||
Tiers: `ultimate`
|
||||
|
||||
### `usage_activity_by_stage.secure.sast_scans`
|
||||
|
||||
Counts sast jobs
|
||||
|
|
@ -19430,6 +19442,30 @@ Status: `data_available`
|
|||
|
||||
Tiers: `ultimate`
|
||||
|
||||
### `usage_activity_by_stage_monthly.secure.running_container_scanning_pipeline`
|
||||
|
||||
Pipelines containing a Running Container Scanning job
|
||||
|
||||
[YAML definition](https://gitlab.com/gitlab-org/gitlab/-/blob/master/ee/config/metrics/counts_28d/20210618125224_running_container_scanning_pipeline.yml)
|
||||
|
||||
Group: `group::container security`
|
||||
|
||||
Status: `data_available`
|
||||
|
||||
Tiers: `ultimate`
|
||||
|
||||
### `usage_activity_by_stage_monthly.secure.running_container_scanning_scans`
|
||||
|
||||
Counts running container scanning jobs
|
||||
|
||||
[YAML definition](https://gitlab.com/gitlab-org/gitlab/-/blob/master/ee/config/metrics/counts_28d/20210618101233_running_container_scanning_scans.yml)
|
||||
|
||||
Group: `group::container security`
|
||||
|
||||
Status: `data_available`
|
||||
|
||||
Tiers: `ultimate`
|
||||
|
||||
### `usage_activity_by_stage_monthly.secure.sast_pipeline`
|
||||
|
||||
Counts of Pipelines that have at least 1 SAST job
|
||||
|
|
|
|||
|
|
@ -706,51 +706,53 @@ dast:
|
|||
|
||||
### Available CI/CD variables
|
||||
|
||||
DAST can be [configured](#customizing-the-dast-settings) using CI/CD variables.
|
||||
You can use CI/CD variables to customize DAST.
|
||||
|
||||
| CI/CD variable | Type | Description |
|
||||
|:--------------------------------------------|:--------------|:-----------------------------------|
|
||||
|:------------------------------------------------|:--------------|:-------------------------------|
|
||||
| `SECURE_ANALYZERS_PREFIX` | URL | Set the Docker registry base address from which to download the analyzer. |
|
||||
| `DAST_WEBSITE` (**1**) | URL | The URL of the website to scan. `DAST_API_OPENAPI` must be specified if this is omitted. |
|
||||
| `DAST_WEBSITE` <sup>1</sup> | URL | The URL of the website to scan. `DAST_API_OPENAPI` must be specified if this is omitted. |
|
||||
| `DAST_API_OPENAPI` | URL or string | The API specification to import. The specification can be hosted at a URL, or the name of a file present in the `/zap/wrk` directory. `DAST_WEBSITE` must be specified if this is omitted. |
|
||||
| `DAST_API_SPECIFICATION` (**1**) | URL or string | [Deprecated](https://gitlab.com/gitlab-org/gitlab/-/issues/290241) in GitLab 13.12 and replaced by `DAST_API_OPENAPI`. To be removed in GitLab 15.0. The API specification to import. The specification can be hosted at a URL, or the name of a file present in the `/zap/wrk` directory. `DAST_WEBSITE` must be specified if this is omitted. |
|
||||
| `DAST_SPIDER_START_AT_HOST` | boolean | Set to `false` to prevent DAST from resetting the target to its host before scanning. When `true`, non-host targets `http://test.site/some_path` is reset to `http://test.site` before scan. Default: `false`. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/258805) in GitLab 13.6. |
|
||||
| `DAST_AUTH_URL` (**1**) | URL | The URL of the page containing the sign-in HTML form on the target website. `DAST_USERNAME` and `DAST_PASSWORD` are submitted with the login form to create an authenticated scan. Not supported for API scans. |
|
||||
| `DAST_AUTH_VERIFICATION_URL` (**1**) | URL | A URL only accessible to logged in users that DAST can use to confirm successful authentication. If provided, DAST exits if it cannot access the URL. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/207335) in GitLab 13.8. |
|
||||
| `DAST_USERNAME` (**1**) | string | The username to enter into the username field on the sign-in HTML form. |
|
||||
| `DAST_PASSWORD` (**1**) | string | The password to enter into the password field on the sign-in HTML form. |
|
||||
| `DAST_USERNAME_FIELD` (**1**) | selector | A selector describing the username field on the sign-in HTML form. Example: `id:user` |
|
||||
| `DAST_PASSWORD_FIELD` (**1**) | selector | A selector describing the password field on the sign-in HTML form. Example: `css:.password-field` |
|
||||
| `DAST_API_SPECIFICATION` <sup>1</sup> | URL or string | [Deprecated](https://gitlab.com/gitlab-org/gitlab/-/issues/290241) in GitLab 13.12 and replaced by `DAST_API_OPENAPI`. To be removed in GitLab 15.0. The API specification to import. The specification can be hosted at a URL, or the name of a file present in the `/zap/wrk` directory. `DAST_WEBSITE` must be specified if this is omitted. |
|
||||
| `DAST_SPIDER_START_AT_HOST` | boolean | Set to `false` to prevent DAST from resetting the target to its host before scanning. When `true`, non-host targets `http://test.site/some_path` is reset to `http://test.site` before scan. Default: `true`. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/258805) in GitLab 13.6. |
|
||||
| `DAST_AUTH_URL` <sup>1</sup> | URL | The URL of the page containing the sign-in HTML form on the target website. `DAST_USERNAME` and `DAST_PASSWORD` are submitted with the login form to create an authenticated scan. Not supported for API scans. |
|
||||
| `DAST_AUTH_VERIFICATION_URL` <sup>1</sup> | URL | A URL only accessible to logged in users that DAST can use to confirm successful authentication. If provided, DAST exits if it cannot access the URL. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/207335) in GitLab 13.8. |
|
||||
| `DAST_USERNAME` <sup>1</sup> | string | The username to authenticate to in the website. |
|
||||
| `DAST_PASSWORD` <sup>1</sup> | string | The password to authenticate to in the website. |
|
||||
| `DAST_USERNAME_FIELD` <sup>1</sup> | string | The name of username field at the sign-in HTML form. |
|
||||
| `DAST_PASSWORD_FIELD` <sup>1</sup> | string | The name of password field at the sign-in HTML form. |
|
||||
| `DAST_SKIP_TARGET_CHECK` | boolean | Set to `true` to prevent DAST from checking that the target is available before scanning. Default: `false`. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/229067) in GitLab 13.8. |
|
||||
| `DAST_MASK_HTTP_HEADERS` | string | Comma-separated list of request and response headers to be masked (GitLab 13.1). Must contain **all** headers to be masked. Refer to [list of headers that are masked by default](#hide-sensitive-information). |
|
||||
| `DAST_EXCLUDE_URLS` (**1**) | URLs | The URLs to skip during the authenticated scan; comma-separated. Regular expression syntax can be used to match multiple URLs. For example, `.*` matches an arbitrary character sequence. Not supported for API scans. |
|
||||
| `DAST_FULL_SCAN_ENABLED` (**1**) | boolean | Set to `true` to run a [ZAP Full Scan](https://github.com/zaproxy/zaproxy/wiki/ZAP-Full-Scan) instead of a [ZAP Baseline Scan](https://github.com/zaproxy/zaproxy/wiki/ZAP-Baseline-Scan). Default: `false` |
|
||||
| `DAST_EXCLUDE_URLS` <sup>1</sup> | URLs | The URLs to skip during the authenticated scan; comma-separated. Regular expression syntax can be used to match multiple URLs. For example, `.*` matches an arbitrary character sequence. Not supported for API scans. |
|
||||
| `DAST_FULL_SCAN_ENABLED` <sup>1</sup> | boolean | Set to `true` to run a [ZAP Full Scan](https://github.com/zaproxy/zaproxy/wiki/ZAP-Full-Scan) instead of a [ZAP Baseline Scan](https://github.com/zaproxy/zaproxy/wiki/ZAP-Baseline-Scan). Default: `false` |
|
||||
| `DAST_FULL_SCAN_DOMAIN_VALIDATION_REQUIRED` | boolean | **{warning}** **[Removed](https://gitlab.com/gitlab-org/gitlab/-/issues/293595)** in GitLab 14.0. Set to `true` to require domain validation when running DAST full scans. Not supported for API scans. Default: `false` |
|
||||
| `DAST_AUTO_UPDATE_ADDONS` | boolean | ZAP add-ons are pinned to specific versions in the DAST Docker image. Set to `true` to download the latest versions when the scan starts. Default: `false` |
|
||||
| `DAST_API_HOST_OVERRIDE` (**1**) | string | Used to override domains defined in API specification files. Only supported when importing the API specification from a URL. Example: `example.com:8080` |
|
||||
| `DAST_API_HOST_OVERRIDE` <sup>1</sup> | string | Used to override domains defined in API specification files. Only supported when importing the API specification from a URL. Example: `example.com:8080` |
|
||||
| `DAST_EXCLUDE_RULES` | string | Set to a comma-separated list of Vulnerability Rule IDs to exclude them from running during the scan. Rule IDs are numbers and can be found from the DAST log or on the [ZAP project](https://www.zaproxy.org/docs/alerts/). For example, `HTTP Parameter Override` has a rule ID of `10026`. Cannot be used when `DAST_ONLY_INCLUDE_RULES` is set. **Note:** In earlier versions of GitLab the excluded rules were executed but vulnerabilities they generated were suppressed. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/118641) in GitLab 12.10. |
|
||||
| `DAST_ONLY_INCLUDE_RULES` | string | Set to a comma-separated list of Vulnerability Rule IDs to configure the scan to run only them. Rule IDs are numbers and can be found from the DAST log or on the [ZAP project](https://www.zaproxy.org/docs/alerts/). Cannot be used when `DAST_EXCLUDE_RULES` is set. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/250651) in GitLab 13.12. |
|
||||
| `DAST_REQUEST_HEADERS` (**1**) | string | Set to a comma-separated list of request header names and values. Headers are added to every request made by DAST. For example, `Cache-control: no-cache,User-Agent: DAST/1.0` |
|
||||
| `DAST_DEBUG` (**1**) | boolean | Enable debug message output. Default: `false`. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/12652) in GitLab 13.1. |
|
||||
| `DAST_TARGET_AVAILABILITY_TIMEOUT` (**1**) | number | Time limit in seconds to wait for target availability.
|
||||
| `DAST_SPIDER_MINS` (**1**) | number | The maximum duration of the spider scan in minutes. Set to `0` for unlimited. Default: One minute, or unlimited when the scan is a full scan. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/12652) in GitLab 13.1. |
|
||||
| `DAST_REQUEST_HEADERS` <sup>1</sup> | string | Set to a comma-separated list of request header names and values. Headers are added to every request made by DAST. For example, `Cache-control: no-cache,User-Agent: DAST/1.0` |
|
||||
| `DAST_DEBUG` <sup>1</sup> | boolean | Enable debug message output. Default: `false`. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/12652) in GitLab 13.1. |
|
||||
| `DAST_TARGET_AVAILABILITY_TIMEOUT` <sup>1</sup> | number | Time limit in seconds to wait for target availability. |
|
||||
| `DAST_SPIDER_MINS` <sup>1</sup> | number | The maximum duration of the spider scan in minutes. Set to `0` for unlimited. Default: One minute, or unlimited when the scan is a full scan. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/12652) in GitLab 13.1. |
|
||||
| `DAST_HTML_REPORT` | string | The filename of the HTML report written at the end of a scan. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/12652) in GitLab 13.1. |
|
||||
| `DAST_MARKDOWN_REPORT` | string | The filename of the Markdown report written at the end of a scan. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/12652) in GitLab 13.1. |
|
||||
| `DAST_XML_REPORT` | string | The filename of the XML report written at the end of a scan. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/12652) in GitLab 13.1. |
|
||||
| `DAST_INCLUDE_ALPHA_VULNERABILITIES` | boolean | Set to `true` to include alpha passive and active scan rules. Default: `false`. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/12652) in GitLab 13.1. |
|
||||
| `DAST_USE_AJAX_SPIDER` (**1**) | boolean | Set to `true` to use the AJAX spider in addition to the traditional spider, useful for crawling sites that require JavaScript. Default: `false`. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/12652) in GitLab 13.1. |
|
||||
| `DAST_USE_AJAX_SPIDER` <sup>1</sup> | boolean | Set to `true` to use the AJAX spider in addition to the traditional spider, useful for crawling sites that require JavaScript. Default: `false`. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/12652) in GitLab 13.1. |
|
||||
| `DAST_PATHS` | string | Set to a comma-separated list of URLs for DAST to scan. For example, `/page1.html,/category1/page3.html,/page2.html`. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/214120) in GitLab 13.4. |
|
||||
| `DAST_PATHS_FILE` | string | The file path containing the paths within `DAST_WEBSITE` to scan. The file must be plain text with one path per line. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/258825) in GitLab 13.6. |
|
||||
| `DAST_SUBMIT_FIELD` | selector | A selector describing the element that when clicked submits the login form, or the password form of a multi-page login process. Example: `xpath://input[@value='Login']`. [Introduced](https://gitlab.com/gitlab-org/gitlab-ee/issues/9894) in GitLab 12.4. |
|
||||
| `DAST_FIRST_SUBMIT_FIELD` | selector | A selector describing the element that when clicked submits the username form of a multi-page login process. Example: `.submit`. [Introduced](https://gitlab.com/gitlab-org/gitlab-ee/issues/9894) in GitLab 12.4. |
|
||||
| `DAST_SUBMIT_FIELD` | string | The `id` or `name` of the element that when clicked submits the login form or the password form of a multi-page login process. [Introduced](https://gitlab.com/gitlab-org/gitlab-ee/issues/9894) in GitLab 12.4. |
|
||||
| `DAST_FIRST_SUBMIT_FIELD` | string | The `id` or `name` of the element that when clicked submits the username form of a multi-page login process. [Introduced](https://gitlab.com/gitlab-org/gitlab-ee/issues/9894) in GitLab 12.4. |
|
||||
| `DAST_ZAP_CLI_OPTIONS` | string | ZAP server command-line options. For example, `-Xmx3072m` would set the Java maximum memory allocation pool size. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/12652) in GitLab 13.1. |
|
||||
| `DAST_ZAP_LOG_CONFIGURATION` | string | Set to a semicolon-separated list of additional log4j properties for the ZAP Server. |
|
||||
| `DAST_ZAP_LOG_CONFIGURATION` | string | Set to a semicolon-separated list of additional log4j properties for the ZAP Server. For example, `log4j.logger.org.parosproxy.paros.network.HttpSender=DEBUG;log4j.logger.com.crawljax=DEBUG` |
|
||||
| `DAST_AUTH_EXCLUDE_URLS` | URLs | **{warning}** **[Removed](https://gitlab.com/gitlab-org/gitlab/-/issues/289959)** in GitLab 14.0. Replaced by `DAST_EXCLUDE_URLS`. The URLs to skip during the authenticated scan; comma-separated. Regular expression syntax can be used to match multiple URLs. For example, `.*` matches an arbitrary character sequence. Not supported for API scans. |
|
||||
| `DAST_AGGREGATE_VULNERABILITIES` | boolean | Vulnerability aggregation is set to `true` by default. To disable this feature and see each vulnerability individually set to `false`. [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/254043) in GitLab 14.0. |
|
||||
| `DAST_MAX_URLS_PER_VULNERABILITY` | number | The maximum number of URLs reported for a single vulnerability. `DAST_MAX_URLS_PER_VULNERABILITY` is set to `50` by default. To list all the URLs set to `0`. [Introduced](https://gitlab.com/gitlab-org/security-products/dast/-/merge_requests/433) in GitLab 13.12. |
|
||||
| `DAST_AUTH_REPORT` | boolean | Used in combination with exporting the `gl-dast-debug-auth-report.html` artifact to aid in debugging authentication issues. |
|
||||
| `DAST_AUTH_VERIFICATION_SELECTOR` | selector | Verifies successful authentication by checking for presence of a selector once the login form has been submitted. Example: `css:.user-photo` |
|
||||
| `DAST_AUTH_VERIFICATION_LOGIN_FORM` | boolean | Verifies successful authentication by checking for the lack of a login form once the login form has been submitted. |
|
||||
|
||||
1. DAST CI/CD variable available to an on-demand scan.
|
||||
1. Available to an on-demand DAST scan.
|
||||
|
||||
#### Selectors
|
||||
|
||||
|
|
|
|||
|
|
@ -332,6 +332,36 @@ If you forget to set the service alias, the `docker:19.03.12` image is unable to
|
|||
error during connect: Get http://docker:2376/v1.39/info: dial tcp: lookup docker on 192.168.0.1:53: no such host
|
||||
```
|
||||
|
||||
### Using a Docker-in-Docker image with Dependency Proxy
|
||||
|
||||
To use your own Docker images with Dependency Proxy, follow these steps
|
||||
in addition to the steps in the
|
||||
[Docker-in-Docker](../../../ci/docker/using_docker_build.md#use-the-docker-executor-with-the-docker-image-docker-in-docker) section:
|
||||
|
||||
1. Update the `image` and `service` to point to your registry.
|
||||
1. Add a service [alias](../../../ci/yaml/README.md#servicesalias).
|
||||
|
||||
Below is an example of what your `.gitlab-ci.yml` should look like:
|
||||
|
||||
```yaml
|
||||
build:
|
||||
image: ${CI_DEPENDENCY_PROXY_GROUP_IMAGE_PREFIX}/group/project/docker:19.03.12
|
||||
services:
|
||||
- name: ${CI_DEPENDENCY_PROXY_GROUP_IMAGE_PREFIX}/docker:18.09.7-dind
|
||||
alias: docker
|
||||
stage: build
|
||||
script:
|
||||
- docker build -t my-docker-image .
|
||||
- docker run my-docker-image /script/to/run/tests
|
||||
```
|
||||
|
||||
If you forget to set the service alias, the `docker:19.03.12` image is unable to find the
|
||||
`dind` service, and an error like the following is thrown:
|
||||
|
||||
```plaintext
|
||||
error during connect: Get http://docker:2376/v1.39/info: dial tcp: lookup docker on 192.168.0.1:53: no such host
|
||||
```
|
||||
|
||||
## Delete images
|
||||
|
||||
You can delete images from your Container Registry in multiple ways.
|
||||
|
|
|
|||
|
|
@ -252,3 +252,21 @@ hub_docker_quota_check:
|
|||
- |
|
||||
TOKEN=$(curl "https://auth.docker.io/token?service=registry.docker.io&scope=repository:ratelimitpreview/test:pull" | jq --raw-output .token) && curl --head --header "Authorization: Bearer $TOKEN" "https://registry-1.docker.io/v2/ratelimitpreview/test/manifests/latest" 2>&1
|
||||
```
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Dependency Proxy Connection Failure
|
||||
|
||||
If a service alias is not set the `docker:19.03.12` image is unable to find the
|
||||
`dind` service, and an error like the following is thrown:
|
||||
|
||||
```plaintext
|
||||
error during connect: Get http://docker:2376/v1.39/info: dial tcp: lookup docker on 192.168.0.1:53: no such host
|
||||
```
|
||||
|
||||
This can be resolved by setting a service alias for the Docker service:
|
||||
|
||||
```plaintext
|
||||
services:
|
||||
- name: ${CI_DEPENDENCY_PROXY_GROUP_IMAGE_PREFIX}/docker:18.09.7-dind
|
||||
alias: docker```
|
||||
|
|
|
|||
|
|
@ -745,7 +745,7 @@ You can create a new package each time the `master` branch is updated.
|
|||
<repositories>
|
||||
<repository>
|
||||
<id>gitlab-maven</id>
|
||||
<url>$env{CI_API_V4_URL}/projects/${env.CI_PROJECT_ID}/packages/maven</url>
|
||||
<url>${env.CI_API_V4_URL}/projects/${env.CI_PROJECT_ID}/packages/maven</url>
|
||||
</repository>
|
||||
</repositories>
|
||||
<distributionManagement>
|
||||
|
|
|
|||
|
|
@ -337,7 +337,7 @@ updated:
|
|||
stage: deploy
|
||||
script:
|
||||
- dotnet pack -c Release
|
||||
- dotnet nuget add source "${CI_API_V4_URL}/${CI_PROJECT_ID}/packages/nuget/index.json" --name gitlab --username gitlab-ci-token --password $CI_JOB_TOKEN --store-password-in-clear-text
|
||||
- dotnet nuget add source "${CI_API_V4_URL}/projects/${CI_PROJECT_ID}/packages/nuget/index.json" --name gitlab --username gitlab-ci-token --password $CI_JOB_TOKEN --store-password-in-clear-text
|
||||
- dotnet nuget push "bin/Release/*.nupkg" --source gitlab
|
||||
only:
|
||||
- master
|
||||
|
|
|
|||
|
|
@ -320,7 +320,7 @@ python -m twine upload --repository <source_name> dist/<package_file>
|
|||
|
||||
You cannot publish a package if a package of the same name and version already exists.
|
||||
You must delete the existing package first. If you attempt to publish the same package
|
||||
more than once, a `404 Bad Request` error occurs.
|
||||
more than once, a `400 Bad Request` error occurs.
|
||||
|
||||
## Install a PyPI package
|
||||
|
||||
|
|
|
|||
|
|
@ -14,6 +14,20 @@ module API
|
|||
detail 'This feature was introduced in GitLab 14.0'
|
||||
end
|
||||
get ':id/avatar' do
|
||||
avatar = user_group.avatar
|
||||
|
||||
not_found!('Avatar') if avatar.blank?
|
||||
|
||||
filename = File.basename(avatar.file.file)
|
||||
|
||||
header(
|
||||
'Content-Disposition',
|
||||
ActionDispatch::Http::ContentDisposition.format(
|
||||
disposition: 'attachment',
|
||||
filename: filename
|
||||
)
|
||||
)
|
||||
|
||||
present_carrierwave_file!(user_group.avatar)
|
||||
end
|
||||
end
|
||||
|
|
|
|||
|
|
@ -77,8 +77,8 @@ module API
|
|||
present services, with: Entities::ProjectServiceBasic
|
||||
end
|
||||
|
||||
SERVICES.each do |service_slug, settings|
|
||||
desc "Set #{service_slug} service for project"
|
||||
SERVICES.each do |slug, settings|
|
||||
desc "Set #{slug} service for project"
|
||||
params do
|
||||
settings.each do |setting|
|
||||
if setting[:required]
|
||||
|
|
@ -88,12 +88,12 @@ module API
|
|||
end
|
||||
end
|
||||
end
|
||||
put ":id/services/#{service_slug}" do
|
||||
service = user_project.find_or_initialize_service(service_slug.underscore)
|
||||
service_params = declared_params(include_missing: false).merge(active: true)
|
||||
put ":id/services/#{slug}" do
|
||||
integration = user_project.find_or_initialize_integration(slug.underscore)
|
||||
params = declared_params(include_missing: false).merge(active: true)
|
||||
|
||||
if service.update(service_params)
|
||||
present service, with: Entities::ProjectService
|
||||
if integration.update(params)
|
||||
present integration, with: Entities::ProjectService
|
||||
else
|
||||
render_api_error!('400 Bad Request', 400)
|
||||
end
|
||||
|
|
@ -102,19 +102,15 @@ module API
|
|||
|
||||
desc "Delete a service for project"
|
||||
params do
|
||||
requires :service_slug, type: String, values: SERVICES.keys, desc: 'The name of the service'
|
||||
requires :slug, type: String, values: SERVICES.keys, desc: 'The name of the service'
|
||||
end
|
||||
delete ":id/services/:service_slug" do
|
||||
service = user_project.find_or_initialize_service(params[:service_slug].underscore)
|
||||
delete ":id/services/:slug" do
|
||||
integration = user_project.find_or_initialize_integration(params[:slug].underscore)
|
||||
|
||||
destroy_conditionally!(service) do
|
||||
attrs = service_attributes(service).inject({}) do |hash, key|
|
||||
hash.merge!(key => nil)
|
||||
end
|
||||
destroy_conditionally!(integration) do
|
||||
attrs = service_attributes(integration).index_with { nil }.merge(active: false)
|
||||
|
||||
unless service.update(attrs.merge(active: false))
|
||||
render_api_error!('400 Bad Request', 400)
|
||||
end
|
||||
render_api_error!('400 Bad Request', 400) unless integration.update(attrs)
|
||||
end
|
||||
end
|
||||
|
||||
|
|
@ -122,10 +118,10 @@ module API
|
|||
success Entities::ProjectService
|
||||
end
|
||||
params do
|
||||
requires :service_slug, type: String, values: SERVICES.keys, desc: 'The name of the service'
|
||||
requires :slug, type: String, values: SERVICES.keys, desc: 'The name of the service'
|
||||
end
|
||||
get ":id/services/:service_slug" do
|
||||
integration = user_project.find_or_initialize_service(params[:service_slug].underscore)
|
||||
get ":id/services/:slug" do
|
||||
integration = user_project.find_or_initialize_integration(params[:slug].underscore)
|
||||
|
||||
not_found!('Service') unless integration&.persisted?
|
||||
|
||||
|
|
|
|||
|
|
@ -156,10 +156,10 @@ module Gitlab
|
|||
|
||||
underscored_service = matched_login['service'].underscore
|
||||
|
||||
return unless Integration.available_services_names.include?(underscored_service)
|
||||
return unless Integration.available_integration_names.include?(underscored_service)
|
||||
|
||||
# We treat underscored_service as a trusted input because it is included
|
||||
# in the Integration.available_services_names allowlist.
|
||||
# in the Integration.available_integration_names allowlist.
|
||||
accessor = Project.integration_association_name(underscored_service)
|
||||
service = project.public_send(accessor) # rubocop:disable GitlabSecurity/PublicSend
|
||||
|
||||
|
|
|
|||
|
|
@ -107,10 +107,10 @@ module Gitlab
|
|||
return success(result) unless prometheus_enabled?
|
||||
return success(result) unless prometheus_server_address.present?
|
||||
|
||||
service = result[:project].find_or_initialize_service('prometheus')
|
||||
prometheus = result[:project].find_or_initialize_integration('prometheus')
|
||||
|
||||
unless service.update(prometheus_integration_attributes)
|
||||
log_error('Could not save prometheus manual configuration for self-monitoring project. Errors: %{errors}' % { errors: service.errors.full_messages })
|
||||
unless prometheus.update(prometheus_integration_attributes)
|
||||
log_error('Could not save prometheus manual configuration for self-monitoring project. Errors: %{errors}' % { errors: prometheus.errors.full_messages })
|
||||
return error(_('Could not save prometheus manual configuration'))
|
||||
end
|
||||
|
||||
|
|
|
|||
|
|
@ -26,7 +26,7 @@ module Gitlab
|
|||
private
|
||||
|
||||
def service_prometheus_adapter
|
||||
project.find_or_initialize_service('prometheus')
|
||||
project.find_or_initialize_integration('prometheus')
|
||||
end
|
||||
end
|
||||
end
|
||||
|
|
|
|||
|
|
@ -403,7 +403,7 @@ module Gitlab
|
|||
|
||||
def services_usage
|
||||
# rubocop: disable UsageData/LargeTable:
|
||||
Integration.available_services_names(include_dev: false).each_with_object({}) do |name, response|
|
||||
Integration.available_integration_names(include_dev: false).each_with_object({}) do |name, response|
|
||||
type = Integration.integration_name_to_type(name)
|
||||
|
||||
response[:"projects_#{name}_active"] = count(Integration.active.where.not(project: nil).where(type: type))
|
||||
|
|
|
|||
|
|
@ -10,7 +10,7 @@ RSpec.describe Admin::IntegrationsController do
|
|||
end
|
||||
|
||||
describe '#edit' do
|
||||
Integration.available_services_names.each do |integration_name|
|
||||
Integration.available_integration_names.each do |integration_name|
|
||||
context "#{integration_name}" do
|
||||
it 'successfully displays the template' do
|
||||
get :edit, params: { id: integration_name }
|
||||
|
|
@ -27,7 +27,7 @@ RSpec.describe Admin::IntegrationsController do
|
|||
end
|
||||
|
||||
it 'returns 404' do
|
||||
get :edit, params: { id: Integration.available_services_names.sample }
|
||||
get :edit, params: { id: Integration.available_integration_names.sample }
|
||||
|
||||
expect(response).to have_gitlab_http_status(:not_found)
|
||||
end
|
||||
|
|
|
|||
|
|
@ -36,7 +36,7 @@ RSpec.describe Groups::Settings::IntegrationsController do
|
|||
describe '#edit' do
|
||||
context 'when user is not owner' do
|
||||
it 'renders not_found' do
|
||||
get :edit, params: { group_id: group, id: Integration.available_services_names(include_project_specific: false).sample }
|
||||
get :edit, params: { group_id: group, id: Integration.available_integration_names(include_project_specific: false).sample }
|
||||
|
||||
expect(response).to have_gitlab_http_status(:not_found)
|
||||
end
|
||||
|
|
@ -47,8 +47,8 @@ RSpec.describe Groups::Settings::IntegrationsController do
|
|||
group.add_owner(user)
|
||||
end
|
||||
|
||||
Integration.available_services_names(include_project_specific: false).each do |integration_name|
|
||||
context "#{integration_name}" do
|
||||
Integration.available_integration_names(include_project_specific: false).each do |integration_name|
|
||||
context integration_name do
|
||||
it 'successfully displays the template' do
|
||||
get :edit, params: { group_id: group, id: integration_name }
|
||||
|
||||
|
|
|
|||
|
|
@ -1,15 +1,8 @@
|
|||
import MockAdapter from 'axios-mock-adapter';
|
||||
import Api from '~/api';
|
||||
import axios from '~/lib/utils/axios_utils';
|
||||
import httpStatus from '~/lib/utils/http_status';
|
||||
import { resolvers } from '~/pipeline_editor/graphql/resolvers';
|
||||
import {
|
||||
mockCiConfigPath,
|
||||
mockCiYml,
|
||||
mockDefaultBranch,
|
||||
mockLintResponse,
|
||||
mockProjectFullPath,
|
||||
} from '../mock_data';
|
||||
import { mockLintResponse } from '../mock_data';
|
||||
|
||||
jest.mock('~/api', () => {
|
||||
return {
|
||||
|
|
@ -18,36 +11,6 @@ jest.mock('~/api', () => {
|
|||
});
|
||||
|
||||
describe('~/pipeline_editor/graphql/resolvers', () => {
|
||||
describe('Query', () => {
|
||||
describe('blobContent', () => {
|
||||
beforeEach(() => {
|
||||
Api.getRawFile.mockResolvedValue({
|
||||
data: mockCiYml,
|
||||
});
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
Api.getRawFile.mockReset();
|
||||
});
|
||||
|
||||
it('resolves lint data with type names', async () => {
|
||||
const result = resolvers.Query.blobContent(null, {
|
||||
projectPath: mockProjectFullPath,
|
||||
path: mockCiConfigPath,
|
||||
ref: mockDefaultBranch,
|
||||
});
|
||||
|
||||
expect(Api.getRawFile).toHaveBeenCalledWith(mockProjectFullPath, mockCiConfigPath, {
|
||||
ref: mockDefaultBranch,
|
||||
});
|
||||
|
||||
// eslint-disable-next-line no-underscore-dangle
|
||||
expect(result.__typename).toBe('BlobContent');
|
||||
await expect(result.rawData).resolves.toBe(mockCiYml);
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe('Mutation', () => {
|
||||
describe('lintCI', () => {
|
||||
let mock;
|
||||
|
|
|
|||
|
|
@ -35,6 +35,23 @@ job_build:
|
|||
- echo "build"
|
||||
needs: ["job_test_2"]
|
||||
`;
|
||||
export const mockBlobContentQueryResponse = {
|
||||
data: {
|
||||
project: { repository: { blobs: { nodes: [{ rawBlob: mockCiYml }] } } },
|
||||
},
|
||||
};
|
||||
|
||||
export const mockBlobContentQueryResponseNoCiFile = {
|
||||
data: {
|
||||
project: { repository: { blobs: { nodes: [] } } },
|
||||
},
|
||||
};
|
||||
|
||||
export const mockBlobContentQueryResponseEmptyCiFile = {
|
||||
data: {
|
||||
project: { repository: { blobs: { nodes: [{ rawBlob: '' }] } } },
|
||||
},
|
||||
};
|
||||
|
||||
const mockJobFields = {
|
||||
beforeScript: [],
|
||||
|
|
|
|||
|
|
@ -3,7 +3,6 @@ import { shallowMount, createLocalVue } from '@vue/test-utils';
|
|||
import VueApollo from 'vue-apollo';
|
||||
import createMockApollo from 'helpers/mock_apollo_helper';
|
||||
import waitForPromises from 'helpers/wait_for_promises';
|
||||
import httpStatusCodes from '~/lib/utils/http_status';
|
||||
import CommitForm from '~/pipeline_editor/components/commit/commit_form.vue';
|
||||
import TextEditor from '~/pipeline_editor/components/editor/text_editor.vue';
|
||||
|
||||
|
|
@ -11,15 +10,19 @@ import PipelineEditorTabs from '~/pipeline_editor/components/pipeline_editor_tab
|
|||
import PipelineEditorEmptyState from '~/pipeline_editor/components/ui/pipeline_editor_empty_state.vue';
|
||||
import PipelineEditorMessages from '~/pipeline_editor/components/ui/pipeline_editor_messages.vue';
|
||||
import { COMMIT_SUCCESS, COMMIT_FAILURE } from '~/pipeline_editor/constants';
|
||||
import getBlobContent from '~/pipeline_editor/graphql/queries/blob_content.graphql';
|
||||
import getCiConfigData from '~/pipeline_editor/graphql/queries/ci_config.graphql';
|
||||
import PipelineEditorApp from '~/pipeline_editor/pipeline_editor_app.vue';
|
||||
import PipelineEditorHome from '~/pipeline_editor/pipeline_editor_home.vue';
|
||||
import {
|
||||
mockCiConfigPath,
|
||||
mockCiConfigQueryResponse,
|
||||
mockCiYml,
|
||||
mockBlobContentQueryResponse,
|
||||
mockBlobContentQueryResponseEmptyCiFile,
|
||||
mockBlobContentQueryResponseNoCiFile,
|
||||
mockDefaultBranch,
|
||||
mockProjectFullPath,
|
||||
mockCiYml,
|
||||
} from './mock_data';
|
||||
|
||||
const localVue = createLocalVue();
|
||||
|
|
@ -75,19 +78,12 @@ describe('Pipeline editor app component', () => {
|
|||
};
|
||||
|
||||
const createComponentWithApollo = async ({ props = {}, provide = {} } = {}) => {
|
||||
const handlers = [[getCiConfigData, mockCiConfigData]];
|
||||
const resolvers = {
|
||||
Query: {
|
||||
blobContent() {
|
||||
return {
|
||||
__typename: 'BlobContent',
|
||||
rawData: mockBlobContentData(),
|
||||
};
|
||||
},
|
||||
},
|
||||
};
|
||||
const handlers = [
|
||||
[getBlobContent, mockBlobContentData],
|
||||
[getCiConfigData, mockCiConfigData],
|
||||
];
|
||||
|
||||
mockApollo = createMockApollo(handlers, resolvers);
|
||||
mockApollo = createMockApollo(handlers);
|
||||
|
||||
const options = {
|
||||
localVue,
|
||||
|
|
@ -133,7 +129,7 @@ describe('Pipeline editor app component', () => {
|
|||
|
||||
describe('when queries are called', () => {
|
||||
beforeEach(() => {
|
||||
mockBlobContentData.mockResolvedValue(mockCiYml);
|
||||
mockBlobContentData.mockResolvedValue(mockBlobContentQueryResponse);
|
||||
mockCiConfigData.mockResolvedValue(mockCiConfigQueryResponse);
|
||||
});
|
||||
|
||||
|
|
@ -159,35 +155,14 @@ describe('Pipeline editor app component', () => {
|
|||
});
|
||||
|
||||
describe('when no CI config file exists', () => {
|
||||
describe('in a project without a repository', () => {
|
||||
it('shows an empty state and does not show editor home component', async () => {
|
||||
mockBlobContentData.mockRejectedValueOnce({
|
||||
response: {
|
||||
status: httpStatusCodes.BAD_REQUEST,
|
||||
},
|
||||
});
|
||||
mockBlobContentData.mockResolvedValue(mockBlobContentQueryResponseNoCiFile);
|
||||
await createComponentWithApollo();
|
||||
|
||||
expect(findEmptyState().exists()).toBe(true);
|
||||
expect(findAlert().exists()).toBe(false);
|
||||
expect(findEditorHome().exists()).toBe(false);
|
||||
});
|
||||
});
|
||||
|
||||
describe('in a project with a repository', () => {
|
||||
it('shows an empty state and does not show editor home component', async () => {
|
||||
mockBlobContentData.mockRejectedValueOnce({
|
||||
response: {
|
||||
status: httpStatusCodes.NOT_FOUND,
|
||||
},
|
||||
});
|
||||
await createComponentWithApollo();
|
||||
|
||||
expect(findEmptyState().exists()).toBe(true);
|
||||
expect(findAlert().exists()).toBe(false);
|
||||
expect(findEditorHome().exists()).toBe(false);
|
||||
});
|
||||
});
|
||||
|
||||
describe('because of a fetching error', () => {
|
||||
it('shows a unkown error message', async () => {
|
||||
|
|
@ -204,14 +179,29 @@ describe('Pipeline editor app component', () => {
|
|||
});
|
||||
});
|
||||
|
||||
describe('when landing on the empty state with feature flag on', () => {
|
||||
it('user can click on CTA button and see an empty editor', async () => {
|
||||
mockBlobContentData.mockRejectedValueOnce({
|
||||
response: {
|
||||
status: httpStatusCodes.NOT_FOUND,
|
||||
describe('with an empty CI config file', () => {
|
||||
describe('with empty state feature flag on', () => {
|
||||
it('does not show the empty screen state', async () => {
|
||||
mockBlobContentData.mockResolvedValue(mockBlobContentQueryResponseEmptyCiFile);
|
||||
|
||||
await createComponentWithApollo({
|
||||
provide: {
|
||||
glFeatures: {
|
||||
pipelineEditorEmptyStateAction: true,
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
expect(findEmptyState().exists()).toBe(false);
|
||||
expect(findTextEditor().exists()).toBe(true);
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe('when landing on the empty state with feature flag on', () => {
|
||||
it('user can click on CTA button and see an empty editor', async () => {
|
||||
mockBlobContentData.mockResolvedValue(mockBlobContentQueryResponseNoCiFile);
|
||||
|
||||
await createComponentWithApollo({
|
||||
provide: {
|
||||
glFeatures: {
|
||||
|
|
@ -315,17 +305,13 @@ describe('Pipeline editor app component', () => {
|
|||
});
|
||||
|
||||
it('hides start screen when refetch fetches CI file', async () => {
|
||||
mockBlobContentData.mockRejectedValue({
|
||||
response: {
|
||||
status: httpStatusCodes.NOT_FOUND,
|
||||
},
|
||||
});
|
||||
mockBlobContentData.mockResolvedValue(mockBlobContentQueryResponseNoCiFile);
|
||||
await createComponentWithApollo();
|
||||
|
||||
expect(findEmptyState().exists()).toBe(true);
|
||||
expect(findEditorHome().exists()).toBe(false);
|
||||
|
||||
mockBlobContentData.mockResolvedValue(mockCiYml);
|
||||
mockBlobContentData.mockResolvedValue(mockBlobContentQueryResponse);
|
||||
await wrapper.vm.$apollo.queries.initialCiFileContent.refetch();
|
||||
|
||||
expect(findEmptyState().exists()).toBe(false);
|
||||
|
|
|
|||
|
|
@ -56,7 +56,6 @@ describe('stage column component', () => {
|
|||
|
||||
afterEach(() => {
|
||||
wrapper.destroy();
|
||||
wrapper = null;
|
||||
});
|
||||
|
||||
describe('when mounted', () => {
|
||||
|
|
|
|||
|
|
@ -2,29 +2,29 @@
|
|||
|
||||
exports[`Links Inner component with a large number of needs matches snapshot and has expected path 1`] = `
|
||||
"<div class=\\"gl-display-flex gl-relative\\" totalgroups=\\"10\\"><svg id=\\"link-svg\\" viewBox=\\"0,0,1019,445\\" width=\\"1019px\\" height=\\"445px\\" class=\\"gl-absolute gl-pointer-events-none\\">
|
||||
<path d=\\"M202,118L42,118C72,118,72,138,102,138\\" stroke-width=\\"2\\" class=\\"gl-fill-transparent gl-transition-duration-slow gl-transition-timing-function-ease gl-stroke-gray-200\\"></path>
|
||||
<path d=\\"M202,118L52,118C82,118,82,148,112,148\\" stroke-width=\\"2\\" class=\\"gl-fill-transparent gl-transition-duration-slow gl-transition-timing-function-ease gl-stroke-gray-200\\"></path>
|
||||
<path d=\\"M222,138L62,138C92,138,92,158,122,158\\" stroke-width=\\"2\\" class=\\"gl-fill-transparent gl-transition-duration-slow gl-transition-timing-function-ease gl-stroke-gray-200\\"></path>
|
||||
<path d=\\"M212,128L72,128C102,128,102,168,132,168\\" stroke-width=\\"2\\" class=\\"gl-fill-transparent gl-transition-duration-slow gl-transition-timing-function-ease gl-stroke-gray-200\\"></path>
|
||||
<path d=\\"M232,148L82,148C112,148,112,178,142,178\\" stroke-width=\\"2\\" class=\\"gl-fill-transparent gl-transition-duration-slow gl-transition-timing-function-ease gl-stroke-gray-200\\"></path>
|
||||
<path d=\\"M202,118C52,118,52,138,102,138\\" stroke-width=\\"2\\" class=\\"gl-fill-transparent gl-transition-duration-slow gl-transition-timing-function-ease gl-stroke-gray-200\\"></path>
|
||||
<path d=\\"M202,118C62,118,62,148,112,148\\" stroke-width=\\"2\\" class=\\"gl-fill-transparent gl-transition-duration-slow gl-transition-timing-function-ease gl-stroke-gray-200\\"></path>
|
||||
<path d=\\"M222,138C72,138,72,158,122,158\\" stroke-width=\\"2\\" class=\\"gl-fill-transparent gl-transition-duration-slow gl-transition-timing-function-ease gl-stroke-gray-200\\"></path>
|
||||
<path d=\\"M212,128C82,128,82,168,132,168\\" stroke-width=\\"2\\" class=\\"gl-fill-transparent gl-transition-duration-slow gl-transition-timing-function-ease gl-stroke-gray-200\\"></path>
|
||||
<path d=\\"M232,148C92,148,92,178,142,178\\" stroke-width=\\"2\\" class=\\"gl-fill-transparent gl-transition-duration-slow gl-transition-timing-function-ease gl-stroke-gray-200\\"></path>
|
||||
</svg> </div>"
|
||||
`;
|
||||
|
||||
exports[`Links Inner component with a parallel need matches snapshot and has expected path 1`] = `
|
||||
"<div class=\\"gl-display-flex gl-relative\\" totalgroups=\\"10\\"><svg id=\\"link-svg\\" viewBox=\\"0,0,1019,445\\" width=\\"1019px\\" height=\\"445px\\" class=\\"gl-absolute gl-pointer-events-none\\">
|
||||
<path d=\\"M192,108L22,108C52,108,52,118,82,118\\" stroke-width=\\"2\\" class=\\"gl-fill-transparent gl-transition-duration-slow gl-transition-timing-function-ease gl-stroke-gray-200\\"></path>
|
||||
<path d=\\"M192,108C32,108,32,118,82,118\\" stroke-width=\\"2\\" class=\\"gl-fill-transparent gl-transition-duration-slow gl-transition-timing-function-ease gl-stroke-gray-200\\"></path>
|
||||
</svg> </div>"
|
||||
`;
|
||||
|
||||
exports[`Links Inner component with one need matches snapshot and has expected path 1`] = `
|
||||
"<div class=\\"gl-display-flex gl-relative\\" totalgroups=\\"10\\"><svg id=\\"link-svg\\" viewBox=\\"0,0,1019,445\\" width=\\"1019px\\" height=\\"445px\\" class=\\"gl-absolute gl-pointer-events-none\\">
|
||||
<path d=\\"M202,118L42,118C72,118,72,138,102,138\\" stroke-width=\\"2\\" class=\\"gl-fill-transparent gl-transition-duration-slow gl-transition-timing-function-ease gl-stroke-gray-200\\"></path>
|
||||
<path d=\\"M202,118C52,118,52,138,102,138\\" stroke-width=\\"2\\" class=\\"gl-fill-transparent gl-transition-duration-slow gl-transition-timing-function-ease gl-stroke-gray-200\\"></path>
|
||||
</svg> </div>"
|
||||
`;
|
||||
|
||||
exports[`Links Inner component with same stage needs matches snapshot and has expected path 1`] = `
|
||||
"<div class=\\"gl-display-flex gl-relative\\" totalgroups=\\"10\\"><svg id=\\"link-svg\\" viewBox=\\"0,0,1019,445\\" width=\\"1019px\\" height=\\"445px\\" class=\\"gl-absolute gl-pointer-events-none\\">
|
||||
<path d=\\"M192,108L22,108C52,108,52,118,82,118\\" stroke-width=\\"2\\" class=\\"gl-fill-transparent gl-transition-duration-slow gl-transition-timing-function-ease gl-stroke-gray-200\\"></path>
|
||||
<path d=\\"M202,118L32,118C62,118,62,128,92,128\\" stroke-width=\\"2\\" class=\\"gl-fill-transparent gl-transition-duration-slow gl-transition-timing-function-ease gl-stroke-gray-200\\"></path>
|
||||
<path d=\\"M192,108C32,108,32,118,82,118\\" stroke-width=\\"2\\" class=\\"gl-fill-transparent gl-transition-duration-slow gl-transition-timing-function-ease gl-stroke-gray-200\\"></path>
|
||||
<path d=\\"M202,118C42,118,42,128,92,128\\" stroke-width=\\"2\\" class=\\"gl-fill-transparent gl-transition-duration-slow gl-transition-timing-function-ease gl-stroke-gray-200\\"></path>
|
||||
</svg> </div>"
|
||||
`;
|
||||
|
|
|
|||
|
|
@ -6,7 +6,7 @@ import LinksInner from '~/pipelines/components/graph_shared/links_inner.vue';
|
|||
import LinksLayer from '~/pipelines/components/graph_shared/links_layer.vue';
|
||||
import JobPill from '~/pipelines/components/pipeline_graph/job_pill.vue';
|
||||
import PipelineGraph from '~/pipelines/components/pipeline_graph/pipeline_graph.vue';
|
||||
import StagePill from '~/pipelines/components/pipeline_graph/stage_pill.vue';
|
||||
import StageName from '~/pipelines/components/pipeline_graph/stage_name.vue';
|
||||
import { pipelineData, singleStageData } from './mock_data';
|
||||
|
||||
describe('pipeline graph component', () => {
|
||||
|
|
@ -35,11 +35,9 @@ describe('pipeline graph component', () => {
|
|||
|
||||
const findAlert = () => wrapper.findComponent(GlAlert);
|
||||
const findAllJobPills = () => wrapper.findAll(JobPill);
|
||||
const findAllStageBackgroundElements = () => wrapper.findAll('[data-testid="stage-background"]');
|
||||
const findAllStagePills = () => wrapper.findAllComponents(StagePill);
|
||||
const findAllStageNames = () => wrapper.findAllComponents(StageName);
|
||||
const findLinksLayer = () => wrapper.findComponent(LinksLayer);
|
||||
const findPipelineGraph = () => wrapper.find('[data-testid="graph-container"]');
|
||||
const findStageBackgroundElementAt = (index) => findAllStageBackgroundElements().at(index);
|
||||
|
||||
afterEach(() => {
|
||||
wrapper.destroy();
|
||||
|
|
@ -67,10 +65,10 @@ describe('pipeline graph component', () => {
|
|||
wrapper = createComponent({ pipelineData: singleStageData });
|
||||
});
|
||||
|
||||
it('renders the right number of stage pills', () => {
|
||||
it('renders the right number of stage titles', () => {
|
||||
const expectedStagesLength = singleStageData.stages.length;
|
||||
|
||||
expect(findAllStagePills()).toHaveLength(expectedStagesLength);
|
||||
expect(findAllStageNames()).toHaveLength(expectedStagesLength);
|
||||
});
|
||||
|
||||
it('renders the right number of job pills', () => {
|
||||
|
|
@ -81,20 +79,6 @@ describe('pipeline graph component', () => {
|
|||
|
||||
expect(findAllJobPills()).toHaveLength(expectedJobsLength);
|
||||
});
|
||||
|
||||
describe('rounds corner', () => {
|
||||
it.each`
|
||||
cssClass | expectedState
|
||||
${'gl-rounded-bottom-left-6'} | ${true}
|
||||
${'gl-rounded-top-left-6'} | ${true}
|
||||
${'gl-rounded-top-right-6'} | ${true}
|
||||
${'gl-rounded-bottom-right-6'} | ${true}
|
||||
`('$cssClass should be $expectedState on the only element', ({ cssClass, expectedState }) => {
|
||||
const classes = findStageBackgroundElementAt(0).classes();
|
||||
|
||||
expect(classes.includes(cssClass)).toBe(expectedState);
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe('with multiple stages and jobs', () => {
|
||||
|
|
@ -102,10 +86,10 @@ describe('pipeline graph component', () => {
|
|||
wrapper = createComponent();
|
||||
});
|
||||
|
||||
it('renders the right number of stage pills', () => {
|
||||
it('renders the right number of stage titles', () => {
|
||||
const expectedStagesLength = pipelineData.stages.length;
|
||||
|
||||
expect(findAllStagePills()).toHaveLength(expectedStagesLength);
|
||||
expect(findAllStageNames()).toHaveLength(expectedStagesLength);
|
||||
});
|
||||
|
||||
it('renders the right number of job pills', () => {
|
||||
|
|
@ -116,34 +100,5 @@ describe('pipeline graph component', () => {
|
|||
|
||||
expect(findAllJobPills()).toHaveLength(expectedJobsLength);
|
||||
});
|
||||
|
||||
describe('rounds corner', () => {
|
||||
it.each`
|
||||
cssClass | expectedState
|
||||
${'gl-rounded-bottom-left-6'} | ${true}
|
||||
${'gl-rounded-top-left-6'} | ${true}
|
||||
${'gl-rounded-top-right-6'} | ${false}
|
||||
${'gl-rounded-bottom-right-6'} | ${false}
|
||||
`(
|
||||
'$cssClass should be $expectedState on the first element',
|
||||
({ cssClass, expectedState }) => {
|
||||
const classes = findStageBackgroundElementAt(0).classes();
|
||||
|
||||
expect(classes.includes(cssClass)).toBe(expectedState);
|
||||
},
|
||||
);
|
||||
|
||||
it.each`
|
||||
cssClass | expectedState
|
||||
${'gl-rounded-bottom-left-6'} | ${false}
|
||||
${'gl-rounded-top-left-6'} | ${false}
|
||||
${'gl-rounded-top-right-6'} | ${true}
|
||||
${'gl-rounded-bottom-right-6'} | ${true}
|
||||
`('$cssClass should be $expectedState on the last element', ({ cssClass, expectedState }) => {
|
||||
const classes = findStageBackgroundElementAt(pipelineData.stages.length - 1).classes();
|
||||
|
||||
expect(classes.includes(cssClass)).toBe(expectedState);
|
||||
});
|
||||
});
|
||||
});
|
||||
});
|
||||
|
|
|
|||
|
|
@ -8,6 +8,6 @@ RSpec.describe GitlabSchema.types['ServiceType'] do
|
|||
end
|
||||
|
||||
def available_services_enum
|
||||
::Integration.available_services_types(include_dev: false).map(&:underscore).map(&:upcase)
|
||||
::Integration.available_integration_types(include_dev: false).map(&:underscore).map(&:upcase)
|
||||
end
|
||||
end
|
||||
|
|
|
|||
|
|
@ -238,16 +238,16 @@ RSpec.describe EmailsHelper do
|
|||
it 'returns the default header logo' do
|
||||
create :appearance, header_logo: nil
|
||||
|
||||
expect(header_logo).to eq(
|
||||
%{<img alt="GitLab" src="/images/mailers/gitlab_header_logo.gif" width="55" height="50" />}
|
||||
expect(header_logo).to match(
|
||||
%r{<img alt="GitLab" src="/images/mailers/gitlab_header_logo\.(?:gif|png)" width="\d+" height="\d+" />}
|
||||
)
|
||||
end
|
||||
end
|
||||
|
||||
context 'there is no brand item' do
|
||||
it 'returns the default header logo' do
|
||||
expect(header_logo).to eq(
|
||||
%{<img alt="GitLab" src="/images/mailers/gitlab_header_logo.gif" width="55" height="50" />}
|
||||
expect(header_logo).to match(
|
||||
%r{<img alt="GitLab" src="/images/mailers/gitlab_header_logo\.(?:gif|png)" width="\d+" height="\d+" />}
|
||||
)
|
||||
end
|
||||
end
|
||||
|
|
|
|||
|
|
@ -24,8 +24,8 @@ RSpec.describe OperationsHelper do
|
|||
let_it_be(:prometheus_integration) { ::Integrations::Prometheus.new(project: project) }
|
||||
|
||||
before do
|
||||
allow(project).to receive(:find_or_initialize_service).and_call_original
|
||||
allow(project).to receive(:find_or_initialize_service).with('prometheus').and_return(prometheus_integration)
|
||||
allow(project).to receive(:find_or_initialize_integration).and_call_original
|
||||
allow(project).to receive(:find_or_initialize_integration).with('prometheus').and_return(prometheus_integration)
|
||||
end
|
||||
|
||||
it 'returns the correct values' do
|
||||
|
|
|
|||
|
|
@ -13,7 +13,7 @@ RSpec.describe Gitlab::Prometheus::Adapter do
|
|||
let(:prometheus_integration) { double(:prometheus_integration, can_query?: true) }
|
||||
|
||||
before do
|
||||
allow(project).to receive(:find_or_initialize_service).with('prometheus').and_return prometheus_integration
|
||||
allow(project).to receive(:find_or_initialize_integration).with('prometheus').and_return prometheus_integration
|
||||
end
|
||||
|
||||
it 'return prometheus integration as prometheus adapter' do
|
||||
|
|
@ -33,7 +33,7 @@ RSpec.describe Gitlab::Prometheus::Adapter do
|
|||
let(:prometheus_integration) { double(:prometheus_integration, can_query?: false) }
|
||||
|
||||
before do
|
||||
allow(project).to receive(:find_or_initialize_service).with('prometheus').and_return prometheus_integration
|
||||
allow(project).to receive(:find_or_initialize_integration).with('prometheus').and_return prometheus_integration
|
||||
end
|
||||
|
||||
context 'with cluster with prometheus disabled' do
|
||||
|
|
|
|||
|
|
@ -23,7 +23,7 @@ RSpec.describe DeploymentMetrics do
|
|||
let(:prometheus_integration) { instance_double(::Integrations::Prometheus, can_query?: true, configured?: true) }
|
||||
|
||||
before do
|
||||
allow(deployment.project).to receive(:find_or_initialize_service).with('prometheus').and_return prometheus_integration
|
||||
allow(deployment.project).to receive(:find_or_initialize_integration).with('prometheus').and_return prometheus_integration
|
||||
end
|
||||
|
||||
it { is_expected.to be_truthy }
|
||||
|
|
@ -33,7 +33,7 @@ RSpec.describe DeploymentMetrics do
|
|||
let(:prometheus_integration) { instance_double(::Integrations::Prometheus, configured?: true, can_query?: false) }
|
||||
|
||||
before do
|
||||
allow(deployment.project).to receive(:find_or_initialize_service).with('prometheus').and_return prometheus_integration
|
||||
allow(deployment.project).to receive(:find_or_initialize_integration).with('prometheus').and_return prometheus_integration
|
||||
end
|
||||
|
||||
it { is_expected.to be_falsy }
|
||||
|
|
@ -43,7 +43,7 @@ RSpec.describe DeploymentMetrics do
|
|||
let(:prometheus_integration) { instance_double(::Integrations::Prometheus, configured?: false, can_query?: false) }
|
||||
|
||||
before do
|
||||
allow(deployment.project).to receive(:find_or_initialize_service).with('prometheus').and_return prometheus_integration
|
||||
allow(deployment.project).to receive(:find_or_initialize_integration).with('prometheus').and_return prometheus_integration
|
||||
end
|
||||
|
||||
it { is_expected.to be_falsy }
|
||||
|
|
|
|||
|
|
@ -140,10 +140,10 @@ RSpec.describe Integration do
|
|||
end
|
||||
|
||||
describe "Test Button" do
|
||||
let(:service) { build(:service, project: project) }
|
||||
let(:integration) { build(:service, project: project) }
|
||||
|
||||
describe '#can_test?' do
|
||||
subject { service.can_test? }
|
||||
subject { integration.can_test? }
|
||||
|
||||
context 'when repository is not empty' do
|
||||
let(:project) { build(:project, :repository) }
|
||||
|
|
@ -158,9 +158,9 @@ RSpec.describe Integration do
|
|||
end
|
||||
|
||||
context 'when instance-level service' do
|
||||
Integration.available_services_types.each do |service_type|
|
||||
let(:service) do
|
||||
described_class.send(:integration_type_to_model, service_type).new(instance: true)
|
||||
Integration.available_integration_types.each do |type|
|
||||
let(:integration) do
|
||||
described_class.send(:integration_type_to_model, type).new(instance: true)
|
||||
end
|
||||
|
||||
it { is_expected.to be_falsey }
|
||||
|
|
@ -168,9 +168,9 @@ RSpec.describe Integration do
|
|||
end
|
||||
|
||||
context 'when group-level service' do
|
||||
Integration.available_services_types.each do |service_type|
|
||||
let(:service) do
|
||||
described_class.send(:integration_type_to_model, service_type).new(group_id: group.id)
|
||||
Integration.available_integration_types.each do |type|
|
||||
let(:integration) do
|
||||
described_class.send(:integration_type_to_model, type).new(group_id: group.id)
|
||||
end
|
||||
|
||||
it { is_expected.to be_falsey }
|
||||
|
|
@ -185,9 +185,9 @@ RSpec.describe Integration do
|
|||
let(:project) { build(:project, :repository) }
|
||||
|
||||
it 'test runs execute' do
|
||||
expect(service).to receive(:execute).with(data)
|
||||
expect(integration).to receive(:execute).with(data)
|
||||
|
||||
service.test(data)
|
||||
integration.test(data)
|
||||
end
|
||||
end
|
||||
|
||||
|
|
@ -195,9 +195,9 @@ RSpec.describe Integration do
|
|||
let(:project) { build(:project) }
|
||||
|
||||
it 'test runs execute' do
|
||||
expect(service).to receive(:execute).with(data)
|
||||
expect(integration).to receive(:execute).with(data)
|
||||
|
||||
service.test(data)
|
||||
integration.test(data)
|
||||
end
|
||||
end
|
||||
end
|
||||
|
|
@ -251,11 +251,13 @@ RSpec.describe Integration do
|
|||
describe '.find_or_initialize_all_non_project_specific' do
|
||||
shared_examples 'service instances' do
|
||||
it 'returns the available service instances' do
|
||||
expect(Integration.find_or_initialize_all_non_project_specific(Integration.for_instance).map(&:to_param)).to match_array(Integration.available_services_names(include_project_specific: false))
|
||||
expect(Integration.find_or_initialize_all_non_project_specific(Integration.for_instance).map(&:to_param))
|
||||
.to match_array(Integration.available_integration_names(include_project_specific: false))
|
||||
end
|
||||
|
||||
it 'does not create service instances' do
|
||||
expect { Integration.find_or_initialize_all_non_project_specific(Integration.for_instance) }.not_to change { Integration.count }
|
||||
expect { Integration.find_or_initialize_all_non_project_specific(Integration.for_instance) }
|
||||
.not_to change(Integration, :count)
|
||||
end
|
||||
end
|
||||
|
||||
|
|
@ -264,7 +266,7 @@ RSpec.describe Integration do
|
|||
context 'with all existing instances' do
|
||||
before do
|
||||
Integration.insert_all(
|
||||
Integration.available_services_types(include_project_specific: false).map { |type| { instance: true, type: type } }
|
||||
Integration.available_integration_types(include_project_specific: false).map { |type| { instance: true, type: type } }
|
||||
)
|
||||
end
|
||||
|
||||
|
|
@ -292,13 +294,15 @@ RSpec.describe Integration do
|
|||
describe 'template' do
|
||||
shared_examples 'retrieves service templates' do
|
||||
it 'returns the available service templates' do
|
||||
expect(Integration.find_or_create_templates.pluck(:type)).to match_array(Integration.available_services_types(include_project_specific: false))
|
||||
expect(Integration.find_or_create_templates.pluck(:type)).to match_array(Integration.available_integration_types(include_project_specific: false))
|
||||
end
|
||||
end
|
||||
|
||||
describe '.find_or_create_templates' do
|
||||
it 'creates service templates' do
|
||||
expect { Integration.find_or_create_templates }.to change { Integration.count }.from(0).to(Integration.available_services_names(include_project_specific: false).size)
|
||||
total = Integration.available_integration_names(include_project_specific: false).size
|
||||
|
||||
expect { Integration.find_or_create_templates }.to change(Integration, :count).from(0).to(total)
|
||||
end
|
||||
|
||||
it_behaves_like 'retrieves service templates'
|
||||
|
|
@ -306,7 +310,7 @@ RSpec.describe Integration do
|
|||
context 'with all existing templates' do
|
||||
before do
|
||||
Integration.insert_all(
|
||||
Integration.available_services_types(include_project_specific: false).map { |type| { template: true, type: type } }
|
||||
Integration.available_integration_types(include_project_specific: false).map { |type| { template: true, type: type } }
|
||||
)
|
||||
end
|
||||
|
||||
|
|
@ -332,7 +336,9 @@ RSpec.describe Integration do
|
|||
end
|
||||
|
||||
it 'creates the rest of the service templates' do
|
||||
expect { Integration.find_or_create_templates }.to change { Integration.count }.from(1).to(Integration.available_services_names(include_project_specific: false).size)
|
||||
total = Integration.available_integration_names(include_project_specific: false).size
|
||||
|
||||
expect { Integration.find_or_create_templates }.to change(Integration, :count).from(1).to(total)
|
||||
end
|
||||
|
||||
it_behaves_like 'retrieves service templates'
|
||||
|
|
@ -461,13 +467,15 @@ RSpec.describe Integration do
|
|||
|
||||
describe 'is prefilled for projects pushover service' do
|
||||
it "has all fields prefilled" do
|
||||
service = project.find_or_initialize_service('pushover')
|
||||
integration = project.find_or_initialize_integration('pushover')
|
||||
|
||||
expect(service.template).to eq(false)
|
||||
expect(service.device).to eq('MyDevice')
|
||||
expect(service.sound).to eq('mic')
|
||||
expect(service.priority).to eq(4)
|
||||
expect(service.api_key).to eq('123456789')
|
||||
expect(integration).to have_attributes(
|
||||
template: eq(false),
|
||||
device: eq('MyDevice'),
|
||||
sound: eq('mic'),
|
||||
priority: eq(4),
|
||||
api_key: eq('123456789')
|
||||
)
|
||||
end
|
||||
end
|
||||
end
|
||||
|
|
@ -896,37 +904,37 @@ RSpec.describe Integration do
|
|||
end
|
||||
end
|
||||
|
||||
describe '.available_services_names' do
|
||||
describe '.available_integration_names' do
|
||||
it 'calls the right methods' do
|
||||
expect(described_class).to receive(:services_names).and_call_original
|
||||
expect(described_class).to receive(:dev_services_names).and_call_original
|
||||
expect(described_class).to receive(:project_specific_services_names).and_call_original
|
||||
expect(described_class).to receive(:integration_names).and_call_original
|
||||
expect(described_class).to receive(:dev_integration_names).and_call_original
|
||||
expect(described_class).to receive(:project_specific_integration_names).and_call_original
|
||||
|
||||
described_class.available_services_names
|
||||
described_class.available_integration_names
|
||||
end
|
||||
|
||||
it 'does not call project_specific_services_names with include_project_specific false' do
|
||||
expect(described_class).to receive(:services_names).and_call_original
|
||||
expect(described_class).to receive(:dev_services_names).and_call_original
|
||||
expect(described_class).not_to receive(:project_specific_services_names)
|
||||
it 'does not call project_specific_integration_names with include_project_specific false' do
|
||||
expect(described_class).to receive(:integration_names).and_call_original
|
||||
expect(described_class).to receive(:dev_integration_names).and_call_original
|
||||
expect(described_class).not_to receive(:project_specific_integration_names)
|
||||
|
||||
described_class.available_services_names(include_project_specific: false)
|
||||
described_class.available_integration_names(include_project_specific: false)
|
||||
end
|
||||
|
||||
it 'does not call dev_services_names with include_dev false' do
|
||||
expect(described_class).to receive(:services_names).and_call_original
|
||||
expect(described_class).not_to receive(:dev_services_names)
|
||||
expect(described_class).to receive(:project_specific_services_names).and_call_original
|
||||
expect(described_class).to receive(:integration_names).and_call_original
|
||||
expect(described_class).not_to receive(:dev_integration_names)
|
||||
expect(described_class).to receive(:project_specific_integration_names).and_call_original
|
||||
|
||||
described_class.available_services_names(include_dev: false)
|
||||
described_class.available_integration_names(include_dev: false)
|
||||
end
|
||||
|
||||
it { expect(described_class.available_services_names).to include('jenkins') }
|
||||
it { expect(described_class.available_integration_names).to include('jenkins') }
|
||||
end
|
||||
|
||||
describe '.project_specific_services_names' do
|
||||
describe '.project_specific_integration_names' do
|
||||
it do
|
||||
expect(described_class.project_specific_services_names)
|
||||
expect(described_class.project_specific_integration_names)
|
||||
.to include(*described_class::PROJECT_SPECIFIC_INTEGRATION_NAMES)
|
||||
end
|
||||
end
|
||||
|
|
|
|||
|
|
@ -1557,13 +1557,16 @@ RSpec.describe Project, factory_default: :keep do
|
|||
end
|
||||
end
|
||||
|
||||
describe '.with_service' do
|
||||
describe '.with_integration' do
|
||||
before do
|
||||
create_list(:prometheus_project, 2)
|
||||
end
|
||||
|
||||
it 'avoid n + 1' do
|
||||
expect { described_class.with_service(:prometheus_integration).map(&:prometheus_integration) }.not_to exceed_query_limit(1)
|
||||
let(:integration) { :prometheus_integration }
|
||||
|
||||
it 'avoids n + 1' do
|
||||
expect { described_class.with_integration(integration).map(&integration) }
|
||||
.not_to exceed_query_limit(1)
|
||||
end
|
||||
end
|
||||
|
||||
|
|
@ -5838,53 +5841,53 @@ RSpec.describe Project, factory_default: :keep do
|
|||
end
|
||||
end
|
||||
|
||||
describe '#find_or_initialize_services' do
|
||||
describe '#find_or_initialize_integrations' do
|
||||
let_it_be(:subject) { create(:project) }
|
||||
|
||||
it 'avoids N+1 database queries' do
|
||||
control_count = ActiveRecord::QueryRecorder.new { subject.find_or_initialize_services }.count
|
||||
control_count = ActiveRecord::QueryRecorder.new { subject.find_or_initialize_integrations }.count
|
||||
|
||||
expect(control_count).to be <= 4
|
||||
end
|
||||
|
||||
it 'avoids N+1 database queries with more available services' do
|
||||
allow(Integration).to receive(:available_services_names).and_return(%w[pushover])
|
||||
control_count = ActiveRecord::QueryRecorder.new { subject.find_or_initialize_services }
|
||||
it 'avoids N+1 database queries with more available integrations' do
|
||||
allow(Integration).to receive(:available_integration_names).and_return(%w[pushover])
|
||||
control_count = ActiveRecord::QueryRecorder.new { subject.find_or_initialize_integrations }
|
||||
|
||||
allow(Integration).to receive(:available_services_names).and_call_original
|
||||
expect { subject.find_or_initialize_services }.not_to exceed_query_limit(control_count)
|
||||
allow(Integration).to receive(:available_integration_names).and_call_original
|
||||
expect { subject.find_or_initialize_integrations }.not_to exceed_query_limit(control_count)
|
||||
end
|
||||
|
||||
context 'with disabled services' do
|
||||
context 'with disabled integrations' do
|
||||
before do
|
||||
allow(Integration).to receive(:available_services_names).and_return(%w[prometheus pushover teamcity])
|
||||
allow(subject).to receive(:disabled_services).and_return(%w[prometheus])
|
||||
allow(Integration).to receive(:available_integration_names).and_return(%w[prometheus pushover teamcity])
|
||||
allow(subject).to receive(:disabled_integrations).and_return(%w[prometheus])
|
||||
end
|
||||
|
||||
it 'returns only enabled services sorted' do
|
||||
services = subject.find_or_initialize_services
|
||||
|
||||
expect(services.size).to eq(2)
|
||||
expect(services.map(&:title)).to eq(['JetBrains TeamCity', 'Pushover'])
|
||||
expect(subject.find_or_initialize_integrations).to match [
|
||||
have_attributes(title: 'JetBrains TeamCity'),
|
||||
have_attributes(title: 'Pushover')
|
||||
]
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
describe '#find_or_initialize_service' do
|
||||
describe '#find_or_initialize_integration' do
|
||||
it 'avoids N+1 database queries' do
|
||||
allow(Integration).to receive(:available_services_names).and_return(%w[prometheus pushover])
|
||||
allow(Integration).to receive(:available_integration_names).and_return(%w[prometheus pushover])
|
||||
|
||||
control_count = ActiveRecord::QueryRecorder.new { subject.find_or_initialize_service('prometheus') }.count
|
||||
control_count = ActiveRecord::QueryRecorder.new { subject.find_or_initialize_integration('prometheus') }.count
|
||||
|
||||
allow(Integration).to receive(:available_services_names).and_call_original
|
||||
allow(Integration).to receive(:available_integration_names).and_call_original
|
||||
|
||||
expect { subject.find_or_initialize_service('prometheus') }.not_to exceed_query_limit(control_count)
|
||||
expect { subject.find_or_initialize_integration('prometheus') }.not_to exceed_query_limit(control_count)
|
||||
end
|
||||
|
||||
it 'returns nil if integration is disabled' do
|
||||
allow(subject).to receive(:disabled_services).and_return(%w[prometheus])
|
||||
allow(subject).to receive(:disabled_integrations).and_return(%w[prometheus])
|
||||
|
||||
expect(subject.find_or_initialize_service('prometheus')).to be_nil
|
||||
expect(subject.find_or_initialize_integration('prometheus')).to be_nil
|
||||
end
|
||||
|
||||
context 'with an existing integration' do
|
||||
|
|
@ -5895,7 +5898,7 @@ RSpec.describe Project, factory_default: :keep do
|
|||
end
|
||||
|
||||
it 'retrieves the integration' do
|
||||
expect(subject.find_or_initialize_service('prometheus').api_url).to eq('https://prometheus.project.com/')
|
||||
expect(subject.find_or_initialize_integration('prometheus').api_url).to eq('https://prometheus.project.com/')
|
||||
end
|
||||
end
|
||||
|
||||
|
|
@ -5905,25 +5908,25 @@ RSpec.describe Project, factory_default: :keep do
|
|||
create(:prometheus_integration, :template, api_url: 'https://prometheus.template.com/')
|
||||
end
|
||||
|
||||
it 'builds the service from the instance if exists' do
|
||||
expect(subject.find_or_initialize_service('prometheus').api_url).to eq('https://prometheus.instance.com/')
|
||||
it 'builds the service from the instance integration' do
|
||||
expect(subject.find_or_initialize_integration('prometheus').api_url).to eq('https://prometheus.instance.com/')
|
||||
end
|
||||
end
|
||||
|
||||
context 'with an instance-level and template integrations' do
|
||||
context 'with a template integration and no instance-level' do
|
||||
before do
|
||||
create(:prometheus_integration, :template, api_url: 'https://prometheus.template.com/')
|
||||
end
|
||||
|
||||
it 'builds the service from the template if instance does not exists' do
|
||||
expect(subject.find_or_initialize_service('prometheus').api_url).to eq('https://prometheus.template.com/')
|
||||
it 'builds the service from the template' do
|
||||
expect(subject.find_or_initialize_integration('prometheus').api_url).to eq('https://prometheus.template.com/')
|
||||
end
|
||||
end
|
||||
|
||||
context 'without an exisiting integration, nor instance-level or template' do
|
||||
it 'builds the service if instance or template does not exists' do
|
||||
expect(subject.find_or_initialize_service('prometheus')).to be_a(::Integrations::Prometheus)
|
||||
expect(subject.find_or_initialize_service('prometheus').api_url).to be_nil
|
||||
context 'without an exisiting integration, or instance-level or template' do
|
||||
it 'builds the service' do
|
||||
expect(subject.find_or_initialize_integration('prometheus')).to be_a(::Integrations::Prometheus)
|
||||
expect(subject.find_or_initialize_integration('prometheus').api_url).to be_nil
|
||||
end
|
||||
end
|
||||
end
|
||||
|
|
|
|||
|
|
@ -15,6 +15,8 @@ RSpec.describe API::GroupAvatar do
|
|||
get api(avatar_path(group))
|
||||
|
||||
expect(response).to have_gitlab_http_status(:ok)
|
||||
expect(response.headers['Content-Disposition'])
|
||||
.to eq(%(attachment; filename="dk.png"; filename*=UTF-8''dk.png))
|
||||
end
|
||||
|
||||
context 'when the group does not have avatar' do
|
||||
|
|
@ -24,6 +26,8 @@ RSpec.describe API::GroupAvatar do
|
|||
get api(avatar_path(group))
|
||||
|
||||
expect(response).to have_gitlab_http_status(:not_found)
|
||||
expect(response.body)
|
||||
.to eq(%({"message":"404 Avatar Not Found"}))
|
||||
end
|
||||
end
|
||||
|
||||
|
|
|
|||
|
|
@ -24,11 +24,11 @@ RSpec.describe API::Services do
|
|||
expect(response).to have_gitlab_http_status(:forbidden)
|
||||
end
|
||||
|
||||
context 'project with services' do
|
||||
context 'with integrations' do
|
||||
let!(:active_integration) { create(:emails_on_push_integration, project: project, active: true) }
|
||||
let!(:integration) { create(:custom_issue_tracker_integration, project: project, active: false) }
|
||||
|
||||
it "returns a list of all active services" do
|
||||
it "returns a list of all active integrations" do
|
||||
get api("/projects/#{project.id}/services", user)
|
||||
|
||||
aggregate_failures 'expect successful response with all active services' do
|
||||
|
|
@ -42,7 +42,7 @@ RSpec.describe API::Services do
|
|||
end
|
||||
end
|
||||
|
||||
Integration.available_services_names.each do |service|
|
||||
Integration.available_integration_names.each do |service|
|
||||
describe "PUT /projects/:id/services/#{service.dasherize}" do
|
||||
include_context service
|
||||
|
||||
|
|
@ -99,7 +99,7 @@ RSpec.describe API::Services do
|
|||
include_context service
|
||||
|
||||
before do
|
||||
initialize_service(service)
|
||||
initialize_integration(service)
|
||||
end
|
||||
|
||||
it "deletes #{service}" do
|
||||
|
|
@ -114,7 +114,7 @@ RSpec.describe API::Services do
|
|||
describe "GET /projects/:id/services/#{service.dasherize}" do
|
||||
include_context service
|
||||
|
||||
let!(:initialized_service) { initialize_service(service, active: true) }
|
||||
let!(:initialized_service) { initialize_integration(service, active: true) }
|
||||
|
||||
let_it_be(:project2) do
|
||||
create(:project, creator_id: user.id, namespace: user.namespace)
|
||||
|
|
@ -141,7 +141,7 @@ RSpec.describe API::Services do
|
|||
expect(json_response['properties'].keys).to match_array(service_instance.api_field_names)
|
||||
end
|
||||
|
||||
it "returns all properties of inactive service #{service}" do
|
||||
it "returns all properties of inactive integration #{service}" do
|
||||
deactive_service!
|
||||
|
||||
get api("/projects/#{project.id}/services/#{dashed_service}", user)
|
||||
|
|
@ -151,16 +151,16 @@ RSpec.describe API::Services do
|
|||
expect(json_response['properties'].keys).to match_array(service_instance.api_field_names)
|
||||
end
|
||||
|
||||
it "returns not found if service does not exist" do
|
||||
it "returns not found if integration does not exist" do
|
||||
get api("/projects/#{project2.id}/services/#{dashed_service}", user)
|
||||
|
||||
expect(response).to have_gitlab_http_status(:not_found)
|
||||
expect(json_response['message']).to eq('404 Service Not Found')
|
||||
end
|
||||
|
||||
it "returns not found if service exists but is in `Project#disabled_services`" do
|
||||
it "returns not found if service exists but is in `Project#disabled_integrations`" do
|
||||
expect_next_found_instance_of(Project) do |project|
|
||||
expect(project).to receive(:disabled_services).at_least(:once).and_return([service])
|
||||
expect(project).to receive(:disabled_integrations).at_least(:once).and_return([service])
|
||||
end
|
||||
|
||||
get api("/projects/#{project.id}/services/#{dashed_service}", user)
|
||||
|
|
|
|||
|
|
@ -394,11 +394,11 @@ RSpec.describe Projects::Operations::UpdateService do
|
|||
}
|
||||
end
|
||||
|
||||
it 'uses Project#find_or_initialize_service to include instance defined defaults and pass them to Projects::UpdateService', :aggregate_failures do
|
||||
it 'uses Project#find_or_initialize_integration to include instance defined defaults and pass them to Projects::UpdateService', :aggregate_failures do
|
||||
project_update_service = double(Projects::UpdateService)
|
||||
|
||||
expect(project)
|
||||
.to receive(:find_or_initialize_service)
|
||||
.to receive(:find_or_initialize_integration)
|
||||
.with('prometheus')
|
||||
.and_return(prometheus_integration)
|
||||
expect(Projects::UpdateService).to receive(:new) do |project_arg, user_arg, update_params_hash|
|
||||
|
|
@ -413,13 +413,13 @@ RSpec.describe Projects::Operations::UpdateService do
|
|||
end
|
||||
end
|
||||
|
||||
context 'prometheus params were not passed into service' do
|
||||
context 'when prometheus params are not passed into service' do
|
||||
let(:params) { { something: :else } }
|
||||
|
||||
it 'does not pass any prometheus params into Projects::UpdateService', :aggregate_failures do
|
||||
project_update_service = double(Projects::UpdateService)
|
||||
|
||||
expect(project).not_to receive(:find_or_initialize_service)
|
||||
expect(project).not_to receive(:find_or_initialize_integration)
|
||||
expect(Projects::UpdateService)
|
||||
.to receive(:new)
|
||||
.with(project, user, {})
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
Integration.available_services_names.each do |service|
|
||||
Integration.available_integration_names.each do |service|
|
||||
RSpec.shared_context service do
|
||||
include JiraServiceHelper if service == 'jira'
|
||||
|
||||
|
|
@ -49,12 +49,12 @@ Integration.available_services_names.each do |service|
|
|||
stub_jira_integration_test if service == 'jira'
|
||||
end
|
||||
|
||||
def initialize_service(service, attrs = {})
|
||||
service_item = project.find_or_initialize_service(service)
|
||||
service_item.attributes = attrs
|
||||
service_item.properties = service_attrs
|
||||
service_item.save!
|
||||
service_item
|
||||
def initialize_integration(integration, attrs = {})
|
||||
record = project.find_or_initialize_integration(integration)
|
||||
record.attributes = attrs
|
||||
record.properties = service_attrs
|
||||
record.save!
|
||||
record
|
||||
end
|
||||
|
||||
private
|
||||
|
|
@ -66,7 +66,7 @@ Integration.available_services_names.each do |service|
|
|||
return unless licensed_feature
|
||||
|
||||
stub_licensed_features(licensed_feature => true)
|
||||
project.clear_memoization(:disabled_services)
|
||||
project.clear_memoization(:disabled_integrations)
|
||||
end
|
||||
end
|
||||
end
|
||||
|
|
|
|||
|
|
@ -0,0 +1,92 @@
|
|||
# frozen_string_literal: true
|
||||
|
||||
require 'spec_helper'
|
||||
|
||||
RSpec.describe Packages::Helm::ExtractionWorker, type: :worker do
|
||||
describe '#perform' do
|
||||
let_it_be(:package) { create(:helm_package, without_package_files: true, status: 'processing')}
|
||||
|
||||
let!(:package_file) { create(:helm_package_file, without_loaded_metadatum: true, package: package) }
|
||||
let(:package_file_id) { package_file.id }
|
||||
let(:channel) { 'stable' }
|
||||
|
||||
let(:expected_metadata) do
|
||||
{
|
||||
'apiVersion' => 'v2',
|
||||
'description' => 'File, Block, and Object Storage Services for your Cloud-Native Environment',
|
||||
'icon' => 'https://rook.io/images/rook-logo.svg',
|
||||
'name' => 'rook-ceph',
|
||||
'sources' => ['https://github.com/rook/rook'],
|
||||
'version' => 'v1.5.8'
|
||||
}
|
||||
end
|
||||
|
||||
subject { described_class.new.perform(channel, package_file_id) }
|
||||
|
||||
shared_examples 'handling error' do
|
||||
it 'mark the package as errored', :aggregate_failures do
|
||||
expect(Gitlab::ErrorTracking).to receive(:log_exception).with(
|
||||
instance_of(Packages::Helm::ExtractFileMetadataService::ExtractionError),
|
||||
project_id: package_file.package.project_id
|
||||
)
|
||||
expect { subject }
|
||||
.to not_change { Packages::Package.count }
|
||||
.and not_change { Packages::PackageFile.count }
|
||||
.and change { package.reload.status }.from('processing').to('error')
|
||||
end
|
||||
end
|
||||
|
||||
context 'with valid package file' do
|
||||
it_behaves_like 'an idempotent worker' do
|
||||
let(:job_args) { [channel, package_file_id] }
|
||||
|
||||
it 'updates package and package file', :aggregate_failures do
|
||||
expect(Gitlab::ErrorTracking).not_to receive(:log_exception)
|
||||
|
||||
expect { subject }
|
||||
.to not_change { Packages::Package.count }
|
||||
.and not_change { Packages::PackageFile.count }
|
||||
.and change { Packages::Helm::FileMetadatum.count }.from(0).to(1)
|
||||
.and change { package.reload.status }.from('processing').to('default')
|
||||
|
||||
helm_file_metadatum = package_file.helm_file_metadatum
|
||||
|
||||
expect(helm_file_metadatum.channel).to eq(channel)
|
||||
expect(helm_file_metadatum.metadata).to eq(expected_metadata)
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
context 'with invalid package file id' do
|
||||
let(:package_file_id) { 5555 }
|
||||
|
||||
it "doesn't update helm_file_metadatum", :aggregate_failures do
|
||||
expect { subject }
|
||||
.to not_change { Packages::Package.count }
|
||||
.and not_change { Packages::PackageFile.count }
|
||||
.and not_change { Packages::Helm::FileMetadatum.count }
|
||||
.and not_change { package.reload.status }
|
||||
end
|
||||
end
|
||||
|
||||
context 'with an empty package file' do
|
||||
before do
|
||||
expect_next_instance_of(Gem::Package::TarReader) do |tar_reader|
|
||||
expect(tar_reader).to receive(:each).and_return([])
|
||||
end
|
||||
end
|
||||
|
||||
it_behaves_like 'handling error'
|
||||
end
|
||||
|
||||
context 'with an invalid YAML' do
|
||||
before do
|
||||
expect_next_instance_of(Gem::Package::TarReader::Entry) do |entry|
|
||||
expect(entry).to receive(:read).and_return('{')
|
||||
end
|
||||
end
|
||||
|
||||
it_behaves_like 'handling error'
|
||||
end
|
||||
end
|
||||
end
|
||||
Loading…
Reference in New Issue