gitlab-ce/doc/user/ai_features.md

19 KiB

stage group info type
AI-powered AI Model Validation To determine the technical writer assigned to the Stage/Group associated with this page, see https://about.gitlab.com/handbook/product/ux/technical-writing/#assignments index, reference

GitLab Duo

GitLab is creating AI-assisted features across our DevSecOps platform. These features aim to help increase velocity and solve key pain points across the software development lifecycle.

Feature Purpose Large Language Model Current availability Maturity
Suggested Reviewers Assists in creating faster and higher-quality reviews by automatically suggesting reviewers for your merge request. GitLab creates a machine learning model for each project, which is used to generate reviewers

View the issue
SaaS only

Ultimate tier
Generally Available (GA)
Code Suggestions Helps you write code more efficiently by viewing code suggestions as you type. code-gecko and code-bison

Anthropic's Claude model
SaaS
Self-managed

All tiers
Beta
Vulnerability summary Helps you remediate vulnerabilities more efficiently, uplevel your skills, and write more secure code. text-bison

Anthropic's claude model if degraded performance
SaaS only

Ultimate tier
Beta
Code explanation Helps you understand code by explaining it in English language. codechat-bison SaaS only

Ultimate tier
Experiment
GitLab Duo Chat Process and generate text and code in a conversational manner. Helps you quickly identify useful information in large volumes of text in issues, epics, code, and GitLab documentation. Anthropic's claude model

OpenAI Embeddings
SaaS only

Ultimate tier
Experiment
Value stream forecasting Assists you with predicting productivity metrics and identifying anomalies across your software development lifecycle. Statistical forecasting SaaS only
Self-managed

Ultimate tier
Experiment
Discussion summary Assists with quickly getting everyone up to speed on lengthy conversations to help ensure you are all on the same page. OpenAI's GPT-3 SaaS only

Ultimate tier
Experiment
Merge request summary Efficiently communicate the impact of your merge request changes. text-bison SaaS only

Ultimate tier
Experiment
Code review summary Helps ease merge request handoff between authors and reviewers and help reviewers efficiently understand suggestions. text-bison SaaS only

Ultimate tier
Experiment
Merge request template population Generate a description for the merge request based on the contents of the template. text-bison SaaS only

Ultimate tier
Experiment
Test generation Automates repetitive tasks and helps catch bugs early. text-bison SaaS only

Ultimate tier
Experiment
Git suggestions Helps you discover or recall Git commands when and where you need them. Google Vertex Codey APIs SaaS only

Ultimate tier
Experiment
Root cause analysis Assists you in determining the root cause for a pipeline failure and failed CI/CD build. Google Vertex Codey APIs SaaS only

Ultimate tier
Experiment
Issue description generation Generate issue descriptions. OpenAI's GPT-3 SaaS only

Ultimate tier
Experiment

Enable AI/ML features

  • Third-party AI features
    • All features built on large language models (LLM) from Google, Anthropic or OpenAI (besides Code Suggestions) require that this setting is enabled at the group level.
    • Generally Available features are available when third-party AI features are enabled.
    • Third-party AI features are enabled by default.
    • This setting is available to Ultimate groups on SaaS and can be set by a user who has the Owner role in the group.
    • View how to enable this setting.
  • Experiment and Beta features
    • All features categorized as Experiment features or Beta features (besides Code Suggestions) require that this setting is enabled at the group level. This is in addition to the Third-party AI features setting.
    • Their usage is subject to the Testing Terms of Use.
    • Experiment and Beta features are disabled by default.
    • This setting is available to Ultimate groups on SaaS and can be set by a user who has the Owner role in the group.
    • View how to enable this setting.
  • Code Suggestions

Experimental AI features and how to use them

The following subsections describe the experimental AI features in more detail.

Explain code in the Web UI with Code explanation (ULTIMATE SAAS EXPERIMENT)

Introduced in GitLab 15.11 as an Experiment on GitLab.com.

To use this feature:

GitLab can help you get up to speed faster if you:

  • Spend a lot of time trying to understand pieces of code that others have created, or
  • Struggle to understand code written in a language that you are not familiar with.

By using a large language model, GitLab can explain the code in natural language.

To explain your code:

  1. On the left sidebar, select Search or go to and find your project.
  2. Select any file in your project that contains code.
  3. On the file, select the lines that you want to have explained.
  4. On the left side, select the question mark ({question}). You might have to scroll to the first line of your selection to view it. This sends the selected code, together with a prompt, to provide an explanation to the large language model.
  5. A drawer is displayed on the right side of the page. Wait a moment for the explanation to be generated.
  6. Provide feedback about how satisfied you are with the explanation, so we can improve the results.

You can also have code explained in the context of a merge request. To explain code in a merge request:

  1. On the left sidebar, select Search or go to and find your project.

  2. On the left sidebar, select Code > Merge requests, then select your merge request.

  3. On the secondary menu, select Changes.

  4. On the file you would like explained, select the three dots ({ellipsis_v}) and select View File @ $SHA.

    A separate browser tab opens and shows the full file with the latest changes.

  5. On the new tab, select the lines that you want to have explained.

  6. On the left side, select the question mark ({question}). You might have to scroll to the first line of your selection to view it. This sends the selected code, together with a prompt, to provide an explanation to the large language model.

  7. A drawer is displayed on the right side of the page. Wait a moment for the explanation to be generated.

  8. Provide feedback about how satisfied you are with the explanation, so we can improve the results.

How to use the Explain Code Experiment

We cannot guarantee that the large language model produces results that are correct. Use the explanation with caution.

Answer questions with GitLab Duo Chat (ULTIMATE SAAS EXPERIMENT)

Introduced in GitLab 16.0 as an Experiment.

To use this feature, at least one group you're a member of must:

You can get AI generated support from GitLab Duo Chat about the following topics:

  • How to use GitLab.
  • Questions about an issue.
  • Summarizing an issue.

Example questions you might ask:

  • What is a fork?
  • How to reset my password
  • Summarize the issue <link to your issue>
  • Summarize the description of the current issue

The examples above all use data from either the issue or the GitLab documentation. However, you can also ask to generate code, CI/CD configurations, or to explain code. For example:

  • Write a hello world function in Ruby
  • Write a tic tac toe game in JavaScript
  • Write a .gitlab-ci.yml file to test and build a rails application
  • Explain the following code: def sum(a, b) a + b end

You can also ask follow-up questions.

This is an experimental feature and we're continuously extending the capabilities and reliability of the chat.

  1. In the lower-left corner, select the Help icon. The new left sidebar must be enabled.
  2. Select Ask in GitLab Duo Chat. A drawer opens on the right side of your screen.
  3. Enter your question in the chat input box and press Enter or select Send. It may take a few seconds for the interactive AI chat to produce an answer.
  4. You can ask a follow-up question.
  5. If you want to ask a new question unrelated to the previous conversation, you may receive better answers if you clear the context by typing /reset into the input box and selecting Send.

To give feedback about a specific response, use the feedback buttons in the response message. Or, you can add a comment in the feedback issue.

NOTE: Only the last 50 messages are retained in the chat history. The chat history expires 3 days after last use.

Summarize issue discussions with Discussion summary (ULTIMATE SAAS EXPERIMENT)

Introduced in GitLab 16.0 as an Experiment.

To use this feature:

You can generate a summary of discussions on an issue:

  1. In an issue, scroll to the Activity section.
  2. Select View summary.

The comments in the issue are summarized in as many as 10 list items. The summary is displayed only for you.

Provide feedback on this experimental feature in issue 407779.

Data usage: When you use this feature, the text of public comments on the issue are sent to the large language model referenced above.

Forecast deployment frequency with Value stream forecasting (ULTIMATE ALL EXPERIMENT)

Introduced in GitLab 16.2 as an Experiment.

To use this feature:

In CI/CD Analytics, you can view a forecast of deployment frequency:

  1. On the left sidebar, select Search or go to and find your project.
  2. Select Analyze > CI/CD analytics.
  3. Select the Deployment frequency tab.
  4. Turn on the Show forecast toggle.
  5. On the confirmation dialog, select Accept testing terms.

The forecast is displayed as a dotted line on the chart. Data is forecasted for a duration that is half of the selected date range. For example, if you select a 30-day range, a forecast for the following 15 days is displayed.

Forecast deployment frequency

Provide feedback on this experimental feature in issue 416833.

Root cause analysis (ULTIMATE SAAS EXPERIMENT)

Introduced in GitLab 16.2 as an Experiment.

To use this feature:

When the feature is available, the "Root cause analysis" button will appears on a failed CI/CD job. Selecting this button generates an analysis regarding the reason for the failure.

Summarize an issue with Issue description generation (ULTIMATE SAAS EXPERIMENT)

Introduced in GitLab 16.3 as an Experiment.

To use this feature:

You can generate the description for an issue from a short summary.

  1. Create a new issue.
  2. Above the Description field, select AI actions > Generate issue description.
  3. Write a short description and select Submit.

The issue description is replaced with AI-generated text.

Provide feedback on this experimental feature in issue 409844.

Data usage: When you use this feature, the text you enter is sent to the large language model referenced above.

Data usage

GitLab AI features leverage generative AI to help increase velocity and aim to help make you more productive. Each feature operates independently of other features and is not required for other features to function.

Progressive enhancement

These features are designed as a progressive enhancement to existing GitLab features across our DevSecOps platform. They are designed to fail gracefully and should not prevent the core functionality of the underlying feature. You should note each feature is subject to its expected functionality as defined by the relevant feature support policy.

Stability and performance

These features are in a variety of feature support levels. Due to the nature of these features, there may be high demand for usage which may cause degraded performance or unexpected downtime of the feature. We have built these features to gracefully degrade and have controls in place to allow us to mitigate abuse or misuse. GitLab may disable beta and experimental features for any or all customers at any time at our discretion.

Third party services

Data privacy

Some AI features require the use of third-party AI services models and APIs from: Google AI and OpenAI. The processing of any personal data is in accordance with our Privacy Statement. You may also visit the Sub-Processors page to see the list of our Sub-Processors that we use to provide these features.

Group owners can control which top-level groups have access to third-party AI features by using the group level third-party AI features setting.

Model accuracy and quality

Generative AI may produce unexpected results that may be:

  • Low-quality
  • Incoherent
  • Incomplete
  • Produce failed pipelines
  • Insecure code
  • Offensive or insensitive
  • Out of date information

GitLab is actively iterating on all our AI-assisted capabilities to improve the quality of the generated content. We improve the quality through prompt engineering, evaluating new AI/ML models to power these features, and through novel heuristics built into these features directly.