Add anchored and smoothed to vector selectors. (#16457)

* Add anchored and smoothed to vector selectors.

This adds "anchored" and "smoothed" keywords that can be used following a matrix selector.

"Anchored" selects the last point before the range (or the first one after the range) and adds it at the boundary of the matrix selector.

"Smoothed" applies linear interpolation at the edges using the points around the edges. In the absence of a point before or after the edge, the first or the last point is added to the edge, without interpolation.

*Exemple usage*

* `increase(caddy_http_requests_total[5m] anchored)` (equivalent of *caddy_http_requests_total - caddy_http_requests_total offset 5m* but takes counter reset into consideration)
* `rate(caddy_http_requests_total[step()] smoothed)`

Signed-off-by: Julien Pivotto <291750+roidelapluie@users.noreply.github.com>

* Update docs/feature_flags.md

Co-authored-by: Charles Korn <charleskorn@users.noreply.github.com>
Signed-off-by: Julien <291750+roidelapluie@users.noreply.github.com>

* Smoothed/Anchored rate: Add more tests

Signed-off-by: Julien Pivotto <291750+roidelapluie@users.noreply.github.com>

* Anchored/Smoothed modifier: error out with histograms

Signed-off-by: Julien Pivotto <291750+roidelapluie@users.noreply.github.com>

---------

Signed-off-by: Julien Pivotto <291750+roidelapluie@users.noreply.github.com>
Signed-off-by: Julien <291750+roidelapluie@users.noreply.github.com>
Co-authored-by: Charles Korn <charleskorn@users.noreply.github.com>
This commit is contained in:
Julien 2025-09-25 11:34:59 +02:00 committed by GitHub
parent 55a4782eb7
commit 4199c2f45a
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
17 changed files with 1715 additions and 615 deletions

View File

@ -275,6 +275,9 @@ func (c *flagConfig) setFeatureListOptions(logger *slog.Logger) error {
case "promql-delayed-name-removal":
c.promqlEnableDelayedNameRemoval = true
logger.Info("Experimental PromQL delayed name removal enabled.")
case "promql-extended-range-selectors":
parser.EnableExtendedRangeSelectors = true
logger.Info("Experimental PromQL extended range selectors enabled.")
case "":
continue
case "old-ui":
@ -561,7 +564,7 @@ func main() {
a.Flag("scrape.discovery-reload-interval", "Interval used by scrape manager to throttle target groups updates.").
Hidden().Default("5s").SetValue(&cfg.scrape.DiscoveryReloadInterval)
a.Flag("enable-feature", "Comma separated feature names to enable. Valid options: exemplar-storage, expand-external-labels, memory-snapshot-on-shutdown, promql-per-step-stats, promql-experimental-functions, extra-scrape-metrics, auto-gomaxprocs, native-histograms, created-timestamp-zero-ingestion, concurrent-rule-eval, delayed-compaction, old-ui, otlp-deltatocumulative, promql-duration-expr, use-uncached-io. See https://prometheus.io/docs/prometheus/latest/feature_flags/ for more details.").
a.Flag("enable-feature", "Comma separated feature names to enable. Valid options: exemplar-storage, expand-external-labels, memory-snapshot-on-shutdown, promql-per-step-stats, promql-experimental-functions, extra-scrape-metrics, auto-gomaxprocs, native-histograms, created-timestamp-zero-ingestion, concurrent-rule-eval, delayed-compaction, old-ui, otlp-deltatocumulative, promql-duration-expr, use-uncached-io, promql-extended-range-selectors. See https://prometheus.io/docs/prometheus/latest/feature_flags/ for more details.").
Default("").StringsVar(&cfg.featureList)
a.Flag("agent", "Run Prometheus in 'Agent mode'.").BoolVar(&agentMode)

View File

@ -58,7 +58,7 @@ The Prometheus monitoring server
| <code class="text-nowrap">--query.timeout</code> | Maximum time a query may take before being aborted. Use with server mode only. | `2m` |
| <code class="text-nowrap">--query.max-concurrency</code> | Maximum number of queries executed concurrently. Use with server mode only. | `20` |
| <code class="text-nowrap">--query.max-samples</code> | Maximum number of samples a single query can load into memory. Note that queries will fail if they try to load more samples than this into memory, so this also limits the number of samples a query can return. Use with server mode only. | `50000000` |
| <code class="text-nowrap">--enable-feature</code> <code class="text-nowrap">...<code class="text-nowrap"> | Comma separated feature names to enable. Valid options: exemplar-storage, expand-external-labels, memory-snapshot-on-shutdown, promql-per-step-stats, promql-experimental-functions, extra-scrape-metrics, auto-gomaxprocs, native-histograms, created-timestamp-zero-ingestion, concurrent-rule-eval, delayed-compaction, old-ui, otlp-deltatocumulative, promql-duration-expr, use-uncached-io. See https://prometheus.io/docs/prometheus/latest/feature_flags/ for more details. | |
| <code class="text-nowrap">--enable-feature</code> <code class="text-nowrap">...<code class="text-nowrap"> | Comma separated feature names to enable. Valid options: exemplar-storage, expand-external-labels, memory-snapshot-on-shutdown, promql-per-step-stats, promql-experimental-functions, extra-scrape-metrics, auto-gomaxprocs, native-histograms, created-timestamp-zero-ingestion, concurrent-rule-eval, delayed-compaction, old-ui, otlp-deltatocumulative, promql-duration-expr, use-uncached-io, promql-extended-range-selectors. See https://prometheus.io/docs/prometheus/latest/feature_flags/ for more details. | |
| <code class="text-nowrap">--agent</code> | Run Prometheus in 'Agent mode'. | |
| <code class="text-nowrap">--log.level</code> | Only log messages with the given severity or above. One of: [debug, info, warn, error] | `info` |
| <code class="text-nowrap">--log.format</code> | Output format of log messages. One of: [logfmt, json] | `logfmt` |

View File

@ -302,3 +302,42 @@ memory in response to misleading cache growth.
This is currently implemented using direct I/O.
For more details, see the [proposal](https://github.com/prometheus/proposals/pull/45).
## Extended Range Selectors
`--enable-feature=promql-extended-range-selectors`
Enables experimental `anchored` and `smoothed` modifiers for PromQL range and instant selectors. These modifiers provide more control over how range boundaries are handled in functions like `rate` and `increase`, especially with missing or irregular data.
Native Histograms are not yet supported by the extended range selectors.
### `anchored`
Uses the most recent sample (within the lookback delta) at the beginning of the range, or alternatively the first sample within the range if there is no sample within the lookback delta. The last sample within the range is also used at the end of the range. No extrapolation or interpolation is applied, so this is useful to get the direct difference between sample values.
Anchored range selector work with: `resets`, `changes`, `rate`, `increase`, and `delta`.
Example query:
`increase(http_requests_total[5m] anchored)`
**Note**: When using the anchored modifier with the increase function, the results returned are integers.
### `smoothed`
In range selectors, linearly interpolates values at the range boundaries, using the sample values before and after the boundaries for an improved estimation that is robust against irregular scrapes and missing samples. However, it requires a sample after the evaluation interval to work properly, see note below.
For instant selectors, values are linearly interpolated at the evaluation timestamp using the samples immediately before and after that point.
Smoothed range selectors work with: `rate`, `increase`, and `delta`.
Example query:
`rate(http_requests_total[step()] smoothed)`
> **Note for alerting and recording rules:**
> The `smoothed` modifier requires samples after the evaluation interval, so using it directly in alerting or recording rules will typically *under-estimate* the result, as future samples are not available at evaluation time.
> To use `smoothed` safely in rules, you **must** apply a `query_offset` to the rule group (see [documentation](https://prometheus.io/docs/prometheus/latest/configuration/recording_rules/#rule_group)) to ensure the calculation window is fully in the past and all needed samples are available.
> For critical alerting, set the offset to at least one scrape interval; for less critical or more resilient use cases, consider a larger offset (multiple scrape intervals) to tolerate missed scrapes.
For more details, see the [design doc](https://github.com/prometheus/proposals/blob/main/proposals/2025-04-04_extended-range-selectors-semantics.md).
**Note**: Extended Range Selectors are not supported for subqueries.

View File

@ -117,6 +117,18 @@ func rangeQueryCases() []benchCase {
expr: "rate(sparse[1m])",
steps: 10000,
},
// Smoothed rate.
{
expr: "rate(a_X[1m] smoothed)",
},
{
expr: "rate(a_X[1m] smoothed)",
steps: 10000,
},
{
expr: "rate(sparse[1m] smoothed)",
steps: 10000,
},
// Holt-Winters and long ranges.
{
expr: "double_exponential_smoothing(a_X[1d], 0.3, 0.3)",
@ -266,6 +278,10 @@ func rangeQueryCases() []benchCase {
}
func BenchmarkRangeQuery(b *testing.B) {
parser.EnableExtendedRangeSelectors = true
b.Cleanup(func() {
parser.EnableExtendedRangeSelectors = false
})
stor := teststorage.New(b)
stor.DisableCompactions() // Don't want auto-compaction disrupting timings.
defer stor.Close()

View File

@ -21,6 +21,7 @@ import (
"fmt"
"io"
"log/slog"
"maps"
"math"
"reflect"
"runtime"
@ -926,14 +927,28 @@ func getTimeRangesForSelector(s *parser.EvalStmt, n *parser.VectorSelector, path
// because wo want to exclude samples that are precisely the
// lookback delta before the eval time.
start -= durationMilliseconds(s.LookbackDelta) - 1
if n.Smoothed {
end += durationMilliseconds(s.LookbackDelta)
}
} else {
// For all matrix queries we want to ensure that we have
// (end-start) + range selected this way we have `range` data
// before the start time. We subtract one from the range to
// exclude samples positioned directly at the lower boundary of
// the range.
// For matrix queries, adjust the start and end times to ensure the
// correct range of data is selected. For "anchored" selectors, extend
// the start time backwards by the lookback delta plus the evaluation
// range. For "smoothed" selectors, extend both the start and end times
// by the lookback delta, and also extend the start time by the
// evaluation range to cover the smoothing window. For standard range
// queries, extend the start time backwards by the range (minus one
// millisecond) to exclude samples exactly at the lower boundary.
switch {
case n.Anchored:
start -= durationMilliseconds(s.LookbackDelta+evalRange) - 1
case n.Smoothed:
start -= durationMilliseconds(s.LookbackDelta+evalRange) - 1
end += durationMilliseconds(s.LookbackDelta)
default:
start -= durationMilliseconds(evalRange) - 1
}
}
offsetMilliseconds := durationMilliseconds(n.OriginalOffset)
start -= offsetMilliseconds
@ -979,7 +994,6 @@ func (ng *Engine) populateSeries(ctx context.Context, querier storage.Querier, s
evalRange = 0
hints.By, hints.Grouping = extractGroupsFromPath(path)
n.UnexpandedSeriesSet = querier.Select(ctx, false, hints, n.LabelMatchers...)
case *parser.MatrixSelector:
evalRange = n.Range
}
@ -1524,6 +1538,76 @@ func (ev *evaluator) rangeEvalAgg(ctx context.Context, aggExpr *parser.Aggregate
return result, annos
}
// smoothSeries is a helper function that smooths the series by interpolating the values
// based on values before and after the timestamp.
func (ev *evaluator) smoothSeries(series []storage.Series, offset time.Duration) Matrix {
dur := ev.endTimestamp - ev.startTimestamp
it := storage.NewBuffer(dur + 2*durationMilliseconds(ev.lookbackDelta))
offMS := offset.Milliseconds()
start := ev.startTimestamp - offMS
end := ev.endTimestamp - offMS
step := ev.interval
lb := durationMilliseconds(ev.lookbackDelta)
var chkIter chunkenc.Iterator
mat := make(Matrix, 0, len(series))
for _, s := range series {
ss := Series{Metric: s.Labels()}
chkIter = s.Iterator(chkIter)
it.Reset(chkIter)
var floats []FPoint
var hists []HPoint
for ts := start; ts <= end; ts += step {
matrixStart := ts - lb
matrixEnd := ts + lb
floats, hists = ev.matrixIterSlice(it, matrixStart, matrixEnd, floats, hists)
if len(floats) == 0 && len(hists) == 0 {
continue
}
if len(hists) > 0 {
// TODO: support native histograms.
ev.errorf("smoothed and anchored modifiers do not work with native histograms")
}
// Binary search for the first index with T >= ts.
i := sort.Search(len(floats), func(i int) bool { return floats[i].T >= ts })
switch {
case i < len(floats) && floats[i].T == ts:
// Exact match.
ss.Floats = append(ss.Floats, floats[i])
case i > 0 && i < len(floats):
// Interpolate between prev and next.
// TODO: detect if the sample is a counter, based on __type__ or metadata.
prev, next := floats[i-1], floats[i]
val := interpolate(prev, next, ts, false, false)
ss.Floats = append(ss.Floats, FPoint{F: val, T: ts})
case i > 0:
// No next point yet; carry forward previous value.
prev := floats[i-1]
ss.Floats = append(ss.Floats, FPoint{F: prev.F, T: ts})
default:
// i == 0 and floats[0].T > ts: there is no previous data yet; skip.
}
}
mat = append(mat, ss)
}
return mat
}
// evalSeries generates a Matrix between ev.startTimestamp and ev.endTimestamp (inclusive), each point spaced ev.interval apart, from series given offset.
// For every storage.Series iterator in series, the method iterates in ev.interval sized steps from ev.startTimestamp until and including ev.endTimestamp,
// collecting every corresponding sample (obtained via ev.vectorSelectorSingle) into a Series.
@ -1784,6 +1868,17 @@ func (ev *evaluator) eval(ctx context.Context, expr parser.Expr) (parser.Value,
sel := arg.(*parser.MatrixSelector)
selVS := sel.VectorSelector.(*parser.VectorSelector)
switch {
case selVS.Anchored:
if _, ok := AnchoredSafeFunctions[e.Func.Name]; !ok {
ev.errorf("anchored modifier can only be used with: %s - not with %s", strings.Join(slices.Sorted(maps.Keys(AnchoredSafeFunctions)), ", "), e.Func.Name)
}
case selVS.Smoothed:
if _, ok := SmoothedSafeFunctions[e.Func.Name]; !ok {
ev.errorf("smoothed modifier can only be used with: %s - not with %s", strings.Join(slices.Sorted(maps.Keys(SmoothedSafeFunctions)), ", "), e.Func.Name)
}
}
ws, err := checkAndExpandSeriesSet(ctx, sel)
warnings.Merge(ws)
if err != nil {
@ -1792,7 +1887,17 @@ func (ev *evaluator) eval(ctx context.Context, expr parser.Expr) (parser.Value,
mat := make(Matrix, 0, len(selVS.Series)) // Output matrix.
offset := durationMilliseconds(selVS.Offset)
selRange := durationMilliseconds(sel.Range)
stepRange := min(selRange, ev.interval)
var stepRange int64
switch {
case selVS.Anchored:
stepRange = min(selRange+durationMilliseconds(ev.lookbackDelta), ev.interval)
case selVS.Smoothed:
stepRange = min(selRange+durationMilliseconds(2*ev.lookbackDelta), ev.interval)
default:
stepRange = min(selRange, ev.interval)
}
// Reuse objects across steps to save memory allocations.
var floats []FPoint
var histograms []HPoint
@ -1800,7 +1905,18 @@ func (ev *evaluator) eval(ctx context.Context, expr parser.Expr) (parser.Value,
inMatrix := make(Matrix, 1)
enh := &EvalNodeHelper{Out: make(Vector, 0, 1), enableDelayedNameRemoval: ev.enableDelayedNameRemoval}
// Process all the calls for one time series at a time.
it := storage.NewBuffer(selRange)
// For anchored and smoothed selectors, we need to iterate over a
// larger range than the query range to account for the lookback delta.
// For standard range queries, we iterate over the query range.
bufferRange := selRange
switch {
case selVS.Anchored:
bufferRange += durationMilliseconds(ev.lookbackDelta)
case selVS.Smoothed:
bufferRange += durationMilliseconds(2 * ev.lookbackDelta)
}
it := storage.NewBuffer(bufferRange)
var chkIter chunkenc.Iterator
// The last_over_time and first_over_time functions act like
@ -1849,11 +1965,24 @@ func (ev *evaluator) eval(ctx context.Context, expr parser.Expr) (parser.Value,
if ts == ev.startTimestamp || selVS.Timestamp == nil {
maxt := ts - offset
mint := maxt - selRange
switch {
case selVS.Anchored:
mint -= durationMilliseconds(ev.lookbackDelta)
case selVS.Smoothed:
mint -= durationMilliseconds(ev.lookbackDelta)
maxt += durationMilliseconds(ev.lookbackDelta)
}
floats, histograms = ev.matrixIterSlice(it, mint, maxt, floats, histograms)
}
if len(floats)+len(histograms) == 0 {
continue
}
if selVS.Anchored || selVS.Smoothed {
if len(histograms) > 0 {
// TODO: support native histograms.
ev.errorf("smoothed and anchored modifiers do not work with native histograms")
}
}
inMatrix[0].Floats = floats
inMatrix[0].Histograms = histograms
enh.Ts = ts
@ -2052,6 +2181,10 @@ func (ev *evaluator) eval(ctx context.Context, expr parser.Expr) (parser.Value,
if err != nil {
ev.error(errWithWarnings{fmt.Errorf("expanding series: %w", err), ws})
}
if e.Smoothed {
mat := ev.smoothSeries(e.Series, e.Offset)
return mat, ws
}
mat := ev.evalSeries(ctx, e.Series, e.Offset, false)
return mat, ws
@ -2348,10 +2481,23 @@ func (ev *evaluator) matrixSelector(ctx context.Context, node *parser.MatrixSele
offset = durationMilliseconds(vs.Offset)
maxt = ev.startTimestamp - offset
mint = maxt - durationMilliseconds(node.Range)
// matrixMint keeps the original mint for smoothed and anchored selectors.
matrixMint = mint
// matrixMaxt keeps the original maxt for smoothed and anchored selectors.
matrixMaxt = maxt
matrix = make(Matrix, 0, len(vs.Series))
it = storage.NewBuffer(durationMilliseconds(node.Range))
bufferRange = durationMilliseconds(node.Range)
)
switch {
case vs.Anchored:
bufferRange += durationMilliseconds(ev.lookbackDelta)
mint -= durationMilliseconds(ev.lookbackDelta)
case vs.Smoothed:
bufferRange += 2 * durationMilliseconds(ev.lookbackDelta)
mint -= durationMilliseconds(ev.lookbackDelta)
maxt += durationMilliseconds(ev.lookbackDelta)
}
it := storage.NewBuffer(bufferRange)
ws, err := checkAndExpandSeriesSet(ctx, node)
if err != nil {
ev.error(errWithWarnings{fmt.Errorf("expanding series: %w", err), ws})
@ -2370,6 +2516,18 @@ func (ev *evaluator) matrixSelector(ctx context.Context, node *parser.MatrixSele
}
ss.Floats, ss.Histograms = ev.matrixIterSlice(it, mint, maxt, nil, nil)
switch {
case vs.Anchored:
if ss.Histograms != nil {
ev.errorf("anchored modifier is not supported with histograms")
}
ss.Floats = extendFloats(ss.Floats, matrixMint, matrixMaxt, false)
case vs.Smoothed:
if ss.Histograms != nil {
ev.errorf("anchored modifier is not supported with histograms")
}
ss.Floats = extendFloats(ss.Floats, matrixMint, matrixMaxt, true)
}
totalSize := int64(len(ss.Floats)) + int64(totalHPointSize(ss.Histograms))
ev.samplesStats.IncrementSamplesAtTimestamp(ev.startTimestamp, totalSize)
@ -4035,3 +4193,39 @@ func (ev *evaluator) gatherVector(ts int64, input Matrix, output Vector, bufHelp
return output, bufHelpers
}
// extendFloats extends the floats to the given mint and maxt.
// This function is used with matrix selectors that are smoothed or anchored.
func extendFloats(floats []FPoint, mint, maxt int64, smoothed bool) []FPoint {
lastSampleIndex := len(floats) - 1
firstSampleIndex := max(0, sort.Search(lastSampleIndex, func(i int) bool { return floats[i].T > mint })-1)
if smoothed {
lastSampleIndex = sort.Search(lastSampleIndex, func(i int) bool { return floats[i].T >= maxt })
}
if floats[lastSampleIndex].T <= mint {
return []FPoint{}
}
// TODO: detect if the sample is a counter, based on __type__ or metadata.
left := pickOrInterpolateLeft(floats, firstSampleIndex, mint, smoothed, false)
right := pickOrInterpolateRight(floats, lastSampleIndex, maxt, smoothed, false)
// Filter out samples at boundaries or outside the range.
if floats[firstSampleIndex].T <= mint {
firstSampleIndex++
}
if floats[lastSampleIndex].T >= maxt {
lastSampleIndex--
}
// TODO: Preallocate the length of the new list.
out := make([]FPoint, 0)
// Create the new floats list with the boundary samples and the inner samples.
out = append(out, FPoint{T: mint, F: left})
out = append(out, floats[firstSampleIndex:lastSampleIndex+1]...)
out = append(out, FPoint{T: maxt, F: right})
return out
}

View File

@ -1513,6 +1513,160 @@ load 10s
}
}
func TestExtendedRangeSelectors(t *testing.T) {
parser.EnableExtendedRangeSelectors = true
t.Cleanup(func() {
parser.EnableExtendedRangeSelectors = false
})
engine := newTestEngine(t)
storage := promqltest.LoadedStorage(t, `
load 10s
metric 1+1x10
withreset 1+1x4 1+1x5
notregular 0 5 100 2 8
`)
t.Cleanup(func() { storage.Close() })
tc := []struct {
query string
t time.Time
expected promql.Matrix
}{
{
query: "metric[10s] smoothed",
t: time.Unix(10, 0),
expected: promql.Matrix{
promql.Series{
Floats: []promql.FPoint{{F: 1, T: 0}, {F: 2, T: 10000}},
Metric: labels.FromStrings("__name__", "metric"),
},
},
},
{
query: "metric[10s] smoothed",
t: time.Unix(15, 0),
expected: promql.Matrix{
promql.Series{
Floats: []promql.FPoint{{F: 1.5, T: 5000}, {F: 2, T: 10000}, {F: 2.5, T: 15000}},
Metric: labels.FromStrings("__name__", "metric"),
},
},
},
{
query: "metric[10s] smoothed",
t: time.Unix(5, 0),
expected: promql.Matrix{
promql.Series{
Floats: []promql.FPoint{{F: 1, T: -5000}, {F: 1, T: 0}, {F: 1.5, T: 5000}},
Metric: labels.FromStrings("__name__", "metric"),
},
},
},
{
query: "metric[10s] smoothed",
t: time.Unix(105, 0),
expected: promql.Matrix{
promql.Series{
Floats: []promql.FPoint{{F: 10.5, T: 95000}, {F: 11, T: 100000}, {F: 11, T: 105000}},
Metric: labels.FromStrings("__name__", "metric"),
},
},
},
{
query: "withreset[10s] smoothed",
t: time.Unix(45, 0),
expected: promql.Matrix{
promql.Series{
Floats: []promql.FPoint{{F: 4.5, T: 35000}, {F: 5, T: 40000}, {F: 3, T: 45000}},
Metric: labels.FromStrings("__name__", "withreset"),
},
},
},
{
query: "metric[10s] anchored",
t: time.Unix(10, 0),
expected: promql.Matrix{
promql.Series{
Floats: []promql.FPoint{{F: 1, T: 0}, {F: 2, T: 10000}},
Metric: labels.FromStrings("__name__", "metric"),
},
},
},
{
query: "metric[10s] anchored",
t: time.Unix(15, 0),
expected: promql.Matrix{
promql.Series{
Floats: []promql.FPoint{{F: 1, T: 5000}, {F: 2, T: 10000}, {F: 2, T: 15000}},
Metric: labels.FromStrings("__name__", "metric"),
},
},
},
{
query: "metric[10s] anchored",
t: time.Unix(5, 0),
expected: promql.Matrix{
promql.Series{
Floats: []promql.FPoint{{F: 1, T: -5000}, {F: 1, T: 0}, {F: 1, T: 5000}},
Metric: labels.FromStrings("__name__", "metric"),
},
},
},
{
query: "metric[10s] anchored",
t: time.Unix(105, 0),
expected: promql.Matrix{
promql.Series{
Floats: []promql.FPoint{{F: 10, T: 95000}, {F: 11, T: 100000}, {F: 11, T: 105000}},
Metric: labels.FromStrings("__name__", "metric"),
},
},
},
{
query: "withreset[10s] anchored",
t: time.Unix(45, 0),
expected: promql.Matrix{
promql.Series{
Floats: []promql.FPoint{{F: 4, T: 35000}, {F: 5, T: 40000}, {F: 5, T: 45000}},
Metric: labels.FromStrings("__name__", "withreset"),
},
},
},
{
query: "notregular[20s] smoothed",
t: time.Unix(30, 0),
expected: promql.Matrix{
promql.Series{
Floats: []promql.FPoint{{F: 5, T: 10000}, {F: 100, T: 20000}, {F: 2, T: 30000}},
Metric: labels.FromStrings("__name__", "notregular"),
},
},
},
{
query: "notregular[20s] anchored",
t: time.Unix(30, 0),
expected: promql.Matrix{
promql.Series{
Floats: []promql.FPoint{{F: 5, T: 10000}, {F: 100, T: 20000}, {F: 2, T: 30000}},
Metric: labels.FromStrings("__name__", "notregular"),
},
},
},
}
for _, tc := range tc {
t.Run(tc.query, func(t *testing.T) {
engine = promqltest.NewTestEngine(t, false, 0, 100)
qry, err := engine.NewInstantQuery(context.Background(), storage, nil, tc.query, tc.t)
require.NoError(t, err)
res := qry.Exec(context.Background())
require.NoError(t, res.Err)
require.Equal(t, tc.expected, res.Value)
})
}
}
func TestAtModifier(t *testing.T) {
engine := newTestEngine(t)
storage := promqltest.LoadedStorage(t, `

View File

@ -65,13 +65,127 @@ func funcTime(_ []Vector, _ Matrix, _ parser.Expressions, enh *EvalNodeHelper) (
}}, nil
}
// pickOrInterpolateLeft returns the value at the left boundary of the range.
// If interpolation is needed (when smoothed is true and the first sample is before the range start),
// it returns the interpolated value at the left boundary; otherwise, it returns the first sample's value.
func pickOrInterpolateLeft(floats []FPoint, first int, rangeStart int64, smoothed, isCounter bool) float64 {
if smoothed && floats[first].T < rangeStart {
return interpolate(floats[first], floats[first+1], rangeStart, isCounter, true)
}
return floats[first].F
}
// pickOrInterpolateRight returns the value at the right boundary of the range.
// If interpolation is needed (when smoothed is true and the last sample is after the range end),
// it returns the interpolated value at the right boundary; otherwise, it returns the last sample's value.
func pickOrInterpolateRight(floats []FPoint, last int, rangeEnd int64, smoothed, isCounter bool) float64 {
if smoothed && last > 0 && floats[last].T > rangeEnd {
return interpolate(floats[last-1], floats[last], rangeEnd, isCounter, false)
}
return floats[last].F
}
// interpolate performs linear interpolation between two points.
// If isCounter is true and there is a counter reset:
// - on the left edge, it sets the value to 0.
// - on the right edge, it adds the left value to the right value.
// It then calculates the interpolated value at the given timestamp.
func interpolate(p1, p2 FPoint, t int64, isCounter, leftEdge bool) float64 {
y1 := p1.F
y2 := p2.F
if isCounter && y2 < y1 {
if leftEdge {
y1 = 0
} else {
y2 += y1
}
}
return y1 + (y2-y1)*float64(t-p1.T)/float64(p2.T-p1.T)
}
// correctForCounterResets calculates the correction for counter resets.
// This function is only used for extendedRate functions with smoothed or anchored rates.
func correctForCounterResets(left, right float64, points []FPoint) float64 {
var correction float64
prev := left
for _, p := range points {
if p.F < prev {
correction += prev
}
prev = p.F
}
if right < prev {
correction += prev
}
return correction
}
// extendedRate is a utility function for anchored/smoothed rate/increase/delta.
// It calculates the rate (allowing for counter resets if isCounter is true),
// extrapolates if the first/last sample if needed, and returns
// the result as either per-second (if isRate is true) or overall.
func extendedRate(vals Matrix, args parser.Expressions, enh *EvalNodeHelper, isCounter, isRate bool) (Vector, annotations.Annotations) {
var (
ms = args[0].(*parser.MatrixSelector)
vs = ms.VectorSelector.(*parser.VectorSelector)
samples = vals[0]
f = samples.Floats
lastSampleIndex = len(f) - 1
rangeStart = enh.Ts - durationMilliseconds(ms.Range+vs.Offset)
rangeEnd = enh.Ts - durationMilliseconds(vs.Offset)
annos annotations.Annotations
smoothed = vs.Smoothed
)
firstSampleIndex := max(0, sort.Search(lastSampleIndex, func(i int) bool { return f[i].T > rangeStart })-1)
if smoothed {
lastSampleIndex = sort.Search(lastSampleIndex, func(i int) bool { return f[i].T >= rangeEnd })
}
if f[lastSampleIndex].T <= rangeStart {
return enh.Out, annos
}
left := pickOrInterpolateLeft(f, firstSampleIndex, rangeStart, smoothed, isCounter)
right := pickOrInterpolateRight(f, lastSampleIndex, rangeEnd, smoothed, isCounter)
resultFloat := right - left
if isCounter {
// We only need to consider samples exactly within the range
// for counter resets correction, as pickOrInterpolateLeft and
// pickOrInterpolateRight already handle the resets at boundaries.
if f[firstSampleIndex].T <= rangeStart {
firstSampleIndex++
}
if f[lastSampleIndex].T >= rangeEnd {
lastSampleIndex--
}
resultFloat += correctForCounterResets(left, right, f[firstSampleIndex:lastSampleIndex+1])
}
if isRate {
resultFloat /= ms.Range.Seconds()
}
return append(enh.Out, Sample{F: resultFloat}), annos
}
// extrapolatedRate is a utility function for rate/increase/delta.
// It calculates the rate (allowing for counter resets if isCounter is true),
// extrapolates if the first/last sample is close to the boundary, and returns
// the result as either per-second (if isRate is true) or overall.
//
// Note: If the vector selector is smoothed or anchored, it will use the
// extendedRate function instead.
func extrapolatedRate(vals Matrix, args parser.Expressions, enh *EvalNodeHelper, isCounter, isRate bool) (Vector, annotations.Annotations) {
ms := args[0].(*parser.MatrixSelector)
vs := ms.VectorSelector.(*parser.VectorSelector)
if vs.Anchored || vs.Smoothed {
return extendedRate(vals, args, enh, isCounter, isRate)
}
var (
samples = vals[0]
rangeStart = enh.Ts - durationMilliseconds(ms.Range+vs.Offset)
@ -1548,8 +1662,21 @@ func funcHistogramQuantile(vectorVals []Vector, _ Matrix, args parser.Expression
return enh.Out, annos
}
// pickFirstSampleIndex returns the index of the last sample before
// or at the range start, or 0 if none exist before the range start.
// If the vector selector is not anchored, it always returns 0.
func pickFirstSampleIndex(floats []FPoint, args parser.Expressions, enh *EvalNodeHelper) int {
ms := args[0].(*parser.MatrixSelector)
vs := ms.VectorSelector.(*parser.VectorSelector)
if !vs.Anchored {
return 0
}
rangeStart := enh.Ts - durationMilliseconds(ms.Range+vs.Offset)
return max(0, sort.Search(len(floats)-1, func(i int) bool { return floats[i].T > rangeStart })-1)
}
// === resets(Matrix parser.ValueTypeMatrix) (Vector, Annotations) ===
func funcResets(_ []Vector, matrixVal Matrix, _ parser.Expressions, enh *EvalNodeHelper) (Vector, annotations.Annotations) {
func funcResets(_ []Vector, matrixVal Matrix, args parser.Expressions, enh *EvalNodeHelper) (Vector, annotations.Annotations) {
floats := matrixVal[0].Floats
histograms := matrixVal[0].Histograms
resets := 0
@ -1558,7 +1685,8 @@ func funcResets(_ []Vector, matrixVal Matrix, _ parser.Expressions, enh *EvalNod
}
var prevSample, curSample Sample
for iFloat, iHistogram := 0, 0; iFloat < len(floats) || iHistogram < len(histograms); {
firstSampleIndex := pickFirstSampleIndex(floats, args, enh)
for iFloat, iHistogram := firstSampleIndex, 0; iFloat < len(floats) || iHistogram < len(histograms); {
switch {
// Process a float sample if no histogram sample remains or its timestamp is earlier.
// Process a histogram sample if no float sample remains or its timestamp is earlier.
@ -1571,7 +1699,7 @@ func funcResets(_ []Vector, matrixVal Matrix, _ parser.Expressions, enh *EvalNod
iHistogram++
}
// Skip the comparison for the first sample, just initialize prevSample.
if iFloat+iHistogram == 1 {
if iFloat+iHistogram == 1+firstSampleIndex {
prevSample = curSample
continue
}
@ -1594,7 +1722,7 @@ func funcResets(_ []Vector, matrixVal Matrix, _ parser.Expressions, enh *EvalNod
}
// === changes(Matrix parser.ValueTypeMatrix) (Vector, Annotations) ===
func funcChanges(_ []Vector, matrixVal Matrix, _ parser.Expressions, enh *EvalNodeHelper) (Vector, annotations.Annotations) {
func funcChanges(_ []Vector, matrixVal Matrix, args parser.Expressions, enh *EvalNodeHelper) (Vector, annotations.Annotations) {
floats := matrixVal[0].Floats
histograms := matrixVal[0].Histograms
changes := 0
@ -1603,7 +1731,8 @@ func funcChanges(_ []Vector, matrixVal Matrix, _ parser.Expressions, enh *EvalNo
}
var prevSample, curSample Sample
for iFloat, iHistogram := 0, 0; iFloat < len(floats) || iHistogram < len(histograms); {
firstSampleIndex := pickFirstSampleIndex(floats, args, enh)
for iFloat, iHistogram := firstSampleIndex, 0; iFloat < len(floats) || iHistogram < len(histograms); {
switch {
// Process a float sample if no histogram sample remains or its timestamp is earlier.
// Process a histogram sample if no float sample remains or its timestamp is earlier.
@ -1616,7 +1745,7 @@ func funcChanges(_ []Vector, matrixVal Matrix, _ parser.Expressions, enh *EvalNo
iHistogram++
}
// Skip the comparison for the first sample, just initialize prevSample.
if iFloat+iHistogram == 1 {
if iFloat+iHistogram == 1+firstSampleIndex {
prevSample = curSample
continue
}
@ -1920,6 +2049,26 @@ var AtModifierUnsafeFunctions = map[string]struct{}{
"timestamp": {},
}
// AnchoredSafeFunctions are the functions that can be used with the anchored
// modifier. Anchored modifier returns matrices with samples outside of the
// boundaries, so not every function can be used with it.
var AnchoredSafeFunctions = map[string]struct{}{
"resets": {},
"changes": {},
"rate": {},
"increase": {},
"delta": {},
}
// SmoothedSafeFunctions are the functions that can be used with the smoothed
// modifier. Smoothed modifier returns matrices with samples outside of the
// boundaries, so not every function can be used with it.
var SmoothedSafeFunctions = map[string]struct{}{
"rate": {},
"increase": {},
"delta": {},
}
type vectorByValueHeap Vector
func (s vectorByValueHeap) Len() int {

View File

@ -79,3 +79,24 @@ func TestKahanSumInc(t *testing.T) {
})
}
}
func TestInterpolate(t *testing.T) {
tests := []struct {
p1, p2 FPoint
t int64
isCounter bool
expected float64
}{
{FPoint{T: 1, F: 100}, FPoint{T: 2, F: 200}, 1, false, 100},
{FPoint{T: 0, F: 100}, FPoint{T: 2, F: 200}, 1, false, 150},
{FPoint{T: 0, F: 200}, FPoint{T: 2, F: 100}, 1, false, 150},
{FPoint{T: 0, F: 200}, FPoint{T: 2, F: 0}, 1, true, 200},
{FPoint{T: 0, F: 200}, FPoint{T: 2, F: 100}, 1, true, 250},
{FPoint{T: 0, F: 500}, FPoint{T: 2, F: 100}, 1, true, 550},
{FPoint{T: 0, F: 500}, FPoint{T: 10, F: 0}, 1, true, 500},
}
for _, test := range tests {
result := interpolate(test.p1, test.p2, test.t, test.isCounter, false)
require.Equal(t, test.expected, result)
}
}

View File

@ -226,6 +226,11 @@ type VectorSelector struct {
// This is the case when VectorSelector is used to represent the info function's second argument.
BypassEmptyMatcherCheck bool
// Anchored is true when the VectorSelector is anchored.
Anchored bool
// Smoothed is true when the VectorSelector is smoothed.
Smoothed bool
PosRange posrange.PositionRange
}

View File

@ -141,6 +141,8 @@ GROUP_LEFT
GROUP_RIGHT
IGNORING
OFFSET
SMOOTHED
ANCHORED
ON
WITHOUT
%token keywordsEnd
@ -187,7 +189,7 @@ START_METRIC_SELECTOR
%type <int> int
%type <uint> uint
%type <float> number series_value signed_number signed_or_unsigned_number
%type <node> step_invariant_expr aggregate_expr aggregate_modifier bin_modifier binary_expr bool_modifier expr function_call function_call_args function_call_body group_modifiers label_matchers matrix_selector number_duration_literal offset_expr on_or_ignoring paren_expr string_literal subquery_expr unary_expr vector_selector duration_expr paren_duration_expr positive_duration_expr offset_duration_expr
%type <node> step_invariant_expr aggregate_expr aggregate_modifier bin_modifier binary_expr bool_modifier expr function_call function_call_args function_call_body group_modifiers label_matchers matrix_selector number_duration_literal offset_expr anchored_expr smoothed_expr on_or_ignoring paren_expr string_literal subquery_expr unary_expr vector_selector duration_expr paren_duration_expr positive_duration_expr offset_duration_expr
%start start
@ -230,6 +232,8 @@ expr :
| matrix_selector
| number_duration_literal
| offset_expr
| anchored_expr
| smoothed_expr
| paren_expr
| string_literal
| subquery_expr
@ -464,6 +468,20 @@ offset_expr: expr OFFSET offset_duration_expr
{ yylex.(*parser).unexpected("offset", "number, duration, or step()"); $$ = $1 }
;
/*
* Anchored and smoothed modifiers
*/
anchored_expr: expr ANCHORED
{
yylex.(*parser).setAnchored($1)
}
smoothed_expr: expr SMOOTHED
{
yylex.(*parser).setSmoothed($1)
}
/*
* @ modifiers.
*/

File diff suppressed because it is too large Load Diff

View File

@ -129,6 +129,8 @@ var key = map[string]ItemType{
// Keywords.
"offset": OFFSET,
"smoothed": SMOOTHED,
"anchored": ANCHORED,
"by": BY,
"without": WITHOUT,
"on": ON,

View File

@ -42,6 +42,9 @@ var parserPool = sync.Pool{
// ExperimentalDurationExpr is a flag to enable experimental duration expression parsing.
var ExperimentalDurationExpr bool
// EnableExtendedRangeSelectors is a flag to enable experimental extended range selectors.
var EnableExtendedRangeSelectors bool
type Parser interface {
ParseExpr() (Expr, error)
Close()
@ -1021,6 +1024,52 @@ func (p *parser) addOffsetExpr(e Node, expr *DurationExpr) {
*endPosp = p.lastClosing
}
func (p *parser) setAnchored(e Node) {
if !EnableExtendedRangeSelectors {
p.addParseErrf(e.PositionRange(), "anchored modifier is experimental and not enabled")
return
}
switch s := e.(type) {
case *VectorSelector:
s.Anchored = true
if s.Smoothed {
p.addParseErrf(e.PositionRange(), "anchored and smoothed modifiers cannot be used together")
}
case *MatrixSelector:
s.VectorSelector.(*VectorSelector).Anchored = true
if s.VectorSelector.(*VectorSelector).Smoothed {
p.addParseErrf(e.PositionRange(), "anchored and smoothed modifiers cannot be used together")
}
case *SubqueryExpr:
p.addParseErrf(e.PositionRange(), "anchored modifier is not supported for subqueries")
default:
p.addParseErrf(e.PositionRange(), "anchored modifier not implemented")
}
}
func (p *parser) setSmoothed(e Node) {
if !EnableExtendedRangeSelectors {
p.addParseErrf(e.PositionRange(), "smoothed modifier is experimental and not enabled")
return
}
switch s := e.(type) {
case *VectorSelector:
s.Smoothed = true
if s.Anchored {
p.addParseErrf(e.PositionRange(), "anchored and smoothed modifiers cannot be used together")
}
case *MatrixSelector:
s.VectorSelector.(*VectorSelector).Smoothed = true
if s.VectorSelector.(*VectorSelector).Anchored {
p.addParseErrf(e.PositionRange(), "anchored and smoothed modifiers cannot be used together")
}
case *SubqueryExpr:
p.addParseErrf(e.PositionRange(), "smoothed modifier is not supported for subqueries")
default:
p.addParseErrf(e.PositionRange(), "smoothed modifier not implemented")
}
}
// setTimestamp is used to set the timestamp from the @ modifier in the generated parser.
func (p *parser) setTimestamp(e Node, ts float64) {
if math.IsInf(ts, -1) || math.IsInf(ts, 1) || math.IsNaN(ts) ||

View File

@ -263,11 +263,18 @@ func (node *MatrixSelector) String() string {
vecSelector.Timestamp = nil
vecSelector.StartOrEnd = 0
extendedAttribute := ""
switch {
case vecSelector.Anchored:
extendedAttribute = " anchored"
case vecSelector.Smoothed:
extendedAttribute = " smoothed"
}
rangeStr := model.Duration(node.Range).String()
if node.RangeExpr != nil {
rangeStr = node.RangeExpr.String()
}
str := fmt.Sprintf("%s[%s]%s%s", vecSelector.String(), rangeStr, at, offset)
str := fmt.Sprintf("%s[%s]%s%s%s", vecSelector.String(), rangeStr, extendedAttribute, at, offset)
vecSelector.OriginalOffset, vecSelector.OriginalOffsetExpr, vecSelector.Timestamp, vecSelector.StartOrEnd = offsetVal, offsetExprVal, atVal, preproc
@ -380,6 +387,12 @@ func (node *VectorSelector) String() string {
b.WriteString(" @ end()")
}
switch {
case node.Anchored:
b.WriteString(" anchored")
case node.Smoothed:
b.WriteString(" smoothed")
}
switch {
case node.OriginalOffsetExpr != nil:
b.WriteString(" offset ")
node.OriginalOffsetExpr.writeTo(b)

View File

@ -48,6 +48,11 @@ func TestConcurrentRangeQueries(t *testing.T) {
}
// Enable experimental functions testing
parser.EnableExperimentalFunctions = true
parser.EnableExtendedRangeSelectors = true
t.Cleanup(func() {
parser.EnableExperimentalFunctions = false
parser.EnableExtendedRangeSelectors = false
})
engine := promqltest.NewTestEngineWithOpts(t, opts)
const interval = 10000 // 10s interval.

View File

@ -123,9 +123,11 @@ func RunBuiltinTestsWithStorage(t TBRun, engine promql.QueryEngine, newStorage f
t.Cleanup(func() {
parser.EnableExperimentalFunctions = false
parser.ExperimentalDurationExpr = false
parser.EnableExtendedRangeSelectors = false
})
parser.EnableExperimentalFunctions = true
parser.ExperimentalDurationExpr = true
parser.EnableExtendedRangeSelectors = true
files, err := fs.Glob(testsFs, "*/*.test")
require.NoError(t, err)

View File

@ -0,0 +1,414 @@
# Reference from PROM-52: Complete dataset
load 15s
metric 1+1x4 9+1x4
eval instant at 5s increase(metric[1m])
eval instant at 20s increase(metric[1m])
{} 1.833333333
eval instant at 35s increase(metric[1m])
{} 2.833333333
eval instant at 50s increase(metric[1m])
{} 4
eval instant at 65s increase(metric[1m])
{} 4
eval instant at 80s increase(metric[1m])
{} 8
eval instant at 95s increase(metric[1m])
{} 8
eval instant at 110s increase(metric[1m])
{} 8
eval instant at 125s increase(metric[1m])
{} 4
eval instant at 5s increase(metric[1m] anchored)
{} 0
eval instant at 20s increase(metric[1m] anchored)
{} 1
eval instant at 35s increase(metric[1m] anchored)
{} 2
eval instant at 50s increase(metric[1m] anchored)
{} 3
eval instant at 65s increase(metric[1m] anchored)
{} 4
eval instant at 80s increase(metric[1m] anchored)
{} 7
eval instant at 95s increase(metric[1m] anchored)
{} 7
eval instant at 110s increase(metric[1m] anchored)
{} 7
eval instant at 125s increase(metric[1m] anchored)
{} 7
eval instant at 5s increase(metric[1m] smoothed)
{} 0.333333333
eval instant at 20s increase(metric[1m] smoothed)
{} 1.333333333
eval instant at 35s increase(metric[1m] smoothed)
{} 2.333333333
eval instant at 50s increase(metric[1m] smoothed)
{} 3.333333333
eval instant at 65s increase(metric[1m] smoothed)
{} 5
eval instant at 80s increase(metric[1m] smoothed)
{} 7
eval instant at 95s increase(metric[1m] smoothed)
{} 7
eval instant at 110s increase(metric[1m] smoothed)
{} 7
eval instant at 125s increase(metric[1m] smoothed)
{} 6
# Reference from PROM-52: Partial dataset
clear
load 15s
metric 1+1x2 _ _ 9+1x4
eval instant at 5s increase(metric[1m])
eval instant at 20s increase(metric[1m])
{} 1.833333333
eval instant at 35s increase(metric[1m])
{} 2.833333333
eval instant at 50s increase(metric[1m])
{} 3.166666666
eval instant at 65s increase(metric[1m])
{} 2.166666666
eval instant at 80s increase(metric[1m])
{} 8
eval instant at 95s increase(metric[1m])
{} 1.833333333
eval instant at 110s increase(metric[1m])
{} 2.833333333
eval instant at 125s increase(metric[1m])
{} 4
eval instant at 5s increase(metric[1m] anchored)
{} 0
eval instant at 20s increase(metric[1m] anchored)
{} 1
eval instant at 35s increase(metric[1m] anchored)
{} 2
eval instant at 50s increase(metric[1m] anchored)
{} 2
eval instant at 65s increase(metric[1m] anchored)
{} 2
eval instant at 80s increase(metric[1m] anchored)
{} 7
eval instant at 95s increase(metric[1m] anchored)
{} 7
eval instant at 110s increase(metric[1m] anchored)
{} 8
eval instant at 125s increase(metric[1m] anchored)
{} 9
eval instant at 5s increase(metric[1m] smoothed)
{} 0.333333333
eval instant at 20s increase(metric[1m] smoothed)
{} 1.333333333
eval instant at 35s increase(metric[1m] smoothed)
{} 2.666666666
eval instant at 50s increase(metric[1m] smoothed)
{} 4.666666666
eval instant at 65s increase(metric[1m] smoothed)
{} 6.333333333
eval instant at 80s increase(metric[1m] smoothed)
{} 7
eval instant at 95s increase(metric[1m] smoothed)
{} 6.666666666
eval instant at 110s increase(metric[1m] smoothed)
{} 5.666666666
eval instant at 125s increase(metric[1m] smoothed)
{} 4.666666666
# Test that inverval is left-open.
clear
load 1m
metric 1 2 _ 4 5
eval instant at 2m increase(metric[1m] smoothed)
{} 1
eval instant at 2m increase(metric[1m] anchored)
# Basic test with counter resets
clear
load 1m
metric{id="1"} 1+1x4 1+1x4
metric{id="2"} 3 2+2x9
metric{id="3"} 5+3x2 3+3x6
eval instant at 1m30s increase(metric[1m])
eval instant at 1m30s increase(metric[1m] smoothed)
{id="1"} 1
{id="2"} 2
{id="3"} 3
eval instant at 1m30s increase(metric[1m] anchored)
{id="1"} 1
{id="2"} 2
{id="3"} 3
eval instant at 1m30s delta(metric[1m])
eval instant at 1m30s delta(metric[1m] anchored)
{id="1"} 1
{id="2"} -1
{id="3"} 3
eval instant at 3m0s delta(metric[1m] anchored)
{id="1"} 1
{id="2"} 2
{id="3"} -8
eval instant at 3m30s delta(metric[1m] anchored)
{id="1"} 1
{id="2"} 2
{id="3"} -8
eval instant at 6m increase(metric[5m])
{id="1"} 5
{id="2"} 10
{id="3"} 15
eval instant at 6m15s increase(metric[5m] smoothed)
{id="1"} 5
{id="2"} 10
{id="3"} 15
eval instant at 6m increase(metric[5m] smoothed)
{id="1"} 5
{id="2"} 10
{id="3"} 15
eval instant at 5m increase(metric[5m] anchored)
{id="1"} 5
{id="2"} 10
{id="3"} 15
eval instant at 15m increase(metric[5m] anchored)
clear
load 1m
metric{id="1"} 11 -1 100 0
metric{id="2"} 0 0 100 0 0 11 -1
eval instant at 5m30s delta(metric[5m] smoothed)
{id="1"} -5
{id="2"} 5
eval instant at 5m45s delta(metric[5m] smoothed)
{id="1"} -2
{id="2"} 2
clear
load 1m
metric{id="1"} 1+1x10
metric{id="2"} 1 1+1x10
metric{id="3"} 99-1x10
metric{id="4"} 99 99-1x10
eval instant at 5m changes(metric[5m])
{id="1"} 4
{id="2"} 4
{id="3"} 4
{id="4"} 4
eval instant at 5m30s changes(metric[5m])
{id="1"} 4
{id="2"} 4
{id="3"} 4
{id="4"} 4
eval instant at 5m0s changes(metric[5m] anchored)
{id="1"} 5
{id="2"} 4
{id="3"} 5
{id="4"} 4
eval instant at 6m changes(metric[5m] anchored)
{id="1"} 5
{id="2"} 5
{id="3"} 5
{id="4"} 5
eval instant at 5m30s changes(metric[5m] anchored)
{id="1"} 5
{id="2"} 4
{id="3"} 5
{id="4"} 4
eval instant at 5m30s resets(metric[5m])
{id="1"} 0
{id="2"} 0
{id="3"} 4
{id="4"} 4
eval instant at 5m30s resets(metric[5m] anchored)
{id="1"} 0
{id="2"} 0
{id="3"} 5
{id="4"} 4
clear
load 1m
metric{id="1"} 2 _ 1 _ _ _ _ _ 0
metric{id="2"} 99-1x10
eval instant at 2m changes(metric[1m])
{id="1"} 0
{id="2"} 0
eval instant at 3m changes(metric[1m])
{id="2"} 0
eval instant at 2m changes(metric[1m] anchored)
{id="1"} 1
{id="2"} 1
eval instant at 3m changes(metric[1m] anchored)
{id="1"} 1
{id="2"} 1
eval instant at 8m changes(metric[1m] anchored)
{id="1"} 0
{id="2"} 1
eval instant at 8m changes(metric[1m1ms] anchored)
{id="1"} 1
{id="2"} 2
eval instant at 2m resets(metric[1m])
{id="1"} 0
{id="2"} 0
eval instant at 3m resets(metric[1m])
{id="2"} 0
eval instant at 2m resets(metric[1m] anchored)
{id="1"} 1
{id="2"} 1
eval instant at 3m resets(metric[1m] anchored)
{id="1"} 1
{id="2"} 1
eval instant at 8m resets(metric[1m] anchored)
{id="1"} 0
{id="2"} 1
eval instant at 8m resets(metric[1m1ms] anchored)
{id="1"} 1
{id="2"} 2
clear
load 1m
metric 9 8 5 4
eval instant at 2m15s increase(metric[2m] smoothed)
{} 12
clear
eval instant at 1m deriv(foo[3m] smoothed)
expect fail msg: smoothed modifier can only be used with: delta, increase, rate - not with deriv
eval instant at 1m resets(foo[3m] smoothed)
expect fail msg: smoothed modifier can only be used with: delta, increase, rate - not with resets
eval instant at 1m changes(foo[3m] smoothed)
expect fail msg: smoothed modifier can only be used with: delta, increase, rate - not with changes
eval instant at 1m max_over_time(foo[3m] smoothed)
expect fail msg: smoothed modifier can only be used with: delta, increase, rate - not with max_over_time
eval instant at 1m predict_linear(foo[3m] smoothed, 4)
expect fail msg: smoothed modifier can only be used with: delta, increase, rate - not with predict_linear
eval instant at 1m deriv(foo[3m] anchored)
expect fail msg: anchored modifier can only be used with: changes, delta, increase, rate, resets - not with deriv
eval instant at 1m resets(foo[3m] anchored)
eval instant at 1m changes(foo[3m] anchored)
eval instant at 1m max_over_time(foo[3m] anchored)
expect fail msg: anchored modifier can only be used with: changes, delta, increase, rate, resets - not with max_over_time
eval instant at 1m predict_linear(foo[3m] anchored, 4)
expect fail msg: anchored modifier can only be used with: changes, delta, increase, rate, resets - not with predict_linear
clear
load 10s
metric 1+1x10
withreset 1+1x4 1+1x5
notregular 0 5 100 2 8
eval instant at 10s metric smoothed
metric 2
eval instant at 15s metric smoothed
metric 2.5
eval instant at 5s metric smoothed
metric 1.5
eval instant at 105s metric smoothed
metric 11
eval instant at 45s withreset smoothed
withreset 3
eval instant at 30s notregular smoothed
notregular 2