Compare commits

..

59 Commits

Author SHA1 Message Date
Amlan Kumar Nandy
0ac4d0f99d Merge branch 'main' into SIG-7727 2025-11-15 14:19:44 +07:00
Karan Balani
fbb66f14ba chore: improve otel demo app setup with docker based signoz (#9567)
## 📄 Summary

Minor improvements on local setup guide doc.
2025-11-14 22:11:22 +05:30
Karan Balani
54b67d9cfd feat: add bounded cache for opaque tokenizer only for last observed at cache (#9581)
Move away from unbounded cache for `lastObservedAt` stat, which was powered by BigCache (unbounded), to Ristretto, a bounded in-memory cache (https://github.com/dgraph-io/ristretto).

This PR is first step towards moving away from unbounded caches in the system, more PRs to follow.
2025-11-14 21:23:57 +05:30
Abhishek Kumar Singh
1a193015a7 refactor: PostableRule struct (#9448)
* refactor: PostableRule struct

- made validation part of `UnmarshalJSON`
- removed validation from `processRuleDefaults` and updated signature to remove error from return type

* refactor: updated error message for missing composite query

---------

Co-authored-by: Srikanth Chekuri <srikanth.chekuri92@gmail.com>
2025-11-13 19:45:19 +00:00
Vikrant Gupta
245179cbf7 feat(authz): openfga sql migration (#9580)
* feat(authz): openfga sql migration

* feat(authz): formatting and naming

* feat(authz): formatting and naming

* feat(authz): extract function for store and model id

* feat(authz): reorder the provider
2025-11-14 00:43:02 +05:30
Yunus M
dbb6b333c8 feat: reset error boundary on pathname change (#9570) 2025-11-13 15:58:33 +05:30
Shaheer Kochai
56f8e53d88 refactor: migrate status code table to v5 (#9546)
* fix: fix the issue of aggregation incorrectly falling back to count

* refactor: add support for v5 queries in endPointDetailsDataQueries of EndPointDetails

* chore: add common utility functions

* refactor: migrate status code table to v5

* fix: status code table formatting

* chore: add tests for status code table v5 migration

* chore: add convertFiltersWithUrlHandling helper

* chore: remove unnecessary tests

* chore: aggregateAttribute to aggregations

* fix: remove the aggregateOperator fallback logic changes

* fix: fix the failing test

* fix: add response_status_code exists to the status code table query
2025-11-12 13:55:00 +00:00
Aditya Singh
2f4e371dac Fix: Preserve query on navigation b/w views | Logs Explorer code cleanup (#9496)
* feat: synchronise panel type state

* feat: refactor explorer queries

* feat: use explorer util queries

* feat: minor refactor

* feat: update test cases

* feat: remove code

* feat: minor refactor

* feat: minor refactor

* feat: update tests

* feat: update list query logic to only support first staged query

* feat: fix export query and saved views change

* feat: test fix

* feat: export link fix

---------

Co-authored-by: Nityananda Gohain <nityanandagohain@gmail.com>
2025-11-12 17:25:24 +05:30
Nikhil Mantri
db75ec56bc chore: update Services to use QBV5 (#9287) 2025-11-12 14:02:07 +05:30
primus-bot[bot]
02755a6527 chore(release): bump to v0.101.0 (#9566)
Co-authored-by: primus-bot[bot] <171087277+primus-bot[bot]@users.noreply.github.com>
Co-authored-by: Priyanshu Shrivastava <priyanshu@signoz.io>
2025-11-12 12:39:55 +05:30
Srikanth Chekuri
9f089e0784 fix(pagerduty): add severity for labels (#9538) 2025-11-12 05:51:26 +05:30
Srikanth Chekuri
fb9a7ad3cd chore: update integration dashboard json to v5 (#9534) 2025-11-12 00:09:15 +05:30
Aditya Singh
ad631d70b6 fix: add key to allow side bar nav on error thrown (#9560) 2025-11-11 17:21:06 +05:30
Vikrant Gupta
c44efeab33 fix(sessions): do not use axios base instance (#9556)
* fix(sessions): do not use axios base instance

* fix(sessions): fix test cases

* fix(sessions): add trailing slashes
2025-11-11 08:42:16 +00:00
Tushar Vats
e9743fa7ac feat: bump cloud agent version to 0.0.6 (#9298) 2025-11-11 13:58:34 +05:30
Amlan Kumar Nandy
b7ece08d3e fix: aggregation options for metric in alert condition do not get updated (#9485)
Co-authored-by: Srikanth Chekuri <srikanth.chekuri92@gmail.com>
2025-11-11 13:47:58 +07:00
Pranjul Kalsi
e5f4f5cc72 fix: preserve SMTPRequireTLS during default merge (#8478) (#9418)
The issue was with how Mergo Treats Zero values, Mergo only fills **zero-value** fields in the destination.
Since `false` is the zero value for `bool`, it always gets **replaced** by `true` from the source. Using pointers doesn’t help—`Merge` dereferences them and still treats `false` as zero.
2025-11-11 01:28:16 +05:30
Vikrant Gupta
4437630127 fix(tokenizer): do not retry 401 email_password session request (#9541) 2025-11-10 14:04:16 +00:00
Yunus M
89639b239e feat: convert duration ms to string to be passed to getYAxisFormattedValue (#9539) 2025-11-10 18:03:32 +05:30
Yunus M
785ae9f0bd feat: pass email if username is not set - pylon (#9526) 2025-11-10 17:30:32 +05:30
Abhi kumar
8752022cef fix: updated dashboard panel colors for better contrast ratio (#9500)
* fix: updated dashboard panel colors for better contrast ratio

* chore: preetier fix

* feat: added changes for the tooltip to follow cursor
2025-11-06 17:17:33 +05:30
Aditya Singh
c7e4a9c45d Fix: uplot dense points selection (#9469)
* feat: fix uplot focused series logic selection

* fix: stop propogation only if drilldown enabled

* feat: minor refactor

* feat: minor refactor

* feat: minor refactor

* feat: minor refactor

---------

Co-authored-by: Srikanth Chekuri <srikanth.chekuri92@gmail.com>
2025-11-06 11:14:02 +00:00
primus-bot[bot]
bf92c92204 chore(release): bump to v0.100.1 (#9499)
Co-authored-by: primus-bot[bot] <171087277+primus-bot[bot]@users.noreply.github.com>
2025-11-06 13:22:09 +05:30
Srikanth Chekuri
bd63633be7 fix: do not format for non aggregation columns (#9492) 2025-11-05 19:24:56 +05:30
Nikhil Mantri
1158e1199b Fix: filter with time in span scope condition builder (#9426) 2025-11-05 13:11:36 +05:30
primus-bot[bot]
0a60c49314 chore(release): bump to v0.100.0 (#9488)
Co-authored-by: primus-bot[bot] <171087277+primus-bot[bot]@users.noreply.github.com>
Co-authored-by: Priyanshu Shrivastava <priyanshu@signoz.io>
2025-11-05 12:06:42 +05:30
Ekansh Gupta
c25e3beb81 feat: changed descirption of span percentile calculation (#9487) 2025-11-05 06:23:24 +00:00
SagarRajput-7
c9e0f2b9ca fix: removed cleanup variable url function to avoid url reseting (#9449) 2025-11-05 00:33:11 +05:30
Abhi kumar
6d831849c1 perf: optimize tooltip plugin with caching, memoization, and improved… (#9421)
* perf: optimize tooltip plugin with caching, memoization, and improved DOM operations

* perf(uplot): optimize tooltip with focused sorting and O(n²) to O(n) reduction

* perf(uplot): optimize threshold rendering with batched canvas operations

* chore: pr review changes

* chore: removed last index check for tooltip generation

* chore: shifted to rendering only one points when hovered

---------

Co-authored-by: Srikanth Chekuri <srikanth.chekuri92@gmail.com>
2025-11-04 17:34:15 +00:00
aniketio-ctrl
83eeb46f99 feat(sqlstore): added sql formatter for json (#9420)
* chore: added sql formatter for json

* chore: updated json extract columns

* chore: added apend ident

* chore: resolved pr comments

* chore: resolved pr comments

* chore: resolved pr comments

* chore: resolved pr comments

* chore: minor changes

* chore: minor changes

* chore: minor changes

* chore: minor changes

* chore: resolve comments

* chore: added append value

* chore: added append value

* chore: added append value

* chore: added append value

* chore: added append value

* chore: added append value

* chore: added append value

* chore: added append value

---------

Co-authored-by: Vikrant Gupta <vikrant@signoz.io>
2025-11-04 22:05:23 +05:30
Shaheer Kochai
287558dc9d refactor: migrate External API's top 10 errors query_range request to v5 (#9476)
* feat: migrate top 10 errors query_range request to v5

* chore: remove unnecessary tests

* chore: improve the top error tests

* fix: send status_message EXISTS only if the toggle is on

* fix: get the count value and simplify the null check

* fix: send has_error = true

* chore: fall back to url.full if url.path doesn't exist

* refactor: address the PR review requested changes

* chore: add test to check if we're sending the correct filters

---------

Co-authored-by: Nityananda Gohain <nityanandagohain@gmail.com>
2025-11-04 20:09:32 +05:30
Yunus M
83aad793c2 fix: alignment issues in home page (#9459) 2025-11-04 13:13:01 +05:30
Shaheer Kochai
3eff689c85 fix: fix the issue of save button incorrectly enabled when cold_storage_ttl_days is -1 (#9458)
* fix: logs retention save button enabled when S3 disabled

* test: add test for save button state when S3 is disabled
2025-11-04 12:10:17 +05:30
Yunus M
f5bcd65e2e feat: update styles for percentile value (#9477)
* feat: update styles for percentile value

* feat: reset data on span change, remove unnecessary useMemo
2025-11-03 23:40:02 +05:30
Yunus M
e7772d93af fix: flaky multi ingestion settings test (#9478) 2025-11-03 22:21:13 +05:30
swapnil-signoz
bbf987ebd7 fix: removing duplicate creation of user if user does not exist already (#9455)
* fix: removing duplicate creation of user if user does not exist already

* test: adding api test case

* fix: updated test cases

* fix: remove unnecessary logging and clean up connection params API

* feat: add gateway fixture and integrate with signoz for connection parameters

* feat: add cloudintegrations to the test job matrix in integrationci.yaml

* fix: remove outdated comments from make_http_mocks fixture

* fix: remove deprecated ZeusURL from build configurations
2025-11-03 16:45:08 +05:30
Nityananda Gohain
105c3a3b8c fix: return coldstorage -1 if not set for logs (#9471) 2025-11-03 08:10:53 +00:00
Aditya Singh
c1a4a5b8db Log Details minor ui fix (#9463)
* feat: fix copy btn styles

* feat: minor refactor
2025-11-03 11:59:06 +05:30
aniketio-ctrl
c9591f4341 fix: formatted threshold unit in description and summary (#9350) 2025-11-02 14:27:21 +00:00
Yunus M
fd216fdee1 feat(meter): add ability to query meter data across product modules (#9142)
* feat: enable users to query meter specific data in alerts

* feat: enable metrics / meter selection in alerts and dashboards

* feat: enable setting alerts for ingestion limits

* feat: set where clause when setting alert for ingestion key

* feat(meter): handle the where clause changes

* feat: remove add alert for infinite values

* feat: add unit test cases for set alert flow

* feat: handle inital and onchange state for meter source

* feat: pass thresholds array from ingestion settings

* feat: derive source from value change rather than local state

---------

Co-authored-by: Vikrant Gupta <vikrant@signoz.io>
Co-authored-by: Srikanth Chekuri <srikanth.chekuri92@gmail.com>
2025-11-02 19:02:56 +05:30
Yunus M
f5bf4293a1 feat: span percentile - UI (#9397)
* feat: show span percentile in span details

* feat: resource attribute selection for span percentile

* feat: wait for 2 secs for the first fetch of span percentile

* feat: add unit test cases for span percentiles

* feat: use style tokens

* feat: remove redundant test assertion

* chore: resolve conflicts

* feat: reset initial wait state on span change

* feat: update payload , endpoint as per new backend changes

* feat: address review comments

* feat: fetch span percentile without specific resource attributes - first time
2025-11-01 22:57:36 +05:30
Shaheer Kochai
155a44a25d feat: add support for infra metrics in trace details (#8911)
* feat: add support for infra metrics in trace details v2

* fix: adjust the empty state if the data source is traces

* refactor: logLineTimestamp prop to timestamp

* chore: write tests for span infra metrics

* chore: return search from useLocation mock

* chore: address review changes to move inline options to useMemo

* refactor: simplify infrastructure metadata extraction logic in SpanRelatedSignals

* refactor: extract infrastructure metadata logic into utility function

* test(infraMetrics): club the similar tests

* fix: improve logs and infra tabs switching assertions

* feat: update Infra option icon to Metrics in SpanDetailsDrawer

* chore: change infra to metrics in span details drawer

* fix: fix the failing tests

---------

Co-authored-by: Nityananda Gohain <nityanandagohain@gmail.com>
2025-11-01 21:26:05 +04:30
Vishal Sharma
4b21c9d5f9 feat: add result count to data source search analytics event (#9444) 2025-10-31 12:35:24 +00:00
Yunus M
5ef0a18867 Update CODEOWNERS for frontend code (#9456) 2025-10-31 12:52:37 +05:30
SagarRajput-7
c8266d1aec fix: upgraded the axios resolution to fix vulnerability (#9454) 2025-10-31 11:53:10 +05:30
amlannandy
7503cbf902 chore: move util 2025-10-31 12:25:28 +07:00
amlannandy
13bf14474d Merge branch 'main' into SIG-7727 2025-10-31 12:22:50 +07:00
amlannandy
c7ea032298 chore: code cleanup 2025-10-31 11:55:05 +07:00
SagarRajput-7
adfd16ce1b fix: adapt the scroll reset fix in alert and histogram panels (#9322) 2025-10-30 13:31:17 +00:00
SagarRajput-7
6db74a5585 feat: allow custom precision in dashboard panels (#9054) 2025-10-30 18:50:40 +05:30
Pandey
f8e0db0085 chore: bump golangci-lint to the latest version (#9445) 2025-10-30 11:21:35 +00:00
Srikanth Chekuri
30776d6179 Merge branch 'main' into SIG-7727 2025-10-29 16:28:43 +05:30
Amlan Kumar Nandy
66564c807d Merge branch 'main' into SIG-7727 2025-10-28 15:30:01 +07:00
amlannandy
c40120dd3a chore: minor fixes 2025-10-28 14:48:27 +07:00
amlannandy
c1309072de chore: update unit formatting 2025-10-28 12:46:13 +07:00
amlannandy
3278af0327 chore: add unit tests 2025-10-27 15:52:47 +07:00
amlannandy
ccfcec73e9 chore: add unit tests 2025-10-27 15:52:47 +07:00
amlannandy
5ffd717953 chore: minor change 2025-10-27 15:52:47 +07:00
amlannandy
8cd9f5f25a chore: y axis unit management in explorer section 2025-10-27 15:52:47 +07:00
279 changed files with 45638 additions and 45481 deletions

View File

@@ -42,7 +42,7 @@ services:
timeout: 5s
retries: 3
schema-migrator-sync:
image: signoz/signoz-schema-migrator:v0.129.7
image: signoz/signoz-schema-migrator:v0.129.8
container_name: schema-migrator-sync
command:
- sync
@@ -55,7 +55,7 @@ services:
condition: service_healthy
restart: on-failure
schema-migrator-async:
image: signoz/signoz-schema-migrator:v0.129.7
image: signoz/signoz-schema-migrator:v0.129.8
container_name: schema-migrator-async
command:
- async

2
.github/CODEOWNERS vendored
View File

@@ -2,7 +2,7 @@
# Owners are automatically requested for review for PRs that changes code
# that they own.
/frontend/ @SigNoz/frontend @YounixM
/frontend/ @YounixM @aks07
/frontend/src/container/MetricsApplication @srikanthccv
/frontend/src/container/NewWidget/RightContainer/types.ts @srikanthccv

View File

@@ -107,7 +107,6 @@ jobs:
-X github.com/SigNoz/signoz/pkg/version.branch=${{ needs.prepare.outputs.branch }}
-X github.com/SigNoz/signoz/ee/zeus.url=https://api.signoz.cloud
-X github.com/SigNoz/signoz/ee/zeus.deprecatedURL=https://license.signoz.io
-X github.com/SigNoz/signoz/ee/query-service/constants.ZeusURL=https://api.signoz.cloud
-X github.com/SigNoz/signoz/ee/query-service/constants.LicenseSignozIo=https://license.signoz.io/api/v1
-X github.com/SigNoz/signoz/pkg/analytics.key=9kRrJ7oPCGPEJLF6QjMPLt5bljFhRQBr'
DOCKER_BASE_IMAGES: '{"alpine": "alpine:3.20.3"}'

View File

@@ -106,7 +106,6 @@ jobs:
-X github.com/SigNoz/signoz/pkg/version.branch=${{ needs.prepare.outputs.branch }}
-X github.com/SigNoz/signoz/ee/zeus.url=https://api.staging.signoz.cloud
-X github.com/SigNoz/signoz/ee/zeus.deprecatedURL=https://license.staging.signoz.cloud
-X github.com/SigNoz/signoz/ee/query-service/constants.ZeusURL=https://api.staging.signoz.cloud
-X github.com/SigNoz/signoz/ee/query-service/constants.LicenseSignozIo=https://license.staging.signoz.cloud/api/v1
-X github.com/SigNoz/signoz/pkg/analytics.key=9kRrJ7oPCGPEJLF6QjMPLt5bljFhRQBr'
DOCKER_BASE_IMAGES: '{"alpine": "alpine:3.20.3"}'

View File

@@ -17,6 +17,7 @@ jobs:
- bootstrap
- passwordauthn
- callbackauthn
- cloudintegrations
- querier
- ttl
sqlstore-provider:

1
.gitignore vendored
View File

@@ -106,7 +106,6 @@ downloads/
eggs/
.eggs/
lib/
!frontend/src/lib/
lib64/
parts/
sdist/

View File

@@ -1,43 +1,63 @@
version: "2"
linters:
default: standard
default: none
enable:
- bodyclose
- depguard
- errcheck
- forbidigo
- govet
- iface
- ineffassign
- misspell
- nilnil
- sloglint
- depguard
- iface
- unparam
- forbidigo
linters-settings:
sloglint:
no-mixed-args: true
kv-only: true
no-global: all
context: all
static-msg: true
msg-style: lowercased
key-naming-case: snake
depguard:
rules:
nozap:
deny:
- pkg: "go.uber.org/zap"
desc: "Do not use zap logger. Use slog instead."
noerrors:
deny:
- pkg: "errors"
desc: "Do not use errors package. Use github.com/SigNoz/signoz/pkg/errors instead."
iface:
enable:
- identical
forbidigo:
forbid:
- fmt.Errorf
- ^(fmt\.Print.*|print|println)$
issues:
exclude-dirs:
- "pkg/query-service"
- "ee/query-service"
- "scripts/"
- unused
settings:
depguard:
rules:
noerrors:
deny:
- pkg: errors
desc: Do not use errors package. Use github.com/SigNoz/signoz/pkg/errors instead.
nozap:
deny:
- pkg: go.uber.org/zap
desc: Do not use zap logger. Use slog instead.
forbidigo:
forbid:
- pattern: fmt.Errorf
- pattern: ^(fmt\.Print.*|print|println)$
iface:
enable:
- identical
sloglint:
no-mixed-args: true
kv-only: true
no-global: all
context: all
static-msg: true
key-naming-case: snake
exclusions:
generated: lax
presets:
- comments
- common-false-positives
- legacy
- std-error-handling
paths:
- pkg/query-service
- ee/query-service
- scripts/
- tmp/
- third_party$
- builtin$
- examples$
formatters:
exclusions:
generated: lax
paths:
- third_party$
- builtin$
- examples$

View File

@@ -84,10 +84,9 @@ go-run-enterprise: ## Runs the enterprise go backend server
SIGNOZ_ALERTMANAGER_PROVIDER=signoz \
SIGNOZ_TELEMETRYSTORE_PROVIDER=clickhouse \
SIGNOZ_TELEMETRYSTORE_CLICKHOUSE_DSN=tcp://127.0.0.1:9000 \
SIGNOZ_TELEMETRYSTORE_CLICKHOUSE_CLUSTER=cluster \
go run -race \
$(GO_BUILD_CONTEXT_ENTERPRISE)/*.go \
--config ./conf/prometheus.yml \
--cluster cluster
$(GO_BUILD_CONTEXT_ENTERPRISE)/*.go
.PHONY: go-test
go-test: ## Runs go unit tests
@@ -102,10 +101,9 @@ go-run-community: ## Runs the community go backend server
SIGNOZ_ALERTMANAGER_PROVIDER=signoz \
SIGNOZ_TELEMETRYSTORE_PROVIDER=clickhouse \
SIGNOZ_TELEMETRYSTORE_CLICKHOUSE_DSN=tcp://127.0.0.1:9000 \
SIGNOZ_TELEMETRYSTORE_CLICKHOUSE_CLUSTER=cluster \
go run -race \
$(GO_BUILD_CONTEXT_COMMUNITY)/*.go server \
--config ./conf/prometheus.yml \
--cluster cluster
$(GO_BUILD_CONTEXT_COMMUNITY)/*.go server
.PHONY: go-build-community $(GO_BUILD_ARCHS_COMMUNITY)
go-build-community: ## Builds the go backend server for community
@@ -208,4 +206,4 @@ py-lint: ## Run lint for integration tests
.PHONY: py-test
py-test: ## Runs integration tests
@cd tests/integration && poetry run pytest --basetemp=./tmp/ -vv --capture=no src/
@cd tests/integration && poetry run pytest --basetemp=./tmp/ -vv --capture=no src/

View File

@@ -31,7 +31,6 @@ builds:
- -X github.com/SigNoz/signoz/pkg/version.branch={{ .Branch }}
- -X github.com/SigNoz/signoz/ee/zeus.url=https://api.signoz.cloud
- -X github.com/SigNoz/signoz/ee/zeus.deprecatedURL=https://license.signoz.io
- -X github.com/SigNoz/signoz/ee/query-service/constants.ZeusURL=https://api.signoz.cloud
- -X github.com/SigNoz/signoz/ee/query-service/constants.LicenseSignozIo=https://license.signoz.io/api/v1
- -X github.com/SigNoz/signoz/pkg/analytics.key=9kRrJ7oPCGPEJLF6QjMPLt5bljFhRQBr
mod_timestamp: "{{ .CommitTimestamp }}"

View File

@@ -176,7 +176,7 @@ services:
# - ../common/clickhouse/storage.xml:/etc/clickhouse-server/config.d/storage.xml
signoz:
!!merge <<: *db-depend
image: signoz/signoz:v0.99.0
image: signoz/signoz:v0.101.0
command:
- --config=/root/config/prometheus.yml
ports:
@@ -209,7 +209,7 @@ services:
retries: 3
otel-collector:
!!merge <<: *db-depend
image: signoz/signoz-otel-collector:v0.129.7
image: signoz/signoz-otel-collector:v0.129.8
command:
- --config=/etc/otel-collector-config.yaml
- --manager-config=/etc/manager-config.yaml
@@ -233,7 +233,7 @@ services:
- signoz
schema-migrator:
!!merge <<: *common
image: signoz/signoz-schema-migrator:v0.129.7
image: signoz/signoz-schema-migrator:v0.129.8
deploy:
restart_policy:
condition: on-failure

View File

@@ -117,7 +117,7 @@ services:
# - ../common/clickhouse/storage.xml:/etc/clickhouse-server/config.d/storage.xml
signoz:
!!merge <<: *db-depend
image: signoz/signoz:v0.99.0
image: signoz/signoz:v0.101.0
command:
- --config=/root/config/prometheus.yml
ports:
@@ -150,7 +150,7 @@ services:
retries: 3
otel-collector:
!!merge <<: *db-depend
image: signoz/signoz-otel-collector:v0.129.7
image: signoz/signoz-otel-collector:v0.129.8
command:
- --config=/etc/otel-collector-config.yaml
- --manager-config=/etc/manager-config.yaml
@@ -176,7 +176,7 @@ services:
- signoz
schema-migrator:
!!merge <<: *common
image: signoz/signoz-schema-migrator:v0.129.7
image: signoz/signoz-schema-migrator:v0.129.8
deploy:
restart_policy:
condition: on-failure

View File

@@ -179,7 +179,7 @@ services:
# - ../common/clickhouse/storage.xml:/etc/clickhouse-server/config.d/storage.xml
signoz:
!!merge <<: *db-depend
image: signoz/signoz:${VERSION:-v0.99.0}
image: signoz/signoz:${VERSION:-v0.101.0}
container_name: signoz
command:
- --config=/root/config/prometheus.yml
@@ -213,7 +213,7 @@ services:
# TODO: support otel-collector multiple replicas. Nginx/Traefik for loadbalancing?
otel-collector:
!!merge <<: *db-depend
image: signoz/signoz-otel-collector:${OTELCOL_TAG:-v0.129.7}
image: signoz/signoz-otel-collector:${OTELCOL_TAG:-v0.129.8}
container_name: signoz-otel-collector
command:
- --config=/etc/otel-collector-config.yaml
@@ -239,7 +239,7 @@ services:
condition: service_healthy
schema-migrator-sync:
!!merge <<: *common
image: signoz/signoz-schema-migrator:${OTELCOL_TAG:-v0.129.7}
image: signoz/signoz-schema-migrator:${OTELCOL_TAG:-v0.129.8}
container_name: schema-migrator-sync
command:
- sync
@@ -250,7 +250,7 @@ services:
condition: service_healthy
schema-migrator-async:
!!merge <<: *db-depend
image: signoz/signoz-schema-migrator:${OTELCOL_TAG:-v0.129.7}
image: signoz/signoz-schema-migrator:${OTELCOL_TAG:-v0.129.8}
container_name: schema-migrator-async
command:
- async

View File

@@ -111,7 +111,7 @@ services:
# - ../common/clickhouse/storage.xml:/etc/clickhouse-server/config.d/storage.xml
signoz:
!!merge <<: *db-depend
image: signoz/signoz:${VERSION:-v0.99.0}
image: signoz/signoz:${VERSION:-v0.101.0}
container_name: signoz
command:
- --config=/root/config/prometheus.yml
@@ -144,7 +144,7 @@ services:
retries: 3
otel-collector:
!!merge <<: *db-depend
image: signoz/signoz-otel-collector:${OTELCOL_TAG:-v0.129.7}
image: signoz/signoz-otel-collector:${OTELCOL_TAG:-v0.129.8}
container_name: signoz-otel-collector
command:
- --config=/etc/otel-collector-config.yaml
@@ -166,7 +166,7 @@ services:
condition: service_healthy
schema-migrator-sync:
!!merge <<: *common
image: signoz/signoz-schema-migrator:${OTELCOL_TAG:-v0.129.7}
image: signoz/signoz-schema-migrator:${OTELCOL_TAG:-v0.129.8}
container_name: schema-migrator-sync
command:
- sync
@@ -178,7 +178,7 @@ services:
restart: on-failure
schema-migrator-async:
!!merge <<: *db-depend
image: signoz/signoz-schema-migrator:${OTELCOL_TAG:-v0.129.7}
image: signoz/signoz-schema-migrator:${OTELCOL_TAG:-v0.129.8}
container_name: schema-migrator-async
command:
- async

View File

@@ -103,9 +103,19 @@ Remember to replace the region and ingestion key with proper values as obtained
Both SigNoz and OTel demo app [frontend-proxy service, to be accurate] share common port allocation at 8080. To prevent port allocation conflicts, modify the OTel demo application config to use port 8081 as the `ENVOY_PORT` value as shown below, and run docker compose command.
Also, both SigNoz and OTel Demo App have the same `PROMETHEUS_PORT` configured, by default both of them try to start at `9090`, which may cause either of them to fail depending upon which one acquires it first. To prevent this, we need to mofify the value of `PROMETHEUS_PORT` too.
```sh
ENVOY_PORT=8081 docker compose up -d
ENVOY_PORT=8081 PROMETHEUS_PORT=9091 docker compose up -d
```
Alternatively, we can modify these values using the `.env` file too, which reduces the command as just:
```sh
docker compose up -d
```
This spins up multiple microservices, with OpenTelemetry instrumentation enabled. you can verify this by,
```sh
docker compose ps -a

View File

@@ -21,12 +21,12 @@ type role
define update: [user, role#assignee]
define delete: [user, role#assignee]
type resources
type metaresources
relations
define create: [user, role#assignee]
define list: [user, role#assignee]
type resource
type metaresource
relations
define read: [user, anonymous, role#assignee]
define update: [user, role#assignee]
@@ -35,6 +35,6 @@ type resource
define block: [user, role#assignee]
type telemetry
type telemetryresource
relations
define read: [user, anonymous, role#assignee]

View File

@@ -10,7 +10,6 @@ import (
"strings"
"time"
"github.com/SigNoz/signoz/ee/query-service/constants"
"github.com/SigNoz/signoz/pkg/errors"
"github.com/SigNoz/signoz/pkg/http/render"
"github.com/SigNoz/signoz/pkg/modules/user"
@@ -77,7 +76,7 @@ func (ah *APIHandler) CloudIntegrationsGenerateConnectionParams(w http.ResponseW
return
}
ingestionUrl, signozApiUrl, apiErr := getIngestionUrlAndSigNozAPIUrl(r.Context(), license.Key)
ingestionUrl, signozApiUrl, apiErr := ah.getIngestionUrlAndSigNozAPIUrl(r.Context(), license.Key)
if apiErr != nil {
RespondError(w, basemodel.WrapApiError(
apiErr, "couldn't deduce ingestion url and signoz api url",
@@ -186,48 +185,37 @@ func (ah *APIHandler) getOrCreateCloudIntegrationUser(
return cloudIntegrationUser, nil
}
func getIngestionUrlAndSigNozAPIUrl(ctx context.Context, licenseKey string) (
func (ah *APIHandler) getIngestionUrlAndSigNozAPIUrl(ctx context.Context, licenseKey string) (
string, string, *basemodel.ApiError,
) {
url := fmt.Sprintf(
"%s%s",
strings.TrimSuffix(constants.ZeusURL, "/"),
"/v2/deployments/me",
)
// TODO: remove this struct from here
type deploymentResponse struct {
Status string `json:"status"`
Error string `json:"error"`
Data struct {
Name string `json:"name"`
ClusterInfo struct {
Region struct {
DNS string `json:"dns"`
} `json:"region"`
} `json:"cluster"`
} `json:"data"`
Name string `json:"name"`
ClusterInfo struct {
Region struct {
DNS string `json:"dns"`
} `json:"region"`
} `json:"cluster"`
}
resp, apiErr := requestAndParseResponse[deploymentResponse](
ctx, url, map[string]string{"X-Signoz-Cloud-Api-Key": licenseKey}, nil,
)
if apiErr != nil {
return "", "", basemodel.WrapApiError(
apiErr, "couldn't query for deployment info",
)
}
if resp.Status != "success" {
respBytes, err := ah.Signoz.Zeus.GetDeployment(ctx, licenseKey)
if err != nil {
return "", "", basemodel.InternalError(fmt.Errorf(
"couldn't query for deployment info: status: %s, error: %s",
resp.Status, resp.Error,
"couldn't query for deployment info: error: %w", err,
))
}
regionDns := resp.Data.ClusterInfo.Region.DNS
deploymentName := resp.Data.Name
resp := new(deploymentResponse)
err = json.Unmarshal(respBytes, resp)
if err != nil {
return "", "", basemodel.InternalError(fmt.Errorf(
"couldn't unmarshal deployment info response: error: %w", err,
))
}
regionDns := resp.ClusterInfo.Region.DNS
deploymentName := resp.Name
if len(regionDns) < 1 || len(deploymentName) < 1 {
// Fail early if actual response structure and expectation here ever diverge

View File

@@ -10,9 +10,6 @@ var SaasSegmentKey = GetOrDefaultEnv("SIGNOZ_SAAS_SEGMENT_KEY", "")
var FetchFeatures = GetOrDefaultEnv("FETCH_FEATURES", "false")
var ZeusFeaturesURL = GetOrDefaultEnv("ZEUS_FEATURES_URL", "ZeusFeaturesURL")
// this is set via build time variable
var ZeusURL = "https://api.signoz.cloud"
func GetOrDefaultEnv(key string, fallback string) string {
v := os.Getenv(key)
if len(v) == 0 {

View File

@@ -0,0 +1,153 @@
package postgressqlstore
import (
"strings"
"github.com/SigNoz/signoz/pkg/sqlstore"
"github.com/uptrace/bun/schema"
)
type formatter struct {
bunf schema.Formatter
}
func newFormatter(dialect schema.Dialect) sqlstore.SQLFormatter {
return &formatter{bunf: schema.NewFormatter(dialect)}
}
func (f *formatter) JSONExtractString(column, path string) []byte {
var sql []byte
sql = f.bunf.AppendIdent(sql, column)
sql = append(sql, f.convertJSONPathToPostgres(path)...)
return sql
}
func (f *formatter) JSONType(column, path string) []byte {
var sql []byte
sql = append(sql, "jsonb_typeof("...)
sql = f.bunf.AppendIdent(sql, column)
sql = append(sql, f.convertJSONPathToPostgresWithMode(path, false)...)
sql = append(sql, ')')
return sql
}
func (f *formatter) JSONIsArray(column, path string) []byte {
var sql []byte
sql = append(sql, f.JSONType(column, path)...)
sql = append(sql, " = "...)
sql = schema.Append(f.bunf, sql, "array")
return sql
}
func (f *formatter) JSONArrayElements(column, path, alias string) ([]byte, []byte) {
var sql []byte
sql = append(sql, "jsonb_array_elements("...)
sql = f.bunf.AppendIdent(sql, column)
sql = append(sql, f.convertJSONPathToPostgresWithMode(path, false)...)
sql = append(sql, ") AS "...)
sql = f.bunf.AppendIdent(sql, alias)
return sql, []byte(alias)
}
func (f *formatter) JSONArrayOfStrings(column, path, alias string) ([]byte, []byte) {
var sql []byte
sql = append(sql, "jsonb_array_elements_text("...)
sql = f.bunf.AppendIdent(sql, column)
sql = append(sql, f.convertJSONPathToPostgresWithMode(path, false)...)
sql = append(sql, ") AS "...)
sql = f.bunf.AppendIdent(sql, alias)
return sql, append([]byte(alias), "::text"...)
}
func (f *formatter) JSONKeys(column, path, alias string) ([]byte, []byte) {
var sql []byte
sql = append(sql, "jsonb_each("...)
sql = f.bunf.AppendIdent(sql, column)
sql = append(sql, f.convertJSONPathToPostgresWithMode(path, false)...)
sql = append(sql, ") AS "...)
sql = f.bunf.AppendIdent(sql, alias)
return sql, append([]byte(alias), ".key"...)
}
func (f *formatter) JSONArrayAgg(expression string) []byte {
var sql []byte
sql = append(sql, "jsonb_agg("...)
sql = append(sql, expression...)
sql = append(sql, ')')
return sql
}
func (f *formatter) JSONArrayLiteral(values ...string) []byte {
var sql []byte
sql = append(sql, "jsonb_build_array("...)
for idx, value := range values {
if idx > 0 {
sql = append(sql, ", "...)
}
sql = schema.Append(f.bunf, sql, value)
}
sql = append(sql, ')')
return sql
}
func (f *formatter) TextToJsonColumn(column string) []byte {
var sql []byte
sql = f.bunf.AppendIdent(sql, column)
sql = append(sql, "::jsonb"...)
return sql
}
func (f *formatter) convertJSONPathToPostgres(jsonPath string) []byte {
return f.convertJSONPathToPostgresWithMode(jsonPath, true)
}
func (f *formatter) convertJSONPathToPostgresWithMode(jsonPath string, asText bool) []byte {
path := strings.TrimPrefix(strings.TrimPrefix(jsonPath, "$"), ".")
if path == "" {
return nil
}
parts := strings.Split(path, ".")
var validParts []string
for _, part := range parts {
if part != "" {
validParts = append(validParts, part)
}
}
if len(validParts) == 0 {
return nil
}
var result []byte
for idx, part := range validParts {
if idx == len(validParts)-1 {
if asText {
result = append(result, "->>"...)
} else {
result = append(result, "->"...)
}
result = schema.Append(f.bunf, result, part)
return result
}
result = append(result, "->"...)
result = schema.Append(f.bunf, result, part)
}
return result
}
func (f *formatter) LowerExpression(expression string) []byte {
var sql []byte
sql = append(sql, "lower("...)
sql = append(sql, expression...)
sql = append(sql, ')')
return sql
}

View File

@@ -0,0 +1,500 @@
package postgressqlstore
import (
"testing"
"github.com/stretchr/testify/assert"
"github.com/uptrace/bun/dialect/pgdialect"
)
func TestJSONExtractString(t *testing.T) {
tests := []struct {
name string
column string
path string
expected string
}{
{
name: "simple path",
column: "data",
path: "$.field",
expected: `"data"->>'field'`,
},
{
name: "nested path",
column: "metadata",
path: "$.user.name",
expected: `"metadata"->'user'->>'name'`,
},
{
name: "deeply nested path",
column: "json_col",
path: "$.level1.level2.level3",
expected: `"json_col"->'level1'->'level2'->>'level3'`,
},
{
name: "root path",
column: "json_col",
path: "$",
expected: `"json_col"`,
},
{
name: "empty path",
column: "data",
path: "",
expected: `"data"`,
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
f := newFormatter(pgdialect.New())
got := string(f.JSONExtractString(tt.column, tt.path))
assert.Equal(t, tt.expected, got)
})
}
}
func TestJSONType(t *testing.T) {
tests := []struct {
name string
column string
path string
expected string
}{
{
name: "simple path",
column: "data",
path: "$.field",
expected: `jsonb_typeof("data"->'field')`,
},
{
name: "nested path",
column: "metadata",
path: "$.user.age",
expected: `jsonb_typeof("metadata"->'user'->'age')`,
},
{
name: "root path",
column: "json_col",
path: "$",
expected: `jsonb_typeof("json_col")`,
},
{
name: "empty path",
column: "data",
path: "",
expected: `jsonb_typeof("data")`,
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
f := newFormatter(pgdialect.New())
got := string(f.JSONType(tt.column, tt.path))
assert.Equal(t, tt.expected, got)
})
}
}
func TestJSONIsArray(t *testing.T) {
tests := []struct {
name string
column string
path string
expected string
}{
{
name: "simple path",
column: "data",
path: "$.items",
expected: `jsonb_typeof("data"->'items') = 'array'`,
},
{
name: "nested path",
column: "metadata",
path: "$.user.tags",
expected: `jsonb_typeof("metadata"->'user'->'tags') = 'array'`,
},
{
name: "root path",
column: "json_col",
path: "$",
expected: `jsonb_typeof("json_col") = 'array'`,
},
{
name: "empty path",
column: "data",
path: "",
expected: `jsonb_typeof("data") = 'array'`,
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
f := newFormatter(pgdialect.New())
got := string(f.JSONIsArray(tt.column, tt.path))
assert.Equal(t, tt.expected, got)
})
}
}
func TestJSONArrayElements(t *testing.T) {
tests := []struct {
name string
column string
path string
alias string
expected string
}{
{
name: "root path with dollar sign",
column: "data",
path: "$",
alias: "elem",
expected: `jsonb_array_elements("data") AS "elem"`,
},
{
name: "root path empty",
column: "data",
path: "",
alias: "elem",
expected: `jsonb_array_elements("data") AS "elem"`,
},
{
name: "nested path",
column: "metadata",
path: "$.items",
alias: "item",
expected: `jsonb_array_elements("metadata"->'items') AS "item"`,
},
{
name: "deeply nested path",
column: "json_col",
path: "$.user.tags",
alias: "tag",
expected: `jsonb_array_elements("json_col"->'user'->'tags') AS "tag"`,
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
f := newFormatter(pgdialect.New())
got, _ := f.JSONArrayElements(tt.column, tt.path, tt.alias)
assert.Equal(t, tt.expected, string(got))
})
}
}
func TestJSONArrayOfStrings(t *testing.T) {
tests := []struct {
name string
column string
path string
alias string
expected string
}{
{
name: "root path with dollar sign",
column: "data",
path: "$",
alias: "str",
expected: `jsonb_array_elements_text("data") AS "str"`,
},
{
name: "root path empty",
column: "data",
path: "",
alias: "str",
expected: `jsonb_array_elements_text("data") AS "str"`,
},
{
name: "nested path",
column: "metadata",
path: "$.strings",
alias: "s",
expected: `jsonb_array_elements_text("metadata"->'strings') AS "s"`,
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
f := newFormatter(pgdialect.New())
got, _ := f.JSONArrayOfStrings(tt.column, tt.path, tt.alias)
assert.Equal(t, tt.expected, string(got))
})
}
}
func TestJSONKeys(t *testing.T) {
tests := []struct {
name string
column string
path string
alias string
expected string
}{
{
name: "root path with dollar sign",
column: "data",
path: "$",
alias: "k",
expected: `jsonb_each("data") AS "k"`,
},
{
name: "root path empty",
column: "data",
path: "",
alias: "k",
expected: `jsonb_each("data") AS "k"`,
},
{
name: "nested path",
column: "metadata",
path: "$.object",
alias: "key",
expected: `jsonb_each("metadata"->'object') AS "key"`,
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
f := newFormatter(pgdialect.New())
got, _ := f.JSONKeys(tt.column, tt.path, tt.alias)
assert.Equal(t, tt.expected, string(got))
})
}
}
func TestJSONArrayAgg(t *testing.T) {
tests := []struct {
name string
expression string
expected string
}{
{
name: "simple column",
expression: "id",
expected: "jsonb_agg(id)",
},
{
name: "expression with function",
expression: "DISTINCT name",
expected: "jsonb_agg(DISTINCT name)",
},
{
name: "complex expression",
expression: "data->>'field'",
expected: "jsonb_agg(data->>'field')",
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
f := newFormatter(pgdialect.New())
got := string(f.JSONArrayAgg(tt.expression))
assert.Equal(t, tt.expected, got)
})
}
}
func TestJSONArrayLiteral(t *testing.T) {
tests := []struct {
name string
values []string
expected string
}{
{
name: "empty array",
values: []string{},
expected: "jsonb_build_array()",
},
{
name: "single value",
values: []string{"value1"},
expected: "jsonb_build_array('value1')",
},
{
name: "multiple values",
values: []string{"value1", "value2", "value3"},
expected: "jsonb_build_array('value1', 'value2', 'value3')",
},
{
name: "values with special characters",
values: []string{"test", "with space", "with-dash"},
expected: "jsonb_build_array('test', 'with space', 'with-dash')",
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
f := newFormatter(pgdialect.New())
got := string(f.JSONArrayLiteral(tt.values...))
assert.Equal(t, tt.expected, got)
})
}
}
func TestConvertJSONPathToPostgresWithMode(t *testing.T) {
tests := []struct {
name string
jsonPath string
asText bool
expected string
}{
{
name: "simple path as text",
jsonPath: "$.field",
asText: true,
expected: "->>'field'",
},
{
name: "simple path as json",
jsonPath: "$.field",
asText: false,
expected: "->'field'",
},
{
name: "nested path as text",
jsonPath: "$.user.name",
asText: true,
expected: "->'user'->>'name'",
},
{
name: "nested path as json",
jsonPath: "$.user.name",
asText: false,
expected: "->'user'->'name'",
},
{
name: "deeply nested as text",
jsonPath: "$.a.b.c.d",
asText: true,
expected: "->'a'->'b'->'c'->>'d'",
},
{
name: "root path",
jsonPath: "$",
asText: true,
expected: "",
},
{
name: "empty path",
jsonPath: "",
asText: true,
expected: "",
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
f := newFormatter(pgdialect.New()).(*formatter)
got := string(f.convertJSONPathToPostgresWithMode(tt.jsonPath, tt.asText))
assert.Equal(t, tt.expected, got)
})
}
}
func TestTextToJsonColumn(t *testing.T) {
tests := []struct {
name string
column string
expected string
}{
{
name: "simple column name",
column: "data",
expected: `"data"::jsonb`,
},
{
name: "column with underscore",
column: "user_data",
expected: `"user_data"::jsonb`,
},
{
name: "column with special characters",
column: "json-col",
expected: `"json-col"::jsonb`,
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
f := newFormatter(pgdialect.New())
got := string(f.TextToJsonColumn(tt.column))
assert.Equal(t, tt.expected, got)
})
}
}
func TestLowerExpression(t *testing.T) {
tests := []struct {
name string
expr string
expected string
}{
{
name: "simple column name",
expr: "name",
expected: "lower(name)",
},
{
name: "quoted column identifier",
expr: `"column_name"`,
expected: `lower("column_name")`,
},
{
name: "jsonb text extraction",
expr: "data->>'field'",
expected: "lower(data->>'field')",
},
{
name: "nested jsonb extraction",
expr: "metadata->'user'->>'name'",
expected: "lower(metadata->'user'->>'name')",
},
{
name: "jsonb_typeof expression",
expr: "jsonb_typeof(data->'field')",
expected: "lower(jsonb_typeof(data->'field'))",
},
{
name: "string concatenation",
expr: "first_name || ' ' || last_name",
expected: "lower(first_name || ' ' || last_name)",
},
{
name: "CAST expression",
expr: "CAST(value AS TEXT)",
expected: "lower(CAST(value AS TEXT))",
},
{
name: "COALESCE expression",
expr: "COALESCE(name, 'default')",
expected: "lower(COALESCE(name, 'default'))",
},
{
name: "subquery column",
expr: "users.email",
expected: "lower(users.email)",
},
{
name: "quoted identifier with special chars",
expr: `"user-name"`,
expected: `lower("user-name")`,
},
{
name: "jsonb to text cast",
expr: "data::text",
expected: "lower(data::text)",
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
f := newFormatter(pgdialect.New())
got := string(f.LowerExpression(tt.expr))
assert.Equal(t, tt.expected, got)
})
}
}

View File

@@ -15,10 +15,11 @@ import (
)
type provider struct {
settings factory.ScopedProviderSettings
sqldb *sql.DB
bundb *sqlstore.BunDB
dialect *dialect
settings factory.ScopedProviderSettings
sqldb *sql.DB
bundb *sqlstore.BunDB
dialect *dialect
formatter sqlstore.SQLFormatter
}
func NewFactory(hookFactories ...factory.ProviderFactory[sqlstore.SQLStoreHook, sqlstore.Config]) factory.ProviderFactory[sqlstore.SQLStore, sqlstore.Config] {
@@ -55,11 +56,14 @@ func New(ctx context.Context, providerSettings factory.ProviderSettings, config
sqldb := stdlib.OpenDBFromPool(pool)
pgDialect := pgdialect.New()
bunDB := sqlstore.NewBunDB(settings, sqldb, pgDialect, hooks)
return &provider{
settings: settings,
sqldb: sqldb,
bundb: sqlstore.NewBunDB(settings, sqldb, pgdialect.New(), hooks),
dialect: new(dialect),
settings: settings,
sqldb: sqldb,
bundb: bunDB,
dialect: new(dialect),
formatter: newFormatter(bunDB.Dialect()),
}, nil
}
@@ -75,6 +79,10 @@ func (provider *provider) Dialect() sqlstore.SQLDialect {
return provider.dialect
}
func (provider *provider) Formatter() sqlstore.SQLFormatter {
return provider.formatter
}
func (provider *provider) BunDBCtx(ctx context.Context) bun.IDB {
return provider.bundb.BunDBCtx(ctx)
}

View File

@@ -1,6 +1,4 @@
// Mock for useSafeNavigate hook to avoid React Router version conflicts in tests
export { isEventObject } from '../src/utils/isEventObject';
interface SafeNavigateOptions {
replace?: boolean;
state?: unknown;

View File

@@ -69,7 +69,7 @@
"antd": "5.11.0",
"antd-table-saveas-excel": "2.2.1",
"antlr4": "4.13.2",
"axios": "1.8.2",
"axios": "1.12.0",
"babel-eslint": "^10.1.0",
"babel-jest": "^29.6.4",
"babel-loader": "9.1.3",

View File

@@ -274,7 +274,7 @@ function App(): JSX.Element {
chat_settings: {
app_id: process.env.PYLON_APP_ID,
email: user.email,
name: user.displayName,
name: user.displayName || user.email,
},
};
}

View File

@@ -1,4 +1,4 @@
import { ApiBaseInstance as axios } from 'api';
import { LogEventAxiosInstance as axios } from 'api';
import { ErrorResponseHandler } from 'api/ErrorResponseHandler';
import { AxiosError } from 'axios';
import { ErrorResponse, SuccessResponse } from 'types/api';

View File

@@ -1,13 +1,11 @@
/* eslint-disable sonarjs/no-duplicate-string */
import { ApiBaseInstance } from 'api';
import axios from 'api';
import { getFieldKeys } from '../getFieldKeys';
// Mock the API instance
jest.mock('api', () => ({
ApiBaseInstance: {
get: jest.fn(),
},
get: jest.fn(),
}));
describe('getFieldKeys API', () => {
@@ -31,33 +29,33 @@ describe('getFieldKeys API', () => {
it('should call API with correct parameters when no args provided', async () => {
// Mock successful API response
(ApiBaseInstance.get as jest.Mock).mockResolvedValueOnce(mockSuccessResponse);
(axios.get as jest.Mock).mockResolvedValueOnce(mockSuccessResponse);
// Call function with no parameters
await getFieldKeys();
// Verify API was called correctly with empty params object
expect(ApiBaseInstance.get).toHaveBeenCalledWith('/fields/keys', {
expect(axios.get).toHaveBeenCalledWith('/fields/keys', {
params: {},
});
});
it('should call API with signal parameter when provided', async () => {
// Mock successful API response
(ApiBaseInstance.get as jest.Mock).mockResolvedValueOnce(mockSuccessResponse);
(axios.get as jest.Mock).mockResolvedValueOnce(mockSuccessResponse);
// Call function with signal parameter
await getFieldKeys('traces');
// Verify API was called with signal parameter
expect(ApiBaseInstance.get).toHaveBeenCalledWith('/fields/keys', {
expect(axios.get).toHaveBeenCalledWith('/fields/keys', {
params: { signal: 'traces' },
});
});
it('should call API with name parameter when provided', async () => {
// Mock successful API response
(ApiBaseInstance.get as jest.Mock).mockResolvedValueOnce({
(axios.get as jest.Mock).mockResolvedValueOnce({
status: 200,
data: {
status: 'success',
@@ -72,14 +70,14 @@ describe('getFieldKeys API', () => {
await getFieldKeys(undefined, 'service');
// Verify API was called with name parameter
expect(ApiBaseInstance.get).toHaveBeenCalledWith('/fields/keys', {
expect(axios.get).toHaveBeenCalledWith('/fields/keys', {
params: { name: 'service' },
});
});
it('should call API with both signal and name when provided', async () => {
// Mock successful API response
(ApiBaseInstance.get as jest.Mock).mockResolvedValueOnce({
(axios.get as jest.Mock).mockResolvedValueOnce({
status: 200,
data: {
status: 'success',
@@ -94,14 +92,14 @@ describe('getFieldKeys API', () => {
await getFieldKeys('logs', 'service');
// Verify API was called with both parameters
expect(ApiBaseInstance.get).toHaveBeenCalledWith('/fields/keys', {
expect(axios.get).toHaveBeenCalledWith('/fields/keys', {
params: { signal: 'logs', name: 'service' },
});
});
it('should return properly formatted response', async () => {
// Mock API to return our response
(ApiBaseInstance.get as jest.Mock).mockResolvedValueOnce(mockSuccessResponse);
(axios.get as jest.Mock).mockResolvedValueOnce(mockSuccessResponse);
// Call the function
const result = await getFieldKeys('traces');

View File

@@ -1,13 +1,11 @@
/* eslint-disable sonarjs/no-duplicate-string */
import { ApiBaseInstance } from 'api';
import axios from 'api';
import { getFieldValues } from '../getFieldValues';
// Mock the API instance
jest.mock('api', () => ({
ApiBaseInstance: {
get: jest.fn(),
},
get: jest.fn(),
}));
describe('getFieldValues API', () => {
@@ -17,7 +15,7 @@ describe('getFieldValues API', () => {
it('should call the API with correct parameters (no options)', async () => {
// Mock API response
(ApiBaseInstance.get as jest.Mock).mockResolvedValueOnce({
(axios.get as jest.Mock).mockResolvedValueOnce({
status: 200,
data: {
status: 'success',
@@ -34,14 +32,14 @@ describe('getFieldValues API', () => {
await getFieldValues();
// Verify API was called correctly with empty params
expect(ApiBaseInstance.get).toHaveBeenCalledWith('/fields/values', {
expect(axios.get).toHaveBeenCalledWith('/fields/values', {
params: {},
});
});
it('should call the API with signal parameter', async () => {
// Mock API response
(ApiBaseInstance.get as jest.Mock).mockResolvedValueOnce({
(axios.get as jest.Mock).mockResolvedValueOnce({
status: 200,
data: {
status: 'success',
@@ -58,14 +56,14 @@ describe('getFieldValues API', () => {
await getFieldValues('traces');
// Verify API was called with signal parameter
expect(ApiBaseInstance.get).toHaveBeenCalledWith('/fields/values', {
expect(axios.get).toHaveBeenCalledWith('/fields/values', {
params: { signal: 'traces' },
});
});
it('should call the API with name parameter', async () => {
// Mock API response
(ApiBaseInstance.get as jest.Mock).mockResolvedValueOnce({
(axios.get as jest.Mock).mockResolvedValueOnce({
status: 200,
data: {
status: 'success',
@@ -82,14 +80,14 @@ describe('getFieldValues API', () => {
await getFieldValues(undefined, 'service.name');
// Verify API was called with name parameter
expect(ApiBaseInstance.get).toHaveBeenCalledWith('/fields/values', {
expect(axios.get).toHaveBeenCalledWith('/fields/values', {
params: { name: 'service.name' },
});
});
it('should call the API with value parameter', async () => {
// Mock API response
(ApiBaseInstance.get as jest.Mock).mockResolvedValueOnce({
(axios.get as jest.Mock).mockResolvedValueOnce({
status: 200,
data: {
status: 'success',
@@ -106,14 +104,14 @@ describe('getFieldValues API', () => {
await getFieldValues(undefined, 'service.name', 'front');
// Verify API was called with value parameter
expect(ApiBaseInstance.get).toHaveBeenCalledWith('/fields/values', {
expect(axios.get).toHaveBeenCalledWith('/fields/values', {
params: { name: 'service.name', searchText: 'front' },
});
});
it('should call the API with time range parameters', async () => {
// Mock API response
(ApiBaseInstance.get as jest.Mock).mockResolvedValueOnce({
(axios.get as jest.Mock).mockResolvedValueOnce({
status: 200,
data: {
status: 'success',
@@ -138,7 +136,7 @@ describe('getFieldValues API', () => {
);
// Verify API was called with time range parameters (converted to milliseconds)
expect(ApiBaseInstance.get).toHaveBeenCalledWith('/fields/values', {
expect(axios.get).toHaveBeenCalledWith('/fields/values', {
params: {
signal: 'logs',
name: 'service.name',
@@ -165,7 +163,7 @@ describe('getFieldValues API', () => {
},
};
(ApiBaseInstance.get as jest.Mock).mockResolvedValueOnce(mockResponse);
(axios.get as jest.Mock).mockResolvedValueOnce(mockResponse);
// Call the function
const result = await getFieldValues('traces', 'mixed.values');
@@ -196,7 +194,7 @@ describe('getFieldValues API', () => {
};
// Mock API to return our response
(ApiBaseInstance.get as jest.Mock).mockResolvedValueOnce(mockApiResponse);
(axios.get as jest.Mock).mockResolvedValueOnce(mockApiResponse);
// Call the function
const result = await getFieldValues('traces', 'service.name');

View File

@@ -1,4 +1,4 @@
import { ApiBaseInstance } from 'api';
import axios from 'api';
import { ErrorResponseHandlerV2 } from 'api/ErrorResponseHandlerV2';
import { AxiosError } from 'axios';
import { ErrorV2Resp, SuccessResponseV2 } from 'types/api';
@@ -24,7 +24,7 @@ export const getFieldKeys = async (
}
try {
const response = await ApiBaseInstance.get('/fields/keys', { params });
const response = await axios.get('/fields/keys', { params });
return {
httpStatusCode: response.status,

View File

@@ -1,5 +1,5 @@
/* eslint-disable sonarjs/cognitive-complexity */
import { ApiBaseInstance } from 'api';
import axios from 'api';
import { ErrorResponseHandlerV2 } from 'api/ErrorResponseHandlerV2';
import { AxiosError } from 'axios';
import { ErrorV2Resp, SuccessResponseV2 } from 'types/api';
@@ -47,7 +47,7 @@ export const getFieldValues = async (
}
try {
const response = await ApiBaseInstance.get('/fields/values', { params });
const response = await axios.get('/fields/values', { params });
// Normalize values from different types (stringValues, boolValues, etc.)
if (response.data?.data?.values) {

View File

@@ -86,8 +86,9 @@ const interceptorRejected = async (
if (
response.status === 401 &&
// if the session rotate call errors out with 401 or the delete sessions call returns 401 then we do not retry!
// if the session rotate call or the create session errors out with 401 or the delete sessions call returns 401 then we do not retry!
response.config.url !== '/sessions/rotate' &&
response.config.url !== '/sessions/email_password' &&
!(
response.config.url === '/sessions' && response.config.method === 'delete'
)
@@ -199,15 +200,15 @@ ApiV5Instance.interceptors.request.use(interceptorsRequestResponse);
//
// axios Base
export const ApiBaseInstance = axios.create({
export const LogEventAxiosInstance = axios.create({
baseURL: `${ENVIRONMENT.baseURL}${apiV1}`,
});
ApiBaseInstance.interceptors.response.use(
LogEventAxiosInstance.interceptors.response.use(
interceptorsResponse,
interceptorRejectedBase,
);
ApiBaseInstance.interceptors.request.use(interceptorsRequestResponse);
LogEventAxiosInstance.interceptors.request.use(interceptorsRequestResponse);
//
// gateway Api V1

View File

@@ -1,4 +1,4 @@
import { ApiBaseInstance } from 'api';
import axios from 'api';
import { ErrorResponseHandler } from 'api/ErrorResponseHandler';
import { AxiosError, AxiosResponse } from 'axios';
import { baseAutoCompleteIdKeysOrder } from 'constants/queryBuilder';
@@ -17,7 +17,7 @@ export const getHostAttributeKeys = async (
try {
const response: AxiosResponse<{
data: IQueryAutocompleteResponse;
}> = await ApiBaseInstance.get(
}> = await axios.get(
`/${entity}/attribute_keys?dataSource=metrics&searchText=${searchText}`,
{
params: {

View File

@@ -1,4 +1,4 @@
import { ApiBaseInstance } from 'api';
import axios from 'api';
import { ErrorResponseHandler } from 'api/ErrorResponseHandler';
import { AxiosError } from 'axios';
import { SOMETHING_WENT_WRONG } from 'constants/api';
@@ -20,7 +20,7 @@ const getOnboardingStatus = async (props: {
}): Promise<SuccessResponse<OnboardingStatusResponse> | ErrorResponse> => {
const { endpointService, ...rest } = props;
try {
const response = await ApiBaseInstance.post(
const response = await axios.post(
`/messaging-queues/kafka/onboarding/${endpointService || 'consumers'}`,
rest,
);

View File

@@ -1,13 +1,20 @@
import axios from 'api';
import { ApiV2Instance } from 'api';
import { ErrorResponseHandlerV2 } from 'api/ErrorResponseHandlerV2';
import { AxiosError } from 'axios';
import { ErrorV2Resp } from 'types/api';
import { PayloadProps, Props } from 'types/api/metrics/getService';
const getService = async (props: Props): Promise<PayloadProps> => {
const response = await axios.post(`/services`, {
start: `${props.start}`,
end: `${props.end}`,
tags: props.selectedTags,
});
return response.data;
try {
const response = await ApiV2Instance.post(`/services`, {
start: `${props.start}`,
end: `${props.end}`,
tags: props.selectedTags,
});
return response.data.data;
} catch (error) {
ErrorResponseHandlerV2(error as AxiosError<ErrorV2Resp>);
}
};
export default getService;

View File

@@ -1,22 +1,27 @@
import axios from 'api';
import { ApiV2Instance } from 'api';
import { ErrorResponseHandlerV2 } from 'api/ErrorResponseHandlerV2';
import { AxiosError } from 'axios';
import { ErrorV2Resp } from 'types/api';
import { PayloadProps, Props } from 'types/api/metrics/getTopOperations';
const getTopOperations = async (props: Props): Promise<PayloadProps> => {
const endpoint = props.isEntryPoint
? '/service/entry_point_operations'
: '/service/top_operations';
try {
const endpoint = props.isEntryPoint
? '/service/entry_point_operations'
: '/service/top_operations';
const response = await axios.post(endpoint, {
start: `${props.start}`,
end: `${props.end}`,
service: props.service,
tags: props.selectedTags,
});
const response = await ApiV2Instance.post(endpoint, {
start: `${props.start}`,
end: `${props.end}`,
service: props.service,
tags: props.selectedTags,
limit: 5000,
});
if (props.isEntryPoint) {
return response.data.data;
} catch (error) {
ErrorResponseHandlerV2(error as AxiosError<ErrorV2Resp>);
}
return response.data;
};
export default getTopOperations;

View File

@@ -1,4 +1,4 @@
import { ApiBaseInstance } from 'api';
import axios from 'api';
import { ErrorResponseHandler } from 'api/ErrorResponseHandler';
import { AxiosError } from 'axios';
import { ErrorResponse, SuccessResponse } from 'types/api';
@@ -9,7 +9,7 @@ const getCustomFilters = async (
): Promise<SuccessResponse<PayloadProps> | ErrorResponse> => {
const { signal } = props;
try {
const response = await ApiBaseInstance.get(`orgs/me/filters/${signal}`);
const response = await axios.get(`/orgs/me/filters/${signal}`);
return {
statusCode: 200,

View File

@@ -1,4 +1,4 @@
import { ApiBaseInstance } from 'api';
import axios from 'api';
import { AxiosError } from 'axios';
import { SuccessResponse } from 'types/api';
import { UpdateCustomFiltersProps } from 'types/api/quickFilters/updateCustomFilters';
@@ -6,7 +6,7 @@ import { UpdateCustomFiltersProps } from 'types/api/quickFilters/updateCustomFil
const updateCustomFiltersAPI = async (
props: UpdateCustomFiltersProps,
): Promise<SuccessResponse<void> | AxiosError> =>
ApiBaseInstance.put(`orgs/me/filters`, {
axios.put(`/orgs/me/filters`, {
...props.data,
});

View File

@@ -1,4 +1,4 @@
import { ApiBaseInstance } from 'api';
import axios from 'api';
import { ErrorResponseHandlerV2 } from 'api/ErrorResponseHandlerV2';
import { AxiosError } from 'axios';
import { ErrorV2Resp, SuccessResponseV2 } from 'types/api';
@@ -9,15 +9,12 @@ const listOverview = async (
): Promise<SuccessResponseV2<PayloadProps>> => {
const { start, end, show_ip: showIp, filter } = props;
try {
const response = await ApiBaseInstance.post(
`/third-party-apis/overview/list`,
{
start,
end,
show_ip: showIp,
filter,
},
);
const response = await axios.post(`/third-party-apis/overview/list`, {
start,
end,
show_ip: showIp,
filter,
});
return {
httpStatusCode: response.status,

View File

@@ -0,0 +1,28 @@
import axios from 'api';
import { ErrorResponseHandlerV2 } from 'api/ErrorResponseHandlerV2';
import { AxiosError } from 'axios';
import { ErrorV2Resp, SuccessResponseV2 } from 'types/api';
import {
GetSpanPercentilesProps,
GetSpanPercentilesResponseDataProps,
} from 'types/api/trace/getSpanPercentiles';
const getSpanPercentiles = async (
props: GetSpanPercentilesProps,
): Promise<SuccessResponseV2<GetSpanPercentilesResponseDataProps>> => {
try {
const response = await axios.post('/span_percentile', {
...props,
});
return {
httpStatusCode: response.status,
data: response.data.data,
};
} catch (error) {
ErrorResponseHandlerV2(error as AxiosError<ErrorV2Resp>);
throw error;
}
};
export default getSpanPercentiles;

View File

@@ -11,7 +11,7 @@ import {
export const getQueryRangeV5 = async (
props: QueryRangePayloadV5,
version: string,
signal: AbortSignal,
signal?: AbortSignal,
headers?: Record<string, string>,
): Promise<SuccessResponseV2<MetricRangePayloadV5>> => {
try {

View File

@@ -0,0 +1,371 @@
/* eslint-disable sonarjs/no-duplicate-string */
import { getYAxisFormattedValue, PrecisionOptionsEnum } from '../yAxisConfig';
const testFullPrecisionGetYAxisFormattedValue = (
value: string,
format: string,
): string => getYAxisFormattedValue(value, format, PrecisionOptionsEnum.FULL);
describe('getYAxisFormattedValue - none (full precision legacy assertions)', () => {
test('large integers and decimals', () => {
expect(testFullPrecisionGetYAxisFormattedValue('250034', 'none')).toBe(
'250034',
);
expect(
testFullPrecisionGetYAxisFormattedValue('250034897.12345', 'none'),
).toBe('250034897.12345');
expect(
testFullPrecisionGetYAxisFormattedValue('250034897.02354', 'none'),
).toBe('250034897.02354');
expect(testFullPrecisionGetYAxisFormattedValue('9999999.9999', 'none')).toBe(
'9999999.9999',
);
});
test('preserves leading zeros after decimal until first non-zero', () => {
expect(testFullPrecisionGetYAxisFormattedValue('1.0000234', 'none')).toBe(
'1.0000234',
);
expect(testFullPrecisionGetYAxisFormattedValue('0.00003', 'none')).toBe(
'0.00003',
);
});
test('trims to three significant decimals and removes trailing zeros', () => {
expect(
testFullPrecisionGetYAxisFormattedValue('0.000000250034', 'none'),
).toBe('0.000000250034');
expect(testFullPrecisionGetYAxisFormattedValue('0.00000025', 'none')).toBe(
'0.00000025',
);
// Big precision, limiting the javascript precision (~16 digits)
expect(
testFullPrecisionGetYAxisFormattedValue('1.0000000000000001', 'none'),
).toBe('1');
expect(
testFullPrecisionGetYAxisFormattedValue('1.00555555559595876', 'none'),
).toBe('1.005555555595958');
expect(testFullPrecisionGetYAxisFormattedValue('0.000000001', 'none')).toBe(
'0.000000001',
);
expect(
testFullPrecisionGetYAxisFormattedValue('0.000000250000', 'none'),
).toBe('0.00000025');
});
test('whole numbers normalize', () => {
expect(testFullPrecisionGetYAxisFormattedValue('1000', 'none')).toBe('1000');
expect(testFullPrecisionGetYAxisFormattedValue('99.5458', 'none')).toBe(
'99.5458',
);
expect(testFullPrecisionGetYAxisFormattedValue('1.234567', 'none')).toBe(
'1.234567',
);
expect(testFullPrecisionGetYAxisFormattedValue('99.998', 'none')).toBe(
'99.998',
);
});
test('strip redundant decimal zeros', () => {
expect(testFullPrecisionGetYAxisFormattedValue('1000.000', 'none')).toBe(
'1000',
);
expect(testFullPrecisionGetYAxisFormattedValue('99.500', 'none')).toBe(
'99.5',
);
expect(testFullPrecisionGetYAxisFormattedValue('1.000', 'none')).toBe('1');
});
test('edge values', () => {
expect(testFullPrecisionGetYAxisFormattedValue('0', 'none')).toBe('0');
expect(testFullPrecisionGetYAxisFormattedValue('-0', 'none')).toBe('0');
expect(testFullPrecisionGetYAxisFormattedValue('Infinity', 'none')).toBe('∞');
expect(testFullPrecisionGetYAxisFormattedValue('-Infinity', 'none')).toBe(
'-∞',
);
expect(testFullPrecisionGetYAxisFormattedValue('invalid', 'none')).toBe(
'NaN',
);
expect(testFullPrecisionGetYAxisFormattedValue('', 'none')).toBe('NaN');
expect(testFullPrecisionGetYAxisFormattedValue('abc123', 'none')).toBe('NaN');
});
test('small decimals keep precision as-is', () => {
expect(testFullPrecisionGetYAxisFormattedValue('0.0001', 'none')).toBe(
'0.0001',
);
expect(testFullPrecisionGetYAxisFormattedValue('-0.0001', 'none')).toBe(
'-0.0001',
);
expect(testFullPrecisionGetYAxisFormattedValue('0.000000001', 'none')).toBe(
'0.000000001',
);
});
test('simple decimals preserved', () => {
expect(testFullPrecisionGetYAxisFormattedValue('0.1', 'none')).toBe('0.1');
expect(testFullPrecisionGetYAxisFormattedValue('0.2', 'none')).toBe('0.2');
expect(testFullPrecisionGetYAxisFormattedValue('0.3', 'none')).toBe('0.3');
expect(testFullPrecisionGetYAxisFormattedValue('1.0000000001', 'none')).toBe(
'1.0000000001',
);
});
});
describe('getYAxisFormattedValue - units (full precision legacy assertions)', () => {
test('ms', () => {
expect(testFullPrecisionGetYAxisFormattedValue('1500', 'ms')).toBe('1.5 s');
expect(testFullPrecisionGetYAxisFormattedValue('500', 'ms')).toBe('500 ms');
expect(testFullPrecisionGetYAxisFormattedValue('60000', 'ms')).toBe('1 min');
expect(testFullPrecisionGetYAxisFormattedValue('295.429', 'ms')).toBe(
'295.429 ms',
);
expect(testFullPrecisionGetYAxisFormattedValue('4353.81', 'ms')).toBe(
'4.35381 s',
);
});
test('s', () => {
expect(testFullPrecisionGetYAxisFormattedValue('90', 's')).toBe('1.5 mins');
expect(testFullPrecisionGetYAxisFormattedValue('30', 's')).toBe('30 s');
expect(testFullPrecisionGetYAxisFormattedValue('3600', 's')).toBe('1 hour');
});
test('m', () => {
expect(testFullPrecisionGetYAxisFormattedValue('90', 'm')).toBe('1.5 hours');
expect(testFullPrecisionGetYAxisFormattedValue('30', 'm')).toBe('30 min');
expect(testFullPrecisionGetYAxisFormattedValue('1440', 'm')).toBe('1 day');
});
test('bytes', () => {
expect(testFullPrecisionGetYAxisFormattedValue('1024', 'bytes')).toBe(
'1 KiB',
);
expect(testFullPrecisionGetYAxisFormattedValue('512', 'bytes')).toBe('512 B');
expect(testFullPrecisionGetYAxisFormattedValue('1536', 'bytes')).toBe(
'1.5 KiB',
);
});
test('mbytes', () => {
expect(testFullPrecisionGetYAxisFormattedValue('1024', 'mbytes')).toBe(
'1 GiB',
);
expect(testFullPrecisionGetYAxisFormattedValue('512', 'mbytes')).toBe(
'512 MiB',
);
expect(testFullPrecisionGetYAxisFormattedValue('1536', 'mbytes')).toBe(
'1.5 GiB',
);
});
test('kbytes', () => {
expect(testFullPrecisionGetYAxisFormattedValue('1024', 'kbytes')).toBe(
'1 MiB',
);
expect(testFullPrecisionGetYAxisFormattedValue('512', 'kbytes')).toBe(
'512 KiB',
);
expect(testFullPrecisionGetYAxisFormattedValue('1536', 'kbytes')).toBe(
'1.5 MiB',
);
});
test('short', () => {
expect(testFullPrecisionGetYAxisFormattedValue('1000', 'short')).toBe('1 K');
expect(testFullPrecisionGetYAxisFormattedValue('1500', 'short')).toBe(
'1.5 K',
);
expect(testFullPrecisionGetYAxisFormattedValue('999', 'short')).toBe('999');
expect(testFullPrecisionGetYAxisFormattedValue('1000000', 'short')).toBe(
'1 Mil',
);
expect(testFullPrecisionGetYAxisFormattedValue('1555600', 'short')).toBe(
'1.5556 Mil',
);
expect(testFullPrecisionGetYAxisFormattedValue('999999', 'short')).toBe(
'999.999 K',
);
expect(testFullPrecisionGetYAxisFormattedValue('1000000000', 'short')).toBe(
'1 Bil',
);
expect(testFullPrecisionGetYAxisFormattedValue('1500000000', 'short')).toBe(
'1.5 Bil',
);
expect(testFullPrecisionGetYAxisFormattedValue('999999999', 'short')).toBe(
'999.999999 Mil',
);
});
test('percent', () => {
expect(testFullPrecisionGetYAxisFormattedValue('0.15', 'percent')).toBe(
'0.15%',
);
expect(testFullPrecisionGetYAxisFormattedValue('0.1234', 'percent')).toBe(
'0.1234%',
);
expect(testFullPrecisionGetYAxisFormattedValue('0.123499', 'percent')).toBe(
'0.123499%',
);
expect(testFullPrecisionGetYAxisFormattedValue('1.5', 'percent')).toBe(
'1.5%',
);
expect(testFullPrecisionGetYAxisFormattedValue('0.0001', 'percent')).toBe(
'0.0001%',
);
expect(
testFullPrecisionGetYAxisFormattedValue('0.000000001', 'percent'),
).toBe('1e-9%');
expect(
testFullPrecisionGetYAxisFormattedValue('0.000000250034', 'percent'),
).toBe('0.000000250034%');
expect(testFullPrecisionGetYAxisFormattedValue('0.00000025', 'percent')).toBe(
'0.00000025%',
);
// Big precision, limiting the javascript precision (~16 digits)
expect(
testFullPrecisionGetYAxisFormattedValue('1.0000000000000001', 'percent'),
).toBe('1%');
expect(
testFullPrecisionGetYAxisFormattedValue('1.00555555559595876', 'percent'),
).toBe('1.005555555595958%');
});
test('ratio', () => {
expect(testFullPrecisionGetYAxisFormattedValue('0.5', 'ratio')).toBe(
'0.5 ratio',
);
expect(testFullPrecisionGetYAxisFormattedValue('1.25', 'ratio')).toBe(
'1.25 ratio',
);
expect(testFullPrecisionGetYAxisFormattedValue('2.0', 'ratio')).toBe(
'2 ratio',
);
});
test('temperature units', () => {
expect(testFullPrecisionGetYAxisFormattedValue('25', 'celsius')).toBe(
'25 °C',
);
expect(testFullPrecisionGetYAxisFormattedValue('0', 'celsius')).toBe('0 °C');
expect(testFullPrecisionGetYAxisFormattedValue('-10', 'celsius')).toBe(
'-10 °C',
);
expect(testFullPrecisionGetYAxisFormattedValue('77', 'fahrenheit')).toBe(
'77 °F',
);
expect(testFullPrecisionGetYAxisFormattedValue('32', 'fahrenheit')).toBe(
'32 °F',
);
expect(testFullPrecisionGetYAxisFormattedValue('14', 'fahrenheit')).toBe(
'14 °F',
);
});
test('ms edge cases', () => {
expect(testFullPrecisionGetYAxisFormattedValue('0', 'ms')).toBe('0 ms');
expect(testFullPrecisionGetYAxisFormattedValue('-1500', 'ms')).toBe('-1.5 s');
expect(testFullPrecisionGetYAxisFormattedValue('Infinity', 'ms')).toBe('∞');
});
test('bytes edge cases', () => {
expect(testFullPrecisionGetYAxisFormattedValue('0', 'bytes')).toBe('0 B');
expect(testFullPrecisionGetYAxisFormattedValue('-1024', 'bytes')).toBe(
'-1 KiB',
);
});
});
describe('getYAxisFormattedValue - precision option tests', () => {
test('precision 0 drops decimal part', () => {
expect(getYAxisFormattedValue('1.2345', 'none', 0)).toBe('1');
expect(getYAxisFormattedValue('0.9999', 'none', 0)).toBe('0');
expect(getYAxisFormattedValue('12345.6789', 'none', 0)).toBe('12345');
expect(getYAxisFormattedValue('0.0000123456', 'none', 0)).toBe('0');
expect(getYAxisFormattedValue('1000.000', 'none', 0)).toBe('1000');
expect(getYAxisFormattedValue('0.000000250034', 'none', 0)).toBe('0');
expect(getYAxisFormattedValue('1.00555555559595876', 'none', 0)).toBe('1');
// with unit
expect(getYAxisFormattedValue('4353.81', 'ms', 0)).toBe('4 s');
});
test('precision 1,2,3,4 decimals', () => {
expect(getYAxisFormattedValue('1.2345', 'none', 1)).toBe('1.2');
expect(getYAxisFormattedValue('1.2345', 'none', 2)).toBe('1.23');
expect(getYAxisFormattedValue('1.2345', 'none', 3)).toBe('1.234');
expect(getYAxisFormattedValue('1.2345', 'none', 4)).toBe('1.2345');
expect(getYAxisFormattedValue('0.0000123456', 'none', 1)).toBe('0.00001');
expect(getYAxisFormattedValue('0.0000123456', 'none', 2)).toBe('0.000012');
expect(getYAxisFormattedValue('0.0000123456', 'none', 3)).toBe('0.0000123');
expect(getYAxisFormattedValue('0.0000123456', 'none', 4)).toBe('0.00001234');
expect(getYAxisFormattedValue('1000.000', 'none', 1)).toBe('1000');
expect(getYAxisFormattedValue('1000.000', 'none', 2)).toBe('1000');
expect(getYAxisFormattedValue('1000.000', 'none', 3)).toBe('1000');
expect(getYAxisFormattedValue('1000.000', 'none', 4)).toBe('1000');
expect(getYAxisFormattedValue('0.000000250034', 'none', 1)).toBe('0.0000002');
expect(getYAxisFormattedValue('0.000000250034', 'none', 2)).toBe(
'0.00000025',
); // leading zeros + 2 significant => same trimmed
expect(getYAxisFormattedValue('0.000000250034', 'none', 3)).toBe(
'0.00000025',
);
expect(getYAxisFormattedValue('0.000000250304', 'none', 4)).toBe(
'0.0000002503',
);
expect(getYAxisFormattedValue('1.00555555559595876', 'none', 1)).toBe(
'1.005',
);
expect(getYAxisFormattedValue('1.00555555559595876', 'none', 2)).toBe(
'1.0055',
);
expect(getYAxisFormattedValue('1.00555555559595876', 'none', 3)).toBe(
'1.00555',
);
expect(getYAxisFormattedValue('1.00555555559595876', 'none', 4)).toBe(
'1.005555',
);
// with unit
expect(getYAxisFormattedValue('4353.81', 'ms', 1)).toBe('4.4 s');
expect(getYAxisFormattedValue('4353.81', 'ms', 2)).toBe('4.35 s');
expect(getYAxisFormattedValue('4353.81', 'ms', 3)).toBe('4.354 s');
expect(getYAxisFormattedValue('4353.81', 'ms', 4)).toBe('4.3538 s');
// Percentages
expect(getYAxisFormattedValue('0.123456', 'percent', 2)).toBe('0.12%');
expect(getYAxisFormattedValue('0.123456', 'percent', 4)).toBe('0.1235%'); // approximation
});
test('precision full uses up to DEFAULT_SIGNIFICANT_DIGITS significant digits', () => {
expect(
getYAxisFormattedValue(
'0.00002625429914148441',
'none',
PrecisionOptionsEnum.FULL,
),
).toBe('0.000026254299141');
expect(
getYAxisFormattedValue(
'0.000026254299141484417',
's',
PrecisionOptionsEnum.FULL,
),
).toBe('26254299141484417000000 µs');
expect(
getYAxisFormattedValue('4353.81', 'ms', PrecisionOptionsEnum.FULL),
).toBe('4.35381 s');
expect(getYAxisFormattedValue('500', 'ms', PrecisionOptionsEnum.FULL)).toBe(
'500 ms',
);
});
});

View File

@@ -1,58 +1,168 @@
/* eslint-disable sonarjs/cognitive-complexity */
import { formattedValueToString, getValueFormat } from '@grafana/data';
import * as Sentry from '@sentry/react';
import { UniversalYAxisUnit } from 'components/YAxisUnitSelector/types';
import { isUniversalUnit } from 'components/YAxisUnitSelector/utils';
import { isNaN } from 'lodash-es';
import { formatUniversalUnit } from '../YAxisUnitSelector/formatter';
const DEFAULT_SIGNIFICANT_DIGITS = 15;
// max decimals to keep should not exceed 15 decimal places to avoid floating point precision issues
const MAX_DECIMALS = 15;
export enum PrecisionOptionsEnum {
ZERO = 0,
ONE = 1,
TWO = 2,
THREE = 3,
FOUR = 4,
FULL = 'full',
}
export type PrecisionOption = 0 | 1 | 2 | 3 | 4 | PrecisionOptionsEnum.FULL;
/**
* Formats a number for display, preserving leading zeros after the decimal point
* and showing up to DEFAULT_SIGNIFICANT_DIGITS digits after the first non-zero decimal digit.
* It avoids scientific notation and removes unnecessary trailing zeros.
*
* @example
* formatDecimalWithLeadingZeros(1.2345); // "1.2345"
* formatDecimalWithLeadingZeros(0.0012345); // "0.0012345"
* formatDecimalWithLeadingZeros(5.0); // "5"
*
* @param value The number to format.
* @returns The formatted string.
*/
const formatDecimalWithLeadingZeros = (
value: number,
precision: PrecisionOption,
): string => {
if (value === 0) {
return '0';
}
// Use toLocaleString to get a full decimal representation without scientific notation.
const numStr = value.toLocaleString('en-US', {
useGrouping: false,
maximumFractionDigits: 20,
});
const [integerPart, decimalPart = ''] = numStr.split('.');
// If there's no decimal part, the integer part is the result.
if (!decimalPart) {
return integerPart;
}
// Find the index of the first non-zero digit in the decimal part.
const firstNonZeroIndex = decimalPart.search(/[^0]/);
// If the decimal part consists only of zeros, return just the integer part.
if (firstNonZeroIndex === -1) {
return integerPart;
}
// Determine the number of decimals to keep: leading zeros + up to N significant digits.
const significantDigits =
precision === PrecisionOptionsEnum.FULL
? DEFAULT_SIGNIFICANT_DIGITS
: precision;
const decimalsToKeep = firstNonZeroIndex + (significantDigits || 0);
// max decimals to keep should not exceed 15 decimal places to avoid floating point precision issues
const finalDecimalsToKeep = Math.min(decimalsToKeep, MAX_DECIMALS);
const trimmedDecimalPart = decimalPart.substring(0, finalDecimalsToKeep);
// If precision is 0, we drop the decimal part entirely.
if (precision === 0) {
return integerPart;
}
// Remove any trailing zeros from the result to keep it clean.
const finalDecimalPart = trimmedDecimalPart.replace(/0+$/, '');
// Return the integer part, or the integer and decimal parts combined.
return finalDecimalPart ? `${integerPart}.${finalDecimalPart}` : integerPart;
};
/**
* Formats a Y-axis value based on a given format string.
*
* @param value The string value from the axis.
* @param format The format identifier (e.g. 'none', 'ms', 'bytes', 'short').
* @returns A formatted string ready for display.
*/
export const getYAxisFormattedValue = (
value: string,
format: string,
precision: PrecisionOption = 2, // default precision requested
): string => {
let decimalPrecision: number | undefined;
const parsedValue = getValueFormat(format)(
parseFloat(value),
undefined,
undefined,
undefined,
);
const numValue = parseFloat(value);
// Handle non-numeric or special values first.
if (isNaN(numValue)) return 'NaN';
if (numValue === Infinity) return '∞';
if (numValue === -Infinity) return '-∞';
const decimalPlaces = value.split('.')[1]?.length || undefined;
// Use custom formatter for the 'none' format honoring precision
if (format === 'none') {
return formatDecimalWithLeadingZeros(numValue, precision);
}
// For all other standard formats, delegate to grafana/data's built-in formatter.
const computeDecimals = (): number | undefined => {
if (precision === PrecisionOptionsEnum.FULL) {
return decimalPlaces && decimalPlaces >= DEFAULT_SIGNIFICANT_DIGITS
? decimalPlaces
: DEFAULT_SIGNIFICANT_DIGITS;
}
return precision;
};
const fallbackFormat = (): string => {
if (precision === PrecisionOptionsEnum.FULL) return numValue.toString();
if (precision === 0) return Math.round(numValue).toString();
return precision !== undefined
? numValue
.toFixed(precision)
.replace(/(\.[0-9]*[1-9])0+$/, '$1') // trimming zeros
.replace(/\.$/, '')
: numValue.toString();
};
try {
const decimalSplitted = parsedValue.text.split('.');
if (decimalSplitted.length === 1) {
decimalPrecision = 0;
} else {
const decimalDigits = decimalSplitted[1].split('');
decimalPrecision = decimalDigits.length;
let nonZeroCtr = 0;
for (let idx = 0; idx < decimalDigits.length; idx += 1) {
if (decimalDigits[idx] !== '0') {
nonZeroCtr += 1;
if (nonZeroCtr >= 2) {
decimalPrecision = idx + 1;
}
} else if (nonZeroCtr) {
decimalPrecision = idx;
break;
}
}
const formatter = getValueFormat(format);
const formattedValue = formatter(numValue, computeDecimals(), undefined);
if (formattedValue.text && formattedValue.text.includes('.')) {
formattedValue.text = formatDecimalWithLeadingZeros(
parseFloat(formattedValue.text),
precision,
);
}
return formattedValueToString(
getValueFormat(format)(
parseFloat(value),
decimalPrecision,
undefined,
undefined,
),
);
// Separate logic for universal units
if (format && isUniversalUnit(format)) {
return formatUniversalUnit(parseFloat(value), format as UniversalYAxisUnit);
}
return formattedValueToString(formattedValue);
} catch (error) {
console.error(error);
Sentry.captureEvent({
message: `Error applying formatter: ${
error instanceof Error ? error.message : 'Unknown error'
}`,
level: 'error',
});
return fallbackFormat();
}
return `${parseFloat(value)}`;
};
export const getToolTipValue = (value: string, format?: string): string => {
try {
return formattedValueToString(
getValueFormat(format)(parseFloat(value), undefined, undefined, undefined),
);
} catch (error) {
console.error(error);
}
return `${value}`;
};
export const getToolTipValue = (
value: string | number,
format?: string,
precision?: PrecisionOption,
): string =>
getYAxisFormattedValue(value?.toString(), format || 'none', precision);

View File

@@ -60,6 +60,14 @@ function Metrics({
setElement,
} = useMultiIntersectionObserver(hostWidgetInfo.length, { threshold: 0.1 });
const legendScrollPositionRef = useRef<{
scrollTop: number;
scrollLeft: number;
}>({
scrollTop: 0,
scrollLeft: 0,
});
const queryPayloads = useMemo(
() =>
getHostQueryPayload(
@@ -147,6 +155,13 @@ function Metrics({
maxTimeScale: graphTimeIntervals[idx].end,
onDragSelect: (start, end) => onDragSelect(start, end, idx),
query: currentQuery,
legendScrollPosition: legendScrollPositionRef.current,
setLegendScrollPosition: (position: {
scrollTop: number;
scrollLeft: number;
}) => {
legendScrollPositionRef.current = position;
},
}),
),
[

View File

@@ -132,9 +132,9 @@
justify-content: center;
}
.json-action-btn {
.log-detail-drawer__actions {
display: flex;
gap: 8px;
gap: 4px;
}
}

View File

@@ -319,31 +319,35 @@ function LogDetailInner({
</Radio.Button>
</Radio.Group>
{selectedView === VIEW_TYPES.JSON && (
<div className="json-action-btn">
<div className="log-detail-drawer__actions">
{selectedView === VIEW_TYPES.CONTEXT && (
<Tooltip
title="Show Filters"
placement="topLeft"
aria-label="Show Filters"
>
<Button
className="action-btn"
icon={<Filter size={16} />}
onClick={handleFilterVisible}
/>
</Tooltip>
)}
<Tooltip
title={selectedView === VIEW_TYPES.JSON ? 'Copy JSON' : 'Copy Log Link'}
placement="topLeft"
aria-label={
selectedView === VIEW_TYPES.JSON ? 'Copy JSON' : 'Copy Log Link'
}
>
<Button
className="action-btn"
icon={<Copy size={16} />}
onClick={handleJSONCopy}
onClick={selectedView === VIEW_TYPES.JSON ? handleJSONCopy : onLogCopy}
/>
</div>
)}
{selectedView === VIEW_TYPES.CONTEXT && (
<Button
className="action-btn"
icon={<Filter size={16} />}
onClick={handleFilterVisible}
/>
)}
<Tooltip title="Copy Log Link" placement="left" aria-label="Copy Log Link">
<Button
className="action-btn"
icon={<Copy size={16} />}
onClick={onLogCopy}
/>
</Tooltip>
</Tooltip>
</div>
</div>
{isFilterVisible && contextQuery?.builder.queryData[0] && (
<div className="log-detail-drawer-query-container">
@@ -383,7 +387,8 @@ function LogDetailInner({
podName={log.resources_string?.[RESOURCE_KEYS.POD_NAME] || ''}
nodeName={log.resources_string?.[RESOURCE_KEYS.NODE_NAME] || ''}
hostName={log.resources_string?.[RESOURCE_KEYS.HOST_NAME] || ''}
logLineTimestamp={log.timestamp.toString()}
timestamp={log.timestamp.toString()}
dataSource={DataSource.LOGS}
/>
)}
</Drawer>

View File

@@ -398,7 +398,7 @@
}
.qb-search-container {
.metrics-select-container {
.metrics-container {
margin-bottom: 12px;
}
}

View File

@@ -22,6 +22,8 @@ export const QueryBuilderV2 = memo(function QueryBuilderV2({
showOnlyWhereClause = false,
showTraceOperator = false,
version,
onSignalSourceChange,
signalSourceChangeEnabled = false,
}: QueryBuilderProps): JSX.Element {
const {
currentQuery,
@@ -175,6 +177,8 @@ export const QueryBuilderV2 = memo(function QueryBuilderV2({
queryVariant={config?.queryVariant || 'dropdown'}
showOnlyWhereClause={showOnlyWhereClause}
isListViewPanel={isListViewPanel}
onSignalSourceChange={onSignalSourceChange || ((): void => {})}
signalSourceChangeEnabled={signalSourceChangeEnabled}
/>
) : (
currentQuery.builder.queryData.map((query, index) => (
@@ -193,7 +197,9 @@ export const QueryBuilderV2 = memo(function QueryBuilderV2({
queryVariant={config?.queryVariant || 'dropdown'}
showOnlyWhereClause={showOnlyWhereClause}
isListViewPanel={isListViewPanel}
signalSource={config?.signalSource || ''}
signalSource={query.source as 'meter' | ''}
onSignalSourceChange={onSignalSourceChange || ((): void => {})}
signalSourceChangeEnabled={signalSourceChangeEnabled}
/>
))
)}

View File

@@ -1,5 +1,14 @@
.metrics-select-container {
.metrics-source-select-container {
margin-bottom: 8px;
display: flex;
flex-direction: row;
align-items: flex-start;
gap: 8px;
width: 100%;
.source-selector {
width: 120px;
}
.ant-select-selector {
width: 100%;
@@ -42,7 +51,7 @@
}
.lightMode {
.metrics-select-container {
.metrics-source-select-container {
.ant-select-selector {
border: 1px solid var(--bg-vanilla-300) !important;
background: var(--bg-vanilla-100);

View File

@@ -1,21 +1,39 @@
import './MetricsSelect.styles.scss';
import { Select } from 'antd';
import {
initialQueriesMap,
initialQueryMeterWithType,
PANEL_TYPES,
} from 'constants/queryBuilder';
import { AggregatorFilter } from 'container/QueryBuilder/filters';
import { useQueryBuilder } from 'hooks/queryBuilder/useQueryBuilder';
import { useQueryOperations } from 'hooks/queryBuilder/useQueryBuilderOperations';
import { memo, useCallback, useState } from 'react';
import { memo, useCallback, useMemo, useState } from 'react';
import { BaseAutocompleteData } from 'types/api/queryBuilder/queryAutocompleteResponse';
import { IBuilderQuery } from 'types/api/queryBuilder/queryBuilderData';
import { DataSource } from 'types/common/queryBuilder';
import { SelectOption } from 'types/common/select';
export const SOURCE_OPTIONS: SelectOption<string, string>[] = [
{ value: 'metrics', label: 'Metrics' },
{ value: 'meter', label: 'Meter' },
];
export const MetricsSelect = memo(function MetricsSelect({
query,
index,
version,
signalSource,
onSignalSourceChange,
signalSourceChangeEnabled = false,
}: {
query: IBuilderQuery;
index: number;
version: string;
signalSource: 'meter' | '';
onSignalSourceChange: (value: string) => void;
signalSourceChangeEnabled: boolean;
}): JSX.Element {
const [attributeKeys, setAttributeKeys] = useState<BaseAutocompleteData[]>([]);
@@ -31,8 +49,67 @@ export const MetricsSelect = memo(function MetricsSelect({
},
[handleChangeAggregatorAttribute, attributeKeys],
);
const { updateAllQueriesOperators, handleSetQueryData } = useQueryBuilder();
const source = useMemo(
() => (signalSource === 'meter' ? 'meter' : 'metrics'),
[signalSource],
);
const defaultMeterQuery = useMemo(
() =>
updateAllQueriesOperators(
initialQueryMeterWithType,
PANEL_TYPES.BAR,
DataSource.METRICS,
'meter' as 'meter' | '',
),
[updateAllQueriesOperators],
);
const defaultMetricsQuery = useMemo(
() =>
updateAllQueriesOperators(
initialQueriesMap.metrics,
PANEL_TYPES.BAR,
DataSource.METRICS,
'',
),
[updateAllQueriesOperators],
);
const handleSignalSourceChange = (value: string): void => {
onSignalSourceChange(value);
handleSetQueryData(
index,
value === 'meter'
? {
...defaultMeterQuery.builder.queryData[0],
source: 'meter',
queryName: query.queryName,
}
: {
...defaultMetricsQuery.builder.queryData[0],
source: '',
queryName: query.queryName,
},
);
};
return (
<div className="metrics-select-container">
<div className="metrics-source-select-container">
{signalSourceChangeEnabled && (
<Select
className="source-selector"
placeholder="Source"
options={SOURCE_OPTIONS}
value={source}
defaultValue="metrics"
onChange={handleSignalSourceChange}
/>
)}
<AggregatorFilter
onChange={handleAggregatorAttributeChange}
query={query}

View File

@@ -33,7 +33,13 @@ export const QueryV2 = memo(function QueryV2({
showOnlyWhereClause = false,
signalSource = '',
isMultiQueryAllowed = false,
}: QueryProps & { ref: React.RefObject<HTMLDivElement> }): JSX.Element {
onSignalSourceChange,
signalSourceChangeEnabled = false,
}: QueryProps & {
ref: React.RefObject<HTMLDivElement>;
onSignalSourceChange: (value: string) => void;
signalSourceChangeEnabled: boolean;
}): JSX.Element {
const { cloneQuery, panelType } = useQueryBuilder();
const showFunctions = query?.functions?.length > 0;
@@ -207,12 +213,14 @@ export const QueryV2 = memo(function QueryV2({
<div className="qb-elements-container">
<div className="qb-search-container">
{dataSource === DataSource.METRICS && (
<div className="metrics-select-container">
<div className="metrics-container">
<MetricsSelect
query={query}
index={index}
version={ENTITY_VERSION_V5}
signalSource={signalSource as 'meter' | ''}
onSignalSourceChange={onSignalSourceChange}
signalSourceChangeEnabled={signalSourceChangeEnabled}
/>
</div>
)}
@@ -258,7 +266,7 @@ export const QueryV2 = memo(function QueryV2({
panelType={panelType}
query={query}
index={index}
key={`metrics-aggregate-section-${query.queryName}-${query.dataSource}`}
key={`metrics-aggregate-section-${query.queryName}-${query.dataSource}-${signalSource}`}
version="v4"
signalSource={signalSource as 'meter' | ''}
/>

View File

@@ -0,0 +1,327 @@
import { UniversalYAxisUnit } from 'components/YAxisUnitSelector/types';
import {
AdditionalLabelsMappingForGrafanaUnits,
UniversalUnitToGrafanaUnit,
} from '../constants';
import { formatUniversalUnit } from '../formatter';
const VALUE_BELOW_THRESHOLD = 900;
describe('formatUniversalUnit', () => {
describe('Time', () => {
it('formats time values below conversion threshold', () => {
expect(formatUniversalUnit(31, UniversalYAxisUnit.DAYS)).toBe('4.43 weeks');
expect(formatUniversalUnit(25, UniversalYAxisUnit.HOURS)).toBe('1.04 days');
expect(formatUniversalUnit(61, UniversalYAxisUnit.MINUTES)).toBe(
'1.02 hours',
);
expect(formatUniversalUnit(61, UniversalYAxisUnit.SECONDS)).toBe(
'1.02 mins',
);
expect(formatUniversalUnit(1006, UniversalYAxisUnit.MILLISECONDS)).toBe(
'1.01 s',
);
expect(formatUniversalUnit(100006, UniversalYAxisUnit.MICROSECONDS)).toBe(
'100 ms',
);
expect(formatUniversalUnit(1006, UniversalYAxisUnit.NANOSECONDS)).toBe(
'1.01 µs',
);
});
});
describe('Data', () => {
it('formats data values below conversion threshold', () => {
expect(
formatUniversalUnit(VALUE_BELOW_THRESHOLD, UniversalYAxisUnit.BYTES),
).toBe('900 B');
expect(formatUniversalUnit(900, UniversalYAxisUnit.KILOBYTES)).toBe(
'900 kB',
);
expect(formatUniversalUnit(900, UniversalYAxisUnit.MEGABYTES)).toBe(
'900 MB',
);
expect(formatUniversalUnit(900, UniversalYAxisUnit.GIGABYTES)).toBe(
'900 GB',
);
expect(formatUniversalUnit(900, UniversalYAxisUnit.TERABYTES)).toBe(
'900 TB',
);
expect(formatUniversalUnit(900, UniversalYAxisUnit.PETABYTES)).toBe(
'900 PB',
);
expect(formatUniversalUnit(900, UniversalYAxisUnit.EXABYTES)).toBe('900 EB');
expect(formatUniversalUnit(900, UniversalYAxisUnit.ZETTABYTES)).toBe(
'900 ZB',
);
expect(formatUniversalUnit(900, UniversalYAxisUnit.YOTTABYTES)).toBe(
'900 YB',
);
});
it('formats data values above conversion threshold with scaling', () => {
expect(formatUniversalUnit(1000, UniversalYAxisUnit.BYTES)).toBe('1 kB');
expect(formatUniversalUnit(1000, UniversalYAxisUnit.KILOBYTES)).toBe('1 MB');
expect(formatUniversalUnit(1000, UniversalYAxisUnit.MEGABYTES)).toBe('1 GB');
expect(formatUniversalUnit(1000, UniversalYAxisUnit.GIGABYTES)).toBe('1 TB');
expect(formatUniversalUnit(1000, UniversalYAxisUnit.TERABYTES)).toBe('1 PB');
expect(formatUniversalUnit(1000, UniversalYAxisUnit.PETABYTES)).toBe('1 EB');
expect(formatUniversalUnit(1000, UniversalYAxisUnit.EXABYTES)).toBe('1 ZB');
expect(formatUniversalUnit(1000, UniversalYAxisUnit.ZETTABYTES)).toBe(
'1 YB',
);
});
it('formats data values above conversion threshold with decimals', () => {
expect(formatUniversalUnit(1034, UniversalYAxisUnit.BYTES)).toBe('1.03 kB');
expect(formatUniversalUnit(1034, UniversalYAxisUnit.KILOBYTES)).toBe(
'1.03 MB',
);
expect(formatUniversalUnit(1034, UniversalYAxisUnit.MEGABYTES)).toBe(
'1.03 GB',
);
expect(formatUniversalUnit(1034, UniversalYAxisUnit.TERABYTES)).toBe(
'1.03 PB',
);
expect(formatUniversalUnit(1034, UniversalYAxisUnit.PETABYTES)).toBe(
'1.03 EB',
);
expect(formatUniversalUnit(1034, UniversalYAxisUnit.EXABYTES)).toBe(
'1.03 ZB',
);
expect(formatUniversalUnit(1034, UniversalYAxisUnit.ZETTABYTES)).toBe(
'1.03 YB',
);
expect(formatUniversalUnit(1034, UniversalYAxisUnit.YOTTABYTES)).toBe(
'1034 YB',
);
});
});
describe('Data rate', () => {
it('formats data rate values below conversion threshold', () => {
expect(formatUniversalUnit(900, UniversalYAxisUnit.BYTES_SECOND)).toBe(
'900 B/s',
);
expect(formatUniversalUnit(900, UniversalYAxisUnit.KILOBYTES_SECOND)).toBe(
'900 kB/s',
);
expect(formatUniversalUnit(900, UniversalYAxisUnit.MEGABYTES_SECOND)).toBe(
'900 MB/s',
);
expect(formatUniversalUnit(900, UniversalYAxisUnit.GIGABYTES_SECOND)).toBe(
'900 GB/s',
);
expect(formatUniversalUnit(900, UniversalYAxisUnit.TERABYTES_SECOND)).toBe(
'900 TB/s',
);
expect(formatUniversalUnit(900, UniversalYAxisUnit.PETABYTES_SECOND)).toBe(
'900 PB/s',
);
expect(formatUniversalUnit(900, UniversalYAxisUnit.EXABYTES_SECOND)).toBe(
'900 EB/s',
);
expect(formatUniversalUnit(900, UniversalYAxisUnit.ZETTABYTES_SECOND)).toBe(
'900 ZB/s',
);
expect(formatUniversalUnit(900, UniversalYAxisUnit.YOTTABYTES_SECOND)).toBe(
'900 YB/s',
);
});
it('formats data rate values above conversion threshold with scaling (1000)', () => {
expect(formatUniversalUnit(1000, UniversalYAxisUnit.BYTES_SECOND)).toBe(
'1 kB/s',
);
expect(formatUniversalUnit(1000, UniversalYAxisUnit.KILOBYTES_SECOND)).toBe(
'1 MB/s',
);
expect(formatUniversalUnit(1000, UniversalYAxisUnit.MEGABYTES_SECOND)).toBe(
'1 GB/s',
);
expect(formatUniversalUnit(1000, UniversalYAxisUnit.GIGABYTES_SECOND)).toBe(
'1 TB/s',
);
expect(formatUniversalUnit(1000, UniversalYAxisUnit.TERABYTES_SECOND)).toBe(
'1 PB/s',
);
});
});
describe('Bit', () => {
it('formats bit values below conversion threshold', () => {
expect(formatUniversalUnit(900, UniversalYAxisUnit.BITS)).toBe('900 b');
expect(formatUniversalUnit(900, UniversalYAxisUnit.KILOBITS)).toBe('900 kb');
expect(formatUniversalUnit(900, UniversalYAxisUnit.MEGABITS)).toBe('900 Mb');
expect(formatUniversalUnit(900, UniversalYAxisUnit.GIGABITS)).toBe('900 Gb');
expect(formatUniversalUnit(900, UniversalYAxisUnit.TERABITS)).toBe('900 Tb');
expect(formatUniversalUnit(900, UniversalYAxisUnit.PETABITS)).toBe('900 Pb');
expect(formatUniversalUnit(900, UniversalYAxisUnit.EXABITS)).toBe('900 Eb');
expect(formatUniversalUnit(900, UniversalYAxisUnit.ZETTABITS)).toBe(
'900 Zb',
);
expect(formatUniversalUnit(900, UniversalYAxisUnit.YOTTABITS)).toBe(
'900 Yb',
);
});
it('formats bit values above conversion threshold with scaling (1000)', () => {
expect(formatUniversalUnit(1000, UniversalYAxisUnit.BITS)).toBe('1 kb');
expect(formatUniversalUnit(1000, UniversalYAxisUnit.KILOBITS)).toBe('1 Mb');
expect(formatUniversalUnit(1000, UniversalYAxisUnit.MEGABITS)).toBe('1 Gb');
expect(formatUniversalUnit(1000, UniversalYAxisUnit.GIGABITS)).toBe('1 Tb');
expect(formatUniversalUnit(1000, UniversalYAxisUnit.TERABITS)).toBe('1 Pb');
expect(formatUniversalUnit(1000, UniversalYAxisUnit.PETABITS)).toBe('1 Eb');
expect(formatUniversalUnit(1000, UniversalYAxisUnit.EXABITS)).toBe('1 Zb');
expect(formatUniversalUnit(1000, UniversalYAxisUnit.ZETTABITS)).toBe('1 Yb');
});
it('formats bit values below conversion threshold with decimals (0.5)', () => {
expect(formatUniversalUnit(0.5, UniversalYAxisUnit.KILOBITS)).toBe('500 b');
expect(formatUniversalUnit(0.001, UniversalYAxisUnit.MEGABITS)).toBe('1 kb');
});
});
describe('Bit rate', () => {
it('formats bit rate values below conversion threshold', () => {
expect(formatUniversalUnit(900, UniversalYAxisUnit.BITS_SECOND)).toBe(
'900 b/s',
);
});
it('formats bit rate values above conversion threshold with scaling (1000)', () => {
expect(formatUniversalUnit(1000, UniversalYAxisUnit.BITS_SECOND)).toBe(
'1 kb/s',
);
expect(formatUniversalUnit(1000, UniversalYAxisUnit.KILOBITS_SECOND)).toBe(
'1 Mb/s',
);
});
});
describe('Count', () => {
it('formats small values without abbreviation', () => {
expect(formatUniversalUnit(100, UniversalYAxisUnit.COUNT)).toBe('100');
expect(formatUniversalUnit(900, UniversalYAxisUnit.COUNT)).toBe('900');
});
it('formats count values above conversion threshold with scaling (1000)', () => {
expect(formatUniversalUnit(1000, UniversalYAxisUnit.COUNT)).toBe('1 K');
expect(formatUniversalUnit(10000, UniversalYAxisUnit.COUNT)).toBe('10 K');
expect(formatUniversalUnit(100000, UniversalYAxisUnit.COUNT)).toBe('100 K');
expect(formatUniversalUnit(1000000, UniversalYAxisUnit.COUNT)).toBe('1 Mil');
expect(formatUniversalUnit(10000000, UniversalYAxisUnit.COUNT)).toBe(
'10 Mil',
);
expect(formatUniversalUnit(100000000, UniversalYAxisUnit.COUNT)).toBe(
'100 Mil',
);
expect(formatUniversalUnit(1000000000, UniversalYAxisUnit.COUNT)).toBe(
'1 Bil',
);
expect(formatUniversalUnit(1000000000000, UniversalYAxisUnit.COUNT)).toBe(
'1 Tri',
);
});
it('formats count per time units', () => {
expect(formatUniversalUnit(900, UniversalYAxisUnit.COUNT_SECOND)).toBe(
'900 c/s',
);
expect(formatUniversalUnit(900, UniversalYAxisUnit.COUNT_MINUTE)).toBe(
'900 c/m',
);
});
});
describe('Operations units', () => {
it('formats operations per time', () => {
expect(formatUniversalUnit(900, UniversalYAxisUnit.OPS_SECOND)).toBe(
'900 ops/s',
);
expect(formatUniversalUnit(900, UniversalYAxisUnit.OPS_MINUTE)).toBe(
'900 ops/m',
);
});
});
describe('Request units', () => {
it('formats requests per time', () => {
expect(formatUniversalUnit(900, UniversalYAxisUnit.REQUESTS_SECOND)).toBe(
'900 req/s',
);
expect(formatUniversalUnit(900, UniversalYAxisUnit.REQUESTS_MINUTE)).toBe(
'900 req/m',
);
});
});
describe('Read/Write units', () => {
it('formats reads and writes per time', () => {
expect(formatUniversalUnit(900, UniversalYAxisUnit.READS_SECOND)).toBe(
'900 rd/s',
);
expect(formatUniversalUnit(900, UniversalYAxisUnit.WRITES_SECOND)).toBe(
'900 wr/s',
);
expect(formatUniversalUnit(900, UniversalYAxisUnit.READS_MINUTE)).toBe(
'900 rd/m',
);
expect(formatUniversalUnit(900, UniversalYAxisUnit.WRITES_MINUTE)).toBe(
'900 wr/m',
);
});
});
describe('IO Operations units', () => {
it('formats IOPS', () => {
expect(formatUniversalUnit(900, UniversalYAxisUnit.IOOPS_SECOND)).toBe(
'900 io/s',
);
});
});
describe('Percent units', () => {
it('formats percent as-is', () => {
expect(formatUniversalUnit(900, UniversalYAxisUnit.PERCENT)).toBe('900%');
});
it('multiplies percent_unit by 100', () => {
expect(formatUniversalUnit(9, UniversalYAxisUnit.PERCENT_UNIT)).toBe('900%');
});
});
describe('None unit', () => {
it('formats as plain number', () => {
expect(formatUniversalUnit(900, UniversalYAxisUnit.NONE)).toBe('900');
});
});
describe('High-order byte scaling', () => {
it('scales between EB, ZB, YB', () => {
expect(formatUniversalUnit(1000, UniversalYAxisUnit.EXABYTES)).toBe('1 ZB');
expect(formatUniversalUnit(1000, UniversalYAxisUnit.ZETTABYTES)).toBe(
'1 YB',
);
});
});
});
describe('Mapping Validator', () => {
it('validates that all units have a mapping', () => {
// Each universal unit should have a mapping to a 1:1 Grafana unit in UniversalUnitToGrafanaUnit or an additional mapping in AdditionalLabelsMappingForGrafanaUnits
const units = Object.values(UniversalYAxisUnit);
expect(
units.every((unit) => {
const hasBaseMapping = unit in UniversalUnitToGrafanaUnit;
const hasAdditionalMapping = unit in AdditionalLabelsMappingForGrafanaUnits;
const hasMapping = hasBaseMapping || hasAdditionalMapping;
if (!hasMapping) {
console.error(`Unit ${unit} does not have a mapping`);
}
return hasMapping;
}),
).toBe(true);
});
});

View File

@@ -1,4 +1,4 @@
import { UniversalYAxisUnit, YAxisUnit } from './types';
import { UnitFamilyConfig, UniversalYAxisUnit, YAxisUnit } from './types';
// Mapping of universal y-axis units to their AWS, UCUM, and OpenMetrics equivalents
export const UniversalYAxisUnitMappings: Record<
@@ -625,3 +625,152 @@ export const Y_AXIS_CATEGORIES = [
],
},
];
export const UniversalUnitToGrafanaUnit: Partial<
Record<UniversalYAxisUnit, string>
> = {
// Time
[UniversalYAxisUnit.WEEKS]: 'wk',
[UniversalYAxisUnit.DAYS]: 'd',
[UniversalYAxisUnit.HOURS]: 'h',
[UniversalYAxisUnit.MINUTES]: 'm',
[UniversalYAxisUnit.SECONDS]: 's',
[UniversalYAxisUnit.MILLISECONDS]: 'ms',
[UniversalYAxisUnit.MICROSECONDS]: 'µs',
[UniversalYAxisUnit.NANOSECONDS]: 'ns',
// Data (Grafana uses 1024-based IEC format)
[UniversalYAxisUnit.BYTES]: 'decbytes',
[UniversalYAxisUnit.KILOBYTES]: 'deckbytes',
[UniversalYAxisUnit.MEGABYTES]: 'decmbytes',
[UniversalYAxisUnit.GIGABYTES]: 'decgbytes',
[UniversalYAxisUnit.TERABYTES]: 'dectbytes',
[UniversalYAxisUnit.PETABYTES]: 'decpbytes',
// Data Rate
[UniversalYAxisUnit.BYTES_SECOND]: 'Bps',
[UniversalYAxisUnit.KILOBYTES_SECOND]: 'KBs',
[UniversalYAxisUnit.MEGABYTES_SECOND]: 'MBs',
[UniversalYAxisUnit.GIGABYTES_SECOND]: 'GBs',
[UniversalYAxisUnit.TERABYTES_SECOND]: 'TBs',
[UniversalYAxisUnit.PETABYTES_SECOND]: 'PBs',
// Bits
[UniversalYAxisUnit.BITS]: 'bits',
// Bit Rate
[UniversalYAxisUnit.BITS_SECOND]: 'bps',
[UniversalYAxisUnit.KILOBITS_SECOND]: 'Kbits',
[UniversalYAxisUnit.MEGABITS_SECOND]: 'Mbits',
[UniversalYAxisUnit.GIGABITS_SECOND]: 'Gbits',
[UniversalYAxisUnit.TERABITS_SECOND]: 'Tbits',
[UniversalYAxisUnit.PETABITS_SECOND]: 'Pbits',
// Count
[UniversalYAxisUnit.COUNT]: 'short',
[UniversalYAxisUnit.COUNT_SECOND]: 'cps',
[UniversalYAxisUnit.COUNT_MINUTE]: 'cpm',
// Operations
[UniversalYAxisUnit.OPS_SECOND]: 'ops',
[UniversalYAxisUnit.OPS_MINUTE]: 'opm',
// Requests
[UniversalYAxisUnit.REQUESTS_SECOND]: 'reqps',
[UniversalYAxisUnit.REQUESTS_MINUTE]: 'reqpm',
// Reads/Writes
[UniversalYAxisUnit.READS_SECOND]: 'rps',
[UniversalYAxisUnit.WRITES_SECOND]: 'wps',
[UniversalYAxisUnit.READS_MINUTE]: 'rpm',
[UniversalYAxisUnit.WRITES_MINUTE]: 'wpm',
// IO Operations
[UniversalYAxisUnit.IOOPS_SECOND]: 'iops',
// Percent
[UniversalYAxisUnit.PERCENT]: 'percent',
[UniversalYAxisUnit.PERCENT_UNIT]: 'percentunit',
// None
[UniversalYAxisUnit.NONE]: 'none',
};
export const AdditionalLabelsMappingForGrafanaUnits: Partial<
Record<UniversalYAxisUnit, string>
> = {
// Data
[UniversalYAxisUnit.EXABYTES]: 'EB',
[UniversalYAxisUnit.ZETTABYTES]: 'ZB',
[UniversalYAxisUnit.YOTTABYTES]: 'YB',
// Data Rate
[UniversalYAxisUnit.EXABYTES_SECOND]: 'EB/s',
[UniversalYAxisUnit.ZETTABYTES_SECOND]: 'ZB/s',
[UniversalYAxisUnit.YOTTABYTES_SECOND]: 'YB/s',
// Bits
[UniversalYAxisUnit.BITS]: 'b',
[UniversalYAxisUnit.KILOBITS]: 'kb',
[UniversalYAxisUnit.MEGABITS]: 'Mb',
[UniversalYAxisUnit.GIGABITS]: 'Gb',
[UniversalYAxisUnit.TERABITS]: 'Tb',
[UniversalYAxisUnit.PETABITS]: 'Pb',
[UniversalYAxisUnit.EXABITS]: 'Eb',
[UniversalYAxisUnit.ZETTABITS]: 'Zb',
[UniversalYAxisUnit.YOTTABITS]: 'Yb',
// Bit Rate
[UniversalYAxisUnit.EXABITS_SECOND]: 'Eb/s',
[UniversalYAxisUnit.ZETTABITS_SECOND]: 'Zb/s',
[UniversalYAxisUnit.YOTTABITS_SECOND]: 'Yb/s',
};
/**
* Configuration for unit families that need custom scaling
* These are units where Grafana doesn't auto-scale between levels
*/
export const CUSTOM_SCALING_FAMILIES: UnitFamilyConfig[] = [
// Bits (b → kb → Mb → Gb → Tb → Pb → Eb → Zb → Yb)
{
units: [
UniversalYAxisUnit.BITS,
UniversalYAxisUnit.KILOBITS,
UniversalYAxisUnit.MEGABITS,
UniversalYAxisUnit.GIGABITS,
UniversalYAxisUnit.TERABITS,
UniversalYAxisUnit.PETABITS,
UniversalYAxisUnit.EXABITS,
UniversalYAxisUnit.ZETTABITS,
UniversalYAxisUnit.YOTTABITS,
],
scaleFactor: 1000,
},
// High-order bit rates (Eb/s → Zb/s → Yb/s)
{
units: [
UniversalYAxisUnit.EXABITS_SECOND,
UniversalYAxisUnit.ZETTABITS_SECOND,
UniversalYAxisUnit.YOTTABITS_SECOND,
],
scaleFactor: 1000,
},
// High-order bytes (EB → ZB → YB)
{
units: [
UniversalYAxisUnit.EXABYTES,
UniversalYAxisUnit.ZETTABYTES,
UniversalYAxisUnit.YOTTABYTES,
],
scaleFactor: 1000,
},
// High-order byte rates (EB/s → ZB/s → YB/s)
{
units: [
UniversalYAxisUnit.EXABYTES_SECOND,
UniversalYAxisUnit.ZETTABYTES_SECOND,
UniversalYAxisUnit.YOTTABYTES_SECOND,
],
scaleFactor: 1000,
},
];

View File

@@ -0,0 +1,70 @@
import { formattedValueToString, getValueFormat } from '@grafana/data';
import {
AdditionalLabelsMappingForGrafanaUnits,
CUSTOM_SCALING_FAMILIES,
UniversalUnitToGrafanaUnit,
} from 'components/YAxisUnitSelector/constants';
import { UniversalYAxisUnit } from 'components/YAxisUnitSelector/types';
const format = (
formatStr: string,
value: number,
): ReturnType<ReturnType<typeof getValueFormat>> =>
getValueFormat(formatStr)(value, undefined, undefined, undefined);
function scaleValue(
value: number,
unit: UniversalYAxisUnit,
family: UniversalYAxisUnit[],
factor: number,
): { value: number; label: string } {
let idx = family.indexOf(unit);
// If the unit is not in the family, return the unit with the additional label
if (idx === -1) {
return { value, label: AdditionalLabelsMappingForGrafanaUnits[unit] || '' };
}
// Scale the value up or down to the nearest unit in the family
let scaled = value;
// Scale up
while (scaled >= factor && idx < family.length - 1) {
scaled /= factor;
idx += 1;
}
// Scale down
while (scaled < 1 && idx > 0) {
scaled *= factor;
idx -= 1;
}
// Return the scaled value and the label of the nearest unit in the family
return {
value: scaled,
label: AdditionalLabelsMappingForGrafanaUnits[family[idx]] || '',
};
}
export function formatUniversalUnit(
value: number,
unit: UniversalYAxisUnit,
): string {
// Check if this unit belongs to a family that needs custom scaling
const family = CUSTOM_SCALING_FAMILIES.find((family) =>
family.units.includes(unit),
);
if (family) {
const scaled = scaleValue(value, unit, family.units, family.scaleFactor);
const formatted = format('none', scaled.value);
return `${formatted.text} ${scaled.label}`;
}
// Use Grafana formatting with custom label mappings
const grafanaFormat = UniversalUnitToGrafanaUnit[unit];
if (grafanaFormat) {
const formatted = format(grafanaFormat, value);
return formattedValueToString(formatted);
}
// Fallback to short format for other units
return `${format('short', value).text} ${unit}`;
}

View File

@@ -14,7 +14,7 @@ export enum UniversalYAxisUnit {
HOURS = 'h',
MINUTES = 'min',
SECONDS = 's',
MICROSECONDS = 'us',
MICROSECONDS = 'µs',
MILLISECONDS = 'ms',
NANOSECONDS = 'ns',
@@ -364,3 +364,13 @@ export enum YAxisUnit {
OPEN_METRICS_PERCENT_UNIT = 'percentunit',
}
export interface ScaledValue {
value: number;
label: string;
}
export interface UnitFamilyConfig {
units: UniversalYAxisUnit[];
scaleFactor: number;
}

View File

@@ -31,3 +31,9 @@ export const getUniversalNameFromMetricUnit = (
return universalName || unit || '-';
};
export function isUniversalUnit(format: string): boolean {
return Object.values(UniversalYAxisUnit).includes(
format as UniversalYAxisUnit,
);
}

View File

@@ -24,6 +24,7 @@ export const DATE_TIME_FORMATS = {
TIME_SECONDS: 'HH:mm:ss',
TIME_UTC: 'HH:mm:ss (UTC Z)',
TIME_UTC_MS: 'HH:mm:ss.SSS (UTC Z)',
TIME_SPAN_PERCENTILE: 'HH:mm:ss MMM DD',
// Short date formats
DATE_SHORT: 'MM/DD',

View File

@@ -90,4 +90,7 @@ export const REACT_QUERY_KEY = {
// Routing Policies Query Keys
GET_ROUTING_POLICIES: 'GET_ROUTING_POLICIES',
// Span Percentiles Query Keys
GET_SPAN_PERCENTILES: 'GET_SPAN_PERCENTILES',
} as const;

View File

@@ -3,4 +3,5 @@ export const USER_PREFERENCES = {
NAV_SHORTCUTS: 'nav_shortcuts',
LAST_SEEN_CHANGELOG_VERSION: 'last_seen_changelog_version',
SPAN_DETAILS_PINNED_ATTRIBUTES: 'span_details_pinned_attributes',
SPAN_PERCENTILE_RESOURCE_ATTRIBUTES: 'span_percentile_resource_attributes',
};

View File

@@ -1,5 +1,6 @@
import { ENTITY_VERSION_V4 } from 'constants/app';
import { ENTITY_VERSION_V4, ENTITY_VERSION_V5 } from 'constants/app';
import { initialQueriesMap } from 'constants/queryBuilder';
import { REACT_QUERY_KEY } from 'constants/reactQueryKeys';
import { useApiMonitoringParams } from 'container/ApiMonitoring/queryParams';
import {
END_POINT_DETAILS_QUERY_KEYS_ARRAY,
@@ -178,18 +179,26 @@ function EndPointDetails({
[domainName, filters, minTime, maxTime],
);
const V5_QUERIES = [REACT_QUERY_KEY.GET_ENDPOINT_STATUS_CODE_DATA] as const;
const endPointDetailsDataQueries = useQueries(
endPointDetailsQueryPayload.map((payload, index) => ({
queryKey: [
END_POINT_DETAILS_QUERY_KEYS_ARRAY[index],
payload,
filters?.items, // Include filters.items in queryKey for better caching
ENTITY_VERSION_V4,
],
queryFn: (): Promise<SuccessResponse<MetricRangePayloadProps>> =>
GetMetricQueryRange(payload, ENTITY_VERSION_V4),
enabled: !!payload,
})),
endPointDetailsQueryPayload.map((payload, index) => {
const queryKey = END_POINT_DETAILS_QUERY_KEYS_ARRAY[index];
const version = (V5_QUERIES as readonly string[]).includes(queryKey)
? ENTITY_VERSION_V5
: ENTITY_VERSION_V4;
return {
queryKey: [
END_POINT_DETAILS_QUERY_KEYS_ARRAY[index],
payload,
filters?.items, // Include filters.items in queryKey for better caching
version,
],
queryFn: (): Promise<SuccessResponse<MetricRangePayloadProps>> =>
GetMetricQueryRange(payload, version),
enabled: !!payload,
};
}),
);
const [

View File

@@ -1,8 +1,10 @@
import { LoadingOutlined } from '@ant-design/icons';
import { Spin, Switch, Table, Tooltip, Typography } from 'antd';
import { getQueryRangeV5 } from 'api/v5/queryRange/getQueryRange';
import { MetricRangePayloadV5, ScalarData } from 'api/v5/v5';
import { useNavigateToExplorer } from 'components/CeleryTask/useNavigateToExplorer';
import { withErrorBoundary } from 'components/ErrorBoundaryHOC';
import { DEFAULT_ENTITY_VERSION, ENTITY_VERSION_V4 } from 'constants/app';
import { ENTITY_VERSION_V4, ENTITY_VERSION_V5 } from 'constants/app';
import { REACT_QUERY_KEY } from 'constants/reactQueryKeys';
import {
END_POINT_DETAILS_QUERY_KEYS_ARRAY,
@@ -11,13 +13,12 @@ import {
getTopErrorsColumnsConfig,
getTopErrorsCoRelationQueryFilters,
getTopErrorsQueryPayload,
TopErrorsResponseRow,
} from 'container/ApiMonitoring/utils';
import { GetMetricQueryRange } from 'lib/dashboard/getQueryResults';
import { Info } from 'lucide-react';
import { useMemo, useState } from 'react';
import { useQueries } from 'react-query';
import { SuccessResponse } from 'types/api';
import { QueryFunctionContext, useQueries, useQuery } from 'react-query';
import { SuccessResponse, SuccessResponseV2 } from 'types/api';
import { MetricRangePayloadProps } from 'types/api/metrics/getQueryRange';
import { DataTypes } from 'types/api/queryBuilder/queryAutocompleteResponse';
import { IBuilderQuery } from 'types/api/queryBuilder/queryBuilderData';
@@ -46,7 +47,7 @@ function TopErrors({
true,
);
const queryPayloads = useMemo(
const queryPayload = useMemo(
() =>
getTopErrorsQueryPayload(
domainName,
@@ -82,37 +83,34 @@ function TopErrors({
],
);
const topErrorsDataQueries = useQueries(
queryPayloads.map((payload) => ({
queryKey: [
REACT_QUERY_KEY.GET_TOP_ERRORS_BY_DOMAIN,
payload,
DEFAULT_ENTITY_VERSION,
showStatusCodeErrors,
],
queryFn: (): Promise<SuccessResponse<MetricRangePayloadProps>> =>
GetMetricQueryRange(payload, DEFAULT_ENTITY_VERSION),
enabled: !!payload,
staleTime: 0,
cacheTime: 0,
})),
);
const topErrorsDataQuery = topErrorsDataQueries[0];
const {
data: topErrorsData,
isLoading,
isRefetching,
isError,
refetch,
} = topErrorsDataQuery;
} = useQuery({
queryKey: [
REACT_QUERY_KEY.GET_TOP_ERRORS_BY_DOMAIN,
queryPayload,
ENTITY_VERSION_V5,
showStatusCodeErrors,
],
queryFn: ({
signal,
}: QueryFunctionContext): Promise<SuccessResponseV2<MetricRangePayloadV5>> =>
getQueryRangeV5(queryPayload, ENTITY_VERSION_V5, signal),
enabled: !!queryPayload,
staleTime: 0,
cacheTime: 0,
});
const topErrorsColumnsConfig = useMemo(() => getTopErrorsColumnsConfig(), []);
const formattedTopErrorsData = useMemo(
() =>
formatTopErrorsDataForTable(
topErrorsData?.payload?.data?.result as TopErrorsResponseRow[],
topErrorsData?.data?.data?.data?.results[0] as ScalarData,
),
[topErrorsData],
);

View File

@@ -69,6 +69,13 @@ function StatusCodeBarCharts({
} = endPointStatusCodeLatencyBarChartsDataQuery;
const { startTime: minTime, endTime: maxTime } = timeRange;
const legendScrollPositionRef = useRef<{
scrollTop: number;
scrollLeft: number;
}>({
scrollTop: 0,
scrollLeft: 0,
});
const graphRef = useRef<HTMLDivElement>(null);
const dimensions = useResizeObserver(graphRef);
@@ -207,6 +214,13 @@ function StatusCodeBarCharts({
onDragSelect,
colorMapping,
query: currentQuery,
legendScrollPosition: legendScrollPositionRef.current,
setLegendScrollPosition: (position: {
scrollTop: number;
scrollLeft: number;
}) => {
legendScrollPositionRef.current = position;
},
}),
[
minTime,

View File

@@ -8,10 +8,8 @@ import {
endPointStatusCodeColumns,
extractPortAndEndpoint,
formatDataForTable,
formatTopErrorsDataForTable,
getAllEndpointsWidgetData,
getCustomFiltersForBarChart,
getEndPointDetailsQueryPayload,
getFormattedDependentServicesData,
getFormattedEndPointDropDownData,
getFormattedEndPointMetricsData,
@@ -23,8 +21,6 @@ import {
getStatusCodeBarChartWidgetData,
getTopErrorsColumnsConfig,
getTopErrorsCoRelationQueryFilters,
getTopErrorsQueryPayload,
TopErrorsResponseRow,
} from '../utils';
import { APIMonitoringColumnsMock } from './mock';
@@ -344,49 +340,6 @@ describe('API Monitoring Utils', () => {
});
});
describe('formatTopErrorsDataForTable', () => {
it('should format top errors data correctly', () => {
// Arrange
const inputData = [
{
metric: {
[SPAN_ATTRIBUTES.URL_PATH]: '/api/test',
[SPAN_ATTRIBUTES.RESPONSE_STATUS_CODE]: '500',
status_message: 'Internal Server Error',
},
values: [[1000000100, '10']],
queryName: 'A',
legend: 'Test Legend',
},
];
// Act
const result = formatTopErrorsDataForTable(
inputData as TopErrorsResponseRow[],
);
// Assert
expect(result).toBeDefined();
expect(result.length).toBe(1);
// Check first item is formatted correctly
expect(result[0].endpointName).toBe('/api/test');
expect(result[0].statusCode).toBe('500');
expect(result[0].statusMessage).toBe('Internal Server Error');
expect(result[0].count).toBe('10');
expect(result[0].key).toBeDefined();
});
it('should handle empty input', () => {
// Act
const result = formatTopErrorsDataForTable(undefined);
// Assert
expect(result).toBeDefined();
expect(result).toEqual([]);
});
});
describe('getTopErrorsColumnsConfig', () => {
it('should return column configuration with expected fields', () => {
// Act
@@ -453,72 +406,6 @@ describe('API Monitoring Utils', () => {
});
});
describe('getTopErrorsQueryPayload', () => {
it('should create correct query payload with filters', () => {
// Arrange
const domainName = 'test-domain';
const start = 1000000000;
const end = 1000010000;
const filters = {
items: [
{
id: 'test-filter',
key: {
dataType: DataTypes.String,
key: 'test-key',
type: '',
},
op: '=',
value: 'test-value',
},
],
op: 'AND',
};
// Act
const result = getTopErrorsQueryPayload(
domainName,
start,
end,
filters as IBuilderQuery['filters'],
);
// Assert
expect(result).toBeDefined();
expect(result.length).toBeGreaterThan(0);
// Verify query params
expect(result[0].start).toBe(start);
expect(result[0].end).toBe(end);
// Verify correct structure
expect(result[0].graphType).toBeDefined();
expect(result[0].query).toBeDefined();
expect(result[0].query.builder).toBeDefined();
expect(result[0].query.builder.queryData).toBeDefined();
// Verify domain filter is included
const queryData = result[0].query.builder.queryData[0];
expect(queryData.filters).toBeDefined();
// Check for domain filter
const domainFilter = queryData.filters?.items?.find(
// eslint-disable-next-line sonarjs/no-identical-functions
(item) =>
item.key &&
item.key.key === SPAN_ATTRIBUTES.SERVER_NAME &&
item.value === domainName,
);
expect(domainFilter).toBeDefined();
// Check that custom filters were included
const testFilter = queryData.filters?.items?.find(
(item) => item.id === 'test-filter',
);
expect(testFilter).toBeDefined();
});
});
// Add new tests for EndPointDetails utility functions
describe('extractPortAndEndpoint', () => {
it('should extract port and endpoint from a valid URL', () => {
@@ -564,77 +451,6 @@ describe('API Monitoring Utils', () => {
});
});
describe('getEndPointDetailsQueryPayload', () => {
it('should generate proper query payload with all parameters', () => {
// Arrange
const domainName = 'test-domain';
const startTime = 1609459200000; // 2021-01-01
const endTime = 1609545600000; // 2021-01-02
const filters = {
items: [
{
id: 'test-filter',
key: {
dataType: 'string',
key: 'test.key',
type: '',
},
op: '=',
value: 'test-value',
},
],
op: 'AND',
};
// Act
const result = getEndPointDetailsQueryPayload(
domainName,
startTime,
endTime,
filters as IBuilderQuery['filters'],
);
// Assert
expect(result).toHaveLength(6); // Should return 6 queries
// Check that each query includes proper parameters
result.forEach((query) => {
expect(query).toHaveProperty('start', startTime);
expect(query).toHaveProperty('end', endTime);
// Should have query property with builder data
expect(query).toHaveProperty('query');
expect(query.query).toHaveProperty('builder');
// All queries should include the domain filter
const {
query: {
builder: { queryData },
},
} = query;
queryData.forEach((qd) => {
if (qd.filters && qd.filters.items) {
const serverNameFilter = qd.filters?.items?.find(
(item) => item.key && item.key.key === SPAN_ATTRIBUTES.SERVER_NAME,
);
expect(serverNameFilter).toBeDefined();
// Only check if the serverNameFilter exists, as the actual value might vary
// depending on implementation details or domain defaults
if (serverNameFilter) {
expect(typeof serverNameFilter.value).toBe('string');
}
}
// Should include our custom filter
const customFilter = qd.filters?.items?.find(
(item) => item.id === 'test-filter',
);
expect(customFilter).toBeDefined();
});
});
});
});
describe('getRateOverTimeWidgetData', () => {
it('should generate widget configuration for rate over time', () => {
// Arrange

View File

@@ -0,0 +1,226 @@
/* eslint-disable @typescript-eslint/no-explicit-any */
/* eslint-disable sonarjs/no-duplicate-string */
/**
* V5 Migration Tests for Status Code Table Query
*
* These tests validate the migration from V4 to V5 format for the second payload
* in getEndPointDetailsQueryPayload (status code table data):
* - Filter format change: filters.items[] → filter.expression
* - URL handling: Special logic for (http.url OR url.full)
* - Domain filter: (net.peer.name OR server.address)
* - Kind filter: kind_string = 'Client'
* - Kind filter: response_status_code EXISTS
* - Three queries: A (count), B (p99 latency), C (rate)
* - All grouped by response_status_code
*/
import { TraceAggregation } from 'api/v5/v5';
import { getEndPointDetailsQueryPayload } from 'container/ApiMonitoring/utils';
import { IBuilderQuery } from 'types/api/queryBuilder/queryBuilderData';
describe('StatusCodeTable - V5 Migration Validation', () => {
const mockDomainName = 'api.example.com';
const mockStartTime = 1000;
const mockEndTime = 2000;
const emptyFilters: IBuilderQuery['filters'] = {
items: [],
op: 'AND',
};
describe('1. V5 Format Migration with Base Filters', () => {
it('migrates to V5 format with correct filter expression structure and base filters', () => {
const payload = getEndPointDetailsQueryPayload(
mockDomainName,
mockStartTime,
mockEndTime,
emptyFilters,
);
// Second payload is the status code table query
const statusCodeQuery = payload[1];
const queryA = statusCodeQuery.query.builder.queryData[0];
// CRITICAL V5 MIGRATION: filter.expression (not filters.items)
expect(queryA.filter).toBeDefined();
expect(queryA.filter?.expression).toBeDefined();
expect(typeof queryA.filter?.expression).toBe('string');
expect(queryA).not.toHaveProperty('filters.items');
// Base filter 1: Domain (net.peer.name OR server.address)
expect(queryA.filter?.expression).toContain(
`(net.peer.name = '${mockDomainName}' OR server.address = '${mockDomainName}')`,
);
// Base filter 2: Kind
expect(queryA.filter?.expression).toContain("kind_string = 'Client'");
// Base filter 3: response_status_code EXISTS
expect(queryA.filter?.expression).toContain('response_status_code EXISTS');
});
});
describe('2. Three Queries Structure and Consistency', () => {
it('generates three queries (count, p99, rate) all grouped by response_status_code with identical filters', () => {
const payload = getEndPointDetailsQueryPayload(
mockDomainName,
mockStartTime,
mockEndTime,
emptyFilters,
);
const statusCodeQuery = payload[1];
const [queryA, queryB, queryC] = statusCodeQuery.query.builder.queryData;
// Query A: Count
expect(queryA.queryName).toBe('A');
expect(queryA.aggregateOperator).toBe('count');
expect(queryA.aggregations?.[0]).toBeDefined();
expect((queryA.aggregations?.[0] as TraceAggregation)?.expression).toBe(
'count(span_id)',
);
expect(queryA.disabled).toBe(false);
// Query B: P99 Latency
expect(queryB.queryName).toBe('B');
expect(queryB.aggregateOperator).toBe('p99');
expect((queryB.aggregations?.[0] as TraceAggregation)?.expression).toBe(
'p99(duration_nano)',
);
expect(queryB.disabled).toBe(false);
// Query C: Rate
expect(queryC.queryName).toBe('C');
expect(queryC.aggregateOperator).toBe('rate');
expect(queryC.disabled).toBe(false);
// All group by response_status_code
[queryA, queryB, queryC].forEach((query) => {
expect(query.groupBy).toContainEqual(
expect.objectContaining({
key: 'response_status_code',
dataType: 'string',
type: 'span',
}),
);
});
// CRITICAL: All have identical filter expressions
expect(queryA.filter?.expression).toBe(queryB.filter?.expression);
expect(queryB.filter?.expression).toBe(queryC.filter?.expression);
});
});
describe('3. Custom Filters Integration', () => {
it('merges custom filters into filter expression with AND logic', () => {
const customFilters: IBuilderQuery['filters'] = {
items: [
{
id: 'test-1',
key: {
key: 'service.name',
dataType: 'string' as any,
type: 'resource',
},
op: '=',
value: 'user-service',
},
{
id: 'test-2',
key: {
key: 'deployment.environment',
dataType: 'string' as any,
type: 'resource',
},
op: '=',
value: 'production',
},
],
op: 'AND',
};
const payload = getEndPointDetailsQueryPayload(
mockDomainName,
mockStartTime,
mockEndTime,
customFilters,
);
const statusCodeQuery = payload[1];
const expression =
statusCodeQuery.query.builder.queryData[0].filter?.expression;
// Base filters present
expect(expression).toContain('net.peer.name');
expect(expression).toContain("kind_string = 'Client'");
expect(expression).toContain('response_status_code EXISTS');
// Custom filters merged
expect(expression).toContain('service.name');
expect(expression).toContain('user-service');
expect(expression).toContain('deployment.environment');
expect(expression).toContain('production');
// All three queries have the same merged expression
const queries = statusCodeQuery.query.builder.queryData;
expect(queries[0].filter?.expression).toBe(queries[1].filter?.expression);
expect(queries[1].filter?.expression).toBe(queries[2].filter?.expression);
});
});
describe('4. HTTP URL Filter Handling', () => {
it('converts http.url filter to (http.url OR url.full) expression', () => {
const filtersWithHttpUrl: IBuilderQuery['filters'] = {
items: [
{
id: 'http-url-filter',
key: {
key: 'http.url',
dataType: 'string' as any,
type: 'tag',
},
op: '=',
value: '/api/users',
},
{
id: 'service-filter',
key: {
key: 'service.name',
dataType: 'string' as any,
type: 'resource',
},
op: '=',
value: 'user-service',
},
],
op: 'AND',
};
const payload = getEndPointDetailsQueryPayload(
mockDomainName,
mockStartTime,
mockEndTime,
filtersWithHttpUrl,
);
const statusCodeQuery = payload[1];
const expression =
statusCodeQuery.query.builder.queryData[0].filter?.expression;
// CRITICAL: http.url converted to OR logic
expect(expression).toContain(
"(http.url = '/api/users' OR url.full = '/api/users')",
);
// Other filters still present
expect(expression).toContain('service.name');
expect(expression).toContain('user-service');
// Base filters present
expect(expression).toContain('net.peer.name');
expect(expression).toContain("kind_string = 'Client'");
expect(expression).toContain('response_status_code EXISTS');
// All ANDed together (at least 2 ANDs: domain+kind, custom filter, url condition)
expect(expression?.match(/AND/g)?.length).toBeGreaterThanOrEqual(2);
});
});
});

View File

@@ -1,14 +1,6 @@
import { fireEvent, render, screen, within } from '@testing-library/react';
import { useNavigateToExplorer } from 'components/CeleryTask/useNavigateToExplorer';
import { REACT_QUERY_KEY } from 'constants/reactQueryKeys';
import {
formatTopErrorsDataForTable,
getEndPointDetailsQueryPayload,
getTopErrorsColumnsConfig,
getTopErrorsCoRelationQueryFilters,
getTopErrorsQueryPayload,
} from 'container/ApiMonitoring/utils';
import { useQueries } from 'react-query';
import { rest, server } from 'mocks-server/server';
import { fireEvent, render, screen, waitFor, within } from 'tests/test-utils';
import { DataSource } from 'types/common/queryBuilder';
import TopErrors from '../Explorer/Domains/DomainDetails/TopErrors';
@@ -35,28 +27,15 @@ jest.mock(
}),
);
// Mock dependencies
jest.mock('react-query', () => ({
...jest.requireActual('react-query'),
useQueries: jest.fn(),
}));
jest.mock('components/CeleryTask/useNavigateToExplorer', () => ({
useNavigateToExplorer: jest.fn(),
}));
jest.mock('container/ApiMonitoring/utils', () => ({
END_POINT_DETAILS_QUERY_KEYS_ARRAY: ['key1', 'key2', 'key3', 'key4', 'key5'],
formatTopErrorsDataForTable: jest.fn(),
getEndPointDetailsQueryPayload: jest.fn(),
getTopErrorsColumnsConfig: jest.fn(),
getTopErrorsCoRelationQueryFilters: jest.fn(),
getTopErrorsQueryPayload: jest.fn(),
}));
describe('TopErrors', () => {
const TABLE_BODY_SELECTOR = '.ant-table-tbody';
const V5_QUERY_RANGE_API_PATH = '*/api/v5/query_range';
const mockProps = {
// eslint-disable-next-line sonarjs/no-duplicate-string
domainName: 'test-domain',
timeRange: {
startTime: 1000000000,
@@ -68,75 +47,72 @@ describe('TopErrors', () => {
},
};
// Setup basic mocks
// Helper function to wait for table data to load
const waitForTableDataToLoad = async (
container: HTMLElement,
): Promise<void> => {
await waitFor(() => {
const tableBody = container.querySelector(TABLE_BODY_SELECTOR);
expect(tableBody).not.toBeNull();
if (tableBody) {
expect(
within(tableBody as HTMLElement).queryByText('/api/test'),
).toBeInTheDocument();
}
});
};
beforeEach(() => {
jest.clearAllMocks();
// Mock getTopErrorsColumnsConfig
(getTopErrorsColumnsConfig as jest.Mock).mockReturnValue([
{
title: 'Endpoint',
dataIndex: 'endpointName',
key: 'endpointName',
},
{
title: 'Status Code',
dataIndex: 'statusCode',
key: 'statusCode',
},
{
title: 'Status Message',
dataIndex: 'statusMessage',
key: 'statusMessage',
},
{
title: 'Count',
dataIndex: 'count',
key: 'count',
},
]);
// Mock useNavigateToExplorer
(useNavigateToExplorer as jest.Mock).mockReturnValue(jest.fn());
// Mock useQueries
(useQueries as jest.Mock).mockImplementation((queryConfigs) => {
// For topErrorsDataQueries
if (
queryConfigs.length === 1 &&
queryConfigs[0].queryKey &&
queryConfigs[0].queryKey[0] === REACT_QUERY_KEY.GET_TOP_ERRORS_BY_DOMAIN
) {
return [
{
// Mock V5 API endpoint for top errors
server.use(
rest.post(V5_QUERY_RANGE_API_PATH, (_req, res, ctx) =>
res(
ctx.status(200),
ctx.json({
data: {
payload: {
data: {
result: [
{
metric: {
'http.url': '/api/test',
status_code: '500',
// eslint-disable-next-line sonarjs/no-duplicate-string
status_message: 'Internal Server Error',
data: {
results: [
{
columns: [
{
name: 'http.url',
fieldDataType: 'string',
fieldContext: 'attribute',
},
values: [[1000000100, '10']],
queryName: 'A',
legend: 'Test Legend',
},
],
},
{
name: 'response_status_code',
fieldDataType: 'string',
fieldContext: 'span',
},
{
name: 'status_message',
fieldDataType: 'string',
fieldContext: 'span',
},
{ name: 'count()', fieldDataType: 'int64', fieldContext: '' },
],
// eslint-disable-next-line sonarjs/no-duplicate-string
data: [['/api/test', '500', 'Internal Server Error', 10]],
},
],
},
},
isLoading: false,
isRefetching: false,
isError: false,
refetch: jest.fn(),
},
];
}
}),
),
),
);
// For endPointDropDownDataQueries
return [
{
data: {
// Mock V4 API endpoint for dropdown data
server.use(
rest.post('*/api/v1/query_range', (_req, res, ctx) =>
res(
ctx.status(200),
ctx.json({
payload: {
data: {
result: [
@@ -153,62 +129,13 @@ describe('TopErrors', () => {
],
},
},
},
isLoading: false,
isRefetching: false,
isError: false,
},
];
});
// Mock formatTopErrorsDataForTable
(formatTopErrorsDataForTable as jest.Mock).mockReturnValue([
{
key: '1',
endpointName: '/api/test',
statusCode: '500',
statusMessage: 'Internal Server Error',
count: 10,
},
]);
// Mock getTopErrorsQueryPayload
(getTopErrorsQueryPayload as jest.Mock).mockReturnValue([
{
queryName: 'TopErrorsQuery',
start: mockProps.timeRange.startTime,
end: mockProps.timeRange.endTime,
step: 60,
},
]);
// Mock getEndPointDetailsQueryPayload
(getEndPointDetailsQueryPayload as jest.Mock).mockReturnValue([
{},
{},
{
queryName: 'EndpointDropdownQuery',
start: mockProps.timeRange.startTime,
end: mockProps.timeRange.endTime,
step: 60,
},
]);
// Mock useNavigateToExplorer
(useNavigateToExplorer as jest.Mock).mockReturnValue(jest.fn());
// Mock getTopErrorsCoRelationQueryFilters
(getTopErrorsCoRelationQueryFilters as jest.Mock).mockReturnValue({
items: [
{ id: 'test1', key: { key: 'domain' }, op: '=', value: 'test-domain' },
{ id: 'test2', key: { key: 'endpoint' }, op: '=', value: '/api/test' },
{ id: 'test3', key: { key: 'status' }, op: '=', value: '500' },
],
op: 'AND',
});
}),
),
),
);
});
it('renders component correctly', () => {
it('renders component correctly', async () => {
// eslint-disable-next-line react/jsx-props-no-spreading
const { container } = render(<TopErrors {...mockProps} />);
@@ -216,10 +143,11 @@ describe('TopErrors', () => {
expect(screen.getByText('Errors with Status Message')).toBeInTheDocument();
expect(screen.getByText('Status Message Exists')).toBeInTheDocument();
// Find the table row and verify content
const tableBody = container.querySelector('.ant-table-tbody');
expect(tableBody).not.toBeNull();
// Wait for data to load
await waitForTableDataToLoad(container);
// Find the table row and verify content
const tableBody = container.querySelector(TABLE_BODY_SELECTOR);
if (tableBody) {
const row = within(tableBody as HTMLElement).getByRole('row');
expect(within(row).getByText('/api/test')).toBeInTheDocument();
@@ -228,35 +156,40 @@ describe('TopErrors', () => {
}
});
it('renders error state when isError is true', () => {
// Mock useQueries to return isError: true
(useQueries as jest.Mock).mockImplementationOnce(() => [
{
isError: true,
refetch: jest.fn(),
},
]);
it('renders error state when API fails', async () => {
// Mock API to return error
server.use(
rest.post(V5_QUERY_RANGE_API_PATH, (_req, res, ctx) =>
res(ctx.status(500), ctx.json({ error: 'Internal Server Error' })),
),
);
// eslint-disable-next-line react/jsx-props-no-spreading
render(<TopErrors {...mockProps} />);
// Error state should be shown with the actual text displayed in the UI
expect(
screen.getByText('Uh-oh :/ We ran into an error.'),
).toBeInTheDocument();
// Wait for error state
await waitFor(() => {
expect(
screen.getByText('Uh-oh :/ We ran into an error.'),
).toBeInTheDocument();
});
expect(screen.getByText('Please refresh this panel.')).toBeInTheDocument();
expect(screen.getByText('Refresh this panel')).toBeInTheDocument();
});
it('handles row click correctly', () => {
it('handles row click correctly', async () => {
const navigateMock = jest.fn();
(useNavigateToExplorer as jest.Mock).mockReturnValue(navigateMock);
// eslint-disable-next-line react/jsx-props-no-spreading
const { container } = render(<TopErrors {...mockProps} />);
// Wait for data to load
await waitForTableDataToLoad(container);
// Find and click on the table cell containing the endpoint
const tableBody = container.querySelector('.ant-table-tbody');
const tableBody = container.querySelector(TABLE_BODY_SELECTOR);
expect(tableBody).not.toBeNull();
if (tableBody) {
@@ -267,11 +200,28 @@ describe('TopErrors', () => {
// Check if navigateToExplorer was called with correct params
expect(navigateMock).toHaveBeenCalledWith({
filters: [
{ id: 'test1', key: { key: 'domain' }, op: '=', value: 'test-domain' },
{ id: 'test2', key: { key: 'endpoint' }, op: '=', value: '/api/test' },
{ id: 'test3', key: { key: 'status' }, op: '=', value: '500' },
],
filters: expect.arrayContaining([
expect.objectContaining({
key: expect.objectContaining({ key: 'http.url' }),
op: '=',
value: '/api/test',
}),
expect.objectContaining({
key: expect.objectContaining({ key: 'has_error' }),
op: '=',
value: 'true',
}),
expect.objectContaining({
key: expect.objectContaining({ key: 'net.peer.name' }),
op: '=',
value: 'test-domain',
}),
expect.objectContaining({
key: expect.objectContaining({ key: 'response_status_code' }),
op: '=',
value: '500',
}),
]),
dataSource: DataSource.TRACES,
startTime: mockProps.timeRange.startTime,
endTime: mockProps.timeRange.endTime,
@@ -279,24 +229,34 @@ describe('TopErrors', () => {
});
});
it('updates endpoint filter when dropdown value changes', () => {
it('updates endpoint filter when dropdown value changes', async () => {
// eslint-disable-next-line react/jsx-props-no-spreading
render(<TopErrors {...mockProps} />);
// Wait for initial load
await waitFor(() => {
expect(screen.getByRole('combobox')).toBeInTheDocument();
});
// Find the dropdown
const dropdown = screen.getByRole('combobox');
// Mock the change
fireEvent.change(dropdown, { target: { value: '/api/new-endpoint' } });
// Check if getTopErrorsQueryPayload was called with updated parameters
expect(getTopErrorsQueryPayload).toHaveBeenCalled();
// Component should re-render with new filter
expect(dropdown).toBeInTheDocument();
});
it('handles status message toggle correctly', () => {
it('handles status message toggle correctly', async () => {
// eslint-disable-next-line react/jsx-props-no-spreading
render(<TopErrors {...mockProps} />);
// Wait for initial load
await waitFor(() => {
expect(screen.getByRole('switch')).toBeInTheDocument();
});
// Find the toggle switch
const toggle = screen.getByRole('switch');
expect(toggle).toBeInTheDocument();
@@ -307,69 +267,104 @@ describe('TopErrors', () => {
// Click the toggle to turn it off
fireEvent.click(toggle);
// Check if getTopErrorsQueryPayload was called with showStatusCodeErrors=false
expect(getTopErrorsQueryPayload).toHaveBeenCalledWith(
mockProps.domainName,
mockProps.timeRange.startTime,
mockProps.timeRange.endTime,
expect.any(Object),
false,
);
// Title should change
expect(screen.getByText('All Errors')).toBeInTheDocument();
await waitFor(() => {
expect(screen.getByText('All Errors')).toBeInTheDocument();
});
// Click the toggle to turn it back on
fireEvent.click(toggle);
// Check if getTopErrorsQueryPayload was called with showStatusCodeErrors=true
expect(getTopErrorsQueryPayload).toHaveBeenCalledWith(
mockProps.domainName,
mockProps.timeRange.startTime,
mockProps.timeRange.endTime,
expect.any(Object),
true,
);
// Title should change back
expect(screen.getByText('Errors with Status Message')).toBeInTheDocument();
await waitFor(() => {
expect(screen.getByText('Errors with Status Message')).toBeInTheDocument();
});
});
it('includes toggle state in query key for cache busting', () => {
it('includes toggle state in query key for cache busting', async () => {
// eslint-disable-next-line react/jsx-props-no-spreading
render(<TopErrors {...mockProps} />);
const toggle = screen.getByRole('switch');
// Wait for initial load
await waitFor(() => {
expect(screen.getByRole('switch')).toBeInTheDocument();
});
// Initial query should include showStatusCodeErrors=true
expect(useQueries).toHaveBeenCalledWith(
expect.arrayContaining([
expect.objectContaining({
queryKey: expect.arrayContaining([
REACT_QUERY_KEY.GET_TOP_ERRORS_BY_DOMAIN,
expect.any(Object),
expect.any(String),
true,
]),
}),
]),
);
const toggle = screen.getByRole('switch');
// Click toggle
fireEvent.click(toggle);
// Query should be called with showStatusCodeErrors=false in key
expect(useQueries).toHaveBeenCalledWith(
expect.arrayContaining([
expect.objectContaining({
queryKey: expect.arrayContaining([
REACT_QUERY_KEY.GET_TOP_ERRORS_BY_DOMAIN,
expect.any(Object),
expect.any(String),
false,
]),
}),
]),
// Wait for title to change, indicating query was refetched with new key
await waitFor(() => {
expect(screen.getByText('All Errors')).toBeInTheDocument();
});
// The fact that data refetches when toggle changes proves the query key includes the toggle state
expect(toggle).toBeInTheDocument();
});
it('sends query_range v5 API call with required filters including has_error', async () => {
let capturedRequest: any;
// Override the v5 API mock to capture the request
server.use(
rest.post(V5_QUERY_RANGE_API_PATH, async (req, res, ctx) => {
capturedRequest = await req.json();
return res(
ctx.status(200),
ctx.json({
data: {
data: {
results: [
{
columns: [
{
name: 'http.url',
fieldDataType: 'string',
fieldContext: 'attribute',
},
{
name: 'response_status_code',
fieldDataType: 'string',
fieldContext: 'span',
},
{
name: 'status_message',
fieldDataType: 'string',
fieldContext: 'span',
},
{ name: 'count()', fieldDataType: 'int64', fieldContext: '' },
],
data: [['/api/test', '500', 'Internal Server Error', 10]],
},
],
},
},
}),
);
}),
);
// eslint-disable-next-line react/jsx-props-no-spreading
render(<TopErrors {...mockProps} />);
// Wait for the API call to be made
await waitFor(() => {
expect(capturedRequest).toBeDefined();
});
// Extract the filter expression from the captured request
const filterExpression =
capturedRequest.compositeQuery.queries[0].spec.filter.expression;
// Verify all required filters are present
expect(filterExpression).toContain(`kind_string = 'Client'`);
expect(filterExpression).toContain(`(http.url EXISTS OR url.full EXISTS)`);
expect(filterExpression).toContain(
`(net.peer.name = 'test-domain' OR server.address = 'test-domain')`,
);
expect(filterExpression).toContain(`has_error = true`);
expect(filterExpression).toContain(`status_message EXISTS`); // toggle is on by default
});
});

View File

@@ -2,6 +2,7 @@
import { Color } from '@signozhq/design-tokens';
import { Progress, Tag, Tooltip } from 'antd';
import { ColumnType } from 'antd/es/table';
import { convertFiltersToExpressionWithExistingQuery } from 'components/QueryBuilderV2/utils';
import {
FiltersType,
IQuickFiltersConfig,
@@ -27,6 +28,11 @@ import {
OrderByPayload,
TagFilterItem,
} from 'types/api/queryBuilder/queryBuilderData';
import {
ColumnDescriptor,
QueryRangePayloadV5,
ScalarData,
} from 'types/api/v5/queryRange';
import { QueryData } from 'types/api/widgets/getQuery';
import { EQueryType } from 'types/common/dashboard';
import { DataSource } from 'types/common/queryBuilder';
@@ -40,6 +46,18 @@ import {
EndPointsResponseRow,
} from './types';
export const isEmptyFilterValue = (value: unknown): boolean =>
value === '' || value === null || value === undefined || value === 'n/a';
/**
* Returns '-' if value is empty, otherwise returns value as string
*/
export const getDisplayValue = (value: unknown): string =>
isEmptyFilterValue(value) ? '-' : String(value);
export const getDomainNameFilterExpression = (domainName: string): string =>
`(net.peer.name = '${domainName}' OR server.address = '${domainName}')`;
export const clientKindExpression = `kind_string = 'Client'`;
export const ApiMonitoringQuickFiltersConfig: IQuickFiltersConfig[] = [
{
type: FiltersType.CHECKBOX,
@@ -816,153 +834,149 @@ export const getEndPointsQueryPayload = (
];
};
/**
* Converts filters to expression, handling http.url specially by creating (http.url OR url.full) condition
* @param filters Filters to convert
* @param baseExpression Base expression to combine with filters
* @returns Filter expression string
*/
export const convertFiltersWithUrlHandling = (
filters: IBuilderQuery['filters'],
baseExpression: string,
): string => {
if (!filters) {
return baseExpression;
}
// Check if filters contain http.url (SPAN_ATTRIBUTES.URL_PATH)
const httpUrlFilter = filters.items?.find(
(item) => item.key?.key === SPAN_ATTRIBUTES.URL_PATH,
);
// If http.url filter exists, create modified filters with (http.url OR url.full)
if (httpUrlFilter && httpUrlFilter.value) {
// Remove ALL http.url filters from items (guards against duplicates)
const otherFilters = filters.items?.filter(
(item) => item.key?.key !== SPAN_ATTRIBUTES.URL_PATH,
);
// Convert to expression first with other filters
const {
filter: intermediateFilter,
} = convertFiltersToExpressionWithExistingQuery(
{ ...filters, items: otherFilters || [] },
baseExpression,
);
// Add the OR condition for http.url and url.full
const urlValue = httpUrlFilter.value;
const urlCondition = `(http.url = '${urlValue}' OR url.full = '${urlValue}')`;
return intermediateFilter.expression.trim()
? `${intermediateFilter.expression} AND ${urlCondition}`
: urlCondition;
}
const { filter } = convertFiltersToExpressionWithExistingQuery(
filters,
baseExpression,
);
return filter.expression;
};
// eslint-disable-next-line sonarjs/cognitive-complexity
function buildFilterExpression(
domainName: string,
filters: IBuilderQuery['filters'],
showStatusCodeErrors: boolean,
): string {
const baseFilterParts = [
`kind_string = 'Client'`,
`(http.url EXISTS OR url.full EXISTS)`,
`(net.peer.name = '${domainName}' OR server.address = '${domainName}')`,
`has_error = true`,
];
if (showStatusCodeErrors) {
baseFilterParts.push('status_message EXISTS');
}
const filterExpression = baseFilterParts.join(' AND ');
if (!filters) {
return filterExpression;
}
const { filter } = convertFiltersToExpressionWithExistingQuery(
filters,
filterExpression,
);
return filter.expression;
}
export const getTopErrorsQueryPayload = (
domainName: string,
start: number,
end: number,
filters: IBuilderQuery['filters'],
showStatusCodeErrors = true,
): GetQueryResultsProps[] => [
{
selectedTime: 'GLOBAL_TIME',
graphType: PANEL_TYPES.TABLE,
query: {
builder: {
queryData: [
{
dataSource: DataSource.TRACES,
queryName: 'A',
aggregateOperator: 'count',
aggregateAttribute: {
id: '------false',
dataType: DataTypes.String,
key: '',
type: '',
},
timeAggregation: 'rate',
spaceAggregation: 'sum',
functions: [],
filters: {
op: 'AND',
items: [
{
id: '04da97bd',
key: {
key: 'kind_string',
dataType: DataTypes.String,
type: '',
},
op: '=',
value: 'Client',
},
{
id: 'b1af6bdb',
key: {
key: SPAN_ATTRIBUTES.URL_PATH,
dataType: DataTypes.String,
type: 'tag',
},
op: 'exists',
value: '',
},
...(showStatusCodeErrors
? [
{
id: '75d65388',
key: {
key: 'status_message',
dataType: DataTypes.String,
type: '',
},
op: 'exists',
value: '',
},
]
: []),
{
id: '4872bf91',
key: {
key: SPAN_ATTRIBUTES.SERVER_NAME,
dataType: DataTypes.String,
type: 'tag',
},
op: '=',
value: domainName,
},
{
id: 'ab4c885d',
key: {
key: 'has_error',
dataType: DataTypes.bool,
type: '',
},
op: '=',
value: true,
},
...(filters?.items || []),
],
},
expression: 'A',
disabled: false,
stepInterval: 60,
having: [],
limit: 10,
orderBy: [
{
columnName: 'timestamp',
order: 'desc',
},
],
groupBy: [
{
key: SPAN_ATTRIBUTES.URL_PATH,
dataType: DataTypes.String,
type: 'tag',
},
{
dataType: DataTypes.String,
key: 'response_status_code',
type: '',
id: 'response_status_code--string----true',
},
{
key: 'status_message',
dataType: DataTypes.String,
type: '',
},
],
legend: '',
reduceTo: 'avg',
},
],
queryFormulas: [],
queryTraceOperator: [],
},
clickhouse_sql: [
{
disabled: false,
legend: '',
name: 'A',
query: '',
},
],
id: '315b15fa-ff0c-442f-89f8-2bf4fb1af2f2',
promql: [
{
disabled: false,
legend: '',
name: 'A',
query: '',
},
],
queryType: EQueryType.QUERY_BUILDER,
},
variables: {},
): QueryRangePayloadV5 => {
const filterExpression = buildFilterExpression(
domainName,
filters,
showStatusCodeErrors,
);
return {
schemaVersion: 'v1',
start,
end,
step: 240,
},
];
requestType: 'scalar',
compositeQuery: {
queries: [
{
type: 'builder_query',
spec: {
name: 'A',
signal: 'traces',
stepInterval: 60,
disabled: false,
aggregations: [{ expression: 'count()' }],
filter: { expression: filterExpression },
groupBy: [
{
name: 'http.url',
fieldDataType: 'string',
fieldContext: 'attribute',
},
{
name: 'url.full',
fieldDataType: 'string',
fieldContext: 'attribute',
},
{
name: 'response_status_code',
fieldDataType: 'string',
fieldContext: 'span',
},
{
name: 'status_message',
fieldDataType: 'string',
fieldContext: 'span',
},
],
limit: 10,
order: [
{
key: {
name: 'count()',
},
direction: 'desc',
},
],
},
},
],
},
formatOptions: { formatTableResultForUI: true, fillGaps: false },
variables: {},
};
};
export interface EndPointsTableRowData {
key: string;
@@ -1242,63 +1256,48 @@ export const formatEndPointsDataForTable = (
return formattedData;
};
export interface TopErrorsResponseRow {
metric: {
[SPAN_ATTRIBUTES.URL_PATH]: string;
[SPAN_ATTRIBUTES.RESPONSE_STATUS_CODE]: string;
status_message: string;
};
values: [number, string][];
queryName: string;
legend: string;
}
export type TopErrorsResponseRow = ScalarData;
export interface TopErrorsTableRowData {
key: string;
endpointName: string;
statusCode: string;
statusMessage: string;
count: number | string;
count: string;
}
export const formatTopErrorsDataForTable = (
data: TopErrorsResponseRow[] | undefined,
scalarResult: TopErrorsResponseRow | undefined,
): TopErrorsTableRowData[] => {
if (!data) return [];
if (!scalarResult?.data) return [];
return data.map((row) => ({
key: v4(),
endpointName:
row.metric[SPAN_ATTRIBUTES.URL_PATH] === 'n/a' ||
row.metric[SPAN_ATTRIBUTES.URL_PATH] === undefined
? '-'
: row.metric[SPAN_ATTRIBUTES.URL_PATH],
statusCode:
row.metric[SPAN_ATTRIBUTES.RESPONSE_STATUS_CODE] === 'n/a' ||
row.metric[SPAN_ATTRIBUTES.RESPONSE_STATUS_CODE] === undefined
? '-'
: row.metric[SPAN_ATTRIBUTES.RESPONSE_STATUS_CODE],
statusMessage:
row.metric.status_message === 'n/a' ||
row.metric.status_message === undefined
? '-'
: row.metric.status_message,
count:
row.values &&
row.values[0] &&
row.values[0][1] !== undefined &&
row.values[0][1] !== 'n/a'
? row.values[0][1]
: '-',
}));
const columns = scalarResult.columns || [];
const rows = scalarResult.data || [];
return rows.map((rowData: unknown[]) => {
const rowObj: Record<string, unknown> = {};
columns.forEach((col: ColumnDescriptor, index: number) => {
rowObj[col.name] = rowData[index];
});
return {
key: v4(),
endpointName: getDisplayValue(
rowObj[SPAN_ATTRIBUTES.URL_PATH] || rowObj['url.full'],
),
statusCode: getDisplayValue(rowObj[SPAN_ATTRIBUTES.RESPONSE_STATUS_CODE]),
statusMessage: getDisplayValue(rowObj.status_message),
count: getDisplayValue(rowObj.__result_0),
};
});
};
export const getTopErrorsCoRelationQueryFilters = (
domainName: string,
endPointName: string,
statusCode: string,
): IBuilderQuery['filters'] => ({
items: [
): IBuilderQuery['filters'] => {
const items: TagFilterItem[] = [
{
id: 'ea16470b',
key: {
@@ -1330,7 +1329,10 @@ export const getTopErrorsCoRelationQueryFilters = (
op: '=',
value: domainName,
},
{
];
if (statusCode !== '-') {
items.push({
id: 'f6891e27',
key: {
key: 'response_status_code',
@@ -1340,10 +1342,14 @@ export const getTopErrorsCoRelationQueryFilters = (
},
op: '=',
value: statusCode,
},
],
op: 'AND',
});
});
}
return {
items,
op: 'AND',
};
};
export const getTopErrorsColumnsConfig = (): ColumnType<TopErrorsTableRowData>[] => [
{
@@ -1735,47 +1741,29 @@ export const getEndPointDetailsQueryPayload = (
builder: {
queryData: [
{
aggregateAttribute: {
dataType: DataTypes.String,
key: 'span_id',
type: '',
},
aggregations: [
{
expression: 'count(span_id)',
},
],
aggregateOperator: 'count',
dataSource: DataSource.TRACES,
disabled: false,
expression: 'A',
filters: {
items: [
{
id: '23450eb8',
key: {
dataType: DataTypes.String,
key: SPAN_ATTRIBUTES.SERVER_NAME,
type: 'tag',
},
op: '=',
value: domainName,
},
{
id: '212678b9',
key: {
key: 'kind_string',
dataType: DataTypes.String,
type: '',
},
op: '=',
value: 'Client',
},
...(filters?.items || []),
],
op: 'AND',
filter: {
expression: convertFiltersWithUrlHandling(
filters || { items: [], op: 'AND' },
`${getDomainNameFilterExpression(
domainName,
)} AND ${clientKindExpression} AND response_status_code EXISTS`,
),
},
functions: [],
groupBy: [
{
dataType: DataTypes.String,
key: 'response_status_code',
type: '',
type: 'span',
},
],
having: [],
@@ -1789,47 +1777,29 @@ export const getEndPointDetailsQueryPayload = (
timeAggregation: 'count',
},
{
aggregateAttribute: {
dataType: DataTypes.Float64,
key: 'duration_nano',
type: '',
},
aggregations: [
{
expression: 'p99(duration_nano)',
},
],
aggregateOperator: 'p99',
dataSource: DataSource.TRACES,
disabled: false,
expression: 'B',
filters: {
items: [
{
id: '2687dc18',
key: {
dataType: DataTypes.String,
key: SPAN_ATTRIBUTES.SERVER_NAME,
type: 'tag',
},
op: '=',
value: domainName,
},
{
id: '212678b9',
key: {
key: 'kind_string',
dataType: DataTypes.String,
type: '',
},
op: '=',
value: 'Client',
},
...(filters?.items || []),
],
op: 'AND',
filter: {
expression: convertFiltersWithUrlHandling(
filters || { items: [], op: 'AND' },
`${getDomainNameFilterExpression(
domainName,
)} AND ${clientKindExpression} AND response_status_code EXISTS`,
),
},
functions: [],
groupBy: [
{
dataType: DataTypes.String,
key: 'response_status_code',
type: '',
type: 'span',
},
],
having: [],
@@ -1846,41 +1816,21 @@ export const getEndPointDetailsQueryPayload = (
dataSource: DataSource.TRACES,
queryName: 'C',
aggregateOperator: 'rate',
aggregateAttribute: {
dataType: DataTypes.String,
id: '------false',
key: '',
type: '',
},
aggregations: [
{
expression: 'rate()',
},
],
timeAggregation: 'rate',
spaceAggregation: 'sum',
functions: [],
filters: {
items: [
{
id: '334840be',
key: {
dataType: DataTypes.String,
id: 'net.peer.name--string--tag--false',
key: 'net.peer.name',
type: 'tag',
},
op: '=',
value: domainName,
},
{
id: '212678b9',
key: {
key: 'kind_string',
dataType: DataTypes.String,
type: '',
},
op: '=',
value: 'Client',
},
...(filters?.items || []),
],
op: 'AND',
filter: {
expression: convertFiltersWithUrlHandling(
filters || { items: [], op: 'AND' },
`${getDomainNameFilterExpression(
domainName,
)} AND ${clientKindExpression} AND response_status_code EXISTS`,
),
},
expression: 'C',
disabled: false,
@@ -1892,7 +1842,7 @@ export const getEndPointDetailsQueryPayload = (
{
dataType: DataTypes.String,
key: 'response_status_code',
type: '',
type: 'span',
id: 'response_status_code--string----true',
},
],
@@ -2635,17 +2585,12 @@ export const getFormattedEndPointStatusCodeData = (
if (!data) return [];
return data.map((row) => ({
key: v4(),
statusCode:
row.data.response_status_code === 'n/a' ||
row.data.response_status_code === undefined
? '-'
: row.data.response_status_code,
count: row.data.A === 'n/a' || row.data.A === undefined ? '-' : row.data.A,
rate: row.data.C === 'n/a' || row.data.C === undefined ? '-' : row.data.C,
p99Latency:
row.data.B === 'n/a' || row.data.B === undefined
? '-'
: Math.round(Number(row.data.B) / 1000000), // Convert from nanoseconds to milliseconds,
statusCode: getDisplayValue(row.data.response_status_code),
count: isEmptyFilterValue(row.data.A) ? '-' : (row.data.A as number),
rate: isEmptyFilterValue(row.data.C) ? '-' : (row.data.C as number),
p99Latency: isEmptyFilterValue(row.data.B)
? '-'
: Math.round(Number(row.data.B) / 1000000),
}));
};
@@ -2706,7 +2651,9 @@ export const endPointStatusCodeColumns: ColumnType<EndPointStatusCodeData>[] = [
b.p99Latency === '-' || b.p99Latency === 'n/a' ? 0 : Number(b.p99Latency);
return p99LatencyA - p99LatencyB;
},
render: (latency: number): ReactNode => <span>{latency || '-'}ms</span>,
render: (latency: number | string): ReactNode => (
<span>{latency !== '-' ? `${latency}ms` : '-'}</span>
),
},
];

View File

@@ -112,6 +112,8 @@ function AppLayout(props: AppLayoutProps): JSX.Element {
setShowPaymentFailedWarning,
] = useState<boolean>(false);
const errorBoundaryRef = useRef<Sentry.ErrorBoundary>(null);
const [showSlowApiWarning, setShowSlowApiWarning] = useState(false);
const [slowApiWarningShown, setSlowApiWarningShown] = useState(false);
@@ -378,6 +380,13 @@ function AppLayout(props: AppLayoutProps): JSX.Element {
getChangelogByVersionResponse.isSuccess,
]);
// reset error boundary on route change
useEffect(() => {
if (errorBoundaryRef.current) {
errorBoundaryRef.current.resetErrorBoundary();
}
}, [pathname]);
const isToDisplayLayout = isLoggedIn;
const routeKey = useMemo(() => getRouteKey(pathname), [pathname]);
@@ -836,7 +845,10 @@ function AppLayout(props: AppLayoutProps): JSX.Element {
})}
data-overlayscrollbars-initialize
>
<Sentry.ErrorBoundary fallback={<ErrorBoundaryFallback />}>
<Sentry.ErrorBoundary
fallback={<ErrorBoundaryFallback />}
ref={errorBoundaryRef}
>
<LayoutContent data-overlayscrollbars-initialize>
<OverlayScrollbar>
<ChildrenContainer>

View File

@@ -11,12 +11,14 @@ import { v4 } from 'uuid';
import { useCreateAlertState } from '../context';
import {
INITIAL_EVALUATION_WINDOW_STATE,
INITIAL_INFO_THRESHOLD,
INITIAL_RANDOM_THRESHOLD,
INITIAL_WARNING_THRESHOLD,
THRESHOLD_MATCH_TYPE_OPTIONS,
THRESHOLD_OPERATOR_OPTIONS,
} from '../context/constants';
import { AlertThresholdMatchType } from '../context/types';
import EvaluationSettings from '../EvaluationSettings/EvaluationSettings';
import ThresholdItem from './ThresholdItem';
import { AnomalyAndThresholdProps, UpdateThreshold } from './types';
@@ -38,12 +40,12 @@ function AlertThreshold({
alertState,
thresholdState,
setThresholdState,
setEvaluationWindow,
notificationSettings,
setNotificationSettings,
} = useCreateAlertState();
const { currentQuery } = useQueryBuilder();
const queryNames = getQueryNames(currentQuery);
useEffect(() => {
@@ -160,6 +162,54 @@ function AlertThreshold({
}),
);
const handleSetEvaluationDetailsForMeter = (): void => {
setEvaluationWindow({
type: 'SET_INITIAL_STATE_FOR_METER',
});
setThresholdState({
type: 'SET_MATCH_TYPE',
payload: AlertThresholdMatchType.IN_TOTAL,
});
};
const handleSelectedQueryChange = (value: string): void => {
// loop through currenttQuery and find the query that matches the selected query
const query = currentQuery?.builder?.queryData.find(
(query) => query.queryName === value,
);
const currentSelectedQuery = currentQuery?.builder?.queryData.find(
(query) => query.queryName === thresholdState.selectedQuery,
);
const newSelectedQuerySource = query?.source || '';
const currentSelectedQuerySource = currentSelectedQuery?.source || '';
if (newSelectedQuerySource === currentSelectedQuerySource) {
setThresholdState({
type: 'SET_SELECTED_QUERY',
payload: value,
});
return;
}
if (newSelectedQuerySource === 'meter') {
handleSetEvaluationDetailsForMeter();
} else {
setEvaluationWindow({
type: 'SET_INITIAL_STATE',
payload: INITIAL_EVALUATION_WINDOW_STATE,
});
}
setThresholdState({
type: 'SET_SELECTED_QUERY',
payload: value,
});
};
return (
<div
className={classNames(
@@ -175,12 +225,7 @@ function AlertThreshold({
</Typography.Text>
<Select
value={thresholdState.selectedQuery}
onChange={(value): void => {
setThresholdState({
type: 'SET_SELECTED_QUERY',
payload: value,
});
}}
onChange={handleSelectedQueryChange}
style={{ width: 80 }}
options={queryNames}
data-testid="alert-threshold-query-select"

View File

@@ -10,6 +10,7 @@ import { getEvaluationWindowTypeText, getTimeframeText } from './utils';
function EvaluationSettings(): JSX.Element {
const { evaluationWindow, setEvaluationWindow } = useCreateAlertState();
const [
isEvaluationWindowPopoverOpen,
setIsEvaluationWindowPopoverOpen,

View File

@@ -24,7 +24,11 @@ import {
INITIAL_EVALUATION_WINDOW_STATE,
INITIAL_NOTIFICATION_SETTINGS_STATE,
} from './constants';
import { ICreateAlertContextProps, ICreateAlertProviderProps } from './types';
import {
AlertThresholdMatchType,
ICreateAlertContextProps,
ICreateAlertProviderProps,
} from './types';
import {
advancedOptionsReducer,
alertCreationReducer,
@@ -67,6 +71,7 @@ export function CreateAlertProvider(
const { currentQuery, redirectWithQueryBuilderData } = useQueryBuilder();
const location = useLocation();
const queryParams = new URLSearchParams(location.search);
const thresholdsFromURL = queryParams.get(QueryParams.thresholds);
const [alertType, setAlertType] = useState<AlertTypes>(() => {
if (isEditMode) {
@@ -122,7 +127,28 @@ export function CreateAlertProvider(
setThresholdState({
type: 'RESET',
});
}, [alertType]);
if (thresholdsFromURL) {
try {
const thresholds = JSON.parse(thresholdsFromURL);
setThresholdState({
type: 'SET_THRESHOLDS',
payload: thresholds,
});
} catch (error) {
console.error('Error parsing thresholds from URL:', error);
}
setEvaluationWindow({
type: 'SET_INITIAL_STATE_FOR_METER',
});
setThresholdState({
type: 'SET_MATCH_TYPE',
payload: AlertThresholdMatchType.IN_TOTAL,
});
}
}, [alertType, thresholdsFromURL]);
useEffect(() => {
if (isEditMode && initialAlertState) {

View File

@@ -237,6 +237,7 @@ export type EvaluationWindowAction =
}
| { type: 'SET_EVALUATION_CADENCE_MODE'; payload: EvaluationCadenceMode }
| { type: 'SET_INITIAL_STATE'; payload: EvaluationWindowState }
| { type: 'SET_INITIAL_STATE_FOR_METER' }
| { type: 'RESET' };
export type EvaluationCadenceMode = 'default' | 'custom' | 'rrule';

View File

@@ -1,3 +1,5 @@
import { UTC_TIMEZONE } from 'components/CustomTimePicker/timezoneUtils';
import { UniversalYAxisUnit } from 'components/YAxisUnitSelector/types';
import { QueryParams } from 'constants/query';
import {
alertDefaults,
@@ -11,6 +13,7 @@ import { AlertDef } from 'types/api/alerts/def';
import { Query } from 'types/api/queryBuilder/queryBuilderData';
import { DataSource } from 'types/common/queryBuilder';
import { CumulativeWindowTimeframes } from '../EvaluationSettings/types';
import {
INITIAL_ADVANCED_OPTIONS_STATE,
INITIAL_ALERT_STATE,
@@ -210,6 +213,18 @@ export const evaluationWindowReducer = (
return INITIAL_EVALUATION_WINDOW_STATE;
case 'SET_INITIAL_STATE':
return action.payload;
case 'SET_INITIAL_STATE_FOR_METER':
return {
...state,
windowType: 'cumulative',
timeframe: CumulativeWindowTimeframes.CURRENT_DAY,
startingAt: {
time: '00:00:00',
number: '0',
timezone: UTC_TIMEZONE.value,
unit: UniversalYAxisUnit.MINUTES,
},
};
default:
return state;
}

View File

@@ -17,6 +17,7 @@ function ExplorerOptionWrapper({
isOneChartPerQuery,
splitedQueries,
signalSource,
handleChangeSelectedView,
}: ExplorerOptionsWrapperProps): JSX.Element {
const [isExplorerOptionHidden, setIsExplorerOptionHidden] = useState(false);
@@ -38,6 +39,7 @@ function ExplorerOptionWrapper({
setIsExplorerOptionHidden={setIsExplorerOptionHidden}
isOneChartPerQuery={isOneChartPerQuery}
splitedQueries={splitedQueries}
handleChangeSelectedView={handleChangeSelectedView}
/>
);
}

View File

@@ -72,10 +72,11 @@ import { Query } from 'types/api/queryBuilder/queryBuilderData';
import { ViewProps } from 'types/api/saveViews/types';
import { DataSource, StringOperators } from 'types/common/queryBuilder';
import { USER_ROLES } from 'types/roles';
import { panelTypeToExplorerView } from 'utils/explorerUtils';
import { PreservedViewsTypes } from './constants';
import ExplorerOptionsHideArea from './ExplorerOptionsHideArea';
import { PreservedViewsInLocalStorage } from './types';
import { ChangeViewFunctionType, PreservedViewsInLocalStorage } from './types';
import {
DATASOURCE_VS_ROUTES,
generateRGBAFromHex,
@@ -98,6 +99,7 @@ function ExplorerOptions({
setIsExplorerOptionHidden,
isOneChartPerQuery = false,
splitedQueries = [],
handleChangeSelectedView,
}: ExplorerOptionsProps): JSX.Element {
const [isExport, setIsExport] = useState<boolean>(false);
const [isSaveModalOpen, setIsSaveModalOpen] = useState(false);
@@ -412,13 +414,22 @@ function ExplorerOptions({
if (!currentViewDetails) return;
const { query, name, id, panelType: currentPanelType } = currentViewDetails;
handleExplorerTabChange(currentPanelType, {
query,
name,
id,
});
if (handleChangeSelectedView) {
handleChangeSelectedView(panelTypeToExplorerView[currentPanelType], {
query,
name,
id,
});
} else {
// to remove this after traces cleanup
handleExplorerTabChange(currentPanelType, {
query,
name,
id,
});
}
},
[viewsData, handleExplorerTabChange],
[viewsData, handleExplorerTabChange, handleChangeSelectedView],
);
const updatePreservedViewInLocalStorage = (option: {
@@ -524,6 +535,10 @@ function ExplorerOptions({
return;
}
if (handleChangeSelectedView) {
handleChangeSelectedView(panelTypeToExplorerView[PANEL_TYPES.LIST]);
}
history.replace(DATASOURCE_VS_ROUTES[sourcepage]);
};
@@ -1020,6 +1035,7 @@ export interface ExplorerOptionsProps {
setIsExplorerOptionHidden?: Dispatch<SetStateAction<boolean>>;
isOneChartPerQuery?: boolean;
splitedQueries?: Query[];
handleChangeSelectedView?: ChangeViewFunctionType;
}
ExplorerOptions.defaultProps = {
@@ -1029,6 +1045,7 @@ ExplorerOptions.defaultProps = {
isOneChartPerQuery: false,
splitedQueries: [],
signalSource: '',
handleChangeSelectedView: undefined,
};
export default ExplorerOptions;

View File

@@ -2,6 +2,8 @@ import { NotificationInstance } from 'antd/es/notification/interface';
import { AxiosResponse } from 'axios';
import { SaveViewWithNameProps } from 'components/ExplorerCard/types';
import { PANEL_TYPES } from 'constants/queryBuilder';
import { ICurrentQueryData } from 'hooks/useHandleExplorerTabChange';
import { ExplorerViews } from 'pages/LogsExplorer/utils';
import { Dispatch, SetStateAction } from 'react';
import { UseMutateAsyncFunction } from 'react-query';
import { ICompositeMetricQuery } from 'types/api/alerts/compositeQuery';
@@ -38,3 +40,8 @@ export type PreservedViewType =
export type PreservedViewsInLocalStorage = Partial<
Record<PreservedViewType, { key: string; value: string }>
>;
export type ChangeViewFunctionType = (
view: ExplorerViews,
querySearchParameters?: ICurrentQueryData,
) => void;

View File

@@ -108,6 +108,13 @@ function ChartPreview({
const [minTimeScale, setMinTimeScale] = useState<number>();
const [maxTimeScale, setMaxTimeScale] = useState<number>();
const [graphVisibility, setGraphVisibility] = useState<boolean[]>([]);
const legendScrollPositionRef = useRef<{
scrollTop: number;
scrollLeft: number;
}>({
scrollTop: 0,
scrollLeft: 0,
});
const { currentQuery } = useQueryBuilder();
const { minTime, maxTime, selectedTime: globalSelectedInterval } = useSelector<
@@ -296,6 +303,13 @@ function ChartPreview({
setGraphsVisibilityStates: setGraphVisibility,
enhancedLegend: true,
legendPosition,
legendScrollPosition: legendScrollPositionRef.current,
setLegendScrollPosition: (position: {
scrollTop: number;
scrollLeft: number;
}) => {
legendScrollPositionRef.current = position;
},
}),
[
yAxisUnit,

View File

@@ -36,6 +36,7 @@ function QuerySection({
// init namespace for translations
const { t } = useTranslation('alerts');
const [currentTab, setCurrentTab] = useState(queryCategory);
const [signalSource, setSignalSource] = useState<string>('metrics');
const handleQueryCategoryChange = (queryType: string): void => {
setQueryCategory(queryType as EQueryType);
@@ -48,12 +49,17 @@ function QuerySection({
const isDarkMode = useIsDarkMode();
const handleSignalSourceChange = (value: string): void => {
setSignalSource(value);
};
const renderMetricUI = (): JSX.Element => (
<QueryBuilderV2
panelType={panelType}
config={{
queryVariant: 'static',
initialDataSource: ALERTS_DATA_SOURCE_MAP[alertType],
signalSource: signalSource === 'meter' ? 'meter' : '',
}}
showTraceOperator={alertType === AlertTypes.TRACES_BASED_ALERT}
showFunctions={
@@ -62,6 +68,8 @@ function QuerySection({
alertType === AlertTypes.LOGS_BASED_ALERT
}
version={alertDef.version || 'v3'}
onSignalSourceChange={handleSignalSourceChange}
signalSourceChangeEnabled
/>
);

View File

@@ -137,8 +137,7 @@ function GeneralSettings({
if (logsCurrentTTLValues) {
setLogsTotalRetentionPeriod(logsCurrentTTLValues.default_ttl_days * 24);
setLogsS3RetentionPeriod(
logsCurrentTTLValues.cold_storage_ttl_days &&
logsCurrentTTLValues.cold_storage_ttl_days > 0
logsCurrentTTLValues.cold_storage_ttl_days
? logsCurrentTTLValues.cold_storage_ttl_days * 24
: null,
);

View File

@@ -94,6 +94,9 @@ const mockDisksWithoutS3: IDiskType[] = [
];
describe('GeneralSettings - S3 Logs Retention', () => {
const BUTTON_SELECTOR = 'button[type="button"]';
const PRIMARY_BUTTON_CLASS = 'ant-btn-primary';
beforeEach(() => {
jest.clearAllMocks();
(setRetentionApiV2 as jest.Mock).mockResolvedValue({
@@ -155,10 +158,10 @@ describe('GeneralSettings - S3 Logs Retention', () => {
await user.type(s3Input, '5');
// Find the save button in the Logs card
const buttons = logsCard?.querySelectorAll('button[type="button"]');
const buttons = logsCard?.querySelectorAll(BUTTON_SELECTOR);
// The primary button should be the save button
const saveButton = Array.from(buttons || []).find((btn) =>
btn.className.includes('ant-btn-primary'),
btn.className.includes(PRIMARY_BUTTON_CLASS),
) as HTMLButtonElement;
expect(saveButton).toBeInTheDocument();
@@ -262,9 +265,9 @@ describe('GeneralSettings - S3 Logs Retention', () => {
await user.type(totalInput, '60');
// Find the save button
const buttons = logsCard?.querySelectorAll('button[type="button"]');
const buttons = logsCard?.querySelectorAll(BUTTON_SELECTOR);
const saveButton = Array.from(buttons || []).find((btn) =>
btn.className.includes('ant-btn-primary'),
btn.className.includes(PRIMARY_BUTTON_CLASS),
) as HTMLButtonElement;
expect(saveButton).toBeInTheDocument();
@@ -329,4 +332,59 @@ describe('GeneralSettings - S3 Logs Retention', () => {
expect(dropdowns?.[1]).toHaveTextContent('Days');
});
});
describe('Test 4: Save Button State with S3 Disabled', () => {
it('should disable save button when cold_storage_ttl_days is -1 and no changes made', async () => {
const user = userEvent.setup({ pointerEventsCheck: 0 });
render(
<GeneralSettings
metricsTtlValuesPayload={mockMetricsRetention}
tracesTtlValuesPayload={mockTracesRetention}
logsTtlValuesPayload={mockLogsRetentionWithoutS3}
getAvailableDiskPayload={mockDisksWithS3}
metricsTtlValuesRefetch={jest.fn()}
tracesTtlValuesRefetch={jest.fn()}
logsTtlValuesRefetch={jest.fn()}
/>,
);
// Find the Logs card
const logsCard = screen.getByText('Logs').closest('.ant-card');
expect(logsCard).toBeInTheDocument();
// Find the save button
const buttons = logsCard?.querySelectorAll(BUTTON_SELECTOR);
const saveButton = Array.from(buttons || []).find((btn) =>
btn.className.includes(PRIMARY_BUTTON_CLASS),
) as HTMLButtonElement;
expect(saveButton).toBeInTheDocument();
// Verify save button is disabled on initial load (no changes, S3 disabled with -1)
expect(saveButton).toBeDisabled();
// Find the total retention input
const inputs = logsCard?.querySelectorAll('input[type="text"]');
const totalInput = inputs?.[0] as HTMLInputElement;
// Change total retention value to trigger button enable
await user.clear(totalInput);
await user.type(totalInput, '60');
// Button should now be enabled after change
await waitFor(() => {
expect(saveButton).not.toBeDisabled();
});
// Revert to original value (30 days displays as 1 Month)
await user.clear(totalInput);
await user.type(totalInput, '1');
// Button should be disabled again (back to original state)
await waitFor(() => {
expect(saveButton).toBeDisabled();
});
});
});
});

View File

@@ -46,8 +46,7 @@ export const convertHoursValueToRelevantUnit = (
availableUnits?: ITimeUnit[],
): ITimeUnitConversion => {
const unitsToConsider = availableUnits?.length ? availableUnits : TimeUnits;
if (value) {
if (value >= 0) {
for (let idx = unitsToConsider.length - 1; idx >= 0; idx -= 1) {
const timeUnit = unitsToConsider[idx];
const convertedValue = timeUnit.multiplier * value;
@@ -62,7 +61,7 @@ export const convertHoursValueToRelevantUnit = (
}
// Fallback to the first available unit
return { value, timeUnitValue: unitsToConsider[0].value };
return { value: -1, timeUnitValue: unitsToConsider[0].value };
};
export const convertHoursValueToRelevantUnitString = (

View File

@@ -324,6 +324,7 @@ function FullView({
panelType={selectedPanelType}
version={selectedDashboard?.data?.version || 'v3'}
isListViewPanel={selectedPanelType === PANEL_TYPES.LIST}
signalSourceChangeEnabled
// filterConfigs={filterConfigs}
// queryComponents={queryComponents}
/>

View File

@@ -17,12 +17,6 @@ export const Card = styled(CardComponent)<CardProps>`
overflow: hidden;
border-radius: 3px;
border: 1px solid var(--bg-slate-500);
background: linear-gradient(
0deg,
rgba(171, 189, 255, 0) 0%,
rgba(171, 189, 255, 0) 100%
),
#0b0c0e;
${({ isDarkMode }): StyledCSS =>
!isDarkMode &&

View File

@@ -48,6 +48,7 @@ function GridTableComponent({
widgetId,
panelType,
queryRangeRequest,
decimalPrecision,
...props
}: GridTableComponentProps): JSX.Element {
const { t } = useTranslation(['valueGraph']);
@@ -87,10 +88,15 @@ function GridTableComponent({
const newValue = { ...val };
Object.keys(val).forEach((k) => {
const unit = getColumnUnit(k, columnUnits);
if (unit) {
// the check below takes care of not adding units for rows that have n/a or null values
if (val[k] !== 'n/a' && val[k] !== null) {
newValue[k] = getYAxisFormattedValue(String(val[k]), unit);
newValue[k] = getYAxisFormattedValue(
String(val[k]),
unit,
decimalPrecision,
);
} else if (val[k] === null) {
newValue[k] = 'n/a';
}
@@ -103,7 +109,7 @@ function GridTableComponent({
return mutateDataSource;
},
[columnUnits],
[columnUnits, decimalPrecision],
);
const dataSource = useMemo(() => applyColumnUnits(originalDataSource), [

View File

@@ -1,4 +1,5 @@
import { TableProps } from 'antd';
import { PrecisionOption } from 'components/Graph/yAxisConfig';
import { PANEL_TYPES } from 'constants/queryBuilder';
import { LogsExplorerTableProps } from 'container/LogsExplorerTable/LogsExplorerTable.interfaces';
import {
@@ -15,6 +16,7 @@ export type GridTableComponentProps = {
query: Query;
thresholds?: ThresholdProps[];
columnUnits?: ColumnUnit;
decimalPrecision?: PrecisionOption;
tableProcessedDataRef?: React.MutableRefObject<RowData[]>;
sticky?: TableProps<RowData>['sticky'];
searchTerm?: string;

View File

@@ -99,7 +99,11 @@ function GridValueComponent({
rawValue={value}
value={
yAxisUnit
? getYAxisFormattedValue(String(value), yAxisUnit)
? getYAxisFormattedValue(
String(value),
yAxisUnit,
widget?.decimalPrecision,
)
: value.toString()
}
/>

View File

@@ -423,6 +423,7 @@
display: flex;
flex-direction: row;
gap: 14px;
align-items: flex-start;
.section-icon {
display: flex;
@@ -461,7 +462,6 @@
flex-direction: column;
gap: 14px;
width: 150px;
justify-content: flex-end;
.ant-btn {

View File

@@ -115,6 +115,13 @@ function EntityMetrics<T>({
const graphRef = useRef<HTMLDivElement>(null);
const dimensions = useResizeObserver(graphRef);
const { currentQuery } = useQueryBuilder();
const legendScrollPositionRef = useRef<{
scrollTop: number;
scrollLeft: number;
}>({
scrollTop: 0,
scrollLeft: 0,
});
const chartData = useMemo(
() =>
@@ -184,6 +191,13 @@ function EntityMetrics<T>({
maxTimeScale: graphTimeIntervals[idx].end,
onDragSelect: (start, end) => onDragSelect(start, end, idx),
query: currentQuery,
legendScrollPosition: legendScrollPositionRef.current,
setLegendScrollPosition: (position: {
scrollTop: number;
scrollLeft: number;
}) => {
legendScrollPositionRef.current = position;
},
});
}),
[

View File

@@ -418,6 +418,11 @@
font-size: 12px;
font-weight: 600;
}
.set-alert-btn {
cursor: pointer;
margin-left: 24px;
}
}
}

View File

@@ -19,6 +19,7 @@ import {
TablePaginationConfig,
TableProps as AntDTableProps,
Tag,
Tooltip,
Typography,
} from 'antd';
import { NotificationInstance } from 'antd/es/notification/interface';
@@ -34,15 +35,20 @@ import { getYAxisFormattedValue } from 'components/Graph/yAxisConfig';
import Tags from 'components/Tags/Tags';
import { SOMETHING_WENT_WRONG } from 'constants/api';
import { DATE_TIME_FORMATS } from 'constants/dateTimeFormats';
import { QueryParams } from 'constants/query';
import { initialQueryMeterWithType } from 'constants/queryBuilder';
import ROUTES from 'constants/routes';
import { INITIAL_ALERT_THRESHOLD_STATE } from 'container/CreateAlertV2/context/constants';
import dayjs from 'dayjs';
import { useGetDeploymentsData } from 'hooks/CustomDomain/useGetDeploymentsData';
import { useGetAllIngestionsKeys } from 'hooks/IngestionKeys/useGetAllIngestionKeys';
import useDebouncedFn from 'hooks/useDebouncedFunction';
import { useGetTenantLicense } from 'hooks/useGetTenantLicense';
import { useNotifications } from 'hooks/useNotifications';
import { isNil, isUndefined } from 'lodash-es';
import { cloneDeep, isNil, isUndefined } from 'lodash-es';
import {
ArrowUpRight,
BellPlus,
CalendarClock,
Check,
Copy,
@@ -60,6 +66,7 @@ import { useTimezone } from 'providers/Timezone';
import { ChangeEvent, useEffect, useState } from 'react';
import { useTranslation } from 'react-i18next';
import { useMutation } from 'react-query';
import { useHistory } from 'react-router-dom';
import { useCopyToClipboard } from 'react-use';
import { ErrorResponse } from 'types/api';
import {
@@ -71,6 +78,7 @@ import {
IngestionKeyProps,
PaginationProps,
} from 'types/api/ingestionKeys/types';
import { MeterAggregateOperator } from 'types/common/queryBuilder';
import { USER_ROLES } from 'types/roles';
import { getDaysUntilExpiry } from 'utils/timeUtils';
@@ -170,6 +178,8 @@ function MultiIngestionSettings(): JSX.Element {
const { isEnterpriseSelfHostedUser } = useGetTenantLicense();
const history = useHistory();
const [
hasCreateLimitForIngestionKeyError,
setHasCreateLimitForIngestionKeyError,
@@ -694,6 +704,68 @@ function MultiIngestionSettings(): JSX.Element {
const { formatTimezoneAdjustedTimestamp } = useTimezone();
const handleCreateAlert = (
APIKey: IngestionKeyProps,
signal: LimitProps,
): void => {
let metricName = '';
switch (signal.signal) {
case 'metrics':
metricName = 'signoz.meter.metric.datapoint.count';
break;
case 'traces':
metricName = 'signoz.meter.span.size';
break;
case 'logs':
metricName = 'signoz.meter.log.size';
break;
default:
return;
}
const threshold =
signal.signal === 'metrics'
? signal.config?.day?.count || 0
: signal.config?.day?.size || 0;
const query = {
...initialQueryMeterWithType,
builder: {
...initialQueryMeterWithType.builder,
queryData: [
{
...initialQueryMeterWithType.builder.queryData[0],
aggregations: [
{
...initialQueryMeterWithType.builder.queryData[0].aggregations?.[0],
metricName,
timeAggregation: MeterAggregateOperator.INCREASE,
spaceAggregation: MeterAggregateOperator.SUM,
},
],
filter: {
expression: `signoz.workspace.key.id='${APIKey.id}'`,
},
},
],
},
};
const stringifiedQuery = JSON.stringify(query);
const thresholds = cloneDeep(INITIAL_ALERT_THRESHOLD_STATE.thresholds);
thresholds[0].thresholdValue = threshold;
const URL = `${ROUTES.ALERTS_NEW}?showNewCreateAlertsPage=true&${
QueryParams.compositeQuery
}=${encodeURIComponent(stringifiedQuery)}&${
QueryParams.thresholds
}=${encodeURIComponent(JSON.stringify(thresholds))}`;
history.push(URL);
};
const columns: AntDTableProps<IngestionKeyProps>['columns'] = [
{
title: 'Ingestion Key',
@@ -1183,6 +1255,27 @@ function MultiIngestionSettings(): JSX.Element {
</>
))}
</div>
{((signalCfg.usesSize &&
limit?.config?.day?.size !== undefined) ||
(signalCfg.usesCount &&
limit?.config?.day?.count !== undefined)) && (
<Tooltip
title="Set alert on this limit"
placement="top"
arrow={false}
>
<Button
icon={<BellPlus size={14} color={Color.BG_CHERRY_400} />}
className="set-alert-btn periscope-btn ghost"
type="text"
data-testid={`set-alert-btn-${signalName}`}
onClick={(): void =>
handleCreateAlert(APIKey, limitsDict[signalName])
}
/>
</Tooltip>
)}
</div>
{/* SECOND limit usage/limit */}

View File

@@ -1,10 +1,60 @@
import { render, screen } from 'tests/test-utils';
import { QueryParams } from 'constants/query';
import { rest, server } from 'mocks-server/server';
import { render, screen, userEvent, waitFor } from 'tests/test-utils';
import { LimitProps } from 'types/api/ingestionKeys/limits/types';
import {
AllIngestionKeyProps,
IngestionKeyProps,
} from 'types/api/ingestionKeys/types';
import MultiIngestionSettings from '../MultiIngestionSettings';
// Extend the existing types to include limits with proper structure
interface TestIngestionKeyProps extends Omit<IngestionKeyProps, 'limits'> {
limits?: LimitProps[];
}
interface TestAllIngestionKeyProps extends Omit<AllIngestionKeyProps, 'data'> {
data: TestIngestionKeyProps[];
}
// Mock useHistory.push to capture navigation URL used by MultiIngestionSettings
const mockPush = jest.fn() as jest.MockedFunction<(path: string) => void>;
jest.mock('react-router-dom', () => {
// eslint-disable-next-line @typescript-eslint/no-var-requires
const actual = jest.requireActual('react-router-dom');
return {
...actual,
useHistory: (): { push: typeof mockPush } => ({ push: mockPush }),
};
});
// Mock deployments data hook to avoid unrelated network calls in this page
jest.mock(
'hooks/CustomDomain/useGetDeploymentsData',
(): Record<string, unknown> => ({
useGetDeploymentsData: (): {
data: undefined;
isLoading: boolean;
isFetching: boolean;
isError: boolean;
} => ({
data: undefined,
isLoading: false,
isFetching: false,
isError: false,
}),
}),
);
const TEST_CREATED_UPDATED = '2024-01-01T00:00:00Z';
const TEST_EXPIRES_AT = '2030-01-01T00:00:00Z';
const TEST_WORKSPACE_ID = 'w1';
const INGESTION_SETTINGS_ROUTE = '/ingestion-settings';
describe('MultiIngestionSettings Page', () => {
beforeEach(() => {
render(<MultiIngestionSettings />);
mockPush.mockClear();
});
afterEach(() => {
@@ -12,6 +62,10 @@ describe('MultiIngestionSettings Page', () => {
});
it('renders MultiIngestionSettings page without crashing', () => {
render(<MultiIngestionSettings />, undefined, {
initialRoute: INGESTION_SETTINGS_ROUTE,
});
expect(screen.getByText('Ingestion Keys')).toBeInTheDocument();
expect(
@@ -27,4 +81,181 @@ describe('MultiIngestionSettings Page', () => {
expect(aboutKeyslink).toHaveClass('learn-more');
expect(aboutKeyslink).toHaveAttribute('rel', 'noreferrer');
});
it('navigates to create alert with metrics count threshold', async () => {
const user = userEvent.setup({ pointerEventsCheck: 0 });
// Arrange API response with a metrics daily count limit so the alert button is visible
const response: TestAllIngestionKeyProps = {
status: 'success',
data: [
{
name: 'Key One',
expires_at: TEST_EXPIRES_AT,
value: 'secret',
workspace_id: TEST_WORKSPACE_ID,
id: 'k1',
created_at: TEST_CREATED_UPDATED,
updated_at: TEST_CREATED_UPDATED,
tags: [],
limits: [
{
id: 'l1',
signal: 'metrics',
config: { day: { count: 1000 } },
},
],
},
],
_pagination: { page: 1, per_page: 10, pages: 1, total: 1 },
};
server.use(
rest.get('*/workspaces/me/keys*', (_req, res, ctx) =>
res(ctx.status(200), ctx.json(response)),
),
);
// Render with initial route to test navigation
render(<MultiIngestionSettings />, undefined, {
initialRoute: INGESTION_SETTINGS_ROUTE,
});
// Wait for ingestion key to load and expand the row to show limits
await screen.findByText('Key One');
const expandButton = screen.getByRole('button', { name: /right Key One/i });
await user.click(expandButton);
// Wait for limits section to render and click metrics alert button by test id
await screen.findByText('LIMITS');
const metricsAlertBtn = (await screen.findByTestId(
'set-alert-btn-metrics',
)) as HTMLButtonElement;
await user.click(metricsAlertBtn);
// Wait for navigation to occur
await waitFor(() => {
expect(mockPush).toHaveBeenCalledTimes(1);
});
// Assert: navigation occurred with correct query parameters
const navigationCall = mockPush.mock.calls[0][0] as string;
// Check URL contains alerts/new route
expect(navigationCall).toContain('/alerts/new');
expect(navigationCall).toContain('showNewCreateAlertsPage=true');
// Parse query parameters
const urlParams = new URLSearchParams(navigationCall.split('?')[1]);
const thresholds = JSON.parse(urlParams.get(QueryParams.thresholds) || '{}');
expect(thresholds).toBeDefined();
expect(thresholds[0].thresholdValue).toBe(1000);
// Verify compositeQuery parameter exists and contains correct data
const compositeQuery = JSON.parse(
urlParams.get(QueryParams.compositeQuery) || '{}',
);
expect(compositeQuery.builder).toBeDefined();
expect(compositeQuery.builder.queryData).toBeDefined();
// Check that the query contains the correct filter expression for the key
const firstQueryData = compositeQuery.builder.queryData[0];
expect(firstQueryData.filter.expression).toContain(
"signoz.workspace.key.id='k1'",
);
// Verify metric name for metrics signal
expect(firstQueryData.aggregations[0].metricName).toBe(
'signoz.meter.metric.datapoint.count',
);
});
it('navigates to create alert for logs with size threshold', async () => {
const user = userEvent.setup({ pointerEventsCheck: 0 });
// Arrange API response with a logs daily size limit so the alert button is visible
const response: TestAllIngestionKeyProps = {
status: 'success',
data: [
{
name: 'Key Two',
expires_at: TEST_EXPIRES_AT,
value: 'secret',
workspace_id: TEST_WORKSPACE_ID,
id: 'k2',
created_at: TEST_CREATED_UPDATED,
updated_at: TEST_CREATED_UPDATED,
tags: [],
limits: [
{
id: 'l2',
signal: 'logs',
config: { day: { size: 2048 } },
},
],
},
],
_pagination: { page: 1, per_page: 10, pages: 1, total: 1 },
};
server.use(
rest.get('*/workspaces/me/keys*', (_req, res, ctx) =>
res(ctx.status(200), ctx.json(response)),
),
);
render(<MultiIngestionSettings />, undefined, {
initialRoute: INGESTION_SETTINGS_ROUTE,
});
// Wait for ingestion key to load and expand the row to show limits
await screen.findByText('Key Two');
const expandButton = screen.getByRole('button', { name: /right Key Two/i });
await user.click(expandButton);
// Wait for limits section to render and click logs alert button by test id
await screen.findByText('LIMITS');
const logsAlertBtn = (await screen.findByTestId(
'set-alert-btn-logs',
)) as HTMLButtonElement;
await user.click(logsAlertBtn);
// Wait for navigation to occur
await waitFor(() => {
expect(mockPush).toHaveBeenCalledTimes(1);
});
// Assert: navigation occurred with correct query parameters
const navigationCall = mockPush.mock.calls[0][0] as string;
// Check URL contains alerts/new route
expect(navigationCall).toContain('/alerts/new');
expect(navigationCall).toContain('showNewCreateAlertsPage=true');
// Parse query parameters
const urlParams = new URLSearchParams(navigationCall.split('?')[1]);
// Verify thresholds parameter
const thresholds = JSON.parse(urlParams.get(QueryParams.thresholds) || '{}');
expect(thresholds).toBeDefined();
expect(thresholds[0].thresholdValue).toBe(2048);
// Verify compositeQuery parameter exists and contains correct data
const compositeQuery = JSON.parse(
urlParams.get(QueryParams.compositeQuery) || '{}',
);
expect(compositeQuery.builder).toBeDefined();
expect(compositeQuery.builder.queryData).toBeDefined();
// Check that the query contains the correct filter expression for the key
const firstQueryData = compositeQuery.builder.queryData[0];
expect(firstQueryData.filter.expression).toContain(
"signoz.workspace.key.id='k2'",
);
// Verify metric name for logs signal
expect(firstQueryData.aggregations[0].metricName).toBe(
'signoz.meter.log.size',
);
});
});

View File

@@ -1,9 +1,11 @@
import './InfraMetrics.styles.scss';
import { Empty, Radio } from 'antd';
import { Empty } from 'antd';
import { RadioChangeEvent } from 'antd/lib';
import SignozRadioGroup from 'components/SignozRadioGroup/SignozRadioGroup';
import { History, Table } from 'lucide-react';
import { useState } from 'react';
import { useMemo, useState } from 'react';
import { DataSource } from 'types/common/queryBuilder';
import { VIEW_TYPES } from './constants';
import NodeMetrics from './NodeMetrics';
@@ -14,7 +16,8 @@ interface MetricsDataProps {
nodeName: string;
hostName: string;
clusterName: string;
logLineTimestamp: string;
timestamp: string;
dataSource: DataSource.LOGS | DataSource.TRACES;
}
function InfraMetrics({
@@ -22,22 +25,56 @@ function InfraMetrics({
nodeName,
hostName,
clusterName,
logLineTimestamp,
timestamp,
dataSource = DataSource.LOGS,
}: MetricsDataProps): JSX.Element {
const [selectedView, setSelectedView] = useState<string>(() =>
podName ? VIEW_TYPES.POD : VIEW_TYPES.NODE,
);
const viewOptions = useMemo(() => {
const options = [
{
label: (
<div className="view-title">
<Table size={14} />
Node
</div>
),
value: VIEW_TYPES.NODE,
},
];
if (podName) {
options.push({
label: (
<div className="view-title">
<History size={14} />
Pod
</div>
),
value: VIEW_TYPES.POD,
});
}
return options;
}, [podName]);
const handleModeChange = (e: RadioChangeEvent): void => {
setSelectedView(e.target.value);
};
if (!podName && !nodeName && !hostName) {
const emptyStateDescription =
dataSource === DataSource.TRACES
? 'No data available. Please select a span containing a pod, node, or host attributes to view metrics.'
: 'No data available. Please select a valid log line containing a pod, node, or host attributes to view metrics.';
return (
<div className="empty-container">
<Empty
image={Empty.PRESENTED_IMAGE_SIMPLE}
description="No data available. Please select a valid log line containing a pod, node, or host attributes to view metrics."
description={emptyStateDescription}
/>
</div>
);
@@ -45,46 +82,26 @@ function InfraMetrics({
return (
<div className="infra-metrics-container">
<Radio.Group
className="views-tabs"
onChange={handleModeChange}
<SignozRadioGroup
value={selectedView}
>
<Radio.Button
className={selectedView === VIEW_TYPES.NODE ? 'selected_view tab' : 'tab'}
value={VIEW_TYPES.NODE}
>
<div className="view-title">
<Table size={14} />
Node
</div>
</Radio.Button>
{podName && (
<Radio.Button
className={selectedView === VIEW_TYPES.POD ? 'selected_view tab' : 'tab'}
value={VIEW_TYPES.POD}
>
<div className="view-title">
<History size={14} />
Pod
</div>
</Radio.Button>
)}
</Radio.Group>
onChange={handleModeChange}
className="views-tabs"
options={viewOptions}
/>
{/* TODO(Rahul): Make a common config driven component for this and other infra metrics components */}
{selectedView === VIEW_TYPES.NODE && (
<NodeMetrics
nodeName={nodeName}
clusterName={clusterName}
hostName={hostName}
logLineTimestamp={logLineTimestamp}
timestamp={timestamp}
/>
)}
{selectedView === VIEW_TYPES.POD && podName && (
<PodMetrics
podName={podName}
clusterName={clusterName}
logLineTimestamp={logLineTimestamp}
timestamp={timestamp}
/>
)}
</div>

View File

@@ -29,15 +29,15 @@ function NodeMetrics({
nodeName,
clusterName,
hostName,
logLineTimestamp,
timestamp,
}: {
nodeName: string;
clusterName: string;
hostName: string;
logLineTimestamp: string;
timestamp: string;
}): JSX.Element {
const { start, end, verticalLineTimestamp } = useMemo(() => {
const logTimestamp = dayjs(logLineTimestamp);
const logTimestamp = dayjs(timestamp);
const now = dayjs();
const startTime = logTimestamp.subtract(3, 'hour');
@@ -50,7 +50,7 @@ function NodeMetrics({
end: endTime.unix(),
verticalLineTimestamp: logTimestamp.unix(),
};
}, [logLineTimestamp]);
}, [timestamp]);
const { featureFlags } = useAppContext();
const dotMetricsEnabled =
@@ -83,6 +83,13 @@ function NodeMetrics({
const isDarkMode = useIsDarkMode();
const graphRef = useRef<HTMLDivElement>(null);
const dimensions = useResizeObserver(graphRef);
const legendScrollPositionRef = useRef<{
scrollTop: number;
scrollLeft: number;
}>({
scrollTop: 0,
scrollLeft: 0,
});
const chartData = useMemo(
() => queries.map(({ data }) => getUPlotChartData(data?.payload)),
@@ -109,6 +116,13 @@ function NodeMetrics({
uPlot.tzDate(new Date(timestamp * 1e3), timezone.value),
timezone: timezone.value,
query: currentQuery,
legendScrollPosition: legendScrollPositionRef.current,
setLegendScrollPosition: (position: {
scrollTop: number;
scrollLeft: number;
}) => {
legendScrollPositionRef.current = position;
},
}),
),
[

View File

@@ -23,14 +23,14 @@ import { getPodQueryPayload, podWidgetInfo } from './constants';
function PodMetrics({
podName,
clusterName,
logLineTimestamp,
timestamp,
}: {
podName: string;
clusterName: string;
logLineTimestamp: string;
timestamp: string;
}): JSX.Element {
const { start, end, verticalLineTimestamp } = useMemo(() => {
const logTimestamp = dayjs(logLineTimestamp);
const logTimestamp = dayjs(timestamp);
const now = dayjs();
const startTime = logTimestamp.subtract(3, 'hour');
@@ -43,7 +43,15 @@ function PodMetrics({
end: endTime.unix(),
verticalLineTimestamp: logTimestamp.unix(),
};
}, [logLineTimestamp]);
}, [timestamp]);
const legendScrollPositionRef = useRef<{
scrollTop: number;
scrollLeft: number;
}>({
scrollTop: 0,
scrollLeft: 0,
});
const { featureFlags } = useAppContext();
const dotMetricsEnabled =
@@ -91,6 +99,13 @@ function PodMetrics({
uPlot.tzDate(new Date(timestamp * 1e3), timezone.value),
timezone: timezone.value,
query: currentQuery,
legendScrollPosition: legendScrollPositionRef.current,
setLegendScrollPosition: (position: {
scrollTop: number;
scrollLeft: number;
}) => {
legendScrollPositionRef.current = position;
},
}),
),
[

View File

@@ -6,7 +6,6 @@ import { useGetExplorerQueryRange } from 'hooks/queryBuilder/useGetExplorerQuery
import { logsQueryRangeEmptyResponse } from 'mocks-server/__mockdata__/logs_query_range';
import { server } from 'mocks-server/server';
import { rest } from 'msw';
import { ExplorerViews } from 'pages/LogsExplorer/utils';
import { PreferenceContextProvider } from 'providers/preferences/context/PreferenceContextProvider';
import { QueryBuilderContext } from 'providers/QueryBuilder';
import { render, screen } from 'tests/test-utils';
@@ -122,12 +121,12 @@ describe('LogsExplorerList - empty states', () => {
<QueryBuilderContext.Provider value={mockTraceToLogsContextValue as any}>
<PreferenceContextProvider>
<LogsExplorerViews
selectedView={ExplorerViews.LIST}
setIsLoadingQueries={(): void => {}}
listQueryKeyRef={{ current: {} }}
chartQueryKeyRef={{ current: {} }}
setWarning={(): void => {}}
showLiveLogs={false}
handleChangeSelectedView={(): void => {}}
/>
</PreferenceContextProvider>
</QueryBuilderContext.Provider>,
@@ -187,12 +186,12 @@ describe('LogsExplorerList - empty states', () => {
<QueryBuilderContext.Provider value={mockTraceToLogsContextValue as any}>
<PreferenceContextProvider>
<LogsExplorerViews
selectedView={ExplorerViews.LIST}
setIsLoadingQueries={(): void => {}}
listQueryKeyRef={{ current: {} }}
chartQueryKeyRef={{ current: {} }}
setWarning={(): void => {}}
showLiveLogs={false}
handleChangeSelectedView={(): void => {}}
/>
</PreferenceContextProvider>
</QueryBuilderContext.Provider>,

View File

@@ -0,0 +1,210 @@
import {
initialQueryBuilderFormValues,
OPERATORS,
PANEL_TYPES,
} from 'constants/queryBuilder';
import { getPaginationQueryDataV2 } from 'lib/newQueryBuilder/getPaginationQueryData';
import { DataTypes } from 'types/api/queryBuilder/queryAutocompleteResponse';
import {
IBuilderQuery,
Query,
TagFilter,
} from 'types/api/queryBuilder/queryBuilderData';
import { Filter } from 'types/api/v5/queryRange';
import { LogsAggregatorOperator } from 'types/common/queryBuilder';
import { v4 } from 'uuid';
export const getListQuery = (
stagedQuery: Query | null,
): IBuilderQuery | null => {
if (!stagedQuery || stagedQuery.builder.queryData.length < 1) return null;
return stagedQuery.builder.queryData[0] ?? null;
};
export const getFrequencyChartData = (
stagedQuery: Query | null,
activeLogId: string | null,
): Query | null => {
if (!stagedQuery) {
return null;
}
const baseFirstQuery = getListQuery(stagedQuery);
if (!baseFirstQuery) {
return null;
}
let updatedFilterExpression = baseFirstQuery.filter?.expression || '';
if (activeLogId) {
updatedFilterExpression = `${updatedFilterExpression} id <= '${activeLogId}'`.trim();
}
const modifiedQueryData: IBuilderQuery = {
...baseFirstQuery,
disabled: false,
aggregateOperator: LogsAggregatorOperator.COUNT,
filter: {
...baseFirstQuery.filter,
expression: updatedFilterExpression || '',
},
...(activeLogId && {
filters: {
...baseFirstQuery.filters,
items: [
...(baseFirstQuery?.filters?.items || []),
{
id: v4(),
key: {
key: 'id',
type: '',
dataType: DataTypes.String,
},
op: OPERATORS['<='],
value: activeLogId,
},
],
op: 'AND',
},
}),
groupBy: [
{
key: 'severity_text',
dataType: DataTypes.String,
type: '',
id: 'severity_text--string----true',
},
],
legend: '{{severity_text}}',
orderBy: [],
having: {
expression: '',
},
};
const modifiedQuery: Query = {
...stagedQuery,
builder: {
...stagedQuery.builder,
queryData: [modifiedQueryData], // single query data required for list chart
},
};
return modifiedQuery;
};
export const getQueryByPanelType = (
query: Query | null,
selectedPanelType: PANEL_TYPES,
params: {
page?: number;
pageSize?: number;
filters?: TagFilter;
filter?: Filter;
activeLogId?: string | null;
orderBy?: string;
},
): Query | null => {
if (!query) return null;
let queryData: IBuilderQuery[] = query.builder.queryData.map((item) => ({
...item,
}));
if (selectedPanelType === PANEL_TYPES.LIST) {
const { activeLogId = null, orderBy = 'timestamp:desc' } = params;
const paginateData = getPaginationQueryDataV2({
page: params.page ?? 1,
pageSize: params.pageSize ?? 10,
});
let updatedFilters = params.filters;
let updatedFilterExpression = params.filter?.expression || '';
if (activeLogId) {
updatedFilters = {
...params.filters,
items: [
...(params.filters?.items || []),
{
id: v4(),
key: {
key: 'id',
type: '',
dataType: DataTypes.String,
},
op: OPERATORS['<='],
value: activeLogId,
},
],
op: 'AND',
};
updatedFilterExpression = `${updatedFilterExpression} id <= '${activeLogId}'`.trim();
}
// Create orderBy array based on orderDirection
const [columnName, order] = orderBy.split(':');
const newOrderBy = [
{ columnName: columnName || 'timestamp', order: order || 'desc' },
{ columnName: 'id', order: order || 'desc' },
];
queryData = [
{
...(getListQuery(query) || initialQueryBuilderFormValues),
...paginateData,
...(updatedFilters ? { filters: updatedFilters } : {}),
filter: { expression: updatedFilterExpression || '' },
groupBy: [],
having: {
expression: '',
},
orderBy: newOrderBy,
disabled: false,
},
];
}
const data: Query = {
...query,
builder: {
...query.builder,
queryData,
},
};
return data;
};
export const getExportQueryData = (
query: Query | null,
panelType: PANEL_TYPES,
): Query | null => {
if (!query) return null;
if (panelType === PANEL_TYPES.LIST) {
const listQuery = getListQuery(query);
if (!listQuery) return null;
return {
...query,
builder: {
...query.builder,
queryData: [
{
...listQuery,
orderBy: [
{
columnName: 'timestamp',
order: 'desc',
},
],
limit: null,
},
],
},
};
}
return query;
};

View File

@@ -11,29 +11,29 @@ import { QueryParams } from 'constants/query';
import {
initialFilters,
initialQueriesMap,
initialQueryBuilderFormValues,
OPERATORS,
PANEL_TYPES,
} from 'constants/queryBuilder';
import { DEFAULT_PER_PAGE_VALUE } from 'container/Controls/config';
import ExplorerOptionWrapper from 'container/ExplorerOptions/ExplorerOptionWrapper';
import { ChangeViewFunctionType } from 'container/ExplorerOptions/types';
import GoToTop from 'container/GoToTop';
import {} from 'container/LiveLogs/constants';
import LogsExplorerChart from 'container/LogsExplorerChart';
import LogsExplorerList from 'container/LogsExplorerList';
import LogsExplorerTable from 'container/LogsExplorerTable';
import {
getExportQueryData,
getFrequencyChartData,
getListQuery,
getQueryByPanelType,
} from 'container/LogsExplorerViews/explorerUtils';
import TimeSeriesView from 'container/TimeSeriesView/TimeSeriesView';
import { useCopyLogLink } from 'hooks/logs/useCopyLogLink';
import { useGetExplorerQueryRange } from 'hooks/queryBuilder/useGetExplorerQueryRange';
import { useGetPanelTypesQueryParam } from 'hooks/queryBuilder/useGetPanelTypesQueryParam';
import { useQueryBuilder } from 'hooks/queryBuilder/useQueryBuilder';
import { useHandleExplorerTabChange } from 'hooks/useHandleExplorerTabChange';
import { useSafeNavigate } from 'hooks/useSafeNavigate';
import useUrlQueryData from 'hooks/useUrlQueryData';
import { getPaginationQueryDataV2 } from 'lib/newQueryBuilder/getPaginationQueryData';
import { cloneDeep, defaultTo, isEmpty, isUndefined, set } from 'lodash-es';
import { isEmpty, isUndefined } from 'lodash-es';
import LiveLogs from 'pages/LiveLogs';
import { ExplorerViews } from 'pages/LogsExplorer/utils';
import {
Dispatch,
memo,
@@ -52,15 +52,10 @@ import { Warning } from 'types/api';
import { Dashboard } from 'types/api/dashboard/getAll';
import APIError from 'types/api/error';
import { ILog } from 'types/api/logs/log';
import { DataTypes } from 'types/api/queryBuilder/queryAutocompleteResponse';
import {
IBuilderQuery,
Query,
TagFilter,
} from 'types/api/queryBuilder/queryBuilderData';
import { Query, TagFilter } from 'types/api/queryBuilder/queryBuilderData';
import { Filter } from 'types/api/v5/queryRange';
import { QueryDataV3 } from 'types/api/widgets/getQuery';
import { DataSource, LogsAggregatorOperator } from 'types/common/queryBuilder';
import { DataSource } from 'types/common/queryBuilder';
import { GlobalReducer } from 'types/reducer/globalTime';
import { generateExportToDashboardLink } from 'utils/dashboard/generateExportToDashboardLink';
import { v4 } from 'uuid';
@@ -68,14 +63,13 @@ import { v4 } from 'uuid';
import LogsActionsContainer from './LogsActionsContainer';
function LogsExplorerViewsContainer({
selectedView,
setIsLoadingQueries,
listQueryKeyRef,
chartQueryKeyRef,
setWarning,
showLiveLogs,
handleChangeSelectedView,
}: {
selectedView: ExplorerViews;
setIsLoadingQueries: React.Dispatch<React.SetStateAction<boolean>>;
// eslint-disable-next-line @typescript-eslint/no-explicit-any
listQueryKeyRef: MutableRefObject<any>;
@@ -83,19 +77,14 @@ function LogsExplorerViewsContainer({
chartQueryKeyRef: MutableRefObject<any>;
setWarning: Dispatch<SetStateAction<Warning | undefined>>;
showLiveLogs: boolean;
handleChangeSelectedView: ChangeViewFunctionType;
}): JSX.Element {
const { safeNavigate } = useSafeNavigate();
const dispatch = useDispatch();
const [showFrequencyChart, setShowFrequencyChart] = useState(false);
useEffect(() => {
const frequencyChart = getFromLocalstorage(LOCALSTORAGE.SHOW_FREQUENCY_CHART);
setShowFrequencyChart(frequencyChart === 'true');
}, []);
// this is to respect the panel type present in the URL rather than defaulting it to list always.
const panelTypes = useGetPanelTypesQueryParam(PANEL_TYPES.LIST);
const [showFrequencyChart, setShowFrequencyChart] = useState(
() => getFromLocalstorage(LOCALSTORAGE.SHOW_FREQUENCY_CHART) === 'true',
);
const { activeLogId } = useCopyLogLink();
@@ -117,14 +106,9 @@ function LogsExplorerViewsContainer({
stagedQuery,
panelType,
updateAllQueriesOperators,
handleSetConfig,
} = useQueryBuilder();
const [selectedPanelType, setSelectedPanelType] = useState<PANEL_TYPES>(
panelType || PANEL_TYPES.LIST,
);
const { handleExplorerTabChange } = useHandleExplorerTabChange();
const selectedPanelType = panelType || PANEL_TYPES.LIST;
// State
const [page, setPage] = useState<number>(1);
@@ -135,27 +119,9 @@ function LogsExplorerViewsContainer({
const [orderBy, setOrderBy] = useState<string>('timestamp:desc');
const listQuery = useMemo(() => {
if (!stagedQuery || stagedQuery.builder.queryData.length < 1) return null;
return stagedQuery.builder.queryData.find((item) => !item.disabled) || null;
}, [stagedQuery]);
const isMultipleQueries = useMemo(
() =>
currentQuery?.builder?.queryData?.length > 1 ||
currentQuery?.builder?.queryFormulas?.length > 0,
[currentQuery],
);
const isGroupByExist = useMemo(() => {
const groupByCount: number = currentQuery?.builder?.queryData?.reduce<number>(
(acc, query) => acc + query.groupBy.length,
0,
);
return groupByCount > 0;
}, [currentQuery]);
const listQuery = useMemo(() => getListQuery(stagedQuery) || null, [
stagedQuery,
]);
const isLimit: boolean = useMemo(() => {
if (!listQuery) return false;
@@ -165,66 +131,9 @@ function LogsExplorerViewsContainer({
}, [logs.length, listQuery]);
useEffect(() => {
if (!stagedQuery || !listQuery) {
setListChartQuery(null);
return;
}
let updatedFilterExpression = listQuery.filter?.expression || '';
if (activeLogId) {
updatedFilterExpression = `${updatedFilterExpression} id <= '${activeLogId}'`.trim();
}
const modifiedQueryData: IBuilderQuery = {
...listQuery,
aggregateOperator: LogsAggregatorOperator.COUNT,
groupBy: [
{
key: 'severity_text',
dataType: DataTypes.String,
type: '',
id: 'severity_text--string----true',
},
],
legend: '{{severity_text}}',
filter: {
...listQuery?.filter,
expression: updatedFilterExpression || '',
},
...(activeLogId && {
filters: {
...listQuery?.filters,
items: [
...(listQuery?.filters?.items || []),
{
id: v4(),
key: {
key: 'id',
type: '',
dataType: DataTypes.String,
},
op: OPERATORS['<='],
value: activeLogId,
},
],
op: 'AND',
},
}),
};
const modifiedQuery: Query = {
...stagedQuery,
builder: {
...stagedQuery.builder,
queryData: stagedQuery.builder.queryData.map((item) => ({
...item,
...modifiedQueryData,
})),
},
};
const modifiedQuery = getFrequencyChartData(stagedQuery, activeLogId);
setListChartQuery(modifiedQuery);
}, [stagedQuery, listQuery, activeLogId]);
}, [stagedQuery, activeLogId]);
const exportDefaultQuery = useMemo(
() =>
@@ -246,7 +155,9 @@ function LogsExplorerViewsContainer({
ENTITY_VERSION_V5,
{
enabled:
showFrequencyChart && !!listChartQuery && panelType === PANEL_TYPES.LIST,
showFrequencyChart &&
!!listChartQuery &&
selectedPanelType === PANEL_TYPES.LIST,
},
{},
undefined,
@@ -264,7 +175,7 @@ function LogsExplorerViewsContainer({
error,
} = useGetExplorerQueryRange(
requestData,
panelType,
selectedPanelType,
ENTITY_VERSION_V5,
{
keepPreviousData: true,
@@ -296,77 +207,13 @@ function LogsExplorerViewsContainer({
filters: TagFilter;
filter: Filter;
},
): Query | null => {
if (!query) return null;
const paginateData = getPaginationQueryDataV2({
page: params.page,
pageSize: params.pageSize,
});
// Add filter for activeLogId if present
let updatedFilters = params.filters;
let updatedFilterExpression = params.filter?.expression || '';
if (activeLogId) {
updatedFilters = {
...params.filters,
items: [
...(params.filters?.items || []),
{
id: v4(),
key: {
key: 'id',
type: '',
dataType: DataTypes.String,
},
op: OPERATORS['<='],
value: activeLogId,
},
],
op: 'AND',
};
updatedFilterExpression = `${updatedFilterExpression} id <= '${activeLogId}'`.trim();
}
// Create orderBy array based on orderDirection
const [columnName, order] = orderBy.split(':');
const newOrderBy = [
{ columnName: columnName || 'timestamp', order: order || 'desc' },
{ columnName: 'id', order: order || 'desc' },
];
const queryData: IBuilderQuery[] =
query.builder.queryData.length > 1
? query.builder.queryData.map((item) => ({
...item,
...(selectedView !== ExplorerViews.LIST ? { order: [] } : {}),
}))
: [
{
...(listQuery || initialQueryBuilderFormValues),
...paginateData,
...(updatedFilters ? { filters: updatedFilters } : {}),
filter: {
expression: updatedFilterExpression || '',
},
...(selectedView === ExplorerViews.LIST
? { order: newOrderBy, orderBy: newOrderBy }
: { order: [] }),
},
];
const data: Query = {
...query,
builder: {
...query.builder,
queryData,
},
};
return data;
},
[activeLogId, orderBy, listQuery, selectedView],
): Query | null =>
getQueryByPanelType(query, selectedPanelType, {
...params,
activeLogId,
orderBy,
}),
[activeLogId, orderBy, selectedPanelType],
);
useEffect(() => {
@@ -412,7 +259,7 @@ function LogsExplorerViewsContainer({
if (!logEventCalledRef.current && !isUndefined(data?.payload)) {
const currentData = data?.payload?.data?.newResult?.data?.result || [];
logEvent('Logs Explorer: Page visited', {
panelType,
panelType: selectedPanelType,
isEmpty: !currentData?.[0]?.list,
});
logEventCalledRef.current = true;
@@ -420,31 +267,24 @@ function LogsExplorerViewsContainer({
// eslint-disable-next-line react-hooks/exhaustive-deps
}, [data?.payload]);
const getUpdatedQueryForExport = useCallback((): Query => {
const updatedQuery = cloneDeep(currentQuery);
set(updatedQuery, 'builder.queryData[0].pageSize', 10);
return updatedQuery;
}, [currentQuery]);
const handleExport = useCallback(
(dashboard: Dashboard | null, isNewDashboard?: boolean): void => {
if (!dashboard || !panelType) return;
if (!dashboard || !selectedPanelType) return;
const panelTypeParam = AVAILABLE_EXPORT_PANEL_TYPES.includes(panelType)
? panelType
const panelTypeParam = AVAILABLE_EXPORT_PANEL_TYPES.includes(
selectedPanelType,
)
? selectedPanelType
: PANEL_TYPES.TIME_SERIES;
const widgetId = v4();
const query =
panelType === PANEL_TYPES.LIST
? getUpdatedQueryForExport()
: exportDefaultQuery;
const query = getExportQueryData(requestData, selectedPanelType);
if (!query) return;
logEvent('Logs Explorer: Add to dashboard successful', {
panelType,
panelType: selectedPanelType,
isNewDashboard,
dashboardName: dashboard?.data?.title,
});
@@ -458,36 +298,9 @@ function LogsExplorerViewsContainer({
safeNavigate(dashboardEditView);
},
[getUpdatedQueryForExport, exportDefaultQuery, safeNavigate, panelType],
[safeNavigate, requestData, selectedPanelType],
);
useEffect(() => {
const shouldChangeView = isMultipleQueries || isGroupByExist;
if (selectedPanelType === PANEL_TYPES.LIST && shouldChangeView) {
handleExplorerTabChange(PANEL_TYPES.TIME_SERIES);
setSelectedPanelType(PANEL_TYPES.TIME_SERIES);
}
if (panelType) {
setSelectedPanelType(panelType);
}
}, [
isMultipleQueries,
isGroupByExist,
selectedPanelType,
selectedView,
handleExplorerTabChange,
panelType,
]);
useEffect(() => {
if (selectedView && selectedView === ExplorerViews.LIST && handleSetConfig) {
handleSetConfig(defaultTo(panelTypes, PANEL_TYPES.LIST), DataSource.LOGS);
}
// eslint-disable-next-line react-hooks/exhaustive-deps
}, [handleSetConfig, panelTypes]);
useEffect(() => {
const currentData = data?.payload?.data?.newResult?.data?.result || [];
if (currentData.length > 0 && currentData[0].list) {
@@ -546,19 +359,17 @@ function LogsExplorerViewsContainer({
pageSize,
minTime,
activeLogId,
panelType,
selectedView,
selectedPanelType,
dispatch,
selectedTime,
maxTime,
orderBy,
selectedPanelType,
]);
const chartData = useMemo(() => {
if (!stagedQuery) return [];
if (panelType === PANEL_TYPES.LIST) {
if (selectedPanelType === PANEL_TYPES.LIST) {
if (listChartData && listChartData.payload.data?.result.length > 0) {
return listChartData.payload.data.result;
}
@@ -578,7 +389,7 @@ function LogsExplorerViewsContainer({
const firstPayloadQueryArray = firstPayloadQuery ? [firstPayloadQuery] : [];
return isGroupByExist ? data.payload.data.result : firstPayloadQueryArray;
}, [stagedQuery, panelType, data, listChartData, listQuery]);
}, [stagedQuery, selectedPanelType, data, listChartData, listQuery]);
useEffect(() => {
if (
@@ -639,7 +450,7 @@ function LogsExplorerViewsContainer({
className="logs-frequency-chart"
isLoading={isFetchingListChartData || isLoadingListChartData}
data={chartData}
isLogsExplorerViews={panelType === PANEL_TYPES.LIST}
isLogsExplorerViews={selectedPanelType === PANEL_TYPES.LIST}
/>
</div>
)}
@@ -695,6 +506,7 @@ function LogsExplorerViewsContainer({
query={exportDefaultQuery}
onExport={handleExport}
sourcepage={DataSource.LOGS}
handleChangeSelectedView={handleChangeSelectedView}
/>
</div>
);

View File

@@ -5,12 +5,12 @@ import { useGetExplorerQueryRange } from 'hooks/queryBuilder/useGetExplorerQuery
import { logsQueryRangeSuccessResponse } from 'mocks-server/__mockdata__/logs_query_range';
import { server } from 'mocks-server/server';
import { rest } from 'msw';
import { ExplorerViews } from 'pages/LogsExplorer/utils';
import { PreferenceContextProvider } from 'providers/preferences/context/PreferenceContextProvider';
import { QueryBuilderContext } from 'providers/QueryBuilder';
import { VirtuosoMockContext } from 'react-virtuoso';
import { fireEvent, render, RenderResult, waitFor } from 'tests/test-utils';
import { TagFilterItem } from 'types/api/queryBuilder/queryBuilderData';
import { LogsAggregatorOperator } from 'types/common/queryBuilder';
import LogsExplorerViews from '..';
import {
@@ -152,12 +152,12 @@ const renderer = (): RenderResult =>
>
<PreferenceContextProvider>
<LogsExplorerViews
selectedView={ExplorerViews.LIST}
setIsLoadingQueries={(): void => {}}
listQueryKeyRef={{ current: {} }}
chartQueryKeyRef={{ current: {} }}
setWarning={(): void => {}}
showLiveLogs={false}
handleChangeSelectedView={(): void => {}}
/>
</PreferenceContextProvider>
</VirtuosoMockContext.Provider>,
@@ -218,12 +218,12 @@ describe('LogsExplorerViews -', () => {
<QueryBuilderContext.Provider value={mockQueryBuilderContextValue}>
<PreferenceContextProvider>
<LogsExplorerViews
selectedView={ExplorerViews.LIST}
setIsLoadingQueries={(): void => {}}
listQueryKeyRef={{ current: {} }}
chartQueryKeyRef={{ current: {} }}
setWarning={(): void => {}}
showLiveLogs={false}
handleChangeSelectedView={(): void => {}}
/>
</PreferenceContextProvider>
</QueryBuilderContext.Provider>,
@@ -295,12 +295,12 @@ describe('LogsExplorerViews -', () => {
<QueryBuilderContext.Provider value={customContext as any}>
<PreferenceContextProvider>
<LogsExplorerViews
selectedView={ExplorerViews.LIST}
setIsLoadingQueries={(): void => {}}
listQueryKeyRef={{ current: {} }}
chartQueryKeyRef={{ current: {} }}
setWarning={(): void => {}}
showLiveLogs={false}
handleChangeSelectedView={(): void => {}}
/>
</PreferenceContextProvider>
</QueryBuilderContext.Provider>,
@@ -323,4 +323,120 @@ describe('LogsExplorerViews -', () => {
}
});
});
describe('Queries by View', () => {
it('builds Frequency Chart query with COUNT and severity_text grouping and activeLogId bound', async () => {
// Enable frequency chart via localstorage and provide activeLogId
(useCopyLogLink as jest.Mock).mockReturnValue({
activeLogId: ACTIVE_LOG_ID,
});
// Ensure default mock return exists
(useGetExplorerQueryRange as jest.Mock).mockReturnValue({
data: { payload: logsQueryRangeSuccessNewFormatResponse },
});
// Render with LIST panel type so the frequency chart hook runs with TIME_SERIES
render(
<VirtuosoMockContext.Provider
value={{ viewportHeight: 300, itemHeight: 100 }}
>
<PreferenceContextProvider>
<QueryBuilderContext.Provider
value={
{ ...mockQueryBuilderContextValue, panelType: PANEL_TYPES.LIST } as any
}
>
<LogsExplorerViews
setIsLoadingQueries={(): void => {}}
listQueryKeyRef={{ current: {} }}
chartQueryKeyRef={{ current: {} }}
setWarning={(): void => {}}
showLiveLogs={false}
handleChangeSelectedView={(): void => {}}
/>
</QueryBuilderContext.Provider>
</PreferenceContextProvider>
</VirtuosoMockContext.Provider>,
);
await waitFor(() => {
const chartCall = (useGetExplorerQueryRange as jest.Mock).mock.calls.find(
(call) => call[1] === PANEL_TYPES.TIME_SERIES && call[0],
);
expect(chartCall).toBeDefined();
if (chartCall) {
const frequencyQuery = chartCall[0];
const first = frequencyQuery.builder.queryData[0];
// Panel type used for chart fetch
expect(chartCall[1]).toBe(PANEL_TYPES.TIME_SERIES);
// Transformations
expect(first.aggregateOperator).toBe(LogsAggregatorOperator.COUNT);
expect(first.groupBy?.[0]?.key).toBe('severity_text');
expect(first.legend).toBe('{{severity_text}}');
expect(Array.isArray(first.orderBy) && first.orderBy.length === 0).toBe(
true,
);
expect(first.having?.expression).toBe('');
// activeLogId constraints
expect(first.filter?.expression).toContain(`id <= '${ACTIVE_LOG_ID}'`);
expect(
first.filters?.items?.some(
(it: any) =>
it.key?.key === 'id' && it.op === '<=' && it.value === ACTIVE_LOG_ID,
),
).toBe(true);
}
});
});
it('builds List View query with orderBy and clears groupBy/having', async () => {
(useCopyLogLink as jest.Mock).mockReturnValue({ activeLogId: undefined });
(useGetExplorerQueryRange as jest.Mock).mockReturnValue({
data: { payload: logsQueryRangeSuccessNewFormatResponse },
});
render(
<VirtuosoMockContext.Provider
value={{ viewportHeight: 300, itemHeight: 100 }}
>
<PreferenceContextProvider>
<QueryBuilderContext.Provider
value={
{ ...mockQueryBuilderContextValue, panelType: PANEL_TYPES.LIST } as any
}
>
<LogsExplorerViews
setIsLoadingQueries={(): void => {}}
listQueryKeyRef={{ current: {} }}
chartQueryKeyRef={{ current: {} }}
setWarning={(): void => {}}
showLiveLogs={false}
handleChangeSelectedView={(): void => {}}
/>
</QueryBuilderContext.Provider>
</PreferenceContextProvider>
</VirtuosoMockContext.Provider>,
);
await waitFor(() => {
const listCall = (useGetExplorerQueryRange as jest.Mock).mock.calls.find(
(call) => call[1] === PANEL_TYPES.LIST && call[0],
);
expect(listCall).toBeDefined();
if (listCall) {
const listQueryArg = listCall[0];
const first = listQueryArg.builder.queryData[0];
expect(first.groupBy?.length ?? 0).toBe(0);
expect(first.having?.expression).toBe('');
// Default orderBy should be timestamp desc, then id desc
expect(first.orderBy).toEqual([
{ columnName: 'timestamp', order: 'desc' },
{ columnName: 'id', order: 'desc' },
]);
// Ensure the query is enabled for fetch
expect(first.disabled).toBe(false);
}
});
});
});
});

View File

@@ -33,6 +33,7 @@ function Explorer(): JSX.Element {
handleRunQuery,
stagedQuery,
updateAllQueriesOperators,
handleSetQueryData,
currentQuery,
} = useQueryBuilder();
const { safeNavigate } = useSafeNavigate();
@@ -50,6 +51,15 @@ function Explorer(): JSX.Element {
[updateAllQueriesOperators],
);
useEffect(() => {
handleSetQueryData(0, {
...initialQueryMeterWithType.builder.queryData[0],
source: 'meter',
});
// eslint-disable-next-line react-hooks/exhaustive-deps
}, []);
const exportDefaultQuery = useMemo(
() =>
updateAllQueriesOperators(

View File

@@ -115,19 +115,25 @@ describe('TopOperation API Integration', () => {
server.use(
rest.post(
'http://localhost/api/v1/service/top_operations',
'http://localhost/api/v2/service/top_operations',
async (req, res, ctx) => {
const body = await req.json();
apiCalls.push({ endpoint: TOP_OPERATIONS_ENDPOINT, body });
return res(ctx.status(200), ctx.json(mockTopOperationsData));
return res(
ctx.status(200),
ctx.json({ status: 'success', data: mockTopOperationsData }),
);
},
),
rest.post(
'http://localhost/api/v1/service/entry_point_operations',
'http://localhost/api/v2/service/entry_point_operations',
async (req, res, ctx) => {
const body = await req.json();
apiCalls.push({ endpoint: ENTRY_POINT_OPERATIONS_ENDPOINT, body });
return res(ctx.status(200), ctx.json({ data: mockEntryPointData }));
return res(
ctx.status(200),
ctx.json({ status: 'success', data: mockEntryPointData }),
);
},
),
);
@@ -162,6 +168,7 @@ describe('TopOperation API Integration', () => {
end: `${defaultApiCallExpectation.end}`,
service: defaultApiCallExpectation.service,
tags: defaultApiCallExpectation.selectedTags,
limit: 5000,
});
});
@@ -195,6 +202,7 @@ describe('TopOperation API Integration', () => {
end: `${defaultApiCallExpectation.end}`,
service: defaultApiCallExpectation.service,
tags: defaultApiCallExpectation.selectedTags,
limit: 5000,
});
});

View File

@@ -58,6 +58,20 @@
.explore-content {
padding: 0 8px;
.y-axis-unit-selector-container {
display: flex;
align-items: center;
gap: 10px;
padding-top: 10px;
margin-bottom: 10px;
.save-unit-container {
display: flex;
align-items: center;
gap: 10px;
}
}
.ant-space {
margin-top: 10px;
margin-bottom: 20px;
@@ -75,6 +89,14 @@
.time-series-view {
min-width: 100%;
width: 100%;
position: relative;
.no-unit-warning {
position: absolute;
top: 30px;
right: 40px;
z-index: 1000;
}
}
.time-series-container {

View File

@@ -1,7 +1,7 @@
import './Explorer.styles.scss';
import * as Sentry from '@sentry/react';
import { Switch } from 'antd';
import { Switch, Tooltip } from 'antd';
import logEvent from 'api/common/logEvent';
import { QueryBuilderV2 } from 'components/QueryBuilderV2/QueryBuilderV2';
import WarningPopover from 'components/WarningPopover/WarningPopover';
@@ -25,10 +25,10 @@ import { generateExportToDashboardLink } from 'utils/dashboard/generateExportToD
import { v4 as uuid } from 'uuid';
import { MetricsExplorerEventKeys, MetricsExplorerEvents } from '../events';
// import QuerySection from './QuerySection';
import MetricDetails from '../MetricDetails/MetricDetails';
import TimeSeries from './TimeSeries';
import { ExplorerTabs } from './types';
import { splitQueryIntoOneChartPerQuery } from './utils';
import { splitQueryIntoOneChartPerQuery, useGetMetricUnits } from './utils';
const ONE_CHART_PER_QUERY_ENABLED_KEY = 'isOneChartPerQueryEnabled';
@@ -40,6 +40,31 @@ function Explorer(): JSX.Element {
currentQuery,
} = useQueryBuilder();
const { safeNavigate } = useSafeNavigate();
const [isMetricDetailsOpen, setIsMetricDetailsOpen] = useState(false);
const metricNames = useMemo(
() =>
stagedQuery?.builder.queryData.map(
(query) => query.aggregateAttribute?.key ?? '',
) ?? [],
[stagedQuery],
);
const {
units,
metrics,
isLoading: isMetricUnitsLoading,
isError: isMetricUnitsError,
} = useGetMetricUnits(metricNames);
const areAllMetricUnitsSame = useMemo(
() =>
!isMetricUnitsLoading &&
!isMetricUnitsError &&
units.length > 0 &&
units.every((unit) => unit === units[0]),
[units, isMetricUnitsLoading, isMetricUnitsError],
);
const [searchParams, setSearchParams] = useSearchParams();
const isOneChartPerQueryEnabled =
@@ -48,7 +73,31 @@ function Explorer(): JSX.Element {
const [showOneChartPerQuery, toggleShowOneChartPerQuery] = useState(
isOneChartPerQueryEnabled,
);
const [disableOneChartPerQuery, toggleDisableOneChartPerQuery] = useState(
false,
);
const [selectedTab] = useState<ExplorerTabs>(ExplorerTabs.TIME_SERIES);
const [yAxisUnit, setYAxisUnit] = useState<string>('');
useEffect(() => {
if (units.length === 0) {
setYAxisUnit('');
} else if (units.length === 1 && units[0] !== '') {
setYAxisUnit(units[0]);
} else if (areAllMetricUnitsSame && units[0] !== '') {
setYAxisUnit(units[0]);
}
// eslint-disable-next-line react-hooks/exhaustive-deps
}, [JSON.stringify(units), areAllMetricUnitsSame]);
useEffect(() => {
if (units.length > 1 && !areAllMetricUnitsSame) {
toggleShowOneChartPerQuery(true);
toggleDisableOneChartPerQuery(true);
} else {
toggleDisableOneChartPerQuery(false);
}
}, [units, areAllMetricUnitsSame]);
const handleToggleShowOneChartPerQuery = (): void => {
toggleShowOneChartPerQuery(!showOneChartPerQuery);
@@ -68,15 +117,20 @@ function Explorer(): JSX.Element {
[updateAllQueriesOperators],
);
const exportDefaultQuery = useMemo(
() =>
updateAllQueriesOperators(
currentQuery || initialQueriesMap[DataSource.METRICS],
PANEL_TYPES.TIME_SERIES,
DataSource.METRICS,
),
[currentQuery, updateAllQueriesOperators],
);
const exportDefaultQuery = useMemo(() => {
const query = updateAllQueriesOperators(
currentQuery || initialQueriesMap[DataSource.METRICS],
PANEL_TYPES.TIME_SERIES,
DataSource.METRICS,
);
if (yAxisUnit) {
return {
...query,
unit: yAxisUnit,
};
}
return query;
}, [currentQuery, updateAllQueriesOperators, yAxisUnit]);
useShareBuilderUrl({ defaultValue: defaultQuery });
@@ -90,8 +144,16 @@ function Explorer(): JSX.Element {
const widgetId = uuid();
let query = queryToExport || exportDefaultQuery;
if (yAxisUnit) {
query = {
...query,
unit: yAxisUnit,
};
}
const dashboardEditView = generateExportToDashboardLink({
query: queryToExport || exportDefaultQuery,
query,
panelType: PANEL_TYPES.TIME_SERIES,
dashboardId: dashboard.id,
widgetId,
@@ -99,7 +161,7 @@ function Explorer(): JSX.Element {
safeNavigate(dashboardEditView);
},
[exportDefaultQuery, safeNavigate],
[exportDefaultQuery, safeNavigate, yAxisUnit],
);
const splitedQueries = useMemo(
@@ -129,11 +191,17 @@ function Explorer(): JSX.Element {
<div className="explore-header">
<div className="explore-header-left-actions">
<span>1 chart/query</span>
<Switch
checked={showOneChartPerQuery}
onChange={handleToggleShowOneChartPerQuery}
size="small"
/>
<Tooltip
open={disableOneChartPerQuery ? undefined : false}
title="One chart per query cannot be disabled for multiple queries with different units."
>
<Switch
checked={showOneChartPerQuery}
onChange={handleToggleShowOneChartPerQuery}
disabled={disableOneChartPerQuery}
size="small"
/>
</Tooltip>
</div>
<div className="explore-header-right-actions">
{!isEmpty(warning) && <WarningPopover warningData={warning} />}
@@ -174,6 +242,15 @@ function Explorer(): JSX.Element {
<TimeSeries
showOneChartPerQuery={showOneChartPerQuery}
setWarning={setWarning}
areAllMetricUnitsSame={areAllMetricUnitsSame}
isMetricUnitsLoading={isMetricUnitsLoading}
isMetricUnitsError={isMetricUnitsError}
metricUnits={units}
metricNames={metricNames}
metrics={metrics}
setIsMetricDetailsOpen={setIsMetricDetailsOpen}
yAxisUnit={yAxisUnit}
setYAxisUnit={setYAxisUnit}
/>
)}
{/* TODO: Enable once we have resolved all related metrics issues */}
@@ -190,6 +267,14 @@ function Explorer(): JSX.Element {
isOneChartPerQuery={false}
splitedQueries={splitedQueries}
/>
{isMetricDetailsOpen && (
<MetricDetails
metricName={metricNames[0]}
isOpen={isMetricDetailsOpen}
onClose={(): void => setIsMetricDetailsOpen(false)}
isModalTimeSelection={false}
/>
)}
</Sentry.ErrorBoundary>
);
}

View File

@@ -1,15 +1,21 @@
import { Color } from '@signozhq/design-tokens';
import { Button, Tooltip, Typography } from 'antd';
import { MetricType } from 'api/metricsExplorer/getMetricsList';
import { isAxiosError } from 'axios';
import classNames from 'classnames';
import YAxisUnitSelector from 'components/YAxisUnitSelector';
import { ENTITY_VERSION_V5 } from 'constants/app';
import { initialQueriesMap, PANEL_TYPES } from 'constants/queryBuilder';
import { REACT_QUERY_KEY } from 'constants/reactQueryKeys';
import { BuilderUnitsFilter } from 'container/QueryBuilder/filters/BuilderUnitsFilter/BuilderUnits';
import TimeSeriesView from 'container/TimeSeriesView/TimeSeriesView';
import { convertDataValueToMs } from 'container/TimeSeriesView/utils';
import { useUpdateMetricMetadata } from 'hooks/metricsExplorer/useUpdateMetricMetadata';
import { useQueryBuilder } from 'hooks/queryBuilder/useQueryBuilder';
import { useNotifications } from 'hooks/useNotifications';
import { GetMetricQueryRange } from 'lib/dashboard/getQueryResults';
import { useMemo, useState } from 'react';
import { useQueries } from 'react-query';
import { AlertTriangle } from 'lucide-react';
import { useMemo } from 'react';
import { useQueries, useQueryClient } from 'react-query';
import { useSelector } from 'react-redux';
import { AppState } from 'store/reducers';
import { SuccessResponse } from 'types/api';
@@ -24,8 +30,19 @@ import { splitQueryIntoOneChartPerQuery } from './utils';
function TimeSeries({
showOneChartPerQuery,
setWarning,
areAllMetricUnitsSame,
isMetricUnitsLoading,
isMetricUnitsError,
metricUnits,
metricNames,
metrics,
setIsMetricDetailsOpen,
yAxisUnit,
setYAxisUnit,
}: TimeSeriesProps): JSX.Element {
const { stagedQuery, currentQuery } = useQueryBuilder();
const { notifications } = useNotifications();
const queryClient = useQueryClient();
const { selectedTime: globalSelectedTime, maxTime, minTime } = useSelector<
AppState,
@@ -61,8 +78,6 @@ function TimeSeries({
[showOneChartPerQuery, stagedQuery],
);
const [yAxisUnit, setYAxisUnit] = useState<string>('');
const queries = useQueries(
queryPayloads.map((payload, index) => ({
queryKey: [
@@ -83,6 +98,7 @@ function TimeSeries({
globalSelectedInterval: globalSelectedTime,
params: {
dataSource: DataSource.METRICS,
source: 'metrics-explorer',
},
},
// ENTITY_VERSION_V4,
@@ -126,32 +142,141 @@ function TimeSeries({
setYAxisUnit(value);
};
const goToMetricDetails = (): void => {
setIsMetricDetailsOpen(true);
};
const showYAxisUnitSelector = useMemo(() => {
if (metricUnits.length <= 1) {
return true;
}
if (areAllMetricUnitsSame) {
return metricUnits[0] !== '';
}
return false;
}, [metricUnits, areAllMetricUnitsSame]);
const showSaveUnitButton = useMemo(
() =>
metricUnits.length === 1 &&
Boolean(metrics?.[0]) &&
metricUnits[0] === '' &&
yAxisUnit !== '',
[metricUnits, metrics, yAxisUnit],
);
const {
mutate: updateMetricMetadata,
isLoading: isUpdatingMetricMetadata,
} = useUpdateMetricMetadata();
const handleSaveUnit = (): void => {
updateMetricMetadata(
{
metricName: metricNames[0],
payload: {
unit: yAxisUnit,
description: metrics[0]?.metadata?.description ?? '',
metricType: metrics[0]?.metadata?.metric_type as MetricType,
temporality: metrics[0]?.metadata?.temporality,
},
},
{
onSuccess: () => {
notifications.success({
message: 'Unit saved successfully',
});
queryClient.invalidateQueries([
REACT_QUERY_KEY.GET_METRIC_DETAILS,
metricNames[0],
]);
},
onError: () => {
notifications.error({
message: 'Failed to save unit',
});
},
},
);
};
return (
<>
<BuilderUnitsFilter onChange={onUnitChangeHandler} yAxisUnit={yAxisUnit} />
<div className="y-axis-unit-selector-container">
{showYAxisUnitSelector && (
<>
<YAxisUnitSelector
value={yAxisUnit}
onChange={onUnitChangeHandler}
loading={isMetricUnitsLoading}
disabled={isMetricUnitsLoading || isMetricUnitsError}
data-testid="metrics-explorer-y-axis-unit-selector"
/>
{showSaveUnitButton && (
<div className="save-unit-container">
<Typography.Text>
Save the selected unit for this metric?
</Typography.Text>
<Button
type="primary"
size="small"
loading={isUpdatingMetricMetadata}
onClick={handleSaveUnit}
>
Yes
</Button>
</div>
)}
</>
)}
</div>
<div
className={classNames({
'time-series-container': changeLayoutForOneChartPerQuery,
})}
>
{responseData.map((datapoint, index) => (
<div
className="time-series-view"
// eslint-disable-next-line react/no-array-index-key
key={index}
>
<TimeSeriesView
isFilterApplied={false}
isError={queries[index].isError}
isLoading={queries[index].isLoading}
data={datapoint}
yAxisUnit={yAxisUnit}
dataSource={DataSource.METRICS}
error={queries[index].error as APIError}
setWarning={setWarning}
/>
</div>
))}
{responseData.map((datapoint, index) => {
const isMetricUnitEmpty =
!queries[index].isLoading &&
!isMetricUnitsLoading &&
metricUnits.length > 1 &&
metricUnits[index] === '';
return (
<div
className="time-series-view"
// eslint-disable-next-line react/no-array-index-key
key={index}
>
{isMetricUnitEmpty && (
<Tooltip
className="no-unit-warning"
title={
<Typography.Text>
This metric does not have a unit. Please set one for it in the{' '}
<Typography.Link onClick={goToMetricDetails}>
metric details
</Typography.Link>{' '}
drawer.
</Typography.Text>
}
>
<AlertTriangle size={16} color={Color.BG_AMBER_400} />
</Tooltip>
)}
<TimeSeriesView
isFilterApplied={false}
isError={queries[index].isError}
isLoading={queries[index].isLoading}
data={datapoint}
yAxisUnit={yAxisUnit}
dataSource={DataSource.METRICS}
error={queries[index].error as APIError}
setWarning={setWarning}
/>
</div>
);
})}
</div>
</>
);

View File

@@ -15,6 +15,7 @@ import { LicenseEvent } from 'types/api/licensesV3/getActive';
import { DataSource } from 'types/common/queryBuilder';
import Explorer from '../Explorer';
import * as useGetMetricUnitsHooks from '../utils';
const mockSetSearchParams = jest.fn();
const queryClient = new QueryClient();
@@ -126,6 +127,23 @@ jest.spyOn(useQueryBuilderHooks, 'useQueryBuilder').mockReturnValue({
...mockUseQueryBuilderData,
} as any);
const Y_AXIS_UNIT_SELECTOR_TEST_ID = 'metrics-explorer-y-axis-unit-selector';
const SECONDS_UNIT_LABEL = 'Seconds (s)';
function renderExplorer(): void {
render(
<QueryClientProvider client={queryClient}>
<MemoryRouter>
<Provider store={store}>
<ErrorModalProvider>
<Explorer />
</ErrorModalProvider>
</Provider>
</MemoryRouter>
</QueryClientProvider>,
);
}
describe('Explorer', () => {
beforeEach(() => {
jest.clearAllMocks();
@@ -142,17 +160,7 @@ describe('Explorer', () => {
mockSetSearchParams,
]);
render(
<QueryClientProvider client={queryClient}>
<MemoryRouter>
<Provider store={store}>
<ErrorModalProvider>
<Explorer />
</ErrorModalProvider>
</Provider>
</MemoryRouter>
</QueryClientProvider>,
);
renderExplorer();
expect(mockUpdateAllQueriesOperators).toHaveBeenCalledWith(
initialQueriesMap[DataSource.METRICS],
@@ -167,17 +175,7 @@ describe('Explorer', () => {
mockSetSearchParams,
]);
render(
<QueryClientProvider client={queryClient}>
<MemoryRouter>
<Provider store={store}>
<ErrorModalProvider>
<Explorer />
</ErrorModalProvider>
</Provider>
</MemoryRouter>
</QueryClientProvider>,
);
renderExplorer();
const toggle = screen.getByRole('switch');
expect(toggle).toBeChecked();
@@ -189,19 +187,68 @@ describe('Explorer', () => {
mockSetSearchParams,
]);
render(
<QueryClientProvider client={queryClient}>
<MemoryRouter>
<Provider store={store}>
<ErrorModalProvider>
<Explorer />
</ErrorModalProvider>
</Provider>
</MemoryRouter>
</QueryClientProvider>,
);
renderExplorer();
const toggle = screen.getByRole('switch');
expect(toggle).not.toBeChecked();
});
it('should render pre-populated y axis unit for single metric', () => {
jest.spyOn(useGetMetricUnitsHooks, 'useGetMetricUnits').mockReturnValue({
units: ['seconds'],
isLoading: false,
isError: false,
metrics: [],
} as any);
renderExplorer();
const yAxisUnitSelector = screen.getByTestId(Y_AXIS_UNIT_SELECTOR_TEST_ID);
expect(yAxisUnitSelector).toBeInTheDocument();
expect(yAxisUnitSelector).toHaveTextContent(SECONDS_UNIT_LABEL);
});
it('should render pre-populated y axis unit for mutliple metrics with same unit', () => {
jest.spyOn(useGetMetricUnitsHooks, 'useGetMetricUnits').mockReturnValue({
units: ['seconds', 'seconds'],
isLoading: false,
isError: false,
metrics: [],
} as any);
renderExplorer();
const yAxisUnitSelector = screen.getByTestId(Y_AXIS_UNIT_SELECTOR_TEST_ID);
expect(yAxisUnitSelector).toBeInTheDocument();
expect(yAxisUnitSelector).toHaveTextContent(SECONDS_UNIT_LABEL);
});
it('should hide y axis unit selector for multiple metrics with different units', () => {
jest.spyOn(useGetMetricUnitsHooks, 'useGetMetricUnits').mockReturnValue({
units: ['seconds', 'milliseconds'],
isLoading: false,
isError: false,
metrics: [],
} as any);
renderExplorer();
const yAxisUnitSelector = screen.queryByTestId(Y_AXIS_UNIT_SELECTOR_TEST_ID);
expect(yAxisUnitSelector).not.toBeInTheDocument();
});
it('should render empty y axis unit selector for a single metric with no unit', () => {
jest.spyOn(useGetMetricUnitsHooks, 'useGetMetricUnits').mockReturnValue({
units: [],
isLoading: false,
isError: false,
metrics: [],
} as any);
renderExplorer();
const yAxisUnitSelector = screen.queryByTestId(Y_AXIS_UNIT_SELECTOR_TEST_ID);
expect(yAxisUnitSelector).toBeInTheDocument();
expect(yAxisUnitSelector).toHaveTextContent('Please select a unit');
});
});

Some files were not shown because too many files have changed in this diff Show More