Compare commits

...

83 Commits

Author SHA1 Message Date
primus-bot[bot]
0cfb809605 chore(release): bump to v0.102.0 (#9613)
Co-authored-by: primus-bot[bot] <171087277+primus-bot[bot]@users.noreply.github.com>
2025-11-19 12:22:05 +05:30
Abhi kumar
6a378ed7b4 fix: added fix for issue-3226, where the query was getting malformed (#9603)
* fix: added fix for issue-3226, where the query was getting malformed

* chore: added test + fixed previous tests
2025-11-19 11:57:53 +05:30
Shaheer Kochai
8e41847523 fix: minor improvements to trace explorer, exceptions, and trace funnels (#9578)
* chore: hide span selector in exceptions page

* refactor: remove unnecessary order by functionality and related components from TracesView

* chore: remove unnecessary icon from QB in trace funnels step

* chore: improve result table styles in trace funnels

* chore: fix formatting

* Revert "refactor: remove unnecessary order by functionality and related components from TracesView"

This reverts commit 724e9f67af.
2025-11-19 05:48:22 +00:00
Karan Balani
779df62093 feat: tokenizerstore package & role checks in JWT Tokenizer (#9594) 2025-11-19 09:11:02 +05:30
Shaheer Kochai
3763794531 refactor: external apis query range v5 migration (#9550)
* fix: fix the issue of aggregation incorrectly falling back to count

* refactor: add support for v5 queries in endPointDetailsDataQueries of EndPointDetails

* chore: add common utility functions

* chore: add convertFiltersWithUrlHandling helper

* fix: remove the aggregateOperator fallback logic changes

* refactor: migrate external APIs -> endpoint metrics query range request to v5  (#9494)

* refactor: migrate endpoint metrics api to v5

* fix: overall improvements

* fix: add url checks

* chore: remove unnecessary tests

* chore: remove old test

* chore: aggregateAttribute to aggregations

* refactor: migrate status bar charts to v5 (#9548)

* refactor: migrate status bar charts to v5

* chore: add tests

* chore: remove unnecessary tests

* chore: aggregateAttribute to aggregations

* fix: fix the failing test

* refactor: migrate external APIs -> domain metrics query range request to v5 (#9484)

* refactor: migrate domain metrics query_range to v5

* fix: overall bugfixes

* chore: fix the failing tests

* refactor: migrate dependent services to query range v5 (#9549)

* refactor: migrate dependent services to query_range v5

* chore: remove unnecessary tests

* chore: aggregateAttribute to aggregations

* refactor: migrate rate over time and latency charts query to v5 (#9544)

* fix: fix the issue of aggregation incorrectly falling back to count

* refactor: migrate rate over time and latency charts query to v5

* chore: write tests for rate over time and latency over time charts

* chore: overall improvements to the test

* fix: add url checks

* chore: remove the unnecessary tests

* chore: aggregateAttribute to aggregations

* fix: fix the failing tests

* chore: remove unnecessary test

* refactor: migrate "all endpoints" query range request to v5 (#9557)

* feat: add support for hiding columns in GridTableComponent

* refactor: migrate all endpoints section query payload to v5

* chore: aggregateAttribute to aggregations

* test: add V5 migration tests for all endpoints tab

* fix: add http.url exists or url.full exists to ensure we don't get null data

* fix: fallback to url.full while displaying endpoint value

* fix: update renderColumnCell type to accept variable arguments

* fix: remove type casting for renderColumnCell in getAllEndpointsWidgetData

* refactor: migrate external APIs -> domain dropdown query range request to v5 (#9495)

* refactor: migrate domain dropdown request to query_range v5

* fix: add utility to add http.url or url.full to the filter expression

* chore: aggregateAttribute to aggregations

* fix: add http.url exists or url.full exists to ensure we don't get null data

* fix: fallback to url.full if http.url doesn't exist

* fix: fix the failing test

* test: add V5 migration tests for endpoint dropdown query

* fix: fix the failing ts check

* fix: fix the failing tests

* fix: fix the failing tests
2025-11-18 16:50:47 +05:30
Vikrant Gupta
e9fa68e1f3 feat(authz): add stats reporting for public dashboards (#9605)
* feat(authz): add stats reporting for public dashboards

* feat(authz): add stats reporting for public dashboards

* feat(authz): add stats reporting for public dashboards
2025-11-18 15:52:46 +05:30
Vikrant Gupta
7bd3e1c453 feat(authz): publicly shareable dashboards (#9584)
* feat(authz): base setup for public shareable dashboards

* feat(authz): add support for public masking

* feat(authz): added public path for gettable public dashboard

* feat(authz): checkpoint-1 for widget query to query range conversion

* feat(authz): checkpoint-2 for widget query to query range conversion

* feat(authz): fix widget index issue

* feat(authz): better handling for dashboard json and query

* feat(authz): use the default time range if timerange is disabled

* feat(authz): use the default time range if timerange is disabled

* feat(authz): add authz changes

* feat(authz): integrate role with dashboard anonymous access

* feat(authz): integrate the new middleware

* feat(authz): integrate the new middleware

* feat(authz): add back licensing

* feat(authz): renaming selector callback

* feat(authz): self review

* feat(authz): self review

* feat(authz): change to promql
2025-11-18 00:21:46 +05:30
Amlan Kumar Nandy
a48455b2b3 chore: fix tmp related vulnerability (#9582) 2025-11-17 13:31:40 +00:00
Karan Balani
fbb66f14ba chore: improve otel demo app setup with docker based signoz (#9567)
## 📄 Summary

Minor improvements on local setup guide doc.
2025-11-14 22:11:22 +05:30
Karan Balani
54b67d9cfd feat: add bounded cache for opaque tokenizer only for last observed at cache (#9581)
Move away from unbounded cache for `lastObservedAt` stat, which was powered by BigCache (unbounded), to Ristretto, a bounded in-memory cache (https://github.com/dgraph-io/ristretto).

This PR is first step towards moving away from unbounded caches in the system, more PRs to follow.
2025-11-14 21:23:57 +05:30
Abhishek Kumar Singh
1a193015a7 refactor: PostableRule struct (#9448)
* refactor: PostableRule struct

- made validation part of `UnmarshalJSON`
- removed validation from `processRuleDefaults` and updated signature to remove error from return type

* refactor: updated error message for missing composite query

---------

Co-authored-by: Srikanth Chekuri <srikanth.chekuri92@gmail.com>
2025-11-13 19:45:19 +00:00
Vikrant Gupta
245179cbf7 feat(authz): openfga sql migration (#9580)
* feat(authz): openfga sql migration

* feat(authz): formatting and naming

* feat(authz): formatting and naming

* feat(authz): extract function for store and model id

* feat(authz): reorder the provider
2025-11-14 00:43:02 +05:30
Yunus M
dbb6b333c8 feat: reset error boundary on pathname change (#9570) 2025-11-13 15:58:33 +05:30
Shaheer Kochai
56f8e53d88 refactor: migrate status code table to v5 (#9546)
* fix: fix the issue of aggregation incorrectly falling back to count

* refactor: add support for v5 queries in endPointDetailsDataQueries of EndPointDetails

* chore: add common utility functions

* refactor: migrate status code table to v5

* fix: status code table formatting

* chore: add tests for status code table v5 migration

* chore: add convertFiltersWithUrlHandling helper

* chore: remove unnecessary tests

* chore: aggregateAttribute to aggregations

* fix: remove the aggregateOperator fallback logic changes

* fix: fix the failing test

* fix: add response_status_code exists to the status code table query
2025-11-12 13:55:00 +00:00
Aditya Singh
2f4e371dac Fix: Preserve query on navigation b/w views | Logs Explorer code cleanup (#9496)
* feat: synchronise panel type state

* feat: refactor explorer queries

* feat: use explorer util queries

* feat: minor refactor

* feat: update test cases

* feat: remove code

* feat: minor refactor

* feat: minor refactor

* feat: update tests

* feat: update list query logic to only support first staged query

* feat: fix export query and saved views change

* feat: test fix

* feat: export link fix

---------

Co-authored-by: Nityananda Gohain <nityanandagohain@gmail.com>
2025-11-12 17:25:24 +05:30
Nikhil Mantri
db75ec56bc chore: update Services to use QBV5 (#9287) 2025-11-12 14:02:07 +05:30
primus-bot[bot]
02755a6527 chore(release): bump to v0.101.0 (#9566)
Co-authored-by: primus-bot[bot] <171087277+primus-bot[bot]@users.noreply.github.com>
Co-authored-by: Priyanshu Shrivastava <priyanshu@signoz.io>
2025-11-12 12:39:55 +05:30
Srikanth Chekuri
9f089e0784 fix(pagerduty): add severity for labels (#9538) 2025-11-12 05:51:26 +05:30
Srikanth Chekuri
fb9a7ad3cd chore: update integration dashboard json to v5 (#9534) 2025-11-12 00:09:15 +05:30
Aditya Singh
ad631d70b6 fix: add key to allow side bar nav on error thrown (#9560) 2025-11-11 17:21:06 +05:30
Vikrant Gupta
c44efeab33 fix(sessions): do not use axios base instance (#9556)
* fix(sessions): do not use axios base instance

* fix(sessions): fix test cases

* fix(sessions): add trailing slashes
2025-11-11 08:42:16 +00:00
Tushar Vats
e9743fa7ac feat: bump cloud agent version to 0.0.6 (#9298) 2025-11-11 13:58:34 +05:30
Amlan Kumar Nandy
b7ece08d3e fix: aggregation options for metric in alert condition do not get updated (#9485)
Co-authored-by: Srikanth Chekuri <srikanth.chekuri92@gmail.com>
2025-11-11 13:47:58 +07:00
Pranjul Kalsi
e5f4f5cc72 fix: preserve SMTPRequireTLS during default merge (#8478) (#9418)
The issue was with how Mergo Treats Zero values, Mergo only fills **zero-value** fields in the destination.
Since `false` is the zero value for `bool`, it always gets **replaced** by `true` from the source. Using pointers doesn’t help—`Merge` dereferences them and still treats `false` as zero.
2025-11-11 01:28:16 +05:30
Vikrant Gupta
4437630127 fix(tokenizer): do not retry 401 email_password session request (#9541) 2025-11-10 14:04:16 +00:00
Yunus M
89639b239e feat: convert duration ms to string to be passed to getYAxisFormattedValue (#9539) 2025-11-10 18:03:32 +05:30
Yunus M
785ae9f0bd feat: pass email if username is not set - pylon (#9526) 2025-11-10 17:30:32 +05:30
Abhi kumar
8752022cef fix: updated dashboard panel colors for better contrast ratio (#9500)
* fix: updated dashboard panel colors for better contrast ratio

* chore: preetier fix

* feat: added changes for the tooltip to follow cursor
2025-11-06 17:17:33 +05:30
Aditya Singh
c7e4a9c45d Fix: uplot dense points selection (#9469)
* feat: fix uplot focused series logic selection

* fix: stop propogation only if drilldown enabled

* feat: minor refactor

* feat: minor refactor

* feat: minor refactor

* feat: minor refactor

---------

Co-authored-by: Srikanth Chekuri <srikanth.chekuri92@gmail.com>
2025-11-06 11:14:02 +00:00
primus-bot[bot]
bf92c92204 chore(release): bump to v0.100.1 (#9499)
Co-authored-by: primus-bot[bot] <171087277+primus-bot[bot]@users.noreply.github.com>
2025-11-06 13:22:09 +05:30
Srikanth Chekuri
bd63633be7 fix: do not format for non aggregation columns (#9492) 2025-11-05 19:24:56 +05:30
Nikhil Mantri
1158e1199b Fix: filter with time in span scope condition builder (#9426) 2025-11-05 13:11:36 +05:30
primus-bot[bot]
0a60c49314 chore(release): bump to v0.100.0 (#9488)
Co-authored-by: primus-bot[bot] <171087277+primus-bot[bot]@users.noreply.github.com>
Co-authored-by: Priyanshu Shrivastava <priyanshu@signoz.io>
2025-11-05 12:06:42 +05:30
Ekansh Gupta
c25e3beb81 feat: changed descirption of span percentile calculation (#9487) 2025-11-05 06:23:24 +00:00
SagarRajput-7
c9e0f2b9ca fix: removed cleanup variable url function to avoid url reseting (#9449) 2025-11-05 00:33:11 +05:30
Abhi kumar
6d831849c1 perf: optimize tooltip plugin with caching, memoization, and improved… (#9421)
* perf: optimize tooltip plugin with caching, memoization, and improved DOM operations

* perf(uplot): optimize tooltip with focused sorting and O(n²) to O(n) reduction

* perf(uplot): optimize threshold rendering with batched canvas operations

* chore: pr review changes

* chore: removed last index check for tooltip generation

* chore: shifted to rendering only one points when hovered

---------

Co-authored-by: Srikanth Chekuri <srikanth.chekuri92@gmail.com>
2025-11-04 17:34:15 +00:00
aniketio-ctrl
83eeb46f99 feat(sqlstore): added sql formatter for json (#9420)
* chore: added sql formatter for json

* chore: updated json extract columns

* chore: added apend ident

* chore: resolved pr comments

* chore: resolved pr comments

* chore: resolved pr comments

* chore: resolved pr comments

* chore: minor changes

* chore: minor changes

* chore: minor changes

* chore: minor changes

* chore: resolve comments

* chore: added append value

* chore: added append value

* chore: added append value

* chore: added append value

* chore: added append value

* chore: added append value

* chore: added append value

* chore: added append value

---------

Co-authored-by: Vikrant Gupta <vikrant@signoz.io>
2025-11-04 22:05:23 +05:30
Shaheer Kochai
287558dc9d refactor: migrate External API's top 10 errors query_range request to v5 (#9476)
* feat: migrate top 10 errors query_range request to v5

* chore: remove unnecessary tests

* chore: improve the top error tests

* fix: send status_message EXISTS only if the toggle is on

* fix: get the count value and simplify the null check

* fix: send has_error = true

* chore: fall back to url.full if url.path doesn't exist

* refactor: address the PR review requested changes

* chore: add test to check if we're sending the correct filters

---------

Co-authored-by: Nityananda Gohain <nityanandagohain@gmail.com>
2025-11-04 20:09:32 +05:30
Yunus M
83aad793c2 fix: alignment issues in home page (#9459) 2025-11-04 13:13:01 +05:30
Shaheer Kochai
3eff689c85 fix: fix the issue of save button incorrectly enabled when cold_storage_ttl_days is -1 (#9458)
* fix: logs retention save button enabled when S3 disabled

* test: add test for save button state when S3 is disabled
2025-11-04 12:10:17 +05:30
Yunus M
f5bcd65e2e feat: update styles for percentile value (#9477)
* feat: update styles for percentile value

* feat: reset data on span change, remove unnecessary useMemo
2025-11-03 23:40:02 +05:30
Yunus M
e7772d93af fix: flaky multi ingestion settings test (#9478) 2025-11-03 22:21:13 +05:30
swapnil-signoz
bbf987ebd7 fix: removing duplicate creation of user if user does not exist already (#9455)
* fix: removing duplicate creation of user if user does not exist already

* test: adding api test case

* fix: updated test cases

* fix: remove unnecessary logging and clean up connection params API

* feat: add gateway fixture and integrate with signoz for connection parameters

* feat: add cloudintegrations to the test job matrix in integrationci.yaml

* fix: remove outdated comments from make_http_mocks fixture

* fix: remove deprecated ZeusURL from build configurations
2025-11-03 16:45:08 +05:30
Nityananda Gohain
105c3a3b8c fix: return coldstorage -1 if not set for logs (#9471) 2025-11-03 08:10:53 +00:00
Aditya Singh
c1a4a5b8db Log Details minor ui fix (#9463)
* feat: fix copy btn styles

* feat: minor refactor
2025-11-03 11:59:06 +05:30
aniketio-ctrl
c9591f4341 fix: formatted threshold unit in description and summary (#9350) 2025-11-02 14:27:21 +00:00
Yunus M
fd216fdee1 feat(meter): add ability to query meter data across product modules (#9142)
* feat: enable users to query meter specific data in alerts

* feat: enable metrics / meter selection in alerts and dashboards

* feat: enable setting alerts for ingestion limits

* feat: set where clause when setting alert for ingestion key

* feat(meter): handle the where clause changes

* feat: remove add alert for infinite values

* feat: add unit test cases for set alert flow

* feat: handle inital and onchange state for meter source

* feat: pass thresholds array from ingestion settings

* feat: derive source from value change rather than local state

---------

Co-authored-by: Vikrant Gupta <vikrant@signoz.io>
Co-authored-by: Srikanth Chekuri <srikanth.chekuri92@gmail.com>
2025-11-02 19:02:56 +05:30
Yunus M
f5bf4293a1 feat: span percentile - UI (#9397)
* feat: show span percentile in span details

* feat: resource attribute selection for span percentile

* feat: wait for 2 secs for the first fetch of span percentile

* feat: add unit test cases for span percentiles

* feat: use style tokens

* feat: remove redundant test assertion

* chore: resolve conflicts

* feat: reset initial wait state on span change

* feat: update payload , endpoint as per new backend changes

* feat: address review comments

* feat: fetch span percentile without specific resource attributes - first time
2025-11-01 22:57:36 +05:30
Shaheer Kochai
155a44a25d feat: add support for infra metrics in trace details (#8911)
* feat: add support for infra metrics in trace details v2

* fix: adjust the empty state if the data source is traces

* refactor: logLineTimestamp prop to timestamp

* chore: write tests for span infra metrics

* chore: return search from useLocation mock

* chore: address review changes to move inline options to useMemo

* refactor: simplify infrastructure metadata extraction logic in SpanRelatedSignals

* refactor: extract infrastructure metadata logic into utility function

* test(infraMetrics): club the similar tests

* fix: improve logs and infra tabs switching assertions

* feat: update Infra option icon to Metrics in SpanDetailsDrawer

* chore: change infra to metrics in span details drawer

* fix: fix the failing tests

---------

Co-authored-by: Nityananda Gohain <nityanandagohain@gmail.com>
2025-11-01 21:26:05 +04:30
Vishal Sharma
4b21c9d5f9 feat: add result count to data source search analytics event (#9444) 2025-10-31 12:35:24 +00:00
Yunus M
5ef0a18867 Update CODEOWNERS for frontend code (#9456) 2025-10-31 12:52:37 +05:30
SagarRajput-7
c8266d1aec fix: upgraded the axios resolution to fix vulnerability (#9454) 2025-10-31 11:53:10 +05:30
SagarRajput-7
adfd16ce1b fix: adapt the scroll reset fix in alert and histogram panels (#9322) 2025-10-30 13:31:17 +00:00
SagarRajput-7
6db74a5585 feat: allow custom precision in dashboard panels (#9054) 2025-10-30 18:50:40 +05:30
Pandey
f8e0db0085 chore: bump golangci-lint to the latest version (#9445) 2025-10-30 11:21:35 +00:00
Shaheer Kochai
01e0b36d62 fix: overall improvements to span logs drawer empty state (i.e. trace logs empty state vs. span logs empty state + UI improvements) (#9252)
* chore: remove the applied filters in related signals drawer

* chore: make the span logs highlight color more prominent

* fix: add label to open trace logs in logs explorer button

* feat: improve the span logs empty state i.e. add support for no logs for trace_id

* refactor: refactor the span logs content and make it readable

* test: add tests for span logs

* chore: improve tests

* refactor: simplify condition

* chore: remove redundant test

* fix: make trace_id logs request only if drawer is open

* chore: fix failing tests + overall improvements

* Update frontend/src/container/SpanDetailsDrawer/__tests__/SpanDetailsDrawer.test.tsx

Co-authored-by: ellipsis-dev[bot] <65095814+ellipsis-dev[bot]@users.noreply.github.com>

* chore: fix the failing test

* fix: fix the light mode styles for empty logs component

* chore: update the empty state copy

* chore: fix the failing tests by updating the assertions with correct empty state copy

---------

Co-authored-by: ellipsis-dev[bot] <65095814+ellipsis-dev[bot]@users.noreply.github.com>
2025-10-29 16:20:52 +00:00
Ekansh Gupta
e90bb016f7 feat: add span percentile for traces (#8955)
* feat: add span percentile for traces

* feat: fixed merge conflicts

* feat: fixed merge conflicts

* feat: fixed merge conflicts

* feat: added span percentile

* feat: added span percentile

* feat: added test for span percentiles

* feat: added test for span percentiles

* feat: added test for span percentiles

* feat: added test for span percentiles

* feat: removed comments

* feat: moved everything to module

* feat: refactored span percentile

* feat: refactored span percentile

* feat: refactored module package

* feat: fixed tests for span percentile

* feat: refactored span percentile and changed query

* feat: refactored span percentile and changed query

* feat: refactored span percentile and changed query

* feat: refactored span percentile and changed query

* feat: added better error handling

* feat: added better error handling

* feat: addressed pr comments

* feat: addressed pr comments

* feat: renamed translator.go

* feat: added query settings

* feat: added full query test

* feat: added fingerprinting

* feat: refactored tests

* feat: refactored to use fingerprinting and changed tests

* feat: refactored to use fingerprinting and changed tests

* feat: refactored to use fingerprinting and changed tests

* feat: changed errors

* feat: removed redundant tests

* feat: removed redundant tests

* feat: moved everything to trace aggregation and updated tests

* feat: addressed comments regarding metadatastore

* feat: addressed comments regarding metadatastore

* feat: addressed comments regarding metadatastore

* feat: addressed comments for float64

* feat: cleaned up code

* feat: cleaned up code
2025-10-29 21:35:59 +05:30
Amlan Kumar Nandy
bdecbfb7f5 chore: add missing unit tests for getLegend (#9374) 2025-10-29 16:27:20 +05:30
Nageshbansal
3dced2b082 chore(costmeter): enable costmeter by default in docker installations (#9432)
* chore(costmeter): enable costmeter by default in docker installations

* chore(costmeter): enable costmeter by default in docker installations
2025-10-29 15:24:54 +05:30
primus-bot[bot]
1285666087 chore(release): bump to v0.99.0 (#9431)
Co-authored-by: primus-bot[bot] <171087277+primus-bot[bot]@users.noreply.github.com>
2025-10-29 11:49:48 +05:30
Yunus M
1655397eaa feat: allowing switching between views when groupby is present (#9386)
* feat: allowing switching between views when groupby is present

* feat: allowing switching between views when groupby is present

* chore: remove console log
2025-10-29 05:21:10 +00:00
Shaheer Kochai
718360a966 feat: enhance s3 logs retention handling (#9371)
* feat(s3-retention): enhance S3 logs retention handling

* chore: overall improvements

* test: add tests for GeneralSettings S3 logs retention functionality

* test: improve S3 logs retention dropdown interaction and validation

* refactor: change s3 and logs response / payload keys

* chore: update the teststo adjust based on the recent payload keys changes

* chore: update the test mock value

* chore: update tests

* chore: skip the flaky test

* fix: fix the condition that would cause infinite loop and the test would fail as a result

---------

Co-authored-by: Nityananda Gohain <nityanandagohain@gmail.com>
2025-10-28 17:09:55 +00:00
Ekansh Gupta
2f5995b071 feat: changed cold storage duration to seconds in v1 (#9405)
* feat: changed cold storage duration to seconds in v1

* feat: changed cold storage duration to seconds in v1

* feat: renamed json payload

* fix: response and integration tests

---------

Co-authored-by: nityanandagohain <nityanandagohain@gmail.com>
2025-10-28 16:57:43 +00:00
Aditya Singh
a061c9de0f feat: double encode view query (#9429)
* feat: double encode view query

* feat: update test cases
2025-10-28 16:33:53 +00:00
Aditya Singh
7b1ca9a1a6 Fix: Escape HTML rendering in log body (#9413)
* feat: logs html rendering fix

* feat: remove support for \n and \t in table explorer view
2025-10-28 04:29:52 +00:00
Amlan Kumar Nandy
0d1131e99f chore: add data test ids for alerts e2e tests (#9384) 2025-10-27 17:55:06 +00:00
Shaheer Kochai
44d1d0f994 feat(logs-context): implement priority-based resource attribute selection (#9303)
* feat(LogsExplorerContext): implement priority-based resource attribute selection

* chore: write tests for useInitialQuery custom hook

* fix: prevent duplicate context filters + revert the existing regex

* chore: improve the test

* chore: overall improvements

* refactor: make getFallbackItems single responsibility

* refactor: move util functions to util.ts

* refactor: simplify the findFirstPriorityItem util

* chore: improve assertions in useInitialQuery tests

* refactor: handle deduplication at the end

* chore: add comments to clarify the priority categories and prioritization strategy
2025-10-27 13:52:39 +00:00
Pranjul Kalsi
bdce97a727 fix: replace fmt.Errorf with signoz/pkg/errors and update golangci-li… (#9373)
This PR fulfills the requirements of #9069 by:

- Adding a golangci-lint directive (forbidigo) to disallow all fmt.Errorf usages.
- Replacing existing fmt.Errorf instances with structured errors from github.com/SigNoz/signoz/pkg/errors for consistent error classification and lint compliance.
- Verified lint and build integrity.
2025-10-27 16:30:18 +05:30
Shaheer Kochai
5f8cfbe474 feat(quick-filters): improve filter visibility and auto-open behavior (#9253)
* feat(quick-filters): improve filter visibility and auto-open behavior

- Prioritize checked filter values to top of list
- Add visual separator and count indicator when collapsed
- Auto-open filters when they contain active query filters

* chore: remove the unnecessary parentheses

* chore: write tests

* chore: overall improvements

* chore: remove the applied filters count from quick filters

* chore: run prettier on Checkbox.styles.scss

* test(quick-filters): consolidate the tests

* chore: memoize isSomeFilterPresentForCurrentAttribute
2025-10-26 17:24:31 +00:00
SagarRajput-7
55c2f98768 fix: removed option param cleanup from variable function (#9411) 2025-10-26 15:02:56 +05:30
Amlan Kumar Nandy
624bb5cc62 chore: enable editing of unit from metric details (#8839) 2025-10-25 16:33:48 +05:30
SagarRajput-7
95f8fa1566 fix: fix drag select not working in panel edit mode (#9130) 2025-10-25 10:46:22 +00:00
SagarRajput-7
fa97e63912 fix: added test cases for exportoption wrapper and export function (#9321) 2025-10-25 10:33:59 +00:00
SagarRajput-7
c8419c1f82 fix: changed metric time and space type reset and change logic (#9066) 2025-10-25 15:51:45 +05:30
SagarRajput-7
e05ede3978 fix: fix threshold validation mismatch (#9196) 2025-10-25 09:57:56 +00:00
SagarRajput-7
437d0d1345 feat: added variable in url and made dashboard sync around that and sharable with user friendly format (#8874) 2025-10-25 15:16:07 +05:30
Nageshbansal
64e379c413 chore(statsreporter): adds statscollector for config (#9407)
* chore(statsreporter): adds statscollector for config

* chore(statsreporter): resolves review comments
2025-10-24 19:28:19 +05:30
SagarRajput-7
d05d394f57 chore: update slow running test in tracesExplorer test (#9396) 2025-10-23 11:02:02 +05:30
Vikrant Gupta
b4e5085a5a fix(sqlschema): postgres sqlschema get table operation (#9395)
* fix(sqlschema): postgres sqlschema get table operation

* fix(sqlschema): postgres sqlschema get table operation
2025-10-22 19:02:15 +05:30
Abhi kumar
88f7502a15 fix: prevent memory leaks from uncleaned uPlot event listeners (#9320) 2025-10-22 07:19:11 +00:00
primus-bot[bot]
b0442761ac chore(release): bump to v0.98.0 (#9393)
Co-authored-by: primus-bot[bot] <171087277+primus-bot[bot]@users.noreply.github.com>
2025-10-22 12:09:31 +05:30
Vikrant Gupta
d539ca9bab feat(sql): swap mattn/sqlite with modernc.org/sqlite (#9343)
* feat(sql): swap mattn/sqlite with modernc.org/sqlite (#9325)

* feat(sql): swap mattn/sqlite with modernc.org/sqlite

* feat(sql): revert the dashboard testing changes

* feat(sql): enable WAL mode for sqlite

* feat(sql): revert enable WAL mode for sqlite

* feat(sql): use sensible defaults for busy_timeout

* feat(sql): add ldflags

* feat(sql): enable WAL mode for sqlite

* feat(sql): some fixes

* feat(sql): some fixes

* feat(sql): fix yarn lock and config defaults

* feat(sql): update the defaults in example.conf

* feat(sql): remove wal mode from integration tests
2025-10-21 18:45:48 +05:30
Vikrant Gupta
c8194e9abb fix(tokenizer): update the authn domains tooltips (#9388) 2025-10-21 11:25:44 +00:00
449 changed files with 54253 additions and 47247 deletions

View File

@@ -42,7 +42,7 @@ services:
timeout: 5s timeout: 5s
retries: 3 retries: 3
schema-migrator-sync: schema-migrator-sync:
image: signoz/signoz-schema-migrator:v0.129.7 image: signoz/signoz-schema-migrator:v0.129.11
container_name: schema-migrator-sync container_name: schema-migrator-sync
command: command:
- sync - sync
@@ -55,7 +55,7 @@ services:
condition: service_healthy condition: service_healthy
restart: on-failure restart: on-failure
schema-migrator-async: schema-migrator-async:
image: signoz/signoz-schema-migrator:v0.129.7 image: signoz/signoz-schema-migrator:v0.129.11
container_name: schema-migrator-async container_name: schema-migrator-async
command: command:
- async - async

2
.github/CODEOWNERS vendored
View File

@@ -2,7 +2,7 @@
# Owners are automatically requested for review for PRs that changes code # Owners are automatically requested for review for PRs that changes code
# that they own. # that they own.
/frontend/ @SigNoz/frontend @YounixM /frontend/ @YounixM @aks07
/frontend/src/container/MetricsApplication @srikanthccv /frontend/src/container/MetricsApplication @srikanthccv
/frontend/src/container/NewWidget/RightContainer/types.ts @srikanthccv /frontend/src/container/NewWidget/RightContainer/types.ts @srikanthccv

View File

@@ -3,8 +3,8 @@ name: build-community
on: on:
push: push:
tags: tags:
- 'v[0-9]+.[0-9]+.[0-9]+' - "v[0-9]+.[0-9]+.[0-9]+"
- 'v[0-9]+.[0-9]+.[0-9]+-rc.[0-9]+' - "v[0-9]+.[0-9]+.[0-9]+-rc.[0-9]+"
defaults: defaults:
run: run:
@@ -69,14 +69,13 @@ jobs:
GO_BUILD_CONTEXT: ./cmd/community GO_BUILD_CONTEXT: ./cmd/community
GO_BUILD_FLAGS: >- GO_BUILD_FLAGS: >-
-tags timetzdata -tags timetzdata
-ldflags='-linkmode external -extldflags \"-static\" -s -w -ldflags='-s -w
-X github.com/SigNoz/signoz/pkg/version.version=${{ needs.prepare.outputs.version }} -X github.com/SigNoz/signoz/pkg/version.version=${{ needs.prepare.outputs.version }}
-X github.com/SigNoz/signoz/pkg/version.variant=community -X github.com/SigNoz/signoz/pkg/version.variant=community
-X github.com/SigNoz/signoz/pkg/version.hash=${{ needs.prepare.outputs.hash }} -X github.com/SigNoz/signoz/pkg/version.hash=${{ needs.prepare.outputs.hash }}
-X github.com/SigNoz/signoz/pkg/version.time=${{ needs.prepare.outputs.time }} -X github.com/SigNoz/signoz/pkg/version.time=${{ needs.prepare.outputs.time }}
-X github.com/SigNoz/signoz/pkg/version.branch=${{ needs.prepare.outputs.branch }} -X github.com/SigNoz/signoz/pkg/version.branch=${{ needs.prepare.outputs.branch }}
-X github.com/SigNoz/signoz/pkg/analytics.key=9kRrJ7oPCGPEJLF6QjMPLt5bljFhRQBr' -X github.com/SigNoz/signoz/pkg/analytics.key=9kRrJ7oPCGPEJLF6QjMPLt5bljFhRQBr'
GO_CGO_ENABLED: 1
DOCKER_BASE_IMAGES: '{"alpine": "alpine:3.20.3"}' DOCKER_BASE_IMAGES: '{"alpine": "alpine:3.20.3"}'
DOCKER_DOCKERFILE_PATH: ./cmd/community/Dockerfile.multi-arch DOCKER_DOCKERFILE_PATH: ./cmd/community/Dockerfile.multi-arch
DOCKER_MANIFEST: true DOCKER_MANIFEST: true

View File

@@ -84,7 +84,7 @@ jobs:
JS_INPUT_ARTIFACT_CACHE_KEY: enterprise-dotenv-${{ github.sha }} JS_INPUT_ARTIFACT_CACHE_KEY: enterprise-dotenv-${{ github.sha }}
JS_INPUT_ARTIFACT_PATH: frontend/.env JS_INPUT_ARTIFACT_PATH: frontend/.env
JS_OUTPUT_ARTIFACT_CACHE_KEY: enterprise-jsbuild-${{ github.sha }} JS_OUTPUT_ARTIFACT_CACHE_KEY: enterprise-jsbuild-${{ github.sha }}
JS_OUTPUT_ARTIFACT_PATH: frontend/build JS_OUTPUT_ARTIFACT_PATH: frontend/build
DOCKER_BUILD: false DOCKER_BUILD: false
DOCKER_MANIFEST: false DOCKER_MANIFEST: false
go-build: go-build:
@@ -99,7 +99,7 @@ jobs:
GO_BUILD_CONTEXT: ./cmd/enterprise GO_BUILD_CONTEXT: ./cmd/enterprise
GO_BUILD_FLAGS: >- GO_BUILD_FLAGS: >-
-tags timetzdata -tags timetzdata
-ldflags='-linkmode external -extldflags \"-static\" -s -w -ldflags='-s -w
-X github.com/SigNoz/signoz/pkg/version.version=${{ needs.prepare.outputs.version }} -X github.com/SigNoz/signoz/pkg/version.version=${{ needs.prepare.outputs.version }}
-X github.com/SigNoz/signoz/pkg/version.variant=enterprise -X github.com/SigNoz/signoz/pkg/version.variant=enterprise
-X github.com/SigNoz/signoz/pkg/version.hash=${{ needs.prepare.outputs.hash }} -X github.com/SigNoz/signoz/pkg/version.hash=${{ needs.prepare.outputs.hash }}
@@ -107,10 +107,8 @@ jobs:
-X github.com/SigNoz/signoz/pkg/version.branch=${{ needs.prepare.outputs.branch }} -X github.com/SigNoz/signoz/pkg/version.branch=${{ needs.prepare.outputs.branch }}
-X github.com/SigNoz/signoz/ee/zeus.url=https://api.signoz.cloud -X github.com/SigNoz/signoz/ee/zeus.url=https://api.signoz.cloud
-X github.com/SigNoz/signoz/ee/zeus.deprecatedURL=https://license.signoz.io -X github.com/SigNoz/signoz/ee/zeus.deprecatedURL=https://license.signoz.io
-X github.com/SigNoz/signoz/ee/query-service/constants.ZeusURL=https://api.signoz.cloud
-X github.com/SigNoz/signoz/ee/query-service/constants.LicenseSignozIo=https://license.signoz.io/api/v1 -X github.com/SigNoz/signoz/ee/query-service/constants.LicenseSignozIo=https://license.signoz.io/api/v1
-X github.com/SigNoz/signoz/pkg/analytics.key=9kRrJ7oPCGPEJLF6QjMPLt5bljFhRQBr' -X github.com/SigNoz/signoz/pkg/analytics.key=9kRrJ7oPCGPEJLF6QjMPLt5bljFhRQBr'
GO_CGO_ENABLED: 1
DOCKER_BASE_IMAGES: '{"alpine": "alpine:3.20.3"}' DOCKER_BASE_IMAGES: '{"alpine": "alpine:3.20.3"}'
DOCKER_DOCKERFILE_PATH: ./cmd/enterprise/Dockerfile.multi-arch DOCKER_DOCKERFILE_PATH: ./cmd/enterprise/Dockerfile.multi-arch
DOCKER_MANIFEST: true DOCKER_MANIFEST: true

View File

@@ -98,7 +98,7 @@ jobs:
GO_BUILD_CONTEXT: ./cmd/enterprise GO_BUILD_CONTEXT: ./cmd/enterprise
GO_BUILD_FLAGS: >- GO_BUILD_FLAGS: >-
-tags timetzdata -tags timetzdata
-ldflags='-linkmode external -extldflags \"-static\" -s -w -ldflags='-s -w
-X github.com/SigNoz/signoz/pkg/version.version=${{ needs.prepare.outputs.version }} -X github.com/SigNoz/signoz/pkg/version.version=${{ needs.prepare.outputs.version }}
-X github.com/SigNoz/signoz/pkg/version.variant=enterprise -X github.com/SigNoz/signoz/pkg/version.variant=enterprise
-X github.com/SigNoz/signoz/pkg/version.hash=${{ needs.prepare.outputs.hash }} -X github.com/SigNoz/signoz/pkg/version.hash=${{ needs.prepare.outputs.hash }}
@@ -106,10 +106,8 @@ jobs:
-X github.com/SigNoz/signoz/pkg/version.branch=${{ needs.prepare.outputs.branch }} -X github.com/SigNoz/signoz/pkg/version.branch=${{ needs.prepare.outputs.branch }}
-X github.com/SigNoz/signoz/ee/zeus.url=https://api.staging.signoz.cloud -X github.com/SigNoz/signoz/ee/zeus.url=https://api.staging.signoz.cloud
-X github.com/SigNoz/signoz/ee/zeus.deprecatedURL=https://license.staging.signoz.cloud -X github.com/SigNoz/signoz/ee/zeus.deprecatedURL=https://license.staging.signoz.cloud
-X github.com/SigNoz/signoz/ee/query-service/constants.ZeusURL=https://api.staging.signoz.cloud
-X github.com/SigNoz/signoz/ee/query-service/constants.LicenseSignozIo=https://license.staging.signoz.cloud/api/v1 -X github.com/SigNoz/signoz/ee/query-service/constants.LicenseSignozIo=https://license.staging.signoz.cloud/api/v1
-X github.com/SigNoz/signoz/pkg/analytics.key=9kRrJ7oPCGPEJLF6QjMPLt5bljFhRQBr' -X github.com/SigNoz/signoz/pkg/analytics.key=9kRrJ7oPCGPEJLF6QjMPLt5bljFhRQBr'
GO_CGO_ENABLED: 1
DOCKER_BASE_IMAGES: '{"alpine": "alpine:3.20.3"}' DOCKER_BASE_IMAGES: '{"alpine": "alpine:3.20.3"}'
DOCKER_DOCKERFILE_PATH: ./cmd/enterprise/Dockerfile.multi-arch DOCKER_DOCKERFILE_PATH: ./cmd/enterprise/Dockerfile.multi-arch
DOCKER_MANIFEST: true DOCKER_MANIFEST: true
@@ -125,4 +123,4 @@ jobs:
GITHUB_SILENT: true GITHUB_SILENT: true
GITHUB_REPOSITORY_NAME: charts-saas-v3-staging GITHUB_REPOSITORY_NAME: charts-saas-v3-staging
GITHUB_EVENT_NAME: releaser GITHUB_EVENT_NAME: releaser
GITHUB_EVENT_PAYLOAD: "{\"deployment\": \"${{ needs.prepare.outputs.deployment }}\", \"signoz_version\": \"${{ needs.prepare.outputs.version }}\"}" GITHUB_EVENT_PAYLOAD: '{"deployment": "${{ needs.prepare.outputs.deployment }}", "signoz_version": "${{ needs.prepare.outputs.version }}"}'

View File

@@ -17,6 +17,7 @@ jobs:
- bootstrap - bootstrap
- passwordauthn - passwordauthn
- callbackauthn - callbackauthn
- cloudintegrations
- querier - querier
- ttl - ttl
sqlstore-provider: sqlstore-provider:

View File

@@ -1,39 +1,63 @@
version: "2"
linters: linters:
default: standard default: none
enable: enable:
- bodyclose - bodyclose
- depguard
- errcheck
- forbidigo
- govet
- iface
- ineffassign
- misspell - misspell
- nilnil - nilnil
- sloglint - sloglint
- depguard
- iface
- unparam - unparam
- forbidigo - unused
settings:
linters-settings: depguard:
sloglint: rules:
no-mixed-args: true noerrors:
kv-only: true deny:
no-global: all - pkg: errors
context: all desc: Do not use errors package. Use github.com/SigNoz/signoz/pkg/errors instead.
static-msg: true nozap:
msg-style: lowercased deny:
key-naming-case: snake - pkg: go.uber.org/zap
depguard: desc: Do not use zap logger. Use slog instead.
rules: forbidigo:
nozap: forbid:
deny: - pattern: fmt.Errorf
- pkg: "go.uber.org/zap" - pattern: ^(fmt\.Print.*|print|println)$
desc: "Do not use zap logger. Use slog instead." iface:
noerrors: enable:
deny: - identical
- pkg: "errors" sloglint:
desc: "Do not use errors package. Use github.com/SigNoz/signoz/pkg/errors instead." no-mixed-args: true
iface: kv-only: true
enable: no-global: all
- identical context: all
issues: static-msg: true
exclude-dirs: key-naming-case: snake
- "pkg/query-service" exclusions:
- "ee/query-service" generated: lax
- "scripts/" presets:
- comments
- common-false-positives
- legacy
- std-error-handling
paths:
- pkg/query-service
- ee/query-service
- scripts/
- tmp/
- third_party$
- builtin$
- examples$
formatters:
exclusions:
generated: lax
paths:
- third_party$
- builtin$
- examples$

View File

@@ -84,10 +84,9 @@ go-run-enterprise: ## Runs the enterprise go backend server
SIGNOZ_ALERTMANAGER_PROVIDER=signoz \ SIGNOZ_ALERTMANAGER_PROVIDER=signoz \
SIGNOZ_TELEMETRYSTORE_PROVIDER=clickhouse \ SIGNOZ_TELEMETRYSTORE_PROVIDER=clickhouse \
SIGNOZ_TELEMETRYSTORE_CLICKHOUSE_DSN=tcp://127.0.0.1:9000 \ SIGNOZ_TELEMETRYSTORE_CLICKHOUSE_DSN=tcp://127.0.0.1:9000 \
SIGNOZ_TELEMETRYSTORE_CLICKHOUSE_CLUSTER=cluster \
go run -race \ go run -race \
$(GO_BUILD_CONTEXT_ENTERPRISE)/*.go \ $(GO_BUILD_CONTEXT_ENTERPRISE)/*.go
--config ./conf/prometheus.yml \
--cluster cluster
.PHONY: go-test .PHONY: go-test
go-test: ## Runs go unit tests go-test: ## Runs go unit tests
@@ -102,10 +101,9 @@ go-run-community: ## Runs the community go backend server
SIGNOZ_ALERTMANAGER_PROVIDER=signoz \ SIGNOZ_ALERTMANAGER_PROVIDER=signoz \
SIGNOZ_TELEMETRYSTORE_PROVIDER=clickhouse \ SIGNOZ_TELEMETRYSTORE_PROVIDER=clickhouse \
SIGNOZ_TELEMETRYSTORE_CLICKHOUSE_DSN=tcp://127.0.0.1:9000 \ SIGNOZ_TELEMETRYSTORE_CLICKHOUSE_DSN=tcp://127.0.0.1:9000 \
SIGNOZ_TELEMETRYSTORE_CLICKHOUSE_CLUSTER=cluster \
go run -race \ go run -race \
$(GO_BUILD_CONTEXT_COMMUNITY)/*.go server \ $(GO_BUILD_CONTEXT_COMMUNITY)/*.go server
--config ./conf/prometheus.yml \
--cluster cluster
.PHONY: go-build-community $(GO_BUILD_ARCHS_COMMUNITY) .PHONY: go-build-community $(GO_BUILD_ARCHS_COMMUNITY)
go-build-community: ## Builds the go backend server for community go-build-community: ## Builds the go backend server for community
@@ -114,9 +112,9 @@ $(GO_BUILD_ARCHS_COMMUNITY): go-build-community-%: $(TARGET_DIR)
@mkdir -p $(TARGET_DIR)/$(OS)-$* @mkdir -p $(TARGET_DIR)/$(OS)-$*
@echo ">> building binary $(TARGET_DIR)/$(OS)-$*/$(NAME)-community" @echo ">> building binary $(TARGET_DIR)/$(OS)-$*/$(NAME)-community"
@if [ $* = "arm64" ]; then \ @if [ $* = "arm64" ]; then \
CC=aarch64-linux-gnu-gcc CGO_ENABLED=1 GOARCH=$* GOOS=$(OS) go build -C $(GO_BUILD_CONTEXT_COMMUNITY) -tags timetzdata -o $(TARGET_DIR)/$(OS)-$*/$(NAME)-community -ldflags "-linkmode external -extldflags '-static' -s -w $(GO_BUILD_LDFLAGS_COMMUNITY)"; \ GOARCH=$* GOOS=$(OS) go build -C $(GO_BUILD_CONTEXT_COMMUNITY) -tags timetzdata -o $(TARGET_DIR)/$(OS)-$*/$(NAME)-community -ldflags "-s -w $(GO_BUILD_LDFLAGS_COMMUNITY)"; \
else \ else \
CGO_ENABLED=1 GOARCH=$* GOOS=$(OS) go build -C $(GO_BUILD_CONTEXT_COMMUNITY) -tags timetzdata -o $(TARGET_DIR)/$(OS)-$*/$(NAME)-community -ldflags "-linkmode external -extldflags '-static' -s -w $(GO_BUILD_LDFLAGS_COMMUNITY)"; \ GOARCH=$* GOOS=$(OS) go build -C $(GO_BUILD_CONTEXT_COMMUNITY) -tags timetzdata -o $(TARGET_DIR)/$(OS)-$*/$(NAME)-community -ldflags "-s -w $(GO_BUILD_LDFLAGS_COMMUNITY)"; \
fi fi
@@ -127,9 +125,9 @@ $(GO_BUILD_ARCHS_ENTERPRISE): go-build-enterprise-%: $(TARGET_DIR)
@mkdir -p $(TARGET_DIR)/$(OS)-$* @mkdir -p $(TARGET_DIR)/$(OS)-$*
@echo ">> building binary $(TARGET_DIR)/$(OS)-$*/$(NAME)" @echo ">> building binary $(TARGET_DIR)/$(OS)-$*/$(NAME)"
@if [ $* = "arm64" ]; then \ @if [ $* = "arm64" ]; then \
CC=aarch64-linux-gnu-gcc CGO_ENABLED=1 GOARCH=$* GOOS=$(OS) go build -C $(GO_BUILD_CONTEXT_ENTERPRISE) -tags timetzdata -o $(TARGET_DIR)/$(OS)-$*/$(NAME) -ldflags "-linkmode external -extldflags '-static' -s -w $(GO_BUILD_LDFLAGS_ENTERPRISE)"; \ GOARCH=$* GOOS=$(OS) go build -C $(GO_BUILD_CONTEXT_ENTERPRISE) -tags timetzdata -o $(TARGET_DIR)/$(OS)-$*/$(NAME) -ldflags "-s -w $(GO_BUILD_LDFLAGS_ENTERPRISE)"; \
else \ else \
CGO_ENABLED=1 GOARCH=$* GOOS=$(OS) go build -C $(GO_BUILD_CONTEXT_ENTERPRISE) -tags timetzdata -o $(TARGET_DIR)/$(OS)-$*/$(NAME) -ldflags "-linkmode external -extldflags '-static' -s -w $(GO_BUILD_LDFLAGS_ENTERPRISE)"; \ GOARCH=$* GOOS=$(OS) go build -C $(GO_BUILD_CONTEXT_ENTERPRISE) -tags timetzdata -o $(TARGET_DIR)/$(OS)-$*/$(NAME) -ldflags "-s -w $(GO_BUILD_LDFLAGS_ENTERPRISE)"; \
fi fi
.PHONY: go-build-enterprise-race $(GO_BUILD_ARCHS_ENTERPRISE_RACE) .PHONY: go-build-enterprise-race $(GO_BUILD_ARCHS_ENTERPRISE_RACE)
@@ -139,9 +137,9 @@ $(GO_BUILD_ARCHS_ENTERPRISE_RACE): go-build-enterprise-race-%: $(TARGET_DIR)
@mkdir -p $(TARGET_DIR)/$(OS)-$* @mkdir -p $(TARGET_DIR)/$(OS)-$*
@echo ">> building binary $(TARGET_DIR)/$(OS)-$*/$(NAME)" @echo ">> building binary $(TARGET_DIR)/$(OS)-$*/$(NAME)"
@if [ $* = "arm64" ]; then \ @if [ $* = "arm64" ]; then \
CC=aarch64-linux-gnu-gcc CGO_ENABLED=1 GOARCH=$* GOOS=$(OS) go build -C $(GO_BUILD_CONTEXT_ENTERPRISE) -race -tags timetzdata -o $(TARGET_DIR)/$(OS)-$*/$(NAME) -ldflags "-linkmode external -extldflags '-static' -s -w $(GO_BUILD_LDFLAGS_ENTERPRISE)"; \ GOARCH=$* GOOS=$(OS) go build -C $(GO_BUILD_CONTEXT_ENTERPRISE) -race -tags timetzdata -o $(TARGET_DIR)/$(OS)-$*/$(NAME) -ldflags "-s -w $(GO_BUILD_LDFLAGS_ENTERPRISE)"; \
else \ else \
CGO_ENABLED=1 GOARCH=$* GOOS=$(OS) go build -C $(GO_BUILD_CONTEXT_ENTERPRISE) -race -tags timetzdata -o $(TARGET_DIR)/$(OS)-$*/$(NAME) -ldflags "-linkmode external -extldflags '-static' -s -w $(GO_BUILD_LDFLAGS_ENTERPRISE)"; \ GOARCH=$* GOOS=$(OS) go build -C $(GO_BUILD_CONTEXT_ENTERPRISE) -race -tags timetzdata -o $(TARGET_DIR)/$(OS)-$*/$(NAME) -ldflags "-s -w $(GO_BUILD_LDFLAGS_ENTERPRISE)"; \
fi fi
############################################################## ##############################################################
@@ -208,4 +206,4 @@ py-lint: ## Run lint for integration tests
.PHONY: py-test .PHONY: py-test
py-test: ## Runs integration tests py-test: ## Runs integration tests
@cd tests/integration && poetry run pytest --basetemp=./tmp/ -vv --capture=no src/ @cd tests/integration && poetry run pytest --basetemp=./tmp/ -vv --capture=no src/

View File

@@ -12,12 +12,6 @@ builds:
- id: signoz - id: signoz
binary: bin/signoz binary: bin/signoz
main: ./cmd/community main: ./cmd/community
env:
- CGO_ENABLED=1
- >-
{{- if eq .Os "linux" }}
{{- if eq .Arch "arm64" }}CC=aarch64-linux-gnu-gcc{{- end }}
{{- end }}
goos: goos:
- linux - linux
- darwin - darwin
@@ -36,8 +30,6 @@ builds:
- -X github.com/SigNoz/signoz/pkg/version.time={{ .CommitTimestamp }} - -X github.com/SigNoz/signoz/pkg/version.time={{ .CommitTimestamp }}
- -X github.com/SigNoz/signoz/pkg/version.branch={{ .Branch }} - -X github.com/SigNoz/signoz/pkg/version.branch={{ .Branch }}
- -X github.com/SigNoz/signoz/pkg/analytics.key=9kRrJ7oPCGPEJLF6QjMPLt5bljFhRQBr - -X github.com/SigNoz/signoz/pkg/analytics.key=9kRrJ7oPCGPEJLF6QjMPLt5bljFhRQBr
- >-
{{- if eq .Os "linux" }}-linkmode external -extldflags '-static'{{- end }}
mod_timestamp: "{{ .CommitTimestamp }}" mod_timestamp: "{{ .CommitTimestamp }}"
tags: tags:
- timetzdata - timetzdata

View File

@@ -5,9 +5,12 @@ import (
"log/slog" "log/slog"
"github.com/SigNoz/signoz/cmd" "github.com/SigNoz/signoz/cmd"
"github.com/SigNoz/signoz/ee/authz/openfgaauthz"
"github.com/SigNoz/signoz/ee/authz/openfgaschema"
"github.com/SigNoz/signoz/ee/sqlstore/postgressqlstore" "github.com/SigNoz/signoz/ee/sqlstore/postgressqlstore"
"github.com/SigNoz/signoz/pkg/analytics" "github.com/SigNoz/signoz/pkg/analytics"
"github.com/SigNoz/signoz/pkg/authn" "github.com/SigNoz/signoz/pkg/authn"
"github.com/SigNoz/signoz/pkg/authz"
"github.com/SigNoz/signoz/pkg/factory" "github.com/SigNoz/signoz/pkg/factory"
"github.com/SigNoz/signoz/pkg/licensing" "github.com/SigNoz/signoz/pkg/licensing"
"github.com/SigNoz/signoz/pkg/licensing/nooplicensing" "github.com/SigNoz/signoz/pkg/licensing/nooplicensing"
@@ -76,6 +79,9 @@ func runServer(ctx context.Context, config signoz.Config, logger *slog.Logger) e
func(ctx context.Context, providerSettings factory.ProviderSettings, store authtypes.AuthNStore, licensing licensing.Licensing) (map[authtypes.AuthNProvider]authn.AuthN, error) { func(ctx context.Context, providerSettings factory.ProviderSettings, store authtypes.AuthNStore, licensing licensing.Licensing) (map[authtypes.AuthNProvider]authn.AuthN, error) {
return signoz.NewAuthNs(ctx, providerSettings, store, licensing) return signoz.NewAuthNs(ctx, providerSettings, store, licensing)
}, },
func(ctx context.Context, sqlstore sqlstore.SQLStore) factory.ProviderFactory[authz.AuthZ, authz.Config] {
return openfgaauthz.NewProviderFactory(sqlstore, openfgaschema.NewSchema().Get(ctx))
},
) )
if err != nil { if err != nil {
logger.ErrorContext(ctx, "failed to create signoz", "error", err) logger.ErrorContext(ctx, "failed to create signoz", "error", err)

View File

@@ -12,12 +12,6 @@ builds:
- id: signoz - id: signoz
binary: bin/signoz binary: bin/signoz
main: ./cmd/enterprise main: ./cmd/enterprise
env:
- CGO_ENABLED=1
- >-
{{- if eq .Os "linux" }}
{{- if eq .Arch "arm64" }}CC=aarch64-linux-gnu-gcc{{- end }}
{{- end }}
goos: goos:
- linux - linux
- darwin - darwin
@@ -37,11 +31,8 @@ builds:
- -X github.com/SigNoz/signoz/pkg/version.branch={{ .Branch }} - -X github.com/SigNoz/signoz/pkg/version.branch={{ .Branch }}
- -X github.com/SigNoz/signoz/ee/zeus.url=https://api.signoz.cloud - -X github.com/SigNoz/signoz/ee/zeus.url=https://api.signoz.cloud
- -X github.com/SigNoz/signoz/ee/zeus.deprecatedURL=https://license.signoz.io - -X github.com/SigNoz/signoz/ee/zeus.deprecatedURL=https://license.signoz.io
- -X github.com/SigNoz/signoz/ee/query-service/constants.ZeusURL=https://api.signoz.cloud
- -X github.com/SigNoz/signoz/ee/query-service/constants.LicenseSignozIo=https://license.signoz.io/api/v1 - -X github.com/SigNoz/signoz/ee/query-service/constants.LicenseSignozIo=https://license.signoz.io/api/v1
- -X github.com/SigNoz/signoz/pkg/analytics.key=9kRrJ7oPCGPEJLF6QjMPLt5bljFhRQBr - -X github.com/SigNoz/signoz/pkg/analytics.key=9kRrJ7oPCGPEJLF6QjMPLt5bljFhRQBr
- >-
{{- if eq .Os "linux" }}-linkmode external -extldflags '-static'{{- end }}
mod_timestamp: "{{ .CommitTimestamp }}" mod_timestamp: "{{ .CommitTimestamp }}"
tags: tags:
- timetzdata - timetzdata

View File

@@ -8,6 +8,8 @@ import (
"github.com/SigNoz/signoz/cmd" "github.com/SigNoz/signoz/cmd"
"github.com/SigNoz/signoz/ee/authn/callbackauthn/oidccallbackauthn" "github.com/SigNoz/signoz/ee/authn/callbackauthn/oidccallbackauthn"
"github.com/SigNoz/signoz/ee/authn/callbackauthn/samlcallbackauthn" "github.com/SigNoz/signoz/ee/authn/callbackauthn/samlcallbackauthn"
"github.com/SigNoz/signoz/ee/authz/openfgaauthz"
"github.com/SigNoz/signoz/ee/authz/openfgaschema"
enterpriselicensing "github.com/SigNoz/signoz/ee/licensing" enterpriselicensing "github.com/SigNoz/signoz/ee/licensing"
"github.com/SigNoz/signoz/ee/licensing/httplicensing" "github.com/SigNoz/signoz/ee/licensing/httplicensing"
enterpriseapp "github.com/SigNoz/signoz/ee/query-service/app" enterpriseapp "github.com/SigNoz/signoz/ee/query-service/app"
@@ -17,6 +19,7 @@ import (
"github.com/SigNoz/signoz/ee/zeus/httpzeus" "github.com/SigNoz/signoz/ee/zeus/httpzeus"
"github.com/SigNoz/signoz/pkg/analytics" "github.com/SigNoz/signoz/pkg/analytics"
"github.com/SigNoz/signoz/pkg/authn" "github.com/SigNoz/signoz/pkg/authn"
"github.com/SigNoz/signoz/pkg/authz"
"github.com/SigNoz/signoz/pkg/factory" "github.com/SigNoz/signoz/pkg/factory"
"github.com/SigNoz/signoz/pkg/licensing" "github.com/SigNoz/signoz/pkg/licensing"
"github.com/SigNoz/signoz/pkg/modules/organization" "github.com/SigNoz/signoz/pkg/modules/organization"
@@ -105,6 +108,9 @@ func runServer(ctx context.Context, config signoz.Config, logger *slog.Logger) e
return authNs, nil return authNs, nil
}, },
func(ctx context.Context, sqlstore sqlstore.SQLStore) factory.ProviderFactory[authz.AuthZ, authz.Config] {
return openfgaauthz.NewProviderFactory(sqlstore, openfgaschema.NewSchema().Get(ctx))
},
) )
if err != nil { if err != nil {
logger.ErrorContext(ctx, "failed to create signoz", "error", err) logger.ErrorContext(ctx, "failed to create signoz", "error", err)

View File

@@ -1,5 +1,5 @@
##################### SigNoz Configuration Example ##################### ##################### SigNoz Configuration Example #####################
# #
# Do not modify this file # Do not modify this file
# #
@@ -58,7 +58,7 @@ cache:
# The port on which the Redis server is running. Default is usually 6379. # The port on which the Redis server is running. Default is usually 6379.
port: 6379 port: 6379
# The password for authenticating with the Redis server, if required. # The password for authenticating with the Redis server, if required.
password: password:
# The Redis database number to use # The Redis database number to use
db: 0 db: 0
@@ -71,6 +71,10 @@ sqlstore:
sqlite: sqlite:
# The path to the SQLite database file. # The path to the SQLite database file.
path: /var/lib/signoz/signoz.db path: /var/lib/signoz/signoz.db
# Mode is the mode to use for the sqlite database.
mode: delete
# BusyTimeout is the timeout for the sqlite database to wait for a lock.
busy_timeout: 10s
##################### APIServer ##################### ##################### APIServer #####################
apiserver: apiserver:
@@ -238,7 +242,6 @@ statsreporter:
# Whether to collect identities and traits (emails). # Whether to collect identities and traits (emails).
identities: true identities: true
##################### Gateway (License only) ##################### ##################### Gateway (License only) #####################
gateway: gateway:
# The URL of the gateway's api. # The URL of the gateway's api.

View File

@@ -176,7 +176,7 @@ services:
# - ../common/clickhouse/storage.xml:/etc/clickhouse-server/config.d/storage.xml # - ../common/clickhouse/storage.xml:/etc/clickhouse-server/config.d/storage.xml
signoz: signoz:
!!merge <<: *db-depend !!merge <<: *db-depend
image: signoz/signoz:v0.97.0 image: signoz/signoz:v0.102.0
command: command:
- --config=/root/config/prometheus.yml - --config=/root/config/prometheus.yml
ports: ports:
@@ -209,7 +209,7 @@ services:
retries: 3 retries: 3
otel-collector: otel-collector:
!!merge <<: *db-depend !!merge <<: *db-depend
image: signoz/signoz-otel-collector:v0.129.7 image: signoz/signoz-otel-collector:v0.129.11
command: command:
- --config=/etc/otel-collector-config.yaml - --config=/etc/otel-collector-config.yaml
- --manager-config=/etc/manager-config.yaml - --manager-config=/etc/manager-config.yaml
@@ -233,7 +233,7 @@ services:
- signoz - signoz
schema-migrator: schema-migrator:
!!merge <<: *common !!merge <<: *common
image: signoz/signoz-schema-migrator:v0.129.7 image: signoz/signoz-schema-migrator:v0.129.11
deploy: deploy:
restart_policy: restart_policy:
condition: on-failure condition: on-failure

View File

@@ -117,7 +117,7 @@ services:
# - ../common/clickhouse/storage.xml:/etc/clickhouse-server/config.d/storage.xml # - ../common/clickhouse/storage.xml:/etc/clickhouse-server/config.d/storage.xml
signoz: signoz:
!!merge <<: *db-depend !!merge <<: *db-depend
image: signoz/signoz:v0.97.0 image: signoz/signoz:v0.102.0
command: command:
- --config=/root/config/prometheus.yml - --config=/root/config/prometheus.yml
ports: ports:
@@ -150,7 +150,7 @@ services:
retries: 3 retries: 3
otel-collector: otel-collector:
!!merge <<: *db-depend !!merge <<: *db-depend
image: signoz/signoz-otel-collector:v0.129.7 image: signoz/signoz-otel-collector:v0.129.11
command: command:
- --config=/etc/otel-collector-config.yaml - --config=/etc/otel-collector-config.yaml
- --manager-config=/etc/manager-config.yaml - --manager-config=/etc/manager-config.yaml
@@ -176,7 +176,7 @@ services:
- signoz - signoz
schema-migrator: schema-migrator:
!!merge <<: *common !!merge <<: *common
image: signoz/signoz-schema-migrator:v0.129.7 image: signoz/signoz-schema-migrator:v0.129.11
deploy: deploy:
restart_policy: restart_policy:
condition: on-failure condition: on-failure

View File

@@ -1,3 +1,10 @@
connectors:
signozmeter:
metrics_flush_interval: 1h
dimensions:
- name: service.name
- name: deployment.environment
- name: host.name
receivers: receivers:
otlp: otlp:
protocols: protocols:
@@ -21,6 +28,10 @@ processors:
send_batch_size: 10000 send_batch_size: 10000
send_batch_max_size: 11000 send_batch_max_size: 11000
timeout: 10s timeout: 10s
batch/meter:
send_batch_max_size: 25000
send_batch_size: 20000
timeout: 1s
resourcedetection: resourcedetection:
# Using OTEL_RESOURCE_ATTRIBUTES envvar, env detector adds custom labels. # Using OTEL_RESOURCE_ATTRIBUTES envvar, env detector adds custom labels.
detectors: [env, system] detectors: [env, system]
@@ -66,6 +77,11 @@ exporters:
dsn: tcp://clickhouse:9000/signoz_logs dsn: tcp://clickhouse:9000/signoz_logs
timeout: 10s timeout: 10s
use_new_schema: true use_new_schema: true
signozclickhousemeter:
dsn: tcp://clickhouse:9000/signoz_meter
timeout: 45s
sending_queue:
enabled: false
service: service:
telemetry: telemetry:
logs: logs:
@@ -77,16 +93,20 @@ service:
traces: traces:
receivers: [otlp] receivers: [otlp]
processors: [signozspanmetrics/delta, batch] processors: [signozspanmetrics/delta, batch]
exporters: [clickhousetraces] exporters: [clickhousetraces, signozmeter]
metrics: metrics:
receivers: [otlp] receivers: [otlp]
processors: [batch] processors: [batch]
exporters: [signozclickhousemetrics] exporters: [signozclickhousemetrics, signozmeter]
metrics/prometheus: metrics/prometheus:
receivers: [prometheus] receivers: [prometheus]
processors: [batch] processors: [batch]
exporters: [signozclickhousemetrics] exporters: [signozclickhousemetrics, signozmeter]
logs: logs:
receivers: [otlp] receivers: [otlp]
processors: [batch] processors: [batch]
exporters: [clickhouselogsexporter] exporters: [clickhouselogsexporter, signozmeter]
metrics/meter:
receivers: [signozmeter]
processors: [batch/meter]
exporters: [signozclickhousemeter]

View File

@@ -179,7 +179,7 @@ services:
# - ../common/clickhouse/storage.xml:/etc/clickhouse-server/config.d/storage.xml # - ../common/clickhouse/storage.xml:/etc/clickhouse-server/config.d/storage.xml
signoz: signoz:
!!merge <<: *db-depend !!merge <<: *db-depend
image: signoz/signoz:${VERSION:-v0.97.0} image: signoz/signoz:${VERSION:-v0.102.0}
container_name: signoz container_name: signoz
command: command:
- --config=/root/config/prometheus.yml - --config=/root/config/prometheus.yml
@@ -213,7 +213,7 @@ services:
# TODO: support otel-collector multiple replicas. Nginx/Traefik for loadbalancing? # TODO: support otel-collector multiple replicas. Nginx/Traefik for loadbalancing?
otel-collector: otel-collector:
!!merge <<: *db-depend !!merge <<: *db-depend
image: signoz/signoz-otel-collector:${OTELCOL_TAG:-v0.129.7} image: signoz/signoz-otel-collector:${OTELCOL_TAG:-v0.129.11}
container_name: signoz-otel-collector container_name: signoz-otel-collector
command: command:
- --config=/etc/otel-collector-config.yaml - --config=/etc/otel-collector-config.yaml
@@ -239,7 +239,7 @@ services:
condition: service_healthy condition: service_healthy
schema-migrator-sync: schema-migrator-sync:
!!merge <<: *common !!merge <<: *common
image: signoz/signoz-schema-migrator:${OTELCOL_TAG:-v0.129.7} image: signoz/signoz-schema-migrator:${OTELCOL_TAG:-v0.129.11}
container_name: schema-migrator-sync container_name: schema-migrator-sync
command: command:
- sync - sync
@@ -250,7 +250,7 @@ services:
condition: service_healthy condition: service_healthy
schema-migrator-async: schema-migrator-async:
!!merge <<: *db-depend !!merge <<: *db-depend
image: signoz/signoz-schema-migrator:${OTELCOL_TAG:-v0.129.7} image: signoz/signoz-schema-migrator:${OTELCOL_TAG:-v0.129.11}
container_name: schema-migrator-async container_name: schema-migrator-async
command: command:
- async - async

View File

@@ -111,7 +111,7 @@ services:
# - ../common/clickhouse/storage.xml:/etc/clickhouse-server/config.d/storage.xml # - ../common/clickhouse/storage.xml:/etc/clickhouse-server/config.d/storage.xml
signoz: signoz:
!!merge <<: *db-depend !!merge <<: *db-depend
image: signoz/signoz:${VERSION:-v0.97.0} image: signoz/signoz:${VERSION:-v0.102.0}
container_name: signoz container_name: signoz
command: command:
- --config=/root/config/prometheus.yml - --config=/root/config/prometheus.yml
@@ -144,7 +144,7 @@ services:
retries: 3 retries: 3
otel-collector: otel-collector:
!!merge <<: *db-depend !!merge <<: *db-depend
image: signoz/signoz-otel-collector:${OTELCOL_TAG:-v0.129.7} image: signoz/signoz-otel-collector:${OTELCOL_TAG:-v0.129.11}
container_name: signoz-otel-collector container_name: signoz-otel-collector
command: command:
- --config=/etc/otel-collector-config.yaml - --config=/etc/otel-collector-config.yaml
@@ -166,7 +166,7 @@ services:
condition: service_healthy condition: service_healthy
schema-migrator-sync: schema-migrator-sync:
!!merge <<: *common !!merge <<: *common
image: signoz/signoz-schema-migrator:${OTELCOL_TAG:-v0.129.7} image: signoz/signoz-schema-migrator:${OTELCOL_TAG:-v0.129.11}
container_name: schema-migrator-sync container_name: schema-migrator-sync
command: command:
- sync - sync
@@ -178,7 +178,7 @@ services:
restart: on-failure restart: on-failure
schema-migrator-async: schema-migrator-async:
!!merge <<: *db-depend !!merge <<: *db-depend
image: signoz/signoz-schema-migrator:${OTELCOL_TAG:-v0.129.7} image: signoz/signoz-schema-migrator:${OTELCOL_TAG:-v0.129.11}
container_name: schema-migrator-async container_name: schema-migrator-async
command: command:
- async - async

View File

@@ -1,3 +1,10 @@
connectors:
signozmeter:
metrics_flush_interval: 1h
dimensions:
- name: service.name
- name: deployment.environment
- name: host.name
receivers: receivers:
otlp: otlp:
protocols: protocols:
@@ -21,6 +28,10 @@ processors:
send_batch_size: 10000 send_batch_size: 10000
send_batch_max_size: 11000 send_batch_max_size: 11000
timeout: 10s timeout: 10s
batch/meter:
send_batch_max_size: 25000
send_batch_size: 20000
timeout: 1s
resourcedetection: resourcedetection:
# Using OTEL_RESOURCE_ATTRIBUTES envvar, env detector adds custom labels. # Using OTEL_RESOURCE_ATTRIBUTES envvar, env detector adds custom labels.
detectors: [env, system] detectors: [env, system]
@@ -66,6 +77,11 @@ exporters:
dsn: tcp://clickhouse:9000/signoz_logs dsn: tcp://clickhouse:9000/signoz_logs
timeout: 10s timeout: 10s
use_new_schema: true use_new_schema: true
signozclickhousemeter:
dsn: tcp://clickhouse:9000/signoz_meter
timeout: 45s
sending_queue:
enabled: false
service: service:
telemetry: telemetry:
logs: logs:
@@ -77,16 +93,20 @@ service:
traces: traces:
receivers: [otlp] receivers: [otlp]
processors: [signozspanmetrics/delta, batch] processors: [signozspanmetrics/delta, batch]
exporters: [clickhousetraces] exporters: [clickhousetraces, signozmeter]
metrics: metrics:
receivers: [otlp] receivers: [otlp]
processors: [batch] processors: [batch]
exporters: [signozclickhousemetrics] exporters: [signozclickhousemetrics, signozmeter]
metrics/prometheus: metrics/prometheus:
receivers: [prometheus] receivers: [prometheus]
processors: [batch] processors: [batch]
exporters: [signozclickhousemetrics] exporters: [signozclickhousemetrics, signozmeter]
logs: logs:
receivers: [otlp] receivers: [otlp]
processors: [batch] processors: [batch]
exporters: [clickhouselogsexporter] exporters: [clickhouselogsexporter, signozmeter]
metrics/meter:
receivers: [signozmeter]
processors: [batch/meter]
exporters: [signozclickhousemeter]

View File

@@ -13,8 +13,6 @@ Before diving in, make sure you have these tools installed:
- Download from [go.dev/dl](https://go.dev/dl/) - Download from [go.dev/dl](https://go.dev/dl/)
- Check [go.mod](../../go.mod#L3) for the minimum version - Check [go.mod](../../go.mod#L3) for the minimum version
- **GCC** - Required for CGO dependencies
- Download from [gcc.gnu.org](https://gcc.gnu.org/)
- **Node** - Powers our frontend - **Node** - Powers our frontend
- Download from [nodejs.org](https://nodejs.org) - Download from [nodejs.org](https://nodejs.org)

View File

@@ -103,9 +103,19 @@ Remember to replace the region and ingestion key with proper values as obtained
Both SigNoz and OTel demo app [frontend-proxy service, to be accurate] share common port allocation at 8080. To prevent port allocation conflicts, modify the OTel demo application config to use port 8081 as the `ENVOY_PORT` value as shown below, and run docker compose command. Both SigNoz and OTel demo app [frontend-proxy service, to be accurate] share common port allocation at 8080. To prevent port allocation conflicts, modify the OTel demo application config to use port 8081 as the `ENVOY_PORT` value as shown below, and run docker compose command.
Also, both SigNoz and OTel Demo App have the same `PROMETHEUS_PORT` configured, by default both of them try to start at `9090`, which may cause either of them to fail depending upon which one acquires it first. To prevent this, we need to mofify the value of `PROMETHEUS_PORT` too.
```sh ```sh
ENVOY_PORT=8081 docker compose up -d ENVOY_PORT=8081 PROMETHEUS_PORT=9091 docker compose up -d
``` ```
Alternatively, we can modify these values using the `.env` file too, which reduces the command as just:
```sh
docker compose up -d
```
This spins up multiple microservices, with OpenTelemetry instrumentation enabled. you can verify this by, This spins up multiple microservices, with OpenTelemetry instrumentation enabled. you can verify this by,
```sh ```sh
docker compose ps -a docker compose ps -a

View File

@@ -48,7 +48,26 @@ func (provider *provider) Check(ctx context.Context, tuple *openfgav1.TupleKey)
} }
func (provider *provider) CheckWithTupleCreation(ctx context.Context, claims authtypes.Claims, orgID valuer.UUID, relation authtypes.Relation, _ authtypes.Relation, typeable authtypes.Typeable, selectors []authtypes.Selector) error { func (provider *provider) CheckWithTupleCreation(ctx context.Context, claims authtypes.Claims, orgID valuer.UUID, relation authtypes.Relation, _ authtypes.Relation, typeable authtypes.Typeable, selectors []authtypes.Selector) error {
subject, err := authtypes.NewSubject(authtypes.TypeUser, claims.UserID, authtypes.Relation{}) subject, err := authtypes.NewSubject(authtypes.TypeableUser, claims.UserID, orgID, nil)
if err != nil {
return err
}
tuples, err := typeable.Tuples(subject, relation, selectors, orgID)
if err != nil {
return err
}
err = provider.BatchCheck(ctx, tuples)
if err != nil {
return err
}
return nil
}
func (provider *provider) CheckWithTupleCreationWithoutClaims(ctx context.Context, orgID valuer.UUID, relation authtypes.Relation, _ authtypes.Relation, typeable authtypes.Typeable, selectors []authtypes.Selector) error {
subject, err := authtypes.NewSubject(authtypes.TypeableAnonymous, authtypes.AnonymousUser.String(), orgID, nil)
if err != nil { if err != nil {
return err return err
} }

View File

@@ -15,18 +15,18 @@ type anonymous
type role type role
relations relations
define assignee: [user] define assignee: [user, anonymous]
define read: [user, role#assignee] define read: [user, role#assignee]
define update: [user, role#assignee] define update: [user, role#assignee]
define delete: [user, role#assignee] define delete: [user, role#assignee]
type resources type metaresources
relations relations
define create: [user, role#assignee] define create: [user, role#assignee]
define list: [user, role#assignee] define list: [user, role#assignee]
type resource type metaresource
relations relations
define read: [user, anonymous, role#assignee] define read: [user, anonymous, role#assignee]
define update: [user, role#assignee] define update: [user, role#assignee]
@@ -35,6 +35,6 @@ type resource
define block: [user, role#assignee] define block: [user, role#assignee]
type telemetry type telemetryresource
relations relations
define read: [user, anonymous, role#assignee] define read: [user, role#assignee]

View File

@@ -1,10 +1,10 @@
package licensing package licensing
import ( import (
"fmt"
"sync" "sync"
"time" "time"
"github.com/SigNoz/signoz/pkg/errors"
"github.com/SigNoz/signoz/pkg/licensing" "github.com/SigNoz/signoz/pkg/licensing"
) )
@@ -18,7 +18,7 @@ func Config(pollInterval time.Duration, failureThreshold int) licensing.Config {
once.Do(func() { once.Do(func() {
config = licensing.Config{PollInterval: pollInterval, FailureThreshold: failureThreshold} config = licensing.Config{PollInterval: pollInterval, FailureThreshold: failureThreshold}
if err := config.Validate(); err != nil { if err := config.Validate(); err != nil {
panic(fmt.Errorf("invalid licensing config: %w", err)) panic(errors.WrapInternalf(err, errors.CodeInternal, "invalid licensing config"))
} }
}) })

View File

@@ -20,6 +20,10 @@ import (
basemodel "github.com/SigNoz/signoz/pkg/query-service/model" basemodel "github.com/SigNoz/signoz/pkg/query-service/model"
rules "github.com/SigNoz/signoz/pkg/query-service/rules" rules "github.com/SigNoz/signoz/pkg/query-service/rules"
"github.com/SigNoz/signoz/pkg/signoz" "github.com/SigNoz/signoz/pkg/signoz"
"github.com/SigNoz/signoz/pkg/types"
"github.com/SigNoz/signoz/pkg/types/authtypes"
"github.com/SigNoz/signoz/pkg/types/dashboardtypes"
"github.com/SigNoz/signoz/pkg/valuer"
"github.com/SigNoz/signoz/pkg/version" "github.com/SigNoz/signoz/pkg/version"
"github.com/gorilla/mux" "github.com/gorilla/mux"
) )
@@ -99,6 +103,39 @@ func (ah *APIHandler) RegisterRoutes(router *mux.Router, am *middleware.AuthZ) {
router.HandleFunc("/api/v1/billing", am.AdminAccess(ah.getBilling)).Methods(http.MethodGet) router.HandleFunc("/api/v1/billing", am.AdminAccess(ah.getBilling)).Methods(http.MethodGet)
router.HandleFunc("/api/v1/portal", am.AdminAccess(ah.LicensingAPI.Portal)).Methods(http.MethodPost) router.HandleFunc("/api/v1/portal", am.AdminAccess(ah.LicensingAPI.Portal)).Methods(http.MethodPost)
// dashboards
router.HandleFunc("/api/v1/dashboards/{id}/public", am.AdminAccess(ah.Signoz.Handlers.Dashboard.CreatePublic)).Methods(http.MethodPost)
router.HandleFunc("/api/v1/dashboards/{id}/public", am.AdminAccess(ah.Signoz.Handlers.Dashboard.GetPublic)).Methods(http.MethodGet)
router.HandleFunc("/api/v1/dashboards/{id}/public", am.AdminAccess(ah.Signoz.Handlers.Dashboard.UpdatePublic)).Methods(http.MethodPut)
router.HandleFunc("/api/v1/dashboards/{id}/public", am.AdminAccess(ah.Signoz.Handlers.Dashboard.DeletePublic)).Methods(http.MethodDelete)
// public access for dashboards
router.HandleFunc("/api/v1/public/dashboards/{id}", am.CheckWithoutClaims(
ah.Signoz.Handlers.Dashboard.GetPublicData,
authtypes.RelationRead, authtypes.RelationRead,
dashboardtypes.TypeableMetaResourcePublicDashboard,
func(req *http.Request, orgs []*types.Organization) ([]authtypes.Selector, valuer.UUID, error) {
id, err := valuer.NewUUID(mux.Vars(req)["id"])
if err != nil {
return nil, valuer.UUID{}, err
}
return ah.Signoz.Modules.Dashboard.GetPublicDashboardOrgAndSelectors(req.Context(), id, orgs)
})).Methods(http.MethodGet)
router.HandleFunc("/api/v1/public/dashboards/{id}/widgets/{index}/query_range", am.CheckWithoutClaims(
ah.Signoz.Handlers.Dashboard.GetPublicWidgetQueryRange,
authtypes.RelationRead, authtypes.RelationRead,
dashboardtypes.TypeableMetaResourcePublicDashboard,
func(req *http.Request, orgs []*types.Organization) ([]authtypes.Selector, valuer.UUID, error) {
id, err := valuer.NewUUID(mux.Vars(req)["id"])
if err != nil {
return nil, valuer.UUID{}, err
}
return ah.Signoz.Modules.Dashboard.GetPublicDashboardOrgAndSelectors(req.Context(), id, orgs)
})).Methods(http.MethodGet)
// v3 // v3
router.HandleFunc("/api/v3/licenses", am.AdminAccess(ah.LicensingAPI.Activate)).Methods(http.MethodPost) router.HandleFunc("/api/v3/licenses", am.AdminAccess(ah.LicensingAPI.Activate)).Methods(http.MethodPost)
router.HandleFunc("/api/v3/licenses", am.AdminAccess(ah.LicensingAPI.Refresh)).Methods(http.MethodPut) router.HandleFunc("/api/v3/licenses", am.AdminAccess(ah.LicensingAPI.Refresh)).Methods(http.MethodPut)

View File

@@ -10,7 +10,6 @@ import (
"strings" "strings"
"time" "time"
"github.com/SigNoz/signoz/ee/query-service/constants"
"github.com/SigNoz/signoz/pkg/errors" "github.com/SigNoz/signoz/pkg/errors"
"github.com/SigNoz/signoz/pkg/http/render" "github.com/SigNoz/signoz/pkg/http/render"
"github.com/SigNoz/signoz/pkg/modules/user" "github.com/SigNoz/signoz/pkg/modules/user"
@@ -77,7 +76,7 @@ func (ah *APIHandler) CloudIntegrationsGenerateConnectionParams(w http.ResponseW
return return
} }
ingestionUrl, signozApiUrl, apiErr := getIngestionUrlAndSigNozAPIUrl(r.Context(), license.Key) ingestionUrl, signozApiUrl, apiErr := ah.getIngestionUrlAndSigNozAPIUrl(r.Context(), license.Key)
if apiErr != nil { if apiErr != nil {
RespondError(w, basemodel.WrapApiError( RespondError(w, basemodel.WrapApiError(
apiErr, "couldn't deduce ingestion url and signoz api url", apiErr, "couldn't deduce ingestion url and signoz api url",
@@ -186,48 +185,37 @@ func (ah *APIHandler) getOrCreateCloudIntegrationUser(
return cloudIntegrationUser, nil return cloudIntegrationUser, nil
} }
func getIngestionUrlAndSigNozAPIUrl(ctx context.Context, licenseKey string) ( func (ah *APIHandler) getIngestionUrlAndSigNozAPIUrl(ctx context.Context, licenseKey string) (
string, string, *basemodel.ApiError, string, string, *basemodel.ApiError,
) { ) {
url := fmt.Sprintf( // TODO: remove this struct from here
"%s%s",
strings.TrimSuffix(constants.ZeusURL, "/"),
"/v2/deployments/me",
)
type deploymentResponse struct { type deploymentResponse struct {
Status string `json:"status"` Name string `json:"name"`
Error string `json:"error"` ClusterInfo struct {
Data struct { Region struct {
Name string `json:"name"` DNS string `json:"dns"`
} `json:"region"`
ClusterInfo struct { } `json:"cluster"`
Region struct {
DNS string `json:"dns"`
} `json:"region"`
} `json:"cluster"`
} `json:"data"`
} }
resp, apiErr := requestAndParseResponse[deploymentResponse]( respBytes, err := ah.Signoz.Zeus.GetDeployment(ctx, licenseKey)
ctx, url, map[string]string{"X-Signoz-Cloud-Api-Key": licenseKey}, nil, if err != nil {
)
if apiErr != nil {
return "", "", basemodel.WrapApiError(
apiErr, "couldn't query for deployment info",
)
}
if resp.Status != "success" {
return "", "", basemodel.InternalError(fmt.Errorf( return "", "", basemodel.InternalError(fmt.Errorf(
"couldn't query for deployment info: status: %s, error: %s", "couldn't query for deployment info: error: %w", err,
resp.Status, resp.Error,
)) ))
} }
regionDns := resp.Data.ClusterInfo.Region.DNS resp := new(deploymentResponse)
deploymentName := resp.Data.Name
err = json.Unmarshal(respBytes, resp)
if err != nil {
return "", "", basemodel.InternalError(fmt.Errorf(
"couldn't unmarshal deployment info response: error: %w", err,
))
}
regionDns := resp.ClusterInfo.Region.DNS
deploymentName := resp.Name
if len(regionDns) < 1 || len(deploymentName) < 1 { if len(regionDns) < 1 || len(deploymentName) < 1 {
// Fail early if actual response structure and expectation here ever diverge // Fail early if actual response structure and expectation here ever diverge

View File

@@ -192,7 +192,7 @@ func (s Server) HealthCheckStatus() chan healthcheck.Status {
func (s *Server) createPublicServer(apiHandler *api.APIHandler, web web.Web) (*http.Server, error) { func (s *Server) createPublicServer(apiHandler *api.APIHandler, web web.Web) (*http.Server, error) {
r := baseapp.NewRouter() r := baseapp.NewRouter()
am := middleware.NewAuthZ(s.signoz.Instrumentation.Logger()) am := middleware.NewAuthZ(s.signoz.Instrumentation.Logger(), s.signoz.Modules.OrgGetter, s.signoz.Authz)
r.Use(otelmux.Middleware( r.Use(otelmux.Middleware(
"apiserver", "apiserver",

View File

@@ -10,9 +10,6 @@ var SaasSegmentKey = GetOrDefaultEnv("SIGNOZ_SAAS_SEGMENT_KEY", "")
var FetchFeatures = GetOrDefaultEnv("FETCH_FEATURES", "false") var FetchFeatures = GetOrDefaultEnv("FETCH_FEATURES", "false")
var ZeusFeaturesURL = GetOrDefaultEnv("ZEUS_FEATURES_URL", "ZeusFeaturesURL") var ZeusFeaturesURL = GetOrDefaultEnv("ZEUS_FEATURES_URL", "ZeusFeaturesURL")
// this is set via build time variable
var ZeusURL = "https://api.signoz.cloud"
func GetOrDefaultEnv(key string, fallback string) string { func GetOrDefaultEnv(key string, fallback string) string {
v := os.Getenv(key) v := os.Getenv(key)
if len(v) == 0 { if len(v) == 0 {

View File

@@ -2,6 +2,7 @@ package postgressqlschema
import ( import (
"context" "context"
"database/sql"
"github.com/SigNoz/signoz/pkg/factory" "github.com/SigNoz/signoz/pkg/factory"
"github.com/SigNoz/signoz/pkg/sqlschema" "github.com/SigNoz/signoz/pkg/sqlschema"
@@ -47,50 +48,45 @@ func (provider *provider) Operator() sqlschema.SQLOperator {
} }
func (provider *provider) GetTable(ctx context.Context, tableName sqlschema.TableName) (*sqlschema.Table, []*sqlschema.UniqueConstraint, error) { func (provider *provider) GetTable(ctx context.Context, tableName sqlschema.TableName) (*sqlschema.Table, []*sqlschema.UniqueConstraint, error) {
rows, err := provider. columns := []struct {
ColumnName string `bun:"column_name"`
Nullable bool `bun:"nullable"`
SQLDataType string `bun:"udt_name"`
DefaultVal *string `bun:"column_default"`
}{}
err := provider.
sqlstore. sqlstore.
BunDB(). BunDB().
QueryContext(ctx, ` NewRaw(`
SELECT SELECT
c.column_name, c.column_name,
c.is_nullable = 'YES', c.is_nullable = 'YES' as nullable,
c.udt_name, c.udt_name,
c.column_default c.column_default
FROM FROM
information_schema.columns AS c information_schema.columns AS c
WHERE WHERE
c.table_name = ?`, string(tableName)) c.table_name = ?`, string(tableName)).
Scan(ctx, &columns)
if err != nil { if err != nil {
return nil, nil, err return nil, nil, err
} }
if len(columns) == 0 {
return nil, nil, sql.ErrNoRows
}
defer func() { sqlschemaColumns := make([]*sqlschema.Column, 0)
if err := rows.Close(); err != nil { for _, column := range columns {
provider.settings.Logger().ErrorContext(ctx, "error closing rows", "error", err)
}
}()
columns := make([]*sqlschema.Column, 0)
for rows.Next() {
var (
name string
sqlDataType string
nullable bool
defaultVal *string
)
if err := rows.Scan(&name, &nullable, &sqlDataType, &defaultVal); err != nil {
return nil, nil, err
}
columnDefault := "" columnDefault := ""
if defaultVal != nil { if column.DefaultVal != nil {
columnDefault = *defaultVal columnDefault = *column.DefaultVal
} }
columns = append(columns, &sqlschema.Column{ sqlschemaColumns = append(sqlschemaColumns, &sqlschema.Column{
Name: sqlschema.ColumnName(name), Name: sqlschema.ColumnName(column.ColumnName),
Nullable: nullable, Nullable: column.Nullable,
DataType: provider.fmter.DataTypeOf(sqlDataType), DataType: provider.fmter.DataTypeOf(column.SQLDataType),
Default: columnDefault, Default: columnDefault,
}) })
} }
@@ -208,7 +204,7 @@ WHERE
return &sqlschema.Table{ return &sqlschema.Table{
Name: tableName, Name: tableName,
Columns: columns, Columns: sqlschemaColumns,
PrimaryKeyConstraint: primaryKeyConstraint, PrimaryKeyConstraint: primaryKeyConstraint,
ForeignKeyConstraints: foreignKeyConstraints, ForeignKeyConstraints: foreignKeyConstraints,
}, uniqueConstraints, nil }, uniqueConstraints, nil

View File

@@ -0,0 +1,153 @@
package postgressqlstore
import (
"strings"
"github.com/SigNoz/signoz/pkg/sqlstore"
"github.com/uptrace/bun/schema"
)
type formatter struct {
bunf schema.Formatter
}
func newFormatter(dialect schema.Dialect) sqlstore.SQLFormatter {
return &formatter{bunf: schema.NewFormatter(dialect)}
}
func (f *formatter) JSONExtractString(column, path string) []byte {
var sql []byte
sql = f.bunf.AppendIdent(sql, column)
sql = append(sql, f.convertJSONPathToPostgres(path)...)
return sql
}
func (f *formatter) JSONType(column, path string) []byte {
var sql []byte
sql = append(sql, "jsonb_typeof("...)
sql = f.bunf.AppendIdent(sql, column)
sql = append(sql, f.convertJSONPathToPostgresWithMode(path, false)...)
sql = append(sql, ')')
return sql
}
func (f *formatter) JSONIsArray(column, path string) []byte {
var sql []byte
sql = append(sql, f.JSONType(column, path)...)
sql = append(sql, " = "...)
sql = schema.Append(f.bunf, sql, "array")
return sql
}
func (f *formatter) JSONArrayElements(column, path, alias string) ([]byte, []byte) {
var sql []byte
sql = append(sql, "jsonb_array_elements("...)
sql = f.bunf.AppendIdent(sql, column)
sql = append(sql, f.convertJSONPathToPostgresWithMode(path, false)...)
sql = append(sql, ") AS "...)
sql = f.bunf.AppendIdent(sql, alias)
return sql, []byte(alias)
}
func (f *formatter) JSONArrayOfStrings(column, path, alias string) ([]byte, []byte) {
var sql []byte
sql = append(sql, "jsonb_array_elements_text("...)
sql = f.bunf.AppendIdent(sql, column)
sql = append(sql, f.convertJSONPathToPostgresWithMode(path, false)...)
sql = append(sql, ") AS "...)
sql = f.bunf.AppendIdent(sql, alias)
return sql, append([]byte(alias), "::text"...)
}
func (f *formatter) JSONKeys(column, path, alias string) ([]byte, []byte) {
var sql []byte
sql = append(sql, "jsonb_each("...)
sql = f.bunf.AppendIdent(sql, column)
sql = append(sql, f.convertJSONPathToPostgresWithMode(path, false)...)
sql = append(sql, ") AS "...)
sql = f.bunf.AppendIdent(sql, alias)
return sql, append([]byte(alias), ".key"...)
}
func (f *formatter) JSONArrayAgg(expression string) []byte {
var sql []byte
sql = append(sql, "jsonb_agg("...)
sql = append(sql, expression...)
sql = append(sql, ')')
return sql
}
func (f *formatter) JSONArrayLiteral(values ...string) []byte {
var sql []byte
sql = append(sql, "jsonb_build_array("...)
for idx, value := range values {
if idx > 0 {
sql = append(sql, ", "...)
}
sql = schema.Append(f.bunf, sql, value)
}
sql = append(sql, ')')
return sql
}
func (f *formatter) TextToJsonColumn(column string) []byte {
var sql []byte
sql = f.bunf.AppendIdent(sql, column)
sql = append(sql, "::jsonb"...)
return sql
}
func (f *formatter) convertJSONPathToPostgres(jsonPath string) []byte {
return f.convertJSONPathToPostgresWithMode(jsonPath, true)
}
func (f *formatter) convertJSONPathToPostgresWithMode(jsonPath string, asText bool) []byte {
path := strings.TrimPrefix(strings.TrimPrefix(jsonPath, "$"), ".")
if path == "" {
return nil
}
parts := strings.Split(path, ".")
var validParts []string
for _, part := range parts {
if part != "" {
validParts = append(validParts, part)
}
}
if len(validParts) == 0 {
return nil
}
var result []byte
for idx, part := range validParts {
if idx == len(validParts)-1 {
if asText {
result = append(result, "->>"...)
} else {
result = append(result, "->"...)
}
result = schema.Append(f.bunf, result, part)
return result
}
result = append(result, "->"...)
result = schema.Append(f.bunf, result, part)
}
return result
}
func (f *formatter) LowerExpression(expression string) []byte {
var sql []byte
sql = append(sql, "lower("...)
sql = append(sql, expression...)
sql = append(sql, ')')
return sql
}

View File

@@ -0,0 +1,500 @@
package postgressqlstore
import (
"testing"
"github.com/stretchr/testify/assert"
"github.com/uptrace/bun/dialect/pgdialect"
)
func TestJSONExtractString(t *testing.T) {
tests := []struct {
name string
column string
path string
expected string
}{
{
name: "simple path",
column: "data",
path: "$.field",
expected: `"data"->>'field'`,
},
{
name: "nested path",
column: "metadata",
path: "$.user.name",
expected: `"metadata"->'user'->>'name'`,
},
{
name: "deeply nested path",
column: "json_col",
path: "$.level1.level2.level3",
expected: `"json_col"->'level1'->'level2'->>'level3'`,
},
{
name: "root path",
column: "json_col",
path: "$",
expected: `"json_col"`,
},
{
name: "empty path",
column: "data",
path: "",
expected: `"data"`,
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
f := newFormatter(pgdialect.New())
got := string(f.JSONExtractString(tt.column, tt.path))
assert.Equal(t, tt.expected, got)
})
}
}
func TestJSONType(t *testing.T) {
tests := []struct {
name string
column string
path string
expected string
}{
{
name: "simple path",
column: "data",
path: "$.field",
expected: `jsonb_typeof("data"->'field')`,
},
{
name: "nested path",
column: "metadata",
path: "$.user.age",
expected: `jsonb_typeof("metadata"->'user'->'age')`,
},
{
name: "root path",
column: "json_col",
path: "$",
expected: `jsonb_typeof("json_col")`,
},
{
name: "empty path",
column: "data",
path: "",
expected: `jsonb_typeof("data")`,
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
f := newFormatter(pgdialect.New())
got := string(f.JSONType(tt.column, tt.path))
assert.Equal(t, tt.expected, got)
})
}
}
func TestJSONIsArray(t *testing.T) {
tests := []struct {
name string
column string
path string
expected string
}{
{
name: "simple path",
column: "data",
path: "$.items",
expected: `jsonb_typeof("data"->'items') = 'array'`,
},
{
name: "nested path",
column: "metadata",
path: "$.user.tags",
expected: `jsonb_typeof("metadata"->'user'->'tags') = 'array'`,
},
{
name: "root path",
column: "json_col",
path: "$",
expected: `jsonb_typeof("json_col") = 'array'`,
},
{
name: "empty path",
column: "data",
path: "",
expected: `jsonb_typeof("data") = 'array'`,
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
f := newFormatter(pgdialect.New())
got := string(f.JSONIsArray(tt.column, tt.path))
assert.Equal(t, tt.expected, got)
})
}
}
func TestJSONArrayElements(t *testing.T) {
tests := []struct {
name string
column string
path string
alias string
expected string
}{
{
name: "root path with dollar sign",
column: "data",
path: "$",
alias: "elem",
expected: `jsonb_array_elements("data") AS "elem"`,
},
{
name: "root path empty",
column: "data",
path: "",
alias: "elem",
expected: `jsonb_array_elements("data") AS "elem"`,
},
{
name: "nested path",
column: "metadata",
path: "$.items",
alias: "item",
expected: `jsonb_array_elements("metadata"->'items') AS "item"`,
},
{
name: "deeply nested path",
column: "json_col",
path: "$.user.tags",
alias: "tag",
expected: `jsonb_array_elements("json_col"->'user'->'tags') AS "tag"`,
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
f := newFormatter(pgdialect.New())
got, _ := f.JSONArrayElements(tt.column, tt.path, tt.alias)
assert.Equal(t, tt.expected, string(got))
})
}
}
func TestJSONArrayOfStrings(t *testing.T) {
tests := []struct {
name string
column string
path string
alias string
expected string
}{
{
name: "root path with dollar sign",
column: "data",
path: "$",
alias: "str",
expected: `jsonb_array_elements_text("data") AS "str"`,
},
{
name: "root path empty",
column: "data",
path: "",
alias: "str",
expected: `jsonb_array_elements_text("data") AS "str"`,
},
{
name: "nested path",
column: "metadata",
path: "$.strings",
alias: "s",
expected: `jsonb_array_elements_text("metadata"->'strings') AS "s"`,
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
f := newFormatter(pgdialect.New())
got, _ := f.JSONArrayOfStrings(tt.column, tt.path, tt.alias)
assert.Equal(t, tt.expected, string(got))
})
}
}
func TestJSONKeys(t *testing.T) {
tests := []struct {
name string
column string
path string
alias string
expected string
}{
{
name: "root path with dollar sign",
column: "data",
path: "$",
alias: "k",
expected: `jsonb_each("data") AS "k"`,
},
{
name: "root path empty",
column: "data",
path: "",
alias: "k",
expected: `jsonb_each("data") AS "k"`,
},
{
name: "nested path",
column: "metadata",
path: "$.object",
alias: "key",
expected: `jsonb_each("metadata"->'object') AS "key"`,
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
f := newFormatter(pgdialect.New())
got, _ := f.JSONKeys(tt.column, tt.path, tt.alias)
assert.Equal(t, tt.expected, string(got))
})
}
}
func TestJSONArrayAgg(t *testing.T) {
tests := []struct {
name string
expression string
expected string
}{
{
name: "simple column",
expression: "id",
expected: "jsonb_agg(id)",
},
{
name: "expression with function",
expression: "DISTINCT name",
expected: "jsonb_agg(DISTINCT name)",
},
{
name: "complex expression",
expression: "data->>'field'",
expected: "jsonb_agg(data->>'field')",
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
f := newFormatter(pgdialect.New())
got := string(f.JSONArrayAgg(tt.expression))
assert.Equal(t, tt.expected, got)
})
}
}
func TestJSONArrayLiteral(t *testing.T) {
tests := []struct {
name string
values []string
expected string
}{
{
name: "empty array",
values: []string{},
expected: "jsonb_build_array()",
},
{
name: "single value",
values: []string{"value1"},
expected: "jsonb_build_array('value1')",
},
{
name: "multiple values",
values: []string{"value1", "value2", "value3"},
expected: "jsonb_build_array('value1', 'value2', 'value3')",
},
{
name: "values with special characters",
values: []string{"test", "with space", "with-dash"},
expected: "jsonb_build_array('test', 'with space', 'with-dash')",
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
f := newFormatter(pgdialect.New())
got := string(f.JSONArrayLiteral(tt.values...))
assert.Equal(t, tt.expected, got)
})
}
}
func TestConvertJSONPathToPostgresWithMode(t *testing.T) {
tests := []struct {
name string
jsonPath string
asText bool
expected string
}{
{
name: "simple path as text",
jsonPath: "$.field",
asText: true,
expected: "->>'field'",
},
{
name: "simple path as json",
jsonPath: "$.field",
asText: false,
expected: "->'field'",
},
{
name: "nested path as text",
jsonPath: "$.user.name",
asText: true,
expected: "->'user'->>'name'",
},
{
name: "nested path as json",
jsonPath: "$.user.name",
asText: false,
expected: "->'user'->'name'",
},
{
name: "deeply nested as text",
jsonPath: "$.a.b.c.d",
asText: true,
expected: "->'a'->'b'->'c'->>'d'",
},
{
name: "root path",
jsonPath: "$",
asText: true,
expected: "",
},
{
name: "empty path",
jsonPath: "",
asText: true,
expected: "",
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
f := newFormatter(pgdialect.New()).(*formatter)
got := string(f.convertJSONPathToPostgresWithMode(tt.jsonPath, tt.asText))
assert.Equal(t, tt.expected, got)
})
}
}
func TestTextToJsonColumn(t *testing.T) {
tests := []struct {
name string
column string
expected string
}{
{
name: "simple column name",
column: "data",
expected: `"data"::jsonb`,
},
{
name: "column with underscore",
column: "user_data",
expected: `"user_data"::jsonb`,
},
{
name: "column with special characters",
column: "json-col",
expected: `"json-col"::jsonb`,
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
f := newFormatter(pgdialect.New())
got := string(f.TextToJsonColumn(tt.column))
assert.Equal(t, tt.expected, got)
})
}
}
func TestLowerExpression(t *testing.T) {
tests := []struct {
name string
expr string
expected string
}{
{
name: "simple column name",
expr: "name",
expected: "lower(name)",
},
{
name: "quoted column identifier",
expr: `"column_name"`,
expected: `lower("column_name")`,
},
{
name: "jsonb text extraction",
expr: "data->>'field'",
expected: "lower(data->>'field')",
},
{
name: "nested jsonb extraction",
expr: "metadata->'user'->>'name'",
expected: "lower(metadata->'user'->>'name')",
},
{
name: "jsonb_typeof expression",
expr: "jsonb_typeof(data->'field')",
expected: "lower(jsonb_typeof(data->'field'))",
},
{
name: "string concatenation",
expr: "first_name || ' ' || last_name",
expected: "lower(first_name || ' ' || last_name)",
},
{
name: "CAST expression",
expr: "CAST(value AS TEXT)",
expected: "lower(CAST(value AS TEXT))",
},
{
name: "COALESCE expression",
expr: "COALESCE(name, 'default')",
expected: "lower(COALESCE(name, 'default'))",
},
{
name: "subquery column",
expr: "users.email",
expected: "lower(users.email)",
},
{
name: "quoted identifier with special chars",
expr: `"user-name"`,
expected: `lower("user-name")`,
},
{
name: "jsonb to text cast",
expr: "data::text",
expected: "lower(data::text)",
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
f := newFormatter(pgdialect.New())
got := string(f.LowerExpression(tt.expr))
assert.Equal(t, tt.expected, got)
})
}
}

View File

@@ -15,10 +15,11 @@ import (
) )
type provider struct { type provider struct {
settings factory.ScopedProviderSettings settings factory.ScopedProviderSettings
sqldb *sql.DB sqldb *sql.DB
bundb *sqlstore.BunDB bundb *sqlstore.BunDB
dialect *dialect dialect *dialect
formatter sqlstore.SQLFormatter
} }
func NewFactory(hookFactories ...factory.ProviderFactory[sqlstore.SQLStoreHook, sqlstore.Config]) factory.ProviderFactory[sqlstore.SQLStore, sqlstore.Config] { func NewFactory(hookFactories ...factory.ProviderFactory[sqlstore.SQLStoreHook, sqlstore.Config]) factory.ProviderFactory[sqlstore.SQLStore, sqlstore.Config] {
@@ -55,11 +56,14 @@ func New(ctx context.Context, providerSettings factory.ProviderSettings, config
sqldb := stdlib.OpenDBFromPool(pool) sqldb := stdlib.OpenDBFromPool(pool)
pgDialect := pgdialect.New()
bunDB := sqlstore.NewBunDB(settings, sqldb, pgDialect, hooks)
return &provider{ return &provider{
settings: settings, settings: settings,
sqldb: sqldb, sqldb: sqldb,
bundb: sqlstore.NewBunDB(settings, sqldb, pgdialect.New(), hooks), bundb: bunDB,
dialect: new(dialect), dialect: new(dialect),
formatter: newFormatter(bunDB.Dialect()),
}, nil }, nil
} }
@@ -75,6 +79,10 @@ func (provider *provider) Dialect() sqlstore.SQLDialect {
return provider.dialect return provider.dialect
} }
func (provider *provider) Formatter() sqlstore.SQLFormatter {
return provider.formatter
}
func (provider *provider) BunDBCtx(ctx context.Context) bun.IDB { func (provider *provider) BunDBCtx(ctx context.Context) bun.IDB {
return provider.bundb.BunDBCtx(ctx) return provider.bundb.BunDBCtx(ctx)
} }

View File

@@ -1,10 +1,10 @@
package zeus package zeus
import ( import (
"fmt"
neturl "net/url" neturl "net/url"
"sync" "sync"
"github.com/SigNoz/signoz/pkg/errors"
"github.com/SigNoz/signoz/pkg/zeus" "github.com/SigNoz/signoz/pkg/zeus"
) )
@@ -24,17 +24,17 @@ func Config() zeus.Config {
once.Do(func() { once.Do(func() {
parsedURL, err := neturl.Parse(url) parsedURL, err := neturl.Parse(url)
if err != nil { if err != nil {
panic(fmt.Errorf("invalid zeus URL: %w", err)) panic(errors.WrapInternalf(err, errors.CodeInternal, "invalid zeus URL"))
} }
deprecatedParsedURL, err := neturl.Parse(deprecatedURL) deprecatedParsedURL, err := neturl.Parse(deprecatedURL)
if err != nil { if err != nil {
panic(fmt.Errorf("invalid zeus deprecated URL: %w", err)) panic(errors.WrapInternalf(err, errors.CodeInternal, "invalid zeus deprecated URL"))
} }
config = zeus.Config{URL: parsedURL, DeprecatedURL: deprecatedParsedURL} config = zeus.Config{URL: parsedURL, DeprecatedURL: deprecatedParsedURL}
if err := config.Validate(); err != nil { if err := config.Validate(); err != nil {
panic(fmt.Errorf("invalid zeus config: %w", err)) panic(errors.WrapInternalf(err, errors.CodeInternal, "invalid zeus config"))
} }
}) })

View File

@@ -69,7 +69,7 @@
"antd": "5.11.0", "antd": "5.11.0",
"antd-table-saveas-excel": "2.2.1", "antd-table-saveas-excel": "2.2.1",
"antlr4": "4.13.2", "antlr4": "4.13.2",
"axios": "1.8.2", "axios": "1.12.0",
"babel-eslint": "^10.1.0", "babel-eslint": "^10.1.0",
"babel-jest": "^29.6.4", "babel-jest": "^29.6.4",
"babel-loader": "9.1.3", "babel-loader": "9.1.3",
@@ -280,6 +280,7 @@
"got": "11.8.5", "got": "11.8.5",
"form-data": "4.0.4", "form-data": "4.0.4",
"brace-expansion": "^2.0.2", "brace-expansion": "^2.0.2",
"on-headers": "^1.1.0" "on-headers": "^1.1.0",
"tmp": "0.2.4"
} }
} }

View File

@@ -274,7 +274,7 @@ function App(): JSX.Element {
chat_settings: { chat_settings: {
app_id: process.env.PYLON_APP_ID, app_id: process.env.PYLON_APP_ID,
email: user.email, email: user.email,
name: user.displayName, name: user.displayName || user.email,
}, },
}; };
} }

View File

@@ -1,4 +1,4 @@
import { ApiBaseInstance as axios } from 'api'; import { LogEventAxiosInstance as axios } from 'api';
import { ErrorResponseHandler } from 'api/ErrorResponseHandler'; import { ErrorResponseHandler } from 'api/ErrorResponseHandler';
import { AxiosError } from 'axios'; import { AxiosError } from 'axios';
import { ErrorResponse, SuccessResponse } from 'types/api'; import { ErrorResponse, SuccessResponse } from 'types/api';

View File

@@ -1,13 +1,11 @@
/* eslint-disable sonarjs/no-duplicate-string */ /* eslint-disable sonarjs/no-duplicate-string */
import { ApiBaseInstance } from 'api'; import axios from 'api';
import { getFieldKeys } from '../getFieldKeys'; import { getFieldKeys } from '../getFieldKeys';
// Mock the API instance // Mock the API instance
jest.mock('api', () => ({ jest.mock('api', () => ({
ApiBaseInstance: { get: jest.fn(),
get: jest.fn(),
},
})); }));
describe('getFieldKeys API', () => { describe('getFieldKeys API', () => {
@@ -31,33 +29,33 @@ describe('getFieldKeys API', () => {
it('should call API with correct parameters when no args provided', async () => { it('should call API with correct parameters when no args provided', async () => {
// Mock successful API response // Mock successful API response
(ApiBaseInstance.get as jest.Mock).mockResolvedValueOnce(mockSuccessResponse); (axios.get as jest.Mock).mockResolvedValueOnce(mockSuccessResponse);
// Call function with no parameters // Call function with no parameters
await getFieldKeys(); await getFieldKeys();
// Verify API was called correctly with empty params object // Verify API was called correctly with empty params object
expect(ApiBaseInstance.get).toHaveBeenCalledWith('/fields/keys', { expect(axios.get).toHaveBeenCalledWith('/fields/keys', {
params: {}, params: {},
}); });
}); });
it('should call API with signal parameter when provided', async () => { it('should call API with signal parameter when provided', async () => {
// Mock successful API response // Mock successful API response
(ApiBaseInstance.get as jest.Mock).mockResolvedValueOnce(mockSuccessResponse); (axios.get as jest.Mock).mockResolvedValueOnce(mockSuccessResponse);
// Call function with signal parameter // Call function with signal parameter
await getFieldKeys('traces'); await getFieldKeys('traces');
// Verify API was called with signal parameter // Verify API was called with signal parameter
expect(ApiBaseInstance.get).toHaveBeenCalledWith('/fields/keys', { expect(axios.get).toHaveBeenCalledWith('/fields/keys', {
params: { signal: 'traces' }, params: { signal: 'traces' },
}); });
}); });
it('should call API with name parameter when provided', async () => { it('should call API with name parameter when provided', async () => {
// Mock successful API response // Mock successful API response
(ApiBaseInstance.get as jest.Mock).mockResolvedValueOnce({ (axios.get as jest.Mock).mockResolvedValueOnce({
status: 200, status: 200,
data: { data: {
status: 'success', status: 'success',
@@ -72,14 +70,14 @@ describe('getFieldKeys API', () => {
await getFieldKeys(undefined, 'service'); await getFieldKeys(undefined, 'service');
// Verify API was called with name parameter // Verify API was called with name parameter
expect(ApiBaseInstance.get).toHaveBeenCalledWith('/fields/keys', { expect(axios.get).toHaveBeenCalledWith('/fields/keys', {
params: { name: 'service' }, params: { name: 'service' },
}); });
}); });
it('should call API with both signal and name when provided', async () => { it('should call API with both signal and name when provided', async () => {
// Mock successful API response // Mock successful API response
(ApiBaseInstance.get as jest.Mock).mockResolvedValueOnce({ (axios.get as jest.Mock).mockResolvedValueOnce({
status: 200, status: 200,
data: { data: {
status: 'success', status: 'success',
@@ -94,14 +92,14 @@ describe('getFieldKeys API', () => {
await getFieldKeys('logs', 'service'); await getFieldKeys('logs', 'service');
// Verify API was called with both parameters // Verify API was called with both parameters
expect(ApiBaseInstance.get).toHaveBeenCalledWith('/fields/keys', { expect(axios.get).toHaveBeenCalledWith('/fields/keys', {
params: { signal: 'logs', name: 'service' }, params: { signal: 'logs', name: 'service' },
}); });
}); });
it('should return properly formatted response', async () => { it('should return properly formatted response', async () => {
// Mock API to return our response // Mock API to return our response
(ApiBaseInstance.get as jest.Mock).mockResolvedValueOnce(mockSuccessResponse); (axios.get as jest.Mock).mockResolvedValueOnce(mockSuccessResponse);
// Call the function // Call the function
const result = await getFieldKeys('traces'); const result = await getFieldKeys('traces');

View File

@@ -1,13 +1,11 @@
/* eslint-disable sonarjs/no-duplicate-string */ /* eslint-disable sonarjs/no-duplicate-string */
import { ApiBaseInstance } from 'api'; import axios from 'api';
import { getFieldValues } from '../getFieldValues'; import { getFieldValues } from '../getFieldValues';
// Mock the API instance // Mock the API instance
jest.mock('api', () => ({ jest.mock('api', () => ({
ApiBaseInstance: { get: jest.fn(),
get: jest.fn(),
},
})); }));
describe('getFieldValues API', () => { describe('getFieldValues API', () => {
@@ -17,7 +15,7 @@ describe('getFieldValues API', () => {
it('should call the API with correct parameters (no options)', async () => { it('should call the API with correct parameters (no options)', async () => {
// Mock API response // Mock API response
(ApiBaseInstance.get as jest.Mock).mockResolvedValueOnce({ (axios.get as jest.Mock).mockResolvedValueOnce({
status: 200, status: 200,
data: { data: {
status: 'success', status: 'success',
@@ -34,14 +32,14 @@ describe('getFieldValues API', () => {
await getFieldValues(); await getFieldValues();
// Verify API was called correctly with empty params // Verify API was called correctly with empty params
expect(ApiBaseInstance.get).toHaveBeenCalledWith('/fields/values', { expect(axios.get).toHaveBeenCalledWith('/fields/values', {
params: {}, params: {},
}); });
}); });
it('should call the API with signal parameter', async () => { it('should call the API with signal parameter', async () => {
// Mock API response // Mock API response
(ApiBaseInstance.get as jest.Mock).mockResolvedValueOnce({ (axios.get as jest.Mock).mockResolvedValueOnce({
status: 200, status: 200,
data: { data: {
status: 'success', status: 'success',
@@ -58,14 +56,14 @@ describe('getFieldValues API', () => {
await getFieldValues('traces'); await getFieldValues('traces');
// Verify API was called with signal parameter // Verify API was called with signal parameter
expect(ApiBaseInstance.get).toHaveBeenCalledWith('/fields/values', { expect(axios.get).toHaveBeenCalledWith('/fields/values', {
params: { signal: 'traces' }, params: { signal: 'traces' },
}); });
}); });
it('should call the API with name parameter', async () => { it('should call the API with name parameter', async () => {
// Mock API response // Mock API response
(ApiBaseInstance.get as jest.Mock).mockResolvedValueOnce({ (axios.get as jest.Mock).mockResolvedValueOnce({
status: 200, status: 200,
data: { data: {
status: 'success', status: 'success',
@@ -82,14 +80,14 @@ describe('getFieldValues API', () => {
await getFieldValues(undefined, 'service.name'); await getFieldValues(undefined, 'service.name');
// Verify API was called with name parameter // Verify API was called with name parameter
expect(ApiBaseInstance.get).toHaveBeenCalledWith('/fields/values', { expect(axios.get).toHaveBeenCalledWith('/fields/values', {
params: { name: 'service.name' }, params: { name: 'service.name' },
}); });
}); });
it('should call the API with value parameter', async () => { it('should call the API with value parameter', async () => {
// Mock API response // Mock API response
(ApiBaseInstance.get as jest.Mock).mockResolvedValueOnce({ (axios.get as jest.Mock).mockResolvedValueOnce({
status: 200, status: 200,
data: { data: {
status: 'success', status: 'success',
@@ -106,14 +104,14 @@ describe('getFieldValues API', () => {
await getFieldValues(undefined, 'service.name', 'front'); await getFieldValues(undefined, 'service.name', 'front');
// Verify API was called with value parameter // Verify API was called with value parameter
expect(ApiBaseInstance.get).toHaveBeenCalledWith('/fields/values', { expect(axios.get).toHaveBeenCalledWith('/fields/values', {
params: { name: 'service.name', searchText: 'front' }, params: { name: 'service.name', searchText: 'front' },
}); });
}); });
it('should call the API with time range parameters', async () => { it('should call the API with time range parameters', async () => {
// Mock API response // Mock API response
(ApiBaseInstance.get as jest.Mock).mockResolvedValueOnce({ (axios.get as jest.Mock).mockResolvedValueOnce({
status: 200, status: 200,
data: { data: {
status: 'success', status: 'success',
@@ -138,7 +136,7 @@ describe('getFieldValues API', () => {
); );
// Verify API was called with time range parameters (converted to milliseconds) // Verify API was called with time range parameters (converted to milliseconds)
expect(ApiBaseInstance.get).toHaveBeenCalledWith('/fields/values', { expect(axios.get).toHaveBeenCalledWith('/fields/values', {
params: { params: {
signal: 'logs', signal: 'logs',
name: 'service.name', name: 'service.name',
@@ -165,7 +163,7 @@ describe('getFieldValues API', () => {
}, },
}; };
(ApiBaseInstance.get as jest.Mock).mockResolvedValueOnce(mockResponse); (axios.get as jest.Mock).mockResolvedValueOnce(mockResponse);
// Call the function // Call the function
const result = await getFieldValues('traces', 'mixed.values'); const result = await getFieldValues('traces', 'mixed.values');
@@ -196,7 +194,7 @@ describe('getFieldValues API', () => {
}; };
// Mock API to return our response // Mock API to return our response
(ApiBaseInstance.get as jest.Mock).mockResolvedValueOnce(mockApiResponse); (axios.get as jest.Mock).mockResolvedValueOnce(mockApiResponse);
// Call the function // Call the function
const result = await getFieldValues('traces', 'service.name'); const result = await getFieldValues('traces', 'service.name');

View File

@@ -1,4 +1,4 @@
import { ApiBaseInstance } from 'api'; import axios from 'api';
import { ErrorResponseHandlerV2 } from 'api/ErrorResponseHandlerV2'; import { ErrorResponseHandlerV2 } from 'api/ErrorResponseHandlerV2';
import { AxiosError } from 'axios'; import { AxiosError } from 'axios';
import { ErrorV2Resp, SuccessResponseV2 } from 'types/api'; import { ErrorV2Resp, SuccessResponseV2 } from 'types/api';
@@ -24,7 +24,7 @@ export const getFieldKeys = async (
} }
try { try {
const response = await ApiBaseInstance.get('/fields/keys', { params }); const response = await axios.get('/fields/keys', { params });
return { return {
httpStatusCode: response.status, httpStatusCode: response.status,

View File

@@ -1,5 +1,5 @@
/* eslint-disable sonarjs/cognitive-complexity */ /* eslint-disable sonarjs/cognitive-complexity */
import { ApiBaseInstance } from 'api'; import axios from 'api';
import { ErrorResponseHandlerV2 } from 'api/ErrorResponseHandlerV2'; import { ErrorResponseHandlerV2 } from 'api/ErrorResponseHandlerV2';
import { AxiosError } from 'axios'; import { AxiosError } from 'axios';
import { ErrorV2Resp, SuccessResponseV2 } from 'types/api'; import { ErrorV2Resp, SuccessResponseV2 } from 'types/api';
@@ -47,7 +47,7 @@ export const getFieldValues = async (
} }
try { try {
const response = await ApiBaseInstance.get('/fields/values', { params }); const response = await axios.get('/fields/values', { params });
// Normalize values from different types (stringValues, boolValues, etc.) // Normalize values from different types (stringValues, boolValues, etc.)
if (response.data?.data?.values) { if (response.data?.data?.values) {

View File

@@ -86,8 +86,9 @@ const interceptorRejected = async (
if ( if (
response.status === 401 && response.status === 401 &&
// if the session rotate call errors out with 401 or the delete sessions call returns 401 then we do not retry! // if the session rotate call or the create session errors out with 401 or the delete sessions call returns 401 then we do not retry!
response.config.url !== '/sessions/rotate' && response.config.url !== '/sessions/rotate' &&
response.config.url !== '/sessions/email_password' &&
!( !(
response.config.url === '/sessions' && response.config.method === 'delete' response.config.url === '/sessions' && response.config.method === 'delete'
) )
@@ -199,15 +200,15 @@ ApiV5Instance.interceptors.request.use(interceptorsRequestResponse);
// //
// axios Base // axios Base
export const ApiBaseInstance = axios.create({ export const LogEventAxiosInstance = axios.create({
baseURL: `${ENVIRONMENT.baseURL}${apiV1}`, baseURL: `${ENVIRONMENT.baseURL}${apiV1}`,
}); });
ApiBaseInstance.interceptors.response.use( LogEventAxiosInstance.interceptors.response.use(
interceptorsResponse, interceptorsResponse,
interceptorRejectedBase, interceptorRejectedBase,
); );
ApiBaseInstance.interceptors.request.use(interceptorsRequestResponse); LogEventAxiosInstance.interceptors.request.use(interceptorsRequestResponse);
// //
// gateway Api V1 // gateway Api V1

View File

@@ -1,4 +1,4 @@
import { ApiBaseInstance } from 'api'; import axios from 'api';
import { ErrorResponseHandler } from 'api/ErrorResponseHandler'; import { ErrorResponseHandler } from 'api/ErrorResponseHandler';
import { AxiosError, AxiosResponse } from 'axios'; import { AxiosError, AxiosResponse } from 'axios';
import { baseAutoCompleteIdKeysOrder } from 'constants/queryBuilder'; import { baseAutoCompleteIdKeysOrder } from 'constants/queryBuilder';
@@ -17,7 +17,7 @@ export const getHostAttributeKeys = async (
try { try {
const response: AxiosResponse<{ const response: AxiosResponse<{
data: IQueryAutocompleteResponse; data: IQueryAutocompleteResponse;
}> = await ApiBaseInstance.get( }> = await axios.get(
`/${entity}/attribute_keys?dataSource=metrics&searchText=${searchText}`, `/${entity}/attribute_keys?dataSource=metrics&searchText=${searchText}`,
{ {
params: { params: {

View File

@@ -1,4 +1,4 @@
import { ApiBaseInstance } from 'api'; import axios from 'api';
import { ErrorResponseHandler } from 'api/ErrorResponseHandler'; import { ErrorResponseHandler } from 'api/ErrorResponseHandler';
import { AxiosError } from 'axios'; import { AxiosError } from 'axios';
import { SOMETHING_WENT_WRONG } from 'constants/api'; import { SOMETHING_WENT_WRONG } from 'constants/api';
@@ -20,7 +20,7 @@ const getOnboardingStatus = async (props: {
}): Promise<SuccessResponse<OnboardingStatusResponse> | ErrorResponse> => { }): Promise<SuccessResponse<OnboardingStatusResponse> | ErrorResponse> => {
const { endpointService, ...rest } = props; const { endpointService, ...rest } = props;
try { try {
const response = await ApiBaseInstance.post( const response = await axios.post(
`/messaging-queues/kafka/onboarding/${endpointService || 'consumers'}`, `/messaging-queues/kafka/onboarding/${endpointService || 'consumers'}`,
rest, rest,
); );

View File

@@ -1,13 +1,20 @@
import axios from 'api'; import { ApiV2Instance } from 'api';
import { ErrorResponseHandlerV2 } from 'api/ErrorResponseHandlerV2';
import { AxiosError } from 'axios';
import { ErrorV2Resp } from 'types/api';
import { PayloadProps, Props } from 'types/api/metrics/getService'; import { PayloadProps, Props } from 'types/api/metrics/getService';
const getService = async (props: Props): Promise<PayloadProps> => { const getService = async (props: Props): Promise<PayloadProps> => {
const response = await axios.post(`/services`, { try {
start: `${props.start}`, const response = await ApiV2Instance.post(`/services`, {
end: `${props.end}`, start: `${props.start}`,
tags: props.selectedTags, end: `${props.end}`,
}); tags: props.selectedTags,
return response.data; });
return response.data.data;
} catch (error) {
ErrorResponseHandlerV2(error as AxiosError<ErrorV2Resp>);
}
}; };
export default getService; export default getService;

View File

@@ -1,22 +1,27 @@
import axios from 'api'; import { ApiV2Instance } from 'api';
import { ErrorResponseHandlerV2 } from 'api/ErrorResponseHandlerV2';
import { AxiosError } from 'axios';
import { ErrorV2Resp } from 'types/api';
import { PayloadProps, Props } from 'types/api/metrics/getTopOperations'; import { PayloadProps, Props } from 'types/api/metrics/getTopOperations';
const getTopOperations = async (props: Props): Promise<PayloadProps> => { const getTopOperations = async (props: Props): Promise<PayloadProps> => {
const endpoint = props.isEntryPoint try {
? '/service/entry_point_operations' const endpoint = props.isEntryPoint
: '/service/top_operations'; ? '/service/entry_point_operations'
: '/service/top_operations';
const response = await axios.post(endpoint, { const response = await ApiV2Instance.post(endpoint, {
start: `${props.start}`, start: `${props.start}`,
end: `${props.end}`, end: `${props.end}`,
service: props.service, service: props.service,
tags: props.selectedTags, tags: props.selectedTags,
}); limit: 5000,
});
if (props.isEntryPoint) {
return response.data.data; return response.data.data;
} catch (error) {
ErrorResponseHandlerV2(error as AxiosError<ErrorV2Resp>);
} }
return response.data;
}; };
export default getTopOperations; export default getTopOperations;

View File

@@ -9,6 +9,7 @@ export interface UpdateMetricMetadataProps {
metricType: MetricType; metricType: MetricType;
temporality?: Temporality; temporality?: Temporality;
isMonotonic?: boolean; isMonotonic?: boolean;
unit?: string;
} }
export interface UpdateMetricMetadataResponse { export interface UpdateMetricMetadataResponse {

View File

@@ -1,4 +1,4 @@
import { ApiBaseInstance } from 'api'; import axios from 'api';
import { ErrorResponseHandler } from 'api/ErrorResponseHandler'; import { ErrorResponseHandler } from 'api/ErrorResponseHandler';
import { AxiosError } from 'axios'; import { AxiosError } from 'axios';
import { ErrorResponse, SuccessResponse } from 'types/api'; import { ErrorResponse, SuccessResponse } from 'types/api';
@@ -9,7 +9,7 @@ const getCustomFilters = async (
): Promise<SuccessResponse<PayloadProps> | ErrorResponse> => { ): Promise<SuccessResponse<PayloadProps> | ErrorResponse> => {
const { signal } = props; const { signal } = props;
try { try {
const response = await ApiBaseInstance.get(`orgs/me/filters/${signal}`); const response = await axios.get(`/orgs/me/filters/${signal}`);
return { return {
statusCode: 200, statusCode: 200,

View File

@@ -1,4 +1,4 @@
import { ApiBaseInstance } from 'api'; import axios from 'api';
import { AxiosError } from 'axios'; import { AxiosError } from 'axios';
import { SuccessResponse } from 'types/api'; import { SuccessResponse } from 'types/api';
import { UpdateCustomFiltersProps } from 'types/api/quickFilters/updateCustomFilters'; import { UpdateCustomFiltersProps } from 'types/api/quickFilters/updateCustomFilters';
@@ -6,7 +6,7 @@ import { UpdateCustomFiltersProps } from 'types/api/quickFilters/updateCustomFil
const updateCustomFiltersAPI = async ( const updateCustomFiltersAPI = async (
props: UpdateCustomFiltersProps, props: UpdateCustomFiltersProps,
): Promise<SuccessResponse<void> | AxiosError> => ): Promise<SuccessResponse<void> | AxiosError> =>
ApiBaseInstance.put(`orgs/me/filters`, { axios.put(`/orgs/me/filters`, {
...props.data, ...props.data,
}); });

View File

@@ -8,7 +8,7 @@ const setRetentionV2 = async ({
type, type,
defaultTTLDays, defaultTTLDays,
coldStorageVolume, coldStorageVolume,
coldStorageDuration, coldStorageDurationDays,
ttlConditions, ttlConditions,
}: PropsV2): Promise<SuccessResponseV2<PayloadPropsV2>> => { }: PropsV2): Promise<SuccessResponseV2<PayloadPropsV2>> => {
try { try {
@@ -16,7 +16,7 @@ const setRetentionV2 = async ({
type, type,
defaultTTLDays, defaultTTLDays,
coldStorageVolume, coldStorageVolume,
coldStorageDuration, coldStorageDurationDays,
ttlConditions, ttlConditions,
}); });

View File

@@ -1,4 +1,4 @@
import { ApiBaseInstance } from 'api'; import axios from 'api';
import { ErrorResponseHandlerV2 } from 'api/ErrorResponseHandlerV2'; import { ErrorResponseHandlerV2 } from 'api/ErrorResponseHandlerV2';
import { AxiosError } from 'axios'; import { AxiosError } from 'axios';
import { ErrorV2Resp, SuccessResponseV2 } from 'types/api'; import { ErrorV2Resp, SuccessResponseV2 } from 'types/api';
@@ -9,15 +9,12 @@ const listOverview = async (
): Promise<SuccessResponseV2<PayloadProps>> => { ): Promise<SuccessResponseV2<PayloadProps>> => {
const { start, end, show_ip: showIp, filter } = props; const { start, end, show_ip: showIp, filter } = props;
try { try {
const response = await ApiBaseInstance.post( const response = await axios.post(`/third-party-apis/overview/list`, {
`/third-party-apis/overview/list`, start,
{ end,
start, show_ip: showIp,
end, filter,
show_ip: showIp, });
filter,
},
);
return { return {
httpStatusCode: response.status, httpStatusCode: response.status,

View File

@@ -0,0 +1,28 @@
import axios from 'api';
import { ErrorResponseHandlerV2 } from 'api/ErrorResponseHandlerV2';
import { AxiosError } from 'axios';
import { ErrorV2Resp, SuccessResponseV2 } from 'types/api';
import {
GetSpanPercentilesProps,
GetSpanPercentilesResponseDataProps,
} from 'types/api/trace/getSpanPercentiles';
const getSpanPercentiles = async (
props: GetSpanPercentilesProps,
): Promise<SuccessResponseV2<GetSpanPercentilesResponseDataProps>> => {
try {
const response = await axios.post('/span_percentile', {
...props,
});
return {
httpStatusCode: response.status,
data: response.data.data,
};
} catch (error) {
ErrorResponseHandlerV2(error as AxiosError<ErrorV2Resp>);
throw error;
}
};
export default getSpanPercentiles;

View File

@@ -11,7 +11,7 @@ import {
export const getQueryRangeV5 = async ( export const getQueryRangeV5 = async (
props: QueryRangePayloadV5, props: QueryRangePayloadV5,
version: string, version: string,
signal: AbortSignal, signal?: AbortSignal,
headers?: Record<string, string>, headers?: Record<string, string>,
): Promise<SuccessResponseV2<MetricRangePayloadV5>> => { ): Promise<SuccessResponseV2<MetricRangePayloadV5>> => {
try { try {

View File

@@ -0,0 +1,371 @@
/* eslint-disable sonarjs/no-duplicate-string */
import { getYAxisFormattedValue, PrecisionOptionsEnum } from '../yAxisConfig';
const testFullPrecisionGetYAxisFormattedValue = (
value: string,
format: string,
): string => getYAxisFormattedValue(value, format, PrecisionOptionsEnum.FULL);
describe('getYAxisFormattedValue - none (full precision legacy assertions)', () => {
test('large integers and decimals', () => {
expect(testFullPrecisionGetYAxisFormattedValue('250034', 'none')).toBe(
'250034',
);
expect(
testFullPrecisionGetYAxisFormattedValue('250034897.12345', 'none'),
).toBe('250034897.12345');
expect(
testFullPrecisionGetYAxisFormattedValue('250034897.02354', 'none'),
).toBe('250034897.02354');
expect(testFullPrecisionGetYAxisFormattedValue('9999999.9999', 'none')).toBe(
'9999999.9999',
);
});
test('preserves leading zeros after decimal until first non-zero', () => {
expect(testFullPrecisionGetYAxisFormattedValue('1.0000234', 'none')).toBe(
'1.0000234',
);
expect(testFullPrecisionGetYAxisFormattedValue('0.00003', 'none')).toBe(
'0.00003',
);
});
test('trims to three significant decimals and removes trailing zeros', () => {
expect(
testFullPrecisionGetYAxisFormattedValue('0.000000250034', 'none'),
).toBe('0.000000250034');
expect(testFullPrecisionGetYAxisFormattedValue('0.00000025', 'none')).toBe(
'0.00000025',
);
// Big precision, limiting the javascript precision (~16 digits)
expect(
testFullPrecisionGetYAxisFormattedValue('1.0000000000000001', 'none'),
).toBe('1');
expect(
testFullPrecisionGetYAxisFormattedValue('1.00555555559595876', 'none'),
).toBe('1.005555555595958');
expect(testFullPrecisionGetYAxisFormattedValue('0.000000001', 'none')).toBe(
'0.000000001',
);
expect(
testFullPrecisionGetYAxisFormattedValue('0.000000250000', 'none'),
).toBe('0.00000025');
});
test('whole numbers normalize', () => {
expect(testFullPrecisionGetYAxisFormattedValue('1000', 'none')).toBe('1000');
expect(testFullPrecisionGetYAxisFormattedValue('99.5458', 'none')).toBe(
'99.5458',
);
expect(testFullPrecisionGetYAxisFormattedValue('1.234567', 'none')).toBe(
'1.234567',
);
expect(testFullPrecisionGetYAxisFormattedValue('99.998', 'none')).toBe(
'99.998',
);
});
test('strip redundant decimal zeros', () => {
expect(testFullPrecisionGetYAxisFormattedValue('1000.000', 'none')).toBe(
'1000',
);
expect(testFullPrecisionGetYAxisFormattedValue('99.500', 'none')).toBe(
'99.5',
);
expect(testFullPrecisionGetYAxisFormattedValue('1.000', 'none')).toBe('1');
});
test('edge values', () => {
expect(testFullPrecisionGetYAxisFormattedValue('0', 'none')).toBe('0');
expect(testFullPrecisionGetYAxisFormattedValue('-0', 'none')).toBe('0');
expect(testFullPrecisionGetYAxisFormattedValue('Infinity', 'none')).toBe('∞');
expect(testFullPrecisionGetYAxisFormattedValue('-Infinity', 'none')).toBe(
'-∞',
);
expect(testFullPrecisionGetYAxisFormattedValue('invalid', 'none')).toBe(
'NaN',
);
expect(testFullPrecisionGetYAxisFormattedValue('', 'none')).toBe('NaN');
expect(testFullPrecisionGetYAxisFormattedValue('abc123', 'none')).toBe('NaN');
});
test('small decimals keep precision as-is', () => {
expect(testFullPrecisionGetYAxisFormattedValue('0.0001', 'none')).toBe(
'0.0001',
);
expect(testFullPrecisionGetYAxisFormattedValue('-0.0001', 'none')).toBe(
'-0.0001',
);
expect(testFullPrecisionGetYAxisFormattedValue('0.000000001', 'none')).toBe(
'0.000000001',
);
});
test('simple decimals preserved', () => {
expect(testFullPrecisionGetYAxisFormattedValue('0.1', 'none')).toBe('0.1');
expect(testFullPrecisionGetYAxisFormattedValue('0.2', 'none')).toBe('0.2');
expect(testFullPrecisionGetYAxisFormattedValue('0.3', 'none')).toBe('0.3');
expect(testFullPrecisionGetYAxisFormattedValue('1.0000000001', 'none')).toBe(
'1.0000000001',
);
});
});
describe('getYAxisFormattedValue - units (full precision legacy assertions)', () => {
test('ms', () => {
expect(testFullPrecisionGetYAxisFormattedValue('1500', 'ms')).toBe('1.5 s');
expect(testFullPrecisionGetYAxisFormattedValue('500', 'ms')).toBe('500 ms');
expect(testFullPrecisionGetYAxisFormattedValue('60000', 'ms')).toBe('1 min');
expect(testFullPrecisionGetYAxisFormattedValue('295.429', 'ms')).toBe(
'295.429 ms',
);
expect(testFullPrecisionGetYAxisFormattedValue('4353.81', 'ms')).toBe(
'4.35381 s',
);
});
test('s', () => {
expect(testFullPrecisionGetYAxisFormattedValue('90', 's')).toBe('1.5 mins');
expect(testFullPrecisionGetYAxisFormattedValue('30', 's')).toBe('30 s');
expect(testFullPrecisionGetYAxisFormattedValue('3600', 's')).toBe('1 hour');
});
test('m', () => {
expect(testFullPrecisionGetYAxisFormattedValue('90', 'm')).toBe('1.5 hours');
expect(testFullPrecisionGetYAxisFormattedValue('30', 'm')).toBe('30 min');
expect(testFullPrecisionGetYAxisFormattedValue('1440', 'm')).toBe('1 day');
});
test('bytes', () => {
expect(testFullPrecisionGetYAxisFormattedValue('1024', 'bytes')).toBe(
'1 KiB',
);
expect(testFullPrecisionGetYAxisFormattedValue('512', 'bytes')).toBe('512 B');
expect(testFullPrecisionGetYAxisFormattedValue('1536', 'bytes')).toBe(
'1.5 KiB',
);
});
test('mbytes', () => {
expect(testFullPrecisionGetYAxisFormattedValue('1024', 'mbytes')).toBe(
'1 GiB',
);
expect(testFullPrecisionGetYAxisFormattedValue('512', 'mbytes')).toBe(
'512 MiB',
);
expect(testFullPrecisionGetYAxisFormattedValue('1536', 'mbytes')).toBe(
'1.5 GiB',
);
});
test('kbytes', () => {
expect(testFullPrecisionGetYAxisFormattedValue('1024', 'kbytes')).toBe(
'1 MiB',
);
expect(testFullPrecisionGetYAxisFormattedValue('512', 'kbytes')).toBe(
'512 KiB',
);
expect(testFullPrecisionGetYAxisFormattedValue('1536', 'kbytes')).toBe(
'1.5 MiB',
);
});
test('short', () => {
expect(testFullPrecisionGetYAxisFormattedValue('1000', 'short')).toBe('1 K');
expect(testFullPrecisionGetYAxisFormattedValue('1500', 'short')).toBe(
'1.5 K',
);
expect(testFullPrecisionGetYAxisFormattedValue('999', 'short')).toBe('999');
expect(testFullPrecisionGetYAxisFormattedValue('1000000', 'short')).toBe(
'1 Mil',
);
expect(testFullPrecisionGetYAxisFormattedValue('1555600', 'short')).toBe(
'1.5556 Mil',
);
expect(testFullPrecisionGetYAxisFormattedValue('999999', 'short')).toBe(
'999.999 K',
);
expect(testFullPrecisionGetYAxisFormattedValue('1000000000', 'short')).toBe(
'1 Bil',
);
expect(testFullPrecisionGetYAxisFormattedValue('1500000000', 'short')).toBe(
'1.5 Bil',
);
expect(testFullPrecisionGetYAxisFormattedValue('999999999', 'short')).toBe(
'999.999999 Mil',
);
});
test('percent', () => {
expect(testFullPrecisionGetYAxisFormattedValue('0.15', 'percent')).toBe(
'0.15%',
);
expect(testFullPrecisionGetYAxisFormattedValue('0.1234', 'percent')).toBe(
'0.1234%',
);
expect(testFullPrecisionGetYAxisFormattedValue('0.123499', 'percent')).toBe(
'0.123499%',
);
expect(testFullPrecisionGetYAxisFormattedValue('1.5', 'percent')).toBe(
'1.5%',
);
expect(testFullPrecisionGetYAxisFormattedValue('0.0001', 'percent')).toBe(
'0.0001%',
);
expect(
testFullPrecisionGetYAxisFormattedValue('0.000000001', 'percent'),
).toBe('1e-9%');
expect(
testFullPrecisionGetYAxisFormattedValue('0.000000250034', 'percent'),
).toBe('0.000000250034%');
expect(testFullPrecisionGetYAxisFormattedValue('0.00000025', 'percent')).toBe(
'0.00000025%',
);
// Big precision, limiting the javascript precision (~16 digits)
expect(
testFullPrecisionGetYAxisFormattedValue('1.0000000000000001', 'percent'),
).toBe('1%');
expect(
testFullPrecisionGetYAxisFormattedValue('1.00555555559595876', 'percent'),
).toBe('1.005555555595958%');
});
test('ratio', () => {
expect(testFullPrecisionGetYAxisFormattedValue('0.5', 'ratio')).toBe(
'0.5 ratio',
);
expect(testFullPrecisionGetYAxisFormattedValue('1.25', 'ratio')).toBe(
'1.25 ratio',
);
expect(testFullPrecisionGetYAxisFormattedValue('2.0', 'ratio')).toBe(
'2 ratio',
);
});
test('temperature units', () => {
expect(testFullPrecisionGetYAxisFormattedValue('25', 'celsius')).toBe(
'25 °C',
);
expect(testFullPrecisionGetYAxisFormattedValue('0', 'celsius')).toBe('0 °C');
expect(testFullPrecisionGetYAxisFormattedValue('-10', 'celsius')).toBe(
'-10 °C',
);
expect(testFullPrecisionGetYAxisFormattedValue('77', 'fahrenheit')).toBe(
'77 °F',
);
expect(testFullPrecisionGetYAxisFormattedValue('32', 'fahrenheit')).toBe(
'32 °F',
);
expect(testFullPrecisionGetYAxisFormattedValue('14', 'fahrenheit')).toBe(
'14 °F',
);
});
test('ms edge cases', () => {
expect(testFullPrecisionGetYAxisFormattedValue('0', 'ms')).toBe('0 ms');
expect(testFullPrecisionGetYAxisFormattedValue('-1500', 'ms')).toBe('-1.5 s');
expect(testFullPrecisionGetYAxisFormattedValue('Infinity', 'ms')).toBe('∞');
});
test('bytes edge cases', () => {
expect(testFullPrecisionGetYAxisFormattedValue('0', 'bytes')).toBe('0 B');
expect(testFullPrecisionGetYAxisFormattedValue('-1024', 'bytes')).toBe(
'-1 KiB',
);
});
});
describe('getYAxisFormattedValue - precision option tests', () => {
test('precision 0 drops decimal part', () => {
expect(getYAxisFormattedValue('1.2345', 'none', 0)).toBe('1');
expect(getYAxisFormattedValue('0.9999', 'none', 0)).toBe('0');
expect(getYAxisFormattedValue('12345.6789', 'none', 0)).toBe('12345');
expect(getYAxisFormattedValue('0.0000123456', 'none', 0)).toBe('0');
expect(getYAxisFormattedValue('1000.000', 'none', 0)).toBe('1000');
expect(getYAxisFormattedValue('0.000000250034', 'none', 0)).toBe('0');
expect(getYAxisFormattedValue('1.00555555559595876', 'none', 0)).toBe('1');
// with unit
expect(getYAxisFormattedValue('4353.81', 'ms', 0)).toBe('4 s');
});
test('precision 1,2,3,4 decimals', () => {
expect(getYAxisFormattedValue('1.2345', 'none', 1)).toBe('1.2');
expect(getYAxisFormattedValue('1.2345', 'none', 2)).toBe('1.23');
expect(getYAxisFormattedValue('1.2345', 'none', 3)).toBe('1.234');
expect(getYAxisFormattedValue('1.2345', 'none', 4)).toBe('1.2345');
expect(getYAxisFormattedValue('0.0000123456', 'none', 1)).toBe('0.00001');
expect(getYAxisFormattedValue('0.0000123456', 'none', 2)).toBe('0.000012');
expect(getYAxisFormattedValue('0.0000123456', 'none', 3)).toBe('0.0000123');
expect(getYAxisFormattedValue('0.0000123456', 'none', 4)).toBe('0.00001234');
expect(getYAxisFormattedValue('1000.000', 'none', 1)).toBe('1000');
expect(getYAxisFormattedValue('1000.000', 'none', 2)).toBe('1000');
expect(getYAxisFormattedValue('1000.000', 'none', 3)).toBe('1000');
expect(getYAxisFormattedValue('1000.000', 'none', 4)).toBe('1000');
expect(getYAxisFormattedValue('0.000000250034', 'none', 1)).toBe('0.0000002');
expect(getYAxisFormattedValue('0.000000250034', 'none', 2)).toBe(
'0.00000025',
); // leading zeros + 2 significant => same trimmed
expect(getYAxisFormattedValue('0.000000250034', 'none', 3)).toBe(
'0.00000025',
);
expect(getYAxisFormattedValue('0.000000250304', 'none', 4)).toBe(
'0.0000002503',
);
expect(getYAxisFormattedValue('1.00555555559595876', 'none', 1)).toBe(
'1.005',
);
expect(getYAxisFormattedValue('1.00555555559595876', 'none', 2)).toBe(
'1.0055',
);
expect(getYAxisFormattedValue('1.00555555559595876', 'none', 3)).toBe(
'1.00555',
);
expect(getYAxisFormattedValue('1.00555555559595876', 'none', 4)).toBe(
'1.005555',
);
// with unit
expect(getYAxisFormattedValue('4353.81', 'ms', 1)).toBe('4.4 s');
expect(getYAxisFormattedValue('4353.81', 'ms', 2)).toBe('4.35 s');
expect(getYAxisFormattedValue('4353.81', 'ms', 3)).toBe('4.354 s');
expect(getYAxisFormattedValue('4353.81', 'ms', 4)).toBe('4.3538 s');
// Percentages
expect(getYAxisFormattedValue('0.123456', 'percent', 2)).toBe('0.12%');
expect(getYAxisFormattedValue('0.123456', 'percent', 4)).toBe('0.1235%'); // approximation
});
test('precision full uses up to DEFAULT_SIGNIFICANT_DIGITS significant digits', () => {
expect(
getYAxisFormattedValue(
'0.00002625429914148441',
'none',
PrecisionOptionsEnum.FULL,
),
).toBe('0.000026254299141');
expect(
getYAxisFormattedValue(
'0.000026254299141484417',
's',
PrecisionOptionsEnum.FULL,
),
).toBe('26254299141484417000000 µs');
expect(
getYAxisFormattedValue('4353.81', 'ms', PrecisionOptionsEnum.FULL),
).toBe('4.35381 s');
expect(getYAxisFormattedValue('500', 'ms', PrecisionOptionsEnum.FULL)).toBe(
'500 ms',
);
});
});

View File

@@ -1,58 +1,158 @@
/* eslint-disable sonarjs/cognitive-complexity */
import { formattedValueToString, getValueFormat } from '@grafana/data'; import { formattedValueToString, getValueFormat } from '@grafana/data';
import * as Sentry from '@sentry/react';
import { isNaN } from 'lodash-es';
const DEFAULT_SIGNIFICANT_DIGITS = 15;
// max decimals to keep should not exceed 15 decimal places to avoid floating point precision issues
const MAX_DECIMALS = 15;
export enum PrecisionOptionsEnum {
ZERO = 0,
ONE = 1,
TWO = 2,
THREE = 3,
FOUR = 4,
FULL = 'full',
}
export type PrecisionOption = 0 | 1 | 2 | 3 | 4 | PrecisionOptionsEnum.FULL;
/**
* Formats a number for display, preserving leading zeros after the decimal point
* and showing up to DEFAULT_SIGNIFICANT_DIGITS digits after the first non-zero decimal digit.
* It avoids scientific notation and removes unnecessary trailing zeros.
*
* @example
* formatDecimalWithLeadingZeros(1.2345); // "1.2345"
* formatDecimalWithLeadingZeros(0.0012345); // "0.0012345"
* formatDecimalWithLeadingZeros(5.0); // "5"
*
* @param value The number to format.
* @returns The formatted string.
*/
const formatDecimalWithLeadingZeros = (
value: number,
precision: PrecisionOption,
): string => {
if (value === 0) {
return '0';
}
// Use toLocaleString to get a full decimal representation without scientific notation.
const numStr = value.toLocaleString('en-US', {
useGrouping: false,
maximumFractionDigits: 20,
});
const [integerPart, decimalPart = ''] = numStr.split('.');
// If there's no decimal part, the integer part is the result.
if (!decimalPart) {
return integerPart;
}
// Find the index of the first non-zero digit in the decimal part.
const firstNonZeroIndex = decimalPart.search(/[^0]/);
// If the decimal part consists only of zeros, return just the integer part.
if (firstNonZeroIndex === -1) {
return integerPart;
}
// Determine the number of decimals to keep: leading zeros + up to N significant digits.
const significantDigits =
precision === PrecisionOptionsEnum.FULL
? DEFAULT_SIGNIFICANT_DIGITS
: precision;
const decimalsToKeep = firstNonZeroIndex + (significantDigits || 0);
// max decimals to keep should not exceed 15 decimal places to avoid floating point precision issues
const finalDecimalsToKeep = Math.min(decimalsToKeep, MAX_DECIMALS);
const trimmedDecimalPart = decimalPart.substring(0, finalDecimalsToKeep);
// If precision is 0, we drop the decimal part entirely.
if (precision === 0) {
return integerPart;
}
// Remove any trailing zeros from the result to keep it clean.
const finalDecimalPart = trimmedDecimalPart.replace(/0+$/, '');
// Return the integer part, or the integer and decimal parts combined.
return finalDecimalPart ? `${integerPart}.${finalDecimalPart}` : integerPart;
};
/**
* Formats a Y-axis value based on a given format string.
*
* @param value The string value from the axis.
* @param format The format identifier (e.g. 'none', 'ms', 'bytes', 'short').
* @returns A formatted string ready for display.
*/
export const getYAxisFormattedValue = ( export const getYAxisFormattedValue = (
value: string, value: string,
format: string, format: string,
precision: PrecisionOption = 2, // default precision requested
): string => { ): string => {
let decimalPrecision: number | undefined; const numValue = parseFloat(value);
const parsedValue = getValueFormat(format)(
parseFloat(value), // Handle non-numeric or special values first.
undefined, if (isNaN(numValue)) return 'NaN';
undefined, if (numValue === Infinity) return '∞';
undefined, if (numValue === -Infinity) return '-∞';
);
try { const decimalPlaces = value.split('.')[1]?.length || undefined;
const decimalSplitted = parsedValue.text.split('.');
if (decimalSplitted.length === 1) { // Use custom formatter for the 'none' format honoring precision
decimalPrecision = 0; if (format === 'none') {
} else { return formatDecimalWithLeadingZeros(numValue, precision);
const decimalDigits = decimalSplitted[1].split(''); }
decimalPrecision = decimalDigits.length;
let nonZeroCtr = 0; // For all other standard formats, delegate to grafana/data's built-in formatter.
for (let idx = 0; idx < decimalDigits.length; idx += 1) { const computeDecimals = (): number | undefined => {
if (decimalDigits[idx] !== '0') { if (precision === PrecisionOptionsEnum.FULL) {
nonZeroCtr += 1; return decimalPlaces && decimalPlaces >= DEFAULT_SIGNIFICANT_DIGITS
if (nonZeroCtr >= 2) { ? decimalPlaces
decimalPrecision = idx + 1; : DEFAULT_SIGNIFICANT_DIGITS;
}
} else if (nonZeroCtr) {
decimalPrecision = idx;
break;
}
}
} }
return precision;
};
return formattedValueToString( const fallbackFormat = (): string => {
getValueFormat(format)( if (precision === PrecisionOptionsEnum.FULL) return numValue.toString();
parseFloat(value), if (precision === 0) return Math.round(numValue).toString();
decimalPrecision, return precision !== undefined
undefined, ? numValue
undefined, .toFixed(precision)
), .replace(/(\.[0-9]*[1-9])0+$/, '$1') // trimming zeros
); .replace(/\.$/, '')
} catch (error) { : numValue.toString();
console.error(error); };
}
return `${parseFloat(value)}`;
};
export const getToolTipValue = (value: string, format?: string): string => {
try { try {
return formattedValueToString( const formatter = getValueFormat(format);
getValueFormat(format)(parseFloat(value), undefined, undefined, undefined), const formattedValue = formatter(numValue, computeDecimals(), undefined);
); if (formattedValue.text && formattedValue.text.includes('.')) {
formattedValue.text = formatDecimalWithLeadingZeros(
parseFloat(formattedValue.text),
precision,
);
}
return formattedValueToString(formattedValue);
} catch (error) { } catch (error) {
console.error(error); Sentry.captureEvent({
message: `Error applying formatter: ${
error instanceof Error ? error.message : 'Unknown error'
}`,
level: 'error',
});
return fallbackFormat();
} }
return `${value}`;
}; };
export const getToolTipValue = (
value: string | number,
format?: string,
precision?: PrecisionOption,
): string =>
getYAxisFormattedValue(value?.toString(), format || 'none', precision);

View File

@@ -60,6 +60,14 @@ function Metrics({
setElement, setElement,
} = useMultiIntersectionObserver(hostWidgetInfo.length, { threshold: 0.1 }); } = useMultiIntersectionObserver(hostWidgetInfo.length, { threshold: 0.1 });
const legendScrollPositionRef = useRef<{
scrollTop: number;
scrollLeft: number;
}>({
scrollTop: 0,
scrollLeft: 0,
});
const queryPayloads = useMemo( const queryPayloads = useMemo(
() => () =>
getHostQueryPayload( getHostQueryPayload(
@@ -147,6 +155,13 @@ function Metrics({
maxTimeScale: graphTimeIntervals[idx].end, maxTimeScale: graphTimeIntervals[idx].end,
onDragSelect: (start, end) => onDragSelect(start, end, idx), onDragSelect: (start, end) => onDragSelect(start, end, idx),
query: currentQuery, query: currentQuery,
legendScrollPosition: legendScrollPositionRef.current,
setLegendScrollPosition: (position: {
scrollTop: number;
scrollLeft: number;
}) => {
legendScrollPositionRef.current = position;
},
}), }),
), ),
[ [

View File

@@ -132,9 +132,9 @@
justify-content: center; justify-content: center;
} }
.json-action-btn { .log-detail-drawer__actions {
display: flex; display: flex;
gap: 8px; gap: 4px;
} }
} }

View File

@@ -319,31 +319,35 @@ function LogDetailInner({
</Radio.Button> </Radio.Button>
</Radio.Group> </Radio.Group>
{selectedView === VIEW_TYPES.JSON && ( <div className="log-detail-drawer__actions">
<div className="json-action-btn"> {selectedView === VIEW_TYPES.CONTEXT && (
<Tooltip
title="Show Filters"
placement="topLeft"
aria-label="Show Filters"
>
<Button
className="action-btn"
icon={<Filter size={16} />}
onClick={handleFilterVisible}
/>
</Tooltip>
)}
<Tooltip
title={selectedView === VIEW_TYPES.JSON ? 'Copy JSON' : 'Copy Log Link'}
placement="topLeft"
aria-label={
selectedView === VIEW_TYPES.JSON ? 'Copy JSON' : 'Copy Log Link'
}
>
<Button <Button
className="action-btn" className="action-btn"
icon={<Copy size={16} />} icon={<Copy size={16} />}
onClick={handleJSONCopy} onClick={selectedView === VIEW_TYPES.JSON ? handleJSONCopy : onLogCopy}
/> />
</div> </Tooltip>
)} </div>
{selectedView === VIEW_TYPES.CONTEXT && (
<Button
className="action-btn"
icon={<Filter size={16} />}
onClick={handleFilterVisible}
/>
)}
<Tooltip title="Copy Log Link" placement="left" aria-label="Copy Log Link">
<Button
className="action-btn"
icon={<Copy size={16} />}
onClick={onLogCopy}
/>
</Tooltip>
</div> </div>
{isFilterVisible && contextQuery?.builder.queryData[0] && ( {isFilterVisible && contextQuery?.builder.queryData[0] && (
<div className="log-detail-drawer-query-container"> <div className="log-detail-drawer-query-container">
@@ -383,7 +387,8 @@ function LogDetailInner({
podName={log.resources_string?.[RESOURCE_KEYS.POD_NAME] || ''} podName={log.resources_string?.[RESOURCE_KEYS.POD_NAME] || ''}
nodeName={log.resources_string?.[RESOURCE_KEYS.NODE_NAME] || ''} nodeName={log.resources_string?.[RESOURCE_KEYS.NODE_NAME] || ''}
hostName={log.resources_string?.[RESOURCE_KEYS.HOST_NAME] || ''} hostName={log.resources_string?.[RESOURCE_KEYS.HOST_NAME] || ''}
logLineTimestamp={log.timestamp.toString()} timestamp={log.timestamp.toString()}
dataSource={DataSource.LOGS}
/> />
)} )}
</Drawer> </Drawer>

View File

@@ -57,8 +57,8 @@ export const RawLogViewContainer = styled(Row)<{
transition: background-color 2s ease-in;` transition: background-color 2s ease-in;`
: ''} : ''}
${({ $isCustomHighlighted, $isDarkMode, $logType }): string => ${({ $isCustomHighlighted }): string =>
getCustomHighlightBackground($isCustomHighlighted, $isDarkMode, $logType)} getCustomHighlightBackground($isCustomHighlighted)}
`; `;
export const InfoIconWrapper = styled(Info)` export const InfoIconWrapper = styled(Info)`

View File

@@ -153,7 +153,9 @@ export const useTableView = (props: UseTableViewProps): UseTableViewResult => {
children: ( children: (
<TableBodyContent <TableBodyContent
dangerouslySetInnerHTML={{ dangerouslySetInnerHTML={{
__html: getSanitizedLogBody(field as string), __html: getSanitizedLogBody(field as string, {
shouldEscapeHtml: true,
}),
}} }}
fontSize={fontSize} fontSize={fontSize}
linesPerRow={linesPerRow} linesPerRow={linesPerRow}

View File

@@ -32,6 +32,7 @@ import { popupContainer } from 'utils/selectPopupContainer';
import { CustomMultiSelectProps, CustomTagProps, OptionData } from './types'; import { CustomMultiSelectProps, CustomTagProps, OptionData } from './types';
import { import {
ALL_SELECTED_VALUE,
filterOptionsBySearch, filterOptionsBySearch,
handleScrollToBottom, handleScrollToBottom,
prioritizeOrAddOptionForMultiSelect, prioritizeOrAddOptionForMultiSelect,
@@ -43,8 +44,6 @@ enum ToggleTagValue {
All = 'All', All = 'All',
} }
const ALL_SELECTED_VALUE = '__ALL__'; // Constant for the special value
const CustomMultiSelect: React.FC<CustomMultiSelectProps> = ({ const CustomMultiSelect: React.FC<CustomMultiSelectProps> = ({
placeholder = 'Search...', placeholder = 'Search...',
className, className,

View File

@@ -5,6 +5,8 @@ import { OptionData } from './types';
export const SPACEKEY = ' '; export const SPACEKEY = ' ';
export const ALL_SELECTED_VALUE = '__ALL__'; // Constant for the special value
export const prioritizeOrAddOptionForSingleSelect = ( export const prioritizeOrAddOptionForSingleSelect = (
options: OptionData[], options: OptionData[],
value: string, value: string,

View File

@@ -398,7 +398,7 @@
} }
.qb-search-container { .qb-search-container {
.metrics-select-container { .metrics-container {
margin-bottom: 12px; margin-bottom: 12px;
} }
} }

View File

@@ -22,6 +22,8 @@ export const QueryBuilderV2 = memo(function QueryBuilderV2({
showOnlyWhereClause = false, showOnlyWhereClause = false,
showTraceOperator = false, showTraceOperator = false,
version, version,
onSignalSourceChange,
signalSourceChangeEnabled = false,
}: QueryBuilderProps): JSX.Element { }: QueryBuilderProps): JSX.Element {
const { const {
currentQuery, currentQuery,
@@ -175,6 +177,8 @@ export const QueryBuilderV2 = memo(function QueryBuilderV2({
queryVariant={config?.queryVariant || 'dropdown'} queryVariant={config?.queryVariant || 'dropdown'}
showOnlyWhereClause={showOnlyWhereClause} showOnlyWhereClause={showOnlyWhereClause}
isListViewPanel={isListViewPanel} isListViewPanel={isListViewPanel}
onSignalSourceChange={onSignalSourceChange || ((): void => {})}
signalSourceChangeEnabled={signalSourceChangeEnabled}
/> />
) : ( ) : (
currentQuery.builder.queryData.map((query, index) => ( currentQuery.builder.queryData.map((query, index) => (
@@ -193,7 +197,9 @@ export const QueryBuilderV2 = memo(function QueryBuilderV2({
queryVariant={config?.queryVariant || 'dropdown'} queryVariant={config?.queryVariant || 'dropdown'}
showOnlyWhereClause={showOnlyWhereClause} showOnlyWhereClause={showOnlyWhereClause}
isListViewPanel={isListViewPanel} isListViewPanel={isListViewPanel}
signalSource={config?.signalSource || ''} signalSource={query.source as 'meter' | ''}
onSignalSourceChange={onSignalSourceChange || ((): void => {})}
signalSourceChangeEnabled={signalSourceChangeEnabled}
/> />
)) ))
)} )}

View File

@@ -1,5 +1,14 @@
.metrics-select-container { .metrics-source-select-container {
margin-bottom: 8px; margin-bottom: 8px;
display: flex;
flex-direction: row;
align-items: flex-start;
gap: 8px;
width: 100%;
.source-selector {
width: 120px;
}
.ant-select-selector { .ant-select-selector {
width: 100%; width: 100%;
@@ -42,7 +51,7 @@
} }
.lightMode { .lightMode {
.metrics-select-container { .metrics-source-select-container {
.ant-select-selector { .ant-select-selector {
border: 1px solid var(--bg-vanilla-300) !important; border: 1px solid var(--bg-vanilla-300) !important;
background: var(--bg-vanilla-100); background: var(--bg-vanilla-100);

View File

@@ -1,34 +1,121 @@
import './MetricsSelect.styles.scss'; import './MetricsSelect.styles.scss';
import { Select } from 'antd';
import {
initialQueriesMap,
initialQueryMeterWithType,
PANEL_TYPES,
} from 'constants/queryBuilder';
import { AggregatorFilter } from 'container/QueryBuilder/filters'; import { AggregatorFilter } from 'container/QueryBuilder/filters';
import { useQueryBuilder } from 'hooks/queryBuilder/useQueryBuilder';
import { useQueryOperations } from 'hooks/queryBuilder/useQueryBuilderOperations'; import { useQueryOperations } from 'hooks/queryBuilder/useQueryBuilderOperations';
import { memo } from 'react'; import { memo, useCallback, useMemo, useState } from 'react';
import { BaseAutocompleteData } from 'types/api/queryBuilder/queryAutocompleteResponse';
import { IBuilderQuery } from 'types/api/queryBuilder/queryBuilderData'; import { IBuilderQuery } from 'types/api/queryBuilder/queryBuilderData';
import { DataSource } from 'types/common/queryBuilder';
import { SelectOption } from 'types/common/select';
export const SOURCE_OPTIONS: SelectOption<string, string>[] = [
{ value: 'metrics', label: 'Metrics' },
{ value: 'meter', label: 'Meter' },
];
export const MetricsSelect = memo(function MetricsSelect({ export const MetricsSelect = memo(function MetricsSelect({
query, query,
index, index,
version, version,
signalSource, signalSource,
onSignalSourceChange,
signalSourceChangeEnabled = false,
}: { }: {
query: IBuilderQuery; query: IBuilderQuery;
index: number; index: number;
version: string; version: string;
signalSource: 'meter' | ''; signalSource: 'meter' | '';
onSignalSourceChange: (value: string) => void;
signalSourceChangeEnabled: boolean;
}): JSX.Element { }): JSX.Element {
const [attributeKeys, setAttributeKeys] = useState<BaseAutocompleteData[]>([]);
const { handleChangeAggregatorAttribute } = useQueryOperations({ const { handleChangeAggregatorAttribute } = useQueryOperations({
index, index,
query, query,
entityVersion: version, entityVersion: version,
}); });
const handleAggregatorAttributeChange = useCallback(
(value: BaseAutocompleteData, isEditMode?: boolean) => {
handleChangeAggregatorAttribute(value, isEditMode, attributeKeys || []);
},
[handleChangeAggregatorAttribute, attributeKeys],
);
const { updateAllQueriesOperators, handleSetQueryData } = useQueryBuilder();
const source = useMemo(
() => (signalSource === 'meter' ? 'meter' : 'metrics'),
[signalSource],
);
const defaultMeterQuery = useMemo(
() =>
updateAllQueriesOperators(
initialQueryMeterWithType,
PANEL_TYPES.BAR,
DataSource.METRICS,
'meter' as 'meter' | '',
),
[updateAllQueriesOperators],
);
const defaultMetricsQuery = useMemo(
() =>
updateAllQueriesOperators(
initialQueriesMap.metrics,
PANEL_TYPES.BAR,
DataSource.METRICS,
'',
),
[updateAllQueriesOperators],
);
const handleSignalSourceChange = (value: string): void => {
onSignalSourceChange(value);
handleSetQueryData(
index,
value === 'meter'
? {
...defaultMeterQuery.builder.queryData[0],
source: 'meter',
queryName: query.queryName,
}
: {
...defaultMetricsQuery.builder.queryData[0],
source: '',
queryName: query.queryName,
},
);
};
return ( return (
<div className="metrics-select-container"> <div className="metrics-source-select-container">
{signalSourceChangeEnabled && (
<Select
className="source-selector"
placeholder="Source"
options={SOURCE_OPTIONS}
value={source}
defaultValue="metrics"
onChange={handleSignalSourceChange}
/>
)}
<AggregatorFilter <AggregatorFilter
onChange={handleChangeAggregatorAttribute} onChange={handleAggregatorAttributeChange}
query={query} query={query}
index={index} index={index}
signalSource={signalSource || ''} signalSource={signalSource || ''}
setAttributeKeys={setAttributeKeys}
/> />
</div> </div>
); );

View File

@@ -500,7 +500,10 @@ function QueryAddOns({
} }
value={addOn} value={addOn}
> >
<div className="add-on-tab-title"> <div
className="add-on-tab-title"
data-testid={`query-add-on-${addOn.key}`}
>
{addOn.icon} {addOn.icon}
{addOn.label} {addOn.label}
</div> </div>

View File

@@ -33,7 +33,13 @@ export const QueryV2 = memo(function QueryV2({
showOnlyWhereClause = false, showOnlyWhereClause = false,
signalSource = '', signalSource = '',
isMultiQueryAllowed = false, isMultiQueryAllowed = false,
}: QueryProps & { ref: React.RefObject<HTMLDivElement> }): JSX.Element { onSignalSourceChange,
signalSourceChangeEnabled = false,
}: QueryProps & {
ref: React.RefObject<HTMLDivElement>;
onSignalSourceChange: (value: string) => void;
signalSourceChangeEnabled: boolean;
}): JSX.Element {
const { cloneQuery, panelType } = useQueryBuilder(); const { cloneQuery, panelType } = useQueryBuilder();
const showFunctions = query?.functions?.length > 0; const showFunctions = query?.functions?.length > 0;
@@ -207,12 +213,14 @@ export const QueryV2 = memo(function QueryV2({
<div className="qb-elements-container"> <div className="qb-elements-container">
<div className="qb-search-container"> <div className="qb-search-container">
{dataSource === DataSource.METRICS && ( {dataSource === DataSource.METRICS && (
<div className="metrics-select-container"> <div className="metrics-container">
<MetricsSelect <MetricsSelect
query={query} query={query}
index={index} index={index}
version={ENTITY_VERSION_V5} version={ENTITY_VERSION_V5}
signalSource={signalSource as 'meter' | ''} signalSource={signalSource as 'meter' | ''}
onSignalSourceChange={onSignalSourceChange}
signalSourceChangeEnabled={signalSourceChangeEnabled}
/> />
</div> </div>
)} )}
@@ -258,7 +266,7 @@ export const QueryV2 = memo(function QueryV2({
panelType={panelType} panelType={panelType}
query={query} query={query}
index={index} index={index}
key={`metrics-aggregate-section-${query.queryName}-${query.dataSource}`} key={`metrics-aggregate-section-${query.queryName}-${query.dataSource}-${signalSource}`}
version="v4" version="v4"
signalSource={signalSource as 'meter' | ''} signalSource={signalSource as 'meter' | ''}
/> />

View File

@@ -224,7 +224,7 @@ export const convertFiltersToExpressionWithExistingQuery = (
const visitedPairs: Set<string> = new Set(); // Set to track visited query pairs const visitedPairs: Set<string> = new Set(); // Set to track visited query pairs
// Map extracted query pairs to key-specific pair information for faster access // Map extracted query pairs to key-specific pair information for faster access
let queryPairsMap = getQueryPairsMap(existingQuery.trim()); let queryPairsMap = getQueryPairsMap(existingQuery);
filters?.items?.forEach((filter) => { filters?.items?.forEach((filter) => {
const { key, op, value } = filter; const { key, op, value } = filter;
@@ -309,7 +309,7 @@ export const convertFiltersToExpressionWithExistingQuery = (
)}${OPERATORS.IN} ${formattedValue} ${modifiedQuery.slice( )}${OPERATORS.IN} ${formattedValue} ${modifiedQuery.slice(
notInPair.position.valueEnd + 1, notInPair.position.valueEnd + 1,
)}`; )}`;
queryPairsMap = getQueryPairsMap(modifiedQuery.trim()); queryPairsMap = getQueryPairsMap(modifiedQuery);
} }
shouldAddToNonExisting = false; // Don't add this to non-existing filters shouldAddToNonExisting = false; // Don't add this to non-existing filters
} else if ( } else if (

View File

@@ -45,6 +45,12 @@
flex-direction: column; flex-direction: column;
gap: 8px; gap: 8px;
.filter-separator {
height: 1px;
background-color: var(--bg-slate-400);
margin: 4px 0;
}
.value { .value {
display: flex; display: flex;
align-items: center; align-items: center;
@@ -177,6 +183,12 @@
} }
} }
} }
.values {
.filter-separator {
background-color: var(--bg-vanilla-300);
}
}
} }
} }

View File

@@ -0,0 +1,191 @@
import { FiltersType, QuickFiltersSource } from 'components/QuickFilters/types';
import { useGetAggregateValues } from 'hooks/queryBuilder/useGetAggregateValues';
import { useQueryBuilder } from 'hooks/queryBuilder/useQueryBuilder';
import { useGetQueryKeyValueSuggestions } from 'hooks/querySuggestions/useGetQueryKeyValueSuggestions';
import { quickFiltersAttributeValuesResponse } from 'mocks-server/__mockdata__/customQuickFilters';
import { rest, server } from 'mocks-server/server';
import { UseQueryResult } from 'react-query';
import { render, screen, userEvent, waitFor } from 'tests/test-utils';
import { SuccessResponse } from 'types/api';
import { IAttributeValuesResponse } from 'types/api/queryBuilder/getAttributesValues';
import { DataTypes } from 'types/api/queryBuilder/queryAutocompleteResponse';
import { DataSource } from 'types/common/queryBuilder';
import CheckboxFilter from './Checkbox';
// Mock the query builder hook
jest.mock('hooks/queryBuilder/useQueryBuilder');
const mockUseQueryBuilder = jest.mocked(useQueryBuilder);
// Mock the aggregate values hook
jest.mock('hooks/queryBuilder/useGetAggregateValues');
const mockUseGetAggregateValues = jest.mocked(useGetAggregateValues);
// Mock the key value suggestions hook
jest.mock('hooks/querySuggestions/useGetQueryKeyValueSuggestions');
const mockUseGetQueryKeyValueSuggestions = jest.mocked(
useGetQueryKeyValueSuggestions,
);
interface MockFilterConfig {
title: string;
attributeKey: {
key: string;
dataType: DataTypes;
type: string;
};
dataSource: DataSource;
defaultOpen: boolean;
type: FiltersType;
}
const createMockFilter = (
overrides: Partial<MockFilterConfig> = {},
): MockFilterConfig => ({
// eslint-disable-next-line sonarjs/no-duplicate-string
title: 'Service Name',
attributeKey: {
key: 'service.name',
dataType: DataTypes.String,
type: 'resource',
},
dataSource: DataSource.LOGS,
defaultOpen: false,
type: FiltersType.CHECKBOX,
...overrides,
});
const createMockQueryBuilderData = (hasActiveFilters = false): any => ({
lastUsedQuery: 0,
currentQuery: {
builder: {
queryData: [
{
filters: {
items: hasActiveFilters
? [
{
key: {
key: 'service.name',
dataType: DataTypes.String,
type: 'resource',
},
op: 'in',
value: ['otel-demo', 'sample-flask'],
},
]
: [],
},
},
],
},
},
redirectWithQueryBuilderData: jest.fn(),
});
describe('CheckboxFilter - User Flows', () => {
beforeEach(() => {
// Reset all mocks
jest.clearAllMocks();
// Default mock implementations using the same structure as existing tests
mockUseGetAggregateValues.mockReturnValue({
data: {
payload: {
stringAttributeValues: [
'mq-kafka',
'otel-demo',
'otlp-python',
'sample-flask',
],
},
},
isLoading: false,
} as UseQueryResult<SuccessResponse<IAttributeValuesResponse>>);
mockUseGetQueryKeyValueSuggestions.mockReturnValue({
data: null,
isLoading: false,
} as any);
// Setup MSW server for API calls
server.use(
rest.get('*/api/v3/autocomplete/attribute_values', (_req, res, ctx) =>
res(ctx.status(200), ctx.json(quickFiltersAttributeValuesResponse)),
),
);
});
it('should auto-open filter and prioritize checked items with visual separator when user opens page with active filters', async () => {
// Mock query builder with active filters
mockUseQueryBuilder.mockReturnValue(createMockQueryBuilderData(true) as any);
const mockFilter = createMockFilter({ defaultOpen: false });
render(
<CheckboxFilter
filter={mockFilter}
source={QuickFiltersSource.LOGS_EXPLORER}
/>,
);
// User should see the filter is automatically opened (not collapsed)
expect(screen.getByText('Service Name')).toBeInTheDocument();
await waitFor(() => {
// eslint-disable-next-line sonarjs/no-duplicate-string
expect(screen.getByPlaceholderText('Filter values')).toBeInTheDocument();
});
// User should see visual separator between checked and unchecked items
expect(screen.getByTestId('filter-separator')).toBeInTheDocument();
// User should see checked items at the top
await waitFor(() => {
const checkboxes = screen.getAllByRole('checkbox');
expect(checkboxes).toHaveLength(4); // Ensure we have exactly 4 checkboxes
expect(checkboxes[0]).toBeChecked(); // otel-demo should be first and checked
expect(checkboxes[1]).toBeChecked(); // sample-flask should be second and checked
expect(checkboxes[2]).not.toBeChecked(); // mq-kafka should be unchecked
expect(checkboxes[3]).not.toBeChecked(); // otlp-python should be unchecked
});
});
it('should respect user preference when user manually toggles filter over auto-open behavior', async () => {
const user = userEvent.setup({ pointerEventsCheck: 0 });
// Mock query builder with active filters
mockUseQueryBuilder.mockReturnValue(createMockQueryBuilderData(true) as any);
const mockFilter = createMockFilter({ defaultOpen: false });
render(
<CheckboxFilter
filter={mockFilter}
source={QuickFiltersSource.LOGS_EXPLORER}
/>,
);
// Initially auto-opened due to active filters
await waitFor(() => {
expect(screen.getByPlaceholderText('Filter values')).toBeInTheDocument();
});
// User manually closes the filter
await user.click(screen.getByText('Service Name'));
// User should see filter is now closed (respecting user preference)
expect(
screen.queryByPlaceholderText('Filter values'),
).not.toBeInTheDocument();
// User manually opens the filter again
await user.click(screen.getByText('Service Name'));
// User should see filter is now open (respecting user preference)
await waitFor(() => {
expect(screen.getByPlaceholderText('Filter values')).toBeInTheDocument();
});
});
});

View File

@@ -21,7 +21,7 @@ import { useGetQueryKeyValueSuggestions } from 'hooks/querySuggestions/useGetQue
import useDebouncedFn from 'hooks/useDebouncedFunction'; import useDebouncedFn from 'hooks/useDebouncedFunction';
import { cloneDeep, isArray, isEqual, isFunction } from 'lodash-es'; import { cloneDeep, isArray, isEqual, isFunction } from 'lodash-es';
import { ChevronDown, ChevronRight } from 'lucide-react'; import { ChevronDown, ChevronRight } from 'lucide-react';
import { useMemo, useState } from 'react'; import { Fragment, useMemo, useState } from 'react';
import { DataTypes } from 'types/api/queryBuilder/queryAutocompleteResponse'; import { DataTypes } from 'types/api/queryBuilder/queryAutocompleteResponse';
import { Query, TagFilterItem } from 'types/api/queryBuilder/queryBuilderData'; import { Query, TagFilterItem } from 'types/api/queryBuilder/queryBuilderData';
import { DataSource } from 'types/common/queryBuilder'; import { DataSource } from 'types/common/queryBuilder';
@@ -54,7 +54,8 @@ interface ICheckboxProps {
export default function CheckboxFilter(props: ICheckboxProps): JSX.Element { export default function CheckboxFilter(props: ICheckboxProps): JSX.Element {
const { source, filter, onFilterChange } = props; const { source, filter, onFilterChange } = props;
const [searchText, setSearchText] = useState<string>(''); const [searchText, setSearchText] = useState<string>('');
const [isOpen, setIsOpen] = useState<boolean>(filter.defaultOpen); // null = no user action, true = user opened, false = user closed
const [userToggleState, setUserToggleState] = useState<boolean | null>(null);
const [visibleItemsCount, setVisibleItemsCount] = useState<number>(10); const [visibleItemsCount, setVisibleItemsCount] = useState<number>(10);
const { const {
@@ -63,6 +64,33 @@ export default function CheckboxFilter(props: ICheckboxProps): JSX.Element {
redirectWithQueryBuilderData, redirectWithQueryBuilderData,
} = useQueryBuilder(); } = useQueryBuilder();
// Check if this filter has active filters in the query
const isSomeFilterPresentForCurrentAttribute = useMemo(
() =>
currentQuery.builder.queryData?.[
lastUsedQuery || 0
]?.filters?.items?.some((item) =>
isEqual(item.key?.key, filter.attributeKey.key),
),
[currentQuery.builder.queryData, lastUsedQuery, filter.attributeKey.key],
);
// Derive isOpen from filter state + user action
const isOpen = useMemo(() => {
// If user explicitly toggled, respect that
if (userToggleState !== null) return userToggleState;
// Auto-open if this filter has active filters in the query
if (isSomeFilterPresentForCurrentAttribute) return true;
// Otherwise use default behavior (first 2 filters open)
return filter.defaultOpen;
}, [
userToggleState,
isSomeFilterPresentForCurrentAttribute,
filter.defaultOpen,
]);
const { data, isLoading } = useGetAggregateValues( const { data, isLoading } = useGetAggregateValues(
{ {
aggregateOperator: filter.aggregateOperator || 'noop', aggregateOperator: filter.aggregateOperator || 'noop',
@@ -128,8 +156,6 @@ export default function CheckboxFilter(props: ICheckboxProps): JSX.Element {
); );
}, [data?.payload, filter.attributeKey.dataType, keyValueSuggestions, source]); }, [data?.payload, filter.attributeKey.dataType, keyValueSuggestions, source]);
const currentAttributeKeys = attributeValues.slice(0, visibleItemsCount);
const setSearchTextDebounced = useDebouncedFn((...args) => { const setSearchTextDebounced = useDebouncedFn((...args) => {
setSearchText(args[0] as string); setSearchText(args[0] as string);
}, DEBOUNCE_DELAY); }, DEBOUNCE_DELAY);
@@ -202,6 +228,23 @@ export default function CheckboxFilter(props: ICheckboxProps): JSX.Element {
const isMultipleValuesTrueForTheKey = const isMultipleValuesTrueForTheKey =
Object.values(currentFilterState).filter((val) => val).length > 1; Object.values(currentFilterState).filter((val) => val).length > 1;
// Sort checked items to the top, then unchecked items
const currentAttributeKeys = useMemo(() => {
const checkedValues = attributeValues.filter(
(val) => currentFilterState[val],
);
const uncheckedValues = attributeValues.filter(
(val) => !currentFilterState[val],
);
return [...checkedValues, ...uncheckedValues].slice(0, visibleItemsCount);
}, [attributeValues, currentFilterState, visibleItemsCount]);
// Count of checked values in the currently visible items
const checkedValuesCount = useMemo(
() => currentAttributeKeys.filter((val) => currentFilterState[val]).length,
[currentAttributeKeys, currentFilterState],
);
const handleClearFilterAttribute = (): void => { const handleClearFilterAttribute = (): void => {
const preparedQuery: Query = { const preparedQuery: Query = {
...currentQuery, ...currentQuery,
@@ -235,12 +278,6 @@ export default function CheckboxFilter(props: ICheckboxProps): JSX.Element {
} }
}; };
const isSomeFilterPresentForCurrentAttribute = currentQuery.builder.queryData?.[
lastUsedQuery || 0
]?.filters?.items?.some((item) =>
isEqual(item.key?.key, filter.attributeKey.key),
);
const onChange = ( const onChange = (
value: string, value: string,
checked: boolean, checked: boolean,
@@ -490,10 +527,10 @@ export default function CheckboxFilter(props: ICheckboxProps): JSX.Element {
className="filter-header-checkbox" className="filter-header-checkbox"
onClick={(): void => { onClick={(): void => {
if (isOpen) { if (isOpen) {
setIsOpen(false); setUserToggleState(false);
setVisibleItemsCount(10); setVisibleItemsCount(10);
} else { } else {
setIsOpen(true); setUserToggleState(true);
} }
}} }}
> >
@@ -540,50 +577,59 @@ export default function CheckboxFilter(props: ICheckboxProps): JSX.Element {
)} )}
{attributeValues.length > 0 ? ( {attributeValues.length > 0 ? (
<section className="values"> <section className="values">
{currentAttributeKeys.map((value: string) => ( {currentAttributeKeys.map((value: string, index: number) => (
<div key={value} className="value"> <Fragment key={value}>
<Checkbox {index === checkedValuesCount && checkedValuesCount > 0 && (
onChange={(e): void => onChange(value, e.target.checked, false)} <div
checked={currentFilterState[value]} key="separator"
disabled={isFilterDisabled} className="filter-separator"
rootClassName="check-box" data-testid="filter-separator"
/> />
)}
<div className="value">
<Checkbox
onChange={(e): void => onChange(value, e.target.checked, false)}
checked={currentFilterState[value]}
disabled={isFilterDisabled}
rootClassName="check-box"
/>
<div <div
className={cx( className={cx(
'checkbox-value-section', 'checkbox-value-section',
isFilterDisabled ? 'filter-disabled' : '', isFilterDisabled ? 'filter-disabled' : '',
)} )}
onClick={(): void => { onClick={(): void => {
if (isFilterDisabled) { if (isFilterDisabled) {
return; return;
} }
onChange(value, currentFilterState[value], true); onChange(value, currentFilterState[value], true);
}} }}
> >
<div className={`${filter.title} label-${value}`} /> <div className={`${filter.title} label-${value}`} />
{filter.customRendererForValue ? ( {filter.customRendererForValue ? (
filter.customRendererForValue(value) filter.customRendererForValue(value)
) : ( ) : (
<Typography.Text <Typography.Text
className="value-string" className="value-string"
ellipsis={{ tooltip: { placement: 'right' } }} ellipsis={{ tooltip: { placement: 'right' } }}
> >
{String(value)} {String(value)}
</Typography.Text> </Typography.Text>
)} )}
<Button type="text" className="only-btn"> <Button type="text" className="only-btn">
{isSomeFilterPresentForCurrentAttribute {isSomeFilterPresentForCurrentAttribute
? currentFilterState[value] && !isMultipleValuesTrueForTheKey ? currentFilterState[value] && !isMultipleValuesTrueForTheKey
? 'All' ? 'All'
: 'Only' : 'Only'
: 'Only'} : 'Only'}
</Button> </Button>
<Button type="text" className="toggle-btn"> <Button type="text" className="toggle-btn">
Toggle Toggle
</Button> </Button>
</div>
</div> </div>
</div> </Fragment>
))} ))}
</section> </section>
) : isEmptyStateWithDocsEnabled ? ( ) : isEmptyStateWithDocsEnabled ? (

View File

@@ -18,11 +18,6 @@ import UPlot from 'uplot';
import { dataMatch, optionsUpdateState } from './utils'; import { dataMatch, optionsUpdateState } from './utils';
// Extended uPlot interface with custom properties
interface ExtendedUPlot extends uPlot {
_legendScrollCleanup?: () => void;
}
export interface UplotProps { export interface UplotProps {
options: uPlot.Options; options: uPlot.Options;
data: uPlot.AlignedData; data: uPlot.AlignedData;
@@ -71,12 +66,6 @@ const Uplot = forwardRef<ToggleGraphProps | undefined, UplotProps>(
const destroy = useCallback((chart: uPlot | null) => { const destroy = useCallback((chart: uPlot | null) => {
if (chart) { if (chart) {
// Clean up legend scroll event listener
const extendedChart = chart as ExtendedUPlot;
if (extendedChart._legendScrollCleanup) {
extendedChart._legendScrollCleanup();
}
onDeleteRef.current?.(chart); onDeleteRef.current?.(chart);
chart.destroy(); chart.destroy();
chartRef.current = null; chartRef.current = null;

View File

@@ -12,6 +12,7 @@ function YAxisUnitSelector({
onChange, onChange,
placeholder = 'Please select a unit', placeholder = 'Please select a unit',
loading = false, loading = false,
'data-testid': dataTestId,
}: YAxisUnitSelectorProps): JSX.Element { }: YAxisUnitSelectorProps): JSX.Element {
const universalUnit = mapMetricUnitToUniversalUnit(value); const universalUnit = mapMetricUnitToUniversalUnit(value);
@@ -45,6 +46,7 @@ function YAxisUnitSelector({
placeholder={placeholder} placeholder={placeholder}
filterOption={(input, option): boolean => handleSearch(input, option)} filterOption={(input, option): boolean => handleSearch(input, option)}
loading={loading} loading={loading}
data-testid={dataTestId}
> >
{Y_AXIS_CATEGORIES.map((category) => ( {Y_AXIS_CATEGORIES.map((category) => (
<Select.OptGroup key={category.name} label={category.name}> <Select.OptGroup key={category.name} label={category.name}>

View File

@@ -4,6 +4,7 @@ export interface YAxisUnitSelectorProps {
placeholder?: string; placeholder?: string;
loading?: boolean; loading?: boolean;
disabled?: boolean; disabled?: boolean;
'data-testid'?: string;
} }
export enum UniversalYAxisUnit { export enum UniversalYAxisUnit {

View File

@@ -24,6 +24,7 @@ export const DATE_TIME_FORMATS = {
TIME_SECONDS: 'HH:mm:ss', TIME_SECONDS: 'HH:mm:ss',
TIME_UTC: 'HH:mm:ss (UTC Z)', TIME_UTC: 'HH:mm:ss (UTC Z)',
TIME_UTC_MS: 'HH:mm:ss.SSS (UTC Z)', TIME_UTC_MS: 'HH:mm:ss.SSS (UTC Z)',
TIME_SPAN_PERCENTILE: 'HH:mm:ss MMM DD',
// Short date formats // Short date formats
DATE_SHORT: 'MM/DD', DATE_SHORT: 'MM/DD',

View File

@@ -50,4 +50,5 @@ export enum QueryParams {
tab = 'tab', tab = 'tab',
thresholds = 'thresholds', thresholds = 'thresholds',
selectedExplorerView = 'selectedExplorerView', selectedExplorerView = 'selectedExplorerView',
variables = 'variables',
} }

View File

@@ -86,7 +86,11 @@ export const REACT_QUERY_KEY = {
SPAN_LOGS: 'SPAN_LOGS', SPAN_LOGS: 'SPAN_LOGS',
SPAN_BEFORE_LOGS: 'SPAN_BEFORE_LOGS', SPAN_BEFORE_LOGS: 'SPAN_BEFORE_LOGS',
SPAN_AFTER_LOGS: 'SPAN_AFTER_LOGS', SPAN_AFTER_LOGS: 'SPAN_AFTER_LOGS',
TRACE_ONLY_LOGS: 'TRACE_ONLY_LOGS',
// Routing Policies Query Keys // Routing Policies Query Keys
GET_ROUTING_POLICIES: 'GET_ROUTING_POLICIES', GET_ROUTING_POLICIES: 'GET_ROUTING_POLICIES',
// Span Percentiles Query Keys
GET_SPAN_PERCENTILES: 'GET_SPAN_PERCENTILES',
} as const; } as const;

View File

@@ -3,4 +3,5 @@ export const USER_PREFERENCES = {
NAV_SHORTCUTS: 'nav_shortcuts', NAV_SHORTCUTS: 'nav_shortcuts',
LAST_SEEN_CHANGELOG_VERSION: 'last_seen_changelog_version', LAST_SEEN_CHANGELOG_VERSION: 'last_seen_changelog_version',
SPAN_DETAILS_PINNED_ATTRIBUTES: 'span_details_pinned_attributes', SPAN_DETAILS_PINNED_ATTRIBUTES: 'span_details_pinned_attributes',
SPAN_PERCENTILE_RESOURCE_ATTRIBUTES: 'span_percentile_resource_attributes',
}; };

View File

@@ -1,4 +1,5 @@
import { Select } from 'antd'; import { Select } from 'antd';
import { ENTITY_VERSION_V5 } from 'constants/app';
import { initialQueriesMap } from 'constants/queryBuilder'; import { initialQueriesMap } from 'constants/queryBuilder';
import { import {
getAllEndpointsWidgetData, getAllEndpointsWidgetData,
@@ -264,6 +265,7 @@ function AllEndPoints({
customOnDragSelect={(): void => {}} customOnDragSelect={(): void => {}}
customTimeRange={timeRange} customTimeRange={timeRange}
customOnRowClick={onRowClick} customOnRowClick={onRowClick}
version={ENTITY_VERSION_V5}
/> />
</div> </div>
</div> </div>

View File

@@ -1,5 +1,6 @@
import { ENTITY_VERSION_V4 } from 'constants/app'; import { ENTITY_VERSION_V4, ENTITY_VERSION_V5 } from 'constants/app';
import { initialQueriesMap } from 'constants/queryBuilder'; import { initialQueriesMap } from 'constants/queryBuilder';
import { REACT_QUERY_KEY } from 'constants/reactQueryKeys';
import { useApiMonitoringParams } from 'container/ApiMonitoring/queryParams'; import { useApiMonitoringParams } from 'container/ApiMonitoring/queryParams';
import { import {
END_POINT_DETAILS_QUERY_KEYS_ARRAY, END_POINT_DETAILS_QUERY_KEYS_ARRAY,
@@ -178,18 +179,33 @@ function EndPointDetails({
[domainName, filters, minTime, maxTime], [domainName, filters, minTime, maxTime],
); );
const V5_QUERIES = [
REACT_QUERY_KEY.GET_ENDPOINT_STATUS_CODE_DATA,
REACT_QUERY_KEY.GET_ENDPOINT_STATUS_CODE_BAR_CHARTS_DATA,
REACT_QUERY_KEY.GET_ENDPOINT_STATUS_CODE_LATENCY_BAR_CHARTS_DATA,
REACT_QUERY_KEY.GET_ENDPOINT_METRICS_DATA,
REACT_QUERY_KEY.GET_ENDPOINT_DEPENDENT_SERVICES_DATA,
REACT_QUERY_KEY.GET_ENDPOINT_DROPDOWN_DATA,
] as const;
const endPointDetailsDataQueries = useQueries( const endPointDetailsDataQueries = useQueries(
endPointDetailsQueryPayload.map((payload, index) => ({ endPointDetailsQueryPayload.map((payload, index) => {
queryKey: [ const queryKey = END_POINT_DETAILS_QUERY_KEYS_ARRAY[index];
END_POINT_DETAILS_QUERY_KEYS_ARRAY[index], const version = (V5_QUERIES as readonly string[]).includes(queryKey)
payload, ? ENTITY_VERSION_V5
filters?.items, // Include filters.items in queryKey for better caching : ENTITY_VERSION_V4;
ENTITY_VERSION_V4, return {
], queryKey: [
queryFn: (): Promise<SuccessResponse<MetricRangePayloadProps>> => END_POINT_DETAILS_QUERY_KEYS_ARRAY[index],
GetMetricQueryRange(payload, ENTITY_VERSION_V4), payload,
enabled: !!payload, ...(filters?.items?.length ? filters.items : []), // Include filters.items in queryKey for better caching
})), version,
],
queryFn: (): Promise<SuccessResponse<MetricRangePayloadProps>> =>
GetMetricQueryRange(payload, version),
enabled: !!payload,
};
}),
); );
const [ const [

View File

@@ -1,8 +1,10 @@
import { LoadingOutlined } from '@ant-design/icons'; import { LoadingOutlined } from '@ant-design/icons';
import { Spin, Switch, Table, Tooltip, Typography } from 'antd'; import { Spin, Switch, Table, Tooltip, Typography } from 'antd';
import { getQueryRangeV5 } from 'api/v5/queryRange/getQueryRange';
import { MetricRangePayloadV5, ScalarData } from 'api/v5/v5';
import { useNavigateToExplorer } from 'components/CeleryTask/useNavigateToExplorer'; import { useNavigateToExplorer } from 'components/CeleryTask/useNavigateToExplorer';
import { withErrorBoundary } from 'components/ErrorBoundaryHOC'; import { withErrorBoundary } from 'components/ErrorBoundaryHOC';
import { DEFAULT_ENTITY_VERSION, ENTITY_VERSION_V4 } from 'constants/app'; import { ENTITY_VERSION_V5 } from 'constants/app';
import { REACT_QUERY_KEY } from 'constants/reactQueryKeys'; import { REACT_QUERY_KEY } from 'constants/reactQueryKeys';
import { import {
END_POINT_DETAILS_QUERY_KEYS_ARRAY, END_POINT_DETAILS_QUERY_KEYS_ARRAY,
@@ -11,13 +13,12 @@ import {
getTopErrorsColumnsConfig, getTopErrorsColumnsConfig,
getTopErrorsCoRelationQueryFilters, getTopErrorsCoRelationQueryFilters,
getTopErrorsQueryPayload, getTopErrorsQueryPayload,
TopErrorsResponseRow,
} from 'container/ApiMonitoring/utils'; } from 'container/ApiMonitoring/utils';
import { GetMetricQueryRange } from 'lib/dashboard/getQueryResults'; import { GetMetricQueryRange } from 'lib/dashboard/getQueryResults';
import { Info } from 'lucide-react'; import { Info } from 'lucide-react';
import { useMemo, useState } from 'react'; import { useMemo, useState } from 'react';
import { useQueries } from 'react-query'; import { QueryFunctionContext, useQueries, useQuery } from 'react-query';
import { SuccessResponse } from 'types/api'; import { SuccessResponse, SuccessResponseV2 } from 'types/api';
import { MetricRangePayloadProps } from 'types/api/metrics/getQueryRange'; import { MetricRangePayloadProps } from 'types/api/metrics/getQueryRange';
import { DataTypes } from 'types/api/queryBuilder/queryAutocompleteResponse'; import { DataTypes } from 'types/api/queryBuilder/queryAutocompleteResponse';
import { IBuilderQuery } from 'types/api/queryBuilder/queryBuilderData'; import { IBuilderQuery } from 'types/api/queryBuilder/queryBuilderData';
@@ -46,7 +47,7 @@ function TopErrors({
true, true,
); );
const queryPayloads = useMemo( const queryPayload = useMemo(
() => () =>
getTopErrorsQueryPayload( getTopErrorsQueryPayload(
domainName, domainName,
@@ -55,6 +56,10 @@ function TopErrors({
{ {
items: endPointName items: endPointName
? [ ? [
// Remove any existing http.url filters from initialFilters to avoid duplicates
...(initialFilters?.items?.filter(
(item) => item.key?.key !== SPAN_ATTRIBUTES.URL_PATH,
) || []),
{ {
id: '92b8a1c1', id: '92b8a1c1',
key: { key: {
@@ -65,7 +70,6 @@ function TopErrors({
op: '=', op: '=',
value: endPointName, value: endPointName,
}, },
...(initialFilters?.items || []),
] ]
: [...(initialFilters?.items || [])], : [...(initialFilters?.items || [])],
op: 'AND', op: 'AND',
@@ -82,37 +86,34 @@ function TopErrors({
], ],
); );
const topErrorsDataQueries = useQueries(
queryPayloads.map((payload) => ({
queryKey: [
REACT_QUERY_KEY.GET_TOP_ERRORS_BY_DOMAIN,
payload,
DEFAULT_ENTITY_VERSION,
showStatusCodeErrors,
],
queryFn: (): Promise<SuccessResponse<MetricRangePayloadProps>> =>
GetMetricQueryRange(payload, DEFAULT_ENTITY_VERSION),
enabled: !!payload,
staleTime: 0,
cacheTime: 0,
})),
);
const topErrorsDataQuery = topErrorsDataQueries[0];
const { const {
data: topErrorsData, data: topErrorsData,
isLoading, isLoading,
isRefetching, isRefetching,
isError, isError,
refetch, refetch,
} = topErrorsDataQuery; } = useQuery({
queryKey: [
REACT_QUERY_KEY.GET_TOP_ERRORS_BY_DOMAIN,
queryPayload,
ENTITY_VERSION_V5,
showStatusCodeErrors,
],
queryFn: ({
signal,
}: QueryFunctionContext): Promise<SuccessResponseV2<MetricRangePayloadV5>> =>
getQueryRangeV5(queryPayload, ENTITY_VERSION_V5, signal),
enabled: !!queryPayload,
staleTime: 0,
cacheTime: 0,
});
const topErrorsColumnsConfig = useMemo(() => getTopErrorsColumnsConfig(), []); const topErrorsColumnsConfig = useMemo(() => getTopErrorsColumnsConfig(), []);
const formattedTopErrorsData = useMemo( const formattedTopErrorsData = useMemo(
() => () =>
formatTopErrorsDataForTable( formatTopErrorsDataForTable(
topErrorsData?.payload?.data?.result as TopErrorsResponseRow[], topErrorsData?.data?.data?.data?.results[0] as ScalarData,
), ),
[topErrorsData], [topErrorsData],
); );
@@ -130,12 +131,12 @@ function TopErrors({
const endPointDropDownDataQueries = useQueries( const endPointDropDownDataQueries = useQueries(
endPointDropDownQueryPayload.map((payload) => ({ endPointDropDownQueryPayload.map((payload) => ({
queryKey: [ queryKey: [
END_POINT_DETAILS_QUERY_KEYS_ARRAY[4], END_POINT_DETAILS_QUERY_KEYS_ARRAY[2],
payload, payload,
ENTITY_VERSION_V4, ENTITY_VERSION_V5,
], ],
queryFn: (): Promise<SuccessResponse<MetricRangePayloadProps>> => queryFn: (): Promise<SuccessResponse<MetricRangePayloadProps>> =>
GetMetricQueryRange(payload, ENTITY_VERSION_V4), GetMetricQueryRange(payload, ENTITY_VERSION_V5),
enabled: !!payload, enabled: !!payload,
staleTime: 60 * 1000, staleTime: 60 * 1000,
})), })),

View File

@@ -0,0 +1,337 @@
/* eslint-disable @typescript-eslint/no-explicit-any */
/* eslint-disable react/jsx-props-no-spreading */
/* eslint-disable prefer-destructuring */
/* eslint-disable sonarjs/no-duplicate-string */
import { render, screen, waitFor } from '@testing-library/react';
import { TraceAggregation } from 'api/v5/v5';
import { ENTITY_VERSION_V5 } from 'constants/app';
import { GetMetricQueryRange } from 'lib/dashboard/getQueryResults';
import { QueryClient, QueryClientProvider } from 'react-query';
import { IBuilderQuery } from 'types/api/queryBuilder/queryBuilderData';
import DomainMetrics from './DomainMetrics';
// Mock the API call
jest.mock('lib/dashboard/getQueryResults', () => ({
GetMetricQueryRange: jest.fn(),
}));
// Mock ErrorState component
jest.mock('./ErrorState', () => ({
__esModule: true,
default: jest.fn(({ refetch }) => (
<div data-testid="error-state">
<button type="button" onClick={refetch} data-testid="retry-button">
Retry
</button>
</div>
)),
}));
describe('DomainMetrics - V5 Query Payload Tests', () => {
let queryClient: QueryClient;
const mockProps = {
domainName: '0.0.0.0',
timeRange: {
startTime: 1758259531000,
endTime: 1758261331000,
},
domainListFilters: {
items: [],
op: 'AND' as const,
} as IBuilderQuery['filters'],
};
const mockSuccessResponse = {
statusCode: 200,
error: null,
payload: {
data: {
result: [
{
table: {
rows: [
{
data: {
A: '150',
B: '125000000',
D: '2021-01-01T23:00:00Z',
F1: '5.5',
},
},
],
},
},
],
},
},
};
beforeEach(() => {
queryClient = new QueryClient({
defaultOptions: {
queries: {
retry: false,
cacheTime: 0,
},
},
});
jest.clearAllMocks();
});
afterEach(() => {
queryClient.clear();
});
const renderComponent = (props = mockProps): ReturnType<typeof render> =>
render(
<QueryClientProvider client={queryClient}>
<DomainMetrics {...props} />
</QueryClientProvider>,
);
describe('1. V5 Query Payload with Filters', () => {
it('sends correct V5 payload structure with domain name filters', async () => {
(GetMetricQueryRange as jest.Mock).mockResolvedValue(mockSuccessResponse);
renderComponent();
await waitFor(() => {
expect(GetMetricQueryRange).toHaveBeenCalledTimes(1);
});
const [payload, version] = (GetMetricQueryRange as jest.Mock).mock.calls[0];
// Verify it's using V5
expect(version).toBe(ENTITY_VERSION_V5);
// Verify time range
expect(payload.start).toBe(1758259531000);
expect(payload.end).toBe(1758261331000);
// Verify V3 payload structure (getDomainMetricsQueryPayload returns V3 format)
expect(payload.query).toBeDefined();
expect(payload.query.builder).toBeDefined();
expect(payload.query.builder.queryData).toBeDefined();
const queryData = payload.query.builder.queryData;
// Verify Query A - count with URL filter
const queryA = queryData.find((q: any) => q.queryName === 'A');
expect(queryA).toBeDefined();
expect(queryA.dataSource).toBe('traces');
expect(queryA.aggregations?.[0]).toBeDefined();
expect((queryA.aggregations?.[0] as TraceAggregation)?.expression).toBe(
'count()',
);
// Verify exact domain filter expression structure
expect(queryA.filter.expression).toContain(
"(net.peer.name = '0.0.0.0' OR server.address = '0.0.0.0')",
);
expect(queryA.filter.expression).toContain(
'url.full EXISTS OR http.url EXISTS',
);
// Verify Query B - p99 latency
const queryB = queryData.find((q: any) => q.queryName === 'B');
expect(queryB).toBeDefined();
expect(queryB.aggregateOperator).toBe('p99');
expect(queryB.aggregations?.[0]).toBeDefined();
expect((queryB.aggregations?.[0] as TraceAggregation)?.expression).toBe(
'p99(duration_nano)',
);
// Verify exact domain filter expression structure
expect(queryB.filter.expression).toContain(
"(net.peer.name = '0.0.0.0' OR server.address = '0.0.0.0')",
);
// Verify Query C - error count (disabled)
const queryC = queryData.find((q: any) => q.queryName === 'C');
expect(queryC).toBeDefined();
expect(queryC.disabled).toBe(true);
expect(queryC.filter.expression).toContain(
"(net.peer.name = '0.0.0.0' OR server.address = '0.0.0.0')",
);
expect(queryC.aggregations?.[0]).toBeDefined();
expect((queryC.aggregations?.[0] as TraceAggregation)?.expression).toBe(
'count()',
);
expect(queryC.filter.expression).toContain('has_error = true');
// Verify Query D - max timestamp
const queryD = queryData.find((q: any) => q.queryName === 'D');
expect(queryD).toBeDefined();
expect(queryD.aggregateOperator).toBe('max');
expect(queryD.aggregations?.[0]).toBeDefined();
expect((queryD.aggregations?.[0] as TraceAggregation)?.expression).toBe(
'max(timestamp)',
);
// Verify exact domain filter expression structure
expect(queryD.filter.expression).toContain(
"(net.peer.name = '0.0.0.0' OR server.address = '0.0.0.0')",
);
// Verify Formula F1 - error rate calculation
const formulas = payload.query.builder.queryFormulas;
expect(formulas).toBeDefined();
expect(formulas.length).toBeGreaterThan(0);
const formulaF1 = formulas.find((f: any) => f.queryName === 'F1');
expect(formulaF1).toBeDefined();
expect(formulaF1.expression).toBe('(C/A)*100');
});
it('includes custom filters in filter expressions', async () => {
(GetMetricQueryRange as jest.Mock).mockResolvedValue(mockSuccessResponse);
const customFilters: IBuilderQuery['filters'] = {
items: [
{
id: 'test-1',
key: {
key: 'service.name',
dataType: 'string' as any,
type: 'resource',
},
op: '=',
value: 'my-service',
},
{
id: 'test-2',
key: {
key: 'deployment.environment',
dataType: 'string' as any,
type: 'resource',
},
op: '=',
value: 'production',
},
],
op: 'AND' as const,
};
renderComponent({
...mockProps,
domainListFilters: customFilters,
});
await waitFor(() => {
expect(GetMetricQueryRange).toHaveBeenCalled();
});
const [payload] = (GetMetricQueryRange as jest.Mock).mock.calls[0];
const queryData = payload.query.builder.queryData;
// Verify all queries include the custom filters
queryData.forEach((query: any) => {
if (query.filter && query.filter.expression) {
expect(query.filter.expression).toContain('service.name');
expect(query.filter.expression).toContain('my-service');
expect(query.filter.expression).toContain('deployment.environment');
expect(query.filter.expression).toContain('production');
}
});
});
});
describe('2. Data Display State', () => {
it('displays metrics when data is successfully loaded', async () => {
(GetMetricQueryRange as jest.Mock).mockResolvedValue(mockSuccessResponse);
renderComponent();
// Wait for skeletons to disappear
await waitFor(() => {
const skeletons = document.querySelectorAll('.ant-skeleton-button');
expect(skeletons.length).toBe(0);
});
// Verify all metric labels are displayed
expect(screen.getByText('EXTERNAL API')).toBeInTheDocument();
expect(screen.getByText('AVERAGE LATENCY')).toBeInTheDocument();
expect(screen.getByText('ERROR %')).toBeInTheDocument();
expect(screen.getByText('LAST USED')).toBeInTheDocument();
// Verify metric values are displayed
expect(screen.getByText('150')).toBeInTheDocument();
expect(screen.getByText('0.125s')).toBeInTheDocument();
});
});
describe('3. Empty/Missing Data State', () => {
it('displays "-" for missing data values', async () => {
const emptyResponse = {
statusCode: 200,
error: null,
payload: {
data: {
result: [
{
table: {
rows: [],
},
},
],
},
},
};
(GetMetricQueryRange as jest.Mock).mockResolvedValue(emptyResponse);
renderComponent();
await waitFor(() => {
const skeletons = document.querySelectorAll('.ant-skeleton-button');
expect(skeletons.length).toBe(0);
});
// When no data, all values should show "-"
const dashValues = screen.getAllByText('-');
expect(dashValues.length).toBeGreaterThan(0);
});
});
describe('4. Error State', () => {
it('displays error state when API call fails', async () => {
(GetMetricQueryRange as jest.Mock).mockRejectedValue(new Error('API Error'));
renderComponent();
await waitFor(() => {
expect(screen.getByTestId('error-state')).toBeInTheDocument();
});
expect(screen.getByTestId('retry-button')).toBeInTheDocument();
});
it('retries API call when retry button is clicked', async () => {
let callCount = 0;
(GetMetricQueryRange as jest.Mock).mockImplementation(() => {
callCount += 1;
if (callCount === 1) {
return Promise.reject(new Error('API Error'));
}
return Promise.resolve(mockSuccessResponse);
});
renderComponent();
// Wait for error state
await waitFor(() => {
expect(screen.getByTestId('error-state')).toBeInTheDocument();
});
// Click retry
const retryButton = screen.getByTestId('retry-button');
retryButton.click();
// Wait for successful load
await waitFor(() => {
expect(screen.getByText('150')).toBeInTheDocument();
});
expect(callCount).toBe(2);
});
});
});

View File

@@ -1,6 +1,6 @@
import { Color } from '@signozhq/design-tokens'; import { Color } from '@signozhq/design-tokens';
import { Progress, Skeleton, Tooltip, Typography } from 'antd'; import { Progress, Skeleton, Tooltip, Typography } from 'antd';
import { ENTITY_VERSION_V4 } from 'constants/app'; import { ENTITY_VERSION_V5 } from 'constants/app';
import { REACT_QUERY_KEY } from 'constants/reactQueryKeys'; import { REACT_QUERY_KEY } from 'constants/reactQueryKeys';
import { import {
DomainMetricsResponseRow, DomainMetricsResponseRow,
@@ -44,10 +44,10 @@ function DomainMetrics({
queryKey: [ queryKey: [
REACT_QUERY_KEY.GET_DOMAIN_METRICS_DATA, REACT_QUERY_KEY.GET_DOMAIN_METRICS_DATA,
payload, payload,
ENTITY_VERSION_V4, ENTITY_VERSION_V5,
], ],
queryFn: (): Promise<SuccessResponse<MetricRangePayloadProps>> => queryFn: (): Promise<SuccessResponse<MetricRangePayloadProps>> =>
GetMetricQueryRange(payload, ENTITY_VERSION_V4), GetMetricQueryRange(payload, ENTITY_VERSION_V5),
enabled: !!payload, enabled: !!payload,
staleTime: 60 * 1000, // 1 minute stale time : optimize this part staleTime: 60 * 1000, // 1 minute stale time : optimize this part
})), })),
@@ -132,7 +132,9 @@ function DomainMetrics({
) : ( ) : (
<Tooltip title={formattedDomainMetricsData.latency}> <Tooltip title={formattedDomainMetricsData.latency}>
<span className="round-metric-tag"> <span className="round-metric-tag">
{(Number(formattedDomainMetricsData.latency) / 1000).toFixed(3)}s {formattedDomainMetricsData.latency !== '-'
? `${(Number(formattedDomainMetricsData.latency) / 1000).toFixed(3)}s`
: '-'}
</span> </span>
</Tooltip> </Tooltip>
)} )}
@@ -143,23 +145,27 @@ function DomainMetrics({
<Skeleton.Button active size="small" /> <Skeleton.Button active size="small" />
) : ( ) : (
<Tooltip title={formattedDomainMetricsData.errorRate}> <Tooltip title={formattedDomainMetricsData.errorRate}>
<Progress {formattedDomainMetricsData.errorRate !== '-' ? (
status="active" <Progress
percent={Number( status="active"
Number(formattedDomainMetricsData.errorRate).toFixed(2), percent={Number(
)}
strokeLinecap="butt"
size="small"
strokeColor={((): string => {
const errorRatePercent = Number(
Number(formattedDomainMetricsData.errorRate).toFixed(2), Number(formattedDomainMetricsData.errorRate).toFixed(2),
); )}
if (errorRatePercent >= 90) return Color.BG_SAKURA_500; strokeLinecap="butt"
if (errorRatePercent >= 60) return Color.BG_AMBER_500; size="small"
return Color.BG_FOREST_500; strokeColor={((): string => {
})()} const errorRatePercent = Number(
className="progress-bar" Number(formattedDomainMetricsData.errorRate).toFixed(2),
/> );
if (errorRatePercent >= 90) return Color.BG_SAKURA_500;
if (errorRatePercent >= 60) return Color.BG_AMBER_500;
return Color.BG_FOREST_500;
})()}
className="progress-bar"
/>
) : (
'-'
)}
</Tooltip> </Tooltip>
)} )}
</Typography.Text> </Typography.Text>

View File

@@ -0,0 +1,419 @@
/* eslint-disable @typescript-eslint/no-explicit-any */
/* eslint-disable react/jsx-props-no-spreading */
/* eslint-disable prefer-destructuring */
/* eslint-disable sonarjs/no-duplicate-string */
import { render, screen, waitFor } from '@testing-library/react';
import { getEndPointDetailsQueryPayload } from 'container/ApiMonitoring/utils';
import { GetMetricQueryRange } from 'lib/dashboard/getQueryResults';
import { QueryClient, QueryClientProvider, UseQueryResult } from 'react-query';
import { SuccessResponse } from 'types/api';
import EndPointMetrics from './EndPointMetrics';
// Mock the API call
jest.mock('lib/dashboard/getQueryResults', () => ({
GetMetricQueryRange: jest.fn(),
}));
// Mock ErrorState component
jest.mock('./ErrorState', () => ({
__esModule: true,
default: jest.fn(({ refetch }) => (
<div data-testid="error-state">
<button type="button" onClick={refetch} data-testid="retry-button">
Retry
</button>
</div>
)),
}));
describe('EndPointMetrics - V5 Query Payload Tests', () => {
let queryClient: QueryClient;
const mockSuccessResponse = {
statusCode: 200,
error: null,
payload: {
data: {
result: [
{
table: {
rows: [
{
data: {
A: '85.5',
B: '245000000',
D: '2021-01-01T22:30:00Z',
F1: '3.2',
},
},
],
},
},
],
},
},
};
beforeEach(() => {
queryClient = new QueryClient({
defaultOptions: {
queries: {
retry: false,
cacheTime: 0,
},
},
});
jest.clearAllMocks();
});
afterEach(() => {
queryClient.clear();
});
// Helper to create mock query result
const createMockQueryResult = (
response: any,
overrides?: Partial<UseQueryResult<SuccessResponse<any>, unknown>>,
): UseQueryResult<SuccessResponse<any>, unknown> =>
({
data: response,
error: null,
isError: false,
isIdle: false,
isLoading: false,
isLoadingError: false,
isRefetchError: false,
isRefetching: false,
isStale: true,
isSuccess: true,
status: 'success' as const,
dataUpdatedAt: Date.now(),
errorUpdateCount: 0,
errorUpdatedAt: 0,
failureCount: 0,
isFetched: true,
isFetchedAfterMount: true,
isFetching: false,
isPlaceholderData: false,
isPreviousData: false,
refetch: jest.fn(),
remove: jest.fn(),
...overrides,
} as UseQueryResult<SuccessResponse<any>, unknown>);
const renderComponent = (
endPointMetricsDataQuery: UseQueryResult<SuccessResponse<any>, unknown>,
): ReturnType<typeof render> =>
render(
<QueryClientProvider client={queryClient}>
<EndPointMetrics endPointMetricsDataQuery={endPointMetricsDataQuery} />
</QueryClientProvider>,
);
// eslint-disable-next-line sonarjs/cognitive-complexity
describe('1. V5 Query Payload with Filters', () => {
// eslint-disable-next-line sonarjs/cognitive-complexity
it('sends correct V5 payload structure with domain and endpoint filters', async () => {
(GetMetricQueryRange as jest.Mock).mockResolvedValue(mockSuccessResponse);
const domainName = 'api.example.com';
const startTime = 1758259531000;
const endTime = 1758261331000;
const filters = {
items: [],
op: 'AND' as const,
};
// Get the actual payload that would be generated
const payloads = getEndPointDetailsQueryPayload(
domainName,
startTime,
endTime,
filters,
);
// First payload is for endpoint metrics
const metricsPayload = payloads[0];
// Verify it's using the correct structure (V3 format for V5 API)
expect(metricsPayload.query).toBeDefined();
expect(metricsPayload.query.builder).toBeDefined();
expect(metricsPayload.query.builder.queryData).toBeDefined();
const queryData = metricsPayload.query.builder.queryData;
// Verify Query A - rate with domain and client kind filters
const queryA = queryData.find((q: any) => q.queryName === 'A');
expect(queryA).toBeDefined();
if (queryA) {
expect(queryA.dataSource).toBe('traces');
expect(queryA.aggregateOperator).toBe('rate');
expect(queryA.timeAggregation).toBe('rate');
// Verify exact domain filter expression structure
if (queryA.filter) {
expect(queryA.filter.expression).toContain(
"(net.peer.name = 'api.example.com' OR server.address = 'api.example.com')",
);
expect(queryA.filter.expression).toContain("kind_string = 'Client'");
}
}
// Verify Query B - p99 latency with duration_nano
const queryB = queryData.find((q: any) => q.queryName === 'B');
expect(queryB).toBeDefined();
if (queryB) {
expect(queryB.aggregateOperator).toBe('p99');
if (queryB.aggregateAttribute) {
expect(queryB.aggregateAttribute.key).toBe('duration_nano');
}
expect(queryB.timeAggregation).toBe('p99');
// Verify exact domain filter expression structure
if (queryB.filter) {
expect(queryB.filter.expression).toContain(
"(net.peer.name = 'api.example.com' OR server.address = 'api.example.com')",
);
expect(queryB.filter.expression).toContain("kind_string = 'Client'");
}
}
// Verify Query C - error count (disabled)
const queryC = queryData.find((q: any) => q.queryName === 'C');
expect(queryC).toBeDefined();
if (queryC) {
expect(queryC.disabled).toBe(true);
expect(queryC.aggregateOperator).toBe('count');
if (queryC.filter) {
expect(queryC.filter.expression).toContain(
"(net.peer.name = 'api.example.com' OR server.address = 'api.example.com')",
);
expect(queryC.filter.expression).toContain("kind_string = 'Client'");
expect(queryC.filter.expression).toContain('has_error = true');
}
}
// Verify Query D - max timestamp for last used
const queryD = queryData.find((q: any) => q.queryName === 'D');
expect(queryD).toBeDefined();
if (queryD) {
expect(queryD.aggregateOperator).toBe('max');
if (queryD.aggregateAttribute) {
expect(queryD.aggregateAttribute.key).toBe('timestamp');
}
expect(queryD.timeAggregation).toBe('max');
// Verify exact domain filter expression structure
if (queryD.filter) {
expect(queryD.filter.expression).toContain(
"(net.peer.name = 'api.example.com' OR server.address = 'api.example.com')",
);
expect(queryD.filter.expression).toContain("kind_string = 'Client'");
}
}
// Verify Query E - total count (disabled)
const queryE = queryData.find((q: any) => q.queryName === 'E');
expect(queryE).toBeDefined();
if (queryE) {
expect(queryE.disabled).toBe(true);
expect(queryE.aggregateOperator).toBe('count');
if (queryE.aggregateAttribute) {
expect(queryE.aggregateAttribute.key).toBe('span_id');
}
if (queryE.filter) {
expect(queryE.filter.expression).toContain(
"(net.peer.name = 'api.example.com' OR server.address = 'api.example.com')",
);
expect(queryE.filter.expression).toContain("kind_string = 'Client'");
}
}
// Verify Formula F1 - error rate calculation
const formulas = metricsPayload.query.builder.queryFormulas;
expect(formulas).toBeDefined();
expect(formulas.length).toBeGreaterThan(0);
const formulaF1 = formulas.find((f: any) => f.queryName === 'F1');
expect(formulaF1).toBeDefined();
if (formulaF1) {
expect(formulaF1.expression).toBe('(C/E)*100');
expect(formulaF1.disabled).toBe(false);
expect(formulaF1.legend).toBe('error percentage');
}
});
it('includes custom domainListFilters in all query expressions', async () => {
(GetMetricQueryRange as jest.Mock).mockResolvedValue(mockSuccessResponse);
const customFilters = {
items: [
{
id: 'test-1',
key: {
key: 'service.name',
dataType: 'string' as any,
type: 'resource',
},
op: '=',
value: 'payment-service',
},
{
id: 'test-2',
key: {
key: 'deployment.environment',
dataType: 'string' as any,
type: 'resource',
},
op: '=',
value: 'staging',
},
],
op: 'AND' as const,
};
const payloads = getEndPointDetailsQueryPayload(
'api.internal.com',
1758259531000,
1758261331000,
customFilters,
);
const queryData = payloads[0].query.builder.queryData;
// Verify ALL queries (A, B, C, D, E) include the custom filters
const allQueryNames = ['A', 'B', 'C', 'D', 'E'];
allQueryNames.forEach((queryName) => {
const query = queryData.find((q: any) => q.queryName === queryName);
expect(query).toBeDefined();
if (query && query.filter && query.filter.expression) {
// Check for exact filter inclusion
expect(query.filter.expression).toContain('service.name');
expect(query.filter.expression).toContain('payment-service');
expect(query.filter.expression).toContain('deployment.environment');
expect(query.filter.expression).toContain('staging');
// Also verify domain filter is still present
expect(query.filter.expression).toContain(
"(net.peer.name = 'api.internal.com' OR server.address = 'api.internal.com')",
);
// Verify client kind filter is present
expect(query.filter.expression).toContain("kind_string = 'Client'");
}
});
});
});
describe('2. Data Display State', () => {
it('displays metrics when data is successfully loaded', async () => {
const mockQuery = createMockQueryResult(mockSuccessResponse);
renderComponent(mockQuery);
// Wait for skeletons to disappear
await waitFor(() => {
const skeletons = document.querySelectorAll('.ant-skeleton-button');
expect(skeletons.length).toBe(0);
});
// Verify all metric labels are displayed
expect(screen.getByText('Rate')).toBeInTheDocument();
expect(screen.getByText('AVERAGE LATENCY')).toBeInTheDocument();
expect(screen.getByText('ERROR %')).toBeInTheDocument();
expect(screen.getByText('LAST USED')).toBeInTheDocument();
// Verify metric values are displayed
expect(screen.getByText('85.5 ops/sec')).toBeInTheDocument();
expect(screen.getByText('245ms')).toBeInTheDocument();
});
});
describe('3. Empty/Missing Data State', () => {
it("displays '-' for missing data values", async () => {
const emptyResponse = {
statusCode: 200,
error: null,
payload: {
data: {
result: [
{
table: {
rows: [],
},
},
],
},
},
};
const mockQuery = createMockQueryResult(emptyResponse);
renderComponent(mockQuery);
await waitFor(() => {
const skeletons = document.querySelectorAll('.ant-skeleton-button');
expect(skeletons.length).toBe(0);
});
// When no data, all values should show "-"
const dashValues = screen.getAllByText('-');
// Should have at least 2 dashes (rate and last used - latency shows "-", error % shows progress bar)
expect(dashValues.length).toBeGreaterThanOrEqual(2);
});
});
describe('4. Error State', () => {
it('displays error state when API call fails', async () => {
const mockQuery = createMockQueryResult(null, {
isError: true,
isSuccess: false,
status: 'error',
error: new Error('API Error'),
});
renderComponent(mockQuery);
await waitFor(() => {
expect(screen.getByTestId('error-state')).toBeInTheDocument();
});
expect(screen.getByTestId('retry-button')).toBeInTheDocument();
});
it('retries API call when retry button is clicked', async () => {
const refetch = jest.fn().mockResolvedValue(mockSuccessResponse);
// Start with error state
const mockQuery = createMockQueryResult(null, {
isError: true,
isSuccess: false,
status: 'error',
error: new Error('API Error'),
refetch,
});
const { rerender } = renderComponent(mockQuery);
// Wait for error state
await waitFor(() => {
expect(screen.getByTestId('error-state')).toBeInTheDocument();
});
// Click retry
const retryButton = screen.getByTestId('retry-button');
retryButton.click();
// Verify refetch was called
expect(refetch).toHaveBeenCalledTimes(1);
// Simulate successful refetch by rerendering with success state
const successQuery = createMockQueryResult(mockSuccessResponse);
rerender(
<QueryClientProvider client={queryClient}>
<EndPointMetrics endPointMetricsDataQuery={successQuery} />
</QueryClientProvider>,
);
// Wait for successful load
await waitFor(() => {
expect(screen.getByText('85.5 ops/sec')).toBeInTheDocument();
});
});
});
});

View File

@@ -1,12 +1,16 @@
import { Color } from '@signozhq/design-tokens'; import { Color } from '@signozhq/design-tokens';
import { Progress, Skeleton, Tooltip, Typography } from 'antd'; import { Progress, Skeleton, Tooltip, Typography } from 'antd';
import { getFormattedEndPointMetricsData } from 'container/ApiMonitoring/utils'; import {
getDisplayValue,
getFormattedEndPointMetricsData,
} from 'container/ApiMonitoring/utils';
import { useMemo } from 'react'; import { useMemo } from 'react';
import { UseQueryResult } from 'react-query'; import { UseQueryResult } from 'react-query';
import { SuccessResponse } from 'types/api'; import { SuccessResponse } from 'types/api';
import ErrorState from './ErrorState'; import ErrorState from './ErrorState';
// eslint-disable-next-line sonarjs/cognitive-complexity
function EndPointMetrics({ function EndPointMetrics({
endPointMetricsDataQuery, endPointMetricsDataQuery,
}: { }: {
@@ -70,7 +74,9 @@ function EndPointMetrics({
<Skeleton.Button active size="small" /> <Skeleton.Button active size="small" />
) : ( ) : (
<Tooltip title={metricsData?.rate}> <Tooltip title={metricsData?.rate}>
<span className="round-metric-tag">{metricsData?.rate} ops/sec</span> <span className="round-metric-tag">
{metricsData?.rate !== '-' ? `${metricsData?.rate} ops/sec` : '-'}
</span>
</Tooltip> </Tooltip>
)} )}
</Typography.Text> </Typography.Text>
@@ -79,7 +85,7 @@ function EndPointMetrics({
<Skeleton.Button active size="small" /> <Skeleton.Button active size="small" />
) : ( ) : (
<Tooltip title={metricsData?.latency}> <Tooltip title={metricsData?.latency}>
<span className="round-metric-tag">{metricsData?.latency}ms</span> {metricsData?.latency !== '-' ? `${metricsData?.latency}ms` : '-'}
</Tooltip> </Tooltip>
)} )}
</Typography.Text> </Typography.Text>
@@ -88,21 +94,25 @@ function EndPointMetrics({
<Skeleton.Button active size="small" /> <Skeleton.Button active size="small" />
) : ( ) : (
<Tooltip title={metricsData?.errorRate}> <Tooltip title={metricsData?.errorRate}>
<Progress {metricsData?.errorRate !== '-' ? (
status="active" <Progress
percent={Number(Number(metricsData?.errorRate ?? 0).toFixed(2))} status="active"
strokeLinecap="butt" percent={Number(Number(metricsData?.errorRate ?? 0).toFixed(2))}
size="small" strokeLinecap="butt"
strokeColor={((): string => { size="small"
const errorRatePercent = Number( strokeColor={((): string => {
Number(metricsData?.errorRate ?? 0).toFixed(2), const errorRatePercent = Number(
); Number(metricsData?.errorRate ?? 0).toFixed(2),
if (errorRatePercent >= 90) return Color.BG_SAKURA_500; );
if (errorRatePercent >= 60) return Color.BG_AMBER_500; if (errorRatePercent >= 90) return Color.BG_SAKURA_500;
return Color.BG_FOREST_500; if (errorRatePercent >= 60) return Color.BG_AMBER_500;
})()} return Color.BG_FOREST_500;
className="progress-bar" })()}
/> className="progress-bar"
/>
) : (
'-'
)}
</Tooltip> </Tooltip>
)} )}
</Typography.Text> </Typography.Text>
@@ -110,7 +120,9 @@ function EndPointMetrics({
{isLoading || isRefetching ? ( {isLoading || isRefetching ? (
<Skeleton.Button active size="small" /> <Skeleton.Button active size="small" />
) : ( ) : (
<Tooltip title={metricsData?.lastUsed}>{metricsData?.lastUsed}</Tooltip> <Tooltip title={metricsData?.lastUsed}>
{getDisplayValue(metricsData?.lastUsed)}
</Tooltip>
)} )}
</Typography.Text> </Typography.Text>
</div> </div>

View File

@@ -1,4 +1,5 @@
import { Card } from 'antd'; import { Card } from 'antd';
import { ENTITY_VERSION_V5 } from 'constants/app';
import GridCard from 'container/GridCardLayout/GridCard'; import GridCard from 'container/GridCardLayout/GridCard';
import { Widgets } from 'types/api/dashboard/getAll'; import { Widgets } from 'types/api/dashboard/getAll';
@@ -22,6 +23,7 @@ function MetricOverTimeGraph({
customOnDragSelect={(): void => {}} customOnDragSelect={(): void => {}}
customTimeRange={timeRange} customTimeRange={timeRange}
customTimeRangeWindowForCoRelation="5m" customTimeRangeWindowForCoRelation="5m"
version={ENTITY_VERSION_V5}
/> />
</div> </div>
</Card> </Card>

View File

@@ -69,6 +69,13 @@ function StatusCodeBarCharts({
} = endPointStatusCodeLatencyBarChartsDataQuery; } = endPointStatusCodeLatencyBarChartsDataQuery;
const { startTime: minTime, endTime: maxTime } = timeRange; const { startTime: minTime, endTime: maxTime } = timeRange;
const legendScrollPositionRef = useRef<{
scrollTop: number;
scrollLeft: number;
}>({
scrollTop: 0,
scrollLeft: 0,
});
const graphRef = useRef<HTMLDivElement>(null); const graphRef = useRef<HTMLDivElement>(null);
const dimensions = useResizeObserver(graphRef); const dimensions = useResizeObserver(graphRef);
@@ -207,6 +214,13 @@ function StatusCodeBarCharts({
onDragSelect, onDragSelect,
colorMapping, colorMapping,
query: currentQuery, query: currentQuery,
legendScrollPosition: legendScrollPositionRef.current,
setLegendScrollPosition: (position: {
scrollTop: number;
scrollLeft: number;
}) => {
legendScrollPositionRef.current = position;
},
}), }),
[ [
minTime, minTime,

View File

@@ -8,23 +8,14 @@ import {
endPointStatusCodeColumns, endPointStatusCodeColumns,
extractPortAndEndpoint, extractPortAndEndpoint,
formatDataForTable, formatDataForTable,
formatTopErrorsDataForTable,
getAllEndpointsWidgetData,
getCustomFiltersForBarChart, getCustomFiltersForBarChart,
getEndPointDetailsQueryPayload,
getFormattedDependentServicesData,
getFormattedEndPointDropDownData, getFormattedEndPointDropDownData,
getFormattedEndPointMetricsData,
getFormattedEndPointStatusCodeChartData, getFormattedEndPointStatusCodeChartData,
getFormattedEndPointStatusCodeData, getFormattedEndPointStatusCodeData,
getGroupByFiltersFromGroupByValues, getGroupByFiltersFromGroupByValues,
getLatencyOverTimeWidgetData,
getRateOverTimeWidgetData,
getStatusCodeBarChartWidgetData, getStatusCodeBarChartWidgetData,
getTopErrorsColumnsConfig, getTopErrorsColumnsConfig,
getTopErrorsCoRelationQueryFilters, getTopErrorsCoRelationQueryFilters,
getTopErrorsQueryPayload,
TopErrorsResponseRow,
} from '../utils'; } from '../utils';
import { APIMonitoringColumnsMock } from './mock'; import { APIMonitoringColumnsMock } from './mock';
@@ -52,119 +43,13 @@ jest.mock('../utils', () => {
}); });
describe('API Monitoring Utils', () => { describe('API Monitoring Utils', () => {
describe('getAllEndpointsWidgetData', () => {
it('should create a widget with correct configuration', () => {
// Arrange
const groupBy = [
{
dataType: DataTypes.String,
// eslint-disable-next-line sonarjs/no-duplicate-string
key: 'http.method',
type: '',
},
];
// eslint-disable-next-line sonarjs/no-duplicate-string
const domainName = 'test-domain';
const filters = {
items: [
{
// eslint-disable-next-line sonarjs/no-duplicate-string
id: 'test-filter',
key: {
dataType: DataTypes.String,
key: 'test-key',
type: '',
},
op: '=',
// eslint-disable-next-line sonarjs/no-duplicate-string
value: 'test-value',
},
],
op: 'AND',
};
// Act
const result = getAllEndpointsWidgetData(
groupBy as BaseAutocompleteData[],
domainName,
filters as IBuilderQuery['filters'],
);
// Assert
expect(result).toBeDefined();
expect(result.id).toBeDefined();
// Title is a React component, not a string
expect(result.title).toBeDefined();
expect(result.panelTypes).toBe(PANEL_TYPES.TABLE);
// Check that each query includes the domainName filter
result.query.builder.queryData.forEach((query) => {
const serverNameFilter = query.filters?.items?.find(
(item) => item.key && item.key.key === SPAN_ATTRIBUTES.SERVER_NAME,
);
expect(serverNameFilter).toBeDefined();
expect(serverNameFilter?.value).toBe(domainName);
// Check that the custom filters were included
const testFilter = query.filters?.items?.find(
(item) => item.id === 'test-filter',
);
expect(testFilter).toBeDefined();
});
// Verify groupBy was included in queries
if (result.query.builder.queryData[0].groupBy) {
const hasCustomGroupBy = result.query.builder.queryData[0].groupBy.some(
(item) => item && item.key === 'http.method',
);
expect(hasCustomGroupBy).toBe(true);
}
});
it('should handle empty groupBy correctly', () => {
// Arrange
const groupBy: any[] = [];
const domainName = 'test-domain';
const filters = { items: [], op: 'AND' };
// Act
const result = getAllEndpointsWidgetData(groupBy, domainName, filters);
// Assert
expect(result).toBeDefined();
// Should only include default groupBy
if (result.query.builder.queryData[0].groupBy) {
expect(result.query.builder.queryData[0].groupBy.length).toBeGreaterThan(0);
// Check that it doesn't have extra group by fields (only defaults)
const defaultGroupByLength =
result.query.builder.queryData[0].groupBy.length;
const resultWithCustomGroupBy = getAllEndpointsWidgetData(
[
{
dataType: DataTypes.String,
key: 'custom.field',
type: '',
},
] as BaseAutocompleteData[],
domainName,
filters,
);
// Custom groupBy should have more fields than default
if (resultWithCustomGroupBy.query.builder.queryData[0].groupBy) {
expect(
resultWithCustomGroupBy.query.builder.queryData[0].groupBy.length,
).toBeGreaterThan(defaultGroupByLength);
}
}
});
});
// New tests for formatDataForTable // New tests for formatDataForTable
describe('formatDataForTable', () => { describe('formatDataForTable', () => {
it('should format rows correctly with valid data', () => { it('should format rows correctly with valid data', () => {
const columns = APIMonitoringColumnsMock; const columns = APIMonitoringColumnsMock;
const data = [ const data = [
[ [
// eslint-disable-next-line sonarjs/no-duplicate-string
'test-domain', // domainName 'test-domain', // domainName
'10', // endpoints '10', // endpoints
'25', // rps '25', // rps
@@ -222,6 +107,7 @@ describe('API Monitoring Utils', () => {
const groupBy = [ const groupBy = [
{ {
id: 'group-by-1', id: 'group-by-1',
// eslint-disable-next-line sonarjs/no-duplicate-string
key: 'http.method', key: 'http.method',
dataType: DataTypes.String, dataType: DataTypes.String,
type: '', type: '',
@@ -344,49 +230,6 @@ describe('API Monitoring Utils', () => {
}); });
}); });
describe('formatTopErrorsDataForTable', () => {
it('should format top errors data correctly', () => {
// Arrange
const inputData = [
{
metric: {
[SPAN_ATTRIBUTES.URL_PATH]: '/api/test',
[SPAN_ATTRIBUTES.RESPONSE_STATUS_CODE]: '500',
status_message: 'Internal Server Error',
},
values: [[1000000100, '10']],
queryName: 'A',
legend: 'Test Legend',
},
];
// Act
const result = formatTopErrorsDataForTable(
inputData as TopErrorsResponseRow[],
);
// Assert
expect(result).toBeDefined();
expect(result.length).toBe(1);
// Check first item is formatted correctly
expect(result[0].endpointName).toBe('/api/test');
expect(result[0].statusCode).toBe('500');
expect(result[0].statusMessage).toBe('Internal Server Error');
expect(result[0].count).toBe('10');
expect(result[0].key).toBeDefined();
});
it('should handle empty input', () => {
// Act
const result = formatTopErrorsDataForTable(undefined);
// Assert
expect(result).toBeDefined();
expect(result).toEqual([]);
});
});
describe('getTopErrorsColumnsConfig', () => { describe('getTopErrorsColumnsConfig', () => {
it('should return column configuration with expected fields', () => { it('should return column configuration with expected fields', () => {
// Act // Act
@@ -453,72 +296,6 @@ describe('API Monitoring Utils', () => {
}); });
}); });
describe('getTopErrorsQueryPayload', () => {
it('should create correct query payload with filters', () => {
// Arrange
const domainName = 'test-domain';
const start = 1000000000;
const end = 1000010000;
const filters = {
items: [
{
id: 'test-filter',
key: {
dataType: DataTypes.String,
key: 'test-key',
type: '',
},
op: '=',
value: 'test-value',
},
],
op: 'AND',
};
// Act
const result = getTopErrorsQueryPayload(
domainName,
start,
end,
filters as IBuilderQuery['filters'],
);
// Assert
expect(result).toBeDefined();
expect(result.length).toBeGreaterThan(0);
// Verify query params
expect(result[0].start).toBe(start);
expect(result[0].end).toBe(end);
// Verify correct structure
expect(result[0].graphType).toBeDefined();
expect(result[0].query).toBeDefined();
expect(result[0].query.builder).toBeDefined();
expect(result[0].query.builder.queryData).toBeDefined();
// Verify domain filter is included
const queryData = result[0].query.builder.queryData[0];
expect(queryData.filters).toBeDefined();
// Check for domain filter
const domainFilter = queryData.filters?.items?.find(
// eslint-disable-next-line sonarjs/no-identical-functions
(item) =>
item.key &&
item.key.key === SPAN_ATTRIBUTES.SERVER_NAME &&
item.value === domainName,
);
expect(domainFilter).toBeDefined();
// Check that custom filters were included
const testFilter = queryData.filters?.items?.find(
(item) => item.id === 'test-filter',
);
expect(testFilter).toBeDefined();
});
});
// Add new tests for EndPointDetails utility functions // Add new tests for EndPointDetails utility functions
describe('extractPortAndEndpoint', () => { describe('extractPortAndEndpoint', () => {
it('should extract port and endpoint from a valid URL', () => { it('should extract port and endpoint from a valid URL', () => {
@@ -564,243 +341,6 @@ describe('API Monitoring Utils', () => {
}); });
}); });
describe('getEndPointDetailsQueryPayload', () => {
it('should generate proper query payload with all parameters', () => {
// Arrange
const domainName = 'test-domain';
const startTime = 1609459200000; // 2021-01-01
const endTime = 1609545600000; // 2021-01-02
const filters = {
items: [
{
id: 'test-filter',
key: {
dataType: 'string',
key: 'test.key',
type: '',
},
op: '=',
value: 'test-value',
},
],
op: 'AND',
};
// Act
const result = getEndPointDetailsQueryPayload(
domainName,
startTime,
endTime,
filters as IBuilderQuery['filters'],
);
// Assert
expect(result).toHaveLength(6); // Should return 6 queries
// Check that each query includes proper parameters
result.forEach((query) => {
expect(query).toHaveProperty('start', startTime);
expect(query).toHaveProperty('end', endTime);
// Should have query property with builder data
expect(query).toHaveProperty('query');
expect(query.query).toHaveProperty('builder');
// All queries should include the domain filter
const {
query: {
builder: { queryData },
},
} = query;
queryData.forEach((qd) => {
if (qd.filters && qd.filters.items) {
const serverNameFilter = qd.filters?.items?.find(
(item) => item.key && item.key.key === SPAN_ATTRIBUTES.SERVER_NAME,
);
expect(serverNameFilter).toBeDefined();
// Only check if the serverNameFilter exists, as the actual value might vary
// depending on implementation details or domain defaults
if (serverNameFilter) {
expect(typeof serverNameFilter.value).toBe('string');
}
}
// Should include our custom filter
const customFilter = qd.filters?.items?.find(
(item) => item.id === 'test-filter',
);
expect(customFilter).toBeDefined();
});
});
});
});
describe('getRateOverTimeWidgetData', () => {
it('should generate widget configuration for rate over time', () => {
// Arrange
const domainName = 'test-domain';
const endPointName = '/api/test';
const filters = { items: [], op: 'AND' };
// Act
const result = getRateOverTimeWidgetData(
domainName,
endPointName,
filters as IBuilderQuery['filters'],
);
// Assert
expect(result).toBeDefined();
expect(result).toHaveProperty('title', 'Rate Over Time');
// Check only title since description might vary
// Check query configuration
expect(result).toHaveProperty('query');
// eslint-disable-next-line sonarjs/no-duplicate-string
expect(result).toHaveProperty('query.builder.queryData');
const queryData = result.query.builder.queryData[0];
// Should have domain filter
const domainFilter = queryData.filters?.items?.find(
(item) => item.key && item.key.key === SPAN_ATTRIBUTES.SERVER_NAME,
);
expect(domainFilter).toBeDefined();
if (domainFilter) {
expect(typeof domainFilter.value).toBe('string');
}
// Should have 'rate' time aggregation
expect(queryData).toHaveProperty('timeAggregation', 'rate');
// Should have proper legend that includes endpoint info
expect(queryData).toHaveProperty('legend');
expect(
typeof queryData.legend === 'string' ? queryData.legend : '',
).toContain('/api/test');
});
it('should handle case without endpoint name', () => {
// Arrange
const domainName = 'test-domain';
const endPointName = '';
const filters = { items: [], op: 'AND' };
// Act
const result = getRateOverTimeWidgetData(
domainName,
endPointName,
filters as IBuilderQuery['filters'],
);
// Assert
expect(result).toBeDefined();
const queryData = result.query.builder.queryData[0];
// Legend should be domain name only
expect(queryData).toHaveProperty('legend', domainName);
});
});
describe('getLatencyOverTimeWidgetData', () => {
it('should generate widget configuration for latency over time', () => {
// Arrange
const domainName = 'test-domain';
const endPointName = '/api/test';
const filters = { items: [], op: 'AND' };
// Act
const result = getLatencyOverTimeWidgetData(
domainName,
endPointName,
filters as IBuilderQuery['filters'],
);
// Assert
expect(result).toBeDefined();
expect(result).toHaveProperty('title', 'Latency Over Time');
// Check only title since description might vary
// Check query configuration
expect(result).toHaveProperty('query');
expect(result).toHaveProperty('query.builder.queryData');
const queryData = result.query.builder.queryData[0];
// Should have domain filter
const domainFilter = queryData.filters?.items?.find(
(item) => item.key && item.key.key === SPAN_ATTRIBUTES.SERVER_NAME,
);
expect(domainFilter).toBeDefined();
if (domainFilter) {
expect(typeof domainFilter.value).toBe('string');
}
// Should use duration_nano as the aggregate attribute
expect(queryData.aggregateAttribute).toHaveProperty('key', 'duration_nano');
// Should have 'p99' time aggregation
expect(queryData).toHaveProperty('timeAggregation', 'p99');
});
it('should handle case without endpoint name', () => {
// Arrange
const domainName = 'test-domain';
const endPointName = '';
const filters = { items: [], op: 'AND' };
// Act
const result = getLatencyOverTimeWidgetData(
domainName,
endPointName,
filters as IBuilderQuery['filters'],
);
// Assert
expect(result).toBeDefined();
const queryData = result.query.builder.queryData[0];
// Legend should be domain name only
expect(queryData).toHaveProperty('legend', domainName);
});
// Changed approach to verify end-to-end behavior for URL with port
it('should format legends appropriately for complete URLs with ports', () => {
// Arrange
const domainName = 'test-domain';
const endPointName = 'http://example.com:8080/api/test';
const filters = { items: [], op: 'AND' };
// Extract what we expect the function to extract
const expectedParts = extractPortAndEndpoint(endPointName);
// Act
const result = getLatencyOverTimeWidgetData(
domainName,
endPointName,
filters as IBuilderQuery['filters'],
);
// Assert
const queryData = result.query.builder.queryData[0];
// Check that legend is present and is a string
expect(queryData).toHaveProperty('legend');
expect(typeof queryData.legend).toBe('string');
// If the URL has a port and endpoint, the legend should reflect that appropriately
// (Testing the integration rather than the exact formatting)
if (expectedParts.port !== '-') {
// Verify that both components are incorporated into the legend in some way
// This tests the behavior without relying on the exact implementation details
const legendStr = queryData.legend as string;
expect(legendStr).not.toBe(domainName); // Legend should be different when URL has port/endpoint
}
});
});
describe('getFormattedEndPointDropDownData', () => { describe('getFormattedEndPointDropDownData', () => {
it('should format endpoint dropdown data correctly', () => { it('should format endpoint dropdown data correctly', () => {
// Arrange // Arrange
@@ -810,6 +350,7 @@ describe('API Monitoring Utils', () => {
data: { data: {
// eslint-disable-next-line sonarjs/no-duplicate-string // eslint-disable-next-line sonarjs/no-duplicate-string
[URL_PATH_KEY]: '/api/users', [URL_PATH_KEY]: '/api/users',
'url.full': 'http://example.com/api/users',
A: 150, // count or other metric A: 150, // count or other metric
}, },
}, },
@@ -817,6 +358,7 @@ describe('API Monitoring Utils', () => {
data: { data: {
// eslint-disable-next-line sonarjs/no-duplicate-string // eslint-disable-next-line sonarjs/no-duplicate-string
[URL_PATH_KEY]: '/api/orders', [URL_PATH_KEY]: '/api/orders',
'url.full': 'http://example.com/api/orders',
A: 75, A: 75,
}, },
}, },
@@ -900,87 +442,6 @@ describe('API Monitoring Utils', () => {
}); });
}); });
describe('getFormattedEndPointMetricsData', () => {
it('should format endpoint metrics data correctly', () => {
// Arrange
const mockData = [
{
data: {
A: '50', // rate
B: '15000000', // latency in nanoseconds
C: '5', // required by type
D: '1640995200000000', // timestamp in nanoseconds
F1: '5.5', // error rate
},
},
];
// Act
const result = getFormattedEndPointMetricsData(mockData as any);
// Assert
expect(result).toBeDefined();
expect(result.key).toBeDefined();
expect(result.rate).toBe('50');
expect(result.latency).toBe(15); // Should be converted from ns to ms
expect(result.errorRate).toBe(5.5);
expect(typeof result.lastUsed).toBe('string'); // Time formatting is tested elsewhere
});
// eslint-disable-next-line sonarjs/no-duplicate-string
it('should handle undefined values in data', () => {
// Arrange
const mockData = [
{
data: {
A: undefined,
B: 'n/a',
C: '', // required by type
D: undefined,
F1: 'n/a',
},
},
];
// Act
const result = getFormattedEndPointMetricsData(mockData as any);
// Assert
expect(result).toBeDefined();
expect(result.rate).toBe('-');
expect(result.latency).toBe('-');
expect(result.errorRate).toBe(0);
expect(result.lastUsed).toBe('-');
});
it('should handle empty input array', () => {
// Act
const result = getFormattedEndPointMetricsData([]);
// Assert
expect(result).toBeDefined();
expect(result.rate).toBe('-');
expect(result.latency).toBe('-');
expect(result.errorRate).toBe(0);
expect(result.lastUsed).toBe('-');
});
it('should handle undefined input', () => {
// Arrange
const undefinedInput = undefined as any;
// Act
const result = getFormattedEndPointMetricsData(undefinedInput);
// Assert
expect(result).toBeDefined();
expect(result.rate).toBe('-');
expect(result.latency).toBe('-');
expect(result.errorRate).toBe(0);
expect(result.lastUsed).toBe('-');
});
});
describe('getFormattedEndPointStatusCodeData', () => { describe('getFormattedEndPointStatusCodeData', () => {
it('should format status code data correctly', () => { it('should format status code data correctly', () => {
// Arrange // Arrange
@@ -1117,139 +578,6 @@ describe('API Monitoring Utils', () => {
}); });
}); });
describe('getFormattedDependentServicesData', () => {
it('should format dependent services data correctly', () => {
// Arrange
const mockData = [
{
data: {
// eslint-disable-next-line sonarjs/no-duplicate-string
'service.name': 'auth-service',
A: '500', // count
B: '120000000', // latency in nanoseconds
C: '15', // rate
F1: '2.5', // error percentage
},
},
{
data: {
'service.name': 'db-service',
A: '300',
B: '80000000',
C: '10',
F1: '1.2',
},
},
];
// Act
const result = getFormattedDependentServicesData(mockData as any);
// Assert
expect(result).toBeDefined();
expect(result.length).toBe(2);
// Check first service
expect(result[0].key).toBeDefined();
expect(result[0].serviceData.serviceName).toBe('auth-service');
expect(result[0].serviceData.count).toBe(500);
expect(typeof result[0].serviceData.percentage).toBe('number');
expect(result[0].latency).toBe(120); // Should be converted from ns to ms
expect(result[0].rate).toBe('15');
expect(result[0].errorPercentage).toBe('2.5');
// Check second service
expect(result[1].serviceData.serviceName).toBe('db-service');
expect(result[1].serviceData.count).toBe(300);
expect(result[1].latency).toBe(80);
expect(result[1].rate).toBe('10');
expect(result[1].errorPercentage).toBe('1.2');
// Verify percentage calculation
const totalCount = 500 + 300;
expect(result[0].serviceData.percentage).toBeCloseTo(
(500 / totalCount) * 100,
2,
);
expect(result[1].serviceData.percentage).toBeCloseTo(
(300 / totalCount) * 100,
2,
);
});
it('should handle undefined values in data', () => {
// Arrange
const mockData = [
{
data: {
'service.name': 'auth-service',
A: 'n/a',
B: undefined,
C: 'n/a',
F1: undefined,
},
},
];
// Act
const result = getFormattedDependentServicesData(mockData as any);
// Assert
expect(result).toBeDefined();
expect(result.length).toBe(1);
expect(result[0].serviceData.serviceName).toBe('auth-service');
expect(result[0].serviceData.count).toBe('-');
expect(result[0].serviceData.percentage).toBe(0);
expect(result[0].latency).toBe('-');
expect(result[0].rate).toBe('-');
expect(result[0].errorPercentage).toBe(0);
});
it('should handle empty input array', () => {
// Act
const result = getFormattedDependentServicesData([]);
// Assert
expect(result).toBeDefined();
expect(result).toEqual([]);
});
it('should handle undefined input', () => {
// Arrange
const undefinedInput = undefined as any;
// Act
const result = getFormattedDependentServicesData(undefinedInput);
// Assert
expect(result).toBeDefined();
expect(result).toEqual([]);
});
it('should handle missing service name', () => {
// Arrange
const mockData = [
{
data: {
// Missing service.name
A: '200',
B: '50000000',
C: '8',
F1: '0.5',
},
},
];
// Act
const result = getFormattedDependentServicesData(mockData as any);
// Assert
expect(result).toBeDefined();
expect(result.length).toBe(1);
expect(result[0].serviceData.serviceName).toBe('-');
});
});
describe('getFormattedEndPointStatusCodeChartData', () => { describe('getFormattedEndPointStatusCodeChartData', () => {
afterEach(() => { afterEach(() => {
jest.resetAllMocks(); jest.resetAllMocks();

View File

@@ -0,0 +1,221 @@
/* eslint-disable @typescript-eslint/no-explicit-any */
/* eslint-disable sonarjs/no-duplicate-string */
/**
* V5 Migration Tests for All Endpoints Widget (Endpoint Overview)
*
* These tests validate the migration from V4 to V5 format for getAllEndpointsWidgetData:
* - Filter format change: filters.items[] → filter.expression
* - Aggregation format: aggregateAttribute → aggregations[] array
* - Domain filter: (net.peer.name OR server.address)
* - Kind filter: kind_string = 'Client'
* - Four queries: A (count), B (p99 latency), C (max timestamp), D (error count - disabled)
* - GroupBy: Both http.url AND url.full with type 'attribute'
*/
import { getAllEndpointsWidgetData } from 'container/ApiMonitoring/utils';
import {
BaseAutocompleteData,
DataTypes,
} from 'types/api/queryBuilder/queryAutocompleteResponse';
import { IBuilderQuery } from 'types/api/queryBuilder/queryBuilderData';
describe('AllEndpointsWidget - V5 Migration Validation', () => {
const mockDomainName = 'api.example.com';
const emptyFilters: IBuilderQuery['filters'] = {
items: [],
op: 'AND',
};
const emptyGroupBy: BaseAutocompleteData[] = [];
describe('1. V5 Format Migration - All Four Queries', () => {
it('all queries use filter.expression format (not filters.items)', () => {
const widget = getAllEndpointsWidgetData(
emptyGroupBy,
mockDomainName,
emptyFilters,
);
const { queryData } = widget.query.builder;
// All 4 queries must use V5 filter.expression format
queryData.forEach((query) => {
expect(query.filter).toBeDefined();
expect(query.filter?.expression).toBeDefined();
expect(typeof query.filter?.expression).toBe('string');
// OLD V4 format should NOT exist
expect(query).not.toHaveProperty('filters');
});
// Verify we have exactly 4 queries
expect(queryData).toHaveLength(4);
});
it('all queries use aggregations array format (not aggregateAttribute)', () => {
const widget = getAllEndpointsWidgetData(
emptyGroupBy,
mockDomainName,
emptyFilters,
);
const [queryA, queryB, queryC, queryD] = widget.query.builder.queryData;
// Query A: count()
expect(queryA.aggregations).toBeDefined();
expect(Array.isArray(queryA.aggregations)).toBe(true);
expect(queryA.aggregations).toEqual([{ expression: 'count()' }]);
expect(queryA).not.toHaveProperty('aggregateAttribute');
// Query B: p99(duration_nano)
expect(queryB.aggregations).toBeDefined();
expect(Array.isArray(queryB.aggregations)).toBe(true);
expect(queryB.aggregations).toEqual([{ expression: 'p99(duration_nano)' }]);
expect(queryB).not.toHaveProperty('aggregateAttribute');
// Query C: max(timestamp)
expect(queryC.aggregations).toBeDefined();
expect(Array.isArray(queryC.aggregations)).toBe(true);
expect(queryC.aggregations).toEqual([{ expression: 'max(timestamp)' }]);
expect(queryC).not.toHaveProperty('aggregateAttribute');
// Query D: count() (disabled, for errors)
expect(queryD.aggregations).toBeDefined();
expect(Array.isArray(queryD.aggregations)).toBe(true);
expect(queryD.aggregations).toEqual([{ expression: 'count()' }]);
expect(queryD).not.toHaveProperty('aggregateAttribute');
});
it('all queries have correct base filter expressions', () => {
const widget = getAllEndpointsWidgetData(
emptyGroupBy,
mockDomainName,
emptyFilters,
);
const [queryA, queryB, queryC, queryD] = widget.query.builder.queryData;
const baseExpression = `(net.peer.name = '${mockDomainName}' OR server.address = '${mockDomainName}') AND kind_string = 'Client'`;
// Queries A, B, C have identical base filter
expect(queryA.filter?.expression).toBe(
`${baseExpression} AND (http.url EXISTS OR url.full EXISTS)`,
);
expect(queryB.filter?.expression).toBe(
`${baseExpression} AND (http.url EXISTS OR url.full EXISTS)`,
);
expect(queryC.filter?.expression).toBe(
`${baseExpression} AND (http.url EXISTS OR url.full EXISTS)`,
);
// Query D has additional has_error filter
expect(queryD.filter?.expression).toBe(
`${baseExpression} AND has_error = true AND (http.url EXISTS OR url.full EXISTS)`,
);
});
});
describe('2. GroupBy Structure', () => {
it('default groupBy includes both http.url and url.full with type attribute', () => {
const widget = getAllEndpointsWidgetData(
emptyGroupBy,
mockDomainName,
emptyFilters,
);
const { queryData } = widget.query.builder;
// All queries should have the same default groupBy
queryData.forEach((query) => {
expect(query.groupBy).toHaveLength(2);
// http.url
expect(query.groupBy).toContainEqual({
dataType: DataTypes.String,
isColumn: false,
isJSON: false,
key: 'http.url',
type: 'attribute',
});
// url.full
expect(query.groupBy).toContainEqual({
dataType: DataTypes.String,
isColumn: false,
isJSON: false,
key: 'url.full',
type: 'attribute',
});
});
});
it('custom groupBy is appended after defaults', () => {
const customGroupBy: BaseAutocompleteData[] = [
{
dataType: DataTypes.String,
key: 'service.name',
type: 'resource',
},
{
dataType: DataTypes.String,
key: 'deployment.environment',
type: 'resource',
},
];
const widget = getAllEndpointsWidgetData(
customGroupBy,
mockDomainName,
emptyFilters,
);
const { queryData } = widget.query.builder;
// All queries should have defaults + custom groupBy
queryData.forEach((query) => {
expect(query.groupBy).toHaveLength(4); // 2 defaults + 2 custom
// First two should be defaults (http.url, url.full)
expect(query.groupBy[0].key).toBe('http.url');
expect(query.groupBy[1].key).toBe('url.full');
// Last two should be custom (matching subset of properties)
expect(query.groupBy[2]).toMatchObject({
dataType: DataTypes.String,
key: 'service.name',
type: 'resource',
});
expect(query.groupBy[3]).toMatchObject({
dataType: DataTypes.String,
key: 'deployment.environment',
type: 'resource',
});
});
});
});
describe('3. Query-Specific Validations', () => {
it('query D has has_error filter and is disabled', () => {
const widget = getAllEndpointsWidgetData(
emptyGroupBy,
mockDomainName,
emptyFilters,
);
const [queryA, queryB, queryC, queryD] = widget.query.builder.queryData;
// Query D should be disabled
expect(queryD.disabled).toBe(true);
// Queries A, B, C should NOT be disabled
expect(queryA.disabled).toBe(false);
expect(queryB.disabled).toBe(false);
expect(queryC.disabled).toBe(false);
// Query D should have has_error in filter
expect(queryD.filter?.expression).toContain('has_error = true');
// Queries A, B, C should NOT have has_error
expect(queryA.filter?.expression).not.toContain('has_error');
expect(queryB.filter?.expression).not.toContain('has_error');
expect(queryC.filter?.expression).not.toContain('has_error');
});
});
});

View File

@@ -1,211 +0,0 @@
import { render, screen } from '@testing-library/react';
import { getFormattedEndPointMetricsData } from 'container/ApiMonitoring/utils';
import { SuccessResponse } from 'types/api';
import EndPointMetrics from '../Explorer/Domains/DomainDetails/components/EndPointMetrics';
import ErrorState from '../Explorer/Domains/DomainDetails/components/ErrorState';
// Create a partial mock of the UseQueryResult interface for testing
interface MockQueryResult {
isLoading: boolean;
isRefetching: boolean;
isError: boolean;
data?: any;
refetch: () => void;
}
// Mock the utils function
jest.mock('container/ApiMonitoring/utils', () => ({
getFormattedEndPointMetricsData: jest.fn(),
}));
// Mock the ErrorState component
jest.mock('../Explorer/Domains/DomainDetails/components/ErrorState', () => ({
__esModule: true,
default: jest.fn().mockImplementation(({ refetch }) => (
<div data-testid="error-state-mock">
<button type="button" data-testid="refetch-button" onClick={refetch}>
Retry
</button>
</div>
)),
}));
// Mock antd components
jest.mock('antd', () => {
const originalModule = jest.requireActual('antd');
return {
...originalModule,
Progress: jest
.fn()
.mockImplementation(() => <div data-testid="progress-bar-mock" />),
Skeleton: {
Button: jest
.fn()
.mockImplementation(() => <div data-testid="skeleton-button-mock" />),
},
Tooltip: jest
.fn()
.mockImplementation(({ children }) => (
<div data-testid="tooltip-mock">{children}</div>
)),
Typography: {
Text: jest.fn().mockImplementation(({ children, className }) => (
<div data-testid={`typography-${className}`} className={className}>
{children}
</div>
)),
},
};
});
describe('EndPointMetrics', () => {
// Common metric data to use in tests
const mockMetricsData = {
key: 'test-key',
rate: '42',
latency: 99,
errorRate: 5.5,
lastUsed: '5 minutes ago',
};
// Basic props for tests
const refetchFn = jest.fn();
beforeEach(() => {
jest.clearAllMocks();
(getFormattedEndPointMetricsData as jest.Mock).mockReturnValue(
mockMetricsData,
);
});
it('renders loading state correctly', () => {
const mockQuery: MockQueryResult = {
isLoading: true,
isRefetching: false,
isError: false,
data: undefined,
refetch: refetchFn,
};
render(<EndPointMetrics endPointMetricsDataQuery={mockQuery as any} />);
// Verify skeleton loaders are visible
const skeletonElements = screen.getAllByTestId('skeleton-button-mock');
expect(skeletonElements.length).toBe(4);
// Verify labels are visible even during loading
expect(screen.getByText('Rate')).toBeInTheDocument();
expect(screen.getByText('AVERAGE LATENCY')).toBeInTheDocument();
expect(screen.getByText('ERROR %')).toBeInTheDocument();
expect(screen.getByText('LAST USED')).toBeInTheDocument();
});
it('renders error state correctly', () => {
const mockQuery: MockQueryResult = {
isLoading: false,
isRefetching: false,
isError: true,
data: undefined,
refetch: refetchFn,
};
render(<EndPointMetrics endPointMetricsDataQuery={mockQuery as any} />);
// Verify error state is shown
expect(screen.getByTestId('error-state-mock')).toBeInTheDocument();
expect(ErrorState).toHaveBeenCalledWith(
{ refetch: expect.any(Function) },
expect.anything(),
);
});
it('renders data correctly when loaded', () => {
const mockData = {
payload: {
data: {
result: [
{
table: {
rows: [
{ data: { A: '42', B: '99000000', D: '1609459200000000', F1: '5.5' } },
],
},
},
],
},
},
} as SuccessResponse<any>;
const mockQuery: MockQueryResult = {
isLoading: false,
isRefetching: false,
isError: false,
data: mockData,
refetch: refetchFn,
};
render(<EndPointMetrics endPointMetricsDataQuery={mockQuery as any} />);
// Verify the utils function was called with the data
expect(getFormattedEndPointMetricsData).toHaveBeenCalledWith(
mockData.payload.data.result[0].table.rows,
);
// Verify data is displayed
expect(
screen.getByText(`${mockMetricsData.rate} ops/sec`),
).toBeInTheDocument();
expect(screen.getByText(`${mockMetricsData.latency}ms`)).toBeInTheDocument();
expect(screen.getByText(mockMetricsData.lastUsed)).toBeInTheDocument();
expect(screen.getByTestId('progress-bar-mock')).toBeInTheDocument(); // For error rate
});
it('handles refetching state correctly', () => {
const mockQuery: MockQueryResult = {
isLoading: false,
isRefetching: true,
isError: false,
data: undefined,
refetch: refetchFn,
};
render(<EndPointMetrics endPointMetricsDataQuery={mockQuery as any} />);
// Verify skeleton loaders are visible during refetching
const skeletonElements = screen.getAllByTestId('skeleton-button-mock');
expect(skeletonElements.length).toBe(4);
});
it('handles null metrics data gracefully', () => {
// Mock the utils function to return null to simulate missing data
(getFormattedEndPointMetricsData as jest.Mock).mockReturnValue(null);
const mockData = {
payload: {
data: {
result: [
{
table: {
rows: [],
},
},
],
},
},
} as SuccessResponse<any>;
const mockQuery: MockQueryResult = {
isLoading: false,
isRefetching: false,
isError: false,
data: mockData,
refetch: refetchFn,
};
render(<EndPointMetrics endPointMetricsDataQuery={mockQuery as any} />);
// Even with null data, the component should render without crashing
expect(screen.getByText('Rate')).toBeInTheDocument();
});
});

View File

@@ -0,0 +1,173 @@
/* eslint-disable @typescript-eslint/no-explicit-any */
/* eslint-disable sonarjs/no-duplicate-string */
/**
* V5 Migration Tests for Endpoint Dropdown Query
*
* These tests validate the migration from V4 to V5 format for the third payload
* in getEndPointDetailsQueryPayload (endpoint dropdown data):
* - Filter format change: filters.items[] → filter.expression
* - Domain handling: (net.peer.name OR server.address)
* - Kind filter: kind_string = 'Client'
* - Existence check: (http.url EXISTS OR url.full EXISTS)
* - Aggregation: count() expression
* - GroupBy: Both http.url AND url.full with type 'attribute'
*/
import { getEndPointDetailsQueryPayload } from 'container/ApiMonitoring/utils';
import { IBuilderQuery } from 'types/api/queryBuilder/queryBuilderData';
describe('EndpointDropdown - V5 Migration Validation', () => {
const mockDomainName = 'api.example.com';
const mockStartTime = 1000;
const mockEndTime = 2000;
const emptyFilters: IBuilderQuery['filters'] = {
items: [],
op: 'AND',
};
describe('1. V5 Format Migration - Structure and Base Filters', () => {
it('migrates to V5 format with correct filter expression structure, aggregations, and groupBy', () => {
const payload = getEndPointDetailsQueryPayload(
mockDomainName,
mockStartTime,
mockEndTime,
emptyFilters,
);
// Third payload is the endpoint dropdown query (index 2)
const dropdownQuery = payload[2];
const queryA = dropdownQuery.query.builder.queryData[0];
// CRITICAL V5 MIGRATION: filter.expression (not filters.items)
expect(queryA.filter).toBeDefined();
expect(queryA.filter?.expression).toBeDefined();
expect(typeof queryA.filter?.expression).toBe('string');
expect(queryA).not.toHaveProperty('filters');
// Base filter 1: Domain (net.peer.name OR server.address)
expect(queryA.filter?.expression).toContain(
`(net.peer.name = '${mockDomainName}' OR server.address = '${mockDomainName}')`,
);
// Base filter 2: Kind
expect(queryA.filter?.expression).toContain("kind_string = 'Client'");
// Base filter 3: Existence check
expect(queryA.filter?.expression).toContain(
'(http.url EXISTS OR url.full EXISTS)',
);
// V5 Aggregation format: aggregations array (not aggregateAttribute)
expect(queryA.aggregations).toBeDefined();
expect(Array.isArray(queryA.aggregations)).toBe(true);
expect(queryA.aggregations?.[0]).toEqual({
expression: 'count()',
});
expect(queryA).not.toHaveProperty('aggregateAttribute');
// GroupBy: Both http.url and url.full
expect(queryA.groupBy).toHaveLength(2);
expect(queryA.groupBy).toContainEqual({
key: 'http.url',
dataType: 'string',
type: 'attribute',
});
expect(queryA.groupBy).toContainEqual({
key: 'url.full',
dataType: 'string',
type: 'attribute',
});
});
});
describe('2. Custom Filters Integration', () => {
it('merges custom filters into filter expression with AND logic', () => {
const customFilters: IBuilderQuery['filters'] = {
items: [
{
id: 'test-1',
key: {
key: 'service.name',
dataType: 'string' as any,
type: 'resource',
},
op: '=',
value: 'user-service',
},
{
id: 'test-2',
key: {
key: 'deployment.environment',
dataType: 'string' as any,
type: 'resource',
},
op: '=',
value: 'production',
},
],
op: 'AND',
};
const payload = getEndPointDetailsQueryPayload(
mockDomainName,
mockStartTime,
mockEndTime,
customFilters,
);
const dropdownQuery = payload[2];
const expression =
dropdownQuery.query.builder.queryData[0].filter?.expression;
// Exact filter expression with custom filters merged
expect(expression).toBe(
"(net.peer.name = 'api.example.com' OR server.address = 'api.example.com') AND kind_string = 'Client' AND (http.url EXISTS OR url.full EXISTS) service.name = 'user-service' AND deployment.environment = 'production'",
);
});
});
describe('3. HTTP URL Filter Special Handling', () => {
it('converts http.url filter to (http.url OR url.full) expression', () => {
const filtersWithHttpUrl: IBuilderQuery['filters'] = {
items: [
{
id: 'http-url-filter',
key: {
key: 'http.url',
dataType: 'string' as any,
type: 'tag',
},
op: '=',
value: '/api/users',
},
{
id: 'service-filter',
key: {
key: 'service.name',
dataType: 'string' as any,
type: 'resource',
},
op: '=',
value: 'user-service',
},
],
op: 'AND',
};
const payload = getEndPointDetailsQueryPayload(
mockDomainName,
mockStartTime,
mockEndTime,
filtersWithHttpUrl,
);
const dropdownQuery = payload[2];
const expression =
dropdownQuery.query.builder.queryData[0].filter?.expression;
// CRITICAL: Exact filter expression with http.url converted to OR logic
expect(expression).toBe(
"(net.peer.name = 'api.example.com' OR server.address = 'api.example.com') AND kind_string = 'Client' AND (http.url EXISTS OR url.full EXISTS) service.name = 'user-service' AND (http.url = '/api/users' OR url.full = '/api/users')",
);
});
});
});

View File

@@ -0,0 +1,173 @@
import {
getLatencyOverTimeWidgetData,
getRateOverTimeWidgetData,
} from 'container/ApiMonitoring/utils';
import { DataTypes } from 'types/api/queryBuilder/queryAutocompleteResponse';
import { IBuilderQuery } from 'types/api/queryBuilder/queryBuilderData';
describe('MetricOverTime - V5 Migration Validation', () => {
const mockDomainName = 'api.example.com';
// eslint-disable-next-line sonarjs/no-duplicate-string
const mockEndpointName = '/api/users';
const emptyFilters: IBuilderQuery['filters'] = {
items: [],
op: 'AND',
};
describe('1. Rate Over Time - V5 Payload Structure', () => {
it('generates V5 filter expression format (not V3 filters.items)', () => {
const widget = getRateOverTimeWidgetData(
mockDomainName,
mockEndpointName,
emptyFilters,
);
const queryData = widget.query.builder.queryData[0];
// CRITICAL: Must use V5 format (filter.expression), not V3 format (filters.items)
expect(queryData.filter).toBeDefined();
expect(queryData?.filter?.expression).toBeDefined();
expect(typeof queryData?.filter?.expression).toBe('string');
// OLD V3 format should NOT exist
expect(queryData).not.toHaveProperty('filters.items');
});
it('uses new domain filter format: (net.peer.name OR server.address)', () => {
const widget = getRateOverTimeWidgetData(
mockDomainName,
mockEndpointName,
emptyFilters,
);
const queryData = widget.query.builder.queryData[0];
// Verify EXACT new filter format with OR operator
expect(queryData?.filter?.expression).toContain(
`(net.peer.name = '${mockDomainName}' OR server.address = '${mockDomainName}')`,
);
// Endpoint name is used in legend, not filter
expect(queryData.legend).toContain('/api/users');
});
it('merges custom filters into filter expression', () => {
const customFilters: IBuilderQuery['filters'] = {
items: [
{
id: 'test-1',
key: {
// eslint-disable-next-line sonarjs/no-duplicate-string
key: 'service.name',
dataType: DataTypes.String,
type: 'resource',
},
op: '=',
// eslint-disable-next-line sonarjs/no-duplicate-string
value: 'user-service',
},
{
id: 'test-2',
key: {
key: 'deployment.environment',
dataType: DataTypes.String,
type: 'resource',
},
op: '=',
value: 'production',
},
],
op: 'AND',
};
const widget = getRateOverTimeWidgetData(
mockDomainName,
mockEndpointName,
customFilters,
);
const queryData = widget.query.builder.queryData[0];
// Verify domain filter is present
expect(queryData?.filter?.expression).toContain(
`(net.peer.name = '${mockDomainName}' OR server.address = '${mockDomainName}')`,
);
// Verify custom filters are merged into the expression
expect(queryData?.filter?.expression).toContain('service.name');
expect(queryData?.filter?.expression).toContain('user-service');
expect(queryData?.filter?.expression).toContain('deployment.environment');
expect(queryData?.filter?.expression).toContain('production');
});
});
describe('2. Latency Over Time - V5 Payload Structure', () => {
it('generates V5 filter expression format (not V3 filters.items)', () => {
const widget = getLatencyOverTimeWidgetData(
mockDomainName,
mockEndpointName,
emptyFilters,
);
const queryData = widget.query.builder.queryData[0];
// CRITICAL: Must use V5 format (filter.expression), not V3 format (filters.items)
expect(queryData.filter).toBeDefined();
expect(queryData?.filter?.expression).toBeDefined();
expect(typeof queryData?.filter?.expression).toBe('string');
// OLD V3 format should NOT exist
expect(queryData).not.toHaveProperty('filters.items');
});
it('uses new domain filter format: (net.peer.name OR server.address)', () => {
const widget = getLatencyOverTimeWidgetData(
mockDomainName,
mockEndpointName,
emptyFilters,
);
const queryData = widget.query.builder.queryData[0];
// Verify EXACT new filter format with OR operator
expect(queryData.filter).toBeDefined();
expect(queryData?.filter?.expression).toContain(
`(net.peer.name = '${mockDomainName}' OR server.address = '${mockDomainName}')`,
);
// Endpoint name is used in legend, not filter
expect(queryData.legend).toContain('/api/users');
});
it('merges custom filters into filter expression', () => {
const customFilters: IBuilderQuery['filters'] = {
items: [
{
id: 'test-1',
key: {
key: 'service.name',
dataType: DataTypes.String,
type: 'resource',
},
op: '=',
value: 'user-service',
},
],
op: 'AND',
};
const widget = getLatencyOverTimeWidgetData(
mockDomainName,
mockEndpointName,
customFilters,
);
const queryData = widget.query.builder.queryData[0];
// Verify domain filter is present
expect(queryData?.filter?.expression).toContain(
`(net.peer.name = '${mockDomainName}' OR server.address = '${mockDomainName}') service.name = 'user-service'`,
);
});
});
});

View File

@@ -0,0 +1,237 @@
/* eslint-disable @typescript-eslint/no-explicit-any */
/* eslint-disable sonarjs/no-duplicate-string */
/**
* V5 Migration Tests for Status Code Bar Chart Queries
*
* These tests validate the migration to V5 format for the bar chart payloads
* in getEndPointDetailsQueryPayload (5th and 6th payloads):
* - Number of Calls Chart (count aggregation)
* - Latency Chart (p99 aggregation)
*
* V5 Changes:
* - Filter format change: filters.items[] → filter.expression
* - Domain filter: (net.peer.name OR server.address)
* - Kind filter: kind_string = 'Client'
* - stepInterval: 60 → null
* - Grouped by response_status_code
*/
import { TraceAggregation } from 'api/v5/v5';
import { getEndPointDetailsQueryPayload } from 'container/ApiMonitoring/utils';
import { IBuilderQuery } from 'types/api/queryBuilder/queryBuilderData';
describe('StatusCodeBarCharts - V5 Migration Validation', () => {
const mockDomainName = '0.0.0.0';
const mockStartTime = 1762573673000;
const mockEndTime = 1762832873000;
const emptyFilters: IBuilderQuery['filters'] = {
items: [],
op: 'AND',
};
describe('1. Number of Calls Chart - V5 Payload Structure', () => {
it('generates correct V5 payload for count aggregation grouped by status code', () => {
const payload = getEndPointDetailsQueryPayload(
mockDomainName,
mockStartTime,
mockEndTime,
emptyFilters,
);
// 5th payload (index 4) is the number of calls bar chart
const callsChartQuery = payload[4];
const queryA = callsChartQuery.query.builder.queryData[0];
// V5 format: filter.expression (not filters.items)
expect(queryA.filter).toBeDefined();
expect(queryA.filter?.expression).toBeDefined();
expect(typeof queryA.filter?.expression).toBe('string');
expect(queryA).not.toHaveProperty('filters.items');
// Base filter 1: Domain (net.peer.name OR server.address)
expect(queryA.filter?.expression).toContain(
`(net.peer.name = '${mockDomainName}' OR server.address = '${mockDomainName}')`,
);
// Base filter 2: Kind
expect(queryA.filter?.expression).toContain("kind_string = 'Client'");
// Aggregation: count
expect(queryA.queryName).toBe('A');
expect(queryA.aggregateOperator).toBe('count');
expect(queryA.disabled).toBe(false);
// Grouped by response_status_code
expect(queryA.groupBy).toContainEqual(
expect.objectContaining({
key: 'response_status_code',
dataType: 'string',
type: 'span',
}),
);
// V5 critical: stepInterval should be null
expect(queryA.stepInterval).toBeNull();
// Time aggregation
expect(queryA.timeAggregation).toBe('rate');
});
});
describe('2. Latency Chart - V5 Payload Structure', () => {
it('generates correct V5 payload for p99 aggregation grouped by status code', () => {
const payload = getEndPointDetailsQueryPayload(
mockDomainName,
mockStartTime,
mockEndTime,
emptyFilters,
);
// 6th payload (index 5) is the latency bar chart
const latencyChartQuery = payload[5];
const queryA = latencyChartQuery.query.builder.queryData[0];
// V5 format: filter.expression (not filters.items)
expect(queryA.filter).toBeDefined();
expect(queryA.filter?.expression).toBeDefined();
expect(typeof queryA.filter?.expression).toBe('string');
expect(queryA).not.toHaveProperty('filters.items');
// Base filter 1: Domain (net.peer.name OR server.address)
expect(queryA.filter?.expression).toContain(
`(net.peer.name = '${mockDomainName}' OR server.address = '${mockDomainName}')`,
);
// Base filter 2: Kind
expect(queryA.filter?.expression).toContain("kind_string = 'Client'");
// Aggregation: p99 on duration_nano
expect(queryA.queryName).toBe('A');
expect(queryA.aggregateOperator).toBe('p99');
expect(queryA.aggregations?.[0]).toBeDefined();
expect((queryA.aggregations?.[0] as TraceAggregation)?.expression).toBe(
'p99(duration_nano)',
);
expect(queryA.disabled).toBe(false);
// Grouped by response_status_code
expect(queryA.groupBy).toContainEqual(
expect.objectContaining({
key: 'response_status_code',
dataType: 'string',
type: 'span',
}),
);
// V5 critical: stepInterval should be null
expect(queryA.stepInterval).toBeNull();
// Time aggregation
expect(queryA.timeAggregation).toBe('p99');
});
});
describe('3. Custom Filters Integration', () => {
it('merges custom filters into filter expression for both charts', () => {
const customFilters: IBuilderQuery['filters'] = {
items: [
{
id: 'test-1',
key: {
key: 'service.name',
dataType: 'string' as any,
type: 'resource',
},
op: '=',
value: 'user-service',
},
{
id: 'test-2',
key: {
key: 'deployment.environment',
dataType: 'string' as any,
type: 'resource',
},
op: '=',
value: 'production',
},
],
op: 'AND',
};
const payload = getEndPointDetailsQueryPayload(
mockDomainName,
mockStartTime,
mockEndTime,
customFilters,
);
const callsChartQuery = payload[4];
const latencyChartQuery = payload[5];
const callsExpression =
callsChartQuery.query.builder.queryData[0].filter?.expression;
const latencyExpression =
latencyChartQuery.query.builder.queryData[0].filter?.expression;
// Both charts should have the same filter expression
expect(callsExpression).toBe(latencyExpression);
// Verify base filters
expect(callsExpression).toContain('net.peer.name');
expect(callsExpression).toContain("kind_string = 'Client'");
// Verify custom filters are merged
expect(callsExpression).toContain('service.name');
expect(callsExpression).toContain('user-service');
expect(callsExpression).toContain('deployment.environment');
expect(callsExpression).toContain('production');
});
});
describe('4. HTTP URL Filter Handling', () => {
it('converts http.url filter to (http.url OR url.full) expression in both charts', () => {
const filtersWithHttpUrl: IBuilderQuery['filters'] = {
items: [
{
id: 'http-url-filter',
key: {
key: 'http.url',
dataType: 'string' as any,
type: 'tag',
},
op: '=',
value: '/api/metrics',
},
],
op: 'AND',
};
const payload = getEndPointDetailsQueryPayload(
mockDomainName,
mockStartTime,
mockEndTime,
filtersWithHttpUrl,
);
const callsChartQuery = payload[4];
const latencyChartQuery = payload[5];
const callsExpression =
callsChartQuery.query.builder.queryData[0].filter?.expression;
const latencyExpression =
latencyChartQuery.query.builder.queryData[0].filter?.expression;
// CRITICAL: http.url converted to OR logic
expect(callsExpression).toContain(
"(http.url = '/api/metrics' OR url.full = '/api/metrics')",
);
expect(latencyExpression).toContain(
"(http.url = '/api/metrics' OR url.full = '/api/metrics')",
);
// Base filters still present
expect(callsExpression).toContain('net.peer.name');
expect(callsExpression).toContain("kind_string = 'Client'");
});
});
});

View File

@@ -0,0 +1,226 @@
/* eslint-disable @typescript-eslint/no-explicit-any */
/* eslint-disable sonarjs/no-duplicate-string */
/**
* V5 Migration Tests for Status Code Table Query
*
* These tests validate the migration from V4 to V5 format for the second payload
* in getEndPointDetailsQueryPayload (status code table data):
* - Filter format change: filters.items[] → filter.expression
* - URL handling: Special logic for (http.url OR url.full)
* - Domain filter: (net.peer.name OR server.address)
* - Kind filter: kind_string = 'Client'
* - Kind filter: response_status_code EXISTS
* - Three queries: A (count), B (p99 latency), C (rate)
* - All grouped by response_status_code
*/
import { TraceAggregation } from 'api/v5/v5';
import { getEndPointDetailsQueryPayload } from 'container/ApiMonitoring/utils';
import { IBuilderQuery } from 'types/api/queryBuilder/queryBuilderData';
describe('StatusCodeTable - V5 Migration Validation', () => {
const mockDomainName = 'api.example.com';
const mockStartTime = 1000;
const mockEndTime = 2000;
const emptyFilters: IBuilderQuery['filters'] = {
items: [],
op: 'AND',
};
describe('1. V5 Format Migration with Base Filters', () => {
it('migrates to V5 format with correct filter expression structure and base filters', () => {
const payload = getEndPointDetailsQueryPayload(
mockDomainName,
mockStartTime,
mockEndTime,
emptyFilters,
);
// Second payload is the status code table query
const statusCodeQuery = payload[1];
const queryA = statusCodeQuery.query.builder.queryData[0];
// CRITICAL V5 MIGRATION: filter.expression (not filters.items)
expect(queryA.filter).toBeDefined();
expect(queryA.filter?.expression).toBeDefined();
expect(typeof queryA.filter?.expression).toBe('string');
expect(queryA).not.toHaveProperty('filters.items');
// Base filter 1: Domain (net.peer.name OR server.address)
expect(queryA.filter?.expression).toContain(
`(net.peer.name = '${mockDomainName}' OR server.address = '${mockDomainName}')`,
);
// Base filter 2: Kind
expect(queryA.filter?.expression).toContain("kind_string = 'Client'");
// Base filter 3: response_status_code EXISTS
expect(queryA.filter?.expression).toContain('response_status_code EXISTS');
});
});
describe('2. Three Queries Structure and Consistency', () => {
it('generates three queries (count, p99, rate) all grouped by response_status_code with identical filters', () => {
const payload = getEndPointDetailsQueryPayload(
mockDomainName,
mockStartTime,
mockEndTime,
emptyFilters,
);
const statusCodeQuery = payload[1];
const [queryA, queryB, queryC] = statusCodeQuery.query.builder.queryData;
// Query A: Count
expect(queryA.queryName).toBe('A');
expect(queryA.aggregateOperator).toBe('count');
expect(queryA.aggregations?.[0]).toBeDefined();
expect((queryA.aggregations?.[0] as TraceAggregation)?.expression).toBe(
'count(span_id)',
);
expect(queryA.disabled).toBe(false);
// Query B: P99 Latency
expect(queryB.queryName).toBe('B');
expect(queryB.aggregateOperator).toBe('p99');
expect((queryB.aggregations?.[0] as TraceAggregation)?.expression).toBe(
'p99(duration_nano)',
);
expect(queryB.disabled).toBe(false);
// Query C: Rate
expect(queryC.queryName).toBe('C');
expect(queryC.aggregateOperator).toBe('rate');
expect(queryC.disabled).toBe(false);
// All group by response_status_code
[queryA, queryB, queryC].forEach((query) => {
expect(query.groupBy).toContainEqual(
expect.objectContaining({
key: 'response_status_code',
dataType: 'string',
type: 'span',
}),
);
});
// CRITICAL: All have identical filter expressions
expect(queryA.filter?.expression).toBe(queryB.filter?.expression);
expect(queryB.filter?.expression).toBe(queryC.filter?.expression);
});
});
describe('3. Custom Filters Integration', () => {
it('merges custom filters into filter expression with AND logic', () => {
const customFilters: IBuilderQuery['filters'] = {
items: [
{
id: 'test-1',
key: {
key: 'service.name',
dataType: 'string' as any,
type: 'resource',
},
op: '=',
value: 'user-service',
},
{
id: 'test-2',
key: {
key: 'deployment.environment',
dataType: 'string' as any,
type: 'resource',
},
op: '=',
value: 'production',
},
],
op: 'AND',
};
const payload = getEndPointDetailsQueryPayload(
mockDomainName,
mockStartTime,
mockEndTime,
customFilters,
);
const statusCodeQuery = payload[1];
const expression =
statusCodeQuery.query.builder.queryData[0].filter?.expression;
// Base filters present
expect(expression).toContain('net.peer.name');
expect(expression).toContain("kind_string = 'Client'");
expect(expression).toContain('response_status_code EXISTS');
// Custom filters merged
expect(expression).toContain('service.name');
expect(expression).toContain('user-service');
expect(expression).toContain('deployment.environment');
expect(expression).toContain('production');
// All three queries have the same merged expression
const queries = statusCodeQuery.query.builder.queryData;
expect(queries[0].filter?.expression).toBe(queries[1].filter?.expression);
expect(queries[1].filter?.expression).toBe(queries[2].filter?.expression);
});
});
describe('4. HTTP URL Filter Handling', () => {
it('converts http.url filter to (http.url OR url.full) expression', () => {
const filtersWithHttpUrl: IBuilderQuery['filters'] = {
items: [
{
id: 'http-url-filter',
key: {
key: 'http.url',
dataType: 'string' as any,
type: 'tag',
},
op: '=',
value: '/api/users',
},
{
id: 'service-filter',
key: {
key: 'service.name',
dataType: 'string' as any,
type: 'resource',
},
op: '=',
value: 'user-service',
},
],
op: 'AND',
};
const payload = getEndPointDetailsQueryPayload(
mockDomainName,
mockStartTime,
mockEndTime,
filtersWithHttpUrl,
);
const statusCodeQuery = payload[1];
const expression =
statusCodeQuery.query.builder.queryData[0].filter?.expression;
// CRITICAL: http.url converted to OR logic
expect(expression).toContain(
"(http.url = '/api/users' OR url.full = '/api/users')",
);
// Other filters still present
expect(expression).toContain('service.name');
expect(expression).toContain('user-service');
// Base filters present
expect(expression).toContain('net.peer.name');
expect(expression).toContain("kind_string = 'Client'");
expect(expression).toContain('response_status_code EXISTS');
// All ANDed together (at least 2 ANDs: domain+kind, custom filter, url condition)
expect(expression?.match(/AND/g)?.length).toBeGreaterThanOrEqual(2);
});
});
});

View File

@@ -1,17 +1,11 @@
import { fireEvent, render, screen, within } from '@testing-library/react'; import { BuilderQuery } from 'api/v5/v5';
import { useNavigateToExplorer } from 'components/CeleryTask/useNavigateToExplorer'; import { useNavigateToExplorer } from 'components/CeleryTask/useNavigateToExplorer';
import { REACT_QUERY_KEY } from 'constants/reactQueryKeys'; import { rest, server } from 'mocks-server/server';
import { import { fireEvent, render, screen, waitFor, within } from 'tests/test-utils';
formatTopErrorsDataForTable,
getEndPointDetailsQueryPayload,
getTopErrorsColumnsConfig,
getTopErrorsCoRelationQueryFilters,
getTopErrorsQueryPayload,
} from 'container/ApiMonitoring/utils';
import { useQueries } from 'react-query';
import { DataSource } from 'types/common/queryBuilder'; import { DataSource } from 'types/common/queryBuilder';
import TopErrors from '../Explorer/Domains/DomainDetails/TopErrors'; import TopErrors from '../Explorer/Domains/DomainDetails/TopErrors';
import { getTopErrorsQueryPayload } from '../utils';
// Mock the EndPointsDropDown component to avoid issues // Mock the EndPointsDropDown component to avoid issues
jest.mock( jest.mock(
@@ -35,26 +29,14 @@ jest.mock(
}), }),
); );
// Mock dependencies
jest.mock('react-query', () => ({
...jest.requireActual('react-query'),
useQueries: jest.fn(),
}));
jest.mock('components/CeleryTask/useNavigateToExplorer', () => ({ jest.mock('components/CeleryTask/useNavigateToExplorer', () => ({
useNavigateToExplorer: jest.fn(), useNavigateToExplorer: jest.fn(),
})); }));
jest.mock('container/ApiMonitoring/utils', () => ({
END_POINT_DETAILS_QUERY_KEYS_ARRAY: ['key1', 'key2', 'key3', 'key4', 'key5'],
formatTopErrorsDataForTable: jest.fn(),
getEndPointDetailsQueryPayload: jest.fn(),
getTopErrorsColumnsConfig: jest.fn(),
getTopErrorsCoRelationQueryFilters: jest.fn(),
getTopErrorsQueryPayload: jest.fn(),
}));
describe('TopErrors', () => { describe('TopErrors', () => {
const TABLE_BODY_SELECTOR = '.ant-table-tbody';
const V5_QUERY_RANGE_API_PATH = '*/api/v5/query_range';
const mockProps = { const mockProps = {
// eslint-disable-next-line sonarjs/no-duplicate-string // eslint-disable-next-line sonarjs/no-duplicate-string
domainName: 'test-domain', domainName: 'test-domain',
@@ -68,75 +50,72 @@ describe('TopErrors', () => {
}, },
}; };
// Setup basic mocks // Helper function to wait for table data to load
const waitForTableDataToLoad = async (
container: HTMLElement,
): Promise<void> => {
await waitFor(() => {
const tableBody = container.querySelector(TABLE_BODY_SELECTOR);
expect(tableBody).not.toBeNull();
if (tableBody) {
expect(
within(tableBody as HTMLElement).queryByText('/api/test'),
).toBeInTheDocument();
}
});
};
beforeEach(() => { beforeEach(() => {
jest.clearAllMocks(); jest.clearAllMocks();
// Mock getTopErrorsColumnsConfig // Mock useNavigateToExplorer
(getTopErrorsColumnsConfig as jest.Mock).mockReturnValue([ (useNavigateToExplorer as jest.Mock).mockReturnValue(jest.fn());
{
title: 'Endpoint',
dataIndex: 'endpointName',
key: 'endpointName',
},
{
title: 'Status Code',
dataIndex: 'statusCode',
key: 'statusCode',
},
{
title: 'Status Message',
dataIndex: 'statusMessage',
key: 'statusMessage',
},
{
title: 'Count',
dataIndex: 'count',
key: 'count',
},
]);
// Mock useQueries // Mock V5 API endpoint for top errors
(useQueries as jest.Mock).mockImplementation((queryConfigs) => { server.use(
// For topErrorsDataQueries rest.post(V5_QUERY_RANGE_API_PATH, (_req, res, ctx) =>
if ( res(
queryConfigs.length === 1 && ctx.status(200),
queryConfigs[0].queryKey && ctx.json({
queryConfigs[0].queryKey[0] === REACT_QUERY_KEY.GET_TOP_ERRORS_BY_DOMAIN
) {
return [
{
data: { data: {
payload: { data: {
data: { results: [
result: [ {
{ columns: [
metric: { {
'http.url': '/api/test', name: 'http.url',
status_code: '500', fieldDataType: 'string',
// eslint-disable-next-line sonarjs/no-duplicate-string fieldContext: 'attribute',
status_message: 'Internal Server Error',
}, },
values: [[1000000100, '10']], {
queryName: 'A', name: 'response_status_code',
legend: 'Test Legend', fieldDataType: 'string',
}, fieldContext: 'span',
], },
}, {
name: 'status_message',
fieldDataType: 'string',
fieldContext: 'span',
},
{ name: 'count()', fieldDataType: 'int64', fieldContext: '' },
],
// eslint-disable-next-line sonarjs/no-duplicate-string
data: [['/api/test', '500', 'Internal Server Error', 10]],
},
],
}, },
}, },
isLoading: false, }),
isRefetching: false, ),
isError: false, ),
refetch: jest.fn(), );
},
];
}
// For endPointDropDownDataQueries // Mock V4 API endpoint for dropdown data
return [ server.use(
{ rest.post('*/api/v1/query_range', (_req, res, ctx) =>
data: { res(
ctx.status(200),
ctx.json({
payload: { payload: {
data: { data: {
result: [ result: [
@@ -153,62 +132,13 @@ describe('TopErrors', () => {
], ],
}, },
}, },
}, }),
isLoading: false, ),
isRefetching: false, ),
isError: false, );
},
];
});
// Mock formatTopErrorsDataForTable
(formatTopErrorsDataForTable as jest.Mock).mockReturnValue([
{
key: '1',
endpointName: '/api/test',
statusCode: '500',
statusMessage: 'Internal Server Error',
count: 10,
},
]);
// Mock getTopErrorsQueryPayload
(getTopErrorsQueryPayload as jest.Mock).mockReturnValue([
{
queryName: 'TopErrorsQuery',
start: mockProps.timeRange.startTime,
end: mockProps.timeRange.endTime,
step: 60,
},
]);
// Mock getEndPointDetailsQueryPayload
(getEndPointDetailsQueryPayload as jest.Mock).mockReturnValue([
{},
{},
{
queryName: 'EndpointDropdownQuery',
start: mockProps.timeRange.startTime,
end: mockProps.timeRange.endTime,
step: 60,
},
]);
// Mock useNavigateToExplorer
(useNavigateToExplorer as jest.Mock).mockReturnValue(jest.fn());
// Mock getTopErrorsCoRelationQueryFilters
(getTopErrorsCoRelationQueryFilters as jest.Mock).mockReturnValue({
items: [
{ id: 'test1', key: { key: 'domain' }, op: '=', value: 'test-domain' },
{ id: 'test2', key: { key: 'endpoint' }, op: '=', value: '/api/test' },
{ id: 'test3', key: { key: 'status' }, op: '=', value: '500' },
],
op: 'AND',
});
}); });
it('renders component correctly', () => { it('renders component correctly', async () => {
// eslint-disable-next-line react/jsx-props-no-spreading // eslint-disable-next-line react/jsx-props-no-spreading
const { container } = render(<TopErrors {...mockProps} />); const { container } = render(<TopErrors {...mockProps} />);
@@ -216,10 +146,11 @@ describe('TopErrors', () => {
expect(screen.getByText('Errors with Status Message')).toBeInTheDocument(); expect(screen.getByText('Errors with Status Message')).toBeInTheDocument();
expect(screen.getByText('Status Message Exists')).toBeInTheDocument(); expect(screen.getByText('Status Message Exists')).toBeInTheDocument();
// Find the table row and verify content // Wait for data to load
const tableBody = container.querySelector('.ant-table-tbody'); await waitForTableDataToLoad(container);
expect(tableBody).not.toBeNull();
// Find the table row and verify content
const tableBody = container.querySelector(TABLE_BODY_SELECTOR);
if (tableBody) { if (tableBody) {
const row = within(tableBody as HTMLElement).getByRole('row'); const row = within(tableBody as HTMLElement).getByRole('row');
expect(within(row).getByText('/api/test')).toBeInTheDocument(); expect(within(row).getByText('/api/test')).toBeInTheDocument();
@@ -228,35 +159,40 @@ describe('TopErrors', () => {
} }
}); });
it('renders error state when isError is true', () => { it('renders error state when API fails', async () => {
// Mock useQueries to return isError: true // Mock API to return error
(useQueries as jest.Mock).mockImplementationOnce(() => [ server.use(
{ rest.post(V5_QUERY_RANGE_API_PATH, (_req, res, ctx) =>
isError: true, res(ctx.status(500), ctx.json({ error: 'Internal Server Error' })),
refetch: jest.fn(), ),
}, );
]);
// eslint-disable-next-line react/jsx-props-no-spreading // eslint-disable-next-line react/jsx-props-no-spreading
render(<TopErrors {...mockProps} />); render(<TopErrors {...mockProps} />);
// Error state should be shown with the actual text displayed in the UI // Wait for error state
expect( await waitFor(() => {
screen.getByText('Uh-oh :/ We ran into an error.'), expect(
).toBeInTheDocument(); screen.getByText('Uh-oh :/ We ran into an error.'),
).toBeInTheDocument();
});
expect(screen.getByText('Please refresh this panel.')).toBeInTheDocument(); expect(screen.getByText('Please refresh this panel.')).toBeInTheDocument();
expect(screen.getByText('Refresh this panel')).toBeInTheDocument(); expect(screen.getByText('Refresh this panel')).toBeInTheDocument();
}); });
it('handles row click correctly', () => { it('handles row click correctly', async () => {
const navigateMock = jest.fn(); const navigateMock = jest.fn();
(useNavigateToExplorer as jest.Mock).mockReturnValue(navigateMock); (useNavigateToExplorer as jest.Mock).mockReturnValue(navigateMock);
// eslint-disable-next-line react/jsx-props-no-spreading // eslint-disable-next-line react/jsx-props-no-spreading
const { container } = render(<TopErrors {...mockProps} />); const { container } = render(<TopErrors {...mockProps} />);
// Wait for data to load
await waitForTableDataToLoad(container);
// Find and click on the table cell containing the endpoint // Find and click on the table cell containing the endpoint
const tableBody = container.querySelector('.ant-table-tbody'); const tableBody = container.querySelector(TABLE_BODY_SELECTOR);
expect(tableBody).not.toBeNull(); expect(tableBody).not.toBeNull();
if (tableBody) { if (tableBody) {
@@ -267,11 +203,28 @@ describe('TopErrors', () => {
// Check if navigateToExplorer was called with correct params // Check if navigateToExplorer was called with correct params
expect(navigateMock).toHaveBeenCalledWith({ expect(navigateMock).toHaveBeenCalledWith({
filters: [ filters: expect.arrayContaining([
{ id: 'test1', key: { key: 'domain' }, op: '=', value: 'test-domain' }, expect.objectContaining({
{ id: 'test2', key: { key: 'endpoint' }, op: '=', value: '/api/test' }, key: expect.objectContaining({ key: 'http.url' }),
{ id: 'test3', key: { key: 'status' }, op: '=', value: '500' }, op: '=',
], value: '/api/test',
}),
expect.objectContaining({
key: expect.objectContaining({ key: 'has_error' }),
op: '=',
value: 'true',
}),
expect.objectContaining({
key: expect.objectContaining({ key: 'net.peer.name' }),
op: '=',
value: 'test-domain',
}),
expect.objectContaining({
key: expect.objectContaining({ key: 'response_status_code' }),
op: '=',
value: '500',
}),
]),
dataSource: DataSource.TRACES, dataSource: DataSource.TRACES,
startTime: mockProps.timeRange.startTime, startTime: mockProps.timeRange.startTime,
endTime: mockProps.timeRange.endTime, endTime: mockProps.timeRange.endTime,
@@ -279,24 +232,34 @@ describe('TopErrors', () => {
}); });
}); });
it('updates endpoint filter when dropdown value changes', () => { it('updates endpoint filter when dropdown value changes', async () => {
// eslint-disable-next-line react/jsx-props-no-spreading // eslint-disable-next-line react/jsx-props-no-spreading
render(<TopErrors {...mockProps} />); render(<TopErrors {...mockProps} />);
// Wait for initial load
await waitFor(() => {
expect(screen.getByRole('combobox')).toBeInTheDocument();
});
// Find the dropdown // Find the dropdown
const dropdown = screen.getByRole('combobox'); const dropdown = screen.getByRole('combobox');
// Mock the change // Mock the change
fireEvent.change(dropdown, { target: { value: '/api/new-endpoint' } }); fireEvent.change(dropdown, { target: { value: '/api/new-endpoint' } });
// Check if getTopErrorsQueryPayload was called with updated parameters // Component should re-render with new filter
expect(getTopErrorsQueryPayload).toHaveBeenCalled(); expect(dropdown).toBeInTheDocument();
}); });
it('handles status message toggle correctly', () => { it('handles status message toggle correctly', async () => {
// eslint-disable-next-line react/jsx-props-no-spreading // eslint-disable-next-line react/jsx-props-no-spreading
render(<TopErrors {...mockProps} />); render(<TopErrors {...mockProps} />);
// Wait for initial load
await waitFor(() => {
expect(screen.getByRole('switch')).toBeInTheDocument();
});
// Find the toggle switch // Find the toggle switch
const toggle = screen.getByRole('switch'); const toggle = screen.getByRole('switch');
expect(toggle).toBeInTheDocument(); expect(toggle).toBeInTheDocument();
@@ -307,69 +270,71 @@ describe('TopErrors', () => {
// Click the toggle to turn it off // Click the toggle to turn it off
fireEvent.click(toggle); fireEvent.click(toggle);
// Check if getTopErrorsQueryPayload was called with showStatusCodeErrors=false
expect(getTopErrorsQueryPayload).toHaveBeenCalledWith(
mockProps.domainName,
mockProps.timeRange.startTime,
mockProps.timeRange.endTime,
expect.any(Object),
false,
);
// Title should change // Title should change
expect(screen.getByText('All Errors')).toBeInTheDocument(); await waitFor(() => {
expect(screen.getByText('All Errors')).toBeInTheDocument();
});
// Click the toggle to turn it back on // Click the toggle to turn it back on
fireEvent.click(toggle); fireEvent.click(toggle);
// Check if getTopErrorsQueryPayload was called with showStatusCodeErrors=true
expect(getTopErrorsQueryPayload).toHaveBeenCalledWith(
mockProps.domainName,
mockProps.timeRange.startTime,
mockProps.timeRange.endTime,
expect.any(Object),
true,
);
// Title should change back // Title should change back
expect(screen.getByText('Errors with Status Message')).toBeInTheDocument(); await waitFor(() => {
expect(screen.getByText('Errors with Status Message')).toBeInTheDocument();
});
}); });
it('includes toggle state in query key for cache busting', () => { it('includes toggle state in query key for cache busting', async () => {
// eslint-disable-next-line react/jsx-props-no-spreading // eslint-disable-next-line react/jsx-props-no-spreading
render(<TopErrors {...mockProps} />); render(<TopErrors {...mockProps} />);
const toggle = screen.getByRole('switch'); // Wait for initial load
await waitFor(() => {
expect(screen.getByRole('switch')).toBeInTheDocument();
});
// Initial query should include showStatusCodeErrors=true const toggle = screen.getByRole('switch');
expect(useQueries).toHaveBeenCalledWith(
expect.arrayContaining([
expect.objectContaining({
queryKey: expect.arrayContaining([
REACT_QUERY_KEY.GET_TOP_ERRORS_BY_DOMAIN,
expect.any(Object),
expect.any(String),
true,
]),
}),
]),
);
// Click toggle // Click toggle
fireEvent.click(toggle); fireEvent.click(toggle);
// Query should be called with showStatusCodeErrors=false in key // Wait for title to change, indicating query was refetched with new key
expect(useQueries).toHaveBeenCalledWith( await waitFor(() => {
expect.arrayContaining([ expect(screen.getByText('All Errors')).toBeInTheDocument();
expect.objectContaining({ });
queryKey: expect.arrayContaining([
REACT_QUERY_KEY.GET_TOP_ERRORS_BY_DOMAIN, // The fact that data refetches when toggle changes proves the query key includes the toggle state
expect.any(Object), expect(toggle).toBeInTheDocument();
expect.any(String), });
false,
]), it('sends query_range v5 API call with required filters including has_error', async () => {
}), // let capturedRequest: any;
]),
const topErrorsPayload = getTopErrorsQueryPayload(
'test-domain',
mockProps.timeRange.startTime,
mockProps.timeRange.endTime,
{ items: [], op: 'AND' },
false,
);
// eslint-disable-next-line react/jsx-props-no-spreading
render(<TopErrors {...mockProps} />);
// Wait for the API call to be made
await waitFor(() => {
expect(topErrorsPayload).toBeDefined();
});
// Extract the filter expression from the captured request
// getTopErrorsQueryPayload returns a builder_query with TraceBuilderQuery spec
const builderQuery = topErrorsPayload.compositeQuery.queries[0]
.spec as BuilderQuery;
const filterExpression = builderQuery.filter?.expression;
// Verify all required filters are present
expect(filterExpression).toContain(
`kind_string = 'Client' AND (http.url EXISTS OR url.full EXISTS) AND (net.peer.name = 'test-domain' OR server.address = 'test-domain') AND has_error = true`,
); );
}); });
}); });

File diff suppressed because it is too large Load Diff

View File

@@ -112,6 +112,8 @@ function AppLayout(props: AppLayoutProps): JSX.Element {
setShowPaymentFailedWarning, setShowPaymentFailedWarning,
] = useState<boolean>(false); ] = useState<boolean>(false);
const errorBoundaryRef = useRef<Sentry.ErrorBoundary>(null);
const [showSlowApiWarning, setShowSlowApiWarning] = useState(false); const [showSlowApiWarning, setShowSlowApiWarning] = useState(false);
const [slowApiWarningShown, setSlowApiWarningShown] = useState(false); const [slowApiWarningShown, setSlowApiWarningShown] = useState(false);
@@ -378,6 +380,13 @@ function AppLayout(props: AppLayoutProps): JSX.Element {
getChangelogByVersionResponse.isSuccess, getChangelogByVersionResponse.isSuccess,
]); ]);
// reset error boundary on route change
useEffect(() => {
if (errorBoundaryRef.current) {
errorBoundaryRef.current.resetErrorBoundary();
}
}, [pathname]);
const isToDisplayLayout = isLoggedIn; const isToDisplayLayout = isLoggedIn;
const routeKey = useMemo(() => getRouteKey(pathname), [pathname]); const routeKey = useMemo(() => getRouteKey(pathname), [pathname]);
@@ -836,7 +845,10 @@ function AppLayout(props: AppLayoutProps): JSX.Element {
})} })}
data-overlayscrollbars-initialize data-overlayscrollbars-initialize
> >
<Sentry.ErrorBoundary fallback={<ErrorBoundaryFallback />}> <Sentry.ErrorBoundary
fallback={<ErrorBoundaryFallback />}
ref={errorBoundaryRef}
>
<LayoutContent data-overlayscrollbars-initialize> <LayoutContent data-overlayscrollbars-initialize>
<OverlayScrollbar> <OverlayScrollbar>
<ChildrenContainer> <ChildrenContainer>

View File

@@ -11,12 +11,14 @@ import { v4 } from 'uuid';
import { useCreateAlertState } from '../context'; import { useCreateAlertState } from '../context';
import { import {
INITIAL_EVALUATION_WINDOW_STATE,
INITIAL_INFO_THRESHOLD, INITIAL_INFO_THRESHOLD,
INITIAL_RANDOM_THRESHOLD, INITIAL_RANDOM_THRESHOLD,
INITIAL_WARNING_THRESHOLD, INITIAL_WARNING_THRESHOLD,
THRESHOLD_MATCH_TYPE_OPTIONS, THRESHOLD_MATCH_TYPE_OPTIONS,
THRESHOLD_OPERATOR_OPTIONS, THRESHOLD_OPERATOR_OPTIONS,
} from '../context/constants'; } from '../context/constants';
import { AlertThresholdMatchType } from '../context/types';
import EvaluationSettings from '../EvaluationSettings/EvaluationSettings'; import EvaluationSettings from '../EvaluationSettings/EvaluationSettings';
import ThresholdItem from './ThresholdItem'; import ThresholdItem from './ThresholdItem';
import { AnomalyAndThresholdProps, UpdateThreshold } from './types'; import { AnomalyAndThresholdProps, UpdateThreshold } from './types';
@@ -38,12 +40,12 @@ function AlertThreshold({
alertState, alertState,
thresholdState, thresholdState,
setThresholdState, setThresholdState,
setEvaluationWindow,
notificationSettings, notificationSettings,
setNotificationSettings, setNotificationSettings,
} = useCreateAlertState(); } = useCreateAlertState();
const { currentQuery } = useQueryBuilder(); const { currentQuery } = useQueryBuilder();
const queryNames = getQueryNames(currentQuery); const queryNames = getQueryNames(currentQuery);
useEffect(() => { useEffect(() => {
@@ -160,6 +162,54 @@ function AlertThreshold({
}), }),
); );
const handleSetEvaluationDetailsForMeter = (): void => {
setEvaluationWindow({
type: 'SET_INITIAL_STATE_FOR_METER',
});
setThresholdState({
type: 'SET_MATCH_TYPE',
payload: AlertThresholdMatchType.IN_TOTAL,
});
};
const handleSelectedQueryChange = (value: string): void => {
// loop through currenttQuery and find the query that matches the selected query
const query = currentQuery?.builder?.queryData.find(
(query) => query.queryName === value,
);
const currentSelectedQuery = currentQuery?.builder?.queryData.find(
(query) => query.queryName === thresholdState.selectedQuery,
);
const newSelectedQuerySource = query?.source || '';
const currentSelectedQuerySource = currentSelectedQuery?.source || '';
if (newSelectedQuerySource === currentSelectedQuerySource) {
setThresholdState({
type: 'SET_SELECTED_QUERY',
payload: value,
});
return;
}
if (newSelectedQuerySource === 'meter') {
handleSetEvaluationDetailsForMeter();
} else {
setEvaluationWindow({
type: 'SET_INITIAL_STATE',
payload: INITIAL_EVALUATION_WINDOW_STATE,
});
}
setThresholdState({
type: 'SET_SELECTED_QUERY',
payload: value,
});
};
return ( return (
<div <div
className={classNames( className={classNames(
@@ -175,14 +225,10 @@ function AlertThreshold({
</Typography.Text> </Typography.Text>
<Select <Select
value={thresholdState.selectedQuery} value={thresholdState.selectedQuery}
onChange={(value): void => { onChange={handleSelectedQueryChange}
setThresholdState({
type: 'SET_SELECTED_QUERY',
payload: value,
});
}}
style={{ width: 80 }} style={{ width: 80 }}
options={queryNames} options={queryNames}
data-testid="alert-threshold-query-select"
/> />
<Typography.Text className="sentence-text">is</Typography.Text> <Typography.Text className="sentence-text">is</Typography.Text>
<Select <Select
@@ -195,6 +241,7 @@ function AlertThreshold({
}} }}
style={{ width: 180 }} style={{ width: 180 }}
options={THRESHOLD_OPERATOR_OPTIONS} options={THRESHOLD_OPERATOR_OPTIONS}
data-testid="alert-threshold-operator-select"
/> />
<Typography.Text className="sentence-text"> <Typography.Text className="sentence-text">
the threshold(s) the threshold(s)
@@ -209,6 +256,7 @@ function AlertThreshold({
}} }}
style={{ width: 180 }} style={{ width: 180 }}
options={matchTypeOptionsWithTooltips} options={matchTypeOptionsWithTooltips}
data-testid="alert-threshold-match-type-select"
/> />
<Typography.Text className="sentence-text"> <Typography.Text className="sentence-text">
during the <EvaluationSettings /> during the <EvaluationSettings />
@@ -236,6 +284,7 @@ function AlertThreshold({
icon={<Plus size={16} />} icon={<Plus size={16} />}
onClick={addThreshold} onClick={addThreshold}
className="add-threshold-btn" className="add-threshold-btn"
data-testid="add-threshold-button"
> >
Add Threshold Add Threshold
</Button> </Button>

View File

@@ -32,6 +32,7 @@ function ThresholdItem({
style={{ width: 150 }} style={{ width: 150 }}
options={units} options={units}
disabled={units.length === 0} disabled={units.length === 0}
data-testid="threshold-unit-select"
/> />
); );
if (units.length === 0) { if (units.length === 0) {
@@ -47,6 +48,7 @@ function ThresholdItem({
style={{ width: 150 }} style={{ width: 150 }}
options={units} options={units}
disabled={units.length === 0} disabled={units.length === 0}
data-testid="threshold-unit-select"
/> />
</Tooltip> </Tooltip>
); );
@@ -96,6 +98,7 @@ function ThresholdItem({
updateThreshold(threshold.id, 'label', e.target.value) updateThreshold(threshold.id, 'label', e.target.value)
} }
style={{ width: 200 }} style={{ width: 200 }}
data-testid="threshold-name-input"
/> />
<Typography.Text className="sentence-text">on value</Typography.Text> <Typography.Text className="sentence-text">on value</Typography.Text>
<Typography.Text className="sentence-text highlighted-text"> <Typography.Text className="sentence-text highlighted-text">
@@ -109,6 +112,7 @@ function ThresholdItem({
} }
style={{ width: 100 }} style={{ width: 100 }}
type="number" type="number"
data-testid="threshold-value-input"
/> />
{yAxisUnitSelect} {yAxisUnitSelect}
{!notificationSettings.routingPolicies && ( {!notificationSettings.routingPolicies && (
@@ -119,10 +123,12 @@ function ThresholdItem({
onChange={(value): void => onChange={(value): void =>
updateThreshold(threshold.id, 'channels', value) updateThreshold(threshold.id, 'channels', value)
} }
data-testid="threshold-notification-channel-select"
style={{ width: 350 }} style={{ width: 350 }}
options={channels.map((channel) => ({ options={channels.map((channel) => ({
value: channel.name, value: channel.name,
label: channel.name, label: channel.name,
'data-testid': `threshold-notification-channel-option-${threshold.label}`,
}))} }))}
mode="multiple" mode="multiple"
placeholder="Select notification channels" placeholder="Select notification channels"
@@ -157,6 +163,7 @@ function ThresholdItem({
} }
style={{ width: 100 }} style={{ width: 100 }}
type="number" type="number"
data-testid="recovery-threshold-value-input"
/> />
<Tooltip title="Remove recovery threshold"> <Tooltip title="Remove recovery threshold">
<Button <Button
@@ -164,6 +171,7 @@ function ThresholdItem({
icon={<Trash size={16} />} icon={<Trash size={16} />}
onClick={removeRecoveryThreshold} onClick={removeRecoveryThreshold}
className="icon-btn" className="icon-btn"
data-testid="remove-recovery-threshold-button"
/> />
</Tooltip> </Tooltip>
</> </>
@@ -187,6 +195,7 @@ function ThresholdItem({
icon={<CircleX size={16} />} icon={<CircleX size={16} />}
onClick={(): void => removeThreshold(threshold.id)} onClick={(): void => removeThreshold(threshold.id)}
className="icon-btn" className="icon-btn"
data-testid="remove-threshold-button"
/> />
</Tooltip> </Tooltip>
)} )}

View File

@@ -50,6 +50,7 @@ export function getCategorySelectOptionByName(
(unit) => ({ (unit) => ({
label: unit.name, label: unit.name,
value: unit.id, value: unit.id,
'data-testid': `threshold-unit-select-option-${unit.id}`,
}), }),
) || [] ) || []
); );
@@ -401,6 +402,7 @@ export function RoutingPolicyBanner({
</Typography.Text> </Typography.Text>
<Switch <Switch
checked={notificationSettings.routingPolicies} checked={notificationSettings.routingPolicies}
data-testid="routing-policies-switch"
onChange={(value): void => { onChange={(value): void => {
setNotificationSettings({ setNotificationSettings({
type: 'SET_ROUTING_POLICIES', type: 'SET_ROUTING_POLICIES',

Some files were not shown because too many files have changed in this diff Show More