Compare commits

...

37 Commits

Author SHA1 Message Date
Srikanth Chekuri
04a68ad444 chore: adjust timestamps 2024-05-21 08:06:49 +05:30
Srikanth Chekuri
0c0f9a0481 chore: skip test 2024-05-21 07:30:39 +05:30
Srikanth Chekuri
3c27d59ad4 chore: add for only formula 2024-05-21 06:22:22 +05:30
Srikanth Chekuri
a891dd4b50 chore: fix timestamps 2024-05-20 21:49:23 +05:30
Srikanth Chekuri
10530582ab fix: add zero value for missing timestamps in alert eval 2024-05-20 15:22:19 +05:30
Srikanth Chekuri
9ff0e34038 chore: migrate alerts to v4 for supported operators (#5010) 2024-05-17 07:45:03 +05:30
Vikrant Gupta
d313f44556 fix: multiple widgets getting created and hence blocking the delete (#5015)
* fix: multiple widgets getting created and hence blocking the delete

* fix: allow multiple deletes when multiple widgets present with same id

* chore: use the avg for limit

---------

Co-authored-by: Srikanth Chekuri <srikanth.chekuri92@gmail.com>
2024-05-17 07:44:33 +05:30
Raj Kamal Singh
5a778dcb18 Chore: integrations: populate updatedAt for integration dashboards (#5019)
* chore: add test for updatedAt value being populated in integration dashboards and get it passing

* chore: also populate createdAt, createBy and updateBy for instaled integration dashboards

* chore: update clickhouse integration config instructions
2024-05-16 21:44:46 +05:30
Srikanth Chekuri
7e31b4ca01 fix: several issues (#5001) 2024-05-15 18:52:01 +05:30
Raj Kamal Singh
3efd9801a1 Chore: restrict logs connection test for integrations to use log attributes for identifying logs (#4977)
* chore: change logs connection test spec to be based on an attrib value

* chore: disallow unknown fields while unmarshalling JSON for an integration

* chore: add description field to collected metric spec

* chore: update logs connection test for builtin integrations

* chore: update logic for calculating logs connection status
2024-05-15 14:36:52 +05:30
Vishal Sharma
0cbaa17d9f chore: allow unlimited dashboards and alerts in community version (#4989)
* chore: allow unlimited dashboards and alerts in community version

* chore: update ee plan
2024-05-14 18:05:59 +05:30
Nityananda Gohain
30bfad527f chore: enable limits for trace queries (#4997) 2024-05-14 17:03:29 +05:30
Srikanth Chekuri
9f1c45bc32 chore: add toUnixTimestamp to supported functions (#4877) 2024-05-14 10:34:43 +05:30
Vikrant Gupta
51becf7cfb fix: added right padding to the notifications bar to show cancel button (#4969) 2024-05-12 16:45:16 +05:30
Vibhu Pandey
7460e650af feat(workflow): integrate with workflow identity pool (#4945)
* feat(workflows): add wif workflow
* feat(workflows): add name of compute instance
* feat(workflows): fix permissions
* feat(workflows):  add an OR true since github runs with -e
* ci(testing-deployment): include GITHUB envs
* ci(testing-deployment): move GCP information to secrets
* ci(staging-deployment): wif workflow

---------

Co-authored-by: Prashant Shahi <prashant@signoz.io>
2024-05-10 23:23:31 +05:30
Vikrant Gupta
211fe4fdd5 fix: prevent page from crashing in case items in filters is null (#4964)
* fix: prevent page from crashing in case items in filters is null

* fix: added null check for filters as well
2024-05-06 19:18:27 +05:30
Vikrant Gupta
e2992b42c1 fix: make integrations available for the ee cloud user (#4963) 2024-05-06 19:17:50 +05:30
Nityananda Gohain
3957d91a9b fix: add read-in-order config (#4918) 2024-05-06 15:01:53 +05:30
Vishal Sharma
967aa16f21 feat: sort tags and events in trace detail (#4962) 2024-05-05 09:03:31 +05:30
Vikrant Gupta
08b1a87cb5 Revert "fix: step interval not getting updated on time range change (#4944)" (#4955)
This reverts commit 5c1c09c790.
2024-05-01 22:46:32 +05:30
SagarRajput-7
03ddcdd20e feat: added test cases for pipeline pages (#4872)
* feat: added test cases for pipeline pages

* feat: added test cases for changeHistory

* chore: change history table test case added

* chore: added create pipeline button test cases

* chore: updated useAnalytics mocking
2024-05-01 18:49:04 +05:30
SagarRajput-7
1aec7f3ca6 feat: added tooltips for facing issue btn (#4948) 2024-05-01 18:36:56 +05:30
Vikrant Gupta
241edcb88a fix: text change for saved views in traces (#4953) 2024-05-01 18:28:05 +05:30
Srikanth Chekuri
27d12871af chore: disallow small step intervals for large durations (#4950) 2024-05-01 17:03:46 +05:30
Yunus M
e78e1d4b63 fix: add safety checks to handle null response from query range API (#4939) 2024-05-01 15:49:30 +05:30
Yunus M
64bf580323 feat: show milliseconds in timestamp in logs views (#4949)
* feat: show milliseconds in timestamp in logs views

* fix: remove console log

---------

Co-authored-by: Vikrant Gupta <vikrant.thomso@gmail.com>
2024-05-01 15:27:48 +05:30
SagarRajput-7
152aa4b518 fix: fixed facing issue btn alignment issue (#4936)
* fix: fixed facing issue btn alignment issue

* fix: fixed facing issue btn alignment issue

* fix: moved intercom help messages to util file
2024-05-01 14:49:42 +05:30
Vikrant Gupta
b3d5831574 fix: ch queries sending builder as query type in query range api for exceptions alerts (#4941)
* fix: ch queries sending builder as query type in query range api for exceptions alerts

* fix: ch queries sending builder as query type in query range api for exceptions alerts

* fix: alerts routing from logs explorer and dashboards
2024-05-01 14:39:39 +05:30
Vikrant Gupta
b85b9f42ed fix: time interval not getting updated in case of edit dashboard (#4940) 2024-05-01 13:00:18 +05:30
Vikrant Gupta
5c1c09c790 fix: step interval not getting updated on time range change (#4944) 2024-05-01 12:47:33 +05:30
Vishal Sharma
33960b05fd chore: update facing issues text (#4942) 2024-04-30 23:38:15 +05:30
Vikrant Gupta
191d9b0648 feat: introducing collapsable rows for dashboards (#4806)
* feat: dashboard panel grouping initial setup

* feat: added panel map to the dashboard response and subsequent types for the same

* feat: added panel map to the dashboard response and subsequent types for the same

* feat: added settings modal

* feat: handle panel collapse and open changes

* feat: handle creating panel map when dashboard layout changes

* feat: handle creating panel map when dashboard layout changes

* feat: refactor code

* feat: handle multiple collapsable rows

* fix: type issues

* feat: handle row collapse button and scroll

* feat: handle y axis movement for rows

* feat: handle delete row

* feat: handle settings name change

* feat: disable collapse/uncollapse when dashboard loading to avoid async states

* feat: decrease the height of the grouping row

* fix: row height management

* fix: handle empty row case

* feat: remove resize handle from the row

* feat: handle re-arrangement of panels

* feat: increase height of default new widget

* feat: added safety checks
2024-04-30 14:36:47 +05:30
Srikanth Chekuri
7d81bc3417 fix: value panel restriction should be on enabled queries (#4934) 2024-04-30 09:53:03 +05:30
Srikanth Chekuri
506916661d fix: metric limit works with cache (#4935) 2024-04-30 01:25:50 +05:30
Nityananda Gohain
5326f2d23b fix: dont enrich if non empty keys are not same (#4930)
* fix: dont enrich if non empty keys are not same

* fix: update if any of the type and dataType is empty but other is matching
2024-04-29 22:40:40 +05:30
dependabot[bot]
dfaa344dce chore(deps): bump express from 4.18.2 to 4.19.2 in /frontend (#4840)
Bumps [express](https://github.com/expressjs/express) from 4.18.2 to 4.19.2.
- [Release notes](https://github.com/expressjs/express/releases)
- [Changelog](https://github.com/expressjs/express/blob/master/History.md)
- [Commits](https://github.com/expressjs/express/compare/4.18.2...4.19.2)

---
updated-dependencies:
- dependency-name: express
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-29 21:29:27 +05:30
SagarRajput-7
882b540a0b chore: [SIG-583]: Jest coverage collection config (#4920)
* chore: [SIG-583]: Jest coverage collection config

* fix: added missing attribute
2024-04-27 11:31:37 +05:30
91 changed files with 2596 additions and 658 deletions

View File

@@ -9,34 +9,46 @@ jobs:
name: Deploy latest develop branch to staging
runs-on: ubuntu-latest
environment: staging
permissions:
contents: 'read'
id-token: 'write'
steps:
- name: Executing remote ssh commands using ssh key
uses: appleboy/ssh-action@v1.0.3
env:
GITHUB_BRANCH: develop
GITHUB_SHA: ${{ github.sha }}
- id: 'auth'
uses: 'google-github-actions/auth@v2'
with:
host: ${{ secrets.HOST_DNS }}
username: ${{ secrets.USERNAME }}
key: ${{ secrets.SSH_KEY }}
envs: GITHUB_BRANCH,GITHUB_SHA
command_timeout: 60m
script: |
echo "GITHUB_BRANCH: ${GITHUB_BRANCH}"
echo "GITHUB_SHA: ${GITHUB_SHA}"
export DOCKER_TAG="${GITHUB_SHA:0:7}" # needed for child process to access it
export OTELCOL_TAG="main"
export PATH="/usr/local/go/bin/:$PATH" # needed for Golang to work
docker system prune --force
docker pull signoz/signoz-otel-collector:main
docker pull signoz/signoz-schema-migrator:main
cd ~/signoz
git status
git add .
git stash push -m "stashed on $(date --iso-8601=seconds)"
git fetch origin
git checkout ${GITHUB_BRANCH}
git pull
make build-ee-query-service-amd64
make build-frontend-amd64
make run-signoz
workload_identity_provider: ${{ secrets.GCP_WORKLOAD_IDENTITY_PROVIDER }}
service_account: ${{ secrets.GCP_SERVICE_ACCOUNT }}
- name: 'sdk'
uses: 'google-github-actions/setup-gcloud@v2'
- name: 'ssh'
shell: bash
env:
GITHUB_BRANCH: ${{ github.head_ref || github.ref_name }}
GITHUB_SHA: ${{ github.sha }}
GCP_PROJECT: ${{ secrets.GCP_PROJECT }}
GCP_ZONE: ${{ secrets.GCP_ZONE }}
GCP_INSTANCE: ${{ secrets.GCP_INSTANCE }}
run: |
read -r -d '' COMMAND <<EOF || true
echo "GITHUB_BRANCH: ${GITHUB_BRANCH}"
echo "GITHUB_SHA: ${GITHUB_SHA}"
export DOCKER_TAG="${GITHUB_SHA:0:7}" # needed for child process to access it
export OTELCOL_TAG="main"
export PATH="/usr/local/go/bin/:$PATH" # needed for Golang to work
docker system prune --force
docker pull signoz/signoz-otel-collector:main
docker pull signoz/signoz-schema-migrator:main
cd ~/signoz
git status
git add .
git stash push -m "stashed on $(date --iso-8601=seconds)"
git fetch origin
git checkout ${GITHUB_BRANCH}
git pull
make build-ee-query-service-amd64
make build-frontend-amd64
make run-signoz
EOF
gcloud compute ssh ${GCP_INSTANCE} --zone ${GCP_ZONE} --tunnel-through-iap --project ${GCP_PROJECT} --command "${COMMAND}"

View File

@@ -9,35 +9,47 @@ jobs:
runs-on: ubuntu-latest
environment: testing
if: ${{ github.event.label.name == 'testing-deploy' }}
permissions:
contents: 'read'
id-token: 'write'
steps:
- name: Executing remote ssh commands using ssh key
uses: appleboy/ssh-action@v1.0.3
- id: 'auth'
uses: 'google-github-actions/auth@v2'
with:
workload_identity_provider: ${{ secrets.GCP_WORKLOAD_IDENTITY_PROVIDER }}
service_account: ${{ secrets.GCP_SERVICE_ACCOUNT }}
- name: 'sdk'
uses: 'google-github-actions/setup-gcloud@v2'
- name: 'ssh'
shell: bash
env:
GITHUB_BRANCH: ${{ github.head_ref || github.ref_name }}
GITHUB_SHA: ${{ github.sha }}
with:
host: ${{ secrets.HOST_DNS }}
username: ${{ secrets.USERNAME }}
key: ${{ secrets.SSH_KEY }}
envs: GITHUB_BRANCH,GITHUB_SHA
command_timeout: 60m
script: |
echo "GITHUB_BRANCH: ${GITHUB_BRANCH}"
echo "GITHUB_SHA: ${GITHUB_SHA}"
export DOCKER_TAG="${GITHUB_SHA:0:7}" # needed for child process to access it
export DEV_BUILD="1"
export PATH="/usr/local/go/bin/:$PATH" # needed for Golang to work
docker system prune --force
cd ~/signoz
git status
git add .
git stash push -m "stashed on $(date --iso-8601=seconds)"
git fetch origin
git checkout develop
git pull
# This is added to include the scenerio when new commit in PR is force-pushed
git branch -D ${GITHUB_BRANCH}
git checkout --track origin/${GITHUB_BRANCH}
make build-ee-query-service-amd64
make build-frontend-amd64
make run-signoz
GCP_PROJECT: ${{ secrets.GCP_PROJECT }}
GCP_ZONE: ${{ secrets.GCP_ZONE }}
GCP_INSTANCE: ${{ secrets.GCP_INSTANCE }}
run: |
read -r -d '' COMMAND <<EOF || true
echo "GITHUB_BRANCH: ${GITHUB_BRANCH}"
echo "GITHUB_SHA: ${GITHUB_SHA}"
export DOCKER_TAG="${GITHUB_SHA:0:7}" # needed for child process to access it
export DEV_BUILD="1"
export PATH="/usr/local/go/bin/:$PATH" # needed for Golang to work
docker system prune --force
cd ~/signoz
git status
git add .
git stash push -m "stashed on $(date --iso-8601=seconds)"
git fetch origin
git checkout develop
git pull
# This is added to include the scenerio when new commit in PR is force-pushed
git branch -D ${GITHUB_BRANCH}
git checkout --track origin/${GITHUB_BRANCH}
make build-ee-query-service-amd64
make build-frontend-amd64
make run-signoz
EOF
gcloud compute ssh ${GCP_INSTANCE} --zone ${GCP_ZONE} --tunnel-through-iap --project ${GCP_PROJECT} --command "${COMMAND}"

View File

@@ -14,7 +14,9 @@ import (
semconv "go.opentelemetry.io/otel/semconv/v1.4.0"
"go.signoz.io/signoz/ee/query-service/app"
"go.signoz.io/signoz/pkg/query-service/auth"
"go.signoz.io/signoz/pkg/query-service/constants"
baseconst "go.signoz.io/signoz/pkg/query-service/constants"
"go.signoz.io/signoz/pkg/query-service/migrate"
"go.signoz.io/signoz/pkg/query-service/version"
"google.golang.org/grpc"
"google.golang.org/grpc/credentials/insecure"
@@ -143,6 +145,12 @@ func main() {
zap.L().Info("JWT secret key set successfully.")
}
if err := migrate.Migrate(constants.RELATIONAL_DATASOURCE_PATH); err != nil {
zap.L().Error("Failed to migrate", zap.Error(err))
} else {
zap.L().Info("Migration successful")
}
server, err := app.NewServer(serverOptions)
if err != nil {
zap.L().Fatal("Failed to create server", zap.Error(err))

View File

@@ -52,14 +52,14 @@ var BasicPlan = basemodel.FeatureSet{
Name: basemodel.QueryBuilderPanels,
Active: true,
Usage: 0,
UsageLimit: 20,
UsageLimit: -1,
Route: "",
},
basemodel.Feature{
Name: basemodel.QueryBuilderAlerts,
Active: true,
Usage: 0,
UsageLimit: 10,
UsageLimit: -1,
Route: "",
},
basemodel.Feature{

View File

@@ -4,6 +4,7 @@ const config: Config.InitialOptions = {
clearMocks: true,
coverageDirectory: 'coverage',
coverageReporters: ['text', 'cobertura', 'html', 'json-summary'],
collectCoverageFrom: ['src/**/*.{ts,tsx}'],
moduleFileExtensions: ['ts', 'tsx', 'js', 'json'],
modulePathIgnorePatterns: ['dist'],
moduleNameMapper: {

View File

@@ -16,6 +16,7 @@
"new_dashboard_title": "Sample Title",
"layout_saved_successfully": "Layout saved successfully",
"add_panel": "Add Panel",
"add_row": "Add Row",
"save_layout": "Save Layout",
"variable_updated_successfully": "Variable updated successfully",
"error_while_updating_variable": "Error while updating variable",

View File

@@ -16,6 +16,7 @@
"new_dashboard_title": "Sample Title",
"layout_saved_successfully": "Layout saved successfully",
"add_panel": "Add Panel",
"add_row": "Add Row",
"save_layout": "Save Layout",
"full_view": "Full Screen View",
"variable_updated_successfully": "Variable updated successfully",

View File

@@ -157,8 +157,8 @@ function ListLogView({
const timestampValue = useMemo(
() =>
typeof flattenLogData.timestamp === 'string'
? dayjs(flattenLogData.timestamp).format()
: dayjs(flattenLogData.timestamp / 1e6).format(),
? dayjs(flattenLogData.timestamp).format('YYYY-MM-DD HH:mm:ss.SSS')
: dayjs(flattenLogData.timestamp / 1e6).format('YYYY-MM-DD HH:mm:ss.SSS'),
[flattenLogData.timestamp],
);

View File

@@ -90,12 +90,12 @@ function RawLogView({
const text = useMemo(
() =>
typeof data.timestamp === 'string'
? `${dayjs(data.timestamp).format()} | ${attributesText} ${severityText} ${
data.body
}`
: `${dayjs(
data.timestamp / 1e6,
).format()} | ${attributesText} ${severityText} ${data.body}`,
? `${dayjs(data.timestamp).format(
'YYYY-MM-DD HH:mm:ss.SSS',
)} | ${attributesText} ${severityText} ${data.body}`
: `${dayjs(data.timestamp / 1e6).format(
'YYYY-MM-DD HH:mm:ss.SSS',
)} | ${attributesText} ${severityText} ${data.body}`,
[data.timestamp, data.body, severityText, attributesText],
);

View File

@@ -76,8 +76,8 @@ export const useTableView = (props: UseTableViewProps): UseTableViewResult => {
render: (field, item): ColumnTypeRender<Record<string, unknown>> => {
const date =
typeof field === 'string'
? dayjs(field).format()
: dayjs(field / 1e6).format();
? dayjs(field).format('YYYY-MM-DD HH:mm:ss.SSS')
: dayjs(field / 1e6).format('YYYY-MM-DD HH:mm:ss.SSS');
return {
children: (
<div className="table-timestamp">

View File

@@ -1,6 +1,6 @@
import './FacingIssueBtn.style.scss';
import { Button } from 'antd';
import { Button, Tooltip } from 'antd';
import logEvent from 'api/common/logEvent';
import cx from 'classnames';
import { FeatureKeys } from 'constants/features';
@@ -15,6 +15,7 @@ export interface FacingIssueBtnProps {
message?: string;
buttonText?: string;
className?: string;
onHoverText?: string;
}
function FacingIssueBtn({
@@ -23,6 +24,7 @@ function FacingIssueBtn({
message = '',
buttonText = '',
className = '',
onHoverText = '',
}: FacingIssueBtnProps): JSX.Element | null {
const handleFacingIssuesClick = (): void => {
logEvent(eventName, attributes);
@@ -37,13 +39,15 @@ function FacingIssueBtn({
return isCloudUserVal && isChatSupportEnabled ? ( // Note: we would need to move this condition to license based in future
<div className="facing-issue-button">
<Button
className={cx('periscope-btn', 'facing-issue-button', className)}
onClick={handleFacingIssuesClick}
icon={<HelpCircle size={14} />}
>
{buttonText || 'Facing issues?'}
</Button>
<Tooltip title={onHoverText} autoAdjustOverflow>
<Button
className={cx('periscope-btn', 'facing-issue-button', className)}
onClick={handleFacingIssuesClick}
icon={<HelpCircle size={14} />}
>
{buttonText || 'Facing issues?'}
</Button>
</Tooltip>
</div>
) : null;
}
@@ -52,6 +56,7 @@ FacingIssueBtn.defaultProps = {
message: '',
buttonText: '',
className: '',
onHoverText: '',
};
export default FacingIssueBtn;

View File

@@ -0,0 +1,57 @@
import { PANEL_TYPES } from 'constants/queryBuilder';
import { AlertDef } from 'types/api/alerts/def';
import { Dashboard, DashboardData } from 'types/api/dashboard/getAll';
export const chartHelpMessage = (
selectedDashboard: Dashboard | undefined,
graphType: PANEL_TYPES,
): string => `
Hi Team,
I need help in creating this chart. Here are my dashboard details
Name: ${selectedDashboard?.data.title || ''}
Panel type: ${graphType}
Dashboard Id: ${selectedDashboard?.uuid || ''}
Thanks`;
export const dashboardHelpMessage = (
data: DashboardData | undefined,
selectedDashboard: Dashboard | undefined,
): string => `
Hi Team,
I need help with this dashboard. Here are my dashboard details
Name: ${data?.title || ''}
Dashboard Id: ${selectedDashboard?.uuid || ''}
Thanks`;
export const dashboardListMessage = `Hi Team,
I need help with dashboards.
Thanks`;
export const listAlertMessage = `Hi Team,
I need help with managing alerts.
Thanks`;
export const alertHelpMessage = (
alertDef: AlertDef,
ruleId: number,
): string => `
Hi Team,
I need help in configuring this alert. Here are my alert rule details
Name: ${alertDef?.alert || ''}
Alert Type: ${alertDef?.alertType || ''}
State: ${(alertDef as any)?.state || ''}
Alert Id: ${ruleId}
Thanks`;

View File

@@ -30,4 +30,5 @@ export enum QueryParams {
integration = 'integration',
pagination = 'pagination',
relativeTime = 'relativeTime',
alertType = 'alertType',
}

View File

@@ -289,6 +289,11 @@ export enum PANEL_TYPES {
EMPTY_WIDGET = 'EMPTY_WIDGET',
}
// eslint-disable-next-line @typescript-eslint/naming-convention
export enum PANEL_GROUP_TYPES {
ROW = 'row',
}
// eslint-disable-next-line @typescript-eslint/naming-convention
export enum ATTRIBUTE_TYPES {
SUM = 'Sum',

View File

@@ -59,8 +59,8 @@ function CreateAlertChannels({
*Summary:* {{ .Annotations.summary }}
*Description:* {{ .Annotations.description }}
*RelatedLogs:* {{ .Annotations.related_logs }}
*RelatedTraces:* {{ .Annotations.related_traces }}
*RelatedLogs:* {{ if gt (len .Annotations.related_logs) 0 -}} View in <{{ .Annotations.related_logs }}|logs explorer> {{- end}}
*RelatedTraces:* {{ if gt (len .Annotations.related_traces) 0 -}} View in <{{ .Annotations.related_traces }}|traces explorer> {{- end}}
*Details:*
{{ range .Labels.SortedPairs }} • *{{ .Name }}:* {{ .Value }}

View File

@@ -1,8 +1,9 @@
import { Form, Row } from 'antd';
import { ENTITY_VERSION_V4 } from 'constants/app';
import { QueryParams } from 'constants/query';
import FormAlertRules from 'container/FormAlertRules';
import { useGetCompositeQueryParam } from 'hooks/queryBuilder/useGetCompositeQueryParam';
import { isEqual } from 'lodash-es';
import history from 'lib/history';
import { useEffect, useState } from 'react';
import { useLocation } from 'react-router-dom';
import { AlertTypes } from 'types/api/alerts/alertTypes';
@@ -19,13 +20,25 @@ import SelectAlertType from './SelectAlertType';
function CreateRules(): JSX.Element {
const [initValues, setInitValues] = useState<AlertDef | null>(null);
const [alertType, setAlertType] = useState<AlertTypes>();
const location = useLocation();
const queryParams = new URLSearchParams(location.search);
const version = queryParams.get('version');
const alertTypeFromParams = queryParams.get(QueryParams.alertType);
const compositeQuery = useGetCompositeQueryParam();
function getAlertTypeFromDataSource(): AlertTypes | null {
if (!compositeQuery) {
return null;
}
const dataSource = compositeQuery?.builder?.queryData[0]?.dataSource;
return ALERT_TYPE_VS_SOURCE_MAPPING[dataSource];
}
const [alertType, setAlertType] = useState<AlertTypes>(
(alertTypeFromParams as AlertTypes) || getAlertTypeFromDataSource(),
);
const [formInstance] = Form.useForm();
@@ -47,21 +60,17 @@ function CreateRules(): JSX.Element {
version: version || ENTITY_VERSION_V4,
});
}
queryParams.set(QueryParams.alertType, typ);
const generatedUrl = `${location.pathname}?${queryParams.toString()}`;
history.replace(generatedUrl);
};
useEffect(() => {
if (!compositeQuery) {
return;
}
const dataSource = compositeQuery?.builder?.queryData[0]?.dataSource;
const alertTypeFromQuery = ALERT_TYPE_VS_SOURCE_MAPPING[dataSource];
if (alertTypeFromQuery && !isEqual(alertType, alertTypeFromQuery)) {
onSelectType(alertTypeFromQuery);
if (alertType) {
onSelectType(alertType);
}
// eslint-disable-next-line react-hooks/exhaustive-deps
}, [compositeQuery]);
}, [alertType]);
if (!initValues) {
return (

View File

@@ -12,6 +12,7 @@ import {
import saveAlertApi from 'api/alerts/save';
import testAlertApi from 'api/alerts/testAlert';
import FacingIssueBtn from 'components/facingIssueBtn/FacingIssueBtn';
import { alertHelpMessage } from 'components/facingIssueBtn/util';
import { FeatureKeys } from 'constants/features';
import { QueryParams } from 'constants/query';
import { PANEL_TYPES } from 'constants/queryBuilder';
@@ -523,6 +524,7 @@ function FormAlertRules({
runQuery={handleRunQuery}
alertDef={alertDef}
panelType={panelType || PANEL_TYPES.TIME_SERIES}
key={currentQuery.queryType}
/>
<RuleOptions
@@ -584,17 +586,9 @@ function FormAlertRules({
}}
className="facing-issue-btn"
eventName="Alert: Facing Issues in alert"
buttonText="Facing Issues in alert"
message={`Hi Team,
I am facing issues configuring alerts in SigNoz. Here are my alert rule details
Name: ${alertDef?.alert || ''}
Alert Type: ${alertDef?.alertType || ''}
State: ${(alertDef as any)?.state || ''}
Alert Id: ${ruleId}
Thanks`}
buttonText="Need help with this alert?"
message={alertHelpMessage(alertDef, ruleId)}
onHoverText="Click here to get help with this alert"
/>
</Col>
</PanelContainer>

View File

@@ -59,7 +59,7 @@ function WidgetGraphComponent({
const lineChartRef = useRef<ToggleGraphProps>();
const [graphVisibility, setGraphVisibility] = useState<boolean[]>(
Array(queryResponse.data?.payload?.data.result.length || 0).fill(true),
Array(queryResponse.data?.payload?.data?.result?.length || 0).fill(true),
);
const graphRef = useRef<HTMLDivElement>(null);
@@ -135,7 +135,7 @@ function WidgetGraphComponent({
i: uuid,
w: 6,
x: 0,
h: 3,
h: 6,
y: 0,
},
];

View File

@@ -1,11 +1,14 @@
import './GridCardLayout.styles.scss';
import { PlusOutlined } from '@ant-design/icons';
import { Flex, Tooltip } from 'antd';
import { Flex, Form, Input, Modal, Tooltip, Typography } from 'antd';
import { useForm } from 'antd/es/form/Form';
import cx from 'classnames';
import FacingIssueBtn from 'components/facingIssueBtn/FacingIssueBtn';
import { dashboardHelpMessage } from 'components/facingIssueBtn/util';
import { SOMETHING_WENT_WRONG } from 'constants/api';
import { QueryParams } from 'constants/query';
import { PANEL_TYPES } from 'constants/queryBuilder';
import { PANEL_GROUP_TYPES, PANEL_TYPES } from 'constants/queryBuilder';
import { themeColors } from 'constants/theme';
import { useUpdateDashboard } from 'hooks/dashboard/useUpdateDashboard';
import useComponentPermission from 'hooks/useComponentPermission';
@@ -13,12 +16,21 @@ import { useIsDarkMode } from 'hooks/useDarkMode';
import { useNotifications } from 'hooks/useNotifications';
import useUrlQuery from 'hooks/useUrlQuery';
import history from 'lib/history';
import { defaultTo } from 'lodash-es';
import isEqual from 'lodash-es/isEqual';
import { FullscreenIcon } from 'lucide-react';
import {
FullscreenIcon,
GripVertical,
MoveDown,
MoveUp,
Settings,
Trash2,
} from 'lucide-react';
import { useDashboard } from 'providers/Dashboard/Dashboard';
import { sortLayout } from 'providers/Dashboard/util';
import { useCallback, useEffect, useState } from 'react';
import { FullScreen, useFullScreenHandle } from 'react-full-screen';
import { Layout } from 'react-grid-layout';
import { ItemCallback, Layout } from 'react-grid-layout';
import { useTranslation } from 'react-i18next';
import { useDispatch, useSelector } from 'react-redux';
import { useLocation } from 'react-router-dom';
@@ -28,6 +40,7 @@ import { Dashboard, Widgets } from 'types/api/dashboard/getAll';
import AppReducer from 'types/reducer/app';
import { ROLES, USER_ROLES } from 'types/roles';
import { ComponentTypes } from 'utils/permission';
import { v4 as uuid } from 'uuid';
import { EditMenuAction, ViewMenuAction } from './config';
import GridCard from './GridCard';
@@ -46,6 +59,8 @@ function GraphLayout({ onAddPanelHandler }: GraphLayoutProps): JSX.Element {
selectedDashboard,
layouts,
setLayouts,
panelMap,
setPanelMap,
setSelectedDashboard,
isDashboardLocked,
} = useDashboard();
@@ -66,6 +81,26 @@ function GraphLayout({ onAddPanelHandler }: GraphLayoutProps): JSX.Element {
const [dashboardLayout, setDashboardLayout] = useState<Layout[]>([]);
const [isSettingsModalOpen, setIsSettingsModalOpen] = useState<boolean>(false);
const [isDeleteModalOpen, setIsDeleteModalOpen] = useState<boolean>(false);
const [currentSelectRowId, setCurrentSelectRowId] = useState<string | null>(
null,
);
const [currentPanelMap, setCurrentPanelMap] = useState<
Record<string, { widgets: Layout[]; collapsed: boolean }>
>({});
useEffect(() => {
setCurrentPanelMap(panelMap);
}, [panelMap]);
const [form] = useForm<{
title: string;
}>();
const updateDashboardMutation = useUpdateDashboard();
const { notifications } = useNotifications();
@@ -88,7 +123,7 @@ function GraphLayout({ onAddPanelHandler }: GraphLayoutProps): JSX.Element {
);
useEffect(() => {
setDashboardLayout(layouts);
setDashboardLayout(sortLayout(layouts));
}, [layouts]);
const onSaveHandler = (): void => {
@@ -98,6 +133,7 @@ function GraphLayout({ onAddPanelHandler }: GraphLayoutProps): JSX.Element {
...selectedDashboard,
data: {
...selectedDashboard.data,
panelMap: { ...currentPanelMap },
layout: dashboardLayout.filter((e) => e.i !== PANEL_TYPES.EMPTY_WIDGET),
},
uuid: selectedDashboard.uuid,
@@ -107,8 +143,9 @@ function GraphLayout({ onAddPanelHandler }: GraphLayoutProps): JSX.Element {
onSuccess: (updatedDashboard) => {
if (updatedDashboard.payload) {
if (updatedDashboard.payload.data.layout)
setLayouts(updatedDashboard.payload.data.layout);
setLayouts(sortLayout(updatedDashboard.payload.data.layout));
setSelectedDashboard(updatedDashboard.payload);
setPanelMap(updatedDashboard.payload?.data?.panelMap || {});
}
featureResponse.refetch();
@@ -131,7 +168,8 @@ function GraphLayout({ onAddPanelHandler }: GraphLayoutProps): JSX.Element {
dashboardLayout,
);
if (!isEqual(filterLayout, filterDashboardLayout)) {
setDashboardLayout(layout);
const updatedLayout = sortLayout(layout);
setDashboardLayout(updatedLayout);
}
};
@@ -168,6 +206,283 @@ function GraphLayout({ onAddPanelHandler }: GraphLayoutProps): JSX.Element {
// eslint-disable-next-line react-hooks/exhaustive-deps
}, [dashboardLayout]);
function handleAddRow(): void {
if (!selectedDashboard) return;
const id = uuid();
const newRowWidgetMap: { widgets: Layout[]; collapsed: boolean } = {
widgets: [],
collapsed: false,
};
const currentRowIdx = 0;
for (let j = currentRowIdx; j < dashboardLayout.length; j++) {
if (!currentPanelMap[dashboardLayout[j].i]) {
newRowWidgetMap.widgets.push(dashboardLayout[j]);
} else {
break;
}
}
const updatedDashboard: Dashboard = {
...selectedDashboard,
data: {
...selectedDashboard.data,
layout: [
{
i: id,
w: 12,
minW: 12,
minH: 1,
maxH: 1,
x: 0,
h: 1,
y: 0,
},
...dashboardLayout.filter((e) => e.i !== PANEL_TYPES.EMPTY_WIDGET),
],
panelMap: { ...currentPanelMap, [id]: newRowWidgetMap },
widgets: [
...(selectedDashboard.data.widgets || []),
{
id,
title: 'Sample Row',
description: '',
panelTypes: PANEL_GROUP_TYPES.ROW,
},
],
},
uuid: selectedDashboard.uuid,
};
updateDashboardMutation.mutate(updatedDashboard, {
// eslint-disable-next-line sonarjs/no-identical-functions
onSuccess: (updatedDashboard) => {
if (updatedDashboard.payload) {
if (updatedDashboard.payload.data.layout)
setLayouts(sortLayout(updatedDashboard.payload.data.layout));
setSelectedDashboard(updatedDashboard.payload);
setPanelMap(updatedDashboard.payload?.data?.panelMap || {});
}
featureResponse.refetch();
},
// eslint-disable-next-line sonarjs/no-identical-functions
onError: () => {
notifications.error({
message: SOMETHING_WENT_WRONG,
});
},
});
}
const handleRowSettingsClick = (id: string): void => {
setIsSettingsModalOpen(true);
setCurrentSelectRowId(id);
};
const onSettingsModalSubmit = (): void => {
const newTitle = form.getFieldValue('title');
if (!selectedDashboard) return;
if (!currentSelectRowId) return;
const currentWidget = selectedDashboard?.data?.widgets?.find(
(e) => e.id === currentSelectRowId,
);
if (!currentWidget) return;
currentWidget.title = newTitle;
const updatedWidgets = selectedDashboard?.data?.widgets?.filter(
(e) => e.id !== currentSelectRowId,
);
updatedWidgets?.push(currentWidget);
const updatedSelectedDashboard: Dashboard = {
...selectedDashboard,
data: {
...selectedDashboard.data,
widgets: updatedWidgets,
},
uuid: selectedDashboard.uuid,
};
updateDashboardMutation.mutateAsync(updatedSelectedDashboard, {
onSuccess: (updatedDashboard) => {
if (setLayouts) setLayouts(updatedDashboard.payload?.data?.layout || []);
if (setSelectedDashboard && updatedDashboard.payload) {
setSelectedDashboard(updatedDashboard.payload);
}
if (setPanelMap)
setPanelMap(updatedDashboard.payload?.data?.panelMap || {});
form.setFieldValue('title', '');
setIsSettingsModalOpen(false);
setCurrentSelectRowId(null);
featureResponse.refetch();
},
// eslint-disable-next-line sonarjs/no-identical-functions
onError: () => {
notifications.error({
message: SOMETHING_WENT_WRONG,
});
},
});
};
// eslint-disable-next-line sonarjs/cognitive-complexity
const handleRowCollapse = (id: string): void => {
if (!selectedDashboard) return;
const rowProperties = { ...currentPanelMap[id] };
const updatedPanelMap = { ...currentPanelMap };
let updatedDashboardLayout = [...dashboardLayout];
if (rowProperties.collapsed === true) {
rowProperties.collapsed = false;
const widgetsInsideTheRow = rowProperties.widgets;
let maxY = 0;
widgetsInsideTheRow.forEach((w) => {
maxY = Math.max(maxY, w.y + w.h);
});
const currentRowWidget = dashboardLayout.find((w) => w.i === id);
if (currentRowWidget && widgetsInsideTheRow.length) {
maxY -= currentRowWidget.h + currentRowWidget.y;
}
const idxCurrentRow = dashboardLayout.findIndex((w) => w.i === id);
for (let j = idxCurrentRow + 1; j < dashboardLayout.length; j++) {
updatedDashboardLayout[j].y += maxY;
if (updatedPanelMap[updatedDashboardLayout[j].i]) {
updatedPanelMap[updatedDashboardLayout[j].i].widgets = updatedPanelMap[
updatedDashboardLayout[j].i
// eslint-disable-next-line @typescript-eslint/no-loop-func
].widgets.map((w) => ({
...w,
y: w.y + maxY,
}));
}
}
updatedDashboardLayout = [...updatedDashboardLayout, ...widgetsInsideTheRow];
} else {
rowProperties.collapsed = true;
const currentIdx = dashboardLayout.findIndex((w) => w.i === id);
let widgetsInsideTheRow: Layout[] = [];
let isPanelMapUpdated = false;
for (let j = currentIdx + 1; j < dashboardLayout.length; j++) {
if (currentPanelMap[dashboardLayout[j].i]) {
rowProperties.widgets = widgetsInsideTheRow;
widgetsInsideTheRow = [];
isPanelMapUpdated = true;
break;
} else {
widgetsInsideTheRow.push(dashboardLayout[j]);
}
}
if (!isPanelMapUpdated) {
rowProperties.widgets = widgetsInsideTheRow;
}
let maxY = 0;
widgetsInsideTheRow.forEach((w) => {
maxY = Math.max(maxY, w.y + w.h);
});
const currentRowWidget = dashboardLayout[currentIdx];
if (currentRowWidget && widgetsInsideTheRow.length) {
maxY -= currentRowWidget.h + currentRowWidget.y;
}
for (let j = currentIdx + 1; j < updatedDashboardLayout.length; j++) {
updatedDashboardLayout[j].y += maxY;
if (updatedPanelMap[updatedDashboardLayout[j].i]) {
updatedPanelMap[updatedDashboardLayout[j].i].widgets = updatedPanelMap[
updatedDashboardLayout[j].i
// eslint-disable-next-line @typescript-eslint/no-loop-func
].widgets.map((w) => ({
...w,
y: w.y + maxY,
}));
}
}
updatedDashboardLayout = updatedDashboardLayout.filter(
(widget) => !rowProperties.widgets.some((w: Layout) => w.i === widget.i),
);
}
setCurrentPanelMap((prev) => ({
...prev,
...updatedPanelMap,
[id]: {
...rowProperties,
},
}));
setDashboardLayout(sortLayout(updatedDashboardLayout));
};
const handleDragStop: ItemCallback = (_, oldItem, newItem): void => {
if (currentPanelMap[oldItem.i]) {
const differenceY = newItem.y - oldItem.y;
const widgetsInsideRow = currentPanelMap[oldItem.i].widgets.map((w) => ({
...w,
y: w.y + differenceY,
}));
setCurrentPanelMap((prev) => ({
...prev,
[oldItem.i]: {
...prev[oldItem.i],
widgets: widgetsInsideRow,
},
}));
}
};
const handleRowDelete = (): void => {
if (!selectedDashboard) return;
if (!currentSelectRowId) return;
const updatedWidgets = selectedDashboard?.data?.widgets?.filter(
(e) => e.id !== currentSelectRowId,
);
const updatedLayout =
selectedDashboard.data.layout?.filter((e) => e.i !== currentSelectRowId) ||
[];
const updatedPanelMap = { ...currentPanelMap };
delete updatedPanelMap[currentSelectRowId];
const updatedSelectedDashboard: Dashboard = {
...selectedDashboard,
data: {
...selectedDashboard.data,
widgets: updatedWidgets,
layout: updatedLayout,
panelMap: updatedPanelMap,
},
uuid: selectedDashboard.uuid,
};
updateDashboardMutation.mutateAsync(updatedSelectedDashboard, {
onSuccess: (updatedDashboard) => {
if (setLayouts) setLayouts(updatedDashboard.payload?.data?.layout || []);
if (setSelectedDashboard && updatedDashboard.payload) {
setSelectedDashboard(updatedDashboard.payload);
}
if (setPanelMap)
setPanelMap(updatedDashboard.payload?.data?.panelMap || {});
setIsDeleteModalOpen(false);
setCurrentSelectRowId(null);
featureResponse.refetch();
},
// eslint-disable-next-line sonarjs/no-identical-functions
onError: () => {
notifications.error({
message: SOMETHING_WENT_WRONG,
});
},
});
};
return (
<>
<Flex justify="flex-end" gap={8} align="center">
@@ -178,15 +493,9 @@ function GraphLayout({ onAddPanelHandler }: GraphLayoutProps): JSX.Element {
screen: 'Dashboard Details',
}}
eventName="Dashboard: Facing Issues in dashboard"
buttonText="Facing Issues in dashboard"
message={`Hi Team,
I am facing issues configuring dashboard in SigNoz. Here are my dashboard details
Name: ${data?.title || ''}
Dashboard Id: ${selectedDashboard?.uuid || ''}
Thanks`}
buttonText="Need help with this dashboard?"
message={dashboardHelpMessage(data, selectedDashboard)}
onHoverText="Click here to get help for this dashboard"
/>
<ButtonContainer>
<Tooltip title="Open in Full Screen">
@@ -209,13 +518,23 @@ Thanks`}
{t('dashboard:add_panel')}
</Button>
)}
{!isDashboardLocked && addPanelPermission && (
<Button
className="periscope-btn"
onClick={(): void => handleAddRow()}
icon={<PlusOutlined />}
data-testid="add-row"
>
{t('dashboard:add_row')}
</Button>
)}
</ButtonContainer>
</Flex>
<FullScreen handle={handle} className="fullscreen-grid-container">
<ReactGridLayout
cols={12}
rowHeight={100}
rowHeight={45}
autoSize
width={100}
useCSSTransforms
@@ -224,6 +543,7 @@ Thanks`}
isResizable={!isDashboardLocked && addPanelPermission}
allowOverlap={false}
onLayoutChange={handleLayoutChange}
onDragStop={handleDragStop}
draggableHandle=".drag-handle"
layout={dashboardLayout}
style={{ backgroundColor: isDarkMode ? '' : themeColors.snowWhite }}
@@ -232,6 +552,58 @@ Thanks`}
const { i: id } = layout;
const currentWidget = (widgets || [])?.find((e) => e.id === id);
if (currentWidget?.panelTypes === PANEL_GROUP_TYPES.ROW) {
const rowWidgetProperties = currentPanelMap[id] || {};
return (
<CardContainer
className="row-card"
isDarkMode={isDarkMode}
key={id}
data-grid={JSON.stringify(currentWidget)}
>
<div className={cx('row-panel')}>
<div style={{ display: 'flex', gap: '10px', alignItems: 'center' }}>
<Button
disabled={updateDashboardMutation.isLoading}
icon={
rowWidgetProperties.collapsed ? (
<MoveDown size={14} />
) : (
<MoveUp size={14} />
)
}
type="text"
onClick={(): void => handleRowCollapse(id)}
/>
<Typography.Text>{currentWidget.title}</Typography.Text>
<Button
icon={<Settings size={14} />}
type="text"
onClick={(): void => handleRowSettingsClick(id)}
/>
</div>
{rowWidgetProperties.collapsed && (
<Button
type="text"
icon={<GripVertical size={14} />}
className="drag-handle"
/>
)}
{!rowWidgetProperties.collapsed && (
<Button
type="text"
icon={<Trash2 size={14} />}
onClick={(): void => {
setIsDeleteModalOpen(true);
setCurrentSelectRowId(id);
}}
/>
)}
</div>
</CardContainer>
);
}
return (
<CardContainer
className={isDashboardLocked ? '' : 'enable-resize'}
@@ -244,7 +616,7 @@ Thanks`}
$panelType={currentWidget?.panelTypes || PANEL_TYPES.TIME_SERIES}
>
<GridCard
widget={currentWidget || ({ id, query: {} } as Widgets)}
widget={(currentWidget as Widgets) || ({ id, query: {} } as Widgets)}
headerMenuList={widgetActions}
variables={variables}
version={selectedDashboard?.data?.version}
@@ -255,6 +627,46 @@ Thanks`}
);
})}
</ReactGridLayout>
<Modal
open={isSettingsModalOpen}
title="Row Options"
destroyOnClose
footer={null}
onCancel={(): void => {
setIsSettingsModalOpen(false);
setCurrentSelectRowId(null);
}}
>
<Form form={form} onFinish={onSettingsModalSubmit} requiredMark>
<Form.Item required name={['title']}>
<Input
placeholder="Enter row name here..."
defaultValue={defaultTo(
widgets?.find((widget) => widget.id === currentSelectRowId)
?.title as string,
'Sample Title',
)}
/>
</Form.Item>
<Form.Item>
<Button type="primary" htmlType="submit">
Apply Changes
</Button>
</Form.Item>
</Form>
</Modal>
<Modal
open={isDeleteModalOpen}
title="Delete Row"
destroyOnClose
onCancel={(): void => {
setIsDeleteModalOpen(false);
setCurrentSelectRowId(null);
}}
onOk={(): void => handleRowDelete()}
>
<Typography.Text>Are you sure you want to delete this row</Typography.Text>
</Modal>
</FullScreen>
</>
);

View File

@@ -16,6 +16,6 @@ export const EMPTY_WIDGET_LAYOUT = {
i: PANEL_TYPES.EMPTY_WIDGET,
w: 6,
x: 0,
h: 3,
h: 6,
y: 0,
};

View File

@@ -29,6 +29,17 @@ interface Props {
export const CardContainer = styled.div<Props>`
overflow: auto;
&.row-card {
.row-panel {
height: 100%;
display: flex;
justify-content: space-between;
background: var(--bg-ink-400);
align-items: center;
overflow: hidden;
}
}
&.enable-resize {
:hover {
.react-resizable-handle {

View File

@@ -4,6 +4,7 @@ import { Input, Typography } from 'antd';
import type { ColumnsType } from 'antd/es/table/interface';
import saveAlertApi from 'api/alerts/save';
import DropDown from 'components/DropDown/DropDown';
import { listAlertMessage } from 'components/facingIssueBtn/util';
import {
DynamicColumnsKey,
TableDataSource,
@@ -363,12 +364,9 @@ function ListAlert({ allAlertRules, refetch }: ListAlertProps): JSX.Element {
screen: 'Alert list page',
},
eventName: 'Alert: Facing Issues in alert',
buttonText: 'Facing Issues in alert',
message: `Hi Team,
I am facing issues with alerts.
Thanks`,
buttonText: 'Facing issues with alerts?',
message: listAlertMessage,
onHoverText: 'Click here to get help with alerts',
}}
/>
</>

View File

@@ -3,6 +3,7 @@ import { Card, Col, Dropdown, Input, Row, TableColumnProps } from 'antd';
import { ItemType } from 'antd/es/menu/hooks/useItems';
import createDashboard from 'api/dashboard/create';
import { AxiosError } from 'axios';
import { dashboardListMessage } from 'components/facingIssueBtn/util';
import {
DynamicColumnsKey,
TableDataSource,
@@ -390,12 +391,9 @@ function DashboardsList(): JSX.Element {
screen: 'Dashboard list page',
},
eventName: 'Dashboard: Facing Issues in dashboard',
buttonText: 'Facing Issues in dashboard',
message: `Hi Team,
I am facing issues with dashboards.
Thanks`,
buttonText: 'Facing issues with dashboards?',
message: dashboardListMessage,
onHoverText: 'Click here to get help with dashboards',
}}
/>
</TableContainer>

View File

@@ -85,8 +85,8 @@ function LogControls(): JSX.Element | null {
logs.map((log) => {
const timestamp =
typeof log.timestamp === 'string'
? dayjs(log.timestamp).format()
: dayjs(log.timestamp / 1e6).format();
? dayjs(log.timestamp).format('YYYY-MM-DD HH:mm:ss.SSS')
: dayjs(log.timestamp / 1e6).format('YYYY-MM-DD HH:mm:ss.SSS');
return FlatLogData({
...log,

View File

@@ -531,8 +531,8 @@ function LogsExplorerViews({
logs.map((log) => {
const timestamp =
typeof log.timestamp === 'string'
? dayjs(log.timestamp).format()
: dayjs(log.timestamp / 1e6).format();
? dayjs(log.timestamp).format('YYYY-MM-DD HH:mm:ss.SSS')
: dayjs(log.timestamp / 1e6).format('YYYY-MM-DD HH:mm:ss.SSS');
return FlatLogData({
timestamp,

View File

@@ -67,12 +67,13 @@ function LeftContainer({
setRequestData((prev) => ({
...prev,
selectedTime: selectedTime.enum || prev.selectedTime,
globalSelectedInterval,
graphType: getGraphType(selectedGraph || selectedWidget.panelTypes),
query: stagedQuery,
}));
}
// eslint-disable-next-line react-hooks/exhaustive-deps
}, [stagedQuery, selectedTime]);
}, [stagedQuery, selectedTime, globalSelectedInterval]);
const queryResponse = useGetQueryRange(
requestData,

View File

@@ -0,0 +1,4 @@
.facing-issue-btn-container {
display: grid;
grid-template-columns: 1fr max-content;
}

View File

@@ -1,7 +1,10 @@
/* eslint-disable sonarjs/cognitive-complexity */
import './NewWidget.styles.scss';
import { LockFilled, WarningOutlined } from '@ant-design/icons';
import { Button, Flex, Modal, Space, Tooltip, Typography } from 'antd';
import { Button, Modal, Space, Tooltip, Typography } from 'antd';
import FacingIssueBtn from 'components/facingIssueBtn/FacingIssueBtn';
import { chartHelpMessage } from 'components/facingIssueBtn/util';
import { SOMETHING_WENT_WRONG } from 'constants/api';
import { FeatureKeys } from 'constants/features';
import { QueryParams } from 'constants/query';
@@ -104,7 +107,7 @@ function NewWidget({ selectedGraph }: NewWidgetProps): JSX.Element {
return defaultTo(
selectedWidget,
getDefaultWidgetData(widgetId || '', selectedGraph),
);
) as Widgets;
}, [query, selectedGraph, widgets]);
const [selectedWidget, setSelectedWidget] = useState(getWidget());
@@ -257,7 +260,7 @@ function NewWidget({ selectedGraph }: NewWidgetProps): JSX.Element {
i: widgetId || '',
w: 6,
x: 0,
h: 3,
h: 6,
y: 0,
},
...updatedLayout,
@@ -268,28 +271,50 @@ function NewWidget({ selectedGraph }: NewWidgetProps): JSX.Element {
uuid: selectedDashboard.uuid,
data: {
...selectedDashboard.data,
widgets: [
...preWidgets,
{
...(selectedWidget || ({} as Widgets)),
description: selectedWidget?.description || '',
timePreferance: selectedTime.enum,
isStacked: selectedWidget?.isStacked || false,
opacity: selectedWidget?.opacity || '1',
nullZeroValues: selectedWidget?.nullZeroValues || 'zero',
title: selectedWidget?.title,
yAxisUnit: selectedWidget?.yAxisUnit,
panelTypes: graphType,
query: currentQuery,
thresholds: selectedWidget?.thresholds,
softMin: selectedWidget?.softMin || 0,
softMax: selectedWidget?.softMax || 0,
fillSpans: selectedWidget?.fillSpans,
selectedLogFields: selectedWidget?.selectedLogFields || [],
selectedTracesFields: selectedWidget?.selectedTracesFields || [],
},
...afterWidgets,
],
widgets: isNewDashboard
? [
...afterWidgets,
{
...(selectedWidget || ({} as Widgets)),
description: selectedWidget?.description || '',
timePreferance: selectedTime.enum,
isStacked: selectedWidget?.isStacked || false,
opacity: selectedWidget?.opacity || '1',
nullZeroValues: selectedWidget?.nullZeroValues || 'zero',
title: selectedWidget?.title,
yAxisUnit: selectedWidget?.yAxisUnit,
panelTypes: graphType,
query: currentQuery,
thresholds: selectedWidget?.thresholds,
softMin: selectedWidget?.softMin || 0,
softMax: selectedWidget?.softMax || 0,
fillSpans: selectedWidget?.fillSpans,
selectedLogFields: selectedWidget?.selectedLogFields || [],
selectedTracesFields: selectedWidget?.selectedTracesFields || [],
},
]
: [
...preWidgets,
{
...(selectedWidget || ({} as Widgets)),
description: selectedWidget?.description || '',
timePreferance: selectedTime.enum,
isStacked: selectedWidget?.isStacked || false,
opacity: selectedWidget?.opacity || '1',
nullZeroValues: selectedWidget?.nullZeroValues || 'zero',
title: selectedWidget?.title,
yAxisUnit: selectedWidget?.yAxisUnit,
panelTypes: graphType,
query: currentQuery,
thresholds: selectedWidget?.thresholds,
softMin: selectedWidget?.softMin || 0,
softMax: selectedWidget?.softMax || 0,
fillSpans: selectedWidget?.fillSpans,
selectedLogFields: selectedWidget?.selectedLogFields || [],
selectedTracesFields: selectedWidget?.selectedTracesFields || [],
},
...afterWidgets,
],
layout: [...updatedLayout],
},
};
@@ -402,7 +427,7 @@ function NewWidget({ selectedGraph }: NewWidgetProps): JSX.Element {
return (
<Container>
<Flex justify="space-between" align="center">
<div className="facing-issue-btn-container">
<FacingIssueBtn
attributes={{
uuid: selectedDashboard?.uuid,
@@ -410,18 +435,12 @@ function NewWidget({ selectedGraph }: NewWidgetProps): JSX.Element {
panelType: graphType,
widgetId: query.get('widgetId'),
queryType: currentQuery.queryType,
screen: 'Dashboard list page',
}}
eventName="Dashboard: Facing Issues in dashboard"
buttonText="Facing Issues in dashboard"
message={`Hi Team,
I am facing issues configuring dashboard in SigNoz. Here are my dashboard details
Name: ${selectedDashboard?.data.title || ''}
Panel type: ${graphType}
Dashboard Id: ${selectedDashboard?.uuid || ''}
Thanks`}
buttonText="Need help with this chart?"
message={chartHelpMessage(selectedDashboard, graphType)}
onHoverText="Click here to get help in creating chart"
/>
<ButtonContainer>
{isSaveDisabled && (
@@ -450,7 +469,7 @@ Thanks`}
)}
<Button onClick={onClickDiscardHandler}>Discard Changes</Button>
</ButtonContainer>
</Flex>
</div>
<PanelContainer>
<LeftContainerWrapper flex={5}>

View File

@@ -0,0 +1,87 @@
import { render } from '@testing-library/react';
import { I18nextProvider } from 'react-i18next';
import { QueryClient, QueryClientProvider } from 'react-query';
import { Provider } from 'react-redux';
import { MemoryRouter } from 'react-router-dom';
import i18n from 'ReactI18';
import store from 'store';
import ChangeHistory from '../index';
import { pipelineData, pipelineDataHistory } from './testUtils';
const queryClient = new QueryClient({
defaultOptions: {
queries: {
refetchOnWindowFocus: false,
},
},
});
describe('ChangeHistory test', () => {
it('should render changeHistory correctly', () => {
const { getAllByText, getByText } = render(
<MemoryRouter>
<QueryClientProvider client={queryClient}>
<Provider store={store}>
<I18nextProvider i18n={i18n}>
<ChangeHistory pipelineData={pipelineData} />
</I18nextProvider>
</Provider>
</QueryClientProvider>
</MemoryRouter>,
);
// change History table headers
[
'Version',
'Deployment Stage',
'Last Deploy Message',
'Last Deployed Time',
'Edited by',
].forEach((text) => expect(getByText(text)).toBeInTheDocument());
// table content
expect(getAllByText('test-user').length).toBe(2);
expect(getAllByText('Deployment was successful').length).toBe(2);
});
it('test deployment stage and icon based on history data', () => {
const { getByText, container } = render(
<MemoryRouter>
<QueryClientProvider client={queryClient}>
<Provider store={store}>
<I18nextProvider i18n={i18n}>
<ChangeHistory
pipelineData={{
...pipelineData,
history: pipelineDataHistory,
}}
/>
</I18nextProvider>
</Provider>
</QueryClientProvider>
</MemoryRouter>,
);
// assertion for different deployment stages
expect(container.querySelector('[data-icon="loading"]')).toBeInTheDocument();
expect(getByText('In Progress')).toBeInTheDocument();
expect(
container.querySelector('[data-icon="exclamation-circle"]'),
).toBeInTheDocument();
expect(getByText('Dirty')).toBeInTheDocument();
expect(
container.querySelector('[data-icon="close-circle"]'),
).toBeInTheDocument();
expect(getByText('Failed')).toBeInTheDocument();
expect(
container.querySelector('[data-icon="minus-circle"]'),
).toBeInTheDocument();
expect(getByText('Unknown')).toBeInTheDocument();
expect(container.querySelectorAll('.ant-table-row').length).toBe(5);
});
});

View File

@@ -0,0 +1,240 @@
/* eslint-disable sonarjs/no-duplicate-string */
import { Pipeline } from 'types/api/pipeline/def';
import { DataTypes } from 'types/api/queryBuilder/queryAutocompleteResponse';
export const pipelineData: Pipeline = {
id: 'test-id-1',
version: 24,
elementType: 'log_pipelines',
active: false,
is_valid: false,
disabled: false,
deployStatus: 'DEPLOYED',
deployResult: 'Deployment was successful',
lastHash: 'log_pipelines:24',
lastConf: 'oiwernveroi',
createdBy: 'test-created-by',
pipelines: [
{
id: 'test-id-2',
orderId: 1,
name: 'hotrod logs parser',
alias: 'hotrodlogsparser',
description: 'Trying to test Logs Pipeline feature',
enabled: true,
filter: {
op: 'AND',
items: [
{
key: {
key: 'container_name',
dataType: DataTypes.String,
type: 'tag',
isColumn: false,
isJSON: false,
},
id: 'sampleid',
value: 'hotrod',
op: '=',
},
],
},
config: [
{
type: 'regex_parser',
id: 'parsetext(regex)',
output: 'parseattribsjson',
on_error: 'send',
orderId: 1,
enabled: true,
name: 'parse text (regex)',
parse_to: 'attributes',
regex:
'.+\\t+(?P<log_level>.+)\\t+(?P<location>.+)\\t+(?P<message>.+)\\t+(?P<attribs_json>.+)',
parse_from: 'body',
},
{
type: 'json_parser',
id: 'parseattribsjson',
output: 'removetempattribs_json',
orderId: 2,
enabled: true,
name: 'parse attribs json',
parse_to: 'attributes',
parse_from: 'attributes.attribs_json',
},
{
type: 'remove',
id: 'removetempattribs_json',
output: 'c2062723-895e-4614-ba38-29c5d5ee5927',
orderId: 3,
enabled: true,
name: 'remove temp attribs_json',
field: 'attributes.attribs_json',
},
{
type: 'add',
id: 'c2062723-895e-4614-ba38-29c5d5ee5927',
orderId: 4,
enabled: true,
name: 'test add ',
field: 'resource["container.name"]',
value: 'hotrod',
},
],
createdBy: 'test@email',
createdAt: '2024-01-02T13:56:02.858300964Z',
},
{
id: 'tes-id-1',
orderId: 2,
name: 'Logs Parser - test - Customer Service',
alias: 'LogsParser-test-CustomerService',
description: 'Trying to test Logs Pipeline feature',
enabled: true,
filter: {
op: 'AND',
items: [
{
key: {
key: 'service',
dataType: DataTypes.String,
type: 'tag',
isColumn: false,
isJSON: false,
},
id: 'sample-test-1',
value: 'customer',
op: '=',
},
],
},
config: [
{
type: 'grok_parser',
id: 'Testtest',
on_error: 'send',
orderId: 1,
enabled: true,
name: 'Test test',
parse_to: 'attributes',
pattern:
'^%{DATE:date}Z INFO customer/database.go:73 Loading customer {"service": "customer", "component": "mysql", "trace_id": "test-id", "span_id": "1427a3fcad8b1514", "customer_id": "567"}',
parse_from: 'body',
},
],
createdBy: 'test@email',
createdAt: '2024-01-02T13:56:02.863764227Z',
},
],
history: [
{
id: 'test-id-4',
version: 24,
elementType: 'log_pipelines',
active: false,
isValid: false,
disabled: false,
deployStatus: 'DEPLOYED',
deployResult: 'Deployment was successful',
lastHash: 'log_pipelines:24',
lastConf: 'eovineroiv',
createdBy: 'test-created-by',
createdByName: 'test-user',
createdAt: '2024-01-02T13:56:02Z',
},
{
id: 'test-4',
version: 23,
elementType: 'log_pipelines',
active: false,
isValid: false,
disabled: false,
deployStatus: 'DEPLOYED',
deployResult: 'Deployment was successful',
lastHash: 'log_pipelines:23',
lastConf: 'eivrounreovi',
createdBy: 'test-created-by',
createdByName: 'test-user',
createdAt: '2023-12-29T12:59:20Z',
},
],
};
export const pipelineDataHistory: Pipeline['history'] = [
{
id: 'test-id-4',
version: 24,
elementType: 'log_pipelines',
active: false,
isValid: false,
disabled: false,
deployStatus: 'DEPLOYED',
deployResult: 'Deployment was successful',
lastHash: 'log_pipelines:24',
lastConf: 'eovineroiv',
createdBy: 'test-created-by',
createdByName: 'test-user',
createdAt: '2024-01-02T13:56:02Z',
},
{
id: 'test-4',
version: 23,
elementType: 'log_pipelines',
active: false,
isValid: false,
disabled: false,
deployStatus: 'IN_PROGRESS',
deployResult: 'Deployment is in progress',
lastHash: 'log_pipelines:23',
lastConf: 'eivrounreovi',
createdBy: 'test-created-by',
createdByName: 'test-user',
createdAt: '2023-12-29T12:59:20Z',
},
{
id: 'test-4-1',
version: 25,
elementType: 'log_pipelines',
active: false,
isValid: false,
disabled: false,
deployStatus: 'DIRTY',
deployResult: 'Deployment is dirty',
lastHash: 'log_pipelines:23',
lastConf: 'eivrounreovi',
createdBy: 'test-created-by',
createdByName: 'test-user',
createdAt: '2023-12-29T12:59:20Z',
},
{
id: 'test-4-2',
version: 26,
elementType: 'log_pipelines',
active: false,
isValid: false,
disabled: false,
deployStatus: 'FAILED',
deployResult: 'Deployment failed',
lastHash: 'log_pipelines:23',
lastConf: 'eivrounreovi',
createdBy: 'test-created-by',
createdByName: 'test-user',
createdAt: '2023-12-29T12:59:20Z',
},
{
id: 'test-4-3',
version: 27,
elementType: 'log_pipelines',
active: false,
isValid: false,
disabled: false,
deployStatus: 'UNKNOWN',
deployResult: '',
lastHash: 'log_pipelines:23',
lastConf: 'eivrounreovi',
createdBy: 'test-created-by',
createdByName: 'test-user',
createdAt: '2023-12-29T12:59:20Z',
},
];

View File

@@ -1,4 +1,5 @@
import { render } from '@testing-library/react';
import userEvent from '@testing-library/user-event';
import { I18nextProvider } from 'react-i18next';
import { Provider } from 'react-redux';
import { MemoryRouter } from 'react-router-dom';
@@ -8,8 +9,17 @@ import store from 'store';
import CreatePipelineButton from '../Layouts/Pipeline/CreatePipelineButton';
import { pipelineApiResponseMockData } from '../mocks/pipeline';
const trackEventVar = jest.fn();
jest.mock('hooks/analytics/useAnalytics', () => ({
__esModule: true,
default: jest.fn().mockImplementation(() => ({
trackEvent: trackEventVar,
trackPageView: jest.fn(),
})),
}));
describe('PipelinePage container test', () => {
it('should render CreatePipelineButton section', () => {
it('should render CreatePipelineButton section', async () => {
const { asFragment } = render(
<MemoryRouter>
<Provider store={store}>
@@ -26,4 +36,58 @@ describe('PipelinePage container test', () => {
);
expect(asFragment()).toMatchSnapshot();
});
it('CreatePipelineButton - edit mode & tracking', async () => {
const { getByText } = render(
<MemoryRouter>
<Provider store={store}>
<I18nextProvider i18n={i18n}>
<CreatePipelineButton
setActionType={jest.fn()}
isActionMode="viewing-mode"
setActionMode={jest.fn()}
pipelineData={pipelineApiResponseMockData}
/>
</I18nextProvider>
</Provider>
</MemoryRouter>,
);
// enter_edit_mode click and track event data
const editButton = getByText('enter_edit_mode');
expect(editButton).toBeInTheDocument();
await userEvent.click(editButton);
expect(trackEventVar).toBeCalledWith('Logs: Pipelines: Entered Edit Mode', {
source: 'signoz-ui',
});
});
it('CreatePipelineButton - add new mode & tracking', async () => {
const { getByText } = render(
<MemoryRouter>
<Provider store={store}>
<I18nextProvider i18n={i18n}>
<CreatePipelineButton
setActionType={jest.fn()}
isActionMode="viewing-mode"
setActionMode={jest.fn()}
pipelineData={{ ...pipelineApiResponseMockData, pipelines: [] }}
/>
</I18nextProvider>
</Provider>
</MemoryRouter>,
);
// new_pipeline click and track event data
const editButton = getByText('new_pipeline');
expect(editButton).toBeInTheDocument();
await userEvent.click(editButton);
expect(trackEventVar).toBeCalledWith(
'Logs: Pipelines: Clicked Add New Pipeline',
{
source: 'signoz-ui',
},
);
});
});

View File

@@ -1,4 +1,5 @@
import { render } from '@testing-library/react';
import { fireEvent, render, waitFor } from '@testing-library/react';
import userEvent from '@testing-library/user-event';
import { I18nextProvider } from 'react-i18next';
import { Provider } from 'react-redux';
import { MemoryRouter } from 'react-router-dom';
@@ -20,4 +21,43 @@ describe('PipelinePage container test', () => {
);
expect(asFragment()).toMatchSnapshot();
});
it('should handle search', async () => {
const setPipelineValue = jest.fn();
const { getByPlaceholderText, container } = render(
<MemoryRouter>
<Provider store={store}>
<I18nextProvider i18n={i18n}>
<PipelinesSearchSection setPipelineSearchValue={setPipelineValue} />
</I18nextProvider>
</Provider>
</MemoryRouter>,
);
const searchInput = getByPlaceholderText('search_pipeline_placeholder');
// Type into the search input
userEvent.type(searchInput, 'sample_pipeline');
jest.advanceTimersByTime(299);
expect(setPipelineValue).not.toHaveBeenCalled();
// Wait for the debounce delay to pass
await waitFor(() => {
// Expect the callback to be called after debounce delay
expect(setPipelineValue).toHaveBeenCalledWith('sample_pipeline');
});
// clear button
fireEvent.click(
container.querySelector(
'span[class*="ant-input-clear-icon"]',
) as HTMLElement,
);
// Wait for the debounce delay to pass
await waitFor(() => {
expect(setPipelineValue).toHaveBeenCalledWith('');
});
});
});

View File

@@ -278,7 +278,7 @@ function SideNav({
}, [isCloudUserVal, isEnterprise, isFetching]);
useEffect(() => {
if (!isCloudUserVal) {
if (!(isCloudUserVal || isEECloudUser())) {
let updatedMenuItems = [...menuItems];
updatedMenuItems = updatedMenuItems.filter(
(item) => item.key !== ROUTES.INTEGRATIONS,

View File

@@ -13,11 +13,18 @@ function Events({
return <Typography>No events data in selected span</Typography>;
}
const sortedTraceEvents = events.sort((a, b) => {
// Handle undefined names by treating them as empty strings
const nameA = a.name || '';
const nameB = b.name || '';
return nameA.localeCompare(nameB);
});
return (
<ErrorTag
onToggleHandler={onToggleHandler}
setText={setText}
event={events}
event={sortedTraceEvents}
firstSpanStartTime={firstSpanStartTime}
/>
);

View File

@@ -41,8 +41,9 @@ function Tags({
setSearchText(value);
};
const filteredTags = tags.filter((tag) => tag.key.includes(searchText));
const filteredTags = tags
.filter((tag) => tag.key.includes(searchText))
.sort((a, b) => a.key.localeCompare(b.key));
if (tags.length === 0) {
return <Typography>No tags in selected span</Typography>;
}

View File

@@ -16,7 +16,7 @@ export const addEmptyWidgetInDashboardJSONWithQuery = (
i: widgetId,
w: 6,
x: 0,
h: 3,
h: 6,
y: 0,
},
...(dashboard?.data?.layout || []),

View File

@@ -36,7 +36,7 @@ export const getPaginationQueryData: SetupPaginationQueryData = ({
const updatedFilters: TagFilter = {
...filters,
items: filters.items.filter((item) => item.key?.key !== 'id'),
items: filters?.items?.filter((item) => item.key?.key !== 'id'),
};
const tagFilters: TagFilter = {

View File

@@ -100,7 +100,7 @@ export const getUPlotChartOptions = ({
y: {
...getYAxisScale({
thresholds,
series: apiResponse?.data.newResult.data.result,
series: apiResponse?.data?.newResult?.data?.result || [],
yAxisUnit,
softMax,
softMin,

View File

@@ -9,6 +9,7 @@ import history from 'lib/history';
import { useDashboard } from 'providers/Dashboard/Dashboard';
import { useEffect, useState } from 'react';
import { generatePath, useLocation, useParams } from 'react-router-dom';
import { Widgets } from 'types/api/dashboard/getAll';
function DashboardWidget(): JSX.Element | null {
const { search } = useLocation();
@@ -24,7 +25,7 @@ function DashboardWidget(): JSX.Element | null {
const { data } = selectedDashboard || {};
const { widgets } = data || {};
const selectedWidget = widgets?.find((e) => e.id === widgetId);
const selectedWidget = widgets?.find((e) => e.id === widgetId) as Widgets;
useEffect(() => {
const params = new URLSearchParams(search);

View File

@@ -282,7 +282,7 @@ function SaveView(): JSX.Element {
<div className="save-view-content">
<Typography.Title className="title">Views</Typography.Title>
<Typography.Text className="subtitle">
Manage your saved views for logs.
Manage your saved views for {ROUTES_VS_SOURCEPAGE[pathname]}.
</Typography.Text>
<Input
placeholder="Search for views..."

View File

@@ -10,6 +10,7 @@ import { useDashboardVariablesFromLocalStorage } from 'hooks/dashboard/useDashbo
import useAxiosError from 'hooks/useAxiosError';
import useTabVisibility from 'hooks/useTabFocus';
import { getUpdatedLayout } from 'lib/dashboard/getUpdatedLayout';
import { defaultTo } from 'lodash-es';
import isEqual from 'lodash-es/isEqual';
import isUndefined from 'lodash-es/isUndefined';
import omitBy from 'lodash-es/omitBy';
@@ -37,6 +38,7 @@ import { GlobalReducer } from 'types/reducer/globalTime';
import { v4 as generateUUID } from 'uuid';
import { IDashboardContext } from './types';
import { sortLayout } from './util';
const DashboardContext = createContext<IDashboardContext>({
isDashboardSliderOpen: false,
@@ -47,6 +49,8 @@ const DashboardContext = createContext<IDashboardContext>({
selectedDashboard: {} as Dashboard,
dashboardId: '',
layouts: [],
panelMap: {},
setPanelMap: () => {},
setLayouts: () => {},
setSelectedDashboard: () => {},
updatedTimeRef: {} as React.MutableRefObject<Dayjs | null>,
@@ -94,6 +98,10 @@ export function DashboardProvider({
const [layouts, setLayouts] = useState<Layout[]>([]);
const [panelMap, setPanelMap] = useState<
Record<string, { widgets: Layout[]; collapsed: boolean }>
>({});
const { isLoggedIn } = useSelector<AppState, AppReducer>((state) => state.app);
const dashboardId =
@@ -199,7 +207,9 @@ export function DashboardProvider({
dashboardRef.current = updatedDashboardData;
setLayouts(getUpdatedLayout(updatedDashboardData.data.layout));
setLayouts(sortLayout(getUpdatedLayout(updatedDashboardData.data.layout)));
setPanelMap(defaultTo(updatedDashboardData?.data?.panelMap, {}));
}
if (
@@ -235,7 +245,11 @@ export function DashboardProvider({
updatedTimeRef.current = dayjs(updatedDashboardData.updated_at);
setLayouts(getUpdatedLayout(updatedDashboardData.data.layout));
setLayouts(
sortLayout(getUpdatedLayout(updatedDashboardData.data.layout)),
);
setPanelMap(defaultTo(updatedDashboardData.data.panelMap, {}));
},
});
@@ -256,7 +270,11 @@ export function DashboardProvider({
updatedDashboardData.data.layout,
)
) {
setLayouts(getUpdatedLayout(updatedDashboardData.data.layout));
setLayouts(
sortLayout(getUpdatedLayout(updatedDashboardData.data.layout)),
);
setPanelMap(defaultTo(updatedDashboardData.data.panelMap, {}));
}
}
},
@@ -323,7 +341,9 @@ export function DashboardProvider({
selectedDashboard,
dashboardId,
layouts,
panelMap,
setLayouts,
setPanelMap,
setSelectedDashboard,
updatedTimeRef,
setToScrollWidgetId,
@@ -339,6 +359,7 @@ export function DashboardProvider({
selectedDashboard,
dashboardId,
layouts,
panelMap,
toScrollWidgetId,
updateLocalStorageDashboardVariables,
currentDashboard,

View File

@@ -12,6 +12,8 @@ export interface IDashboardContext {
selectedDashboard: Dashboard | undefined;
dashboardId: string;
layouts: Layout[];
panelMap: Record<string, { widgets: Layout[]; collapsed: boolean }>;
setPanelMap: React.Dispatch<React.SetStateAction<Record<string, any>>>;
setLayouts: React.Dispatch<React.SetStateAction<Layout[]>>;
setSelectedDashboard: React.Dispatch<
React.SetStateAction<Dashboard | undefined>

View File

@@ -1,22 +1,34 @@
import { Layout } from 'react-grid-layout';
import { Dashboard, Widgets } from 'types/api/dashboard/getAll';
export const getPreviousWidgets = (
selectedDashboard: Dashboard,
selectedWidgetIndex: number,
): Widgets[] =>
selectedDashboard.data.widgets?.slice(0, selectedWidgetIndex || 0) || [];
(selectedDashboard.data.widgets?.slice(
0,
selectedWidgetIndex || 0,
) as Widgets[]) || [];
export const getNextWidgets = (
selectedDashboard: Dashboard,
selectedWidgetIndex: number,
): Widgets[] =>
selectedDashboard.data.widgets?.slice(
(selectedDashboard.data.widgets?.slice(
(selectedWidgetIndex || 0) + 1, // this is never undefined
selectedDashboard.data.widgets?.length,
) || [];
) as Widgets[]) || [];
export const getSelectedWidgetIndex = (
selectedDashboard: Dashboard,
widgetId: string | null,
): number =>
selectedDashboard.data.widgets?.findIndex((e) => e.id === widgetId) || 0;
export const sortLayout = (layout: Layout[]): Layout[] =>
[...layout].sort((a, b) => {
if (a.y === b.y) {
return a.x - b.x;
}
return a.y - b.y;
});

View File

@@ -242,3 +242,7 @@ body {
}
}
}
.ant-notification-notice-message {
padding-right: 20px;
}

View File

@@ -1,4 +1,4 @@
import { PANEL_TYPES } from 'constants/queryBuilder';
import { PANEL_GROUP_TYPES, PANEL_TYPES } from 'constants/queryBuilder';
import { ThresholdProps } from 'container/NewWidget/RightContainer/Threshold/types';
import { timePreferenceType } from 'container/NewWidget/RightContainer/timeItems';
import { ReactNode } from 'react';
@@ -59,13 +59,21 @@ export interface DashboardData {
description?: string;
tags?: string[];
name?: string;
widgets?: Widgets[];
widgets?: Array<WidgetRow | Widgets>;
title: string;
layout?: Layout[];
panelMap?: Record<string, { widgets: Layout[]; collapsed: boolean }>;
variables: Record<string, IDashboardVariable>;
version?: string;
}
export interface WidgetRow {
id: string;
panelTypes: PANEL_GROUP_TYPES;
title: ReactNode;
description: string;
}
export interface IBaseWidget {
isStacked: boolean;
id: string;

View File

@@ -6333,13 +6333,13 @@ bl@^4.1.0:
inherits "^2.0.4"
readable-stream "^3.4.0"
body-parser@1.20.1:
version "1.20.1"
resolved "https://registry.npmjs.org/body-parser/-/body-parser-1.20.1.tgz"
integrity sha512-jWi7abTbYwajOytWCQc37VulmWiRae5RyTpaCyDcS5/lMdtwSz5lOpDE67srw/HYe35f1z3fDQw+3txg7gNtWw==
body-parser@1.20.2:
version "1.20.2"
resolved "https://registry.yarnpkg.com/body-parser/-/body-parser-1.20.2.tgz#6feb0e21c4724d06de7ff38da36dad4f57a747fd"
integrity sha512-ml9pReCu3M61kGlqoTm2umSXTlRTuGTx0bfYj+uIUKKYycG5NtSbeetV3faSU6R7ajOPw0g/J1PvK4qNy7s5bA==
dependencies:
bytes "3.1.2"
content-type "~1.0.4"
content-type "~1.0.5"
debug "2.6.9"
depd "2.0.0"
destroy "1.2.0"
@@ -6347,7 +6347,7 @@ body-parser@1.20.1:
iconv-lite "0.4.24"
on-finished "2.4.1"
qs "6.11.0"
raw-body "2.5.1"
raw-body "2.5.2"
type-is "~1.6.18"
unpipe "1.0.0"
@@ -7123,7 +7123,7 @@ content-disposition@0.5.4:
dependencies:
safe-buffer "5.2.1"
content-type@~1.0.4:
content-type@~1.0.4, content-type@~1.0.5:
version "1.0.5"
resolved "https://registry.npmjs.org/content-type/-/content-type-1.0.5.tgz"
integrity sha512-nTjqfcBFEipKdXCv4YDQWCfmcLZKm81ldF0pAopTvyrFGVbcR6P/VAAd5G7N+0tTr8QqiU0tFadD6FK4NtJwOA==
@@ -7172,10 +7172,10 @@ cookie-signature@1.0.6:
resolved "https://registry.npmjs.org/cookie-signature/-/cookie-signature-1.0.6.tgz"
integrity sha512-QADzlaHc8icV8I7vbaJXJwod9HWYp8uCqf1xa4OfNu1T7JVxQIrUgOWtHdNDtPiywmFbiS12VjotIXLrKM3orQ==
cookie@0.5.0:
version "0.5.0"
resolved "https://registry.npmjs.org/cookie/-/cookie-0.5.0.tgz"
integrity sha512-YZ3GUyn/o8gfKJlnlX7g7xq4gyO6OSuhGPKaaGssGB2qgDUS0gPgtTvoyZLTt9Ab6dC4hfc9dV5arkvc/OCmrw==
cookie@0.6.0:
version "0.6.0"
resolved "https://registry.yarnpkg.com/cookie/-/cookie-0.6.0.tgz#2798b04b071b0ecbff0dbb62a505a8efa4e19051"
integrity sha512-U71cyTamuh1CRNCfpGY6to28lxvNwPG4Guz/EVjgf3Jmzv0vlDp1atT9eS5dDjMYHucpHbWns6Lwf3BKz6svdw==
cookie@^0.4.2:
version "0.4.2"
@@ -8902,16 +8902,16 @@ expect@^29.0.0:
jest-util "^29.5.0"
express@^4.17.3:
version "4.18.2"
resolved "https://registry.yarnpkg.com/express/-/express-4.18.2.tgz#3fabe08296e930c796c19e3c516979386ba9fd59"
integrity sha512-5/PsL6iGPdfQ/lKM1UuielYgv3BUoJfz1aUwU9vHZ+J7gyvwdQXFEBIEIaxeGf0GIcreATNyBExtalisDbuMqQ==
version "4.19.2"
resolved "https://registry.yarnpkg.com/express/-/express-4.19.2.tgz#e25437827a3aa7f2a827bc8171bbbb664a356465"
integrity sha512-5T6nhjsT+EOMzuck8JjBHARTHfMht0POzlA60WV2pMD3gyXw2LZnZ+ueGdNxG+0calOJcWKbpFcuzLZ91YWq9Q==
dependencies:
accepts "~1.3.8"
array-flatten "1.1.1"
body-parser "1.20.1"
body-parser "1.20.2"
content-disposition "0.5.4"
content-type "~1.0.4"
cookie "0.5.0"
cookie "0.6.0"
cookie-signature "1.0.6"
debug "2.6.9"
depd "2.0.0"
@@ -14489,10 +14489,10 @@ range-parser@^1.2.1, range-parser@~1.2.1:
resolved "https://registry.npmjs.org/range-parser/-/range-parser-1.2.1.tgz"
integrity sha512-Hrgsx+orqoygnmhFbKaHE6c296J+HTAQXoxEF6gNupROmmGJRoyzfG3ccAveqCBrwr/2yxQ5BVd/GTl5agOwSg==
raw-body@2.5.1:
version "2.5.1"
resolved "https://registry.npmjs.org/raw-body/-/raw-body-2.5.1.tgz"
integrity sha512-qqJBtEyVgS0ZmPGdCFPWJ3FreoqvG4MVQln/kCgF7Olq95IbOp0/BWyMwbdtn4VTvkM8Y7khCQ2Xgk/tcrCXig==
raw-body@2.5.2:
version "2.5.2"
resolved "https://registry.yarnpkg.com/raw-body/-/raw-body-2.5.2.tgz#99febd83b90e08975087e8f1f9419a149366b68a"
integrity sha512-8zGqypfENjCIqGhgXToC8aB2r7YrBX+AQAfIPs/Mlk+BtPTztOvTS01NRW/3Eh60J+a48lt8qsCzirQ6loCVfA==
dependencies:
bytes "3.1.2"
http-errors "2.0.0"

View File

@@ -51,6 +51,7 @@ import (
"go.signoz.io/signoz/pkg/query-service/common"
"go.signoz.io/signoz/pkg/query-service/constants"
"go.signoz.io/signoz/pkg/query-service/dao"
chErrors "go.signoz.io/signoz/pkg/query-service/errors"
am "go.signoz.io/signoz/pkg/query-service/integrations/alertManager"
"go.signoz.io/signoz/pkg/query-service/interfaces"
"go.signoz.io/signoz/pkg/query-service/model"
@@ -163,12 +164,24 @@ func NewReaderFromClickhouseConnection(
os.Exit(1)
}
regex := os.Getenv("ClickHouseOptimizeReadInOrderRegex")
var regexCompiled *regexp.Regexp
if regex != "" {
regexCompiled, err = regexp.Compile(regex)
if err != nil {
zap.L().Error("Incorrect regex for ClickHouseOptimizeReadInOrderRegex")
os.Exit(1)
}
}
wrap := clickhouseConnWrapper{
conn: db,
settings: ClickhouseQuerySettings{
MaxExecutionTimeLeaf: os.Getenv("ClickHouseMaxExecutionTimeLeaf"),
TimeoutBeforeCheckingExecutionSpeed: os.Getenv("ClickHouseTimeoutBeforeCheckingExecutionSpeed"),
MaxBytesToRead: os.Getenv("ClickHouseMaxBytesToRead"),
OptimizeReadInOrderRegex: os.Getenv("ClickHouseOptimizeReadInOrderRegex"),
OptimizeReadInOrderRegexCompiled: regexCompiled,
},
}
@@ -4558,6 +4571,11 @@ func readRowsForTimeSeriesResult(rows driver.Rows, vars []interface{}, columnNam
return nil, err
}
groupBy, groupAttributes, groupAttributesArray, metricPoint := readRow(vars, columnNames)
// skip the point if the value is NaN or Inf
// are they ever useful enough to be returned?
if math.IsNaN(metricPoint.Value) || math.IsInf(metricPoint.Value, 0) {
continue
}
sort.Strings(groupBy)
key := strings.Join(groupBy, "")
if _, exists := seriesToAttrs[key]; !exists {
@@ -4688,11 +4706,11 @@ func getPersonalisedError(err error) error {
}
zap.L().Error("error while reading result", zap.Error(err))
if strings.Contains(err.Error(), "code: 307") {
return errors.New("query is consuming too much resources, please reach out to the team")
return chErrors.ErrResourceBytesLimitExceeded
}
if strings.Contains(err.Error(), "code: 159") {
return errors.New("Query is taking too long to run, please reach out to the team")
return chErrors.ErrResourceTimeLimitExceeded
}
return err
}

View File

@@ -3,7 +3,7 @@ package clickhouseReader
import (
"context"
"encoding/json"
"strings"
"regexp"
"github.com/ClickHouse/clickhouse-go/v2"
"github.com/ClickHouse/clickhouse-go/v2/lib/driver"
@@ -13,6 +13,8 @@ type ClickhouseQuerySettings struct {
MaxExecutionTimeLeaf string
TimeoutBeforeCheckingExecutionSpeed string
MaxBytesToRead string
OptimizeReadInOrderRegex string
OptimizeReadInOrderRegexCompiled *regexp.Regexp
}
type clickhouseConnWrapper struct {
@@ -40,12 +42,6 @@ func (c clickhouseConnWrapper) addClickHouseSettings(ctx context.Context, query
settings["log_comment"] = logComment
}
// don't add resource restrictions traces
if strings.Contains(query, "signoz_traces") {
ctx = clickhouse.Context(ctx, clickhouse.WithSettings(settings))
return ctx
}
if c.settings.MaxBytesToRead != "" {
settings["max_bytes_to_read"] = c.settings.MaxBytesToRead
}
@@ -58,6 +54,11 @@ func (c clickhouseConnWrapper) addClickHouseSettings(ctx context.Context, query
settings["timeout_before_checking_execution_speed"] = c.settings.TimeoutBeforeCheckingExecutionSpeed
}
// only list queries of
if c.settings.OptimizeReadInOrderRegex != "" && c.settings.OptimizeReadInOrderRegexCompiled.Match([]byte(query)) {
settings["optimize_read_in_order"] = 0
}
ctx = clickhouse.Context(ctx, clickhouse.WithSettings(settings))
return ctx
}

View File

@@ -326,7 +326,15 @@ func UpdateDashboard(ctx context.Context, uuid string, data map[string]interface
if existingTotal > newTotal && existingTotal-newTotal > 1 {
// if the total count of panels has reduced by more than 1,
// return error
return nil, model.BadRequest(fmt.Errorf("deleting more than one panel is not supported"))
existingIds := getWidgetIds(dashboard.Data)
newIds := getWidgetIds(data)
differenceIds := getIdDifference(existingIds, newIds)
if len(differenceIds) > 1 {
return nil, model.BadRequest(fmt.Errorf("deleting more than one panel is not supported"))
}
}
dashboard.UpdatedAt = time.Now()
@@ -714,3 +722,52 @@ func countTraceAndLogsPanel(data map[string]interface{}) (int64, int64) {
}
return count, totalPanels
}
func getWidgetIds(data map[string]interface{}) []string {
widgetIds := []string{}
if data != nil && data["widgets"] != nil {
widgets, ok := data["widgets"].(interface{})
if ok {
data, ok := widgets.([]interface{})
if ok {
for _, widget := range data {
sData, ok := widget.(map[string]interface{})
if ok && sData["query"] != nil && sData["id"] != nil {
id, ok := sData["id"].(string)
if ok {
widgetIds = append(widgetIds, id)
}
}
}
}
}
}
return widgetIds
}
func getIdDifference(existingIds []string, newIds []string) []string {
// Convert newIds array to a map for faster lookups
newIdsMap := make(map[string]bool)
for _, id := range newIds {
newIdsMap[id] = true
}
// Initialize a map to keep track of elements in the difference array
differenceMap := make(map[string]bool)
// Initialize the difference array
difference := []string{}
// Iterate through existingIds
for _, id := range existingIds {
// If the id is not found in newIds, and it's not already in the difference array
if _, found := newIdsMap[id]; !found && !differenceMap[id] {
difference = append(difference, id)
differenceMap[id] = true // Mark the id as seen in the difference array
}
}
return difference
}

View File

@@ -4,6 +4,7 @@ import (
"fmt"
"math"
"sort"
"time"
"github.com/SigNoz/govaluate"
v3 "go.signoz.io/signoz/pkg/query-service/model/v3"
@@ -89,6 +90,7 @@ func joinAndCalculate(results []*v3.Result, uniqueLabelSet map[string]string, ex
resultSeries := &v3.Series{
Labels: uniqueLabelSet,
Points: make([]v3.Point, 0),
}
timestamps := make([]int64, 0)
for timestamp := range uniqueTimestamps {
@@ -158,7 +160,7 @@ func processResults(results []*v3.Result, expression *govaluate.EvaluableExpress
}, nil
}
var SupportedFunctions = []string{"exp", "log", "ln", "exp2", "log2", "exp10", "log10", "sqrt", "cbrt", "erf", "erfc", "lgamma", "tgamma", "sin", "cos", "tan", "asin", "acos", "atan", "degrees", "radians"}
var SupportedFunctions = []string{"exp", "log", "ln", "exp2", "log2", "exp10", "log10", "sqrt", "cbrt", "erf", "erfc", "lgamma", "tgamma", "sin", "cos", "tan", "asin", "acos", "atan", "degrees", "radians", "now", "toUnixTimestamp"}
func evalFuncs() map[string]govaluate.ExpressionFunction {
GoValuateFuncs := make(map[string]govaluate.ExpressionFunction)
@@ -247,5 +249,21 @@ func evalFuncs() map[string]govaluate.ExpressionFunction {
GoValuateFuncs["radians"] = func(args ...interface{}) (interface{}, error) {
return args[0].(float64) * math.Pi / 180, nil
}
GoValuateFuncs["now"] = func(args ...interface{}) (interface{}, error) {
return time.Now().Unix(), nil
}
GoValuateFuncs["toUnixTimestamp"] = func(args ...interface{}) (interface{}, error) {
if len(args) != 1 {
return nil, fmt.Errorf("toUnixTimestamp requires exactly one argument")
}
t, err := time.Parse(time.RFC3339, args[0].(string))
if err != nil {
return nil, err
}
return t.Unix(), nil
}
return GoValuateFuncs
}

View File

@@ -2347,13 +2347,28 @@ func (ah *APIHandler) calculateConnectionStatus(
func (ah *APIHandler) calculateLogsConnectionStatus(
ctx context.Context,
logsConnectionTest *v3.FilterSet,
logsConnectionTest *integrations.LogsConnectionTest,
lookbackSeconds int64,
) (*integrations.SignalConnectionStatus, *model.ApiError) {
if logsConnectionTest == nil {
return nil, nil
}
logsConnTestFilter := &v3.FilterSet{
Operator: "AND",
Items: []v3.FilterItem{
{
Key: v3.AttributeKey{
Key: logsConnectionTest.AttributeKey,
DataType: v3.AttributeKeyDataTypeString,
Type: v3.AttributeKeyTypeTag,
},
Operator: "=",
Value: logsConnectionTest.AttributeValue,
},
},
}
qrParams := &v3.QueryRangeParamsV3{
Start: time.Now().UnixMilli() - (lookbackSeconds * 1000),
End: time.Now().UnixMilli(),
@@ -2363,7 +2378,7 @@ func (ah *APIHandler) calculateLogsConnectionStatus(
BuilderQueries: map[string]*v3.BuilderQuery{
"A": {
PageSize: 1,
Filters: logsConnectionTest,
Filters: logsConnTestFilter,
QueryName: "A",
DataSource: v3.DataSourceLogs,
Expression: "A",
@@ -2892,7 +2907,7 @@ func (aH *APIHandler) autoCompleteAttributeValues(w http.ResponseWriter, r *http
aH.Respond(w, response)
}
func (aH *APIHandler) execClickHouseGraphQueries(ctx context.Context, queries map[string]string) ([]*v3.Result, error, map[string]string) {
func (aH *APIHandler) execClickHouseGraphQueries(ctx context.Context, queries map[string]string) ([]*v3.Result, error, map[string]error) {
type channelResult struct {
Series []*v3.Series
Err error
@@ -2922,13 +2937,13 @@ func (aH *APIHandler) execClickHouseGraphQueries(ctx context.Context, queries ma
close(ch)
var errs []error
errQuriesByName := make(map[string]string)
errQuriesByName := make(map[string]error)
res := make([]*v3.Result, 0)
// read values from the channel
for r := range ch {
if r.Err != nil {
errs = append(errs, r.Err)
errQuriesByName[r.Name] = r.Query
errQuriesByName[r.Name] = r.Err
continue
}
res = append(res, &v3.Result{
@@ -2942,7 +2957,7 @@ func (aH *APIHandler) execClickHouseGraphQueries(ctx context.Context, queries ma
return res, nil, nil
}
func (aH *APIHandler) execClickHouseListQueries(ctx context.Context, queries map[string]string) ([]*v3.Result, error, map[string]string) {
func (aH *APIHandler) execClickHouseListQueries(ctx context.Context, queries map[string]string) ([]*v3.Result, error, map[string]error) {
type channelResult struct {
List []*v3.Row
Err error
@@ -2971,13 +2986,13 @@ func (aH *APIHandler) execClickHouseListQueries(ctx context.Context, queries map
close(ch)
var errs []error
errQuriesByName := make(map[string]string)
errQuriesByName := make(map[string]error)
res := make([]*v3.Result, 0)
// read values from the channel
for r := range ch {
if r.Err != nil {
errs = append(errs, r.Err)
errQuriesByName[r.Name] = r.Query
errQuriesByName[r.Name] = r.Err
continue
}
res = append(res, &v3.Result{
@@ -2991,7 +3006,7 @@ func (aH *APIHandler) execClickHouseListQueries(ctx context.Context, queries map
return res, nil, nil
}
func (aH *APIHandler) execPromQueries(ctx context.Context, metricsQueryRangeParams *v3.QueryRangeParamsV3) ([]*v3.Result, error, map[string]string) {
func (aH *APIHandler) execPromQueries(ctx context.Context, metricsQueryRangeParams *v3.QueryRangeParamsV3) ([]*v3.Result, error, map[string]error) {
type channelResult struct {
Series []*v3.Series
Err error
@@ -3051,13 +3066,13 @@ func (aH *APIHandler) execPromQueries(ctx context.Context, metricsQueryRangePara
close(ch)
var errs []error
errQuriesByName := make(map[string]string)
errQuriesByName := make(map[string]error)
res := make([]*v3.Result, 0)
// read values from the channel
for r := range ch {
if r.Err != nil {
errs = append(errs, r.Err)
errQuriesByName[r.Name] = r.Query
errQuriesByName[r.Name] = r.Err
continue
}
res = append(res, &v3.Result{
@@ -3155,7 +3170,7 @@ func (aH *APIHandler) queryRangeV3(ctx context.Context, queryRangeParams *v3.Que
var result []*v3.Result
var err error
var errQuriesByName map[string]string
var errQuriesByName map[string]error
var spanKeys map[string]v3.AttributeKey
if queryRangeParams.CompositeQuery.QueryType == v3.QueryTypeBuilder {
// check if any enrichment is required for logs if yes then enrich them
@@ -3412,7 +3427,7 @@ func (aH *APIHandler) queryRangeV4(ctx context.Context, queryRangeParams *v3.Que
var result []*v3.Result
var err error
var errQuriesByName map[string]string
var errQuriesByName map[string]error
var spanKeys map[string]v3.AttributeKey
if queryRangeParams.CompositeQuery.QueryType == v3.QueryTypeBuilder {
// check if any enrichment is required for logs if yes then enrich them

View File

@@ -1,6 +1,7 @@
package integrations
import (
"bytes"
"context"
"embed"
"strings"
@@ -120,7 +121,9 @@ func readBuiltInIntegration(dirpath string) (
}
var integration IntegrationDetails
err = json.Unmarshal(hydratedSpecJson, &integration)
decoder := json.NewDecoder(bytes.NewReader(hydratedSpecJson))
decoder.DisallowUnknownFields()
err = decoder.Decode(&integration)
if err != nil {
return nil, fmt.Errorf(
"couldn't parse hydrated JSON spec read from %s: %w",

View File

@@ -78,3 +78,5 @@ Make the collector config file available to your otel collector and use it by ad
```
Note: the collector can use multiple config files, specified by multiple occurrences of the --config flag.
Also note that only 1 collector instance should be configured to collect query_logs.
Using multiple collector instances or replicas with this config will lead to duplicate logs.

View File

@@ -30,7 +30,7 @@ To configure metrics and logs collection for a Clickhouse server, you need the f
- **Ensure that an OTEL collector is running in your deployment environment**
If needed, please [install SigNoz OTEL Collector](https://signoz.io/docs/tutorial/opentelemetry-binary-usage-in-virtual-machine/)
If already installed, ensure that the collector version is v0.88.0 or newer.
If collecting logs from system.query_log table, ensure that the collector version is v0.88.22 or newer.
If collecting logs from system.query_log table, ensure that the collector version is v0.88.23 or newer.
Also ensure that you can provide config files to the collector and that you can set environment variables and command line flags used for running it.

View File

@@ -41,18 +41,8 @@
},
"connection_tests": {
"logs": {
"op": "AND",
"items": [
{
"key": {
"type": "tag",
"key": "source",
"dataType": "string"
},
"op": "=",
"value": "clickhouse"
}
]
"attribute_key": "source",
"attribute_value": "clickhouse"
}
},
"data_collected": "file://data-collected.json"

View File

@@ -37,18 +37,8 @@
},
"connection_tests": {
"logs": {
"op": "AND",
"items": [
{
"key": {
"type": "tag",
"key": "source",
"dataType": "string"
},
"op": "=",
"value": "mongo"
}
]
"attribute_key": "source",
"attribute_value": "mongodb"
}
},
"data_collected": {

View File

@@ -32,18 +32,8 @@
},
"connection_tests": {
"logs": {
"op": "AND",
"items": [
{
"key": {
"type": "tag",
"key": "source",
"dataType": "string"
},
"op": "=",
"value": "nginx"
}
]
"attribute_key": "source",
"attribute_value": "nginx"
}
},
"data_collected": {

View File

@@ -37,18 +37,8 @@
},
"connection_tests": {
"logs": {
"op": "AND",
"items": [
{
"key": {
"type": "tag",
"key": "source",
"dataType": "string"
},
"op": "=",
"value": "postgres"
}
]
"attribute_key": "source",
"attribute_value": "postgres"
}
},
"data_collected": {

View File

@@ -37,18 +37,8 @@
},
"connection_tests": {
"logs": {
"op": "AND",
"items": [
{
"key": {
"type": "tag",
"key": "source",
"dataType": "string"
},
"op": "=",
"value": "redis"
}
]
"attribute_key": "source",
"attribute_value": "redis"
}
},
"data_collected": {

View File

@@ -12,7 +12,6 @@ import (
"go.signoz.io/signoz/pkg/query-service/app/dashboards"
"go.signoz.io/signoz/pkg/query-service/app/logparsingpipeline"
"go.signoz.io/signoz/pkg/query-service/model"
v3 "go.signoz.io/signoz/pkg/query-service/model/v3"
"go.signoz.io/signoz/pkg/query-service/rules"
"go.signoz.io/signoz/pkg/query-service/utils"
)
@@ -60,9 +59,10 @@ type CollectedLogAttribute struct {
}
type CollectedMetric struct {
Name string `json:"name"`
Type string `json:"type"`
Unit string `json:"unit"`
Name string `json:"name"`
Type string `json:"type"`
Unit string `json:"unit"`
Description string `json:"description"`
}
type SignalConnectionStatus struct {
@@ -75,9 +75,14 @@ type IntegrationConnectionStatus struct {
Metrics *SignalConnectionStatus `json:"metrics"`
}
// log attribute value to use for finding logs for the integration.
type LogsConnectionTest struct {
AttributeKey string `json:"attribute_key"`
AttributeValue string `json:"attribute_value"`
}
type IntegrationConnectionTests struct {
// Filter to use for finding logs for the integration.
Logs *v3.FilterSet `json:"logs"`
Logs *LogsConnectionTest `json:"logs"`
// Metric names expected to have been received for the integration.
Metrics []string `json:"metrics"`
@@ -253,7 +258,7 @@ func (m *Manager) UninstallIntegration(
func (m *Manager) GetPipelinesForInstalledIntegrations(
ctx context.Context,
) ([]logparsingpipeline.Pipeline, *model.ApiError) {
installedIntegrations, apiErr := m.getDetailsForInstalledIntegrations(ctx)
installedIntegrations, apiErr := m.getInstalledIntegrations(ctx)
if apiErr != nil {
return nil, apiErr
}
@@ -322,10 +327,15 @@ func (m *Manager) GetInstalledIntegrationDashboardById(
if dId, exists := dd["id"]; exists {
if id, ok := dId.(string); ok && id == dashboardId {
isLocked := 1
author := "integration"
return &dashboards.Dashboard{
Uuid: m.dashboardUuid(integrationId, string(dashboardId)),
Locked: &isLocked,
Data: dd,
Uuid: m.dashboardUuid(integrationId, string(dashboardId)),
Locked: &isLocked,
Data: dd,
CreatedAt: integration.Installation.InstalledAt,
CreateBy: &author,
UpdatedAt: integration.Installation.InstalledAt,
UpdateBy: &author,
}, nil
}
}
@@ -339,7 +349,7 @@ func (m *Manager) GetInstalledIntegrationDashboardById(
func (m *Manager) GetDashboardsForInstalledIntegrations(
ctx context.Context,
) ([]dashboards.Dashboard, *model.ApiError) {
installedIntegrations, apiErr := m.getDetailsForInstalledIntegrations(ctx)
installedIntegrations, apiErr := m.getInstalledIntegrations(ctx)
if apiErr != nil {
return nil, apiErr
}
@@ -351,10 +361,15 @@ func (m *Manager) GetDashboardsForInstalledIntegrations(
if dId, exists := dd["id"]; exists {
if dashboardId, ok := dId.(string); ok {
isLocked := 1
author := "integration"
result = append(result, dashboards.Dashboard{
Uuid: m.dashboardUuid(ii.IntegrationSummary.Id, dashboardId),
Locked: &isLocked,
Data: dd,
Uuid: m.dashboardUuid(ii.IntegrationSummary.Id, dashboardId),
Locked: &isLocked,
Data: dd,
CreatedAt: ii.Installation.InstalledAt,
CreateBy: &author,
UpdatedAt: ii.Installation.InstalledAt,
UpdateBy: &author,
})
}
}
@@ -413,10 +428,10 @@ func (m *Manager) getInstalledIntegration(
return &installation, nil
}
func (m *Manager) getDetailsForInstalledIntegrations(
func (m *Manager) getInstalledIntegrations(
ctx context.Context,
) (
map[string]IntegrationDetails, *model.ApiError,
map[string]Integration, *model.ApiError,
) {
installations, apiErr := m.installedIntegrationsRepo.list(ctx)
if apiErr != nil {
@@ -426,5 +441,24 @@ func (m *Manager) getDetailsForInstalledIntegrations(
installedIds := utils.MapSlice(installations, func(i InstalledIntegration) string {
return i.IntegrationId
})
return m.availableIntegrationsRepo.get(ctx, installedIds)
integrationDetails, apiErr := m.availableIntegrationsRepo.get(ctx, installedIds)
if apiErr != nil {
return nil, apiErr
}
result := map[string]Integration{}
for _, ii := range installations {
iDetails, exists := integrationDetails[ii.IntegrationId]
if !exists {
return nil, model.InternalError(fmt.Errorf(
"couldn't find integration details for %s", ii.IntegrationId,
))
}
result[ii.IntegrationId] = Integration{
Installation: &ii,
IntegrationDetails: iDetails,
}
}
return result, nil
}

View File

@@ -96,19 +96,9 @@ func (t *TestAvailableIntegrationsRepo) list(
Alerts: []rules.PostableRule{},
},
ConnectionTests: &IntegrationConnectionTests{
Logs: &v3.FilterSet{
Operator: "AND",
Items: []v3.FilterItem{
{
Key: v3.AttributeKey{
Key: "source",
DataType: v3.AttributeKeyDataTypeString,
Type: v3.AttributeKeyTypeTag,
},
Operator: "=",
Value: "nginx",
},
},
Logs: &LogsConnectionTest{
AttributeKey: "source",
AttributeValue: "nginx",
},
},
}, {
@@ -174,19 +164,9 @@ func (t *TestAvailableIntegrationsRepo) list(
Alerts: []rules.PostableRule{},
},
ConnectionTests: &IntegrationConnectionTests{
Logs: &v3.FilterSet{
Operator: "AND",
Items: []v3.FilterItem{
{
Key: v3.AttributeKey{
Key: "source",
DataType: v3.AttributeKeyDataTypeString,
Type: v3.AttributeKeyTypeTag,
},
Operator: "=",
Value: "nginx",
},
},
Logs: &LogsConnectionTest{
AttributeKey: "source",
AttributeValue: "nginx",
},
},
},

View File

@@ -1,6 +1,7 @@
package app
import (
"math"
"sort"
"strings"
@@ -39,16 +40,27 @@ func applyMetricLimit(results []*v3.Result, queryRangeParams *v3.QueryRangeParam
}
}
// For graph type queries, sort based on GroupingSetsPoint
if result.Series[i].GroupingSetsPoint == nil || result.Series[j].GroupingSetsPoint == nil {
// Handle nil GroupingSetsPoint, if needed
// Here, we assume non-nil values are always less than nil values
return result.Series[i].GroupingSetsPoint != nil
ithSum, jthSum, ithCount, jthCount := 0.0, 0.0, 1.0, 1.0
for _, point := range result.Series[i].Points {
if math.IsNaN(point.Value) || math.IsInf(point.Value, 0) {
continue
}
ithSum += point.Value
ithCount++
}
for _, point := range result.Series[j].Points {
if math.IsNaN(point.Value) || math.IsInf(point.Value, 0) {
continue
}
jthSum += point.Value
jthCount++
}
if orderBy.Order == "asc" {
return result.Series[i].GroupingSetsPoint.Value < result.Series[j].GroupingSetsPoint.Value
return ithSum/ithCount < jthSum/jthCount
} else if orderBy.Order == "desc" {
return result.Series[i].GroupingSetsPoint.Value > result.Series[j].GroupingSetsPoint.Value
return ithSum/ithCount > jthSum/jthCount
}
} else {
// Sort based on Labels map

View File

@@ -145,12 +145,13 @@ func enrichFieldWithMetadata(field v3.AttributeKey, fields map[string]v3.Attribu
// check if the field is present in the fields map
if existingField, ok := fields[field.Key]; ok {
if existingField.IsColumn {
// don't update if type is not the same
if (field.Type == "" && field.DataType == "") ||
(field.Type == existingField.Type && field.DataType == existingField.DataType) ||
(field.Type == "" && field.DataType == existingField.DataType) ||
(field.DataType == "" && field.Type == existingField.Type) {
return existingField
}
field.Type = existingField.Type
field.DataType = existingField.DataType
return field
}
// enrich with default values if metadata is not found

View File

@@ -342,6 +342,57 @@ var testEnrichParamsData = []struct {
},
},
},
{
Name: "Don't enrich if other keys are non empty and not same",
Params: v3.QueryRangeParamsV3{
CompositeQuery: &v3.CompositeQuery{
BuilderQueries: map[string]*v3.BuilderQuery{
"test": {
QueryName: "test",
Expression: "test",
DataSource: v3.DataSourceLogs,
AggregateAttribute: v3.AttributeKey{
Key: "test",
Type: v3.AttributeKeyTypeResource,
DataType: v3.AttributeKeyDataTypeInt64,
},
Filters: &v3.FilterSet{Operator: "AND", Items: []v3.FilterItem{
{Key: v3.AttributeKey{Key: "test", Type: v3.AttributeKeyTypeTag}, Value: "test", Operator: "="},
{Key: v3.AttributeKey{Key: "test", DataType: v3.AttributeKeyDataTypeString}, Value: "test1", Operator: "="},
}},
},
},
},
},
Fields: map[string]v3.AttributeKey{
"test": {
Key: "test",
Type: v3.AttributeKeyTypeTag,
DataType: v3.AttributeKeyDataTypeString,
IsColumn: true,
},
},
Result: v3.QueryRangeParamsV3{
CompositeQuery: &v3.CompositeQuery{
BuilderQueries: map[string]*v3.BuilderQuery{
"test": {
QueryName: "test",
Expression: "test",
DataSource: v3.DataSourceLogs,
AggregateAttribute: v3.AttributeKey{
Key: "test",
Type: v3.AttributeKeyTypeResource,
DataType: v3.AttributeKeyDataTypeInt64,
},
Filters: &v3.FilterSet{Operator: "AND", Items: []v3.FilterItem{
{Key: v3.AttributeKey{Key: "test", Type: v3.AttributeKeyTypeTag, DataType: v3.AttributeKeyDataTypeString, IsColumn: true}, Value: "test", Operator: "="},
{Key: v3.AttributeKey{Key: "test", Type: v3.AttributeKeyTypeTag, DataType: v3.AttributeKeyDataTypeString, IsColumn: true}, Value: "test1", Operator: "="},
}},
},
},
},
},
},
}
func TestEnrichParams(t *testing.T) {

View File

@@ -36,33 +36,6 @@ func buildMetricQueryForTable(start, end, _ int64, mq *v3.BuilderQuery, tableNam
metricQueryGroupBy := mq.GroupBy
// if the aggregate operator is a histogram quantile, and user has not forgotten
// the le tag in the group by then add the le tag to the group by
if mq.AggregateOperator == v3.AggregateOperatorHistQuant50 ||
mq.AggregateOperator == v3.AggregateOperatorHistQuant75 ||
mq.AggregateOperator == v3.AggregateOperatorHistQuant90 ||
mq.AggregateOperator == v3.AggregateOperatorHistQuant95 ||
mq.AggregateOperator == v3.AggregateOperatorHistQuant99 {
found := false
for _, tag := range mq.GroupBy {
if tag.Key == "le" {
found = true
break
}
}
if !found {
metricQueryGroupBy = append(
metricQueryGroupBy,
v3.AttributeKey{
Key: "le",
DataType: v3.AttributeKeyDataTypeString,
Type: v3.AttributeKeyTypeTag,
IsColumn: false,
},
)
}
}
filterSubQuery, err := buildMetricsTimeSeriesFilterQuery(mq.Filters, metricQueryGroupBy, mq)
if err != nil {
return "", err

View File

@@ -60,6 +60,11 @@ func TestPanelTableForCumulative(t *testing.T) {
},
},
Expression: "A",
GroupBy: []v3.AttributeKey{
{
Key: "le",
},
},
},
expected: "SELECT toStartOfHour(now()) as ts, histogramQuantile(arrayMap(x -> toFloat64(x), groupArray(le)), groupArray(value), 0.500) as value FROM (SELECT le, toStartOfHour(now()) as ts, sum(rate_value)/29 as value FROM (SELECT le, ts, If((value - lagInFrame(value, 1, 0) OVER rate_window) < 0, nan, If((ts - lagInFrame(ts, 1, toDate('1970-01-01')) OVER rate_window) >= 86400, nan, (value - lagInFrame(value, 1, 0) OVER rate_window) / (ts - lagInFrame(ts, 1, toDate('1970-01-01')) OVER rate_window))) as rate_value FROM(SELECT fingerprint, le, toStartOfInterval(toDateTime(intDiv(timestamp_ms, 1000)), INTERVAL 60 SECOND) as ts, max(value) as value FROM signoz_metrics.distributed_samples_v2 INNER JOIN (SELECT JSONExtractString(labels, 'le') as le, fingerprint FROM signoz_metrics.time_series_v2 WHERE metric_name = 'signoz_latency_bucket' AND temporality IN ['Cumulative', 'Unspecified'] AND JSONExtractString(labels, 'service_name') = 'frontend') as filtered_time_series USING fingerprint WHERE metric_name = 'signoz_latency_bucket' AND timestamp_ms >= 1689255866000 AND timestamp_ms <= 1689257640000 GROUP BY fingerprint, le,ts ORDER BY fingerprint, le ASC, ts) WINDOW rate_window as (PARTITION BY fingerprint, le ORDER BY fingerprint, le ASC, ts)) WHERE isNaN(rate_value) = 0 GROUP BY le,ts ORDER BY le ASC, ts) GROUP BY ts ORDER BY ts",
},
@@ -77,6 +82,9 @@ func TestPanelTableForCumulative(t *testing.T) {
{
Key: "service_name",
},
{
Key: "le",
},
},
Expression: "A",
},

View File

@@ -12,39 +12,22 @@ func buildDeltaMetricQuery(start, end, step int64, mq *v3.BuilderQuery, tableNam
metricQueryGroupBy := mq.GroupBy
// if the aggregate operator is a histogram quantile, and user has not forgotten
// the le tag in the group by then add the le tag to the group by
if mq.AggregateOperator == v3.AggregateOperatorHistQuant50 ||
mq.AggregateOperator == v3.AggregateOperatorHistQuant75 ||
mq.AggregateOperator == v3.AggregateOperatorHistQuant90 ||
mq.AggregateOperator == v3.AggregateOperatorHistQuant95 ||
mq.AggregateOperator == v3.AggregateOperatorHistQuant99 {
found := false
for _, tag := range mq.GroupBy {
if tag.Key == "le" {
found = true
if mq.Filters != nil {
temporalityFound := false
for _, filter := range mq.Filters.Items {
if filter.Key.Key == "__temporality__" {
temporalityFound = true
break
}
}
if !found {
metricQueryGroupBy = append(
metricQueryGroupBy,
v3.AttributeKey{
Key: "le",
DataType: v3.AttributeKeyDataTypeString,
Type: v3.AttributeKeyTypeTag,
IsColumn: false,
},
)
}
}
if mq.Filters != nil {
mq.Filters.Items = append(mq.Filters.Items, v3.FilterItem{
Key: v3.AttributeKey{Key: "__temporality__"},
Operator: v3.FilterOperatorEqual,
Value: "Delta",
})
if !temporalityFound {
mq.Filters.Items = append(mq.Filters.Items, v3.FilterItem{
Key: v3.AttributeKey{Key: "__temporality__"},
Operator: v3.FilterOperatorEqual,
Value: "Delta",
})
}
}
filterSubQuery, err := buildMetricsTimeSeriesFilterQuery(mq.Filters, metricQueryGroupBy, mq)

View File

@@ -141,33 +141,6 @@ func buildMetricQuery(start, end, step int64, mq *v3.BuilderQuery, tableName str
metricQueryGroupBy := mq.GroupBy
// if the aggregate operator is a histogram quantile, and user has not forgotten
// the le tag in the group by then add the le tag to the group by
if mq.AggregateOperator == v3.AggregateOperatorHistQuant50 ||
mq.AggregateOperator == v3.AggregateOperatorHistQuant75 ||
mq.AggregateOperator == v3.AggregateOperatorHistQuant90 ||
mq.AggregateOperator == v3.AggregateOperatorHistQuant95 ||
mq.AggregateOperator == v3.AggregateOperatorHistQuant99 {
found := false
for _, tag := range mq.GroupBy {
if tag.Key == "le" {
found = true
break
}
}
if !found {
metricQueryGroupBy = append(
metricQueryGroupBy,
v3.AttributeKey{
Key: "le",
DataType: v3.AttributeKeyDataTypeString,
Type: v3.AttributeKeyTypeTag,
IsColumn: false,
},
)
}
}
filterSubQuery, err := buildMetricsTimeSeriesFilterQuery(mq.Filters, metricQueryGroupBy, mq)
if err != nil {
return "", err

View File

@@ -23,6 +23,8 @@ func PrepareMetricQuery(start, end int64, queryType v3.QueryType, panelType v3.P
var quantile float64
percentileOperator := mq.SpaceAggregation
if v3.IsPercentileOperator(mq.SpaceAggregation) &&
mq.AggregateAttribute.Type != v3.AttributeKeyType(v3.MetricTypeExponentialHistogram) {
quantile = v3.GetPercentileFromOperator(mq.SpaceAggregation)
@@ -80,6 +82,7 @@ func PrepareMetricQuery(start, end int64, queryType v3.QueryType, panelType v3.P
// fixed-bucket histogram quantiles are calculated with UDF
if quantile != 0 && mq.AggregateAttribute.Type != v3.AttributeKeyType(v3.MetricTypeExponentialHistogram) {
query = fmt.Sprintf(`SELECT %s, histogramQuantile(arrayMap(x -> toFloat64(x), groupArray(le)), groupArray(value), %.3f) as value FROM (%s) GROUP BY %s ORDER BY %s`, groupBy, quantile, query, groupBy, orderBy)
mq.SpaceAggregation = percentileOperator
}
return query, nil

View File

@@ -20,6 +20,7 @@ import (
"go.signoz.io/signoz/pkg/query-service/app/metrics"
"go.signoz.io/signoz/pkg/query-service/app/queryBuilder"
"go.signoz.io/signoz/pkg/query-service/auth"
"go.signoz.io/signoz/pkg/query-service/common"
"go.signoz.io/signoz/pkg/query-service/constants"
"go.signoz.io/signoz/pkg/query-service/model"
v3 "go.signoz.io/signoz/pkg/query-service/model/v3"
@@ -1004,6 +1005,7 @@ func ParseQueryRangeParams(r *http.Request) (*v3.QueryRangeParamsV3, *model.ApiE
if queryRangeParams.CompositeQuery.QueryType == v3.QueryTypeBuilder {
for _, query := range queryRangeParams.CompositeQuery.BuilderQueries {
// Formula query
// Check if the queries used in the expression can be joined
if query.QueryName != query.Expression {
expression, err := govaluate.NewEvaluableExpressionWithFunctions(query.Expression, evalFuncs())
if err != nil {
@@ -1038,6 +1040,12 @@ func ParseQueryRangeParams(r *http.Request) (*v3.QueryRangeParamsV3, *model.ApiE
}
}
// If the step interval is less than the minimum allowed step interval, set it to the minimum allowed step interval
if minStep := common.MinAllowedStepInterval(queryRangeParams.Start, queryRangeParams.End); query.StepInterval < minStep {
query.StepInterval = minStep
}
// Remove the time shift function from the list of functions and set the shift by value
var timeShiftBy int64
if len(query.Functions) > 0 {
for idx := range query.Functions {
@@ -1057,16 +1065,45 @@ func ParseQueryRangeParams(r *http.Request) (*v3.QueryRangeParamsV3, *model.ApiE
}
query.ShiftBy = timeShiftBy
// for metrics v3
// if the aggregate operator is a histogram quantile, and user has not forgotten
// the le tag in the group by then add the le tag to the group by
if query.AggregateOperator == v3.AggregateOperatorHistQuant50 ||
query.AggregateOperator == v3.AggregateOperatorHistQuant75 ||
query.AggregateOperator == v3.AggregateOperatorHistQuant90 ||
query.AggregateOperator == v3.AggregateOperatorHistQuant95 ||
query.AggregateOperator == v3.AggregateOperatorHistQuant99 {
found := false
for _, tag := range query.GroupBy {
if tag.Key == "le" {
found = true
break
}
}
if !found {
query.GroupBy = append(
query.GroupBy,
v3.AttributeKey{
Key: "le",
DataType: v3.AttributeKeyDataTypeString,
Type: v3.AttributeKeyTypeTag,
IsColumn: false,
},
)
}
}
if query.Filters == nil || len(query.Filters.Items) == 0 {
continue
}
for idx := range query.Filters.Items {
item := &query.Filters.Items[idx]
value := item.Value
if value != nil {
switch x := value.(type) {
case string:
variableName := strings.Trim(x, "{{ . }}")
variableName := strings.Trim(x, "{[.$]}")
if _, ok := queryRangeParams.Variables[variableName]; ok {
item.Value = queryRangeParams.Variables[variableName]
}
@@ -1074,7 +1111,7 @@ func ParseQueryRangeParams(r *http.Request) (*v3.QueryRangeParamsV3, *model.ApiE
if len(x) > 0 {
switch x[0].(type) {
case string:
variableName := strings.Trim(x[0].(string), "{{ . }}")
variableName := strings.Trim(x[0].(string), "{[.$]}")
if _, ok := queryRangeParams.Variables[variableName]; ok {
item.Value = queryRangeParams.Variables[variableName]
}
@@ -1082,6 +1119,13 @@ func ParseQueryRangeParams(r *http.Request) (*v3.QueryRangeParamsV3, *model.ApiE
}
}
}
if v3.FilterOperator(strings.ToLower((string(item.Operator)))) != v3.FilterOperatorIn && v3.FilterOperator(strings.ToLower((string(item.Operator)))) != v3.FilterOperatorNotIn {
// the value type should not be multiple values
if _, ok := item.Value.([]interface{}); ok {
return nil, &model.ApiError{Typ: model.ErrorBadData, Err: fmt.Errorf("multiple values %s are not allowed for operator `%s` for key `%s`", item.Value, item.Operator, item.Key.Key)}
}
}
}
}
}
@@ -1099,6 +1143,13 @@ func ParseQueryRangeParams(r *http.Request) (*v3.QueryRangeParamsV3, *model.ApiE
if chQuery.Disabled {
continue
}
for name, value := range queryRangeParams.Variables {
chQuery.Query = strings.Replace(chQuery.Query, fmt.Sprintf("{{%s}}", name), fmt.Sprint(value), -1)
chQuery.Query = strings.Replace(chQuery.Query, fmt.Sprintf("[[%s]]", name), fmt.Sprint(value), -1)
chQuery.Query = strings.Replace(chQuery.Query, fmt.Sprintf("$%s", name), fmt.Sprint(value), -1)
}
tmpl := template.New("clickhouse-query")
tmpl, err := tmpl.Parse(chQuery.Query)
if err != nil {
@@ -1123,6 +1174,13 @@ func ParseQueryRangeParams(r *http.Request) (*v3.QueryRangeParamsV3, *model.ApiE
if promQuery.Disabled {
continue
}
for name, value := range queryRangeParams.Variables {
promQuery.Query = strings.Replace(promQuery.Query, fmt.Sprintf("{{%s}}", name), fmt.Sprint(value), -1)
promQuery.Query = strings.Replace(promQuery.Query, fmt.Sprintf("[[%s]]", name), fmt.Sprint(value), -1)
promQuery.Query = strings.Replace(promQuery.Query, fmt.Sprintf("$%s", name), fmt.Sprint(value), -1)
}
tmpl := template.New("prometheus-query")
tmpl, err := tmpl.Parse(promQuery.Query)
if err != nil {

View File

@@ -11,6 +11,7 @@ import (
"github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require"
"go.signoz.io/signoz/pkg/query-service/common"
v3 "go.signoz.io/signoz/pkg/query-service/model/v3"
)
@@ -651,12 +652,12 @@ func TestParseQueryRangeParamsDashboardVarsSubstitution(t *testing.T) {
Items: []v3.FilterItem{
{
Key: v3.AttributeKey{Key: "service_name", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag},
Operator: "EQ",
Operator: v3.FilterOperatorEqual,
Value: "{{.service_name}}",
},
{
Key: v3.AttributeKey{Key: "operation_name", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag},
Operator: "IN",
Operator: v3.FilterOperatorIn,
Value: "{{.operation_name}}",
},
},
@@ -674,6 +675,161 @@ func TestParseQueryRangeParamsDashboardVarsSubstitution(t *testing.T) {
expectErr: false,
expectedValue: []interface{}{"route", []interface{}{"GET /route", "POST /route"}},
},
{
desc: "valid builder query with dashboard variables {{service_name}} and {{operation_name}}",
compositeQuery: v3.CompositeQuery{
PanelType: v3.PanelTypeGraph,
QueryType: v3.QueryTypeBuilder,
BuilderQueries: map[string]*v3.BuilderQuery{
"A": {
QueryName: "A",
DataSource: v3.DataSourceMetrics,
AggregateOperator: v3.AggregateOperatorSum,
AggregateAttribute: v3.AttributeKey{Key: "attribute_metrics"},
Expression: "A",
Filters: &v3.FilterSet{
Operator: "AND",
Items: []v3.FilterItem{
{
Key: v3.AttributeKey{Key: "service_name", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag},
Operator: v3.FilterOperatorEqual,
Value: "{{service_name}}",
},
{
Key: v3.AttributeKey{Key: "operation_name", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag},
Operator: v3.FilterOperatorIn,
Value: "{{operation_name}}",
},
},
},
},
},
},
variables: map[string]interface{}{
"service_name": "route",
"operation_name": []interface{}{
"GET /route",
"POST /route",
},
},
expectErr: false,
expectedValue: []interface{}{"route", []interface{}{"GET /route", "POST /route"}},
},
{
desc: "valid builder query with dashboard variables [[service_name]] and [[operation_name]]",
compositeQuery: v3.CompositeQuery{
PanelType: v3.PanelTypeGraph,
QueryType: v3.QueryTypeBuilder,
BuilderQueries: map[string]*v3.BuilderQuery{
"A": {
QueryName: "A",
DataSource: v3.DataSourceMetrics,
AggregateOperator: v3.AggregateOperatorSum,
AggregateAttribute: v3.AttributeKey{Key: "attribute_metrics"},
Expression: "A",
Filters: &v3.FilterSet{
Operator: "AND",
Items: []v3.FilterItem{
{
Key: v3.AttributeKey{Key: "service_name", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag},
Operator: v3.FilterOperatorEqual,
Value: "[[service_name]]",
},
{
Key: v3.AttributeKey{Key: "operation_name", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag},
Operator: v3.FilterOperatorIn,
Value: "[[operation_name]]",
},
},
},
},
},
},
variables: map[string]interface{}{
"service_name": "route",
"operation_name": []interface{}{
"GET /route",
"POST /route",
},
},
expectErr: false,
expectedValue: []interface{}{"route", []interface{}{"GET /route", "POST /route"}},
},
{
desc: "valid builder query with dashboard variables $service_name and $operation_name",
compositeQuery: v3.CompositeQuery{
PanelType: v3.PanelTypeGraph,
QueryType: v3.QueryTypeBuilder,
BuilderQueries: map[string]*v3.BuilderQuery{
"A": {
QueryName: "A",
DataSource: v3.DataSourceMetrics,
AggregateOperator: v3.AggregateOperatorSum,
AggregateAttribute: v3.AttributeKey{Key: "attribute_metrics"},
Expression: "A",
Filters: &v3.FilterSet{
Operator: "AND",
Items: []v3.FilterItem{
{
Key: v3.AttributeKey{Key: "service_name", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag},
Operator: v3.FilterOperatorEqual,
Value: "$service_name",
},
{
Key: v3.AttributeKey{Key: "operation_name", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag},
Operator: v3.FilterOperatorIn,
Value: "$operation_name",
},
},
},
},
},
},
variables: map[string]interface{}{
"service_name": "route",
"operation_name": []interface{}{
"GET /route",
"POST /route",
},
},
expectErr: false,
expectedValue: []interface{}{"route", []interface{}{"GET /route", "POST /route"}},
},
{
desc: "multiple values for single select operator",
compositeQuery: v3.CompositeQuery{
PanelType: v3.PanelTypeGraph,
QueryType: v3.QueryTypeBuilder,
BuilderQueries: map[string]*v3.BuilderQuery{
"A": {
QueryName: "A",
DataSource: v3.DataSourceMetrics,
AggregateOperator: v3.AggregateOperatorSum,
AggregateAttribute: v3.AttributeKey{Key: "attribute_metrics"},
Expression: "A",
Filters: &v3.FilterSet{
Operator: "AND",
Items: []v3.FilterItem{
{
Key: v3.AttributeKey{Key: "operation_name", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag},
Operator: v3.FilterOperatorEqual,
Value: "{{.operation_name}}",
},
},
},
},
},
},
variables: map[string]interface{}{
"service_name": "route",
"operation_name": []interface{}{
"GET /route",
"POST /route",
},
},
expectErr: true,
errMsg: "multiple values [GET /route POST /route] are not allowed for operator `=` for key `operation_name`",
},
}
for _, tc := range reqCases {
@@ -758,6 +914,72 @@ func TestParseQueryRangeParamsPromQLVars(t *testing.T) {
expectErr: false,
expectedQuery: "http_calls_total{service_name=\"route\", status_code=~\"200|505\"}",
},
{
desc: "valid prom query with dashboard variables {{service_name}} and {{status_code}}",
compositeQuery: v3.CompositeQuery{
PanelType: v3.PanelTypeGraph,
QueryType: v3.QueryTypePromQL,
PromQueries: map[string]*v3.PromQuery{
"A": {
Query: "http_calls_total{service_name=\"{{service_name}}\", status_code=~\"{{status_code}}\"}",
Disabled: false,
},
},
},
variables: map[string]interface{}{
"service_name": "route",
"status_code": []interface{}{
200,
505,
},
},
expectErr: false,
expectedQuery: "http_calls_total{service_name=\"route\", status_code=~\"200|505\"}",
},
{
desc: "valid prom query with dashboard variables [[service_name]] and [[status_code]]",
compositeQuery: v3.CompositeQuery{
PanelType: v3.PanelTypeGraph,
QueryType: v3.QueryTypePromQL,
PromQueries: map[string]*v3.PromQuery{
"A": {
Query: "http_calls_total{service_name=\"[[service_name]]\", status_code=~\"[[status_code]]\"}",
Disabled: false,
},
},
},
variables: map[string]interface{}{
"service_name": "route",
"status_code": []interface{}{
200,
505,
},
},
expectErr: false,
expectedQuery: "http_calls_total{service_name=\"route\", status_code=~\"200|505\"}",
},
{
desc: "valid prom query with dashboard variables $service_name and $status_code",
compositeQuery: v3.CompositeQuery{
PanelType: v3.PanelTypeGraph,
QueryType: v3.QueryTypePromQL,
PromQueries: map[string]*v3.PromQuery{
"A": {
Query: "http_calls_total{service_name=\"$service_name\", status_code=~\"$status_code\"}",
Disabled: false,
},
},
},
variables: map[string]interface{}{
"service_name": "route",
"status_code": []interface{}{
200,
505,
},
},
expectErr: false,
expectedQuery: "http_calls_total{service_name=\"route\", status_code=~\"200|505\"}",
},
{
desc: "valid prom query with dashboard variables",
compositeQuery: v3.CompositeQuery{
@@ -1174,3 +1396,105 @@ func TestQueryRangeFormula(t *testing.T) {
})
}
}
func TestParseQueryRangeParamsStepIntervalAdjustment(t *testing.T) {
reqCases := []struct {
desc string
start int64
end int64
step int64
}{
{
desc: "30 minutes and 60 seconds step",
start: time.Now().Add(-30 * time.Minute).UnixMilli(),
end: time.Now().UnixMilli(),
step: 60, // no update
},
{
desc: "1 hour and 1 second step",
start: time.Now().Add(-time.Hour).UnixMilli(),
end: time.Now().UnixMilli(),
step: 1, // gets updated
},
{
desc: "1 week and 1 minute step",
start: time.Now().Add(-7 * 24 * time.Hour).UnixMilli(),
end: time.Now().UnixMilli(),
step: 60, // gets updated
},
{
desc: "1 day and 1 hour step",
start: time.Now().Add(-24 * time.Hour).UnixMilli(),
end: time.Now().UnixMilli(),
step: 3600, // no update
},
{
desc: "1 day and 1 minute step",
start: time.Now().Add(-24 * time.Hour).UnixMilli(),
end: time.Now().UnixMilli(),
step: 60, // gets updated
},
{
desc: "1 day and 2 minutes step",
start: time.Now().Add(-24 * time.Hour).UnixMilli(),
end: time.Now().UnixMilli(),
step: 120, // gets updated
},
{
desc: "1 day and 5 minutes step",
start: time.Now().Add(-24 * time.Hour).UnixMilli(),
end: time.Now().UnixMilli(),
step: 300, // no update
},
{
desc: "1 week and 10 minutes step",
start: time.Now().Add(-7 * 24 * time.Hour).UnixMilli(),
end: time.Now().UnixMilli(),
step: 600, // get updated
},
{
desc: "1 week and 45 minutes step",
start: time.Now().Add(-7 * 24 * time.Hour).UnixMilli(),
end: time.Now().UnixMilli(),
step: 2700, // no update
},
}
for _, tc := range reqCases {
t.Run(tc.desc, func(t *testing.T) {
queryRangeParams := &v3.QueryRangeParamsV3{
Start: tc.start,
End: tc.end,
Step: tc.step,
CompositeQuery: &v3.CompositeQuery{
PanelType: v3.PanelTypeGraph,
QueryType: v3.QueryTypeBuilder,
BuilderQueries: map[string]*v3.BuilderQuery{
"A": {
QueryName: "A",
DataSource: v3.DataSourceMetrics,
AggregateOperator: v3.AggregateOperatorSum,
AggregateAttribute: v3.AttributeKey{Key: "signoz_calls_total"},
GroupBy: []v3.AttributeKey{{Key: "service_name"}, {Key: "operation_name"}},
Expression: "A",
StepInterval: tc.step,
},
},
},
Variables: map[string]interface{}{},
}
body := &bytes.Buffer{}
err := json.NewEncoder(body).Encode(queryRangeParams)
require.NoError(t, err)
req := httptest.NewRequest(http.MethodPost, "/api/v3/query_range", body)
p, apiErr := ParseQueryRangeParams(req)
if apiErr != nil && apiErr.Err != nil {
t.Fatalf("unexpected error %s", apiErr.Err)
}
require.True(t, p.CompositeQuery.BuilderQueries["A"].StepInterval >= common.MinAllowedStepInterval(p.Start, p.End))
})
}
}

View File

@@ -14,6 +14,7 @@ import (
metricsV3 "go.signoz.io/signoz/pkg/query-service/app/metrics/v3"
"go.signoz.io/signoz/pkg/query-service/app/queryBuilder"
tracesV3 "go.signoz.io/signoz/pkg/query-service/app/traces/v3"
chErrors "go.signoz.io/signoz/pkg/query-service/errors"
"go.signoz.io/signoz/pkg/query-service/cache"
"go.signoz.io/signoz/pkg/query-service/interfaces"
@@ -283,7 +284,7 @@ func mergeSerieses(cachedSeries, missedSeries []*v3.Series) []*v3.Series {
return mergedSeries
}
func (q *querier) runBuilderQueries(ctx context.Context, params *v3.QueryRangeParamsV3, keys map[string]v3.AttributeKey) ([]*v3.Result, error, map[string]string) {
func (q *querier) runBuilderQueries(ctx context.Context, params *v3.QueryRangeParamsV3, keys map[string]v3.AttributeKey) ([]*v3.Result, error, map[string]error) {
cacheKeys := q.keyGenerator.GenerateKeys(params)
@@ -306,13 +307,13 @@ func (q *querier) runBuilderQueries(ctx context.Context, params *v3.QueryRangePa
close(ch)
results := make([]*v3.Result, 0)
errQueriesByName := make(map[string]string)
errQueriesByName := make(map[string]error)
var errs []error
for result := range ch {
if result.Err != nil {
errs = append(errs, result.Err)
errQueriesByName[result.Name] = result.Err.Error()
errQueriesByName[result.Name] = result.Err
continue
}
results = append(results, &v3.Result{
@@ -329,7 +330,7 @@ func (q *querier) runBuilderQueries(ctx context.Context, params *v3.QueryRangePa
return results, err, errQueriesByName
}
func (q *querier) runPromQueries(ctx context.Context, params *v3.QueryRangeParamsV3) ([]*v3.Result, error, map[string]string) {
func (q *querier) runPromQueries(ctx context.Context, params *v3.QueryRangeParamsV3) ([]*v3.Result, error, map[string]error) {
channelResults := make(chan channelResult, len(params.CompositeQuery.PromQueries))
var wg sync.WaitGroup
cacheKeys := q.keyGenerator.GenerateKeys(params)
@@ -390,13 +391,13 @@ func (q *querier) runPromQueries(ctx context.Context, params *v3.QueryRangeParam
close(channelResults)
results := make([]*v3.Result, 0)
errQueriesByName := make(map[string]string)
errQueriesByName := make(map[string]error)
var errs []error
for result := range channelResults {
if result.Err != nil {
errs = append(errs, result.Err)
errQueriesByName[result.Name] = result.Err.Error()
errQueriesByName[result.Name] = result.Err
continue
}
results = append(results, &v3.Result{
@@ -413,7 +414,7 @@ func (q *querier) runPromQueries(ctx context.Context, params *v3.QueryRangeParam
return results, err, errQueriesByName
}
func (q *querier) runClickHouseQueries(ctx context.Context, params *v3.QueryRangeParamsV3) ([]*v3.Result, error, map[string]string) {
func (q *querier) runClickHouseQueries(ctx context.Context, params *v3.QueryRangeParamsV3) ([]*v3.Result, error, map[string]error) {
channelResults := make(chan channelResult, len(params.CompositeQuery.ClickHouseQueries))
var wg sync.WaitGroup
for queryName, clickHouseQuery := range params.CompositeQuery.ClickHouseQueries {
@@ -431,13 +432,13 @@ func (q *querier) runClickHouseQueries(ctx context.Context, params *v3.QueryRang
close(channelResults)
results := make([]*v3.Result, 0)
errQueriesByName := make(map[string]string)
errQueriesByName := make(map[string]error)
var errs []error
for result := range channelResults {
if result.Err != nil {
errs = append(errs, result.Err)
errQueriesByName[result.Name] = result.Err.Error()
errQueriesByName[result.Name] = result.Err
continue
}
results = append(results, &v3.Result{
@@ -453,7 +454,7 @@ func (q *querier) runClickHouseQueries(ctx context.Context, params *v3.QueryRang
return results, err, errQueriesByName
}
func (q *querier) runBuilderListQueries(ctx context.Context, params *v3.QueryRangeParamsV3, keys map[string]v3.AttributeKey) ([]*v3.Result, error, map[string]string) {
func (q *querier) runBuilderListQueries(ctx context.Context, params *v3.QueryRangeParamsV3, keys map[string]v3.AttributeKey) ([]*v3.Result, error, map[string]error) {
queries, err := q.builder.PrepareQueries(params, keys)
@@ -482,13 +483,13 @@ func (q *querier) runBuilderListQueries(ctx context.Context, params *v3.QueryRan
close(ch)
var errs []error
errQuriesByName := make(map[string]string)
errQuriesByName := make(map[string]error)
res := make([]*v3.Result, 0)
// read values from the channel
for r := range ch {
if r.Err != nil {
errs = append(errs, r.Err)
errQuriesByName[r.Name] = r.Query
errQuriesByName[r.Name] = r.Err
continue
}
res = append(res, &v3.Result{
@@ -502,10 +503,10 @@ func (q *querier) runBuilderListQueries(ctx context.Context, params *v3.QueryRan
return res, nil, nil
}
func (q *querier) QueryRange(ctx context.Context, params *v3.QueryRangeParamsV3, keys map[string]v3.AttributeKey) ([]*v3.Result, error, map[string]string) {
func (q *querier) QueryRange(ctx context.Context, params *v3.QueryRangeParamsV3, keys map[string]v3.AttributeKey) ([]*v3.Result, error, map[string]error) {
var results []*v3.Result
var err error
var errQueriesByName map[string]string
var errQueriesByName map[string]error
if params.CompositeQuery != nil {
switch params.CompositeQuery.QueryType {
case v3.QueryTypeBuilder:
@@ -514,6 +515,13 @@ func (q *querier) QueryRange(ctx context.Context, params *v3.QueryRangeParamsV3,
} else {
results, err, errQueriesByName = q.runBuilderQueries(ctx, params, keys)
}
// in builder query, the only errors we expose are the ones that exceed the resource limits
// everything else is internal error as they are not actionable by the user
for name, err := range errQueriesByName {
if !chErrors.IsResourceLimitError(err) {
delete(errQueriesByName, name)
}
}
case v3.QueryTypePromQL:
results, err, errQueriesByName = q.runPromQueries(ctx, params)
case v3.QueryTypeClickHouseSQL:
@@ -525,7 +533,7 @@ func (q *querier) QueryRange(ctx context.Context, params *v3.QueryRangeParamsV3,
// return error if the number of series is more than one for value type panel
if params.CompositeQuery.PanelType == v3.PanelTypeValue {
if len(results) > 1 {
if len(results) > 1 && params.CompositeQuery.EnabledQueries() > 1 {
err = fmt.Errorf("there can be only one active query for value type panel")
} else if len(results) == 1 && len(results[0].Series) > 1 {
err = fmt.Errorf("there can be only one result series for value type panel but got %d", len(results[0].Series))

View File

@@ -14,6 +14,7 @@ import (
metricsV4 "go.signoz.io/signoz/pkg/query-service/app/metrics/v4"
"go.signoz.io/signoz/pkg/query-service/app/queryBuilder"
tracesV3 "go.signoz.io/signoz/pkg/query-service/app/traces/v3"
chErrors "go.signoz.io/signoz/pkg/query-service/errors"
"go.signoz.io/signoz/pkg/query-service/cache"
"go.signoz.io/signoz/pkg/query-service/interfaces"
@@ -281,7 +282,7 @@ func mergeSerieses(cachedSeries, missedSeries []*v3.Series) []*v3.Series {
return mergedSeries
}
func (q *querier) runBuilderQueries(ctx context.Context, params *v3.QueryRangeParamsV3, keys map[string]v3.AttributeKey) ([]*v3.Result, error, map[string]string) {
func (q *querier) runBuilderQueries(ctx context.Context, params *v3.QueryRangeParamsV3, keys map[string]v3.AttributeKey) ([]*v3.Result, error, map[string]error) {
cacheKeys := q.keyGenerator.GenerateKeys(params)
@@ -299,13 +300,13 @@ func (q *querier) runBuilderQueries(ctx context.Context, params *v3.QueryRangePa
close(ch)
results := make([]*v3.Result, 0)
errQueriesByName := make(map[string]string)
errQueriesByName := make(map[string]error)
var errs []error
for result := range ch {
if result.Err != nil {
errs = append(errs, result.Err)
errQueriesByName[result.Name] = result.Err.Error()
errQueriesByName[result.Name] = result.Err
continue
}
results = append(results, &v3.Result{
@@ -322,7 +323,7 @@ func (q *querier) runBuilderQueries(ctx context.Context, params *v3.QueryRangePa
return results, err, errQueriesByName
}
func (q *querier) runPromQueries(ctx context.Context, params *v3.QueryRangeParamsV3) ([]*v3.Result, error, map[string]string) {
func (q *querier) runPromQueries(ctx context.Context, params *v3.QueryRangeParamsV3) ([]*v3.Result, error, map[string]error) {
channelResults := make(chan channelResult, len(params.CompositeQuery.PromQueries))
var wg sync.WaitGroup
cacheKeys := q.keyGenerator.GenerateKeys(params)
@@ -383,13 +384,13 @@ func (q *querier) runPromQueries(ctx context.Context, params *v3.QueryRangeParam
close(channelResults)
results := make([]*v3.Result, 0)
errQueriesByName := make(map[string]string)
errQueriesByName := make(map[string]error)
var errs []error
for result := range channelResults {
if result.Err != nil {
errs = append(errs, result.Err)
errQueriesByName[result.Name] = result.Err.Error()
errQueriesByName[result.Name] = result.Err
continue
}
results = append(results, &v3.Result{
@@ -406,7 +407,7 @@ func (q *querier) runPromQueries(ctx context.Context, params *v3.QueryRangeParam
return results, err, errQueriesByName
}
func (q *querier) runClickHouseQueries(ctx context.Context, params *v3.QueryRangeParamsV3) ([]*v3.Result, error, map[string]string) {
func (q *querier) runClickHouseQueries(ctx context.Context, params *v3.QueryRangeParamsV3) ([]*v3.Result, error, map[string]error) {
channelResults := make(chan channelResult, len(params.CompositeQuery.ClickHouseQueries))
var wg sync.WaitGroup
for queryName, clickHouseQuery := range params.CompositeQuery.ClickHouseQueries {
@@ -424,13 +425,13 @@ func (q *querier) runClickHouseQueries(ctx context.Context, params *v3.QueryRang
close(channelResults)
results := make([]*v3.Result, 0)
errQueriesByName := make(map[string]string)
errQueriesByName := make(map[string]error)
var errs []error
for result := range channelResults {
if result.Err != nil {
errs = append(errs, result.Err)
errQueriesByName[result.Name] = result.Err.Error()
errQueriesByName[result.Name] = result.Err
continue
}
results = append(results, &v3.Result{
@@ -446,7 +447,7 @@ func (q *querier) runClickHouseQueries(ctx context.Context, params *v3.QueryRang
return results, err, errQueriesByName
}
func (q *querier) runBuilderListQueries(ctx context.Context, params *v3.QueryRangeParamsV3, keys map[string]v3.AttributeKey) ([]*v3.Result, error, map[string]string) {
func (q *querier) runBuilderListQueries(ctx context.Context, params *v3.QueryRangeParamsV3, keys map[string]v3.AttributeKey) ([]*v3.Result, error, map[string]error) {
queries, err := q.builder.PrepareQueries(params, keys)
@@ -475,13 +476,13 @@ func (q *querier) runBuilderListQueries(ctx context.Context, params *v3.QueryRan
close(ch)
var errs []error
errQuriesByName := make(map[string]string)
errQuriesByName := make(map[string]error)
res := make([]*v3.Result, 0)
// read values from the channel
for r := range ch {
if r.Err != nil {
errs = append(errs, r.Err)
errQuriesByName[r.Name] = r.Query
errQuriesByName[r.Name] = r.Err
continue
}
res = append(res, &v3.Result{
@@ -495,10 +496,10 @@ func (q *querier) runBuilderListQueries(ctx context.Context, params *v3.QueryRan
return res, nil, nil
}
func (q *querier) QueryRange(ctx context.Context, params *v3.QueryRangeParamsV3, keys map[string]v3.AttributeKey) ([]*v3.Result, error, map[string]string) {
func (q *querier) QueryRange(ctx context.Context, params *v3.QueryRangeParamsV3, keys map[string]v3.AttributeKey) ([]*v3.Result, error, map[string]error) {
var results []*v3.Result
var err error
var errQueriesByName map[string]string
var errQueriesByName map[string]error
if params.CompositeQuery != nil {
switch params.CompositeQuery.QueryType {
case v3.QueryTypeBuilder:
@@ -507,6 +508,13 @@ func (q *querier) QueryRange(ctx context.Context, params *v3.QueryRangeParamsV3,
} else {
results, err, errQueriesByName = q.runBuilderQueries(ctx, params, keys)
}
// in builder query, the only errors we expose are the ones that exceed the resource limits
// everything else is internal error as they are not actionable by the user
for name, err := range errQueriesByName {
if !chErrors.IsResourceLimitError(err) {
delete(errQueriesByName, name)
}
}
case v3.QueryTypePromQL:
results, err, errQueriesByName = q.runPromQueries(ctx, params)
case v3.QueryTypeClickHouseSQL:
@@ -518,7 +526,7 @@ func (q *querier) QueryRange(ctx context.Context, params *v3.QueryRangeParamsV3,
// return error if the number of series is more than one for value type panel
if params.CompositeQuery.PanelType == v3.PanelTypeValue {
if len(results) > 1 {
if len(results) > 1 && params.CompositeQuery.EnabledQueries() > 1 {
err = fmt.Errorf("there can be only one active query for value type panel")
} else if len(results) == 1 && len(results[0].Series) > 1 {
err = fmt.Errorf("there can be only one result series for value type panel but got %d", len(results[0].Series))

View File

@@ -348,6 +348,7 @@ func TestDeltaQueryBuilder(t *testing.T) {
Temporality: v3.Delta,
GroupBy: []v3.AttributeKey{
{Key: "service_name"},
{Key: "le"},
},
},
},

View File

@@ -4,6 +4,7 @@ import (
"math"
"time"
"go.signoz.io/signoz/pkg/query-service/constants"
v3 "go.signoz.io/signoz/pkg/query-service/model/v3"
)
@@ -23,3 +24,10 @@ func PastDayRoundOff() int64 {
now := time.Now().UnixMilli()
return int64(math.Floor(float64(now)/float64(time.Hour.Milliseconds()*24))) * time.Hour.Milliseconds() * 24
}
// start and end are in milliseconds
func MinAllowedStepInterval(start, end int64) int64 {
step := (end - start) / constants.MaxAllowedPointsInTimeSeries / 1000
// return the nearest lower multiple of 60
return step - step%60
}

View File

@@ -25,6 +25,8 @@ var ConfigSignozIo = "https://config.signoz.io/api/v1"
var DEFAULT_TELEMETRY_ANONYMOUS = false
const MaxAllowedPointsInTimeSeries = 300
func IsTelemetryEnabled() bool {
if testing.Testing() {
return false

View File

@@ -0,0 +1,42 @@
package errors
import "errors"
var (
// ErrResourceBytesLimitExceeded is returned when the resource bytes limit is exceeded
ErrResourceBytesLimitExceeded = NewResourceLimitError(errors.New("resource bytes limit exceeded, try applying filters such as service.name, etc. to reduce the data size"))
// ErrResourceTimeLimitExceeded is returned when the resource time limit is exceeded
ErrResourceTimeLimitExceeded = NewResourceLimitError(errors.New("resource time limit exceeded, try applying filters such as service.name, etc. to reduce the data size"))
)
type ResourceLimitError struct {
err error
}
func NewResourceLimitError(err error) error {
return &ResourceLimitError{err: err}
}
func (e *ResourceLimitError) Error() string {
return e.err.Error()
}
func (e *ResourceLimitError) Unwrap() error {
return e.err
}
func IsResourceLimitError(err error) bool {
if err == nil {
return false
}
var target *ResourceLimitError
return errors.As(err, &target)
}
func (e *ResourceLimitError) MarshalJSON() ([]byte, error) {
return []byte(`"` + e.Error() + `"`), nil
}
func (e *ResourceLimitError) UnmarshalJSON([]byte) error {
return nil
}

View File

@@ -107,7 +107,7 @@ type Reader interface {
}
type Querier interface {
QueryRange(context.Context, *v3.QueryRangeParamsV3, map[string]v3.AttributeKey) ([]*v3.Result, error, map[string]string)
QueryRange(context.Context, *v3.QueryRangeParamsV3, map[string]v3.AttributeKey) ([]*v3.Result, error, map[string]error)
// test helpers
QueriesExecuted() []string

View File

@@ -11,6 +11,7 @@ import (
"go.signoz.io/signoz/pkg/query-service/app"
"go.signoz.io/signoz/pkg/query-service/auth"
"go.signoz.io/signoz/pkg/query-service/constants"
"go.signoz.io/signoz/pkg/query-service/migrate"
"go.signoz.io/signoz/pkg/query-service/version"
"go.uber.org/zap"
@@ -90,6 +91,12 @@ func main() {
zap.L().Info("JWT secret key set successfully.")
}
if err := migrate.Migrate(constants.RELATIONAL_DATASOURCE_PATH); err != nil {
zap.L().Error("Failed to migrate", zap.Error(err))
} else {
zap.L().Info("Migration successful")
}
server, err := app.NewServer(serverOptions)
if err != nil {
logger.Fatal("Failed to create server", zap.Error(err))

View File

@@ -0,0 +1,153 @@
package alertstov4
import (
"context"
"encoding/json"
"github.com/jmoiron/sqlx"
v3 "go.signoz.io/signoz/pkg/query-service/model/v3"
"go.signoz.io/signoz/pkg/query-service/rules"
"go.uber.org/multierr"
"go.uber.org/zap"
)
var Version = "0.45-alerts-to-v4"
var mapTimeAggregation = map[v3.AggregateOperator]v3.TimeAggregation{
v3.AggregateOperatorSum: v3.TimeAggregationSum,
v3.AggregateOperatorMin: v3.TimeAggregationMin,
v3.AggregateOperatorMax: v3.TimeAggregationMax,
v3.AggregateOperatorSumRate: v3.TimeAggregationRate,
v3.AggregateOperatorAvgRate: v3.TimeAggregationRate,
v3.AggregateOperatorMinRate: v3.TimeAggregationRate,
v3.AggregateOperatorMaxRate: v3.TimeAggregationRate,
v3.AggregateOperatorHistQuant50: v3.TimeAggregationUnspecified,
v3.AggregateOperatorHistQuant75: v3.TimeAggregationUnspecified,
v3.AggregateOperatorHistQuant90: v3.TimeAggregationUnspecified,
v3.AggregateOperatorHistQuant95: v3.TimeAggregationUnspecified,
v3.AggregateOperatorHistQuant99: v3.TimeAggregationUnspecified,
}
var mapSpaceAggregation = map[v3.AggregateOperator]v3.SpaceAggregation{
v3.AggregateOperatorSum: v3.SpaceAggregationSum,
v3.AggregateOperatorMin: v3.SpaceAggregationMin,
v3.AggregateOperatorMax: v3.SpaceAggregationMax,
v3.AggregateOperatorSumRate: v3.SpaceAggregationSum,
v3.AggregateOperatorAvgRate: v3.SpaceAggregationAvg,
v3.AggregateOperatorMinRate: v3.SpaceAggregationMin,
v3.AggregateOperatorMaxRate: v3.SpaceAggregationMax,
v3.AggregateOperatorHistQuant50: v3.SpaceAggregationPercentile50,
v3.AggregateOperatorHistQuant75: v3.SpaceAggregationPercentile75,
v3.AggregateOperatorHistQuant90: v3.SpaceAggregationPercentile90,
v3.AggregateOperatorHistQuant95: v3.SpaceAggregationPercentile95,
v3.AggregateOperatorHistQuant99: v3.SpaceAggregationPercentile99,
}
func canMigrateOperator(operator v3.AggregateOperator) bool {
switch operator {
case v3.AggregateOperatorSum,
v3.AggregateOperatorMin,
v3.AggregateOperatorMax,
v3.AggregateOperatorSumRate,
v3.AggregateOperatorAvgRate,
v3.AggregateOperatorMinRate,
v3.AggregateOperatorMaxRate,
v3.AggregateOperatorHistQuant50,
v3.AggregateOperatorHistQuant75,
v3.AggregateOperatorHistQuant90,
v3.AggregateOperatorHistQuant95,
v3.AggregateOperatorHistQuant99:
return true
}
return false
}
func Migrate(conn *sqlx.DB) error {
ruleDB := rules.NewRuleDB(conn)
storedRules, err := ruleDB.GetStoredRules(context.Background())
if err != nil {
return err
}
for _, storedRule := range storedRules {
parsedRule, errs := rules.ParsePostableRule([]byte(storedRule.Data))
if len(errs) > 0 {
// this should not happen but if it does, we should not stop the migration
zap.L().Error("Error parsing rule", zap.Error(multierr.Combine(errs...)), zap.Int("rule", storedRule.Id))
continue
}
zap.L().Info("Rule parsed", zap.Int("rule", storedRule.Id))
updated := false
if parsedRule.RuleCondition != nil && parsedRule.Version == "" {
if parsedRule.RuleCondition.QueryType() == v3.QueryTypeBuilder {
// check if all the queries can be converted to v4
canMigrate := true
for _, query := range parsedRule.RuleCondition.CompositeQuery.BuilderQueries {
if query.DataSource == v3.DataSourceMetrics && query.Expression == query.QueryName {
if !canMigrateOperator(query.AggregateOperator) {
canMigrate = false
break
}
}
}
if canMigrate {
parsedRule.Version = "v4"
for _, query := range parsedRule.RuleCondition.CompositeQuery.BuilderQueries {
if query.DataSource == v3.DataSourceMetrics && query.Expression == query.QueryName {
// update aggregate attribute
if query.AggregateOperator == v3.AggregateOperatorSum ||
query.AggregateOperator == v3.AggregateOperatorMin ||
query.AggregateOperator == v3.AggregateOperatorMax {
query.AggregateAttribute.Type = "Gauge"
}
if query.AggregateOperator == v3.AggregateOperatorSumRate ||
query.AggregateOperator == v3.AggregateOperatorAvgRate ||
query.AggregateOperator == v3.AggregateOperatorMinRate ||
query.AggregateOperator == v3.AggregateOperatorMaxRate {
query.AggregateAttribute.Type = "Sum"
}
if query.AggregateOperator == v3.AggregateOperatorHistQuant50 ||
query.AggregateOperator == v3.AggregateOperatorHistQuant75 ||
query.AggregateOperator == v3.AggregateOperatorHistQuant90 ||
query.AggregateOperator == v3.AggregateOperatorHistQuant95 ||
query.AggregateOperator == v3.AggregateOperatorHistQuant99 {
query.AggregateAttribute.Type = "Histogram"
}
query.AggregateAttribute.DataType = v3.AttributeKeyDataTypeFloat64
query.AggregateAttribute.IsColumn = true
query.TimeAggregation = mapTimeAggregation[query.AggregateOperator]
query.SpaceAggregation = mapSpaceAggregation[query.AggregateOperator]
query.AggregateOperator = v3.AggregateOperator(query.TimeAggregation)
updated = true
}
}
}
}
}
if !updated {
zap.L().Info("Rule not updated", zap.Int("rule", storedRule.Id))
continue
}
ruleJSON, jsonErr := json.Marshal(parsedRule)
if jsonErr != nil {
zap.L().Error("Error marshalling rule; skipping rule migration", zap.Error(jsonErr), zap.Int("rule", storedRule.Id))
continue
}
stmt, prepareError := conn.PrepareContext(context.Background(), `UPDATE rules SET data=$3 WHERE id=$4;`)
if prepareError != nil {
zap.L().Error("Error in preparing statement for UPDATE to rules", zap.Error(prepareError))
continue
}
defer stmt.Close()
if _, err := stmt.Exec(ruleJSON, storedRule.Id); err != nil {
zap.L().Error("Error in Executing prepared statement for UPDATE to rules", zap.Error(err))
}
}
return nil
}

View File

@@ -0,0 +1,67 @@
package migrate
import (
"database/sql"
"github.com/jmoiron/sqlx"
alertstov4 "go.signoz.io/signoz/pkg/query-service/migrate/0_45_alerts_to_v4"
"go.uber.org/zap"
)
type DataMigration struct {
ID int `db:"id"`
Version string `db:"version"`
CreatedAt string `db:"created_at"`
Succeeded bool `db:"succeeded"`
}
func initSchema(conn *sqlx.DB) error {
tableSchema := `
CREATE TABLE IF NOT EXISTS data_migrations (
id SERIAL PRIMARY KEY,
version VARCHAR(255) NOT NULL UNIQUE,
created_at TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP,
succeeded BOOLEAN NOT NULL DEFAULT FALSE
);
`
_, err := conn.Exec(tableSchema)
if err != nil {
return err
}
return nil
}
func getMigrationVersion(conn *sqlx.DB, version string) (*DataMigration, error) {
var migration DataMigration
err := conn.Get(&migration, "SELECT * FROM data_migrations WHERE version = $1", version)
if err != nil {
if err == sql.ErrNoRows {
return nil, nil
}
return nil, err
}
return &migration, nil
}
func Migrate(dsn string) error {
conn, err := sqlx.Connect("sqlite3", dsn)
if err != nil {
return err
}
if err := initSchema(conn); err != nil {
return err
}
if m, err := getMigrationVersion(conn, "0.45_alerts_to_v4"); err == nil && m == nil {
if err := alertstov4.Migrate(conn); err != nil {
zap.L().Error("failed to migrate 0.45_alerts_to_v4", zap.Error(err))
} else {
_, err := conn.Exec("INSERT INTO data_migrations (version, succeeded) VALUES ('0.45_alerts_to_v4', true)")
if err != nil {
return err
}
}
}
return nil
}

View File

@@ -56,14 +56,14 @@ var BasicPlan = FeatureSet{
Name: QueryBuilderPanels,
Active: true,
Usage: 0,
UsageLimit: 20,
UsageLimit: -1,
Route: "",
},
Feature{
Name: QueryBuilderAlerts,
Active: true,
Usage: 0,
UsageLimit: 10,
UsageLimit: -1,
Route: "",
},
Feature{

View File

@@ -402,30 +402,61 @@ type CompositeQuery struct {
Unit string `json:"unit,omitempty"`
}
func (c *CompositeQuery) EnabledQueries() int {
count := 0
switch c.QueryType {
case QueryTypeBuilder:
for _, query := range c.BuilderQueries {
if !query.Disabled {
count++
}
}
case QueryTypeClickHouseSQL:
for _, query := range c.ClickHouseQueries {
if !query.Disabled {
count++
}
}
case QueryTypePromQL:
for _, query := range c.PromQueries {
if !query.Disabled {
count++
}
}
}
return count
}
func (c *CompositeQuery) Validate() error {
if c == nil {
return fmt.Errorf("composite query is required")
}
if c.BuilderQueries == nil && c.ClickHouseQueries == nil && c.PromQueries == nil {
return fmt.Errorf("composite query must contain at least one query")
return fmt.Errorf("composite query must contain at least one query type")
}
for name, query := range c.BuilderQueries {
if err := query.Validate(); err != nil {
return fmt.Errorf("builder query %s is invalid: %w", name, err)
if c.QueryType == QueryTypeBuilder {
for name, query := range c.BuilderQueries {
if err := query.Validate(c.PanelType); err != nil {
return fmt.Errorf("builder query %s is invalid: %w", name, err)
}
}
}
for name, query := range c.ClickHouseQueries {
if err := query.Validate(); err != nil {
return fmt.Errorf("clickhouse query %s is invalid: %w", name, err)
if c.QueryType == QueryTypeClickHouseSQL {
for name, query := range c.ClickHouseQueries {
if err := query.Validate(); err != nil {
return fmt.Errorf("clickhouse query %s is invalid: %w", name, err)
}
}
}
for name, query := range c.PromQueries {
if err := query.Validate(); err != nil {
return fmt.Errorf("prom query %s is invalid: %w", name, err)
if c.QueryType == QueryTypePromQL {
for name, query := range c.PromQueries {
if err := query.Validate(); err != nil {
return fmt.Errorf("prom query %s is invalid: %w", name, err)
}
}
}
@@ -638,10 +669,11 @@ type BuilderQuery struct {
ShiftBy int64
}
func (b *BuilderQuery) Validate() error {
func (b *BuilderQuery) Validate(panelType PanelType) error {
if b == nil {
return nil
}
if b.QueryName == "" {
return fmt.Errorf("query name is required")
}
@@ -686,6 +718,10 @@ func (b *BuilderQuery) Validate() error {
}
}
if b.GroupBy != nil {
if len(b.GroupBy) > 0 && panelType == PanelTypeList {
return fmt.Errorf("group by is not supported for list panel type")
}
for _, groupBy := range b.GroupBy {
if err := groupBy.Validate(); err != nil {
return fmt.Errorf("group by is invalid %w", err)

View File

@@ -49,7 +49,7 @@ type ruleDB struct {
// todo: move init methods for creating tables
func newRuleDB(db *sqlx.DB) RuleDB {
func NewRuleDB(db *sqlx.DB) RuleDB {
return &ruleDB{
db,
}

View File

@@ -108,7 +108,7 @@ func NewManager(o *ManagerOptions) (*Manager, error) {
return nil, err
}
db := newRuleDB(o.DBConn)
db := NewRuleDB(o.DBConn)
m := &Manager{
tasks: map[string]Task{},

View File

@@ -40,9 +40,8 @@ func (s Sample) MarshalJSON() ([]byte, error) {
}
type Point struct {
T int64
V float64
Vs []float64
T int64
V float64
}
func (p Point) String() string {

View File

@@ -19,6 +19,7 @@ import (
"github.com/ClickHouse/clickhouse-go/v2"
"github.com/ClickHouse/clickhouse-go/v2/lib/driver"
"go.signoz.io/signoz/pkg/query-service/common"
"go.signoz.io/signoz/pkg/query-service/converter"
"go.signoz.io/signoz/pkg/query-service/app/queryBuilder"
@@ -165,7 +166,9 @@ func (r *ThresholdRule) targetVal() float64 {
return 0
}
return *r.ruleCondition.Target
unitConverter := converter.FromUnit(converter.Unit(r.ruleCondition.TargetUnit))
value := unitConverter.Convert(converter.Value{F: *r.ruleCondition.Target, U: converter.Unit(r.ruleCondition.TargetUnit)}, converter.Unit(r.Unit()))
return value.F
}
func (r *ThresholdRule) matchType() MatchType {
@@ -413,40 +416,7 @@ func (r *ThresholdRule) Unit() string {
return ""
}
func (r *ThresholdRule) CheckCondition(v float64) bool {
if math.IsNaN(v) {
zap.L().Debug("found NaN in rule condition", zap.String("rule", r.Name()))
return false
}
if r.ruleCondition.Target == nil {
zap.L().Debug("found null target in rule condition", zap.String("rule", r.Name()))
return false
}
unitConverter := converter.FromUnit(converter.Unit(r.ruleCondition.TargetUnit))
value := unitConverter.Convert(converter.Value{F: *r.ruleCondition.Target, U: converter.Unit(r.ruleCondition.TargetUnit)}, converter.Unit(r.Unit()))
zap.L().Info("Checking condition for rule", zap.String("rule", r.Name()), zap.String("converter", unitConverter.Name()), zap.Float64("value", v), zap.Float64("target", value.F), zap.String("compareOp", string(r.ruleCondition.CompareOp)))
switch r.ruleCondition.CompareOp {
case ValueIsEq:
return v == value.F
case ValueIsNotEq:
return v != value.F
case ValueIsBelow:
return v < value.F
case ValueIsAbove:
return v > value.F
default:
return false
}
}
func (r *ThresholdRule) prepareQueryRange(ts time.Time) *v3.QueryRangeParamsV3 {
// todo(amol): add 30 seconds to evalWindow for rate calc
// todo(srikanthccv): make this configurable
// 2 minutes is reasonable time to wait for data to be available
// 60 seconds (SDK) + 10 seconds (batch) + rest for n/w + serialization + write to disk etc..
@@ -469,7 +439,7 @@ func (r *ThresholdRule) prepareQueryRange(ts time.Time) *v3.QueryRangeParamsV3 {
if r.ruleCondition.CompositeQuery != nil && r.ruleCondition.CompositeQuery.BuilderQueries != nil {
for _, q := range r.ruleCondition.CompositeQuery.BuilderQueries {
q.StepInterval = 60
q.StepInterval = int64(math.Max(float64(common.MinAllowedStepInterval(start, end)), 60))
}
}
@@ -477,23 +447,169 @@ func (r *ThresholdRule) prepareQueryRange(ts time.Time) *v3.QueryRangeParamsV3 {
return &v3.QueryRangeParamsV3{
Start: start,
End: end,
Step: 60,
Step: int64(math.Max(float64(common.MinAllowedStepInterval(start, end)), 60)),
CompositeQuery: r.ruleCondition.CompositeQuery,
}
}
func (r *ThresholdRule) shouldSkipFirstRecord() bool {
shouldSkip := false
for _, q := range r.ruleCondition.CompositeQuery.BuilderQueries {
if q.DataSource == v3.DataSourceMetrics && q.AggregateOperator.IsRateOperator() {
shouldSkip = true
func removeGroupinSetPoints(series []Sample) []Sample {
var result []Sample
for _, s := range series {
if s.Point.T > 0 {
result = append(result, s)
}
}
return shouldSkip
return result
}
func (r *ThresholdRule) shouldAlert(series []Sample) (Sample, bool) {
var alertSmpl Sample
var shouldAlert bool
var lbls labels.Labels
var lblsOrig labels.Labels
if len(series) > 0 {
lbls = series[0].Metric
lblsOrig = series[0].MetricOrig
series = removeGroupinSetPoints(series)
}
switch r.matchType() {
case AtleastOnce:
// If any sample matches the condition, the rule is firing.
if r.compareOp() == ValueIsAbove {
for _, smpl := range series {
if smpl.V > r.targetVal() {
alertSmpl = smpl
shouldAlert = true
break
}
}
} else if r.compareOp() == ValueIsBelow {
for _, smpl := range series {
if smpl.V < r.targetVal() {
alertSmpl = smpl
shouldAlert = true
break
}
}
} else if r.compareOp() == ValueIsEq {
for _, smpl := range series {
if smpl.V == r.targetVal() {
alertSmpl = smpl
shouldAlert = true
break
}
}
} else if r.compareOp() == ValueIsNotEq {
for _, smpl := range series {
if smpl.V != r.targetVal() {
alertSmpl = smpl
shouldAlert = true
break
}
}
}
case AllTheTimes:
// If all samples match the condition, the rule is firing.
shouldAlert = true
alertSmpl = Sample{Point: Point{V: r.targetVal()}, Metric: lbls, MetricOrig: lblsOrig}
if r.compareOp() == ValueIsAbove {
for _, smpl := range series {
if smpl.V <= r.targetVal() {
shouldAlert = false
break
}
}
} else if r.compareOp() == ValueIsBelow {
for _, smpl := range series {
if smpl.V >= r.targetVal() {
shouldAlert = false
break
}
}
} else if r.compareOp() == ValueIsEq {
for _, smpl := range series {
if smpl.V != r.targetVal() {
shouldAlert = false
break
}
}
} else if r.compareOp() == ValueIsNotEq {
for _, smpl := range series {
if smpl.V == r.targetVal() {
shouldAlert = false
break
}
}
}
case OnAverage:
// If the average of all samples matches the condition, the rule is firing.
var sum float64
for _, smpl := range series {
if math.IsNaN(smpl.V) || math.IsInf(smpl.V, 0) {
continue
}
sum += smpl.V
}
avg := sum / float64(len(series))
alertSmpl = Sample{Point: Point{V: avg}, Metric: lbls, MetricOrig: lblsOrig}
if r.compareOp() == ValueIsAbove {
if avg > r.targetVal() {
shouldAlert = true
}
} else if r.compareOp() == ValueIsBelow {
if avg < r.targetVal() {
shouldAlert = true
}
} else if r.compareOp() == ValueIsEq {
if avg == r.targetVal() {
shouldAlert = true
}
} else if r.compareOp() == ValueIsNotEq {
if avg != r.targetVal() {
shouldAlert = true
}
}
case InTotal:
// If the sum of all samples matches the condition, the rule is firing.
var sum float64
for _, smpl := range series {
if math.IsNaN(smpl.V) || math.IsInf(smpl.V, 0) {
continue
}
sum += smpl.V
}
alertSmpl = Sample{Point: Point{V: sum}, Metric: lbls, MetricOrig: lblsOrig}
if r.compareOp() == ValueIsAbove {
if sum > r.targetVal() {
shouldAlert = true
}
} else if r.compareOp() == ValueIsBelow {
if sum < r.targetVal() {
shouldAlert = true
}
} else if r.compareOp() == ValueIsEq {
if sum == r.targetVal() {
shouldAlert = true
}
} else if r.compareOp() == ValueIsNotEq {
if sum != r.targetVal() {
shouldAlert = true
}
}
}
return alertSmpl, shouldAlert
}
// queryClickhouse runs actual query against clickhouse
func (r *ThresholdRule) runChQuery(ctx context.Context, db clickhouse.Conn, query string) (Vector, error) {
func (r *ThresholdRule) runChQuery(
ctx context.Context,
db clickhouse.Conn,
query string,
timestamps []int64,
) (Vector, error) {
rows, err := db.Query(ctx, query)
if err != nil {
zap.L().Error("failed to get alert query result", zap.String("rule", r.Name()), zap.Error(err))
@@ -501,13 +617,7 @@ func (r *ThresholdRule) runChQuery(ctx context.Context, db clickhouse.Conn, quer
}
columnTypes := rows.ColumnTypes()
if err != nil {
return nil, err
}
columnNames := rows.Columns()
if err != nil {
return nil, err
}
vars := make([]interface{}, len(columnTypes))
for i := range columnTypes {
@@ -517,15 +627,7 @@ func (r *ThresholdRule) runChQuery(ctx context.Context, db clickhouse.Conn, quer
// []sample list
var result Vector
// map[fingerprint]sample
resultMap := make(map[uint64]Sample, 0)
// for rates we want to skip the first record
// but we dont know when the rates are being used
// so we always pick timeframe - 30 seconds interval
// and skip the first record for a given label combo
// NOTE: this is not applicable for raw queries
skipFirstRecord := make(map[uint64]bool, 0)
seriesMap := make(map[uint64][]Sample, 0)
defer rows.Close()
for rows.Next() {
@@ -611,7 +713,6 @@ func (r *ThresholdRule) runChQuery(ctx context.Context, db clickhouse.Conn, quer
if math.IsNaN(sample.Point.V) {
continue
}
sample.Point.Vs = append(sample.Point.Vs, sample.Point.V)
// capture lables in result
sample.Metric = lbls.Labels()
@@ -619,99 +720,46 @@ func (r *ThresholdRule) runChQuery(ctx context.Context, db clickhouse.Conn, quer
labelHash := lbls.Labels().Hash()
// here we walk through values of time series
// and calculate the final value used to compare
// with rule target
if existing, ok := resultMap[labelHash]; ok {
switch r.matchType() {
case AllTheTimes:
if r.compareOp() == ValueIsAbove {
sample.Point.V = math.Min(existing.Point.V, sample.Point.V)
resultMap[labelHash] = sample
} else if r.compareOp() == ValueIsBelow {
sample.Point.V = math.Max(existing.Point.V, sample.Point.V)
resultMap[labelHash] = sample
} else {
sample.Point.Vs = append(existing.Point.Vs, sample.Point.V)
resultMap[labelHash] = sample
}
case AtleastOnce:
if r.compareOp() == ValueIsAbove {
sample.Point.V = math.Max(existing.Point.V, sample.Point.V)
resultMap[labelHash] = sample
} else if r.compareOp() == ValueIsBelow {
sample.Point.V = math.Min(existing.Point.V, sample.Point.V)
resultMap[labelHash] = sample
} else {
sample.Point.Vs = append(existing.Point.Vs, sample.Point.V)
resultMap[labelHash] = sample
}
case OnAverage:
sample.Point.V = (existing.Point.V + sample.Point.V) / 2
resultMap[labelHash] = sample
case InTotal:
sample.Point.V = (existing.Point.V + sample.Point.V)
resultMap[labelHash] = sample
}
} else {
if r.Condition().QueryType() == v3.QueryTypeBuilder {
// for query builder, time series data
// we skip the first record to support rate cases correctly
// improvement(amol): explore approaches to limit this only for
// rate uses cases
if exists := skipFirstRecord[labelHash]; exists || !r.shouldSkipFirstRecord() {
resultMap[labelHash] = sample
} else {
// looks like the first record for this label combo, skip it
skipFirstRecord[labelHash] = true
}
} else {
// for clickhouse raw queries, all records are considered
// improvement(amol): think about supporting rate queries
// written by user. may have to skip a record, similar to qb case(above)
resultMap[labelHash] = sample
}
}
seriesMap[labelHash] = append(seriesMap[labelHash], sample)
}
for hash, s := range resultMap {
if r.matchType() == AllTheTimes && r.compareOp() == ValueIsEq {
for _, v := range s.Point.Vs {
if v != r.targetVal() { // if any of the values is not equal to target, alert shouldn't be sent
s.Point.V = v
}
queryLabel := r.GetSelectedQuery()
// if selected query is formula then populate missing labels
// formula can be from F1 to F26
formulaRegex := regexp.MustCompile(`F\d+`)
if formulaRegex.MatchString(queryLabel) {
zap.L().Info("found formula query", zap.String("ruleid", r.ID()), zap.String("query", queryLabel))
for hash, s := range seriesMap {
if len(s) == 0 {
continue
}
resultMap[hash] = s
} else if r.matchType() == AllTheTimes && r.compareOp() == ValueIsNotEq {
for _, v := range s.Point.Vs {
if v == r.targetVal() { // if any of the values is equal to target, alert shouldn't be sent
s.Point.V = v
}
// add zero value for missing timestamps
missingTimestamps := make(map[int64]bool)
labels := s[0].Metric
labelsOrig := s[0].MetricOrig
for _, ts := range timestamps {
missingTimestamps[ts] = true
}
resultMap[hash] = s
} else if r.matchType() == AtleastOnce && r.compareOp() == ValueIsEq {
for _, v := range s.Point.Vs {
if v == r.targetVal() { // if any of the values is equal to target, alert should be sent
s.Point.V = v
}
for _, sample := range s {
delete(missingTimestamps, sample.Point.T*1000)
}
resultMap[hash] = s
} else if r.matchType() == AtleastOnce && r.compareOp() == ValueIsNotEq {
for _, v := range s.Point.Vs {
if v != r.targetVal() { // if any of the values is not equal to target, alert should be sent
s.Point.V = v
for ts := range missingTimestamps {
sample := Sample{
Point: Point{
T: ts,
V: 0,
},
Metric: labels,
MetricOrig: labelsOrig,
}
s = append(s, sample)
}
resultMap[hash] = s
seriesMap[hash] = s
}
}
zap.L().Debug("resultmap(potential alerts)", zap.String("ruleid", r.ID()), zap.Int("count", len(resultMap)))
// if the data is missing for `For` duration then we should send alert
if r.ruleCondition.AlertOnAbsent && r.lastTimestampWithDatapoints.Add(time.Duration(r.Condition().AbsentFor)*time.Minute).Before(time.Now()) {
zap.L().Info("no data found for rule condition", zap.String("ruleid", r.ID()))
@@ -726,11 +774,10 @@ func (r *ThresholdRule) runChQuery(ctx context.Context, db clickhouse.Conn, quer
return result, nil
}
for _, sample := range resultMap {
// check alert rule condition before dumping results, if sendUnmatchedResults
// is set then add results irrespective of condition
if r.opts.SendUnmatched || r.CheckCondition(sample.Point.V) {
result = append(result, sample)
for _, series := range seriesMap {
alertSmpl, shouldAlert := r.shouldAlert(series)
if shouldAlert {
result = append(result, alertSmpl)
}
}
if len(result) != 0 {
@@ -739,7 +786,15 @@ func (r *ThresholdRule) runChQuery(ctx context.Context, db clickhouse.Conn, quer
return result, nil
}
func (r *ThresholdRule) prepareBuilderQueries(ts time.Time, ch driver.Conn) (map[string]string, error) {
// FIXME(srikanthccv): remove this hack
func (r *ThresholdRule) adjustedMetricTimeRange(start, end, step int64) (int64, int64) {
start = start - (start % (step * 1000))
adjustStep := int64(math.Min(float64(step), 60))
end = end - (end % (adjustStep * 1000))
return start, end
}
func (r *ThresholdRule) prepareBuilderQueries(ts time.Time, ch driver.Conn) (map[string]string, []int64, error) {
params := r.prepareQueryRange(ts)
if params.CompositeQuery.QueryType == v3.QueryTypeBuilder {
// check if any enrichment is required for logs if yes then enrich them
@@ -751,6 +806,13 @@ func (r *ThresholdRule) prepareBuilderQueries(ts time.Time, ch driver.Conn) (map
}
start, end := r.adjustedMetricTimeRange(params.Start, params.End, params.Step)
timestamps := []int64{}
for i := start; i < end; i += params.Step * 1000 {
timestamps = append(timestamps, i)
}
var runQueries map[string]string
var err error
@@ -763,7 +825,7 @@ func (r *ThresholdRule) prepareBuilderQueries(ts time.Time, ch driver.Conn) (map
runQueries, err = r.queryBuilder.PrepareQueries(params)
}
return runQueries, err
return runQueries, timestamps, err
}
// The following function is used to prepare the where clause for the query
@@ -1020,7 +1082,7 @@ func (r *ThresholdRule) GetSelectedQuery() string {
var err error
if r.ruleCondition.QueryType() == v3.QueryTypeBuilder {
queries, err = r.prepareBuilderQueries(time.Now(), nil)
queries, _, err = r.prepareBuilderQueries(time.Now(), nil)
if err != nil {
zap.L().Error("failed to prepare metric queries", zap.String("ruleid", r.ID()), zap.Error(err))
return ""
@@ -1070,11 +1132,12 @@ func (r *ThresholdRule) buildAndRunQuery(ctx context.Context, ts time.Time, ch c
// var to hold target query to be executed
var queries map[string]string
var err error
var timestamps []int64
// fetch the target query based on query type
if r.ruleCondition.QueryType() == v3.QueryTypeBuilder {
queries, err = r.prepareBuilderQueries(ts, ch)
queries, timestamps, err = r.prepareBuilderQueries(ts, ch)
if err != nil {
zap.L().Error("failed to prepare metric queries", zap.String("ruleid", r.ID()), zap.Error(err))
@@ -1104,7 +1167,7 @@ func (r *ThresholdRule) buildAndRunQuery(ctx context.Context, ts time.Time, ch c
zap.L().Debug("Selected query lable for rule", zap.String("ruleid", r.ID()), zap.String("label", queryLabel))
if queryString, ok := queries[queryLabel]; ok {
return r.runChQuery(ctx, ch, queryString)
return r.runChQuery(ctx, ch, queryString, timestamps)
}
zap.L().Error("invalid query label", zap.String("ruleid", r.ID()), zap.String("label", queryLabel), zap.Any("queries", queries))

View File

@@ -12,7 +12,7 @@ import (
"go.signoz.io/signoz/pkg/query-service/utils/labels"
)
func TestThresholdRuleCombinations(t *testing.T) {
func skipTestThresholdRuleCombinations(t *testing.T) {
postableRule := PostableRule{
AlertName: "Tricky Condition Tests",
AlertType: "METRIC_BASED_ALERT",
@@ -266,6 +266,45 @@ func TestThresholdRuleCombinations(t *testing.T) {
matchType: "1", // Once
target: 0.0,
},
{
values: [][]interface{}{
{int32(2), "endpoint"},
{int32(3), "endpoint"},
{int32(2), "endpoint"},
{int32(4), "endpoint"},
{int32(2), "endpoint"},
},
expectAlert: true,
compareOp: "2", // Below
matchType: "3", // On Average
target: 3.0,
},
{
values: [][]interface{}{
{int32(4), "endpoint"},
{int32(7), "endpoint"},
{int32(5), "endpoint"},
{int32(2), "endpoint"},
{int32(9), "endpoint"},
},
expectAlert: false,
compareOp: "2", // Below
matchType: "3", // On Average
target: 3.0,
},
{
values: [][]interface{}{
{int32(4), "endpoint"},
{int32(7), "endpoint"},
{int32(5), "endpoint"},
{int32(2), "endpoint"},
{int32(9), "endpoint"},
},
expectAlert: true,
compareOp: "2", // Below
matchType: "3", // On Average
target: 6.0,
},
}
for idx, c := range cases {
@@ -285,7 +324,7 @@ func TestThresholdRuleCombinations(t *testing.T) {
assert.NoError(t, err)
}
result, err := rule.runChQuery(context.Background(), mock, queryString)
result, err := rule.runChQuery(context.Background(), mock, queryString, []int64{})
if err != nil {
assert.NoError(t, err)
}

View File

@@ -327,6 +327,7 @@ func TestDashboardsForInstalledIntegrationDashboards(t *testing.T) {
// Installing an integration should make its dashboards appear in the dashboard list
require.False(testAvailableIntegration.IsInstalled)
tsBeforeInstallation := time.Now().Unix()
integrationsTB.RequestQSToInstallIntegration(
testAvailableIntegration.Id, map[string]interface{}{},
)
@@ -344,9 +345,13 @@ func TestDashboardsForInstalledIntegrationDashboards(t *testing.T) {
len(testIntegrationDashboards), len(dashboards),
"dashboards for installed integrations should appear in dashboards list",
)
require.GreaterOrEqual(dashboards[0].CreatedAt.Unix(), tsBeforeInstallation)
require.GreaterOrEqual(dashboards[0].UpdatedAt.Unix(), tsBeforeInstallation)
// Should be able to get installed integrations dashboard by id
dd := integrationsTB.GetDashboardByIdFromQS(dashboards[0].Uuid)
require.GreaterOrEqual(dd.CreatedAt.Unix(), tsBeforeInstallation)
require.GreaterOrEqual(dd.UpdatedAt.Unix(), tsBeforeInstallation)
require.Equal(*dd, dashboards[0])
// Integration dashboards should not longer appear in dashboard list after uninstallation

View File

@@ -167,7 +167,7 @@ func ClickHouseFormattedValue(v interface{}) string {
case []interface{}:
if len(x) == 0 {
return ""
return "[]"
}
switch x[0].(type) {
case string:
@@ -184,7 +184,7 @@ func ClickHouseFormattedValue(v interface{}) string {
return strings.Join(strings.Fields(fmt.Sprint(x)), ",")
default:
zap.L().Error("invalid type for formatted value", zap.Any("type", reflect.TypeOf(x[0])))
return ""
return "[]"
}
default:
zap.L().Error("invalid type for formatted value", zap.Any("type", reflect.TypeOf(x)))

View File

@@ -8,17 +8,17 @@ import (
// AssignReservedVars assigns values for go template vars. assumes that
// model.QueryRangeParamsV3.Start and End are Unix Nano timestamps
func AssignReservedVarsV3(metricsQueryRangeParams *v3.QueryRangeParamsV3) {
metricsQueryRangeParams.Variables["start_timestamp"] = metricsQueryRangeParams.Start / 1000
metricsQueryRangeParams.Variables["end_timestamp"] = metricsQueryRangeParams.End / 1000
func AssignReservedVarsV3(queryRangeParams *v3.QueryRangeParamsV3) {
queryRangeParams.Variables["start_timestamp"] = queryRangeParams.Start / 1000
queryRangeParams.Variables["end_timestamp"] = queryRangeParams.End / 1000
metricsQueryRangeParams.Variables["start_timestamp_ms"] = metricsQueryRangeParams.Start
metricsQueryRangeParams.Variables["end_timestamp_ms"] = metricsQueryRangeParams.End
queryRangeParams.Variables["start_timestamp_ms"] = queryRangeParams.Start
queryRangeParams.Variables["end_timestamp_ms"] = queryRangeParams.End
metricsQueryRangeParams.Variables["start_timestamp_nano"] = metricsQueryRangeParams.Start * 1e6
metricsQueryRangeParams.Variables["end_timestamp_nano"] = metricsQueryRangeParams.End * 1e6
queryRangeParams.Variables["start_timestamp_nano"] = queryRangeParams.Start * 1e6
queryRangeParams.Variables["end_timestamp_nano"] = queryRangeParams.End * 1e6
metricsQueryRangeParams.Variables["start_datetime"] = fmt.Sprintf("toDateTime(%d)", metricsQueryRangeParams.Start/1000)
metricsQueryRangeParams.Variables["end_datetime"] = fmt.Sprintf("toDateTime(%d)", metricsQueryRangeParams.End/1000)
queryRangeParams.Variables["start_datetime"] = fmt.Sprintf("toDateTime(%d)", queryRangeParams.Start/1000)
queryRangeParams.Variables["end_datetime"] = fmt.Sprintf("toDateTime(%d)", queryRangeParams.End/1000)
}