Compare commits

...

29 Commits

Author SHA1 Message Date
Prashant Shahi
765153caa8 Merge branch 'develop' into release/v0.24.0 2023-07-20 12:21:10 +05:30
Rajat Dabade
6d7081a4bd USE_SPAN_METRICS for latency graph on Feature flag (#3172) 2023-07-20 12:02:21 +05:30
Prashant Shahi
f1818235dc chore(release): 📌 pin versions: SigNoz OtelCollector 0.79.4
Signed-off-by: Prashant Shahi <prashant@signoz.io>
2023-07-20 11:04:15 +05:30
Prashant Shahi
c321a1741f Merge branch 'main' into release/v0.24.0 2023-07-20 09:50:13 +05:30
Prashant Shahi
a235beae36 chore(release): 📌 pin versions: SigNoz 0.24.0
Signed-off-by: Prashant Shahi <prashant@signoz.io>
2023-07-20 09:47:55 +05:30
Yevhen Shevchenko
98a2ef4080 feat: add suggestion to order by filter (#3162)
* feat: add suggestion to order by filter

* fix: column name for order by

* fix: mapper for order by

* fix: render order by for different panels

* fix: order by timestamp and aggrigate value

---------

Co-authored-by: Palash Gupta <palashgdev@gmail.com>
Co-authored-by: Vishal Sharma <makeavish786@gmail.com>
2023-07-19 11:17:21 +05:30
Vishal Sharma
5a2a987a9b feat: enable limit on ts (traces) (#3157)
* feat: enable limit on ts
2023-07-19 09:54:27 +05:30
Vishal Sharma
7822b4efee fix: encode email in loginPrecheck API (#3171) 2023-07-18 18:12:56 +05:30
dnazarenkoo
8f1451e154 feat: add the ability to drag columns (#3100)
* feat: add the ability to drag columns

* feat: add the ability to drag columns in the logs explorer

* feat: update drag logic

* fix: resolve comments

* feat: resolve comment regarding error handling

---------

Co-authored-by: Vishal Sharma <makeavish786@gmail.com>
Co-authored-by: Palash Gupta <palashgdev@gmail.com>
2023-07-18 17:18:34 +05:30
dnazarenkoo
07833b9859 feat: add the table view for the traces explorer (#2964)
* feat: add dynamic table based on query

* feat: add the list view for the traces explorer

* feat: add the list view for the traces explorer

* feat: add the list view for the traces explorer

* feat: add the table view for the traces explorer

* feat: remove unnecessary part of code for the table view in the traces explorer

* fix: resolve comments

---------

Co-authored-by: Yevhen Shevchenko <y.shevchenko@seedium.io>
Co-authored-by: Nazarenko19 <danil.nazarenko2000@gmail.com>
Co-authored-by: Vishal Sharma <makeavish786@gmail.com>
Co-authored-by: Palash Gupta <palashgdev@gmail.com>
2023-07-18 15:47:32 +05:30
GitStart
0de40a889d feat: legend show be hidden for the graph with no data (#3168)
Co-authored-by: gitstart <gitstart@users.noreply.github.com>
Co-authored-by: Nitesh Singh <nitesh.singh@gitstart.dev>
Co-authored-by: RubensRafael <rubensrafael2@live.com>
Co-authored-by: Rubens Rafael <70234898+RubensRafael@users.noreply.github.com>
2023-07-18 15:37:27 +05:30
dnazarenkoo
b3a6deb71b fix: resets the state of adding a new panel (#3122)
Co-authored-by: Vishal Sharma <makeavish786@gmail.com>
Co-authored-by: Palash Gupta <palashgdev@gmail.com>
2023-07-18 12:42:05 +05:30
Vishal Sharma
b5af238374 fix: use GLOBAL inner join instead of regular join (#3164) 2023-07-18 12:17:00 +05:30
Vishal Sharma
49afc2549f fix: ordering of ts and table panel (#3163)
* fix: ordering of ts and table panel

* chore: refactor
2023-07-18 11:01:51 +05:30
Prashant Shahi
0ed6594e48 Merge pull request #3165 from SigNoz/release/v0.23.1
Release/v0.23.1
2023-07-18 09:23:06 +05:30
Rajat Dabade
50142321f7 Shifting of graph from Dashboard to Service layer (#3107)
* refactor: initial setup for full view done

* feat: done with shifting of chart to services

* refactor: removed the dependency dashboard action

* refactor: make ondelete and onclone optional in widgetheader

* refactor: optimised the allowdelete, allowclone and allowEdit

* fix: build pipline error

* refactor: moved contant to contant.js

* refactor: create a utils and types and seperated it from component

* refactor: merge the latest overview from develop

* refactor: review comments changes

* refactor: magic string to constants

* refactor: handle the isloading for topLevelOperations

* refactor: apply loading check for other api's

* refactor: seperated the component

* refactor: removed the graphwithoutdashboard component

* fix: the type of variable from dashboard

* refactor: created utils and updated types

* refactor: changed the name of variable and fixed typos

* fix: the menu option dropdown for services widget header

* chore: ts config is updated for the isTwidgetoptions

* chore: removed the unwanted file

* fix: css on hover of widget header and default value

* refactor: renamed the file to index in fullView

* refactor: disable the edit delete clone option

* fix: typos

* chore: types are updated in metrics application

* chore: type is updated

* fix: build is fixed

* refactor: changes the yaxisunit to ns of serviceoverview

---------

Co-authored-by: Palash Gupta <palashgdev@gmail.com>
Co-authored-by: Vishal Sharma <makeavish786@gmail.com>
2023-07-18 08:55:01 +05:30
Prashant Shahi
29df1188c7 chore(release): 📌 pin versions: SigNoz 0.23.1
Signed-off-by: Prashant Shahi <prashant@signoz.io>
2023-07-18 01:08:10 +05:30
Palash Gupta
f3817d7335 chore: limit is enabled (#3159)
* chore: limit is enabled

* chore: tooltip is updated

---------

Co-authored-by: Vishal Sharma <makeavish786@gmail.com>
2023-07-17 22:31:30 +05:30
Srikanth Chekuri
fea8a71f51 fix: order by selection decides the result series (#3138) 2023-07-17 21:26:39 +05:30
Srikanth Chekuri
5e89211f53 feat: table view support for cumulative & delta metrics (#3110) 2023-07-17 21:08:54 +05:30
Palash Gupta
0beffb50ca fix: table view on click is now taking raw logs (#3153) 2023-07-17 18:25:55 +05:30
Yevhen Shevchenko
6efa1011aa fix: table column names with attribute and legend (#3142)
Co-authored-by: Vishal Sharma <makeavish786@gmail.com>
2023-07-17 18:16:41 +05:30
Palash Gupta
c68b611ad9 fix: not found is not visible when loading is visible (#3155) 2023-07-17 15:27:37 +05:30
Srikanth Chekuri
69828548b1 fix: skip grouping set points for value type reducer (#3147) 2023-07-17 14:17:54 +05:30
Palash Gupta
22d0aa951c feat: added goto top feature in list logs veiw (#3146) 2023-07-17 13:42:05 +05:30
Nityananda Gohain
7f9ba6c43a feat: add support for multiquery in ts with limit (#2970)
* feat: add support for multiquery in ts with limit

* feat: multiple groupby support

* feat: variables renamed

* feat: cleanup

* feat: clickhouse formatted value updated to support pointers

* fix: filter creation logic updated

* fix: minor fixes and tests

* fix: autcomplete top level keys

* Revert "fix: autcomplete top level keys"

This reverts commit 8d5e1e480f.

* fix: minor fixes

* feat: formula support for timeseries query with limit

* feat: implementation updated for limit queries

* feat: cleanup

* feat: order by logic updated

* feat: order by logic updated for both ts and table view

---------

Co-authored-by: Srikanth Chekuri <srikanth.chekuri92@gmail.com>
Co-authored-by: Vishal Sharma <makeavish786@gmail.com>
2023-07-16 23:07:45 +05:30
Palash Gupta
7a177e18e4 chore: table view for logs is updated (#3135) 2023-07-14 12:31:25 +05:30
Srikanth Chekuri
433f930956 feat: add flag to switch to span metrics for service level latency metrics (#3134) 2023-07-14 11:31:44 +05:30
Ankit Nayan
206e8b8dc3 Merge pull request #3130 from SigNoz/release/v0.23.0
Release/v0.23.0
2023-07-13 23:06:22 +05:30
103 changed files with 4080 additions and 1606 deletions

View File

@@ -137,7 +137,7 @@ services:
condition: on-failure
query-service:
image: signoz/query-service:0.23.0
image: signoz/query-service:0.24.0
command: ["-config=/root/config/prometheus.yml"]
# ports:
# - "6060:6060" # pprof port
@@ -166,7 +166,7 @@ services:
<<: *clickhouse-depend
frontend:
image: signoz/frontend:0.23.0
image: signoz/frontend:0.24.0
deploy:
restart_policy:
condition: on-failure
@@ -179,7 +179,7 @@ services:
- ../common/nginx-config.conf:/etc/nginx/conf.d/default.conf
otel-collector:
image: signoz/signoz-otel-collector:0.79.3
image: signoz/signoz-otel-collector:0.79.4
command: ["--config=/etc/otel-collector-config.yaml", "--feature-gates=-pkg.translator.prometheus.NormalizeName"]
user: root # required for reading docker container logs
volumes:
@@ -208,7 +208,7 @@ services:
<<: *clickhouse-depend
otel-collector-metrics:
image: signoz/signoz-otel-collector:0.79.3
image: signoz/signoz-otel-collector:0.79.4
command: ["--config=/etc/otel-collector-metrics-config.yaml", "--feature-gates=-pkg.translator.prometheus.NormalizeName"]
volumes:
- ./otel-collector-metrics-config.yaml:/etc/otel-collector-metrics-config.yaml

View File

@@ -41,7 +41,7 @@ services:
# Notes for Maintainers/Contributors who will change Line Numbers of Frontend & Query-Section. Please Update Line Numbers in `./scripts/commentLinesForSetup.sh` & `./CONTRIBUTING.md`
otel-collector:
container_name: otel-collector
image: signoz/signoz-otel-collector:0.79.3
image: signoz/signoz-otel-collector:0.79.4
command: ["--config=/etc/otel-collector-config.yaml", "--feature-gates=-pkg.translator.prometheus.NormalizeName"]
# user: root # required for reading docker container logs
volumes:
@@ -67,7 +67,7 @@ services:
otel-collector-metrics:
container_name: otel-collector-metrics
image: signoz/signoz-otel-collector:0.79.3
image: signoz/signoz-otel-collector:0.79.4
command: ["--config=/etc/otel-collector-metrics-config.yaml", "--feature-gates=-pkg.translator.prometheus.NormalizeName"]
volumes:
- ./otel-collector-metrics-config.yaml:/etc/otel-collector-metrics-config.yaml

View File

@@ -153,7 +153,7 @@ services:
# Notes for Maintainers/Contributors who will change Line Numbers of Frontend & Query-Section. Please Update Line Numbers in `./scripts/commentLinesForSetup.sh` & `./CONTRIBUTING.md`
query-service:
image: signoz/query-service:${DOCKER_TAG:-0.23.0}
image: signoz/query-service:${DOCKER_TAG:-0.24.0}
container_name: query-service
command: ["-config=/root/config/prometheus.yml"]
# ports:
@@ -181,7 +181,7 @@ services:
<<: *clickhouse-depend
frontend:
image: signoz/frontend:${DOCKER_TAG:-0.23.0}
image: signoz/frontend:${DOCKER_TAG:-0.24.0}
container_name: frontend
restart: on-failure
depends_on:
@@ -193,7 +193,7 @@ services:
- ../common/nginx-config.conf:/etc/nginx/conf.d/default.conf
otel-collector:
image: signoz/signoz-otel-collector:${OTELCOL_TAG:-0.79.3}
image: signoz/signoz-otel-collector:${OTELCOL_TAG:-0.79.4}
command: ["--config=/etc/otel-collector-config.yaml", "--feature-gates=-pkg.translator.prometheus.NormalizeName"]
user: root # required for reading docker container logs
volumes:
@@ -219,7 +219,7 @@ services:
<<: *clickhouse-depend
otel-collector-metrics:
image: signoz/signoz-otel-collector:${OTELCOL_TAG:-0.79.3}
image: signoz/signoz-otel-collector:${OTELCOL_TAG:-0.79.4}
command: ["--config=/etc/otel-collector-metrics-config.yaml", "--feature-gates=-pkg.translator.prometheus.NormalizeName"]
volumes:
- ./otel-collector-metrics-config.yaml:/etc/otel-collector-metrics-config.yaml

View File

@@ -15,13 +15,14 @@ import (
)
type APIHandlerOptions struct {
DataConnector interfaces.DataConnector
SkipConfig *basemodel.SkipConfig
PreferDelta bool
AppDao dao.ModelDao
RulesManager *rules.Manager
FeatureFlags baseint.FeatureLookup
LicenseManager *license.Manager
DataConnector interfaces.DataConnector
SkipConfig *basemodel.SkipConfig
PreferDelta bool
PreferSpanMetrics bool
AppDao dao.ModelDao
RulesManager *rules.Manager
FeatureFlags baseint.FeatureLookup
LicenseManager *license.Manager
}
type APIHandler struct {
@@ -33,12 +34,13 @@ type APIHandler struct {
func NewAPIHandler(opts APIHandlerOptions) (*APIHandler, error) {
baseHandler, err := baseapp.NewAPIHandler(baseapp.APIHandlerOpts{
Reader: opts.DataConnector,
SkipConfig: opts.SkipConfig,
PerferDelta: opts.PreferDelta,
AppDao: opts.AppDao,
RuleManager: opts.RulesManager,
FeatureFlags: opts.FeatureFlags})
Reader: opts.DataConnector,
SkipConfig: opts.SkipConfig,
PerferDelta: opts.PreferDelta,
PreferSpanMetrics: opts.PreferSpanMetrics,
AppDao: opts.AppDao,
RuleManager: opts.RulesManager,
FeatureFlags: opts.FeatureFlags})
if err != nil {
return nil, err

View File

@@ -2,6 +2,8 @@ package api
import (
"net/http"
basemodel "go.signoz.io/signoz/pkg/query-service/model"
)
func (ah *APIHandler) getFeatureFlags(w http.ResponseWriter, r *http.Request) {
@@ -10,5 +12,13 @@ func (ah *APIHandler) getFeatureFlags(w http.ResponseWriter, r *http.Request) {
ah.HandleError(w, err, http.StatusInternalServerError)
return
}
if ah.opts.PreferSpanMetrics {
for idx := range featureSet {
feature := &featureSet[idx]
if feature.Name == basemodel.UseSpanMetrics {
featureSet[idx].Active = true
}
}
}
ah.Respond(w, featureSet)
}

View File

@@ -54,9 +54,10 @@ type ServerOptions struct {
HTTPHostPort string
PrivateHostPort string
// alert specific params
DisableRules bool
RuleRepoURL string
PreferDelta bool
DisableRules bool
RuleRepoURL string
PreferDelta bool
PreferSpanMetrics bool
}
// Server runs HTTP api service
@@ -169,13 +170,14 @@ func NewServer(serverOptions *ServerOptions) (*Server, error) {
telemetry.GetInstance().SetReader(reader)
apiOpts := api.APIHandlerOptions{
DataConnector: reader,
SkipConfig: skipConfig,
PreferDelta: serverOptions.PreferDelta,
AppDao: modelDao,
RulesManager: rm,
FeatureFlags: lm,
LicenseManager: lm,
DataConnector: reader,
SkipConfig: skipConfig,
PreferDelta: serverOptions.PreferDelta,
PreferSpanMetrics: serverOptions.PreferSpanMetrics,
AppDao: modelDao,
RulesManager: rm,
FeatureFlags: lm,
LicenseManager: lm,
}
apiHandler, err := api.NewAPIHandler(apiOpts)

View File

@@ -84,11 +84,13 @@ func main() {
var enableQueryServiceLogOTLPExport bool
var preferDelta bool
var preferSpanMetrics bool
flag.StringVar(&promConfigPath, "config", "./config/prometheus.yml", "(prometheus config to read metrics)")
flag.StringVar(&skipTopLvlOpsPath, "skip-top-level-ops", "", "(config file to skip top level operations)")
flag.BoolVar(&disableRules, "rules.disable", false, "(disable rule evaluation)")
flag.BoolVar(&preferDelta, "prefer-delta", false, "(prefer delta over raw metrics)")
flag.BoolVar(&preferDelta, "prefer-delta", false, "(prefer delta over cumulative metrics)")
flag.BoolVar(&preferSpanMetrics, "prefer-span-metrics", false, "(prefer span metrics for service level metrics)")
flag.StringVar(&ruleRepoURL, "rules.repo-url", baseconst.AlertHelpPage, "(host address used to build rule link in alert messages)")
flag.BoolVar(&enableQueryServiceLogOTLPExport, "enable.query.service.log.otlp.export", false, "(enable query service log otlp export)")
flag.Parse()
@@ -105,6 +107,7 @@ func main() {
PromConfigPath: promConfigPath,
SkipTopLvlOpsPath: skipTopLvlOpsPath,
PreferDelta: preferDelta,
PreferSpanMetrics: preferSpanMetrics,
PrivateHostPort: baseconst.PrivateHostPort,
DisableRules: disableRules,
RuleRepoURL: ruleRepoURL,

View File

@@ -60,6 +60,13 @@ var BasicPlan = basemodel.FeatureSet{
UsageLimit: 5,
Route: "",
},
basemodel.Feature{
Name: basemodel.UseSpanMetrics,
Active: false,
Usage: 0,
UsageLimit: -1,
Route: "",
},
}
var ProPlan = basemodel.FeatureSet{
@@ -105,6 +112,13 @@ var ProPlan = basemodel.FeatureSet{
UsageLimit: -1,
Route: "",
},
basemodel.Feature{
Name: basemodel.UseSpanMetrics,
Active: false,
Usage: 0,
UsageLimit: -1,
Route: "",
},
}
var EnterprisePlan = basemodel.FeatureSet{
@@ -150,4 +164,11 @@ var EnterprisePlan = basemodel.FeatureSet{
UsageLimit: -1,
Route: "",
},
basemodel.Feature{
Name: basemodel.UseSpanMetrics,
Active: false,
Usage: 0,
UsageLimit: -1,
Route: "",
},
}

View File

@@ -69,6 +69,7 @@
"papaparse": "5.4.1",
"react": "18.2.0",
"react-dom": "18.2.0",
"react-drag-listview": "2.0.0",
"react-force-graph": "^1.41.0",
"react-grid-layout": "^1.3.4",
"react-i18next": "^11.16.1",

View File

@@ -9,9 +9,9 @@ const loginPrecheck = async (
): Promise<SuccessResponse<PayloadProps> | ErrorResponse> => {
try {
const response = await axios.get(
`/loginPrecheck?email=${props.email}&ref=${encodeURIComponent(
window.location.href,
)}`,
`/loginPrecheck?email=${encodeURIComponent(
props.email,
)}&ref=${encodeURIComponent(window.location.href)}`,
);
return {

View File

@@ -288,12 +288,11 @@ function Graph({
if (chartHasData) {
chartPlugins.push(createIntersectionCursorPlugin());
chartPlugins.push(createDragSelectPlugin());
chartPlugins.push(legend(name, data.datasets.length > 3));
} else {
chartPlugins.push(emptyGraph);
}
chartPlugins.push(legend(name, data.datasets.length > 3));
lineChartRef.current = new Chart(chartRef.current, {
type,
data,

View File

@@ -1,6 +1,7 @@
import { Drawer, Tabs } from 'antd';
import JSONView from 'container/LogDetailedView/JsonView';
import TableView from 'container/LogDetailedView/TableView';
import { useMemo } from 'react';
import { LogDetailProps } from './LogDetail.interfaces';
@@ -14,24 +15,27 @@ function LogDetail({
onClose();
};
const items = [
{
label: 'Table',
key: '1',
children: log && (
<TableView
logData={log}
onAddToQuery={onAddToQuery}
onClickActionItem={onClickActionItem}
/>
),
},
{
label: 'JSON',
key: '2',
children: log && <JSONView logData={log} />,
},
];
const items = useMemo(
() => [
{
label: 'Table',
key: '1',
children: log && (
<TableView
logData={log}
onAddToQuery={onAddToQuery}
onClickActionItem={onClickActionItem}
/>
),
},
{
label: 'JSON',
key: '2',
children: log && <JSONView logData={log} />,
},
],
[log, onAddToQuery, onClickActionItem],
);
return (
<Drawer

View File

@@ -6,7 +6,6 @@ import dayjs from 'dayjs';
import dompurify from 'dompurify';
import { FlatLogData } from 'lib/logs/flatLogData';
import { useMemo } from 'react';
import { ILog } from 'types/api/logs/log';
import { ExpandIconWrapper } from '../RawLogView/styles';
import { defaultCellStyle, defaultTableStyle } from './config';
@@ -57,14 +56,14 @@ export const useTableView = (props: UseTableViewProps): UseTableViewResult => {
dataIndex: 'id',
key: 'expand',
// https://github.com/ant-design/ant-design/discussions/36886
render: (_, item): ColumnTypeRender<Record<string, unknown>> => ({
render: (_, item, index): ColumnTypeRender<Record<string, unknown>> => ({
props: {
style: defaultCellStyle,
},
children: (
<ExpandIconWrapper
onClick={(): void => {
onClickExpand((item as unknown) as ILog);
onClickExpand(logs[index]);
}}
>
<ExpandAltOutlined />
@@ -108,7 +107,7 @@ export const useTableView = (props: UseTableViewProps): UseTableViewResult => {
},
...(appendTo === 'end' ? fieldColumns : []),
];
}, [fields, linesPerRow, appendTo, onClickExpand]);
}, [fields, appendTo, linesPerRow, onClickExpand, logs]);
return { columns, dataSource: flattenLogData };
};

View File

@@ -1,6 +1,8 @@
/* eslint-disable react/jsx-props-no-spreading */
import { Table } from 'antd';
import type { TableProps } from 'antd/es/table';
import { ColumnsType } from 'antd/lib/table';
import { dragColumnParams } from 'hooks/useDragColumns/configs';
import {
SyntheticEvent,
useCallback,
@@ -8,12 +10,18 @@ import {
useMemo,
useState,
} from 'react';
import ReactDragListView from 'react-drag-listview';
import { ResizeCallbackData } from 'react-resizable';
import ResizableHeader from './ResizableHeader';
import { DragSpanStyle } from './styles';
import { ResizeTableProps } from './types';
// eslint-disable-next-line @typescript-eslint/no-explicit-any
function ResizeTable({ columns, ...restprops }: TableProps<any>): JSX.Element {
function ResizeTable({
columns,
onDragColumn,
...restProps
}: ResizeTableProps): JSX.Element {
const [columnsData, setColumns] = useState<ColumnsType>([]);
const handleResize = useCallback(
@@ -31,16 +39,32 @@ function ResizeTable({ columns, ...restprops }: TableProps<any>): JSX.Element {
[columnsData],
);
const mergeColumns = useMemo(
const mergedColumns = useMemo(
() =>
columnsData.map((col, index) => ({
...col,
...(onDragColumn && {
title: (
<DragSpanStyle className="dragHandler">
{col?.title?.toString() || ''}
</DragSpanStyle>
),
}),
onHeaderCell: (column: ColumnsType<unknown>[number]): unknown => ({
width: column.width,
onResize: handleResize(index),
}),
})),
[columnsData, handleResize],
})) as ColumnsType<any>,
[columnsData, onDragColumn, handleResize],
);
const tableParams = useMemo(
() => ({
...restProps,
components: { header: { cell: ResizableHeader } },
columns: mergedColumns,
}),
[mergedColumns, restProps],
);
useEffect(() => {
@@ -49,15 +73,17 @@ function ResizeTable({ columns, ...restprops }: TableProps<any>): JSX.Element {
}
}, [columns]);
return (
<Table
// eslint-disable-next-line react/jsx-props-no-spreading
{...restprops}
components={{ header: { cell: ResizableHeader } }}
// eslint-disable-next-line @typescript-eslint/no-explicit-any
columns={mergeColumns as ColumnsType<any>}
/>
return onDragColumn ? (
<ReactDragListView.DragColumn {...dragColumnParams} onDragEnd={onDragColumn}>
<Table {...tableParams} />
</ReactDragListView.DragColumn>
) : (
<Table {...tableParams} />
);
}
ResizeTable.defaultProps = {
onDragColumn: undefined,
};
export default ResizeTable;

View File

@@ -2,10 +2,16 @@ import styled from 'styled-components';
export const SpanStyle = styled.span`
position: absolute;
right: -5px;
right: -0.313rem;
bottom: 0;
z-index: 1;
width: 10px;
width: 0.625rem;
height: 100%;
cursor: col-resize;
`;
export const DragSpanStyle = styled.span`
display: flex;
margin: -1rem;
padding: 1rem;
`;

View File

@@ -0,0 +1,6 @@
import { TableProps } from 'antd';
// eslint-disable-next-line @typescript-eslint/no-explicit-any
export interface ResizeTableProps extends TableProps<any> {
onDragColumn?: (fromIndex: number, toIndex: number) => void;
}

View File

@@ -8,4 +8,5 @@ export enum FeatureKeys {
QUERY_BUILDER_PANELS = 'QUERY_BUILDER_PANELS',
QUERY_BUILDER_ALERTS = 'QUERY_BUILDER_ALERTS',
DISABLE_UPSELL = 'DISABLE_UPSELL',
USE_SPAN_METRICS = 'USE_SPAN_METRICS',
}

View File

@@ -8,4 +8,6 @@ export enum LOCALSTORAGE {
LOGS_LINES_PER_ROW = 'LOGS_LINES_PER_ROW',
LOGS_LIST_OPTIONS = 'LOGS_LIST_OPTIONS',
TRACES_LIST_OPTIONS = 'TRACES_LIST_OPTIONS',
TRACES_LIST_COLUMNS = 'TRACES_LIST_COLUMNS',
LOGS_LIST_COLUMNS = 'LOGS_LIST_COLUMNS',
}

View File

@@ -0,0 +1,72 @@
import { Select, Spin } from 'antd';
import { OrderByFilterProps } from 'container/QueryBuilder/filters/OrderByFilter/OrderByFilter.interfaces';
import { useOrderByFilter } from 'container/QueryBuilder/filters/OrderByFilter/useOrderByFilter';
import { selectStyle } from 'container/QueryBuilder/filters/QueryBuilderSearch/config';
import { useGetAggregateKeys } from 'hooks/queryBuilder/useGetAggregateKeys';
import { memo, useMemo } from 'react';
import { StringOperators } from 'types/common/queryBuilder';
function ExplorerOrderBy({ query, onChange }: OrderByFilterProps): JSX.Element {
const {
debouncedSearchText,
selectedValue,
aggregationOptions,
generateOptions,
createOptions,
handleChange,
handleSearchKeys,
} = useOrderByFilter({ query, onChange });
const { data, isFetching } = useGetAggregateKeys(
{
aggregateAttribute: query.aggregateAttribute.key,
dataSource: query.dataSource,
aggregateOperator: query.aggregateOperator,
searchText: debouncedSearchText,
},
{
keepPreviousData: true,
},
);
const options = useMemo(() => {
const keysOptions = createOptions(data?.payload?.attributeKeys || []);
const customOptions = createOptions([
{ key: 'timestamp', isColumn: true, type: null, dataType: null },
]);
const baseOptions = [
...customOptions,
...(query.aggregateOperator === StringOperators.NOOP
? []
: aggregationOptions),
...keysOptions,
];
return generateOptions(baseOptions);
}, [
aggregationOptions,
createOptions,
data?.payload?.attributeKeys,
generateOptions,
query.aggregateOperator,
]);
return (
<Select
mode="tags"
style={selectStyle}
onSearch={handleSearchKeys}
showSearch
showArrow={false}
value={selectedValue}
labelInValue
options={options}
notFoundContent={isFetching ? <Spin size="small" /> : null}
onChange={handleChange}
/>
);
}
export default memo(ExplorerOrderBy);

View File

@@ -0,0 +1,29 @@
import { ArrowUpOutlined } from '@ant-design/icons';
import { FloatButton } from 'antd';
import { PANEL_TYPES } from 'constants/queryBuilder';
// hooks
import { useQueryBuilder } from 'hooks/queryBuilder/useQueryBuilder';
import useScrollToTop from 'hooks/useScrollToTop';
function GoToTop(): JSX.Element | null {
const { isVisible, scrollToTop } = useScrollToTop();
const { panelType } = useQueryBuilder();
if (!isVisible) return null;
if (panelType === PANEL_TYPES.LIST) {
return (
<FloatButton
onClick={scrollToTop}
shape="circle"
type="primary"
icon={<ArrowUpOutlined />}
/>
);
}
return null;
}
export default GoToTop;

View File

@@ -1,137 +0,0 @@
import { Button } from 'antd';
import { GraphOnClickHandler } from 'components/Graph';
import Spinner from 'components/Spinner';
import TimePreference from 'components/TimePreferenceDropDown';
import GridGraphComponent from 'container/GridGraphComponent';
import {
timeItems,
timePreferance,
} from 'container/NewWidget/RightContainer/timeItems';
import { useGetQueryRange } from 'hooks/queryBuilder/useGetQueryRange';
import { useStepInterval } from 'hooks/queryBuilder/useStepInterval';
import { getDashboardVariables } from 'lib/dashbaordVariables/getDashboardVariables';
import getChartData from 'lib/getChartData';
import { useCallback, useMemo, useState } from 'react';
import { useSelector } from 'react-redux';
import { AppState } from 'store/reducers';
import { Widgets } from 'types/api/dashboard/getAll';
import { GlobalReducer } from 'types/reducer/globalTime';
import { TimeContainer } from './styles';
function FullView({
widget,
fullViewOptions = true,
onClickHandler,
name,
yAxisUnit,
onDragSelect,
isDependedDataLoaded = false,
}: FullViewProps): JSX.Element {
const { selectedTime: globalSelectedTime } = useSelector<
AppState,
GlobalReducer
>((state) => state.globalTime);
const getSelectedTime = useCallback(
() =>
timeItems.find((e) => e.enum === (widget?.timePreferance || 'GLOBAL_TIME')),
[widget],
);
const [selectedTime, setSelectedTime] = useState<timePreferance>({
name: getSelectedTime()?.name || '',
enum: widget?.timePreferance || 'GLOBAL_TIME',
});
const queryKey = useMemo(
() =>
`FullViewGetMetricsQueryRange-${selectedTime.enum}-${globalSelectedTime}-${widget.id}`,
[selectedTime, globalSelectedTime, widget],
);
const updatedQuery = useStepInterval(widget?.query);
const response = useGetQueryRange(
{
selectedTime: selectedTime.enum,
graphType: widget.panelTypes,
query: updatedQuery,
globalSelectedInterval: globalSelectedTime,
variables: getDashboardVariables(),
},
{
queryKey,
enabled: !isDependedDataLoaded,
},
);
const chartDataSet = useMemo(
() =>
getChartData({
queryData: [
{
queryData: response?.data?.payload?.data?.result || [],
},
],
}),
[response],
);
if (response.status === 'idle' || response.status === 'loading') {
return <Spinner height="100%" size="large" tip="Loading..." />;
}
return (
<>
{fullViewOptions && (
<TimeContainer>
<TimePreference
selectedTime={selectedTime}
setSelectedTime={setSelectedTime}
/>
<Button
onClick={(): void => {
response.refetch();
}}
type="primary"
>
Refresh
</Button>
</TimeContainer>
)}
<GridGraphComponent
GRAPH_TYPES={widget.panelTypes}
data={chartDataSet}
isStacked={widget.isStacked}
opacity={widget.opacity}
title={widget.title}
onClickHandler={onClickHandler}
name={name}
yAxisUnit={yAxisUnit}
onDragSelect={onDragSelect}
/>
</>
);
}
interface FullViewProps {
widget: Widgets;
fullViewOptions?: boolean;
onClickHandler?: GraphOnClickHandler;
name: string;
yAxisUnit?: string;
onDragSelect?: (start: number, end: number) => void;
isDependedDataLoaded?: boolean;
}
FullView.defaultProps = {
fullViewOptions: undefined,
onClickHandler: undefined,
yAxisUnit: undefined,
onDragSelect: undefined,
isDependedDataLoaded: undefined,
};
export default FullView;

View File

@@ -1,5 +1,4 @@
import { Button, Typography } from 'antd';
import getQueryResult from 'api/widgets/getQuery';
import { Button } from 'antd';
import { GraphOnClickHandler } from 'components/Graph';
import Spinner from 'components/Spinner';
import TimePreference from 'components/TimePreferenceDropDown';
@@ -7,22 +6,18 @@ import GridGraphComponent from 'container/GridGraphComponent';
import {
timeItems,
timePreferance,
timePreferenceType,
} from 'container/NewWidget/RightContainer/timeItems';
import convertToNanoSecondsToSecond from 'lib/convertToNanoSecondsToSecond';
import { useGetQueryRange } from 'hooks/queryBuilder/useGetQueryRange';
import { useStepInterval } from 'hooks/queryBuilder/useStepInterval';
import { getDashboardVariables } from 'lib/dashbaordVariables/getDashboardVariables';
import getChartData from 'lib/getChartData';
import GetMaxMinTime from 'lib/getMaxMinTime';
import GetMinMax from 'lib/getMinMax';
import getStartAndEndTime from 'lib/getStartAndEndTime';
import getStep from 'lib/getStep';
import { useCallback, useMemo, useState } from 'react';
import { useQueries } from 'react-query';
import { useSelector } from 'react-redux';
import { AppState } from 'store/reducers';
import { PromQLWidgets } from 'types/api/dashboard/getAll';
import { Widgets } from 'types/api/dashboard/getAll';
import { GlobalReducer } from 'types/reducer/globalTime';
import { NotFoundContainer, TimeContainer } from './styles';
import { TimeContainer } from './styles';
function FullView({
widget,
@@ -31,8 +26,9 @@ function FullView({
name,
yAxisUnit,
onDragSelect,
isDependedDataLoaded = false,
}: FullViewProps): JSX.Element {
const { minTime, maxTime, selectedTime: globalSelectedTime } = useSelector<
const { selectedTime: globalSelectedTime } = useSelector<
AppState,
GlobalReducer
>((state) => state.globalTime);
@@ -48,110 +44,55 @@ function FullView({
enum: widget?.timePreferance || 'GLOBAL_TIME',
});
const maxMinTime = GetMaxMinTime({
graphType: widget.panelTypes,
maxTime,
minTime,
});
const getMinMax = (
time: timePreferenceType,
): { min: string | number; max: string | number } => {
if (time === 'GLOBAL_TIME') {
const minMax = GetMinMax(globalSelectedTime, [
minTime / 1000000,
maxTime / 1000000,
]);
return {
min: convertToNanoSecondsToSecond(minMax.minTime / 1000),
max: convertToNanoSecondsToSecond(minMax.maxTime / 1000),
};
}
const minMax = getStartAndEndTime({
type: selectedTime.enum,
maxTime: maxMinTime.maxTime,
minTime: maxMinTime.minTime,
});
return { min: parseInt(minMax.start, 10), max: parseInt(minMax.end, 10) };
};
const queryMinMax = getMinMax(selectedTime.enum);
const queryLength = widget.query.filter((e) => e.query.length !== 0);
const response = useQueries(
queryLength.map((query) => ({
// eslint-disable-next-line @typescript-eslint/explicit-function-return-type
queryFn: () =>
getQueryResult({
end: queryMinMax.max.toString(),
query: query.query,
start: queryMinMax.min.toString(),
step: `${getStep({
start: queryMinMax.min,
end: queryMinMax.max,
inputFormat: 's',
})}`,
}),
queryHash: `${query.query}-${query.legend}-${selectedTime.enum}`,
retryOnMount: false,
})),
const queryKey = useMemo(
() =>
`FullViewGetMetricsQueryRange-${selectedTime.enum}-${globalSelectedTime}-${widget.id}`,
[selectedTime, globalSelectedTime, widget],
);
const isError =
response.find((e) => e?.data?.statusCode !== 200) !== undefined ||
response.some((e) => e.isError === true);
const updatedQuery = useStepInterval(widget?.query);
const isLoading = response.some((e) => e.isLoading === true);
const errorMessage = response.find((e) => e.data?.error !== null)?.data?.error;
const data = response.map((responseOfQuery) =>
responseOfQuery?.data?.payload?.result.map((e, index) => ({
query: queryLength[index]?.query,
queryData: e,
legend: queryLength[index]?.legend,
})),
const response = useGetQueryRange(
{
selectedTime: selectedTime.enum,
graphType: widget.panelTypes,
query: updatedQuery,
globalSelectedInterval: globalSelectedTime,
variables: getDashboardVariables(),
},
{
queryKey,
enabled: !isDependedDataLoaded,
},
);
const chartDataSet = useMemo(
() =>
getChartData({
queryData: data.map((e) => ({
query: e?.map((e) => e.query).join(' ') || '',
queryData: e?.map((e) => e.queryData) || [],
legend: e?.map((e) => e.legend).join('') || '',
})),
queryData: [
{
queryData: response?.data?.payload?.data?.result || [],
},
],
}),
[data],
[response],
);
if (isLoading) {
if (response.status === 'idle' || response.status === 'loading') {
return <Spinner height="100%" size="large" tip="Loading..." />;
}
if (isError || data === undefined || data[0] === undefined) {
return (
<NotFoundContainer>
<Typography>{errorMessage}</Typography>
</NotFoundContainer>
);
}
return (
<>
{fullViewOptions && (
<TimeContainer>
<TimePreference
{...{
selectedTime,
setSelectedTime,
}}
selectedTime={selectedTime}
setSelectedTime={setSelectedTime}
/>
<Button
onClick={(): void => {
response.forEach((e) => e.refetch());
response.refetch();
}}
type="primary"
>
@@ -176,12 +117,13 @@ function FullView({
}
interface FullViewProps {
widget: PromQLWidgets;
widget: Widgets;
fullViewOptions?: boolean;
onClickHandler?: GraphOnClickHandler;
name: string;
yAxisUnit?: string;
onDragSelect?: (start: number, end: number) => void;
isDependedDataLoaded?: boolean;
}
FullView.defaultProps = {
@@ -189,6 +131,7 @@ FullView.defaultProps = {
onClickHandler: undefined,
yAxisUnit: undefined,
onDragSelect: undefined,
isDependedDataLoaded: undefined,
};
export default FullView;

View File

@@ -1,5 +1,6 @@
import { Typography } from 'antd';
import { ChartData } from 'chart.js';
import { GraphOnClickHandler } from 'components/Graph';
import Spinner from 'components/Spinner';
import GridGraphComponent from 'container/GridGraphComponent';
import { UpdateDashboard } from 'container/GridGraphLayout/utils';
@@ -35,12 +36,16 @@ import { Widgets } from 'types/api/dashboard/getAll';
import AppReducer from 'types/reducer/app';
import DashboardReducer from 'types/reducer/dashboards';
import { GlobalReducer } from 'types/reducer/globalTime';
import {
getSelectedDashboard,
getSelectedDashboardVariable,
} from 'utils/dashboard/selectedDashboard';
import { v4 } from 'uuid';
import { LayoutProps } from '..';
import EmptyWidget from '../EmptyWidget';
import WidgetHeader from '../WidgetHeader';
import FullView from './FullView/index.metricsBuilder';
import FullView from './FullView';
import { FullViewContainer, Modal } from './styles';
function GridCardGraph({
@@ -51,6 +56,10 @@ function GridCardGraph({
layout = [],
setLayout,
onDragSelect,
onClickHandler,
allowDelete,
allowClone,
allowEdit,
}: GridCardGraphProps): JSX.Element {
const { ref: graphRef, inView: isGraphVisible } = useInView({
threshold: 0,
@@ -77,9 +86,9 @@ function GridCardGraph({
const { dashboards } = useSelector<AppState, DashboardReducer>(
(state) => state.dashboards,
);
const [selectedDashboard] = dashboards;
const selectedData = selectedDashboard?.data;
const { variables } = selectedData;
const selectedDashboard = getSelectedDashboard(dashboards);
const variables = getSelectedDashboardVariable(dashboards);
const updatedQuery = useStepInterval(widget?.query);
@@ -172,10 +181,10 @@ function GridCardGraph({
h: 2,
y: 0,
},
...(selectedDashboard.data.layout || []),
...(selectedDashboard?.data.layout || []),
];
if (widget) {
if (widget && selectedDashboard) {
await UpdateDashboard(
{
data: selectedDashboard.data,
@@ -257,6 +266,9 @@ function GridCardGraph({
onClone={onCloneHandler}
queryResponse={queryResponse}
errorMessage={errorMessage}
allowClone={allowClone}
allowDelete={allowDelete}
allowEdit={allowEdit}
/>
</div>
<GridGraphComponent
@@ -267,6 +279,7 @@ function GridCardGraph({
title={' '}
name={name}
yAxisUnit={yAxisUnit}
onClickHandler={onClickHandler}
/>
</>
)}
@@ -289,6 +302,9 @@ function GridCardGraph({
onClone={onCloneHandler}
queryResponse={queryResponse}
errorMessage={errorMessage}
allowClone={allowClone}
allowDelete={allowDelete}
allowEdit={allowEdit}
/>
</div>
<GridGraphComponent
@@ -299,6 +315,7 @@ function GridCardGraph({
title={' '}
name={name}
yAxisUnit={yAxisUnit}
onClickHandler={onClickHandler}
/>
</>
) : (
@@ -335,6 +352,9 @@ function GridCardGraph({
onClone={onCloneHandler}
queryResponse={queryResponse}
errorMessage={errorMessage}
allowClone={allowClone}
allowDelete={allowDelete}
allowEdit={allowEdit}
/>
</div>
)}
@@ -351,6 +371,7 @@ function GridCardGraph({
name={name}
yAxisUnit={yAxisUnit}
onDragSelect={onDragSelect}
onClickHandler={onClickHandler}
/>
)}
@@ -374,10 +395,18 @@ interface GridCardGraphProps extends DispatchProps {
// eslint-disable-next-line react/require-default-props
setLayout?: Dispatch<SetStateAction<LayoutProps[]>>;
onDragSelect?: (start: number, end: number) => void;
onClickHandler?: GraphOnClickHandler;
allowDelete?: boolean;
allowClone?: boolean;
allowEdit?: boolean;
}
GridCardGraph.defaultProps = {
onDragSelect: undefined,
onClickHandler: undefined,
allowDelete: true,
allowClone: true,
allowEdit: true,
};
const mapDispatchToProps = (

View File

@@ -0,0 +1,13 @@
export enum MenuItemKeys {
View = 'view',
Edit = 'edit',
Delete = 'delete',
Clone = 'clone',
}
export const MENUITEM_KEYS_VS_LABELS = {
[MenuItemKeys.View]: 'View',
[MenuItemKeys.Edit]: 'Edit',
[MenuItemKeys.Delete]: 'Delete',
[MenuItemKeys.Clone]: 'Clone',
};

View File

@@ -27,24 +27,29 @@ import {
spinnerStyles,
tooltipStyles,
} from './config';
import { MENUITEM_KEYS_VS_LABELS, MenuItemKeys } from './contants';
import {
ArrowContainer,
HeaderContainer,
HeaderContentContainer,
} from './styles';
import { KeyMethodMappingProps, MenuItem, TWidgetOptions } from './types';
import { generateMenuList, isTWidgetOptions } from './utils';
type TWidgetOptions = 'view' | 'edit' | 'delete' | string;
interface IWidgetHeaderProps {
title: string;
widget: Widgets;
onView: VoidFunction;
onDelete: VoidFunction;
onClone: VoidFunction;
onDelete?: VoidFunction;
onClone?: VoidFunction;
parentHover: boolean;
queryResponse: UseQueryResult<
SuccessResponse<MetricRangePayloadProps> | ErrorResponse
>;
errorMessage: string | undefined;
allowDelete?: boolean;
allowClone?: boolean;
allowEdit?: boolean;
}
function WidgetHeader({
title,
@@ -55,6 +60,9 @@ function WidgetHeader({
parentHover,
queryResponse,
errorMessage,
allowClone = true,
allowDelete = true,
allowEdit = true,
}: IWidgetHeaderProps): JSX.Element {
const [localHover, setLocalHover] = useState(false);
const [isOpen, setIsOpen] = useState<boolean>(false);
@@ -70,24 +78,22 @@ function WidgetHeader({
);
}, [widget.id, widget.panelTypes, widget.query]);
const keyMethodMapping: {
[K in TWidgetOptions]: { key: TWidgetOptions; method: VoidFunction };
} = useMemo(
const keyMethodMapping: KeyMethodMappingProps<TWidgetOptions> = useMemo(
() => ({
view: {
key: 'view',
key: MenuItemKeys.View,
method: onView,
},
edit: {
key: 'edit',
key: MenuItemKeys.Edit,
method: onEditHandler,
},
delete: {
key: 'delete',
key: MenuItemKeys.Delete,
method: onDelete,
},
clone: {
key: 'clone',
key: MenuItemKeys.Clone,
method: onClone,
},
}),
@@ -95,11 +101,13 @@ function WidgetHeader({
);
const onMenuItemSelectHandler: MenuProps['onClick'] = useCallback(
({ key }: { key: TWidgetOptions }): void => {
const functionToCall = keyMethodMapping[key]?.method;
if (functionToCall) {
functionToCall();
setIsOpen(false);
({ key }: { key: string }): void => {
if (isTWidgetOptions(key)) {
const functionToCall = keyMethodMapping[key]?.method;
if (functionToCall) {
functionToCall();
setIsOpen(false);
}
}
},
[keyMethodMapping],
@@ -111,45 +119,53 @@ function WidgetHeader({
role,
);
const menuList: MenuItemType[] = useMemo(
() => [
const actions = useMemo(
(): MenuItem[] => [
{
key: keyMethodMapping.view.key,
key: MenuItemKeys.View,
icon: <FullscreenOutlined />,
label: MENUITEM_KEYS_VS_LABELS[MenuItemKeys.View],
isVisible: true,
disabled: queryResponse.isLoading,
label: 'View',
},
{
key: keyMethodMapping.edit.key,
key: MenuItemKeys.Edit,
icon: <EditFilled />,
label: MENUITEM_KEYS_VS_LABELS[MenuItemKeys.Edit],
isVisible: allowEdit,
disabled: !editWidget,
label: 'Edit',
},
{
key: keyMethodMapping.clone.key,
key: MenuItemKeys.Clone,
icon: <CopyOutlined />,
label: MENUITEM_KEYS_VS_LABELS[MenuItemKeys.Clone],
isVisible: allowClone,
disabled: !editWidget,
label: 'Clone',
},
{
key: keyMethodMapping.delete.key,
key: MenuItemKeys.Delete,
icon: <DeleteOutlined />,
label: MENUITEM_KEYS_VS_LABELS[MenuItemKeys.Delete],
isVisible: allowDelete,
disabled: !deleteWidget,
danger: true,
label: 'Delete',
},
],
[
allowEdit,
allowClone,
allowDelete,
queryResponse.isLoading,
deleteWidget,
editWidget,
keyMethodMapping.delete.key,
keyMethodMapping.edit.key,
keyMethodMapping.view.key,
keyMethodMapping.clone.key,
queryResponse.isLoading,
],
);
const menuList: MenuItemType[] = useMemo(
(): MenuItemType[] => generateMenuList(actions, keyMethodMapping),
[actions, keyMethodMapping],
);
const onClickHandler = useCallback(() => {
setIsOpen((open) => !open);
}, []);
@@ -200,4 +216,12 @@ function WidgetHeader({
);
}
WidgetHeader.defaultProps = {
onDelete: undefined,
onClone: undefined,
allowDelete: true,
allowClone: true,
allowEdit: true,
};
export default WidgetHeader;

View File

@@ -9,6 +9,8 @@ export const HeaderContainer = styled.div<{ hover: boolean }>`
font-size: 0.8rem;
cursor: all-scroll;
position: absolute;
top: 0;
left: 0;
`;
export const HeaderContentContainer = styled.span`

View File

@@ -0,0 +1,25 @@
import { ReactNode } from 'react';
import { MenuItemKeys } from './contants';
export interface MenuItem {
key: TWidgetOptions;
icon: ReactNode;
label: string;
isVisible: boolean;
disabled: boolean;
danger?: boolean;
}
export type TWidgetOptions =
| MenuItemKeys.View
| MenuItemKeys.Edit
| MenuItemKeys.Delete
| MenuItemKeys.Clone;
export type KeyMethodMappingProps<T extends TWidgetOptions> = {
[K in T]: {
key: TWidgetOptions;
method?: VoidFunction;
};
};

View File

@@ -0,0 +1,24 @@
import { MenuItemType } from 'antd/es/menu/hooks/useItems';
import { MenuItemKeys } from './contants';
import { KeyMethodMappingProps, MenuItem, TWidgetOptions } from './types';
export const generateMenuList = (
actions: MenuItem[],
keyMethodMapping: KeyMethodMappingProps<TWidgetOptions>,
): MenuItemType[] =>
actions
.filter((action: MenuItem) => action.isVisible)
.map(({ key, icon: Icon, label, disabled, ...rest }) => ({
key: keyMethodMapping[key].key,
icon: Icon,
label,
disabled,
...rest,
}));
export const isTWidgetOptions = (value: string): value is TWidgetOptions =>
value === MenuItemKeys.View ||
value === MenuItemKeys.Edit ||
value === MenuItemKeys.Delete ||
value === MenuItemKeys.Clone;

View File

@@ -326,6 +326,13 @@ function GridGraph(props: Props): JSX.Element {
errorMessage,
]);
useEffect(
() => (): void => {
toggleAddWidget(false);
},
[toggleAddWidget],
);
return (
<GraphLayoutContainer
addPanelLoading={addPanelLoading}

View File

@@ -1,13 +1,15 @@
import { Button } from 'antd';
import { initialQueriesMap, PANEL_TYPES } from 'constants/queryBuilder';
import ExplorerOrderBy from 'container/ExplorerOrderBy';
import { QueryBuilder } from 'container/QueryBuilder';
import { OrderByFilterProps } from 'container/QueryBuilder/filters/OrderByFilter/OrderByFilter.interfaces';
import { QueryBuilderProps } from 'container/QueryBuilder/QueryBuilder.interfaces';
import { useGetPanelTypesQueryParam } from 'hooks/queryBuilder/useGetPanelTypesQueryParam';
import { useQueryBuilder } from 'hooks/queryBuilder/useQueryBuilder';
import { useShareBuilderUrl } from 'hooks/queryBuilder/useShareBuilderUrl';
import { ButtonWrapperStyled } from 'pages/LogsExplorer/styles';
import { prepareQueryWithDefaultTimestamp } from 'pages/LogsExplorer/utils';
import { memo, useMemo } from 'react';
import { memo, useCallback, useMemo } from 'react';
import { DataSource } from 'types/common/queryBuilder';
function LogExplorerQuerySection(): JSX.Element {
@@ -23,6 +25,7 @@ function LogExplorerQuerySection(): JSX.Element {
}, [updateAllQueriesOperators]);
useShareBuilderUrl(defaultValue);
const filterConfigs: QueryBuilderProps['filterConfigs'] = useMemo(() => {
const isTable = panelTypes === PANEL_TYPES.TABLE;
const config: QueryBuilderProps['filterConfigs'] = {
@@ -32,11 +35,26 @@ function LogExplorerQuerySection(): JSX.Element {
return config;
}, [panelTypes]);
const renderOrderBy = useCallback(
({ query, onChange }: OrderByFilterProps): JSX.Element => (
<ExplorerOrderBy query={query} onChange={onChange} />
),
[],
);
const queryComponents = useMemo(
(): QueryBuilderProps['queryComponents'] => ({
...(panelTypes === PANEL_TYPES.LIST ? { renderOrderBy } : {}),
}),
[panelTypes, renderOrderBy],
);
return (
<QueryBuilder
panelType={panelTypes}
config={{ initialDataSource: DataSource.LOGS, queryVariant: 'static' }}
filterConfigs={filterConfigs}
queryComponents={queryComponents}
actions={
<ButtonWrapperStyled>
<Button type="primary" onClick={handleRunQuery}>

View File

@@ -0,0 +1,24 @@
import { dragColumnParams } from 'hooks/useDragColumns/configs';
import ReactDragListView from 'react-drag-listview';
import { TableComponents } from 'react-virtuoso';
import { TableStyled } from './styles';
interface LogsCustomTableProps {
handleDragEnd: (fromIndex: number, toIndex: number) => void;
}
export const LogsCustomTable = ({
handleDragEnd,
}: LogsCustomTableProps): TableComponents['Table'] =>
function CustomTable({ style, children }): JSX.Element {
return (
<ReactDragListView.DragColumn
// eslint-disable-next-line react/jsx-props-no-spreading
{...dragColumnParams}
onDragEnd={handleDragEnd}
>
<TableStyled style={style}>{children}</TableStyled>
</ReactDragListView.DragColumn>
);
};

View File

@@ -1,22 +1,26 @@
import { ColumnTypeRender } from 'components/Logs/TableView/types';
import { useTableView } from 'components/Logs/TableView/useTableView';
import { cloneElement, ReactElement, ReactNode, useCallback } from 'react';
import { LOCALSTORAGE } from 'constants/localStorage';
import useDragColumns from 'hooks/useDragColumns';
import { getDraggedColumns } from 'hooks/useDragColumns/utils';
import {
cloneElement,
ReactElement,
ReactNode,
useCallback,
useMemo,
} from 'react';
import { TableComponents, TableVirtuoso } from 'react-virtuoso';
import { infinityDefaultStyles } from './config';
import { LogsCustomTable } from './LogsCustomTable';
import {
TableCellStyled,
TableHeaderCellStyled,
TableRowStyled,
TableStyled,
} from './styles';
import { InfinityTableProps } from './types';
// eslint-disable-next-line react/function-component-definition
const CustomTable: TableComponents['Table'] = ({ style, children }) => (
<TableStyled style={style}>{children}</TableStyled>
);
// eslint-disable-next-line react/function-component-definition
const CustomTableRow: TableComponents['TableRow'] = ({
children,
@@ -31,11 +35,25 @@ function InfinityTable({
}: InfinityTableProps): JSX.Element | null {
const { onEndReached } = infitiyTableProps;
const { dataSource, columns } = useTableView(tableViewProps);
const { draggedColumns, onDragColumns } = useDragColumns<
Record<string, unknown>
>(LOCALSTORAGE.LOGS_LIST_COLUMNS);
const tableColumns = useMemo(
() => getDraggedColumns<Record<string, unknown>>(columns, draggedColumns),
[columns, draggedColumns],
);
const handleDragEnd = useCallback(
(fromIndex: number, toIndex: number) =>
onDragColumns(tableColumns, fromIndex, toIndex),
[tableColumns, onDragColumns],
);
const itemContent = useCallback(
(index: number, log: Record<string, unknown>): JSX.Element => (
<>
{columns.map((column) => {
{tableColumns.map((column) => {
if (!column.render) return <td>Empty</td>;
const element: ColumnTypeRender<Record<string, unknown>> = column.render(
@@ -60,20 +78,29 @@ function InfinityTable({
})}
</>
),
[columns],
[tableColumns],
);
const tableHeader = useCallback(
() => (
<tr>
{columns.map((column) => (
<TableHeaderCellStyled key={column.key}>
{column.title as string}
</TableHeaderCellStyled>
))}
{tableColumns.map((column) => {
const isDragColumn = column.key !== 'expand';
return (
<TableHeaderCellStyled
isDragColumn={isDragColumn}
key={column.key}
// eslint-disable-next-line react/jsx-props-no-spreading
{...(isDragColumn && { className: 'dragHandler' })}
>
{column.title as string}
</TableHeaderCellStyled>
);
})}
</tr>
),
[columns],
[tableColumns],
);
return (
@@ -81,7 +108,8 @@ function InfinityTable({
style={infinityDefaultStyles}
data={dataSource}
components={{
Table: CustomTable,
// eslint-disable-next-line react/jsx-props-no-spreading
Table: LogsCustomTable({ handleDragEnd }),
// TODO: fix it in the future
// eslint-disable-next-line @typescript-eslint/ban-ts-comment
// @ts-ignore

View File

@@ -1,6 +1,10 @@
import { themeColors } from 'constants/theme';
import styled from 'styled-components';
interface TableHeaderCellStyledProps {
isDragColumn: boolean;
}
export const TableStyled = styled.table`
width: 100%;
border-top: 1px solid rgba(253, 253, 253, 0.12);
@@ -26,10 +30,12 @@ export const TableRowStyled = styled.tr`
}
`;
export const TableHeaderCellStyled = styled.th`
export const TableHeaderCellStyled = styled.th<TableHeaderCellStyledProps>`
padding: 0.5rem;
border-inline-end: 1px solid rgba(253, 253, 253, 0.12);
background-color: #1d1d1d;
${({ isDragColumn }): string => (isDragColumn ? 'cursor: col-resize;' : '')}
&:first-child {
border-start-start-radius: 2px;
}

View File

@@ -146,12 +146,17 @@ function LogsExplorerList({
isShowPageSize={false}
optionsMenuConfig={config}
/>
{options.format !== 'table' && (
<Heading>
<Typography.Text>Event</Typography.Text>
</Heading>
)}
{logs.length === 0 && <Typography>No logs lines found</Typography>}
{!isLoading && logs.length === 0 && (
<Typography>No logs lines found</Typography>
)}
<InfinityWrapperStyled>{renderContent}</InfinityWrapperStyled>
</>
);

View File

@@ -1,9 +1,9 @@
import { TabsProps } from 'antd';
import axios from 'axios';
import LogDetail from 'components/LogDetail';
import TabLabel from 'components/TabLabel';
import { QueryParams } from 'constants/query';
import {
initialAutocompleteData,
initialQueriesMap,
OPERATORS,
PANEL_TYPES,
@@ -13,16 +13,18 @@ import { queryParamNamesMap } from 'constants/queryBuilderQueryNames';
import ROUTES from 'constants/routes';
import { DEFAULT_PER_PAGE_VALUE } from 'container/Controls/config';
import ExportPanel from 'container/ExportPanel';
import GoToTop from 'container/GoToTop';
import LogsExplorerChart from 'container/LogsExplorerChart';
import LogsExplorerList from 'container/LogsExplorerList';
// TODO: temporary hide table view
// import LogsExplorerTable from 'container/LogsExplorerTable';
import LogsExplorerTable from 'container/LogsExplorerTable';
import { GRAPH_TYPES } from 'container/NewDashboard/ComponentsSlider';
import { SIGNOZ_VALUE } from 'container/QueryBuilder/filters/OrderByFilter/constants';
import TimeSeriesView from 'container/TimeSeriesView/TimeSeriesView';
import { useUpdateDashboard } from 'hooks/dashboard/useUpdateDashboard';
import { addEmptyWidgetInDashboardJSONWithQuery } from 'hooks/dashboard/utils';
import { useGetExplorerQueryRange } from 'hooks/queryBuilder/useGetExplorerQueryRange';
import { useQueryBuilder } from 'hooks/queryBuilder/useQueryBuilder';
import useAxiosError from 'hooks/useAxiosError';
import { useNotifications } from 'hooks/useNotifications';
import useUrlQueryData from 'hooks/useUrlQueryData';
import { chooseAutocompleteFromCustomValue } from 'lib/newQueryBuilder/chooseAutocompleteFromCustomValue';
@@ -73,6 +75,7 @@ function LogsExplorerViews(): JSX.Element {
stagedQuery,
panelType,
updateAllQueriesOperators,
updateQueriesData,
redirectWithQueryBuilderData,
} = useQueryBuilder();
@@ -82,6 +85,8 @@ function LogsExplorerViews(): JSX.Element {
const [logs, setLogs] = useState<ILog[]>([]);
const [requestData, setRequestData] = useState<Query | null>(null);
const handleAxisError = useAxiosError();
const currentStagedQueryData = useMemo(() => {
if (!stagedQuery || stagedQuery.builder.queryData.length !== 1) return null;
@@ -173,26 +178,40 @@ function LogsExplorerViews(): JSX.Element {
setActiveLog(null);
}, []);
const getUpdateQuery = useCallback(
(newPanelType: GRAPH_TYPES): Query => {
let query = updateAllQueriesOperators(
currentQuery,
newPanelType,
DataSource.TRACES,
);
if (newPanelType === PANEL_TYPES.LIST) {
query = updateQueriesData(query, 'queryData', (item) => ({
...item,
orderBy: item.orderBy.filter((item) => item.columnName !== SIGNOZ_VALUE),
aggregateAttribute: initialAutocompleteData,
}));
}
return query;
},
[currentQuery, updateAllQueriesOperators, updateQueriesData],
);
const handleChangeView = useCallback(
(newPanelType: string) => {
(type: string) => {
const newPanelType = type as GRAPH_TYPES;
if (newPanelType === panelType) return;
const query = updateAllQueriesOperators(
currentQuery,
newPanelType as GRAPH_TYPES,
DataSource.LOGS,
);
const query = getUpdateQuery(newPanelType);
redirectWithQueryBuilderData(query, {
[queryParamNamesMap.panelTypes]: newPanelType,
});
},
[
currentQuery,
panelType,
updateAllQueriesOperators,
redirectWithQueryBuilderData,
],
[panelType, getUpdateQuery, redirectWithQueryBuilderData],
);
const getRequestData = useCallback(
@@ -358,16 +377,16 @@ function LogsExplorerViews(): JSX.Element {
history.push(dashboardEditView);
},
onError: (error) => {
if (axios.isAxiosError(error)) {
notifications.error({
message: error.message,
});
}
},
onError: handleAxisError,
});
},
[exportDefaultQuery, history, notifications, updateDashboard],
[
exportDefaultQuery,
history,
notifications,
updateDashboard,
handleAxisError,
],
);
useEffect(() => {
@@ -437,17 +456,16 @@ function LogsExplorerViews(): JSX.Element {
<TimeSeriesView isLoading={isFetching} data={data} isError={isError} />
),
},
// TODO: temporary hide table view
// {
// label: 'Table',
// key: PANEL_TYPES.TABLE,
// children: (
// <LogsExplorerTable
// data={data?.payload.data.newResult.data.result || []}
// isLoading={isFetching}
// />
// ),
// },
{
label: 'Table',
key: PANEL_TYPES.TABLE,
children: (
<LogsExplorerTable
data={data?.payload.data.newResult.data.result || []}
isLoading={isFetching}
/>
),
},
],
[
isMultipleQueries,
@@ -513,6 +531,8 @@ function LogsExplorerViews(): JSX.Element {
onAddToQuery={handleAddToQuery}
onClickActionItem={handleAddToQuery}
/>
<GoToTop />
</>
);
}

View File

@@ -2,7 +2,10 @@ import { PANEL_TYPES } from 'constants/queryBuilder';
import { Widgets } from 'types/api/dashboard/getAll';
import { v4 } from 'uuid';
export const getWidgetQueryBuilder = (query: Widgets['query']): Widgets => ({
export const getWidgetQueryBuilder = (
query: Widgets['query'],
title = '',
): Widgets => ({
description: '',
id: v4(),
isStacked: false,
@@ -11,5 +14,5 @@ export const getWidgetQueryBuilder = (query: Widgets['query']): Widgets => ({
panelTypes: PANEL_TYPES.TIME_SERIES,
query,
timePreferance: 'GLOBAL_TIME',
title: '',
title,
});

View File

@@ -1,7 +1,10 @@
import { OPERATORS } from 'constants/queryBuilder';
import { BaseAutocompleteData } from 'types/api/queryBuilder/queryAutocompleteResponse';
import { TagFilterItem } from 'types/api/queryBuilder/queryBuilderData';
import { QueryBuilderData } from 'types/common/queryBuilder';
import { DataSource, QueryBuilderData } from 'types/common/queryBuilder';
import { DataType, FORMULA, MetricsType, WidgetKeys } from '../constant';
import { IServiceName } from '../Tabs/types';
import {
getQueryBuilderQueries,
getQueryBuilderQuerieswithFormula,
@@ -12,35 +15,42 @@ export const databaseCallsRPS = ({
legend,
tagFilterItems,
}: DatabaseCallsRPSProps): QueryBuilderData => {
const metricName: BaseAutocompleteData = {
dataType: 'float64',
isColumn: true,
key: 'signoz_db_latency_count',
type: null,
};
const groupBy: BaseAutocompleteData[] = [
{ dataType: 'string', isColumn: false, key: 'db_system', type: 'tag' },
];
const itemsA: TagFilterItem[] = [
const autocompleteData: BaseAutocompleteData[] = [
{
id: '',
key: {
dataType: 'string',
isColumn: false,
key: 'service_name',
type: 'resource',
},
op: 'IN',
value: [`${servicename}`],
key: WidgetKeys.SignozDBLatencyCount,
dataType: DataType.FLOAT64,
isColumn: true,
type: null,
},
...tagFilterItems,
];
const groupBy: BaseAutocompleteData[] = [
{ dataType: DataType.STRING, isColumn: false, key: 'db_system', type: 'tag' },
];
const filterItems: TagFilterItem[][] = [
[
{
id: '',
key: {
key: WidgetKeys.Service_name,
dataType: DataType.STRING,
isColumn: false,
type: MetricsType.Resource,
},
op: OPERATORS.IN,
value: [`${servicename}`],
},
...tagFilterItems,
],
];
const legends = [legend];
return getQueryBuilderQueries({
metricName,
autocompleteData,
groupBy,
legend,
itemsA,
legends,
filterItems,
dataSource: DataSource.METRICS,
});
};
@@ -48,32 +58,29 @@ export const databaseCallsAvgDuration = ({
servicename,
tagFilterItems,
}: DatabaseCallProps): QueryBuilderData => {
const metricNameA: BaseAutocompleteData = {
dataType: 'float64',
const autocompleteDataA: BaseAutocompleteData = {
key: WidgetKeys.SignozDbLatencySum,
dataType: DataType.FLOAT64,
isColumn: true,
key: 'signoz_db_latency_sum',
type: null,
};
const metricNameB: BaseAutocompleteData = {
dataType: 'float64',
const autocompleteDataB: BaseAutocompleteData = {
key: WidgetKeys.SignozDBLatencyCount,
dataType: DataType.FLOAT64,
isColumn: true,
key: 'signoz_db_latency_count',
type: null,
};
const expression = 'A/B';
const legendFormula = 'Average Duration';
const legend = '';
const disabled = true;
const additionalItemsA: TagFilterItem[] = [
{
id: '',
key: {
dataType: 'string',
key: WidgetKeys.Service_name,
dataType: DataType.STRING,
isColumn: false,
key: 'service_name',
type: 'resource',
type: MetricsType.Resource,
},
op: 'IN',
op: OPERATORS.IN,
value: [`${servicename}`],
},
...tagFilterItems,
@@ -81,14 +88,14 @@ export const databaseCallsAvgDuration = ({
const additionalItemsB = additionalItemsA;
return getQueryBuilderQuerieswithFormula({
metricNameA,
metricNameB,
autocompleteDataA,
autocompleteDataB,
additionalItemsA,
additionalItemsB,
legend,
disabled,
expression,
legendFormula,
legend: '',
disabled: true,
expression: FORMULA.DATABASE_CALLS_AVG_DURATION,
legendFormula: 'Average Duration',
});
};
@@ -97,6 +104,6 @@ interface DatabaseCallsRPSProps extends DatabaseCallProps {
}
interface DatabaseCallProps {
servicename: string | undefined;
servicename: IServiceName['servicename'];
tagFilterItems: TagFilterItem[];
}

View File

@@ -1,14 +1,22 @@
import { OPERATORS } from 'constants/queryBuilder';
import { BaseAutocompleteData } from 'types/api/queryBuilder/queryAutocompleteResponse';
import { TagFilterItem } from 'types/api/queryBuilder/queryBuilderData';
import { QueryBuilderData } from 'types/common/queryBuilder';
import { DataSource, QueryBuilderData } from 'types/common/queryBuilder';
import { DataType, FORMULA, MetricsType, WidgetKeys } from '../constant';
import { IServiceName } from '../Tabs/types';
import {
getQueryBuilderQueries,
getQueryBuilderQuerieswithFormula,
} from './MetricsPageQueriesFactory';
const groupBy: BaseAutocompleteData[] = [
{ dataType: 'string', isColumn: false, key: 'address', type: 'tag' },
{
dataType: DataType.STRING,
isColumn: false,
key: WidgetKeys.Address,
type: MetricsType.Tag,
},
];
export const externalCallErrorPercent = ({
@@ -16,39 +24,39 @@ export const externalCallErrorPercent = ({
legend,
tagFilterItems,
}: ExternalCallDurationByAddressProps): QueryBuilderData => {
const metricNameA: BaseAutocompleteData = {
dataType: 'float64',
const autocompleteDataA: BaseAutocompleteData = {
key: WidgetKeys.SignozExternalCallLatencyCount,
dataType: DataType.FLOAT64,
isColumn: true,
key: 'signoz_external_call_latency_count',
type: null,
};
const metricNameB: BaseAutocompleteData = {
dataType: 'float64',
const autocompleteDataB: BaseAutocompleteData = {
key: WidgetKeys.SignozExternalCallLatencyCount,
dataType: DataType.FLOAT64,
isColumn: true,
key: 'signoz_external_call_latency_count',
type: null,
};
const additionalItemsA: TagFilterItem[] = [
{
id: '',
key: {
dataType: 'string',
key: WidgetKeys.Service_name,
dataType: DataType.STRING,
isColumn: false,
key: 'service_name',
type: 'resource',
type: MetricsType.Resource,
},
op: 'IN',
op: OPERATORS.IN,
value: [`${servicename}`],
},
{
id: '',
key: {
dataType: 'int64',
key: WidgetKeys.StatusCode,
dataType: DataType.INT64,
isColumn: false,
key: 'status_code',
type: 'tag',
type: MetricsType.Tag,
},
op: 'IN',
op: OPERATORS.IN,
value: ['STATUS_CODE_ERROR'],
},
...tagFilterItems,
@@ -57,22 +65,22 @@ export const externalCallErrorPercent = ({
{
id: '',
key: {
dataType: 'string',
key: WidgetKeys.Service_name,
dataType: DataType.STRING,
isColumn: false,
key: 'service_name',
type: 'resource',
type: MetricsType.Resource,
},
op: 'IN',
op: OPERATORS.IN,
value: [`${servicename}`],
},
...tagFilterItems,
];
const legendFormula = legend;
const expression = 'A*100/B';
const expression = FORMULA.ERROR_PERCENTAGE;
const disabled = true;
return getQueryBuilderQuerieswithFormula({
metricNameA,
metricNameB,
autocompleteDataA,
autocompleteDataB,
additionalItemsA,
additionalItemsB,
legend,
@@ -87,19 +95,19 @@ export const externalCallDuration = ({
servicename,
tagFilterItems,
}: ExternalCallProps): QueryBuilderData => {
const metricNameA: BaseAutocompleteData = {
dataType: 'float64',
const autocompleteDataA: BaseAutocompleteData = {
dataType: DataType.FLOAT64,
isColumn: true,
key: 'signoz_external_call_latency_sum',
key: WidgetKeys.SignozExternalCallLatencySum,
type: null,
};
const metricNameB: BaseAutocompleteData = {
dataType: 'float64',
const autocompleteDataB: BaseAutocompleteData = {
dataType: DataType.FLOAT64,
isColumn: true,
key: 'signoz_external_call_latency_count',
key: WidgetKeys.SignozExternalCallLatencyCount,
type: null,
};
const expression = 'A/B';
const expression = FORMULA.DATABASE_CALLS_AVG_DURATION;
const legendFormula = 'Average Duration';
const legend = '';
const disabled = true;
@@ -107,12 +115,12 @@ export const externalCallDuration = ({
{
id: '',
key: {
dataType: 'string',
dataType: DataType.STRING,
isColumn: false,
key: 'service_name',
type: 'resource',
key: WidgetKeys.Service_name,
type: MetricsType.Resource,
},
op: 'IN',
op: OPERATORS.IN,
value: [`${servicename}`],
},
...tagFilterItems,
@@ -120,8 +128,8 @@ export const externalCallDuration = ({
const additionalItemsB = additionalItemsA;
return getQueryBuilderQuerieswithFormula({
metricNameA,
metricNameB,
autocompleteDataA,
autocompleteDataB,
additionalItemsA,
additionalItemsB,
legend,
@@ -136,31 +144,38 @@ export const externalCallRpsByAddress = ({
legend,
tagFilterItems,
}: ExternalCallDurationByAddressProps): QueryBuilderData => {
const metricName: BaseAutocompleteData = {
dataType: 'float64',
isColumn: true,
key: 'signoz_external_call_latency_count',
type: null,
};
const itemsA: TagFilterItem[] = [
const autocompleteData: BaseAutocompleteData[] = [
{
id: '',
key: {
dataType: 'string',
isColumn: false,
key: 'service_name',
type: 'resource',
},
op: 'IN',
value: [`${servicename}`],
dataType: DataType.FLOAT64,
isColumn: true,
key: WidgetKeys.SignozExternalCallLatencyCount,
type: null,
},
...tagFilterItems,
];
const filterItems: TagFilterItem[][] = [
[
{
id: '',
key: {
dataType: DataType.STRING,
isColumn: false,
key: WidgetKeys.Service_name,
type: MetricsType.Resource,
},
op: OPERATORS.IN,
value: [`${servicename}`],
},
...tagFilterItems,
],
];
const legends: string[] = [legend];
return getQueryBuilderQueries({
metricName,
autocompleteData,
groupBy,
legend,
itemsA,
legends,
filterItems,
dataSource: DataSource.METRICS,
});
};
@@ -169,31 +184,31 @@ export const externalCallDurationByAddress = ({
legend,
tagFilterItems,
}: ExternalCallDurationByAddressProps): QueryBuilderData => {
const metricNameA: BaseAutocompleteData = {
dataType: 'float64',
const autocompleteDataA: BaseAutocompleteData = {
dataType: DataType.FLOAT64,
isColumn: true,
key: 'signoz_external_call_latency_sum',
key: WidgetKeys.SignozExternalCallLatencySum,
type: null,
};
const metricNameB: BaseAutocompleteData = {
dataType: 'float64',
const autocompleteDataB: BaseAutocompleteData = {
dataType: DataType.FLOAT64,
isColumn: true,
key: 'signoz_external_call_latency_count',
key: WidgetKeys.SignozExternalCallLatencyCount,
type: null,
};
const expression = 'A/B';
const expression = FORMULA.DATABASE_CALLS_AVG_DURATION;
const legendFormula = legend;
const disabled = true;
const additionalItemsA: TagFilterItem[] = [
{
id: '',
key: {
dataType: 'string',
dataType: DataType.STRING,
isColumn: false,
key: 'service_name',
type: 'resource',
key: WidgetKeys.Service_name,
type: MetricsType.Resource,
},
op: 'IN',
op: OPERATORS.IN,
value: [`${servicename}`],
},
...tagFilterItems,
@@ -201,8 +216,8 @@ export const externalCallDurationByAddress = ({
const additionalItemsB = additionalItemsA;
return getQueryBuilderQuerieswithFormula({
metricNameA,
metricNameB,
autocompleteDataA,
autocompleteDataB,
additionalItemsA,
additionalItemsB,
legend,
@@ -218,6 +233,6 @@ interface ExternalCallDurationByAddressProps extends ExternalCallProps {
}
export interface ExternalCallProps {
servicename: string | undefined;
servicename: IServiceName['servicename'];
tagFilterItems: TagFilterItem[];
}

View File

@@ -5,44 +5,64 @@ import {
import getStep from 'lib/getStep';
import store from 'store';
import { BaseAutocompleteData } from 'types/api/queryBuilder/queryAutocompleteResponse';
import { TagFilterItem } from 'types/api/queryBuilder/queryBuilderData';
import {
IBuilderQuery,
TagFilterItem,
} from 'types/api/queryBuilder/queryBuilderData';
import {
DataSource,
MetricAggregateOperator,
QueryBuilderData,
} from 'types/common/queryBuilder';
export const getQueryBuilderQueries = ({
metricName,
autocompleteData,
groupBy = [],
legend,
itemsA,
legends,
filterItems,
aggregateOperator,
dataSource,
queryNameAndExpression,
}: BuilderQueriesProps): QueryBuilderData => ({
queryFormulas: [],
queryData: [
{
queryData: autocompleteData.map((item, index) => {
const newQueryData: IBuilderQuery = {
...initialQueryBuilderFormValuesMap.metrics,
aggregateOperator: MetricAggregateOperator.SUM_RATE,
aggregateOperator: ((): string => {
if (aggregateOperator) {
return aggregateOperator[index];
}
return MetricAggregateOperator.SUM_RATE;
})(),
disabled: false,
groupBy,
aggregateAttribute: metricName,
legend,
aggregateAttribute: item,
legend: legends[index],
stepInterval: getStep({
end: store.getState().globalTime.maxTime,
inputFormat: 'ns',
start: store.getState().globalTime.minTime,
}),
reduceTo: 'sum',
filters: {
items: itemsA,
items: filterItems[index],
op: 'AND',
},
},
],
reduceTo: 'sum',
dataSource,
};
if (queryNameAndExpression) {
newQueryData.queryName = queryNameAndExpression[index];
newQueryData.expression = queryNameAndExpression[index];
}
return newQueryData;
}),
});
export const getQueryBuilderQuerieswithFormula = ({
metricNameA,
metricNameB,
autocompleteDataA,
autocompleteDataB,
additionalItemsA,
additionalItemsB,
legend,
@@ -65,7 +85,7 @@ export const getQueryBuilderQuerieswithFormula = ({
disabled,
groupBy,
legend,
aggregateAttribute: metricNameA,
aggregateAttribute: autocompleteDataA,
reduceTo: 'sum',
filters: {
items: additionalItemsA,
@@ -83,7 +103,7 @@ export const getQueryBuilderQuerieswithFormula = ({
disabled,
groupBy,
legend,
aggregateAttribute: metricNameB,
aggregateAttribute: autocompleteDataB,
queryName: 'B',
expression: 'B',
reduceTo: 'sum',
@@ -101,15 +121,18 @@ export const getQueryBuilderQuerieswithFormula = ({
});
interface BuilderQueriesProps {
metricName: BaseAutocompleteData;
autocompleteData: BaseAutocompleteData[];
groupBy?: BaseAutocompleteData[];
legend: string;
itemsA: TagFilterItem[];
legends: string[];
filterItems: TagFilterItem[][];
aggregateOperator?: string[];
dataSource: DataSource;
queryNameAndExpression?: string[];
}
interface BuilderQuerieswithFormulaProps {
metricNameA: BaseAutocompleteData;
metricNameB: BaseAutocompleteData;
autocompleteDataA: BaseAutocompleteData;
autocompleteDataB: BaseAutocompleteData;
legend: string;
disabled: boolean;
groupBy?: BaseAutocompleteData[];

View File

@@ -1,55 +1,131 @@
import { OPERATORS } from 'constants/queryBuilder';
import { BaseAutocompleteData } from 'types/api/queryBuilder/queryAutocompleteResponse';
import { TagFilterItem } from 'types/api/queryBuilder/queryBuilderData';
import { QueryBuilderData } from 'types/common/queryBuilder';
import { DataSource, QueryBuilderData } from 'types/common/queryBuilder';
import {
DataType,
FORMULA,
GraphTitle,
LATENCY_AGGREGATEOPERATOR,
LATENCY_AGGREGATEOPERATOR_SPAN_METRICS,
MetricsType,
OPERATION_LEGENDS,
QUERYNAME_AND_EXPRESSION,
WidgetKeys,
} from '../constant';
import { IServiceName } from '../Tabs/types';
import {
getQueryBuilderQueries,
getQueryBuilderQuerieswithFormula,
} from './MetricsPageQueriesFactory';
export const latency = ({
servicename,
tagFilterItems,
isSpanMetricEnable = false,
topLevelOperationsRoute,
}: LatencyProps): QueryBuilderData => {
const newAutoCompleteData: BaseAutocompleteData = {
key: isSpanMetricEnable
? WidgetKeys.Signoz_latency_bucket
: WidgetKeys.DurationNano,
dataType: DataType.FLOAT64,
isColumn: true,
type: isSpanMetricEnable ? null : MetricsType.Tag,
};
const autocompleteData: BaseAutocompleteData[] = Array(3).fill(
newAutoCompleteData,
);
const filterItem: TagFilterItem[] = [
{
id: '',
key: {
key: isSpanMetricEnable ? WidgetKeys.Service_name : WidgetKeys.ServiceName,
dataType: DataType.STRING,
type: isSpanMetricEnable ? MetricsType.Resource : MetricsType.Tag,
isColumn: !isSpanMetricEnable,
},
op: isSpanMetricEnable ? OPERATORS.IN : OPERATORS['='],
value: isSpanMetricEnable ? [servicename] : servicename,
},
{
id: '',
key: {
dataType: DataType.STRING,
isColumn: !isSpanMetricEnable,
key: isSpanMetricEnable ? WidgetKeys.Operation : WidgetKeys.Name,
type: MetricsType.Tag,
},
op: OPERATORS.IN.toLowerCase(), // TODO: need to remove toLowerCase() this once backend is changed
value: [...topLevelOperationsRoute],
},
...tagFilterItems,
];
const filterItems: TagFilterItem[][] = Array(3).fill([...filterItem]);
return getQueryBuilderQueries({
autocompleteData,
legends: LATENCY_AGGREGATEOPERATOR,
filterItems,
aggregateOperator: isSpanMetricEnable
? LATENCY_AGGREGATEOPERATOR_SPAN_METRICS
: LATENCY_AGGREGATEOPERATOR,
dataSource: isSpanMetricEnable ? DataSource.METRICS : DataSource.TRACES,
queryNameAndExpression: QUERYNAME_AND_EXPRESSION,
});
};
export const operationPerSec = ({
servicename,
tagFilterItems,
topLevelOperations,
}: OperationPerSecProps): QueryBuilderData => {
const metricName: BaseAutocompleteData = {
dataType: 'float64',
isColumn: true,
key: 'signoz_latency_count',
type: null,
};
const legend = 'Operations';
const autocompleteData: BaseAutocompleteData[] = [
{
key: WidgetKeys.SignozLatencyCount,
dataType: DataType.FLOAT64,
isColumn: true,
type: null,
},
];
const itemsA: TagFilterItem[] = [
{
id: '',
key: {
dataType: 'string',
isColumn: false,
key: 'service_name',
type: 'resource',
const filterItems: TagFilterItem[][] = [
[
{
id: '',
key: {
key: WidgetKeys.Service_name,
dataType: DataType.STRING,
isColumn: false,
type: MetricsType.Resource,
},
op: OPERATORS.IN,
value: [`${servicename}`],
},
op: 'IN',
value: [`${servicename}`],
},
{
id: '',
key: {
dataType: 'string',
isColumn: false,
key: 'operation',
type: 'tag',
{
id: '',
key: {
key: WidgetKeys.Operation,
dataType: DataType.STRING,
isColumn: false,
type: MetricsType.Tag,
},
op: OPERATORS.IN,
value: topLevelOperations,
},
op: 'IN',
value: topLevelOperations,
},
...tagFilterItems,
...tagFilterItems,
],
];
return getQueryBuilderQueries({
metricName,
legend,
itemsA,
autocompleteData,
legends: OPERATION_LEGENDS,
filterItems,
dataSource: DataSource.METRICS,
});
};
@@ -58,50 +134,50 @@ export const errorPercentage = ({
tagFilterItems,
topLevelOperations,
}: OperationPerSecProps): QueryBuilderData => {
const metricNameA: BaseAutocompleteData = {
dataType: 'float64',
const autocompleteDataA: BaseAutocompleteData = {
key: WidgetKeys.SignozCallsTotal,
dataType: DataType.FLOAT64,
isColumn: true,
key: 'signoz_calls_total',
type: null,
};
const metricNameB: BaseAutocompleteData = {
dataType: 'float64',
const autocompleteDataB: BaseAutocompleteData = {
key: WidgetKeys.SignozCallsTotal,
dataType: DataType.FLOAT64,
isColumn: true,
key: 'signoz_calls_total',
type: null,
};
const additionalItemsA: TagFilterItem[] = [
{
id: '',
key: {
dataType: 'string',
key: WidgetKeys.Service_name,
dataType: DataType.STRING,
isColumn: false,
key: 'service_name',
type: 'resource',
type: MetricsType.Resource,
},
op: 'IN',
op: OPERATORS.IN,
value: [`${servicename}`],
},
{
id: '',
key: {
dataType: 'string',
key: WidgetKeys.Operation,
dataType: DataType.STRING,
isColumn: false,
key: 'operation',
type: 'tag',
type: MetricsType.Tag,
},
op: 'IN',
op: OPERATORS.IN,
value: topLevelOperations,
},
{
id: '',
key: {
dataType: 'int64',
key: WidgetKeys.StatusCode,
dataType: DataType.INT64,
isColumn: false,
key: 'status_code',
type: 'tag',
type: MetricsType.Tag,
},
op: 'IN',
op: OPERATORS.IN,
value: ['STATUS_CODE_ERROR'],
},
...tagFilterItems,
@@ -111,46 +187,49 @@ export const errorPercentage = ({
{
id: '',
key: {
dataType: 'string',
key: WidgetKeys.Service_name,
dataType: DataType.STRING,
isColumn: false,
key: 'service_name',
type: 'resource',
type: MetricsType.Resource,
},
op: 'IN',
op: OPERATORS.IN,
value: [`${servicename}`],
},
{
id: '',
key: {
dataType: 'string',
key: WidgetKeys.Operation,
dataType: DataType.STRING,
isColumn: false,
key: 'operation',
type: 'tag',
type: MetricsType.Tag,
},
op: 'IN',
op: OPERATORS.IN,
value: topLevelOperations,
},
...tagFilterItems,
];
const legendFormula = 'Error Percentage';
const legend = legendFormula;
const expression = 'A*100/B';
const disabled = true;
return getQueryBuilderQuerieswithFormula({
metricNameA,
metricNameB,
autocompleteDataA,
autocompleteDataB,
additionalItemsA,
additionalItemsB,
legend,
disabled,
expression,
legendFormula,
legend: GraphTitle.ERROR_PERCENTAGE,
disabled: true,
expression: FORMULA.ERROR_PERCENTAGE,
legendFormula: GraphTitle.ERROR_PERCENTAGE,
});
};
export interface OperationPerSecProps {
servicename: string | undefined;
servicename: IServiceName['servicename'];
tagFilterItems: TagFilterItem[];
topLevelOperations: string[];
}
export interface LatencyProps {
servicename: IServiceName['servicename'];
tagFilterItems: TagFilterItem[];
isSpanMetricEnable?: boolean;
topLevelOperationsRoute: string[];
}

View File

@@ -1,5 +1,5 @@
import { Col } from 'antd';
import FullView from 'container/GridGraphLayout/Graph/FullView/index.metricsBuilder';
import Graph from 'container/GridGraphLayout/Graph/';
import {
databaseCallsAvgDuration,
databaseCallsRPS,
@@ -15,9 +15,11 @@ import { TagFilterItem } from 'types/api/queryBuilder/queryBuilderData';
import { EQueryType } from 'types/common/dashboard';
import { v4 as uuid } from 'uuid';
import { GraphTitle } from '../constant';
import { getWidgetQueryBuilder } from '../MetricsApplication.factory';
import { Card, GraphContainer, GraphTitle, Row } from '../styles';
import { Card, GraphContainer, Row } from '../styles';
import { Button } from './styles';
import { IServiceName } from './types';
import {
dbSystemTags,
handleNonInQueryRange,
@@ -26,7 +28,7 @@ import {
} from './util';
function DBCall(): JSX.Element {
const { servicename } = useParams<{ servicename?: string }>();
const { servicename } = useParams<IServiceName>();
const [selectedTimeStamp, setSelectedTimeStamp] = useState<number>(0);
const { queries } = useResourceAttribute();
@@ -48,31 +50,37 @@ function DBCall(): JSX.Element {
const databaseCallsRPSWidget = useMemo(
() =>
getWidgetQueryBuilder({
queryType: EQueryType.QUERY_BUILDER,
promql: [],
builder: databaseCallsRPS({
servicename,
legend,
tagFilterItems,
}),
clickhouse_sql: [],
id: uuid(),
}),
getWidgetQueryBuilder(
{
queryType: EQueryType.QUERY_BUILDER,
promql: [],
builder: databaseCallsRPS({
servicename,
legend,
tagFilterItems,
}),
clickhouse_sql: [],
id: uuid(),
},
GraphTitle.DATABASE_CALLS_RPS,
),
[servicename, tagFilterItems],
);
const databaseCallsAverageDurationWidget = useMemo(
() =>
getWidgetQueryBuilder({
queryType: EQueryType.QUERY_BUILDER,
promql: [],
builder: databaseCallsAvgDuration({
servicename,
tagFilterItems,
}),
clickhouse_sql: [],
id: uuid(),
}),
getWidgetQueryBuilder(
{
queryType: EQueryType.QUERY_BUILDER,
promql: [],
builder: databaseCallsAvgDuration({
servicename,
tagFilterItems,
}),
clickhouse_sql: [],
id: uuid(),
},
GraphTitle.DATABASE_CALLS_AVG_DURATION,
),
[servicename, tagFilterItems],
);
@@ -92,11 +100,9 @@ function DBCall(): JSX.Element {
View Traces
</Button>
<Card>
<GraphTitle>Database Calls RPS</GraphTitle>
<GraphContainer>
<FullView
<Graph
name="database_call_rps"
fullViewOptions={false}
widget={databaseCallsRPSWidget}
yAxisUnit="reqps"
onClickHandler={(ChartEvent, activeElements, chart, data): void => {
@@ -108,6 +114,9 @@ function DBCall(): JSX.Element {
'database_call_rps',
);
}}
allowClone={false}
allowDelete={false}
allowEdit={false}
/>
</GraphContainer>
</Card>
@@ -127,11 +136,9 @@ function DBCall(): JSX.Element {
View Traces
</Button>
<Card>
<GraphTitle>Database Calls Avg Duration</GraphTitle>
<GraphContainer>
<FullView
<Graph
name="database_call_avg_duration"
fullViewOptions={false}
widget={databaseCallsAverageDurationWidget}
yAxisUnit="ms"
onClickHandler={(ChartEvent, activeElements, chart, data): void => {
@@ -143,6 +150,9 @@ function DBCall(): JSX.Element {
'database_call_avg_duration',
);
}}
allowClone={false}
allowDelete={false}
allowEdit={false}
/>
</GraphContainer>
</Card>

View File

@@ -1,5 +1,5 @@
import { Col } from 'antd';
import FullView from 'container/GridGraphLayout/Graph/FullView/index.metricsBuilder';
import Graph from 'container/GridGraphLayout/Graph/';
import {
externalCallDuration,
externalCallDurationByAddress,
@@ -16,10 +16,11 @@ import { useParams } from 'react-router-dom';
import { EQueryType } from 'types/common/dashboard';
import { v4 as uuid } from 'uuid';
import { GraphTitle, legend } from '../constant';
import { getWidgetQueryBuilder } from '../MetricsApplication.factory';
import { Card, GraphContainer, GraphTitle, Row } from '../styles';
import { legend } from './constant';
import { Card, GraphContainer, Row } from '../styles';
import { Button } from './styles';
import { IServiceName } from './types';
import {
handleNonInQueryRange,
onGraphClickHandler,
@@ -29,7 +30,7 @@ import {
function External(): JSX.Element {
const [selectedTimeStamp, setSelectedTimeStamp] = useState<number>(0);
const { servicename } = useParams<{ servicename?: string }>();
const { servicename } = useParams<IServiceName>();
const { queries } = useResourceAttribute();
const tagFilterItems = useMemo(
@@ -40,17 +41,20 @@ function External(): JSX.Element {
const externalCallErrorWidget = useMemo(
() =>
getWidgetQueryBuilder({
queryType: EQueryType.QUERY_BUILDER,
promql: [],
builder: externalCallErrorPercent({
servicename,
legend: legend.address,
tagFilterItems,
}),
clickhouse_sql: [],
id: uuid(),
}),
getWidgetQueryBuilder(
{
queryType: EQueryType.QUERY_BUILDER,
promql: [],
builder: externalCallErrorPercent({
servicename,
legend: legend.address,
tagFilterItems,
}),
clickhouse_sql: [],
id: uuid(),
},
GraphTitle.EXTERNAL_CALL_ERROR_PERCENTAGE,
),
[servicename, tagFilterItems],
);
@@ -61,48 +65,57 @@ function External(): JSX.Element {
const externalCallDurationWidget = useMemo(
() =>
getWidgetQueryBuilder({
queryType: EQueryType.QUERY_BUILDER,
promql: [],
builder: externalCallDuration({
servicename,
tagFilterItems,
}),
clickhouse_sql: [],
id: uuid(),
}),
getWidgetQueryBuilder(
{
queryType: EQueryType.QUERY_BUILDER,
promql: [],
builder: externalCallDuration({
servicename,
tagFilterItems,
}),
clickhouse_sql: [],
id: uuid(),
},
GraphTitle.EXTERNAL_CALL_DURATION,
),
[servicename, tagFilterItems],
);
const externalCallRPSWidget = useMemo(
() =>
getWidgetQueryBuilder({
queryType: EQueryType.QUERY_BUILDER,
promql: [],
builder: externalCallRpsByAddress({
servicename,
legend: legend.address,
tagFilterItems,
}),
clickhouse_sql: [],
id: uuid(),
}),
getWidgetQueryBuilder(
{
queryType: EQueryType.QUERY_BUILDER,
promql: [],
builder: externalCallRpsByAddress({
servicename,
legend: legend.address,
tagFilterItems,
}),
clickhouse_sql: [],
id: uuid(),
},
GraphTitle.EXTERNAL_CALL_RPS_BY_ADDRESS,
),
[servicename, tagFilterItems],
);
const externalCallDurationAddressWidget = useMemo(
() =>
getWidgetQueryBuilder({
queryType: EQueryType.QUERY_BUILDER,
promql: [],
builder: externalCallDurationByAddress({
servicename,
legend: legend.address,
tagFilterItems,
}),
clickhouse_sql: [],
id: uuid(),
}),
getWidgetQueryBuilder(
{
queryType: EQueryType.QUERY_BUILDER,
promql: [],
builder: externalCallDurationByAddress({
servicename,
legend: legend.address,
tagFilterItems,
}),
clickhouse_sql: [],
id: uuid(),
},
GraphTitle.EXTERNAL_CALL_DURATION_BY_ADDRESS,
),
[servicename, tagFilterItems],
);
@@ -124,11 +137,9 @@ function External(): JSX.Element {
View Traces
</Button>
<Card>
<GraphTitle>External Call Error Percentage</GraphTitle>
<GraphContainer>
<FullView
<Graph
name="external_call_error_percentage"
fullViewOptions={false}
widget={externalCallErrorWidget}
yAxisUnit="%"
onClickHandler={(ChartEvent, activeElements, chart, data): void => {
@@ -140,6 +151,9 @@ function External(): JSX.Element {
'external_call_error_percentage',
);
}}
allowClone={false}
allowDelete={false}
allowEdit={false}
/>
</GraphContainer>
</Card>
@@ -161,11 +175,9 @@ function External(): JSX.Element {
</Button>
<Card>
<GraphTitle>External Call duration</GraphTitle>
<GraphContainer>
<FullView
<Graph
name="external_call_duration"
fullViewOptions={false}
widget={externalCallDurationWidget}
yAxisUnit="ms"
onClickHandler={(ChartEvent, activeElements, chart, data): void => {
@@ -177,6 +189,9 @@ function External(): JSX.Element {
'external_call_duration',
);
}}
allowClone={false}
allowDelete={false}
allowEdit={false}
/>
</GraphContainer>
</Card>
@@ -199,11 +214,9 @@ function External(): JSX.Element {
View Traces
</Button>
<Card>
<GraphTitle>External Call RPS(by Address)</GraphTitle>
<GraphContainer>
<FullView
<Graph
name="external_call_rps_by_address"
fullViewOptions={false}
widget={externalCallRPSWidget}
yAxisUnit="reqps"
onClickHandler={(ChartEvent, activeElements, chart, data): void => {
@@ -215,6 +228,9 @@ function External(): JSX.Element {
'external_call_rps_by_address',
);
}}
allowClone={false}
allowDelete={false}
allowEdit={false}
/>
</GraphContainer>
</Card>
@@ -236,11 +252,9 @@ function External(): JSX.Element {
</Button>
<Card>
<GraphTitle>External Call duration(by Address)</GraphTitle>
<GraphContainer>
<FullView
<Graph
name="external_call_duration_by_address"
fullViewOptions={false}
widget={externalCallDurationAddressWidget}
yAxisUnit="ms"
onClickHandler={(ChartEvent, activeElements, chart, data): void => {
@@ -252,6 +266,9 @@ function External(): JSX.Element {
'external_call_duration_by_address',
);
}}
allowClone={false}
allowDelete={false}
allowEdit={false}
/>
</GraphContainer>
</Card>

View File

@@ -1,16 +1,9 @@
import { Typography } from 'antd';
import getServiceOverview from 'api/metrics/getServiceOverview';
import getTopLevelOperations, {
ServiceDataProps,
} from 'api/metrics/getTopLevelOperations';
import getTopOperations from 'api/metrics/getTopOperations';
import axios from 'axios';
import { ActiveElement, Chart, ChartData, ChartEvent } from 'chart.js';
import Graph from 'components/Graph';
import Spinner from 'components/Spinner';
import { QueryParams } from 'constants/query';
import ROUTES from 'constants/routes';
import FullView from 'container/GridGraphLayout/Graph/FullView/index.metricsBuilder';
import { routeConfig } from 'container/SideNav/config';
import { getQueryString } from 'container/SideNav/helper';
import useResourceAttribute from 'hooks/useResourceAttribute';
@@ -18,32 +11,30 @@ import {
convertRawQueriesToTraceSelectedTags,
resourceAttributesToTagFilterItems,
} from 'hooks/useResourceAttribute/utils';
import convertToNanoSecondsToSecond from 'lib/convertToNanoSecondsToSecond';
import { colors } from 'lib/getRandomColor';
import getStep from 'lib/getStep';
import history from 'lib/history';
import { useCallback, useMemo, useState } from 'react';
import { useQueries, UseQueryResult } from 'react-query';
import { useQuery } from 'react-query';
import { useDispatch, useSelector } from 'react-redux';
import { useLocation, useParams } from 'react-router-dom';
import { UpdateTimeInterval } from 'store/actions';
import { AppState } from 'store/reducers';
import { PayloadProps } from 'types/api/metrics/getServiceOverview';
import { PayloadProps as PayloadPropsTopOpertions } from 'types/api/metrics/getTopOperations';
import { EQueryType } from 'types/common/dashboard';
import { GlobalReducer } from 'types/reducer/globalTime';
import { Tags } from 'types/reducer/trace';
import { v4 as uuid } from 'uuid';
import { SOMETHING_WENT_WRONG } from '../../../constants/api';
import { GraphTitle } from '../constant';
import { getWidgetQueryBuilder } from '../MetricsApplication.factory';
import {
errorPercentage,
operationPerSec,
} from '../MetricsPageQueries/OverviewQueries';
import { Card, Col, GraphContainer, GraphTitle, Row } from '../styles';
import TopOperationsTable from '../TopOperationsTable';
import { Col, Row } from '../styles';
import ServiceOverview from './Overview/ServiceOverview';
import TopLevelOperation from './Overview/TopLevelOperations';
import TopOperation from './Overview/TopOperation';
import { Button } from './styles';
import { IServiceName } from './types';
import {
handleNonInQueryRange,
onGraphClickHandler,
@@ -54,7 +45,7 @@ function Application(): JSX.Element {
const { maxTime, minTime } = useSelector<AppState, GlobalReducer>(
(state) => state.globalTime,
);
const { servicename } = useParams<{ servicename?: string }>();
const { servicename } = useParams<IServiceName>();
const [selectedTimeStamp, setSelectedTimeStamp] = useState<number>(0);
const { search } = useLocation();
const { queries } = useResourceAttribute();
@@ -86,53 +77,15 @@ function Application(): JSX.Element {
[handleSetTimeStamp],
);
const queryResult = useQueries<
[
UseQueryResult<PayloadProps>,
UseQueryResult<PayloadPropsTopOpertions>,
UseQueryResult<ServiceDataProps>,
]
>([
{
queryKey: [servicename, selectedTags, minTime, maxTime],
queryFn: (): Promise<PayloadProps> =>
getServiceOverview({
service: servicename || '',
start: minTime,
end: maxTime,
step: getStep({
start: minTime,
end: maxTime,
inputFormat: 'ns',
}),
selectedTags,
}),
},
{
queryKey: [minTime, maxTime, servicename, selectedTags],
queryFn: (): Promise<PayloadPropsTopOpertions> =>
getTopOperations({
service: servicename || '',
start: minTime,
end: maxTime,
selectedTags,
}),
},
{
queryKey: [servicename, minTime, maxTime, selectedTags],
queryFn: (): Promise<ServiceDataProps> => getTopLevelOperations(),
},
]);
const serviceOverview = queryResult[0].data;
const serviceOverviewError = queryResult[0].error;
const serviceOverviewIsError = queryResult[0].isError;
const serviceOverviewIsLoading = queryResult[0].isLoading;
const topOperations = queryResult[1].data;
const topLevelOperations = queryResult[2].data;
const topLevelOperationsError = queryResult[2].error;
const topLevelOperationsIsError = queryResult[2].isError;
const topLevelOperationsIsLoading = queryResult[2].isLoading;
const {
data: topLevelOperations,
isLoading: topLevelOperationsLoading,
error: topLevelOperationsError,
isError: topLevelOperationsIsError,
} = useQuery<ServiceDataProps>({
queryKey: [servicename, minTime, maxTime, selectedTags],
queryFn: getTopLevelOperations,
});
const selectedTraceTags: string = JSON.stringify(
convertRawQueriesToTraceSelectedTags(queries) || [],
@@ -144,40 +97,47 @@ function Application(): JSX.Element {
[queries],
);
const topLevelOperationsRoute = useMemo(
() => (topLevelOperations ? topLevelOperations[servicename || ''] : []),
[servicename, topLevelOperations],
);
const operationPerSecWidget = useMemo(
() =>
getWidgetQueryBuilder({
queryType: EQueryType.QUERY_BUILDER,
promql: [],
builder: operationPerSec({
servicename,
tagFilterItems,
topLevelOperations: topLevelOperations
? topLevelOperations[servicename || '']
: [],
}),
clickhouse_sql: [],
id: uuid(),
}),
[servicename, topLevelOperations, tagFilterItems],
getWidgetQueryBuilder(
{
queryType: EQueryType.QUERY_BUILDER,
promql: [],
builder: operationPerSec({
servicename,
tagFilterItems,
topLevelOperations: topLevelOperationsRoute,
}),
clickhouse_sql: [],
id: uuid(),
},
GraphTitle.RATE_PER_OPS,
),
[servicename, tagFilterItems, topLevelOperationsRoute],
);
const errorPercentageWidget = useMemo(
() =>
getWidgetQueryBuilder({
queryType: EQueryType.QUERY_BUILDER,
promql: [],
builder: errorPercentage({
servicename,
tagFilterItems,
topLevelOperations: topLevelOperations
? topLevelOperations[servicename || '']
: [],
}),
clickhouse_sql: [],
id: uuid(),
}),
[servicename, topLevelOperations, tagFilterItems],
getWidgetQueryBuilder(
{
queryType: EQueryType.QUERY_BUILDER,
promql: [],
builder: errorPercentage({
servicename,
tagFilterItems,
topLevelOperations: topLevelOperationsRoute,
}),
clickhouse_sql: [],
id: uuid(),
},
GraphTitle.ERROR_PERCENTAGE,
),
[servicename, tagFilterItems, topLevelOperationsRoute],
);
const onDragSelect = useCallback(
@@ -212,107 +172,18 @@ function Application(): JSX.Element {
);
};
const generalChartDataProperties = useCallback(
(title: string, colorIndex: number) => ({
borderColor: colors[colorIndex],
label: title,
showLine: true,
borderWidth: 1.5,
spanGaps: true,
pointRadius: 2,
pointHoverRadius: 4,
}),
[],
);
const dataSets = useMemo(() => {
if (!serviceOverview) {
return [];
}
return [
{
data: serviceOverview.map((e) =>
parseFloat(convertToNanoSecondsToSecond(e.p99)),
),
...generalChartDataProperties('p99 Latency', 0),
},
{
data: serviceOverview.map((e) =>
parseFloat(convertToNanoSecondsToSecond(e.p95)),
),
...generalChartDataProperties('p95 Latency', 1),
},
{
data: serviceOverview.map((e) =>
parseFloat(convertToNanoSecondsToSecond(e.p50)),
),
...generalChartDataProperties('p50 Latency', 2),
},
];
}, [generalChartDataProperties, serviceOverview]);
const data = useMemo(() => {
if (!serviceOverview) {
return {
datasets: [],
labels: [],
};
}
return {
datasets: dataSets,
labels: serviceOverview.map(
(e) => new Date(parseFloat(convertToNanoSecondsToSecond(e.timestamp))),
),
};
}, [serviceOverview, dataSets]);
return (
<>
<Row gutter={24}>
<Col span={12}>
<Button
type="default"
size="small"
id="Service_button"
onClick={onViewTracePopupClick({
servicename,
selectedTraceTags,
timestamp: selectedTimeStamp,
})}
>
View Traces
</Button>
<Card>
{serviceOverviewIsError ? (
<Typography>
{axios.isAxiosError(serviceOverviewError)
? serviceOverviewError.response?.data
: SOMETHING_WENT_WRONG}
</Typography>
) : (
<>
<GraphTitle>Latency</GraphTitle>
{serviceOverviewIsLoading && (
<Spinner size="large" tip="Loading..." height="40vh" />
)}
{!serviceOverviewIsLoading && (
<GraphContainer>
<Graph
animate={false}
onClickHandler={handleGraphClick('Service')}
name="service_latency"
type="line"
data={data}
yAxisUnit="ms"
onDragSelect={onDragSelect}
/>
</GraphContainer>
)}
</>
)}
</Card>
<ServiceOverview
onDragSelect={onDragSelect}
handleGraphClick={handleGraphClick}
selectedTimeStamp={selectedTimeStamp}
selectedTraceTags={selectedTraceTags}
tagFilterItems={tagFilterItems}
topLevelOperationsRoute={topLevelOperationsRoute}
/>
</Col>
<Col span={12}>
@@ -328,30 +199,17 @@ function Application(): JSX.Element {
>
View Traces
</Button>
<Card>
{topLevelOperationsIsError ? (
<Typography>
{axios.isAxiosError(topLevelOperationsError)
? topLevelOperationsError.response?.data
: SOMETHING_WENT_WRONG}
</Typography>
) : (
<>
<GraphTitle>Rate (ops/s)</GraphTitle>
<GraphContainer>
<FullView
name="operations_per_sec"
fullViewOptions={false}
onClickHandler={handleGraphClick('Rate')}
widget={operationPerSecWidget}
yAxisUnit="ops"
onDragSelect={onDragSelect}
isDependedDataLoaded={topLevelOperationsIsLoading}
/>
</GraphContainer>
</>
)}
</Card>
<TopLevelOperation
handleGraphClick={handleGraphClick}
onDragSelect={onDragSelect}
topLevelOperationsError={topLevelOperationsError}
topLevelOperationsLoading={topLevelOperationsLoading}
topLevelOperationsIsError={topLevelOperationsIsError}
name="operations_per_sec"
widget={operationPerSecWidget}
yAxisUnit="ops"
opName="Rate"
/>
</Col>
</Row>
<Row gutter={24}>
@@ -367,43 +225,28 @@ function Application(): JSX.Element {
View Traces
</Button>
<Card>
{topLevelOperationsIsError ? (
<Typography>
{axios.isAxiosError(topLevelOperationsError)
? topLevelOperationsError.response?.data
: SOMETHING_WENT_WRONG}
</Typography>
) : (
<>
<GraphTitle>Error Percentage</GraphTitle>
<GraphContainer>
<FullView
name="error_percentage_%"
fullViewOptions={false}
onClickHandler={handleGraphClick('Error')}
widget={errorPercentageWidget}
yAxisUnit="%"
onDragSelect={onDragSelect}
isDependedDataLoaded={topLevelOperationsIsLoading}
/>
</GraphContainer>
</>
)}
</Card>
<TopLevelOperation
handleGraphClick={handleGraphClick}
onDragSelect={onDragSelect}
topLevelOperationsError={topLevelOperationsError}
topLevelOperationsLoading={topLevelOperationsLoading}
topLevelOperationsIsError={topLevelOperationsIsError}
name="error_percentage_%"
widget={errorPercentageWidget}
yAxisUnit="%"
opName="Error"
/>
</Col>
<Col span={12}>
<Card>
<TopOperationsTable data={topOperations || []} />
</Card>
<TopOperation />
</Col>
</Row>
</>
);
}
type ClickHandlerType = (
export type ClickHandlerType = (
ChartEvent: ChartEvent,
activeElements: ActiveElement[],
chart: Chart,

View File

@@ -0,0 +1,93 @@
import { FeatureKeys } from 'constants/features';
import Graph from 'container/GridGraphLayout/Graph/';
import { GraphTitle } from 'container/MetricsApplication/constant';
import { getWidgetQueryBuilder } from 'container/MetricsApplication/MetricsApplication.factory';
import { latency } from 'container/MetricsApplication/MetricsPageQueries/OverviewQueries';
import { Card, GraphContainer } from 'container/MetricsApplication/styles';
import useFeatureFlag from 'hooks/useFeatureFlag';
import { useMemo } from 'react';
import { useParams } from 'react-router-dom';
import { TagFilterItem } from 'types/api/queryBuilder/queryBuilderData';
import { EQueryType } from 'types/common/dashboard';
import { v4 as uuid } from 'uuid';
import { ClickHandlerType } from '../Overview';
import { Button } from '../styles';
import { IServiceName } from '../types';
import { onViewTracePopupClick } from '../util';
function ServiceOverview({
onDragSelect,
handleGraphClick,
selectedTraceTags,
selectedTimeStamp,
tagFilterItems,
topLevelOperationsRoute,
}: ServiceOverviewProps): JSX.Element {
const { servicename } = useParams<IServiceName>();
const isSpanMetricEnable = useFeatureFlag(FeatureKeys.USE_SPAN_METRICS)
?.active;
const latencyWidget = useMemo(
() =>
getWidgetQueryBuilder(
{
queryType: EQueryType.QUERY_BUILDER,
promql: [],
builder: latency({
servicename,
tagFilterItems,
isSpanMetricEnable,
topLevelOperationsRoute,
}),
clickhouse_sql: [],
id: uuid(),
},
GraphTitle.LATENCY,
),
[servicename, tagFilterItems, isSpanMetricEnable, topLevelOperationsRoute],
);
return (
<>
<Button
type="default"
size="small"
id="Service_button"
onClick={onViewTracePopupClick({
servicename,
selectedTraceTags,
timestamp: selectedTimeStamp,
})}
>
View Traces
</Button>
<Card>
<GraphContainer>
<Graph
name="service_latency"
onDragSelect={onDragSelect}
widget={latencyWidget}
yAxisUnit="ns"
onClickHandler={handleGraphClick('Service')}
allowClone={false}
allowDelete={false}
allowEdit={false}
/>
</GraphContainer>
</Card>
</>
);
}
interface ServiceOverviewProps {
selectedTimeStamp: number;
selectedTraceTags: string;
onDragSelect: (start: number, end: number) => void;
handleGraphClick: (type: string) => ClickHandlerType;
tagFilterItems: TagFilterItem[];
topLevelOperationsRoute: string[];
}
export default ServiceOverview;

View File

@@ -0,0 +1,65 @@
import { Typography } from 'antd';
import axios from 'axios';
import Spinner from 'components/Spinner';
import { SOMETHING_WENT_WRONG } from 'constants/api';
import Graph from 'container/GridGraphLayout/Graph/';
import { Card, GraphContainer } from 'container/MetricsApplication/styles';
import { Widgets } from 'types/api/dashboard/getAll';
import { ClickHandlerType } from '../Overview';
function TopLevelOperation({
name,
opName,
topLevelOperationsIsError,
topLevelOperationsError,
topLevelOperationsLoading,
onDragSelect,
handleGraphClick,
widget,
yAxisUnit,
}: TopLevelOperationProps): JSX.Element {
return (
<Card>
{topLevelOperationsIsError ? (
<Typography>
{axios.isAxiosError(topLevelOperationsError)
? topLevelOperationsError.response?.data
: SOMETHING_WENT_WRONG}
</Typography>
) : (
<GraphContainer>
{topLevelOperationsLoading && (
<Spinner size="large" tip="Loading..." height="40vh" />
)}
{!topLevelOperationsLoading && (
<Graph
name={name}
widget={widget}
onClickHandler={handleGraphClick(opName)}
yAxisUnit={yAxisUnit}
onDragSelect={onDragSelect}
allowClone={false}
allowDelete={false}
allowEdit={false}
/>
)}
</GraphContainer>
)}
</Card>
);
}
interface TopLevelOperationProps {
name: string;
opName: string;
topLevelOperationsIsError: boolean;
topLevelOperationsError: unknown;
topLevelOperationsLoading: boolean;
onDragSelect: (start: number, end: number) => void;
handleGraphClick: (type: string) => ClickHandlerType;
widget: Widgets;
yAxisUnit: string;
}
export default TopLevelOperation;

View File

@@ -0,0 +1,46 @@
import getTopOperations from 'api/metrics/getTopOperations';
import Spinner from 'components/Spinner';
import { Card } from 'container/MetricsApplication/styles';
import TopOperationsTable from 'container/MetricsApplication/TopOperationsTable';
import useResourceAttribute from 'hooks/useResourceAttribute';
import { convertRawQueriesToTraceSelectedTags } from 'hooks/useResourceAttribute/utils';
import { useMemo } from 'react';
import { useQuery } from 'react-query';
import { useSelector } from 'react-redux';
import { useParams } from 'react-router-dom';
import { AppState } from 'store/reducers';
import { PayloadProps } from 'types/api/metrics/getTopOperations';
import { GlobalReducer } from 'types/reducer/globalTime';
import { Tags } from 'types/reducer/trace';
function TopOperation(): JSX.Element {
const { maxTime, minTime } = useSelector<AppState, GlobalReducer>(
(state) => state.globalTime,
);
const { servicename } = useParams<{ servicename?: string }>();
const { queries } = useResourceAttribute();
const selectedTags = useMemo(
() => (convertRawQueriesToTraceSelectedTags(queries) as Tags[]) || [],
[queries],
);
const { data, isLoading } = useQuery<PayloadProps>({
queryKey: [minTime, maxTime, servicename, selectedTags],
queryFn: (): Promise<PayloadProps> =>
getTopOperations({
service: servicename || '',
start: minTime,
end: maxTime,
selectedTags,
}),
});
return (
<Card>
{isLoading && <Spinner size="large" tip="Loading..." height="40vh" />}
{!isLoading && <TopOperationsTable data={data || []} />}
</Card>
);
}
export default TopOperation;

View File

@@ -1,3 +0,0 @@
export const legend = {
address: '{{address}}',
};

View File

@@ -0,0 +1,3 @@
export interface IServiceName {
servicename: string;
}

View File

@@ -0,0 +1,60 @@
export const legend = {
address: '{{address}}',
};
export const QUERYNAME_AND_EXPRESSION = ['A', 'B', 'C'];
export const LATENCY_AGGREGATEOPERATOR = ['p50', 'p90', 'p99'];
export const LATENCY_AGGREGATEOPERATOR_SPAN_METRICS = [
'hist_quantile_50',
'hist_quantile_90',
'hist_quantile_99',
];
export const OPERATION_LEGENDS = ['Operations'];
export enum FORMULA {
ERROR_PERCENTAGE = 'A*100/B',
DATABASE_CALLS_AVG_DURATION = 'A/B',
}
export enum GraphTitle {
LATENCY = 'Latency',
RATE_PER_OPS = 'Rate (ops/s)',
ERROR_PERCENTAGE = 'Error Percentage',
DATABASE_CALLS_RPS = 'Database Calls RPS',
DATABASE_CALLS_AVG_DURATION = 'Database Calls Avg Duration',
EXTERNAL_CALL_ERROR_PERCENTAGE = 'External Call Error Percentage',
EXTERNAL_CALL_DURATION = 'External Call duration',
EXTERNAL_CALL_RPS_BY_ADDRESS = 'External Call RPS(by Address)',
EXTERNAL_CALL_DURATION_BY_ADDRESS = 'External Call duration(by Address)',
}
export enum DataType {
STRING = 'string',
FLOAT64 = 'float64',
INT64 = 'int64',
}
export enum MetricsType {
Tag = 'tag',
Resource = 'resource',
}
export enum WidgetKeys {
Name = 'name',
Address = 'address',
DurationNano = 'durationNano',
StatusCode = 'status_code',
Operation = 'operation',
OperationName = 'operationName',
Service_name = 'service_name',
ServiceName = 'serviceName',
SignozLatencyCount = 'signoz_latency_count',
SignozDBLatencyCount = 'signoz_db_latency_count',
DatabaseCallCount = 'signoz_database_call_count',
DatabaseCallLatencySum = 'signoz_database_call_latency_sum',
SignozDbLatencySum = 'signoz_db_latency_sum',
SignozCallsTotal = 'signoz_calls_total',
SignozExternalCallLatencyCount = 'signoz_external_call_latency_count',
SignozExternalCallLatencySum = 'signoz_external_call_latency_sum',
Signoz_latency_bucket = 'signoz_latency_bucket',
}

View File

@@ -3,12 +3,12 @@ import getFromLocalstorage from 'api/browser/localstorage/get';
import setToLocalstorage from 'api/browser/localstorage/set';
import { getAggregateKeys } from 'api/queryBuilder/getAttributeKeys';
import { LOCALSTORAGE } from 'constants/localStorage';
import { QueryBuilderKeys } from 'constants/queryBuilder';
import { useGetAggregateKeys } from 'hooks/queryBuilder/useGetAggregateKeys';
import useDebounce from 'hooks/useDebounce';
import { useNotifications } from 'hooks/useNotifications';
import useUrlQueryData from 'hooks/useUrlQueryData';
import { useCallback, useEffect, useMemo, useState } from 'react';
import { useQueries, useQuery } from 'react-query';
import { useQueries } from 'react-query';
import { ErrorResponse, SuccessResponse } from 'types/api';
import {
BaseAutocompleteData,
@@ -30,6 +30,7 @@ interface UseOptionsMenuProps {
interface UseOptionsMenu {
options: OptionsQuery;
config: OptionsMenuConfig;
handleOptionsChange: (newQueryData: OptionsQuery) => void;
}
const useOptionsMenu = ({
@@ -115,16 +116,12 @@ const useOptionsMenu = ({
const {
data: searchedAttributesData,
isFetching: isSearchedAttributesFetching,
} = useQuery(
[QueryBuilderKeys.GET_AGGREGATE_KEYS, debouncedSearchText, isFocused],
async () =>
getAggregateKeys({
...initialQueryParams,
searchText: debouncedSearchText,
}),
} = useGetAggregateKeys(
{
enabled: isFocused,
...initialQueryParams,
searchText: debouncedSearchText,
},
{ queryKey: [debouncedSearchText, isFocused], enabled: isFocused },
);
const searchedAttributeKeys = useMemo(
@@ -306,6 +303,7 @@ const useOptionsMenu = ({
return {
options: optionsQueryData,
config: optionsMenuConfig,
handleOptionsChange: handleRedirectWithOptionsData,
};
};

View File

@@ -3,6 +3,8 @@ import { ReactNode } from 'react';
import { IBuilderQuery } from 'types/api/queryBuilder/queryBuilderData';
import { DataSource } from 'types/common/queryBuilder';
import { OrderByFilterProps } from './filters/OrderByFilter/OrderByFilter.interfaces';
export type QueryBuilderConfig =
| {
queryVariant: 'static';
@@ -17,4 +19,5 @@ export type QueryBuilderProps = {
filterConfigs?: Partial<
Record<keyof IBuilderQuery, { isHidden: boolean; isDisabled: boolean }>
>;
queryComponents?: { renderOrderBy?: (props: OrderByFilterProps) => ReactNode };
};

View File

@@ -16,6 +16,7 @@ export const QueryBuilder = memo(function QueryBuilder({
panelType: newPanelType,
actions,
filterConfigs = {},
queryComponents,
}: QueryBuilderProps): JSX.Element {
const {
currentQuery,
@@ -74,6 +75,7 @@ export const QueryBuilder = memo(function QueryBuilder({
queryVariant={config?.queryVariant || 'dropdown'}
query={query}
filterConfigs={filterConfigs}
queryComponents={queryComponents}
/>
</Col>
))}

View File

@@ -6,4 +6,4 @@ export type QueryProps = {
isAvailableToDisable: boolean;
query: IBuilderQuery;
queryVariant: 'static' | 'dropdown';
} & Pick<QueryBuilderProps, 'filterConfigs'>;
} & Pick<QueryBuilderProps, 'filterConfigs' | 'queryComponents'>;

View File

@@ -36,6 +36,7 @@ export const Query = memo(function Query({
queryVariant,
query,
filterConfigs,
queryComponents,
}: QueryProps): JSX.Element {
const { panelType } = useQueryBuilder();
const {
@@ -110,6 +111,17 @@ export const Query = memo(function Query({
[handleChangeQueryData],
);
const renderOrderByFilter = useCallback((): ReactNode => {
if (queryComponents?.renderOrderBy) {
return queryComponents.renderOrderBy({
query,
onChange: handleChangeOrderByKeys,
});
}
return <OrderByFilter query={query} onChange={handleChangeOrderByKeys} />;
}, [queryComponents, query, handleChangeOrderByKeys]);
const renderAggregateEveryFilter = useCallback(
(): JSX.Element | null =>
!filterConfigs?.stepInterval?.isHidden ? (
@@ -167,9 +179,7 @@ export const Query = memo(function Query({
<Col flex="5.93rem">
<FilterLabel label="Order by" />
</Col>
<Col flex="1 1 12.5rem">
<OrderByFilter query={query} onChange={handleChangeOrderByKeys} />
</Col>
<Col flex="1 1 12.5rem">{renderOrderByFilter()}</Col>
</Row>
</Col>
)}
@@ -225,9 +235,7 @@ export const Query = memo(function Query({
<Col flex="5.93rem">
<FilterLabel label="Order by" />
</Col>
<Col flex="1 1 12.5rem">
<OrderByFilter query={query} onChange={handleChangeOrderByKeys} />
</Col>
<Col flex="1 1 12.5rem">{renderOrderByFilter()}</Col>
</Row>
</Col>
@@ -238,11 +246,11 @@ export const Query = memo(function Query({
}
}, [
panelType,
query,
isMetricsDataSource,
handleChangeHavingFilter,
query,
handleChangeLimit,
handleChangeOrderByKeys,
handleChangeHavingFilter,
renderOrderByFilter,
renderAggregateEveryFilter,
]);

View File

@@ -7,6 +7,7 @@ import {
selectValueDivider,
} from 'constants/queryBuilder';
import { DEBOUNCE_DELAY } from 'constants/queryBuilderFilterConfig';
import { useGetAggregateKeys } from 'hooks/queryBuilder/useGetAggregateKeys';
import useDebounce from 'hooks/useDebounce';
import { chooseAutocompleteFromCustomValue } from 'lib/newQueryBuilder/chooseAutocompleteFromCustomValue';
// ** Components
@@ -14,7 +15,7 @@ import { chooseAutocompleteFromCustomValue } from 'lib/newQueryBuilder/chooseAut
import { transformStringWithPrefix } from 'lib/query/transformStringWithPrefix';
import { isEqual, uniqWith } from 'lodash-es';
import { memo, useCallback, useEffect, useState } from 'react';
import { useQuery, useQueryClient } from 'react-query';
import { useQueryClient } from 'react-query';
import { BaseAutocompleteData } from 'types/api/queryBuilder/queryAutocompleteResponse';
import { SelectOption } from 'types/common/select';
@@ -38,16 +39,15 @@ export const GroupByFilter = memo(function GroupByFilter({
const debouncedValue = useDebounce(searchText, DEBOUNCE_DELAY);
const { isFetching } = useQuery(
[QueryBuilderKeys.GET_AGGREGATE_KEYS, debouncedValue, isFocused],
async () =>
getAggregateKeys({
aggregateAttribute: query.aggregateAttribute.key,
dataSource: query.dataSource,
aggregateOperator: query.aggregateOperator,
searchText: debouncedValue,
}),
const { isFetching } = useGetAggregateKeys(
{
aggregateAttribute: query.aggregateAttribute.key,
dataSource: query.dataSource,
aggregateOperator: query.aggregateOperator,
searchText: debouncedValue,
},
{
queryKey: [debouncedValue, isFocused],
enabled: !disabled && isFocused,
onSuccess: (data) => {
const keys = query.groupBy.reduce<string[]>((acc, item) => {

View File

@@ -1,8 +1,8 @@
import { InputNumber, Tooltip } from 'antd';
// import { useMemo } from 'react';
import { InputNumber } from 'antd';
import { useMemo } from 'react';
import { IBuilderQuery } from 'types/api/queryBuilder/queryBuilderData';
import { DataSource } from 'types/common/queryBuilder';
// import { DataSource } from 'types/common/queryBuilder';
import { selectStyle } from '../QueryBuilderSearch/config';
function LimitFilter({ onChange, query }: LimitFilterProps): JSX.Element {
@@ -21,25 +21,23 @@ function LimitFilter({ onChange, query }: LimitFilterProps): JSX.Element {
}
};
// const isMetricsDataSource = useMemo(
// () => query.dataSource === DataSource.METRICS,
// [query.dataSource],
// );
const isMetricsDataSource = useMemo(
() => query.dataSource === DataSource.METRICS,
[query.dataSource],
);
// const isDisabled = isMetricsDataSource && !query.aggregateAttribute.key;
const isDisabled = isMetricsDataSource && !query.aggregateAttribute.key;
return (
<Tooltip placement="top" title="coming soon">
<InputNumber
min={1}
type="number"
readOnly
value={query.limit}
style={selectStyle}
onChange={onChange}
onKeyDown={handleKeyDown}
/>
</Tooltip>
<InputNumber
min={1}
type="number"
value={query.limit}
style={selectStyle}
disabled={isDisabled}
onChange={onChange}
onKeyDown={handleKeyDown}
/>
);
}

View File

@@ -1,208 +1,57 @@
import { Select, Spin } from 'antd';
import { getAggregateKeys } from 'api/queryBuilder/getAttributeKeys';
import { QueryBuilderKeys } from 'constants/queryBuilder';
import { IOption } from 'hooks/useResourceAttribute/types';
import { uniqWith } from 'lodash-es';
import * as Papa from 'papaparse';
import { useCallback, useMemo, useState } from 'react';
import { useQuery } from 'react-query';
import { OrderByPayload } from 'types/api/queryBuilder/queryBuilderData';
import { useGetAggregateKeys } from 'hooks/queryBuilder/useGetAggregateKeys';
import { useMemo } from 'react';
import { DataSource, MetricAggregateOperator } from 'types/common/queryBuilder';
import { selectStyle } from '../QueryBuilderSearch/config';
import { getRemoveOrderFromValue } from '../QueryBuilderSearch/utils';
import { FILTERS } from './config';
import { OrderByFilterProps } from './OrderByFilter.interfaces';
import {
checkIfKeyPresent,
getLabelFromValue,
mapLabelValuePairs,
orderByValueDelimiter,
splitOrderByFromString,
transformToOrderByStringValues,
} from './utils';
import { useOrderByFilter } from './useOrderByFilter';
export function OrderByFilter({
query,
onChange,
}: OrderByFilterProps): JSX.Element {
const [searchText, setSearchText] = useState<string>('');
const [selectedValue, setSelectedValue] = useState<IOption[]>(
transformToOrderByStringValues(query.orderBy),
const {
debouncedSearchText,
selectedValue,
aggregationOptions,
generateOptions,
createOptions,
handleChange,
handleSearchKeys,
} = useOrderByFilter({ query, onChange });
const { data, isFetching } = useGetAggregateKeys(
{
aggregateAttribute: query.aggregateAttribute.key,
dataSource: query.dataSource,
aggregateOperator: query.aggregateOperator,
searchText: debouncedSearchText,
},
{
enabled: !!query.aggregateAttribute.key,
keepPreviousData: true,
},
);
const { data, isFetching } = useQuery(
[QueryBuilderKeys.GET_AGGREGATE_KEYS, searchText],
async () =>
getAggregateKeys({
aggregateAttribute: query.aggregateAttribute.key,
dataSource: query.dataSource,
aggregateOperator: query.aggregateOperator,
searchText,
}),
{ enabled: !!query.aggregateAttribute.key, keepPreviousData: true },
);
const handleSearchKeys = useCallback(
(searchText: string): void => setSearchText(searchText),
[],
);
const noAggregationOptions = useMemo(
() =>
data?.payload?.attributeKeys
? mapLabelValuePairs(data?.payload?.attributeKeys).flat()
: [],
[data?.payload?.attributeKeys],
);
const aggregationOptions = useMemo(
() =>
mapLabelValuePairs(query.groupBy)
.flat()
.concat([
{
label: `${query.aggregateOperator}(${query.aggregateAttribute.key}) ${FILTERS.ASC}`,
value: `${query.aggregateOperator}(${query.aggregateAttribute.key})${orderByValueDelimiter}${FILTERS.ASC}`,
},
{
label: `${query.aggregateOperator}(${query.aggregateAttribute.key}) ${FILTERS.DESC}`,
value: `${query.aggregateOperator}(${query.aggregateAttribute.key})${orderByValueDelimiter}${FILTERS.DESC}`,
},
]),
[query.aggregateAttribute.key, query.aggregateOperator, query.groupBy],
);
const customValue: IOption[] = useMemo(() => {
if (!searchText) return [];
return [
{
label: `${searchText} ${FILTERS.ASC}`,
value: `${searchText}${orderByValueDelimiter}${FILTERS.ASC}`,
},
{
label: `${searchText} ${FILTERS.DESC}`,
value: `${searchText}${orderByValueDelimiter}${FILTERS.DESC}`,
},
];
}, [searchText]);
const optionsData = useMemo(() => {
const keyOptions = createOptions(data?.payload?.attributeKeys || []);
const groupByOptions = createOptions(query.groupBy);
const options =
query.aggregateOperator === MetricAggregateOperator.NOOP
? noAggregationOptions
: aggregationOptions;
? keyOptions
: [...groupByOptions, ...aggregationOptions];
const resultOption = [...customValue, ...options];
return resultOption.filter(
(option) =>
!getLabelFromValue(selectedValue).includes(
getRemoveOrderFromValue(option.value),
),
);
return generateOptions(options);
}, [
aggregationOptions,
customValue,
noAggregationOptions,
createOptions,
data?.payload?.attributeKeys,
generateOptions,
query.aggregateOperator,
selectedValue,
query.groupBy,
]);
const getUniqValues = useCallback((values: IOption[]): IOption[] => {
const modifiedValues = values.map((item) => {
const match = Papa.parse(item.value, { delimiter: orderByValueDelimiter });
if (!match) return { label: item.label, value: item.value };
// eslint-disable-next-line @typescript-eslint/naming-convention, @typescript-eslint/no-unused-vars
const [_, order] = match.data.flat() as string[];
if (order)
return {
label: item.label,
value: item.value,
};
return {
label: `${item.value} ${FILTERS.ASC}`,
value: `${item.value}${orderByValueDelimiter}${FILTERS.ASC}`,
};
});
return uniqWith(
modifiedValues,
(current, next) =>
getRemoveOrderFromValue(current.value) ===
getRemoveOrderFromValue(next.value),
);
}, []);
const getValidResult = useCallback(
(result: IOption[]): IOption[] =>
result.reduce<IOption[]>((acc, item) => {
if (item.value === FILTERS.ASC || item.value === FILTERS.DESC) return acc;
if (item.value.includes(FILTERS.ASC) || item.value.includes(FILTERS.DESC)) {
const splittedOrderBy = splitOrderByFromString(item.value);
if (splittedOrderBy) {
acc.push({
label: `${splittedOrderBy.columnName} ${splittedOrderBy.order}`,
value: `${splittedOrderBy.columnName}${orderByValueDelimiter}${splittedOrderBy.order}`,
});
return acc;
}
}
acc.push(item);
return acc;
}, []),
[],
);
const handleChange = (values: IOption[]): void => {
const validResult = getValidResult(values);
const result = getUniqValues(validResult);
const orderByValues: OrderByPayload[] = result.map((item) => {
const match = Papa.parse(item.value, { delimiter: orderByValueDelimiter });
if (!match) {
return {
columnName: item.value,
order: 'asc',
};
}
const [columnName, order] = match.data.flat() as string[];
const columnNameValue = checkIfKeyPresent(
columnName,
query.aggregateAttribute.key,
)
? '#SIGNOZ_VALUE'
: columnName;
const orderValue = order ?? 'asc';
return {
columnName: columnNameValue,
order: orderValue,
};
});
const selectedValue: IOption[] = orderByValues.map((item) => ({
label: `${item.columnName} ${item.order}`,
value: `${item.columnName} ${item.order}`,
}));
setSelectedValue(selectedValue);
setSearchText('');
onChange(orderByValues);
};
const isDisabledSelect = useMemo(
() =>
!query.aggregateAttribute.key ||

View File

@@ -0,0 +1 @@
export const SIGNOZ_VALUE = '#SIGNOZ_VALUE';

View File

@@ -0,0 +1,199 @@
import { DEBOUNCE_DELAY } from 'constants/queryBuilderFilterConfig';
import useDebounce from 'hooks/useDebounce';
import { IOption } from 'hooks/useResourceAttribute/types';
import { isEqual, uniqWith } from 'lodash-es';
import * as Papa from 'papaparse';
import { useCallback, useMemo, useState } from 'react';
import { BaseAutocompleteData } from 'types/api/queryBuilder/queryAutocompleteResponse';
import { OrderByPayload } from 'types/api/queryBuilder/queryBuilderData';
import { getRemoveOrderFromValue } from '../QueryBuilderSearch/utils';
import { FILTERS } from './config';
import { SIGNOZ_VALUE } from './constants';
import { OrderByFilterProps } from './OrderByFilter.interfaces';
import {
getLabelFromValue,
mapLabelValuePairs,
orderByValueDelimiter,
splitOrderByFromString,
transformToOrderByStringValues,
} from './utils';
type UseOrderByFilterResult = {
searchText: string;
debouncedSearchText: string;
selectedValue: IOption[];
aggregationOptions: IOption[];
generateOptions: (options: IOption[]) => IOption[];
createOptions: (data: BaseAutocompleteData[]) => IOption[];
handleChange: (values: IOption[]) => void;
handleSearchKeys: (search: string) => void;
};
export const useOrderByFilter = ({
query,
onChange,
}: OrderByFilterProps): UseOrderByFilterResult => {
const [searchText, setSearchText] = useState<string>('');
const debouncedSearchText = useDebounce(searchText, DEBOUNCE_DELAY);
const handleSearchKeys = useCallback(
(searchText: string): void => setSearchText(searchText),
[],
);
const getUniqValues = useCallback((values: IOption[]): IOption[] => {
const modifiedValues = values.map((item) => {
const match = Papa.parse(item.value, { delimiter: orderByValueDelimiter });
if (!match) return { label: item.label, value: item.value };
// eslint-disable-next-line @typescript-eslint/naming-convention, @typescript-eslint/no-unused-vars
const [_, order] = match.data.flat() as string[];
if (order)
return {
label: item.label,
value: item.value,
};
return {
label: `${item.value} ${FILTERS.ASC}`,
value: `${item.value}${orderByValueDelimiter}${FILTERS.ASC}`,
};
});
return uniqWith(
modifiedValues,
(current, next) =>
getRemoveOrderFromValue(current.value) ===
getRemoveOrderFromValue(next.value),
);
}, []);
const customValue: IOption[] = useMemo(() => {
if (!searchText) return [];
return [
{
label: `${searchText} ${FILTERS.ASC}`,
value: `${searchText}${orderByValueDelimiter}${FILTERS.ASC}`,
},
{
label: `${searchText} ${FILTERS.DESC}`,
value: `${searchText}${orderByValueDelimiter}${FILTERS.DESC}`,
},
];
}, [searchText]);
const selectedValue = useMemo(() => transformToOrderByStringValues(query), [
query,
]);
const generateOptions = useCallback(
(options: IOption[]): IOption[] => {
const currentCustomValue = options.find(
(keyOption) =>
getRemoveOrderFromValue(keyOption.value) === debouncedSearchText,
)
? []
: customValue;
const result = [...currentCustomValue, ...options];
const uniqResult = uniqWith(result, isEqual);
return uniqResult.filter(
(option) =>
!getLabelFromValue(selectedValue).includes(
getRemoveOrderFromValue(option.value),
),
);
},
[customValue, debouncedSearchText, selectedValue],
);
const getValidResult = useCallback(
(result: IOption[]): IOption[] =>
result.reduce<IOption[]>((acc, item) => {
if (item.value === FILTERS.ASC || item.value === FILTERS.DESC) return acc;
if (item.value.includes(FILTERS.ASC) || item.value.includes(FILTERS.DESC)) {
const splittedOrderBy = splitOrderByFromString(item.value);
if (splittedOrderBy) {
acc.push({
label: `${splittedOrderBy.columnName} ${splittedOrderBy.order}`,
value: `${splittedOrderBy.columnName}${orderByValueDelimiter}${splittedOrderBy.order}`,
});
return acc;
}
}
acc.push(item);
return acc;
}, []),
[],
);
const handleChange = (values: IOption[]): void => {
const validResult = getValidResult(values);
const result = getUniqValues(validResult);
const orderByValues: OrderByPayload[] = result.map((item) => {
const match = Papa.parse(item.value, { delimiter: orderByValueDelimiter });
if (!match) {
return {
columnName: item.value,
order: 'asc',
};
}
const [columnName, order] = match.data.flat() as string[];
const columnNameValue =
columnName === SIGNOZ_VALUE ? SIGNOZ_VALUE : columnName;
const orderValue = order ?? 'asc';
return {
columnName: columnNameValue,
order: orderValue,
};
});
setSearchText('');
onChange(orderByValues);
};
const createOptions = useCallback(
(data: BaseAutocompleteData[]): IOption[] => mapLabelValuePairs(data).flat(),
[],
);
const aggregationOptions = useMemo(
() => [
{
label: `${query.aggregateOperator}(${query.aggregateAttribute.key}) ${FILTERS.ASC}`,
value: `${SIGNOZ_VALUE}${orderByValueDelimiter}${FILTERS.ASC}`,
},
{
label: `${query.aggregateOperator}(${query.aggregateAttribute.key}) ${FILTERS.DESC}`,
value: `${SIGNOZ_VALUE}${orderByValueDelimiter}${FILTERS.DESC}`,
},
],
[query],
);
return {
searchText,
debouncedSearchText,
selectedValue,
aggregationOptions,
createOptions,
handleChange,
handleSearchKeys,
generateOptions,
};
};

View File

@@ -1,31 +1,32 @@
import { IOption } from 'hooks/useResourceAttribute/types';
import { transformStringWithPrefix } from 'lib/query/transformStringWithPrefix';
import * as Papa from 'papaparse';
import { BaseAutocompleteData } from 'types/api/queryBuilder/queryAutocompleteResponse';
import { OrderByPayload } from 'types/api/queryBuilder/queryBuilderData';
import {
IBuilderQuery,
OrderByPayload,
} from 'types/api/queryBuilder/queryBuilderData';
import { FILTERS } from './config';
import { SIGNOZ_VALUE } from './constants';
export const orderByValueDelimiter = '|';
export const transformToOrderByStringValues = (
orderBy: OrderByPayload[],
query: IBuilderQuery,
): IOption[] => {
const prepareSelectedValue: IOption[] = orderBy.reduce<IOption[]>(
(acc, item) => {
if (item.columnName === '#SIGNOZ_VALUE') return acc;
const option: IOption = {
label: `${item.columnName} ${item.order}`,
const prepareSelectedValue: IOption[] = query.orderBy.map((item) => {
if (item.columnName === SIGNOZ_VALUE) {
return {
label: `${query.aggregateOperator}(${query.aggregateAttribute.key}) ${item.order}`,
value: `${item.columnName}${orderByValueDelimiter}${item.order}`,
};
}
acc.push(option);
return acc;
},
[],
);
return {
label: `${item.columnName} ${item.order}`,
value: `${item.columnName}${orderByValueDelimiter}${item.order}`,
};
});
return prepareSelectedValue;
};
@@ -34,20 +35,15 @@ export function mapLabelValuePairs(
arr: BaseAutocompleteData[],
): Array<IOption>[] {
return arr.map((item) => {
const label = transformStringWithPrefix({
str: item.key,
prefix: item.type || '',
condition: !item.isColumn,
});
const value = item.key;
return [
{
label: `${label} asc`,
value: `${value}${orderByValueDelimiter}asc`,
label: `${value} ${FILTERS.ASC}`,
value: `${value}${orderByValueDelimiter}${FILTERS.ASC}`,
},
{
label: `${label} desc`,
value: `${value}${orderByValueDelimiter}desc`,
label: `${value} ${FILTERS.DESC}`,
value: `${value}${orderByValueDelimiter}${FILTERS.DESC}`,
},
];
});
@@ -58,6 +54,7 @@ export function getLabelFromValue(arr: IOption[]): string[] {
const match = Papa.parse(item.value, { delimiter: orderByValueDelimiter });
if (match) {
const [key] = match.data as string[];
return key[0];
}

View File

@@ -6,6 +6,8 @@ import { useOptionsMenu } from 'container/OptionsMenu';
import { useGetQueryRange } from 'hooks/queryBuilder/useGetQueryRange';
import { useQueryBuilder } from 'hooks/queryBuilder/useQueryBuilder';
import { Pagination, URL_PAGINATION } from 'hooks/queryPagination';
import useDragColumns from 'hooks/useDragColumns';
import { getDraggedColumns } from 'hooks/useDragColumns/utils';
import useUrlQueryData from 'hooks/useUrlQueryData';
import history from 'lib/history';
import { RowData } from 'lib/query/createTableColumnsFromQuery';
@@ -37,6 +39,10 @@ function ListView(): JSX.Element {
},
});
const { draggedColumns, onDragColumns } = useDragColumns<RowData>(
LOCALSTORAGE.TRACES_LIST_COLUMNS,
);
const { queryData: paginationQueryData } = useUrlQueryData<Pagination>(
URL_PAGINATION,
);
@@ -82,9 +88,10 @@ function ListView(): JSX.Element {
queryTableDataResult,
]);
const columns = useMemo(() => getListColumns(options?.selectColumns || []), [
options?.selectColumns,
]);
const columns = useMemo(() => {
const updatedColumns = getListColumns(options?.selectColumns || []);
return getDraggedColumns(updatedColumns, draggedColumns);
}, [options?.selectColumns, draggedColumns]);
const transformedQueryTableData = useMemo(
() => transformDataWithDate(queryTableData) || [],
@@ -106,6 +113,12 @@ function ListView(): JSX.Element {
[],
);
const handleDragColumn = useCallback(
(fromIndex: number, toIndex: number) =>
onDragColumns(columns, fromIndex, toIndex),
[columns, onDragColumns],
);
return (
<Container>
<TraceExplorerControls
@@ -127,6 +140,7 @@ function ListView(): JSX.Element {
dataSource={transformedQueryTableData}
columns={columns}
onRow={handleRow}
onDragColumn={handleDragColumn}
/>
)}
</Container>

View File

@@ -1,10 +1,12 @@
import { Button } from 'antd';
import { PANEL_TYPES } from 'constants/queryBuilder';
import ExplorerOrderBy from 'container/ExplorerOrderBy';
import { QueryBuilder } from 'container/QueryBuilder';
import { OrderByFilterProps } from 'container/QueryBuilder/filters/OrderByFilter/OrderByFilter.interfaces';
import { QueryBuilderProps } from 'container/QueryBuilder/QueryBuilder.interfaces';
import { useGetPanelTypesQueryParam } from 'hooks/queryBuilder/useGetPanelTypesQueryParam';
import { useQueryBuilder } from 'hooks/queryBuilder/useQueryBuilder';
import { memo, useMemo } from 'react';
import { memo, useCallback, useMemo } from 'react';
import { DataSource } from 'types/common/queryBuilder';
import { ButtonWrapper, Container } from './styles';
@@ -22,6 +24,22 @@ function QuerySection(): JSX.Element {
return config;
}, []);
const renderOrderBy = useCallback(
({ query, onChange }: OrderByFilterProps) => (
<ExplorerOrderBy query={query} onChange={onChange} />
),
[],
);
const queryComponents = useMemo((): QueryBuilderProps['queryComponents'] => {
const shouldRenderCustomOrderBy =
panelTypes === PANEL_TYPES.LIST || panelTypes === PANEL_TYPES.TRACE;
return {
...(shouldRenderCustomOrderBy ? { renderOrderBy } : {}),
};
}, [panelTypes, renderOrderBy]);
return (
<Container>
<QueryBuilder
@@ -31,6 +49,7 @@ function QuerySection(): JSX.Element {
initialDataSource: DataSource.TRACES,
}}
filterConfigs={filterConfigs}
queryComponents={queryComponents}
actions={
<ButtonWrapper>
<Button onClick={handleRunQuery} type="primary">

View File

@@ -0,0 +1,53 @@
import { Space } from 'antd';
import { initialQueriesMap, PANEL_TYPES } from 'constants/queryBuilder';
import { REACT_QUERY_KEY } from 'constants/reactQueryKeys';
import { QueryTable } from 'container/QueryTable';
import { useGetQueryRange } from 'hooks/queryBuilder/useGetQueryRange';
import { useQueryBuilder } from 'hooks/queryBuilder/useQueryBuilder';
import { memo } from 'react';
import { useSelector } from 'react-redux';
import { AppState } from 'store/reducers';
import { GlobalReducer } from 'types/reducer/globalTime';
function TableView(): JSX.Element {
const { stagedQuery, panelType } = useQueryBuilder();
const { selectedTime: globalSelectedTime, maxTime, minTime } = useSelector<
AppState,
GlobalReducer
>((state) => state.globalTime);
const { data, isLoading } = useGetQueryRange(
{
query: stagedQuery || initialQueriesMap.traces,
graphType: panelType || PANEL_TYPES.TABLE,
selectedTime: 'GLOBAL_TIME',
globalSelectedInterval: globalSelectedTime,
params: {
dataSource: 'traces',
},
},
{
queryKey: [
REACT_QUERY_KEY.GET_QUERY_RANGE,
globalSelectedTime,
maxTime,
minTime,
stagedQuery,
],
enabled: !!stagedQuery && panelType === PANEL_TYPES.TABLE,
},
);
return (
<Space.Compact block direction="vertical">
<QueryTable
query={stagedQuery || initialQueriesMap.traces}
queryTableData={data?.payload.data.newResult.data.result || []}
loading={isLoading}
/>
</Space.Compact>
);
}
export default memo(TableView);

View File

@@ -1,6 +1,4 @@
import { getAggregateKeys } from 'api/queryBuilder/getAttributeKeys';
import { getAttributesValues } from 'api/queryBuilder/getAttributesValues';
import { QueryBuilderKeys } from 'constants/queryBuilder';
import { DEBOUNCE_DELAY } from 'constants/queryBuilderFilterConfig';
import {
getRemovePrefixFromKey,
@@ -10,12 +8,13 @@ import {
import useDebounceValue from 'hooks/useDebounce';
import { isEqual, uniqWith } from 'lodash-es';
import { useCallback, useEffect, useMemo, useRef, useState } from 'react';
import { useQuery } from 'react-query';
import { useDebounce } from 'react-use';
import { BaseAutocompleteData } from 'types/api/queryBuilder/queryAutocompleteResponse';
import { IBuilderQuery } from 'types/api/queryBuilder/queryBuilderData';
import { DataSource } from 'types/common/queryBuilder';
import { useGetAggregateKeys } from './useGetAggregateKeys';
type IuseFetchKeysAndValues = {
keys: BaseAutocompleteData[];
results: string[];
@@ -71,19 +70,15 @@ export const useFetchKeysAndValues = (
],
);
const { data, isFetching, status } = useQuery(
[QueryBuilderKeys.GET_AGGREGATE_KEYS, searchParams],
async () =>
getAggregateKeys({
searchText: searchKey,
dataSource: query.dataSource,
aggregateOperator: query.aggregateOperator,
aggregateAttribute: query.aggregateAttribute.key,
tagType: query.aggregateAttribute.type ?? null,
}),
const { data, isFetching, status } = useGetAggregateKeys(
{
enabled: isQueryEnabled,
searchText: searchKey,
dataSource: query.dataSource,
aggregateOperator: query.aggregateOperator,
aggregateAttribute: query.aggregateAttribute.key,
tagType: query.aggregateAttribute.type ?? null,
},
{ queryKey: [searchParams], enabled: isQueryEnabled },
);
/**

View File

@@ -0,0 +1,34 @@
import { getAggregateKeys } from 'api/queryBuilder/getAttributeKeys';
import { QueryBuilderKeys } from 'constants/queryBuilder';
import { useMemo } from 'react';
import { useQuery, UseQueryOptions, UseQueryResult } from 'react-query';
import { ErrorResponse, SuccessResponse } from 'types/api';
import { IGetAttributeKeysPayload } from 'types/api/queryBuilder/getAttributeKeys';
import { IQueryAutocompleteResponse } from 'types/api/queryBuilder/queryAutocompleteResponse';
type UseGetAttributeKeys = (
requestData: IGetAttributeKeysPayload,
options?: UseQueryOptions<
SuccessResponse<IQueryAutocompleteResponse> | ErrorResponse
>,
) => UseQueryResult<
SuccessResponse<IQueryAutocompleteResponse> | ErrorResponse
>;
export const useGetAggregateKeys: UseGetAttributeKeys = (
requestData,
options,
) => {
const queryKey = useMemo(() => {
if (options?.queryKey && Array.isArray(options.queryKey)) {
return [QueryBuilderKeys.GET_AGGREGATE_KEYS, ...options.queryKey];
}
return [QueryBuilderKeys.GET_AGGREGATE_KEYS, requestData];
}, [options?.queryKey, requestData]);
return useQuery<SuccessResponse<IQueryAutocompleteResponse> | ErrorResponse>({
queryKey,
queryFn: () => getAggregateKeys(requestData),
...options,
});
};

View File

@@ -0,0 +1,7 @@
export const COLUMNS = 'columns';
export const dragColumnParams = {
ignoreSelector: '.react-resizable-handle',
nodeSelector: 'th',
handleSelector: '.dragHandler',
};

View File

@@ -0,0 +1,75 @@
import { ColumnsType } from 'antd/es/table';
import getFromLocalstorage from 'api/browser/localstorage/get';
import setToLocalstorage from 'api/browser/localstorage/set';
import { LOCALSTORAGE } from 'constants/localStorage';
import useUrlQueryData from 'hooks/useUrlQueryData';
import { useCallback, useEffect, useMemo } from 'react';
import { COLUMNS } from './configs';
import { UseDragColumns } from './types';
const useDragColumns = <T>(storageKey: LOCALSTORAGE): UseDragColumns<T> => {
const {
query: draggedColumnsQuery,
queryData: draggedColumns,
redirectWithQuery: redirectWithDraggedColumns,
} = useUrlQueryData<ColumnsType<T>>(COLUMNS, []);
const localStorageDraggedColumns = useMemo(
() => getFromLocalstorage(storageKey),
[storageKey],
);
const handleRedirectWithDraggedColumns = useCallback(
(columns: ColumnsType<T>) => {
redirectWithDraggedColumns(columns);
setToLocalstorage(storageKey, JSON.stringify(columns));
},
[storageKey, redirectWithDraggedColumns],
);
const onDragColumns = useCallback(
(columns: ColumnsType<T>, fromIndex: number, toIndex: number): void => {
const columnsData = [...columns];
const item = columnsData.splice(fromIndex, 1)[0];
columnsData.splice(toIndex, 0, item);
handleRedirectWithDraggedColumns(columnsData);
},
[handleRedirectWithDraggedColumns],
);
const redirectWithNewDraggedColumns = useCallback(
async (localStorageColumns: string) => {
let nextDraggedColumns: ColumnsType<T> = [];
try {
const parsedDraggedColumns = await JSON.parse(localStorageColumns);
nextDraggedColumns = parsedDraggedColumns;
} catch (e) {
console.log('error while parsing json');
} finally {
redirectWithDraggedColumns(nextDraggedColumns);
}
},
[redirectWithDraggedColumns],
);
useEffect(() => {
if (draggedColumnsQuery || !localStorageDraggedColumns) return;
redirectWithNewDraggedColumns(localStorageDraggedColumns);
}, [
draggedColumnsQuery,
localStorageDraggedColumns,
redirectWithNewDraggedColumns,
]);
return {
draggedColumns,
onDragColumns,
};
};
export default useDragColumns;

View File

@@ -0,0 +1,10 @@
import { ColumnsType } from 'antd/es/table';
export type UseDragColumns<T> = {
draggedColumns: ColumnsType<T>;
onDragColumns: (
columns: ColumnsType<T>,
fromIndex: number,
toIndex: number,
) => void;
};

View File

@@ -0,0 +1,37 @@
import { ColumnsType } from 'antd/es/table';
const filterColumns = <T>(
initialColumns: ColumnsType<T>,
findColumns: ColumnsType<T>,
isColumnExist = true,
): ColumnsType<T> =>
initialColumns.filter(({ title: columnTitle }) => {
const column = findColumns.find(({ title }) => title === columnTitle);
return isColumnExist ? !!column : !column;
});
export const getDraggedColumns = <T>(
currentColumns: ColumnsType<T>,
draggedColumns: ColumnsType<T>,
): ColumnsType<T> => {
if (draggedColumns.length) {
const actualDruggedColumns = filterColumns<T>(draggedColumns, currentColumns);
const newColumns = filterColumns<T>(
currentColumns,
actualDruggedColumns,
false,
);
return [...actualDruggedColumns, ...newColumns].reduce((acc, { title }) => {
const column = currentColumns.find(
({ title: columnTitle }) => title === columnTitle,
);
if (column) return [...acc, column];
return acc;
}, [] as ColumnsType<T>);
}
return currentColumns;
};

View File

@@ -0,0 +1,29 @@
import throttle from 'lodash-es/throttle';
import { useEffect, useState } from 'react';
import { UseScrollToTop } from './types';
function useScrollToTop(visibleOffset = 200): UseScrollToTop {
const [isVisible, setIsVisible] = useState<boolean>(false);
const scrollToTop = (): void => {
window.scrollTo({
top: 0,
behavior: 'smooth',
});
};
useEffect(() => {
const toggleVisibility = throttle(() => {
setIsVisible(window.pageYOffset > visibleOffset);
}, 300);
window.addEventListener('scroll', toggleVisibility);
return (): void => window.removeEventListener('scroll', toggleVisibility);
}, [visibleOffset]);
return { isVisible, scrollToTop };
}
export default useScrollToTop;

View File

@@ -0,0 +1,4 @@
export interface UseScrollToTop {
isVisible: boolean;
scrollToTop: VoidFunction;
}

View File

@@ -0,0 +1,58 @@
import { act, renderHook } from '@testing-library/react';
import useScrollToTop from './index';
// Mocking window.scrollTo method
global.scrollTo = jest.fn();
describe('useScrollToTop hook', () => {
beforeAll(() => {
jest.useFakeTimers();
});
it('should change visibility and scroll to top on call', () => {
const { result } = renderHook(() => useScrollToTop(100));
// Simulate scrolling 150px down
act(() => {
global.pageYOffset = 150;
global.dispatchEvent(new Event('scroll'));
jest.advanceTimersByTime(300);
});
expect(result.current.isVisible).toBe(true);
// Simulate scrolling to top
act(() => {
result.current.scrollToTop();
});
expect(global.scrollTo).toHaveBeenCalledWith({ top: 0, behavior: 'smooth' });
});
it('should be invisible when scrolled less than offset', () => {
const { result } = renderHook(() => useScrollToTop(100));
// Simulate scrolling 50px down
act(() => {
global.pageYOffset = 50;
global.dispatchEvent(new Event('scroll'));
jest.advanceTimersByTime(300);
});
expect(result.current.isVisible).toBe(false);
});
it('should be visible when scrolled more than offset', () => {
const { result } = renderHook(() => useScrollToTop(100));
// Simulate scrolling 50px down
act(() => {
global.pageYOffset = 200;
global.dispatchEvent(new Event('scroll'));
jest.advanceTimersByTime(300);
});
expect(result.current.isVisible).toBe(true);
});
});

View File

@@ -4,8 +4,13 @@ import { FORMULA_REGEXP } from 'constants/regExp';
import { QueryTableProps } from 'container/QueryTable/QueryTable.intefaces';
import { toCapitalize } from 'lib/toCapitalize';
import { ReactNode } from 'react';
import { IBuilderQuery, Query } from 'types/api/queryBuilder/queryBuilderData';
import {
IBuilderFormula,
IBuilderQuery,
Query,
} from 'types/api/queryBuilder/queryBuilderData';
import { ListItem, QueryDataV3, SeriesItem } from 'types/api/widgets/getQuery';
import { QueryBuilderData } from 'types/common/queryBuilder';
import { v4 as uuid } from 'uuid';
type CreateTableDataFromQueryParams = Pick<
@@ -21,8 +26,10 @@ export type RowData = {
type DynamicColumn = {
key: keyof RowData;
title: string;
sourceLabel: string;
data: (string | number)[];
type: 'field' | 'operator';
type: 'field' | 'operator' | 'formula';
// sortable: boolean;
};
@@ -39,7 +46,6 @@ type CreateTableDataFromQuery = (
type FillColumnData = (
queryTableData: QueryDataV3[],
dynamicColumns: DynamicColumns,
query: Query,
) => { filledDynamicColumns: DynamicColumns; rowsLength: number };
type GetDynamicColumns = (
@@ -54,43 +60,37 @@ type SeriesItemLabels = SeriesItem['labels'];
const isFormula = (queryName: string): boolean =>
FORMULA_REGEXP.test(queryName);
const isColumnExist = (
columnName: string,
const isValueExist = (
field: keyof DynamicColumn,
value: string,
columns: DynamicColumns,
): boolean => {
const columnKeys = columns.map((item) => item.key);
const existColumns = columns.find((item) => item[field] === value);
return columnKeys.includes(columnName);
return !!existColumns;
};
const prepareColumnTitle = (title: string): string => {
const haveUnderscore = title.includes('_');
if (haveUnderscore) {
return title
.split('_')
.map((str) => toCapitalize(str))
.join(' ');
}
return toCapitalize(title);
};
const getQueryOperator = (
queryData: IBuilderQuery[],
const getQueryByName = <T extends keyof QueryBuilderData>(
builder: QueryBuilderData,
currentQueryName: string,
): string => {
const builderQuery = queryData.find((q) => q.queryName === currentQueryName);
type: T,
): (T extends 'queryData' ? IBuilderQuery : IBuilderFormula) | null => {
const queryArray = builder[type];
return builderQuery ? builderQuery.aggregateOperator : '';
const currentQuery =
queryArray.find((q) => q.queryName === currentQueryName) || null;
if (!currentQuery) return null;
return currentQuery as T extends 'queryData' ? IBuilderQuery : IBuilderFormula;
};
const createLabels = <T extends ListItemData | SeriesItemLabels>(
labels: T,
// labels: T,
label: keyof T,
dynamicColumns: DynamicColumns,
): void => {
if (isColumnExist(label as string, dynamicColumns)) return;
if (isValueExist('key', label as string, dynamicColumns)) return;
// const labelValue = labels[label];
@@ -98,6 +98,8 @@ const createLabels = <T extends ListItemData | SeriesItemLabels>(
const fieldObj: DynamicColumn = {
key: label as string,
title: label as string,
sourceLabel: label as string,
data: [],
type: 'field',
// sortable: isNumber,
@@ -106,6 +108,68 @@ const createLabels = <T extends ListItemData | SeriesItemLabels>(
dynamicColumns.push(fieldObj);
};
const appendOperatorFormulaColumns = (
builder: QueryBuilderData,
currentQueryName: string,
dynamicColumns: DynamicColumns,
): void => {
const currentFormula = getQueryByName(
builder,
currentQueryName,
'queryFormulas',
);
if (currentFormula) {
let formulaLabel = `${currentFormula.queryName}(${currentFormula.expression})`;
if (currentFormula.legend) {
formulaLabel += ` - ${currentFormula.legend}`;
}
const formulaColumn: DynamicColumn = {
key: currentQueryName,
title: formulaLabel,
sourceLabel: formulaLabel,
data: [],
type: 'formula',
// sortable: isNumber,
};
dynamicColumns.push(formulaColumn);
}
const currentQueryData = getQueryByName(
builder,
currentQueryName,
'queryData',
);
if (!currentQueryData) return;
let operatorLabel = `${currentQueryData.aggregateOperator}`;
if (currentQueryData.aggregateAttribute.key) {
operatorLabel += `(${currentQueryData.aggregateAttribute.key})`;
}
if (currentQueryData.legend) {
operatorLabel += ` - ${currentQueryData.legend}`;
} else {
operatorLabel += ` - ${currentQueryData.queryName}`;
}
const resultValue = `${toCapitalize(operatorLabel)}`;
const operatorColumn: DynamicColumn = {
key: currentQueryName,
title: resultValue,
sourceLabel: resultValue,
data: [],
type: 'operator',
// sortable: isNumber,
};
dynamicColumns.push(operatorColumn);
};
const getDynamicColumns: GetDynamicColumns = (queryTableData, query) => {
const dynamicColumns: DynamicColumns = [];
@@ -113,49 +177,52 @@ const getDynamicColumns: GetDynamicColumns = (queryTableData, query) => {
if (currentQuery.list) {
currentQuery.list.forEach((listItem) => {
Object.keys(listItem.data).forEach((label) => {
createLabels<ListItemData>(
listItem.data,
label as ListItemKey,
dynamicColumns,
);
createLabels<ListItemData>(label as ListItemKey, dynamicColumns);
});
});
}
if (currentQuery.series) {
if (!isColumnExist('timestamp', dynamicColumns)) {
if (!isValueExist('key', 'timestamp', dynamicColumns)) {
dynamicColumns.push({
key: 'timestamp',
title: 'Timestamp',
sourceLabel: 'Timestamp',
data: [],
type: 'field',
// sortable: true,
});
}
currentQuery.series.forEach((seria) => {
Object.keys(seria.labels).forEach((label) => {
createLabels<SeriesItemLabels>(seria.labels, label, dynamicColumns);
});
});
const operator = getQueryOperator(
query.builder.queryData,
appendOperatorFormulaColumns(
query.builder,
currentQuery.queryName,
dynamicColumns,
);
if (operator === '' || isColumnExist(operator, dynamicColumns)) return;
const operatorColumn: DynamicColumn = {
key: operator,
data: [],
type: 'operator',
// sortable: true,
};
dynamicColumns.push(operatorColumn);
currentQuery.series.forEach((seria) => {
Object.keys(seria.labels).forEach((label) => {
createLabels<SeriesItemLabels>(label, dynamicColumns);
});
});
}
});
return dynamicColumns;
return dynamicColumns.map((item) => {
if (isFormula(item.key as string)) {
return item;
}
const sameValues = dynamicColumns.filter(
(column) => column.sourceLabel === item.sourceLabel,
);
if (sameValues.length > 1) {
return { ...item, title: `${item.title} - ${item.key}` };
}
return item;
});
};
const fillEmptyRowCells = (
@@ -179,7 +246,6 @@ const fillDataFromSeria = (
seria: SeriesItem,
columns: DynamicColumns,
queryName: string,
operator: string,
): void => {
const labelEntries = Object.entries(seria.labels);
@@ -195,13 +261,7 @@ const fillDataFromSeria = (
return;
}
if (isFormula(queryName) && queryName === column.key) {
column.data.push(parseFloat(value.value).toFixed(2));
unusedColumnsKeys.delete(column.key);
return;
}
if (!isFormula(queryName) && operator === column.key) {
if (queryName === column.key) {
column.data.push(parseFloat(value.value).toFixed(2));
unusedColumnsKeys.delete(column.key);
return;
@@ -238,25 +298,16 @@ const fillDataFromList = (
});
};
const fillColumnsData: FillColumnData = (queryTableData, cols, query) => {
const fillColumnsData: FillColumnData = (queryTableData, cols) => {
const fields = cols.filter((item) => item.type === 'field');
const operators = cols.filter((item) => item.type === 'operator');
const resultColumns = [...fields, ...operators];
const formulas = cols.filter((item) => item.type === 'formula');
const resultColumns = [...fields, ...operators, ...formulas];
queryTableData.forEach((currentQuery) => {
if (currentQuery.series) {
currentQuery.series.forEach((seria) => {
const currentOperator = getQueryOperator(
query.builder.queryData,
currentQuery.queryName,
);
fillDataFromSeria(
seria,
resultColumns,
currentQuery.queryName,
currentOperator,
);
fillDataFromSeria(seria, resultColumns, currentQuery.queryName);
});
}
@@ -303,7 +354,7 @@ const generateTableColumns = (
const column: ColumnType<RowData> = {
dataIndex: item.key,
key: item.key,
title: prepareColumnTitle(item.key as string),
title: item.title,
// sorter: item.sortable
// ? (a: RowData, b: RowData): number =>
// (a[item.key] as number) - (b[item.key] as number)
@@ -326,7 +377,6 @@ export const createTableColumnsFromQuery: CreateTableDataFromQuery = ({
const { filledDynamicColumns, rowsLength } = fillColumnsData(
queryTableData,
dynamicColumns,
query,
);
const dataSource = generateData(filledDynamicColumns, rowsLength);

View File

@@ -2,11 +2,16 @@ import { Tabs } from 'antd';
import axios from 'axios';
import ExplorerCard from 'components/ExplorerCard';
import { QueryParams } from 'constants/query';
import { initialQueriesMap, PANEL_TYPES } from 'constants/queryBuilder';
import {
initialAutocompleteData,
initialQueriesMap,
PANEL_TYPES,
} from 'constants/queryBuilder';
import { queryParamNamesMap } from 'constants/queryBuilderQueryNames';
import ROUTES from 'constants/routes';
import ExportPanel from 'container/ExportPanel';
import { GRAPH_TYPES } from 'container/NewDashboard/ComponentsSlider';
import { SIGNOZ_VALUE } from 'container/QueryBuilder/filters/OrderByFilter/constants';
import QuerySection from 'container/TracesExplorer/QuerySection';
import { useUpdateDashboard } from 'hooks/dashboard/useUpdateDashboard';
import { addEmptyWidgetInDashboardJSONWithQuery } from 'hooks/dashboard/utils';
@@ -17,6 +22,7 @@ import history from 'lib/history';
import { useCallback, useEffect, useMemo } from 'react';
import { generatePath } from 'react-router-dom';
import { Dashboard } from 'types/api/dashboard/getAll';
import { Query } from 'types/api/queryBuilder/queryBuilderData';
import { DataSource } from 'types/common/queryBuilder';
import { ActionsWrapper, Container } from './styles';
@@ -29,6 +35,7 @@ function TracesExplorer(): JSX.Element {
currentQuery,
panelType,
updateAllQueriesOperators,
updateQueriesData,
redirectWithQueryBuilderData,
} = useQueryBuilder();
@@ -141,26 +148,42 @@ function TracesExplorer(): JSX.Element {
[exportDefaultQuery, notifications, updateDashboard],
);
const handleTabChange = useCallback(
(newPanelType: string): void => {
if (panelType === newPanelType) return;
const query = updateAllQueriesOperators(
const getUpdateQuery = useCallback(
(newPanelType: GRAPH_TYPES): Query => {
let query = updateAllQueriesOperators(
currentQuery,
newPanelType as GRAPH_TYPES,
newPanelType,
DataSource.TRACES,
);
if (
newPanelType === PANEL_TYPES.LIST ||
newPanelType === PANEL_TYPES.TRACE
) {
query = updateQueriesData(query, 'queryData', (item) => ({
...item,
orderBy: item.orderBy.filter((item) => item.columnName !== SIGNOZ_VALUE),
aggregateAttribute: initialAutocompleteData,
}));
}
return query;
},
[currentQuery, updateAllQueriesOperators, updateQueriesData],
);
const handleTabChange = useCallback(
(type: string): void => {
const newPanelType = type as GRAPH_TYPES;
if (panelType === newPanelType) return;
const query = getUpdateQuery(newPanelType);
redirectWithQueryBuilderData(query, {
[queryParamNamesMap.panelTypes]: newPanelType,
});
},
[
currentQuery,
panelType,
redirectWithQueryBuilderData,
updateAllQueriesOperators,
],
[getUpdateQuery, panelType, redirectWithQueryBuilderData],
);
useShareBuilderUrl(defaultQuery);

View File

@@ -3,6 +3,7 @@ import TabLabel from 'components/TabLabel';
import { PANEL_TYPES } from 'constants/queryBuilder';
import TimeSeriesView from 'container/TimeSeriesView';
import ListView from 'container/TracesExplorer/ListView';
import TableView from 'container/TracesExplorer/TableView';
import TracesView from 'container/TracesExplorer/TracesView';
import { DataSource } from 'types/common/queryBuilder';
@@ -42,4 +43,9 @@ export const getTabsItems = ({
key: PANEL_TYPES.TIME_SERIES,
children: <TimeSeriesView dataSource={DataSource.TRACES} />,
},
{
label: 'Table View',
key: PANEL_TYPES.TABLE,
children: <TableView />,
},
];

View File

@@ -70,6 +70,7 @@ export const QueryBuilderContext = createContext<QueryBuilderContextType>({
handleRunQuery: () => {},
resetStagedQuery: () => {},
updateAllQueriesOperators: () => initialQueriesMap.metrics,
updateQueriesData: () => initialQueriesMap.metrics,
initQueryBuilderData: () => {},
});
@@ -222,6 +223,22 @@ export function QueryBuilderProvider({
[getElementWithActualOperator],
);
const updateQueriesData = useCallback(
<T extends keyof QueryBuilderData>(
query: Query,
type: T,
updateCallback: (
item: QueryBuilderData[T][number],
index: number,
) => QueryBuilderData[T][number],
): Query => {
const result = query.builder[type].map(updateCallback);
return { ...query, builder: { ...query.builder, [type]: result } };
},
[],
);
const removeQueryBuilderEntityByIndex = useCallback(
(type: keyof QueryBuilderData, index: number) => {
setCurrentQuery((prevState) => {
@@ -567,6 +584,7 @@ export function QueryBuilderProvider({
handleRunQuery,
resetStagedQuery,
updateAllQueriesOperators,
updateQueriesData,
initQueryBuilderData,
}),
[
@@ -588,6 +606,7 @@ export function QueryBuilderProvider({
handleRunQuery,
resetStagedQuery,
updateAllQueriesOperators,
updateQueriesData,
initQueryBuilderData,
],
);

View File

@@ -6,6 +6,6 @@ export interface PayloadProps {
}
export interface Props {
email?: string;
email: string;
path?: string;
}

View File

@@ -5,10 +5,7 @@ export interface PayloadProps {
result: QueryData[];
}
export type ListItem = {
timestamp: string;
data: Omit<ILog, 'timestamp'>;
};
export type ListItem = { timestamp: string; data: Omit<ILog, 'timestamp'> };
export interface QueryData {
metric: {

View File

@@ -192,6 +192,14 @@ export type QueryBuilderContextType = {
panelType: GRAPH_TYPES,
dataSource: DataSource,
) => Query;
updateQueriesData: <T extends keyof QueryBuilderData>(
query: Query,
type: T,
updateCallback: (
item: QueryBuilderData[T][number],
index: number,
) => QueryBuilderData[T][number],
) => Query;
initQueryBuilderData: (query: Query) => void;
};

View File

@@ -0,0 +1,18 @@
import { Dashboard, IDashboardVariable } from 'types/api/dashboard/getAll';
export const getSelectedDashboard = (dashboard: Dashboard[]): Dashboard => {
if (dashboard.length > 0) {
return dashboard[0];
}
return {} as Dashboard;
};
export const getSelectedDashboardVariable = (
dashboard: Dashboard[],
): Record<string, IDashboardVariable> => {
if (dashboard.length > 0) {
const { variables } = dashboard[0].data;
return variables;
}
return {};
};

View File

@@ -3719,6 +3719,14 @@ babel-preset-react-app@^10.0.0:
babel-plugin-macros "^3.1.0"
babel-plugin-transform-react-remove-prop-types "^0.4.24"
babel-runtime@^6.26.0:
version "6.26.0"
resolved "https://registry.yarnpkg.com/babel-runtime/-/babel-runtime-6.26.0.tgz#965c7058668e82b55d7bfe04ff2337bc8b5647fe"
integrity sha512-ITKNuq2wKlW1fJg9sSW52eepoYgZBggvOAHC0u/CYu/qxQ9EVzThCgR69BnSXLHjy2f7SY5zaQ4yt7H9ZVxY2g==
dependencies:
core-js "^2.4.0"
regenerator-runtime "^0.11.0"
balanced-match@^1.0.0:
version "1.0.2"
resolved "https://registry.npmjs.org/balanced-match/-/balanced-match-1.0.2.tgz"
@@ -4474,6 +4482,11 @@ core-js-compat@^3.25.1:
dependencies:
browserslist "^4.21.5"
core-js@^2.4.0:
version "2.6.12"
resolved "https://registry.yarnpkg.com/core-js/-/core-js-2.6.12.tgz#d9333dfa7b065e347cc5682219d6f690859cc2ec"
integrity sha512-Kb2wC0fvsWfQrgk8HU5lW6U/Lcs8+9aaYcy4ZFc6DDlo4nZ7n70dEgE5rtR0oG6ufKDUnrwfWL1mXR5ljDatrQ==
core-util-is@~1.0.0:
version "1.0.3"
resolved "https://registry.npmjs.org/core-util-is/-/core-util-is-1.0.3.tgz"
@@ -9999,7 +10012,7 @@ prompts@^2.0.1, prompts@^2.4.1:
kleur "^3.0.3"
sisteransi "^1.0.5"
prop-types@15, prop-types@15.x, prop-types@^15.6.1, prop-types@^15.6.2, prop-types@^15.7.2, prop-types@^15.8.1:
prop-types@15, prop-types@15.x, prop-types@^15.5.8, prop-types@^15.6.1, prop-types@^15.6.2, prop-types@^15.7.2, prop-types@^15.8.1:
version "15.8.1"
resolved "https://registry.npmjs.org/prop-types/-/prop-types-15.8.1.tgz"
integrity sha512-oj87CgZICdulUohogVAR7AjlC0327U4el4L6eAvOqCeudMDVU0NThNaV+b9Df4dXgSP1gXMTnPdhfe/2qDH5cg==
@@ -10513,6 +10526,14 @@ react-dom@18.2.0:
loose-envify "^1.1.0"
scheduler "^0.23.0"
react-drag-listview@2.0.0:
version "2.0.0"
resolved "https://registry.yarnpkg.com/react-drag-listview/-/react-drag-listview-2.0.0.tgz#b8e7ec5f980ecbbf3abb85f50db0b03cd764edbf"
integrity sha512-7Apx/1Xt4qu+JHHP0rH6aLgZgS7c2MX8ocHVGCi03KfeIWEu0t14MhT3boQKM33l5eJrE/IWfExFTvoYq22fsg==
dependencies:
babel-runtime "^6.26.0"
prop-types "^15.5.8"
react-draggable@^4.0.0, react-draggable@^4.0.3:
version "4.4.5"
resolved "https://registry.npmjs.org/react-draggable/-/react-draggable-4.4.5.tgz"
@@ -10808,6 +10829,11 @@ regenerator-runtime@0.13.9:
resolved "https://registry.npmjs.org/regenerator-runtime/-/regenerator-runtime-0.13.9.tgz"
integrity sha512-p3VT+cOEgxFsRRA9X4lkI1E+k2/CtnKtU4gcxyaCUreilL/vqI6CdZ3wxVUx3UOUg+gnUOQQcRI7BmSI656MYA==
regenerator-runtime@^0.11.0:
version "0.11.1"
resolved "https://registry.yarnpkg.com/regenerator-runtime/-/regenerator-runtime-0.11.1.tgz#be05ad7f9bf7d22e056f9726cee5017fbf19e2e9"
integrity sha512-MguG95oij0fC3QV3URf4V2SDYGJhJnJGqvIIgdECeODCT98wSWDAJ94SSuVpYQUoTcGUIL6L4yNB7j1DFFHSBg==
regenerator-runtime@^0.13.11:
version "0.13.11"
resolved "https://registry.npmjs.org/regenerator-runtime/-/regenerator-runtime-0.13.11.tgz"

View File

@@ -13,7 +13,7 @@ https://github.com/SigNoz/signoz/blob/main/CONTRIBUTING.md#to-run-clickhouse-set
- Change the alertmanager section in `signoz/deploy/docker/clickhouse-setup/docker-compose.yaml` as follows:
```console
alertmanager:
image: signoz/alertmanager:0.23.0-0.1
image: signoz/alertmanager:0.23.1
volumes:
- ./data/alertmanager:/data
expose:

View File

@@ -4161,7 +4161,7 @@ func readRowsForTimeSeriesResult(rows driver.Rows, vars []interface{}, columnNam
// ("order", "/fetch/{Id}")
// ("order", "/order")
seriesToPoints := make(map[string][]v3.Point)
var keys []string
// seriesToAttrs is a mapping of key to a map of attribute key to attribute value
// for each series. This is used to populate the series' attributes
// For instance, for the above example, the seriesToAttrs will be
@@ -4182,12 +4182,15 @@ func readRowsForTimeSeriesResult(rows driver.Rows, vars []interface{}, columnNam
groupBy, groupAttributes, metricPoint := readRow(vars, columnNames)
sort.Strings(groupBy)
key := strings.Join(groupBy, "")
if _, exists := seriesToAttrs[key]; !exists {
keys = append(keys, key)
}
seriesToAttrs[key] = groupAttributes
seriesToPoints[key] = append(seriesToPoints[key], metricPoint)
}
var seriesList []*v3.Series
for key := range seriesToPoints {
for _, key := range keys {
points := seriesToPoints[key]
// find the grouping sets point for the series

View File

@@ -61,17 +61,18 @@ func NewRouter() *mux.Router {
type APIHandler struct {
// queryService *querysvc.QueryService
// queryParser queryParser
basePath string
apiPrefix string
reader interfaces.Reader
skipConfig *model.SkipConfig
appDao dao.ModelDao
alertManager am.Manager
ruleManager *rules.Manager
featureFlags interfaces.FeatureLookup
ready func(http.HandlerFunc) http.HandlerFunc
queryBuilder *queryBuilder.QueryBuilder
preferDelta bool
basePath string
apiPrefix string
reader interfaces.Reader
skipConfig *model.SkipConfig
appDao dao.ModelDao
alertManager am.Manager
ruleManager *rules.Manager
featureFlags interfaces.FeatureLookup
ready func(http.HandlerFunc) http.HandlerFunc
queryBuilder *queryBuilder.QueryBuilder
preferDelta bool
preferSpanMetrics bool
// SetupCompleted indicates if SigNoz is ready for general use.
// at the moment, we mark the app ready when the first user
@@ -86,7 +87,8 @@ type APIHandlerOpts struct {
SkipConfig *model.SkipConfig
PerferDelta bool
PerferDelta bool
PreferSpanMetrics bool
// dao layer to perform crud on app objects like dashboard, alerts etc
AppDao dao.ModelDao
@@ -106,13 +108,14 @@ func NewAPIHandler(opts APIHandlerOpts) (*APIHandler, error) {
}
aH := &APIHandler{
reader: opts.Reader,
appDao: opts.AppDao,
skipConfig: opts.SkipConfig,
preferDelta: opts.PerferDelta,
alertManager: alertManager,
ruleManager: opts.RuleManager,
featureFlags: opts.FeatureFlags,
reader: opts.Reader,
appDao: opts.AppDao,
skipConfig: opts.SkipConfig,
preferDelta: opts.PerferDelta,
preferSpanMetrics: opts.PreferSpanMetrics,
alertManager: alertManager,
ruleManager: opts.RuleManager,
featureFlags: opts.FeatureFlags,
}
builderOpts := queryBuilder.QueryBuilderOptions{
@@ -1668,6 +1671,14 @@ func (aH *APIHandler) getFeatureFlags(w http.ResponseWriter, r *http.Request) {
aH.HandleError(w, err, http.StatusInternalServerError)
return
}
if aH.preferSpanMetrics {
for idx := range featureSet {
feature := &featureSet[idx]
if feature.Name == model.UseSpanMetrics {
featureSet[idx].Active = true
}
}
}
aH.Respond(w, featureSet)
}
@@ -2511,6 +2522,7 @@ func (aH *APIHandler) execClickHouseGraphQueries(ctx context.Context, queries ma
wg.Add(1)
go func(name, query string) {
defer wg.Done()
seriesList, err := aH.reader.GetTimeSeriesResultV3(ctx, query)
if err != nil {
@@ -2842,20 +2854,48 @@ func applyMetricLimit(results []*v3.Result, queryRangeParams *v3.QueryRangeParam
builderQueries := queryRangeParams.CompositeQuery.BuilderQueries
if builderQueries != nil && builderQueries[result.QueryName].DataSource == v3.DataSourceMetrics {
limit := builderQueries[result.QueryName].Limit
var orderAsc bool
for _, item := range builderQueries[result.QueryName].OrderBy {
if item.ColumnName == constants.SigNozOrderByValue {
orderAsc = strings.ToLower(item.Order) == "asc"
break
}
}
orderByList := builderQueries[result.QueryName].OrderBy
if limit != 0 {
sort.Slice(result.Series, func(i, j int) bool {
if orderAsc {
return result.Series[i].Points[0].Value < result.Series[j].Points[0].Value
if len(orderByList) == 0 {
// If no orderBy is specified, sort by value in descending order
orderByList = []v3.OrderBy{{ColumnName: constants.SigNozOrderByValue, Order: "desc"}}
}
sort.SliceStable(result.Series, func(i, j int) bool {
for _, orderBy := range orderByList {
if orderBy.ColumnName == constants.SigNozOrderByValue {
if result.Series[i].GroupingSetsPoint == nil || result.Series[j].GroupingSetsPoint == nil {
// Handle nil GroupingSetsPoint, if needed
// Here, we assume non-nil values are always less than nil values
return result.Series[i].GroupingSetsPoint != nil
}
if orderBy.Order == "asc" {
return result.Series[i].GroupingSetsPoint.Value < result.Series[j].GroupingSetsPoint.Value
} else if orderBy.Order == "desc" {
return result.Series[i].GroupingSetsPoint.Value > result.Series[j].GroupingSetsPoint.Value
}
} else {
// Sort based on Labels map
labelI, existsI := result.Series[i].Labels[orderBy.ColumnName]
labelJ, existsJ := result.Series[j].Labels[orderBy.ColumnName]
if !existsI || !existsJ {
// Handle missing labels, if needed
// Here, we assume non-existent labels are always less than existing ones
return existsI
}
if orderBy.Order == "asc" {
return strings.Compare(labelI, labelJ) < 0
} else if orderBy.Order == "desc" {
return strings.Compare(labelI, labelJ) > 0
}
}
}
return result.Series[i].Points[0].Value > result.Series[j].Points[0].Value
// Preserve original order if no matching orderBy is found
return i < j
})
if len(result.Series) > int(limit) {
result.Series = result.Series[:limit]
}

View File

@@ -443,6 +443,278 @@ func TestApplyLimitOnMetricResult(t *testing.T) {
},
},
},
{
// ["GET /api/v1/health", "DELETE /api/v1/health"] so result should be ["DELETE /api/v1/health"] although it has lower value
name: "test limit with operation asc",
inputResult: []*v3.Result{
{
QueryName: "A",
Series: []*v3.Series{
{
Labels: map[string]string{
"service_name": "frontend",
"operation": "GET /api/v1/health",
},
Points: []v3.Point{
{
Timestamp: 1689220036000,
Value: 19.2,
},
{
Timestamp: 1689220096000,
Value: 19.5,
},
},
GroupingSetsPoint: &v3.Point{
Timestamp: 0,
Value: 19.3,
},
},
{
Labels: map[string]string{
"service_name": "route",
"operation": "DELETE /api/v1/health",
},
Points: []v3.Point{
{
Timestamp: 1689220036000,
Value: 8.83,
},
{
Timestamp: 1689220096000,
Value: 8.83,
},
},
GroupingSetsPoint: &v3.Point{
Timestamp: 0,
Value: 8.83,
},
},
},
},
},
params: &v3.QueryRangeParamsV3{
Start: 1689220036000,
End: 1689220096000,
Step: 60,
CompositeQuery: &v3.CompositeQuery{
BuilderQueries: map[string]*v3.BuilderQuery{
"A": {
QueryName: "A",
AggregateAttribute: v3.AttributeKey{Key: "signo_calls_total"},
DataSource: v3.DataSourceMetrics,
AggregateOperator: v3.AggregateOperatorSumRate,
Expression: "A",
GroupBy: []v3.AttributeKey{{Key: "service_name"}},
Limit: 1,
OrderBy: []v3.OrderBy{{ColumnName: "operation", Order: "asc"}},
},
},
QueryType: v3.QueryTypeBuilder,
PanelType: v3.PanelTypeGraph,
},
},
expectedResult: []*v3.Result{
{
QueryName: "A",
Series: []*v3.Series{
{
Labels: map[string]string{
"service_name": "route",
"operation": "DELETE /api/v1/health",
},
Points: []v3.Point{
{
Timestamp: 1689220036000,
Value: 8.83,
},
{
Timestamp: 1689220096000,
Value: 8.83,
},
},
GroupingSetsPoint: &v3.Point{
Timestamp: 0,
Value: 8.83,
},
},
},
},
},
},
{
name: "test limit with multiple order by labels",
inputResult: []*v3.Result{
{
QueryName: "A",
Series: []*v3.Series{
{
Labels: map[string]string{
"service_name": "frontend",
"operation": "GET /api/v1/health",
"status_code": "200",
"priority": "P0",
},
Points: []v3.Point{
{
Timestamp: 1689220036000,
Value: 19.2,
},
{
Timestamp: 1689220096000,
Value: 19.5,
},
},
GroupingSetsPoint: &v3.Point{
Timestamp: 0,
Value: 19.3,
},
},
{
Labels: map[string]string{
"service_name": "route",
"operation": "DELETE /api/v1/health",
"status_code": "301",
"priority": "P1",
},
Points: []v3.Point{
{
Timestamp: 1689220036000,
Value: 8.83,
},
{
Timestamp: 1689220096000,
Value: 8.83,
},
},
GroupingSetsPoint: &v3.Point{
Timestamp: 0,
Value: 8.83,
},
},
{
Labels: map[string]string{
"service_name": "route",
"operation": "DELETE /api/v1/health",
"status_code": "400",
"priority": "P0",
},
Points: []v3.Point{
{
Timestamp: 1689220036000,
Value: 8.83,
},
{
Timestamp: 1689220096000,
Value: 8.83,
},
},
GroupingSetsPoint: &v3.Point{
Timestamp: 0,
Value: 8.83,
},
},
{
Labels: map[string]string{
"service_name": "route",
"operation": "DELETE /api/v1/health",
"status_code": "200",
"priority": "P1",
},
Points: []v3.Point{
{
Timestamp: 1689220036000,
Value: 8.83,
},
{
Timestamp: 1689220096000,
Value: 8.83,
},
},
GroupingSetsPoint: &v3.Point{
Timestamp: 0,
Value: 8.83,
},
},
},
},
},
params: &v3.QueryRangeParamsV3{
Start: 1689220036000,
End: 1689220096000,
Step: 60,
CompositeQuery: &v3.CompositeQuery{
BuilderQueries: map[string]*v3.BuilderQuery{
"A": {
QueryName: "A",
AggregateAttribute: v3.AttributeKey{Key: "signo_calls_total"},
DataSource: v3.DataSourceMetrics,
AggregateOperator: v3.AggregateOperatorSumRate,
Expression: "A",
GroupBy: []v3.AttributeKey{{Key: "service_name"}, {Key: "operation"}, {Key: "status_code"}, {Key: "priority"}},
Limit: 2,
OrderBy: []v3.OrderBy{
{ColumnName: "priority", Order: "asc"},
{ColumnName: "status_code", Order: "desc"},
},
},
},
QueryType: v3.QueryTypeBuilder,
PanelType: v3.PanelTypeGraph,
},
},
expectedResult: []*v3.Result{
{
QueryName: "A",
Series: []*v3.Series{
{
Labels: map[string]string{
"service_name": "frontend",
"operation": "GET /api/v1/health",
"status_code": "200",
"priority": "P0",
},
Points: []v3.Point{
{
Timestamp: 1689220036000,
Value: 19.2,
},
{
Timestamp: 1689220096000,
Value: 19.5,
},
},
GroupingSetsPoint: &v3.Point{
Timestamp: 0,
Value: 19.3,
},
},
{
Labels: map[string]string{
"service_name": "route",
"operation": "DELETE /api/v1/health",
"status_code": "400",
"priority": "P0",
},
Points: []v3.Point{
{
Timestamp: 1689220036000,
Value: 8.83,
},
{
Timestamp: 1689220096000,
Value: 8.83,
},
},
GroupingSetsPoint: &v3.Point{
Timestamp: 0,
Value: 8.83,
},
},
},
},
},
},
}
for _, c := range cases {

View File

@@ -89,17 +89,29 @@ func getClickhouseColumnName(key v3.AttributeKey) string {
}
// getSelectLabels returns the select labels for the query based on groupBy and aggregateOperator
func getSelectLabels(aggregatorOperator v3.AggregateOperator, groupBy []v3.AttributeKey) (string, error) {
func getSelectLabels(aggregatorOperator v3.AggregateOperator, groupBy []v3.AttributeKey) string {
var selectLabels string
if aggregatorOperator == v3.AggregateOperatorNoOp {
selectLabels = ""
} else {
for _, tag := range groupBy {
columnName := getClickhouseColumnName(tag)
selectLabels += fmt.Sprintf(", %s as %s", columnName, tag.Key)
selectLabels += fmt.Sprintf(" %s as %s,", columnName, tag.Key)
}
}
return selectLabels, nil
return selectLabels
}
func getSelectKeys(aggregatorOperator v3.AggregateOperator, groupBy []v3.AttributeKey) string {
var selectLabels []string
if aggregatorOperator == v3.AggregateOperatorNoOp {
return ""
} else {
for _, tag := range groupBy {
selectLabels = append(selectLabels, tag.Key)
}
}
return strings.Join(selectLabels, ",")
}
func buildLogsTimeSeriesFilterQuery(fs *v3.FilterSet, groupBy []v3.AttributeKey) (string, error) {
@@ -163,7 +175,7 @@ func getZerosForEpochNano(epoch int64) int64 {
return int64(math.Pow(10, float64(19-count)))
}
func buildLogsQuery(panelType v3.PanelType, start, end, step int64, mq *v3.BuilderQuery) (string, error) {
func buildLogsQuery(panelType v3.PanelType, start, end, step int64, mq *v3.BuilderQuery, graphLimitQtype string) (string, error) {
filterSubQuery, err := buildLogsTimeSeriesFilterQuery(mq.Filters, mq.GroupBy)
if err != nil {
@@ -173,10 +185,7 @@ func buildLogsQuery(panelType v3.PanelType, start, end, step int64, mq *v3.Build
// timerange will be sent in epoch millisecond
timeFilter := fmt.Sprintf("(timestamp >= %d AND timestamp <= %d)", start*getZerosForEpochNano(start), end*getZerosForEpochNano(end))
selectLabels, err := getSelectLabels(mq.AggregateOperator, mq.GroupBy)
if err != nil {
return "", err
}
selectLabels := getSelectLabels(mq.AggregateOperator, mq.GroupBy)
having := having(mq.Having)
if having != "" {
@@ -184,35 +193,44 @@ func buildLogsQuery(panelType v3.PanelType, start, end, step int64, mq *v3.Build
}
var queryTmpl string
if panelType == v3.PanelTypeTable {
if graphLimitQtype == constants.FirstQueryGraphLimit {
queryTmpl = "SELECT"
} else if panelType == v3.PanelTypeTable {
queryTmpl =
"SELECT now() as ts" + selectLabels +
", %s as value " +
"from signoz_logs.distributed_logs " +
"where " + timeFilter + "%s" +
"%s%s" +
"%s"
"SELECT now() as ts,"
} else if panelType == v3.PanelTypeGraph || panelType == v3.PanelTypeValue {
// Select the aggregate value for interval
queryTmpl =
fmt.Sprintf("SELECT toStartOfInterval(fromUnixTimestamp64Nano(timestamp), INTERVAL %d SECOND) AS ts", step) + selectLabels +
", %s as value " +
"from signoz_logs.distributed_logs " +
"where " + timeFilter + "%s" +
"%s%s" +
"%s"
fmt.Sprintf("SELECT toStartOfInterval(fromUnixTimestamp64Nano(timestamp), INTERVAL %d SECOND) AS ts,", step)
}
groupBy := groupByAttributeKeyTags(panelType, mq.GroupBy...)
queryTmpl =
queryTmpl + selectLabels +
" %s as value " +
"from signoz_logs.distributed_logs " +
"where " + timeFilter + "%s" +
"%s%s" +
"%s"
// we dont need value for first query
// going with this route as for a cleaner approach on implementation
if graphLimitQtype == constants.FirstQueryGraphLimit {
queryTmpl = "SELECT " + getSelectKeys(mq.AggregateOperator, mq.GroupBy) + " from (" + queryTmpl + ")"
}
groupBy := groupByAttributeKeyTags(panelType, graphLimitQtype, mq.GroupBy...)
if panelType != v3.PanelTypeList && groupBy != "" {
groupBy = " group by " + groupBy
}
orderBy := orderByAttributeKeyTags(panelType, mq.AggregateOperator, mq.OrderBy, mq.GroupBy)
orderBy := orderByAttributeKeyTags(panelType, mq.OrderBy, mq.GroupBy)
if panelType != v3.PanelTypeList && orderBy != "" {
orderBy = " order by " + orderBy
}
if graphLimitQtype == constants.SecondQueryGraphLimit {
filterSubQuery = filterSubQuery + " AND " + fmt.Sprintf("(%s) IN (", getSelectKeys(mq.AggregateOperator, mq.GroupBy)) + "%s)"
}
aggregationKey := ""
if mq.AggregateAttribute.Key != "" {
aggregationKey = getClickhouseColumnName(mq.AggregateAttribute)
@@ -273,82 +291,56 @@ func buildLogsQuery(panelType v3.PanelType, start, end, step int64, mq *v3.Build
// groupBy returns a string of comma separated tags for group by clause
// `ts` is always added to the group by clause
func groupBy(panelType v3.PanelType, tags ...string) string {
if panelType == v3.PanelTypeGraph || panelType == v3.PanelTypeValue {
func groupBy(panelType v3.PanelType, graphLimitQtype string, tags ...string) string {
if (graphLimitQtype != constants.FirstQueryGraphLimit) && (panelType == v3.PanelTypeGraph || panelType == v3.PanelTypeValue) {
tags = append(tags, "ts")
}
return strings.Join(tags, ",")
}
func groupByAttributeKeyTags(panelType v3.PanelType, tags ...v3.AttributeKey) string {
func groupByAttributeKeyTags(panelType v3.PanelType, graphLimitQtype string, tags ...v3.AttributeKey) string {
groupTags := []string{}
for _, tag := range tags {
groupTags = append(groupTags, tag.Key)
}
return groupBy(panelType, groupTags...)
return groupBy(panelType, graphLimitQtype, groupTags...)
}
// orderBy returns a string of comma separated tags for order by clause
// if there are remaining items which are not present in tags they are also added
// if the order is not specified, it defaults to ASC
func orderBy(panelType v3.PanelType, items []v3.OrderBy, tags []string) []string {
func orderBy(panelType v3.PanelType, items []v3.OrderBy, tagLookup map[string]struct{}) []string {
var orderBy []string
// create a lookup
addedToOrderBy := map[string]bool{}
itemsLookup := map[string]v3.OrderBy{}
for i := 0; i < len(items); i++ {
addedToOrderBy[items[i].ColumnName] = false
itemsLookup[items[i].ColumnName] = items[i]
}
for _, tag := range tags {
if item, ok := itemsLookup[tag]; ok {
orderBy = append(orderBy, fmt.Sprintf("%s %s", item.ColumnName, item.Order))
addedToOrderBy[item.ColumnName] = true
} else {
orderBy = append(orderBy, fmt.Sprintf("%s ASC", tag))
}
}
// users might want to order by value of aggreagation
for _, item := range items {
if item.ColumnName == constants.SigNozOrderByValue {
orderBy = append(orderBy, fmt.Sprintf("value %s", item.Order))
addedToOrderBy[item.ColumnName] = true
}
}
// add the remaining items
if panelType == v3.PanelTypeList {
for _, item := range items {
// since these are not present in tags we will have to select them correctly
// for list view there is no need to check if it was added since they wont be added yet but this is just for safety
if !addedToOrderBy[item.ColumnName] {
attr := v3.AttributeKey{Key: item.ColumnName, DataType: item.DataType, Type: item.Type, IsColumn: item.IsColumn}
name := getClickhouseColumnName(attr)
orderBy = append(orderBy, fmt.Sprintf("%s %s", name, item.Order))
}
} else if _, ok := tagLookup[item.ColumnName]; ok {
orderBy = append(orderBy, fmt.Sprintf("%s %s", item.ColumnName, item.Order))
} else if panelType == v3.PanelTypeList {
attr := v3.AttributeKey{Key: item.ColumnName, DataType: item.DataType, Type: item.Type, IsColumn: item.IsColumn}
name := getClickhouseColumnName(attr)
orderBy = append(orderBy, fmt.Sprintf("%s %s", name, item.Order))
}
}
return orderBy
}
func orderByAttributeKeyTags(panelType v3.PanelType, aggregatorOperator v3.AggregateOperator, items []v3.OrderBy, tags []v3.AttributeKey) string {
var groupTags []string
for _, tag := range tags {
groupTags = append(groupTags, tag.Key)
}
orderByArray := orderBy(panelType, items, groupTags)
func orderByAttributeKeyTags(panelType v3.PanelType, items []v3.OrderBy, tags []v3.AttributeKey) string {
if panelType == v3.PanelTypeList {
if len(orderByArray) == 0 {
orderByArray = append(orderByArray, constants.TIMESTAMP)
tagLookup := map[string]struct{}{}
for _, v := range tags {
tagLookup[v.Key] = struct{}{}
}
orderByArray := orderBy(panelType, items, tagLookup)
if len(orderByArray) == 0 {
if panelType == v3.PanelTypeList {
orderByArray = append(orderByArray, constants.TIMESTAMP+" DESC")
} else {
orderByArray = append(orderByArray, "value DESC")
}
} else if panelType == v3.PanelTypeGraph || panelType == v3.PanelTypeValue {
// since in other aggregation operator we will have to add ts as it will not be present in group by
orderByArray = append(orderByArray, "ts")
}
str := strings.Join(orderByArray, ",")
@@ -392,8 +384,26 @@ func addOffsetToQuery(query string, offset uint64) string {
return fmt.Sprintf("%s OFFSET %d", query, offset)
}
func PrepareLogsQuery(start, end int64, queryType v3.QueryType, panelType v3.PanelType, mq *v3.BuilderQuery) (string, error) {
query, err := buildLogsQuery(panelType, start, end, mq.StepInterval, mq)
func PrepareLogsQuery(start, end int64, queryType v3.QueryType, panelType v3.PanelType, mq *v3.BuilderQuery, graphLimitQtype string) (string, error) {
if graphLimitQtype == constants.FirstQueryGraphLimit {
// give me just the groupby names
query, err := buildLogsQuery(panelType, start, end, mq.StepInterval, mq, graphLimitQtype)
if err != nil {
return "", err
}
query = addLimitToQuery(query, mq.Limit)
return query, nil
} else if graphLimitQtype == constants.SecondQueryGraphLimit {
query, err := buildLogsQuery(panelType, start, end, mq.StepInterval, mq, graphLimitQtype)
if err != nil {
return "", err
}
return query, nil
}
query, err := buildLogsQuery(panelType, start, end, mq.StepInterval, mq, graphLimitQtype)
if err != nil {
return "", err
}
@@ -401,7 +411,7 @@ func PrepareLogsQuery(start, end int64, queryType v3.QueryType, panelType v3.Pan
query, err = reduceQuery(query, mq.ReduceTo, mq.AggregateOperator)
}
if panelType == v3.PanelTypeList {
if panelType == v3.PanelTypeList || panelType == v3.PanelTypeTable {
if mq.PageSize > 0 {
if mq.Limit > 0 && mq.Offset > mq.Limit {
return "", fmt.Errorf("max limit exceeded")
@@ -414,4 +424,5 @@ func PrepareLogsQuery(start, end int64, queryType v3.QueryType, panelType v3.Pan
}
return query, err
}

View File

@@ -1,6 +1,7 @@
package v3
import (
"fmt"
"testing"
. "github.com/smartystreets/goconvey/convey"
@@ -59,13 +60,13 @@ var testGetSelectLabelsData = []struct {
Name: "select fields for groupBy attribute",
AggregateOperator: v3.AggregateOperatorCount,
GroupByTags: []v3.AttributeKey{{Key: "user_name", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag}},
SelectLabels: ", attributes_string_value[indexOf(attributes_string_key, 'user_name')] as user_name",
SelectLabels: " attributes_string_value[indexOf(attributes_string_key, 'user_name')] as user_name,",
},
{
Name: "select fields for groupBy resource",
AggregateOperator: v3.AggregateOperatorCount,
GroupByTags: []v3.AttributeKey{{Key: "user_name", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeResource}},
SelectLabels: ", resources_string_value[indexOf(resources_string_key, 'user_name')] as user_name",
SelectLabels: " resources_string_value[indexOf(resources_string_key, 'user_name')] as user_name,",
},
{
Name: "select fields for groupBy attribute and resource",
@@ -74,27 +75,26 @@ var testGetSelectLabelsData = []struct {
{Key: "user_name", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeResource},
{Key: "host", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag},
},
SelectLabels: ", resources_string_value[indexOf(resources_string_key, 'user_name')] as user_name, attributes_string_value[indexOf(attributes_string_key, 'host')] as host",
SelectLabels: " resources_string_value[indexOf(resources_string_key, 'user_name')] as user_name, attributes_string_value[indexOf(attributes_string_key, 'host')] as host,",
},
{
Name: "select fields for groupBy materialized columns",
AggregateOperator: v3.AggregateOperatorCount,
GroupByTags: []v3.AttributeKey{{Key: "host", IsColumn: true}},
SelectLabels: ", host as host",
SelectLabels: " host as host,",
},
{
Name: "trace_id field as an attribute",
AggregateOperator: v3.AggregateOperatorCount,
GroupByTags: []v3.AttributeKey{{Key: "trace_id", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag}},
SelectLabels: ", attributes_string_value[indexOf(attributes_string_key, 'trace_id')] as trace_id",
SelectLabels: " attributes_string_value[indexOf(attributes_string_key, 'trace_id')] as trace_id,",
},
}
func TestGetSelectLabels(t *testing.T) {
for _, tt := range testGetSelectLabelsData {
Convey("testGetSelectLabelsData", t, func() {
selectLabels, err := getSelectLabels(tt.AggregateOperator, tt.GroupByTags)
So(err, ShouldBeNil)
selectLabels := getSelectLabels(tt.AggregateOperator, tt.GroupByTags)
So(selectLabels, ShouldEqual, tt.SelectLabels)
})
}
@@ -238,6 +238,7 @@ var testBuildLogsQueryData = []struct {
TableName string
AggregateOperator v3.AggregateOperator
ExpectedQuery string
Type int
}{
{
Name: "Test aggregate count on select field",
@@ -251,7 +252,7 @@ var testBuildLogsQueryData = []struct {
Expression: "A",
},
TableName: "logs",
ExpectedQuery: "SELECT toStartOfInterval(fromUnixTimestamp64Nano(timestamp), INTERVAL 60 SECOND) AS ts, toFloat64(count(*)) as value from signoz_logs.distributed_logs where (timestamp >= 1680066360726210000 AND timestamp <= 1680066458000000000) group by ts order by ts",
ExpectedQuery: "SELECT toStartOfInterval(fromUnixTimestamp64Nano(timestamp), INTERVAL 60 SECOND) AS ts, toFloat64(count(*)) as value from signoz_logs.distributed_logs where (timestamp >= 1680066360726210000 AND timestamp <= 1680066458000000000) group by ts order by value DESC",
},
{
Name: "Test aggregate count on a attribute",
@@ -266,7 +267,7 @@ var testBuildLogsQueryData = []struct {
Expression: "A",
},
TableName: "logs",
ExpectedQuery: "SELECT toStartOfInterval(fromUnixTimestamp64Nano(timestamp), INTERVAL 60 SECOND) AS ts, toFloat64(count(*)) as value from signoz_logs.distributed_logs where (timestamp >= 1680066360726210000 AND timestamp <= 1680066458000000000) AND has(attributes_string_key, 'user_name') group by ts order by ts",
ExpectedQuery: "SELECT toStartOfInterval(fromUnixTimestamp64Nano(timestamp), INTERVAL 60 SECOND) AS ts, toFloat64(count(*)) as value from signoz_logs.distributed_logs where (timestamp >= 1680066360726210000 AND timestamp <= 1680066458000000000) AND has(attributes_string_key, 'user_name') group by ts order by value DESC",
},
{
Name: "Test aggregate count on a with filter",
@@ -284,7 +285,7 @@ var testBuildLogsQueryData = []struct {
Expression: "A",
},
TableName: "logs",
ExpectedQuery: "SELECT toStartOfInterval(fromUnixTimestamp64Nano(timestamp), INTERVAL 60 SECOND) AS ts, toFloat64(count(*)) as value from signoz_logs.distributed_logs where (timestamp >= 1680066360726210000 AND timestamp <= 1680066458000000000) AND attributes_float64_value[indexOf(attributes_float64_key, 'bytes')] > 100.000000 AND has(attributes_string_key, 'user_name') group by ts order by ts",
ExpectedQuery: "SELECT toStartOfInterval(fromUnixTimestamp64Nano(timestamp), INTERVAL 60 SECOND) AS ts, toFloat64(count(*)) as value from signoz_logs.distributed_logs where (timestamp >= 1680066360726210000 AND timestamp <= 1680066458000000000) AND attributes_float64_value[indexOf(attributes_float64_key, 'bytes')] > 100.000000 AND has(attributes_string_key, 'user_name') group by ts order by value DESC",
},
{
Name: "Test aggregate count distinct and order by value",
@@ -300,7 +301,7 @@ var testBuildLogsQueryData = []struct {
OrderBy: []v3.OrderBy{{ColumnName: "#SIGNOZ_VALUE", Order: "ASC"}},
},
TableName: "logs",
ExpectedQuery: "SELECT toStartOfInterval(fromUnixTimestamp64Nano(timestamp), INTERVAL 60 SECOND) AS ts, toFloat64(count(distinct(name))) as value from signoz_logs.distributed_logs where (timestamp >= 1680066360726210000 AND timestamp <= 1680066458000000000) group by ts order by value ASC,ts",
ExpectedQuery: "SELECT toStartOfInterval(fromUnixTimestamp64Nano(timestamp), INTERVAL 60 SECOND) AS ts, toFloat64(count(distinct(name))) as value from signoz_logs.distributed_logs where (timestamp >= 1680066360726210000 AND timestamp <= 1680066458000000000) group by ts order by value ASC",
},
{
Name: "Test aggregate count distinct on non selected field",
@@ -315,7 +316,7 @@ var testBuildLogsQueryData = []struct {
Expression: "A",
},
TableName: "logs",
ExpectedQuery: "SELECT toStartOfInterval(fromUnixTimestamp64Nano(timestamp), INTERVAL 60 SECOND) AS ts, toFloat64(count(distinct(attributes_string_value[indexOf(attributes_string_key, 'name')]))) as value from signoz_logs.distributed_logs where (timestamp >= 1680066360726210000 AND timestamp <= 1680066458000000000) group by ts order by ts",
ExpectedQuery: "SELECT toStartOfInterval(fromUnixTimestamp64Nano(timestamp), INTERVAL 60 SECOND) AS ts, toFloat64(count(distinct(attributes_string_value[indexOf(attributes_string_key, 'name')]))) as value from signoz_logs.distributed_logs where (timestamp >= 1680066360726210000 AND timestamp <= 1680066458000000000) group by ts order by value DESC",
},
{
Name: "Test aggregate count distinct with filter and groupBy",
@@ -344,7 +345,7 @@ var testBuildLogsQueryData = []struct {
"AND attributes_string_value[indexOf(attributes_string_key, 'method')] = 'GET' AND resources_string_value[indexOf(resources_string_key, 'x')] != 'abc' " +
"AND indexOf(attributes_string_key, 'method') > 0 " +
"group by method,ts " +
"order by method ASC,ts",
"order by method ASC",
},
{
Name: "Test aggregate count with multiple filter,groupBy and orderBy",
@@ -375,7 +376,7 @@ var testBuildLogsQueryData = []struct {
"AND indexOf(attributes_string_key, 'method') > 0 " +
"AND indexOf(resources_string_key, 'x') > 0 " +
"group by method,x,ts " +
"order by method ASC,x ASC,ts",
"order by method ASC,x ASC",
},
{
Name: "Test aggregate avg",
@@ -404,7 +405,7 @@ var testBuildLogsQueryData = []struct {
"AND attributes_string_value[indexOf(attributes_string_key, 'method')] = 'GET' " +
"AND indexOf(attributes_string_key, 'method') > 0 " +
"group by method,ts " +
"order by method ASC,ts",
"order by method ASC",
},
{
Name: "Test aggregate sum",
@@ -433,7 +434,7 @@ var testBuildLogsQueryData = []struct {
"AND attributes_string_value[indexOf(attributes_string_key, 'method')] = 'GET' " +
"AND indexOf(attributes_string_key, 'method') > 0 " +
"group by method,ts " +
"order by method ASC,ts",
"order by method ASC",
},
{
Name: "Test aggregate min",
@@ -462,7 +463,7 @@ var testBuildLogsQueryData = []struct {
"AND attributes_string_value[indexOf(attributes_string_key, 'method')] = 'GET' " +
"AND indexOf(attributes_string_key, 'method') > 0 " +
"group by method,ts " +
"order by method ASC,ts",
"order by method ASC",
},
{
Name: "Test aggregate max",
@@ -491,7 +492,7 @@ var testBuildLogsQueryData = []struct {
"AND attributes_string_value[indexOf(attributes_string_key, 'method')] = 'GET' " +
"AND indexOf(attributes_string_key, 'method') > 0 " +
"group by method,ts " +
"order by method ASC,ts",
"order by method ASC",
},
{
Name: "Test aggregate PXX",
@@ -516,7 +517,7 @@ var testBuildLogsQueryData = []struct {
"where (timestamp >= 1680066360726210000 AND timestamp <= 1680066458000000000) " +
"AND indexOf(attributes_string_key, 'method') > 0 " +
"group by method,ts " +
"order by method ASC,ts",
"order by method ASC",
},
{
Name: "Test aggregate RateSum",
@@ -538,7 +539,7 @@ var testBuildLogsQueryData = []struct {
", sum(bytes)/60 as value from signoz_logs.distributed_logs " +
"where (timestamp >= 1680066360726210000 AND timestamp <= 1680066458000000000) " +
"AND indexOf(attributes_string_key, 'method') > 0 " +
"group by method,ts order by method ASC,ts",
"group by method,ts order by method ASC",
},
{
Name: "Test aggregate rate",
@@ -561,7 +562,7 @@ var testBuildLogsQueryData = []struct {
"from signoz_logs.distributed_logs where (timestamp >= 1680066360726210000 AND timestamp <= 1680066458000000000) " +
"AND indexOf(attributes_string_key, 'method') > 0 " +
"group by method,ts " +
"order by method ASC,ts",
"order by method ASC",
},
{
Name: "Test aggregate RateSum without materialized column",
@@ -585,7 +586,7 @@ var testBuildLogsQueryData = []struct {
"from signoz_logs.distributed_logs where (timestamp >= 1680066360726210000 AND timestamp <= 1680066458000000000) " +
"AND indexOf(attributes_string_key, 'method') > 0 " +
"group by method,ts " +
"order by method ASC,ts",
"order by method ASC",
},
{
Name: "Test Noop",
@@ -603,7 +604,7 @@ var testBuildLogsQueryData = []struct {
ExpectedQuery: "SELECT timestamp, id, trace_id, span_id, trace_flags, severity_text, severity_number, body,CAST((attributes_string_key, attributes_string_value), 'Map(String, String)') as attributes_string," +
"CAST((attributes_int64_key, attributes_int64_value), 'Map(String, Int64)') as attributes_int64,CAST((attributes_float64_key, attributes_float64_value), 'Map(String, Float64)') as attributes_float64," +
"CAST((resources_string_key, resources_string_value), 'Map(String, String)') as resources_string " +
"from signoz_logs.distributed_logs where (timestamp >= 1680066360726210000 AND timestamp <= 1680066458000000000) order by timestamp",
"from signoz_logs.distributed_logs where (timestamp >= 1680066360726210000 AND timestamp <= 1680066458000000000) order by timestamp DESC",
},
{
Name: "Test Noop order by custom",
@@ -642,7 +643,7 @@ var testBuildLogsQueryData = []struct {
ExpectedQuery: "SELECT timestamp, id, trace_id, span_id, trace_flags, severity_text, severity_number, body,CAST((attributes_string_key, attributes_string_value), 'Map(String, String)') as attributes_string," +
"CAST((attributes_int64_key, attributes_int64_value), 'Map(String, Int64)') as attributes_int64,CAST((attributes_float64_key, attributes_float64_value), 'Map(String, Float64)') as attributes_float64," +
"CAST((resources_string_key, resources_string_value), 'Map(String, String)') as resources_string " +
"from signoz_logs.distributed_logs where (timestamp >= 1680066360726210000 AND timestamp <= 1680066458000000000) AND severity_number != 0 order by timestamp",
"from signoz_logs.distributed_logs where (timestamp >= 1680066360726210000 AND timestamp <= 1680066458000000000) AND severity_number != 0 order by timestamp DESC",
},
{
Name: "Test aggregate with having clause",
@@ -664,7 +665,7 @@ var testBuildLogsQueryData = []struct {
},
},
TableName: "logs",
ExpectedQuery: "SELECT toStartOfInterval(fromUnixTimestamp64Nano(timestamp), INTERVAL 60 SECOND) AS ts, toFloat64(count(distinct(attributes_string_value[indexOf(attributes_string_key, 'name')]))) as value from signoz_logs.distributed_logs where (timestamp >= 1680066360726210000 AND timestamp <= 1680066458000000000) group by ts having value > 10 order by ts",
ExpectedQuery: "SELECT toStartOfInterval(fromUnixTimestamp64Nano(timestamp), INTERVAL 60 SECOND) AS ts, toFloat64(count(distinct(attributes_string_value[indexOf(attributes_string_key, 'name')]))) as value from signoz_logs.distributed_logs where (timestamp >= 1680066360726210000 AND timestamp <= 1680066458000000000) group by ts having value > 10 order by value DESC",
},
{
Name: "Test aggregate with having clause and filters",
@@ -690,7 +691,7 @@ var testBuildLogsQueryData = []struct {
},
},
TableName: "logs",
ExpectedQuery: "SELECT toStartOfInterval(fromUnixTimestamp64Nano(timestamp), INTERVAL 60 SECOND) AS ts, toFloat64(count(distinct(attributes_string_value[indexOf(attributes_string_key, 'name')]))) as value from signoz_logs.distributed_logs where (timestamp >= 1680066360726210000 AND timestamp <= 1680066458000000000) AND attributes_string_value[indexOf(attributes_string_key, 'method')] = 'GET' group by ts having value > 10 order by ts",
ExpectedQuery: "SELECT toStartOfInterval(fromUnixTimestamp64Nano(timestamp), INTERVAL 60 SECOND) AS ts, toFloat64(count(distinct(attributes_string_value[indexOf(attributes_string_key, 'name')]))) as value from signoz_logs.distributed_logs where (timestamp >= 1680066360726210000 AND timestamp <= 1680066458000000000) AND attributes_string_value[indexOf(attributes_string_key, 'method')] = 'GET' group by ts having value > 10 order by value DESC",
},
{
Name: "Test top level key",
@@ -716,7 +717,7 @@ var testBuildLogsQueryData = []struct {
},
},
TableName: "logs",
ExpectedQuery: "SELECT toStartOfInterval(fromUnixTimestamp64Nano(timestamp), INTERVAL 60 SECOND) AS ts, toFloat64(count(distinct(attributes_string_value[indexOf(attributes_string_key, 'name')]))) as value from signoz_logs.distributed_logs where (timestamp >= 1680066360726210000 AND timestamp <= 1680066458000000000) AND body ILIKE '%test%' group by ts having value > 10 order by ts",
ExpectedQuery: "SELECT toStartOfInterval(fromUnixTimestamp64Nano(timestamp), INTERVAL 60 SECOND) AS ts, toFloat64(count(distinct(attributes_string_value[indexOf(attributes_string_key, 'name')]))) as value from signoz_logs.distributed_logs where (timestamp >= 1680066360726210000 AND timestamp <= 1680066458000000000) AND body ILIKE '%test%' group by ts having value > 10 order by value DESC",
},
{
Name: "Test attribute with same name as top level key",
@@ -742,7 +743,7 @@ var testBuildLogsQueryData = []struct {
},
},
TableName: "logs",
ExpectedQuery: "SELECT toStartOfInterval(fromUnixTimestamp64Nano(timestamp), INTERVAL 60 SECOND) AS ts, toFloat64(count(distinct(attributes_string_value[indexOf(attributes_string_key, 'name')]))) as value from signoz_logs.distributed_logs where (timestamp >= 1680066360726210000 AND timestamp <= 1680066458000000000) AND attributes_string_value[indexOf(attributes_string_key, 'body')] ILIKE '%test%' group by ts having value > 10 order by ts",
ExpectedQuery: "SELECT toStartOfInterval(fromUnixTimestamp64Nano(timestamp), INTERVAL 60 SECOND) AS ts, toFloat64(count(distinct(attributes_string_value[indexOf(attributes_string_key, 'name')]))) as value from signoz_logs.distributed_logs where (timestamp >= 1680066360726210000 AND timestamp <= 1680066458000000000) AND attributes_string_value[indexOf(attributes_string_key, 'body')] ILIKE '%test%' group by ts having value > 10 order by value DESC",
},
// Tests for table panel type
@@ -758,7 +759,7 @@ var testBuildLogsQueryData = []struct {
Expression: "A",
},
TableName: "logs",
ExpectedQuery: "SELECT now() as ts, toFloat64(count(*)) as value from signoz_logs.distributed_logs where (timestamp >= 1680066360726210000 AND timestamp <= 1680066458000000000)",
ExpectedQuery: "SELECT now() as ts, toFloat64(count(*)) as value from signoz_logs.distributed_logs where (timestamp >= 1680066360726210000 AND timestamp <= 1680066458000000000) order by value DESC",
},
{
Name: "TABLE: Test count with groupBy",
@@ -775,7 +776,7 @@ var testBuildLogsQueryData = []struct {
},
},
TableName: "logs",
ExpectedQuery: "SELECT now() as ts, attributes_string_value[indexOf(attributes_string_key, 'name')] as name, toFloat64(count(*)) as value from signoz_logs.distributed_logs where (timestamp >= 1680066360726210000 AND timestamp <= 1680066458000000000) AND indexOf(attributes_string_key, 'name') > 0 group by name order by name ASC",
ExpectedQuery: "SELECT now() as ts, attributes_string_value[indexOf(attributes_string_key, 'name')] as name, toFloat64(count(*)) as value from signoz_logs.distributed_logs where (timestamp >= 1680066360726210000 AND timestamp <= 1680066458000000000) AND indexOf(attributes_string_key, 'name') > 0 group by name order by value DESC",
},
{
Name: "TABLE: Test count with groupBy, orderBy",
@@ -802,7 +803,8 @@ var testBuildLogsQueryData = []struct {
func TestBuildLogsQuery(t *testing.T) {
for _, tt := range testBuildLogsQueryData {
Convey("TestBuildLogsQuery", t, func() {
query, err := buildLogsQuery(tt.PanelType, tt.Start, tt.End, tt.Step, tt.BuilderQuery)
query, err := buildLogsQuery(tt.PanelType, tt.Start, tt.End, tt.Step, tt.BuilderQuery, "")
fmt.Println(query)
So(err, ShouldBeNil)
So(query, ShouldEqual, tt.ExpectedQuery)
@@ -844,8 +846,8 @@ var testOrderBy = []struct {
Name string
PanelType v3.PanelType
Items []v3.OrderBy
Tags []string
Result []string
Tags []v3.AttributeKey
Result string
}{
{
Name: "Test 1",
@@ -860,8 +862,10 @@ var testOrderBy = []struct {
Order: "desc",
},
},
Tags: []string{"name"},
Result: []string{"name asc", "value desc"},
Tags: []v3.AttributeKey{
{Key: "name"},
},
Result: "name asc,value desc",
},
{
Name: "Test 2",
@@ -876,8 +880,34 @@ var testOrderBy = []struct {
Order: "asc",
},
},
Tags: []string{"name", "bytes"},
Result: []string{"name asc", "bytes asc"},
Tags: []v3.AttributeKey{
{Key: "name"},
{Key: "bytes"},
},
Result: "name asc,bytes asc",
},
{
Name: "Test Graph item not present in tag",
PanelType: v3.PanelTypeGraph,
Items: []v3.OrderBy{
{
ColumnName: "name",
Order: "asc",
},
{
ColumnName: "bytes",
Order: "asc",
},
{
ColumnName: "method",
Order: "asc",
},
},
Tags: []v3.AttributeKey{
{Key: "name"},
{Key: "bytes"},
},
Result: "name asc,bytes asc",
},
{
Name: "Test 3",
@@ -896,8 +926,11 @@ var testOrderBy = []struct {
Order: "asc",
},
},
Tags: []string{"name", "bytes"},
Result: []string{"name asc", "bytes asc", "value asc"},
Tags: []v3.AttributeKey{
{Key: "name"},
{Key: "bytes"},
},
Result: "name asc,value asc,bytes asc",
},
{
Name: "Test 4",
@@ -923,16 +956,163 @@ var testOrderBy = []struct {
DataType: v3.AttributeKeyDataTypeString,
},
},
Tags: []string{"name", "bytes"},
Result: []string{"name asc", "bytes asc", "value asc", "attributes_string_value[indexOf(attributes_string_key, 'response_time')] desc"},
Tags: []v3.AttributeKey{
{Key: "name"},
{Key: "bytes"},
},
Result: "name asc,value asc,bytes asc,attributes_string_value[indexOf(attributes_string_key, 'response_time')] desc",
},
}
func TestOrderBy(t *testing.T) {
for _, tt := range testOrderBy {
Convey("testOrderBy", t, func() {
res := orderBy(tt.PanelType, tt.Items, tt.Tags)
res := orderByAttributeKeyTags(tt.PanelType, tt.Items, tt.Tags)
So(res, ShouldResemble, tt.Result)
})
}
}
// if there is no group by then there is no point of limit in ts and table queries
// since the above will result in a single ts
// handle only when there is a group by something.
var testPrepLogsQueryData = []struct {
Name string
PanelType v3.PanelType
Start int64
End int64
Step int64
BuilderQuery *v3.BuilderQuery
GroupByTags []v3.AttributeKey
TableName string
AggregateOperator v3.AggregateOperator
ExpectedQuery string
Type string
}{
{
Name: "Test TS with limit- first",
PanelType: v3.PanelTypeGraph,
Start: 1680066360726210000,
End: 1680066458000000000,
Step: 60,
BuilderQuery: &v3.BuilderQuery{
QueryName: "A",
AggregateAttribute: v3.AttributeKey{Key: "name", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag},
AggregateOperator: v3.AggregateOperatorCountDistinct,
Expression: "A",
Filters: &v3.FilterSet{Operator: "AND", Items: []v3.FilterItem{
{Key: v3.AttributeKey{Key: "method", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag}, Value: "GET", Operator: "="},
},
},
Limit: 10,
GroupBy: []v3.AttributeKey{{Key: "method", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag}},
},
TableName: "logs",
ExpectedQuery: "SELECT method from (SELECT attributes_string_value[indexOf(attributes_string_key, 'method')] as method, toFloat64(count(distinct(attributes_string_value[indexOf(attributes_string_key, 'name')]))) as value from signoz_logs.distributed_logs where (timestamp >= 1680066360726210000 AND timestamp <= 1680066458000000000) AND attributes_string_value[indexOf(attributes_string_key, 'method')] = 'GET' AND indexOf(attributes_string_key, 'method') > 0 group by method order by value DESC) LIMIT 10",
Type: constants.FirstQueryGraphLimit,
},
{
Name: "Test TS with limit- first - with order by value",
PanelType: v3.PanelTypeGraph,
Start: 1680066360726210000,
End: 1680066458000000000,
Step: 60,
BuilderQuery: &v3.BuilderQuery{
QueryName: "A",
AggregateAttribute: v3.AttributeKey{Key: "name", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag},
AggregateOperator: v3.AggregateOperatorCountDistinct,
Expression: "A",
Filters: &v3.FilterSet{Operator: "AND", Items: []v3.FilterItem{
{Key: v3.AttributeKey{Key: "method", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag}, Value: "GET", Operator: "="},
},
},
Limit: 10,
GroupBy: []v3.AttributeKey{{Key: "method", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag}},
OrderBy: []v3.OrderBy{{ColumnName: constants.SigNozOrderByValue, Order: "ASC"}},
},
TableName: "logs",
ExpectedQuery: "SELECT method from (SELECT attributes_string_value[indexOf(attributes_string_key, 'method')] as method, toFloat64(count(distinct(attributes_string_value[indexOf(attributes_string_key, 'name')]))) as value from signoz_logs.distributed_logs where (timestamp >= 1680066360726210000 AND timestamp <= 1680066458000000000) AND attributes_string_value[indexOf(attributes_string_key, 'method')] = 'GET' AND indexOf(attributes_string_key, 'method') > 0 group by method order by value ASC) LIMIT 10",
Type: constants.FirstQueryGraphLimit,
},
{
Name: "Test TS with limit- first - with order by attribute",
PanelType: v3.PanelTypeGraph,
Start: 1680066360726210000,
End: 1680066458000000000,
Step: 60,
BuilderQuery: &v3.BuilderQuery{
QueryName: "A",
AggregateAttribute: v3.AttributeKey{Key: "name", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag},
AggregateOperator: v3.AggregateOperatorCountDistinct,
Expression: "A",
Filters: &v3.FilterSet{Operator: "AND", Items: []v3.FilterItem{
{Key: v3.AttributeKey{Key: "method", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag}, Value: "GET", Operator: "="},
},
},
Limit: 10,
GroupBy: []v3.AttributeKey{{Key: "method", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag}},
OrderBy: []v3.OrderBy{{ColumnName: "method", Order: "ASC"}},
},
TableName: "logs",
ExpectedQuery: "SELECT method from (SELECT attributes_string_value[indexOf(attributes_string_key, 'method')] as method, toFloat64(count(distinct(attributes_string_value[indexOf(attributes_string_key, 'name')]))) as value from signoz_logs.distributed_logs where (timestamp >= 1680066360726210000 AND timestamp <= 1680066458000000000) AND attributes_string_value[indexOf(attributes_string_key, 'method')] = 'GET' AND indexOf(attributes_string_key, 'method') > 0 group by method order by method ASC) LIMIT 10",
Type: constants.FirstQueryGraphLimit,
},
{
Name: "Test TS with limit- second",
PanelType: v3.PanelTypeGraph,
Start: 1680066360726210000,
End: 1680066458000000000,
Step: 60,
BuilderQuery: &v3.BuilderQuery{
QueryName: "A",
AggregateAttribute: v3.AttributeKey{Key: "name", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag},
AggregateOperator: v3.AggregateOperatorCountDistinct,
Expression: "A",
Filters: &v3.FilterSet{Operator: "AND", Items: []v3.FilterItem{
{Key: v3.AttributeKey{Key: "method", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag}, Value: "GET", Operator: "="},
},
},
GroupBy: []v3.AttributeKey{{Key: "method", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag}},
Limit: 2,
},
TableName: "logs",
ExpectedQuery: "SELECT toStartOfInterval(fromUnixTimestamp64Nano(timestamp), INTERVAL 0 SECOND) AS ts, attributes_string_value[indexOf(attributes_string_key, 'method')] as method, toFloat64(count(distinct(attributes_string_value[indexOf(attributes_string_key, 'name')]))) as value from signoz_logs.distributed_logs where (timestamp >= 1680066360726210000 AND timestamp <= 1680066458000000000) AND attributes_string_value[indexOf(attributes_string_key, 'method')] = 'GET' AND indexOf(attributes_string_key, 'method') > 0 AND (method) IN (%s) group by method,ts order by value DESC",
Type: constants.SecondQueryGraphLimit,
},
{
Name: "Test TS with limit- second - with order by",
PanelType: v3.PanelTypeGraph,
Start: 1680066360726210000,
End: 1680066458000000000,
Step: 60,
BuilderQuery: &v3.BuilderQuery{
QueryName: "A",
AggregateAttribute: v3.AttributeKey{Key: "name", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag},
AggregateOperator: v3.AggregateOperatorCountDistinct,
Expression: "A",
Filters: &v3.FilterSet{Operator: "AND", Items: []v3.FilterItem{
{Key: v3.AttributeKey{Key: "method", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag}, Value: "GET", Operator: "="},
},
},
GroupBy: []v3.AttributeKey{{Key: "method", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag}},
OrderBy: []v3.OrderBy{{ColumnName: "method", Order: "ASC"}},
Limit: 2,
},
TableName: "logs",
ExpectedQuery: "SELECT toStartOfInterval(fromUnixTimestamp64Nano(timestamp), INTERVAL 0 SECOND) AS ts, attributes_string_value[indexOf(attributes_string_key, 'method')] as method, toFloat64(count(distinct(attributes_string_value[indexOf(attributes_string_key, 'name')]))) as value from signoz_logs.distributed_logs where (timestamp >= 1680066360726210000 AND timestamp <= 1680066458000000000) AND attributes_string_value[indexOf(attributes_string_key, 'method')] = 'GET' AND indexOf(attributes_string_key, 'method') > 0 AND (method) IN (%s) group by method,ts order by method ASC",
Type: constants.SecondQueryGraphLimit,
},
}
func TestPrepareLogsQuery(t *testing.T) {
for _, tt := range testPrepLogsQueryData {
Convey("TestBuildLogsQuery", t, func() {
query, err := PrepareLogsQuery(tt.Start, tt.End, "", tt.PanelType, tt.BuilderQuery, tt.Type)
So(err, ShouldBeNil)
So(query, ShouldEqual, tt.ExpectedQuery)
})
}
}

View File

@@ -0,0 +1,196 @@
package v3
import (
"fmt"
"math"
"go.signoz.io/signoz/pkg/query-service/constants"
v3 "go.signoz.io/signoz/pkg/query-service/model/v3"
"go.signoz.io/signoz/pkg/query-service/utils"
)
// This logic is little convoluted for a reason.
// When we work with cumulative metrics, the table view need to show the data for the entire time range.
// In some cases, we could take the points at the start and end of the time range and divide it by the
// duration. But, the problem is there is no guarantee that the trend will be linear between the start and end.
// We can sum the rate of change for some interval X, this interval can be step size of time series.
// However, the speed of query depends on the number of timestamps, so we bump up the xx the step size.
// This should be a good balance between speed and accuracy.
// TODO: find a better way to do this
func stepForTableCumulative(start, end int64) int64 {
// round up to the nearest multiple of 60
duration := (end - start + 1) / 1000
step := math.Max(math.Floor(float64(duration)/120), 60) // assuming 120 max points
if duration > 1800 { // bump for longer duration
step = step * 5
}
return int64(step)
}
func buildMetricQueryForTable(start, end, _ int64, mq *v3.BuilderQuery, tableName string) (string, error) {
step := stepForTableCumulative(start, end)
points := ((end - start + 1) / 1000) / step
metricQueryGroupBy := mq.GroupBy
// if the aggregate operator is a histogram quantile, and user has not forgotten
// the le tag in the group by then add the le tag to the group by
if mq.AggregateOperator == v3.AggregateOperatorHistQuant50 ||
mq.AggregateOperator == v3.AggregateOperatorHistQuant75 ||
mq.AggregateOperator == v3.AggregateOperatorHistQuant90 ||
mq.AggregateOperator == v3.AggregateOperatorHistQuant95 ||
mq.AggregateOperator == v3.AggregateOperatorHistQuant99 {
found := false
for _, tag := range mq.GroupBy {
if tag.Key == "le" {
found = true
break
}
}
if !found {
metricQueryGroupBy = append(
metricQueryGroupBy,
v3.AttributeKey{
Key: "le",
DataType: v3.AttributeKeyDataTypeString,
Type: v3.AttributeKeyTypeTag,
IsColumn: false,
},
)
}
}
filterSubQuery, err := buildMetricsTimeSeriesFilterQuery(mq.Filters, metricQueryGroupBy, mq)
if err != nil {
return "", err
}
samplesTableTimeFilter := fmt.Sprintf("metric_name = %s AND timestamp_ms >= %d AND timestamp_ms <= %d", utils.ClickHouseFormattedValue(mq.AggregateAttribute.Key), start, end)
// Select the aggregate value for interval
queryTmplCounterInner :=
"SELECT %s" +
" toStartOfInterval(toDateTime(intDiv(timestamp_ms, 1000)), INTERVAL %d SECOND) as ts," +
" %s as value" +
" FROM " + constants.SIGNOZ_METRIC_DBNAME + "." + constants.SIGNOZ_SAMPLES_TABLENAME +
" GLOBAL INNER JOIN" +
" (%s) as filtered_time_series" +
" USING fingerprint" +
" WHERE " + samplesTableTimeFilter +
" GROUP BY %s" +
" ORDER BY %s ts"
// Select the aggregate value for interval
queryTmpl :=
"SELECT %s" +
" toStartOfHour(now()) as ts," + // now() has no menaing & used as a placeholder for ts
" %s as value" +
" FROM " + constants.SIGNOZ_METRIC_DBNAME + "." + constants.SIGNOZ_SAMPLES_TABLENAME +
" GLOBAL INNER JOIN" +
" (%s) as filtered_time_series" +
" USING fingerprint" +
" WHERE " + samplesTableTimeFilter +
" GROUP BY %s" +
" ORDER BY %s ts"
// tagsWithoutLe is used to group by all tags except le
// This is done because we want to group by le only when we are calculating quantile
// Otherwise, we want to group by all tags except le
tagsWithoutLe := []string{}
for _, tag := range mq.GroupBy {
if tag.Key != "le" {
tagsWithoutLe = append(tagsWithoutLe, tag.Key)
}
}
// orderWithoutLe := orderBy(mq.OrderBy, tagsWithoutLe)
groupByWithoutLe := groupBy(tagsWithoutLe...)
groupTagsWithoutLe := groupSelect(tagsWithoutLe...)
orderWithoutLe := orderBy(mq.OrderBy, tagsWithoutLe)
groupBy := groupByAttributeKeyTags(metricQueryGroupBy...)
groupTags := groupSelectAttributeKeyTags(metricQueryGroupBy...)
orderBy := orderByAttributeKeyTags(mq.OrderBy, metricQueryGroupBy)
if len(orderBy) != 0 {
orderBy += ","
}
if len(orderWithoutLe) != 0 {
orderWithoutLe += ","
}
switch mq.AggregateOperator {
case v3.AggregateOperatorRate:
return "", fmt.Errorf("rate is not supported for table view")
case v3.AggregateOperatorSumRate, v3.AggregateOperatorAvgRate, v3.AggregateOperatorMaxRate, v3.AggregateOperatorMinRate:
rateGroupBy := "fingerprint, " + groupBy
rateGroupTags := "fingerprint, " + groupTags
rateOrderBy := "fingerprint, " + orderBy
op := "max(value)"
subQuery := fmt.Sprintf(
queryTmplCounterInner, rateGroupTags, step, op, filterSubQuery, rateGroupBy, rateOrderBy,
) // labels will be same so any should be fine
query := `SELECT %s ts, ` + rateWithoutNegative + `as value FROM(%s) WHERE isNaN(value) = 0`
query = fmt.Sprintf(query, groupTags, subQuery)
query = fmt.Sprintf(`SELECT %s toStartOfHour(now()) as ts, %s(value)/%d as value FROM (%s) GROUP BY %s ORDER BY %s ts`, groupTags, aggregateOperatorToSQLFunc[mq.AggregateOperator], points, query, groupBy, orderBy)
return query, nil
case
v3.AggregateOperatorRateSum,
v3.AggregateOperatorRateMax,
v3.AggregateOperatorRateAvg,
v3.AggregateOperatorRateMin:
step = ((end - start + 1) / 1000) / 2
op := fmt.Sprintf("%s(value)", aggregateOperatorToSQLFunc[mq.AggregateOperator])
subQuery := fmt.Sprintf(queryTmplCounterInner, groupTags, step, op, filterSubQuery, groupBy, orderBy)
query := `SELECT %s toStartOfHour(now()) as ts, ` + rateWithoutNegative + `as value FROM(%s) WHERE isNaN(value) = 0`
query = fmt.Sprintf(query, groupTags, subQuery)
return query, nil
case
v3.AggregateOperatorP05,
v3.AggregateOperatorP10,
v3.AggregateOperatorP20,
v3.AggregateOperatorP25,
v3.AggregateOperatorP50,
v3.AggregateOperatorP75,
v3.AggregateOperatorP90,
v3.AggregateOperatorP95,
v3.AggregateOperatorP99:
op := fmt.Sprintf("quantile(%v)(value)", aggregateOperatorToPercentile[mq.AggregateOperator])
query := fmt.Sprintf(queryTmpl, groupTags, op, filterSubQuery, groupBy, orderBy)
return query, nil
case v3.AggregateOperatorHistQuant50, v3.AggregateOperatorHistQuant75, v3.AggregateOperatorHistQuant90, v3.AggregateOperatorHistQuant95, v3.AggregateOperatorHistQuant99:
rateGroupBy := "fingerprint, " + groupBy
rateGroupTags := "fingerprint, " + groupTags
rateOrderBy := "fingerprint, " + orderBy
op := "max(value)"
subQuery := fmt.Sprintf(
queryTmplCounterInner, rateGroupTags, step, op, filterSubQuery, rateGroupBy, rateOrderBy,
) // labels will be same so any should be fine
query := `SELECT %s ts, ` + rateWithoutNegative + ` as value FROM(%s) WHERE isNaN(value) = 0`
query = fmt.Sprintf(query, groupTags, subQuery)
query = fmt.Sprintf(`SELECT %s toStartOfHour(now()) as ts, sum(value)/%d as value FROM (%s) GROUP BY %s HAVING isNaN(value) = 0 ORDER BY %s ts`, groupTags, points, query, groupBy, orderBy)
value := aggregateOperatorToPercentile[mq.AggregateOperator]
query = fmt.Sprintf(`SELECT %s toStartOfHour(now()) as ts, histogramQuantile(arrayMap(x -> toFloat64(x), groupArray(le)), groupArray(value), %.3f) as value FROM (%s) GROUP BY %s ORDER BY %s ts`, groupTagsWithoutLe, value, query, groupByWithoutLe, orderWithoutLe)
return query, nil
case v3.AggregateOperatorAvg, v3.AggregateOperatorSum, v3.AggregateOperatorMin, v3.AggregateOperatorMax:
op := fmt.Sprintf("%s(value)", aggregateOperatorToSQLFunc[mq.AggregateOperator])
query := fmt.Sprintf(queryTmpl, groupTags, op, filterSubQuery, groupBy, orderBy)
return query, nil
case v3.AggregateOperatorCount:
op := "toFloat64(count(*))"
query := fmt.Sprintf(queryTmpl, groupTags, op, filterSubQuery, groupBy, orderBy)
return query, nil
case v3.AggregateOperatorCountDistinct:
op := "toFloat64(count(distinct(value)))"
query := fmt.Sprintf(queryTmpl, groupTags, op, filterSubQuery, groupBy, orderBy)
return query, nil
case v3.AggregateOperatorNoOp:
return "", fmt.Errorf("noop is not supported for table view")
default:
return "", fmt.Errorf("unsupported aggregate operator")
}
}

View File

@@ -0,0 +1,99 @@
package v3
import (
"testing"
v3 "go.signoz.io/signoz/pkg/query-service/model/v3"
)
func TestPanelTableForCumulative(t *testing.T) {
cases := []struct {
name string
query *v3.BuilderQuery
expected string
}{
{
name: "request rate",
query: &v3.BuilderQuery{
QueryName: "A",
DataSource: v3.DataSourceMetrics,
AggregateOperator: v3.AggregateOperatorSumRate,
AggregateAttribute: v3.AttributeKey{
Key: "signoz_latency_count",
},
Temporality: v3.Cumulative,
Filters: &v3.FilterSet{
Items: []v3.FilterItem{
{
Key: v3.AttributeKey{Key: "service_name"},
Operator: v3.FilterOperatorIn,
Value: []interface{}{"frontend"},
},
{
Key: v3.AttributeKey{Key: "operation"},
Operator: v3.FilterOperatorIn,
Value: []interface{}{"HTTP GET /dispatch"},
},
},
},
Expression: "A",
},
expected: "SELECT toStartOfHour(now()) as ts, sum(value)/29 as value FROM (SELECT ts, if(runningDifference(ts) <= 0, nan, if(runningDifference(value) < 0, (value) / runningDifference(ts), runningDifference(value) / runningDifference(ts))) as value FROM(SELECT fingerprint, toStartOfInterval(toDateTime(intDiv(timestamp_ms, 1000)), INTERVAL 60 SECOND) as ts, max(value) as value FROM signoz_metrics.distributed_samples_v2 GLOBAL INNER JOIN (SELECT fingerprint FROM signoz_metrics.distributed_time_series_v2 WHERE metric_name = 'signoz_latency_count' AND temporality IN ['Cumulative', 'Unspecified'] AND JSONExtractString(labels, 'service_name') IN ['frontend'] AND JSONExtractString(labels, 'operation') IN ['HTTP GET /dispatch']) as filtered_time_series USING fingerprint WHERE metric_name = 'signoz_latency_count' AND timestamp_ms >= 1689255866000 AND timestamp_ms <= 1689257640000 GROUP BY fingerprint, ts ORDER BY fingerprint, ts) WHERE isNaN(value) = 0) GROUP BY ts ORDER BY ts",
},
{
name: "latency p50",
query: &v3.BuilderQuery{
QueryName: "A",
DataSource: v3.DataSourceMetrics,
AggregateOperator: v3.AggregateOperatorHistQuant50,
AggregateAttribute: v3.AttributeKey{
Key: "signoz_latency_bucket",
},
Temporality: v3.Cumulative,
Filters: &v3.FilterSet{
Items: []v3.FilterItem{
{
Key: v3.AttributeKey{Key: "service_name"},
Operator: v3.FilterOperatorEqual,
Value: "frontend",
},
},
},
Expression: "A",
},
expected: "SELECT toStartOfHour(now()) as ts, histogramQuantile(arrayMap(x -> toFloat64(x), groupArray(le)), groupArray(value), 0.500) as value FROM (SELECT le, toStartOfHour(now()) as ts, sum(value)/29 as value FROM (SELECT le, ts, if(runningDifference(ts) <= 0, nan, if(runningDifference(value) < 0, (value) / runningDifference(ts), runningDifference(value) / runningDifference(ts))) as value FROM(SELECT fingerprint, le, toStartOfInterval(toDateTime(intDiv(timestamp_ms, 1000)), INTERVAL 60 SECOND) as ts, max(value) as value FROM signoz_metrics.distributed_samples_v2 GLOBAL INNER JOIN (SELECT JSONExtractString(labels, 'le') as le, fingerprint FROM signoz_metrics.distributed_time_series_v2 WHERE metric_name = 'signoz_latency_bucket' AND temporality IN ['Cumulative', 'Unspecified'] AND JSONExtractString(labels, 'service_name') = 'frontend') as filtered_time_series USING fingerprint WHERE metric_name = 'signoz_latency_bucket' AND timestamp_ms >= 1689255866000 AND timestamp_ms <= 1689257640000 GROUP BY fingerprint, le,ts ORDER BY fingerprint, le ASC, ts) WHERE isNaN(value) = 0) GROUP BY le,ts HAVING isNaN(value) = 0 ORDER BY le ASC, ts) GROUP BY ts ORDER BY ts",
},
{
name: "latency p99 with group by",
query: &v3.BuilderQuery{
QueryName: "A",
DataSource: v3.DataSourceMetrics,
AggregateOperator: v3.AggregateOperatorHistQuant99,
AggregateAttribute: v3.AttributeKey{
Key: "signoz_latency_bucket",
},
Temporality: v3.Cumulative,
GroupBy: []v3.AttributeKey{
{
Key: "service_name",
},
},
Expression: "A",
},
expected: "SELECT service_name, toStartOfHour(now()) as ts, histogramQuantile(arrayMap(x -> toFloat64(x), groupArray(le)), groupArray(value), 0.990) as value FROM (SELECT service_name,le, toStartOfHour(now()) as ts, sum(value)/29 as value FROM (SELECT service_name,le, ts, if(runningDifference(ts) <= 0, nan, if(runningDifference(value) < 0, (value) / runningDifference(ts), runningDifference(value) / runningDifference(ts))) as value FROM(SELECT fingerprint, service_name,le, toStartOfInterval(toDateTime(intDiv(timestamp_ms, 1000)), INTERVAL 60 SECOND) as ts, max(value) as value FROM signoz_metrics.distributed_samples_v2 GLOBAL INNER JOIN (SELECT JSONExtractString(labels, 'service_name') as service_name, JSONExtractString(labels, 'le') as le, fingerprint FROM signoz_metrics.distributed_time_series_v2 WHERE metric_name = 'signoz_latency_bucket' AND temporality IN ['Cumulative', 'Unspecified']) as filtered_time_series USING fingerprint WHERE metric_name = 'signoz_latency_bucket' AND timestamp_ms >= 1689255866000 AND timestamp_ms <= 1689257640000 GROUP BY fingerprint, service_name,le,ts ORDER BY fingerprint, service_name ASC,le ASC, ts) WHERE isNaN(value) = 0) GROUP BY service_name,le,ts HAVING isNaN(value) = 0 ORDER BY service_name ASC,le ASC, ts) GROUP BY service_name,ts ORDER BY service_name ASC, ts",
},
}
for _, c := range cases {
t.Run(c.name, func(t *testing.T) {
query, err := buildMetricQueryForTable(1689255866000, 1689257640000, 1800, c.query, "distributed_time_series_v2")
if err != nil {
t.Fatalf("unexpected error: %v", err)
}
if query != c.expected {
t.Fatalf("expected: %s, got: %s", c.expected, query)
}
})
}
}

View File

@@ -0,0 +1,148 @@
package v3
import (
"fmt"
"math"
"go.signoz.io/signoz/pkg/query-service/constants"
v3 "go.signoz.io/signoz/pkg/query-service/model/v3"
"go.signoz.io/signoz/pkg/query-service/utils"
)
func buildDeltaMetricQueryForTable(start, end, _ int64, mq *v3.BuilderQuery, tableName string) (string, error) {
// round up to the nearest multiple of 60
step := int64(math.Ceil(float64(end-start+1)/1000/60) * 60)
metricQueryGroupBy := mq.GroupBy
// if the aggregate operator is a histogram quantile, and user has not forgotten
// the le tag in the group by then add the le tag to the group by
if mq.AggregateOperator == v3.AggregateOperatorHistQuant50 ||
mq.AggregateOperator == v3.AggregateOperatorHistQuant75 ||
mq.AggregateOperator == v3.AggregateOperatorHistQuant90 ||
mq.AggregateOperator == v3.AggregateOperatorHistQuant95 ||
mq.AggregateOperator == v3.AggregateOperatorHistQuant99 {
found := false
for _, tag := range mq.GroupBy {
if tag.Key == "le" {
found = true
break
}
}
if !found {
metricQueryGroupBy = append(
metricQueryGroupBy,
v3.AttributeKey{
Key: "le",
DataType: v3.AttributeKeyDataTypeString,
Type: v3.AttributeKeyTypeTag,
IsColumn: false,
},
)
}
}
filterSubQuery, err := buildMetricsTimeSeriesFilterQuery(mq.Filters, metricQueryGroupBy, mq)
if err != nil {
return "", err
}
samplesTableTimeFilter := fmt.Sprintf("metric_name = %s AND timestamp_ms >= %d AND timestamp_ms <= %d", utils.ClickHouseFormattedValue(mq.AggregateAttribute.Key), start, end)
queryTmpl :=
"SELECT %s toStartOfHour(now()) as ts," + // now() has no menaing & used as a placeholder for ts
" %s as value" +
" FROM " + constants.SIGNOZ_METRIC_DBNAME + "." + constants.SIGNOZ_SAMPLES_TABLENAME +
" GLOBAL INNER JOIN" +
" (%s) as filtered_time_series" +
" USING fingerprint" +
" WHERE " + samplesTableTimeFilter +
" GROUP BY %s" +
" ORDER BY %s ts"
// tagsWithoutLe is used to group by all tags except le
// This is done because we want to group by le only when we are calculating quantile
// Otherwise, we want to group by all tags except le
tagsWithoutLe := []string{}
for _, tag := range mq.GroupBy {
if tag.Key != "le" {
tagsWithoutLe = append(tagsWithoutLe, tag.Key)
}
}
groupByWithoutLeTable := groupBy(tagsWithoutLe...)
groupTagsWithoutLeTable := groupSelect(tagsWithoutLe...)
orderWithoutLeTable := orderBy(mq.OrderBy, tagsWithoutLe)
groupBy := groupByAttributeKeyTags(metricQueryGroupBy...)
groupTags := groupSelectAttributeKeyTags(metricQueryGroupBy...)
orderBy := orderByAttributeKeyTags(mq.OrderBy, metricQueryGroupBy)
if len(orderBy) != 0 {
orderBy += ","
}
if len(orderWithoutLeTable) != 0 {
orderWithoutLeTable += ","
}
switch mq.AggregateOperator {
case v3.AggregateOperatorRate:
// TODO(srikanthccv): what should be the expected behavior here for metrics?
return "", fmt.Errorf("rate is not supported for table view")
case v3.AggregateOperatorSumRate, v3.AggregateOperatorAvgRate, v3.AggregateOperatorMaxRate, v3.AggregateOperatorMinRate:
op := fmt.Sprintf("%s(value)/%d", aggregateOperatorToSQLFunc[mq.AggregateOperator], step)
query := fmt.Sprintf(
queryTmpl, groupTags, op, filterSubQuery, groupBy, orderBy,
)
return query, nil
case
v3.AggregateOperatorRateSum,
v3.AggregateOperatorRateMax,
v3.AggregateOperatorRateAvg,
v3.AggregateOperatorRateMin:
op := fmt.Sprintf("%s(value)/%d", aggregateOperatorToSQLFunc[mq.AggregateOperator], step)
query := fmt.Sprintf(
queryTmpl, groupTags, op, filterSubQuery, groupBy, orderBy,
)
return query, nil
case
v3.AggregateOperatorP05,
v3.AggregateOperatorP10,
v3.AggregateOperatorP20,
v3.AggregateOperatorP25,
v3.AggregateOperatorP50,
v3.AggregateOperatorP75,
v3.AggregateOperatorP90,
v3.AggregateOperatorP95,
v3.AggregateOperatorP99:
op := fmt.Sprintf("quantile(%v)(value)", aggregateOperatorToPercentile[mq.AggregateOperator])
query := fmt.Sprintf(queryTmpl, groupTags, op, filterSubQuery, groupBy, orderBy)
return query, nil
case v3.AggregateOperatorHistQuant50, v3.AggregateOperatorHistQuant75, v3.AggregateOperatorHistQuant90, v3.AggregateOperatorHistQuant95, v3.AggregateOperatorHistQuant99:
op := fmt.Sprintf("sum(value)/%d", step)
query := fmt.Sprintf(
queryTmpl, groupTags, op, filterSubQuery, groupBy, orderBy,
) // labels will be same so any should be fine
value := aggregateOperatorToPercentile[mq.AggregateOperator]
query = fmt.Sprintf(`SELECT %s ts, histogramQuantile(arrayMap(x -> toFloat64(x), groupArray(le)), groupArray(value), %.3f) as value FROM (%s) GROUP BY %s ORDER BY %s ts`, groupTagsWithoutLeTable, value, query, groupByWithoutLeTable, orderWithoutLeTable)
return query, nil
case v3.AggregateOperatorAvg, v3.AggregateOperatorSum, v3.AggregateOperatorMin, v3.AggregateOperatorMax:
op := fmt.Sprintf("%s(value)", aggregateOperatorToSQLFunc[mq.AggregateOperator])
query := fmt.Sprintf(queryTmpl, groupTags, op, filterSubQuery, groupBy, orderBy)
return query, nil
case v3.AggregateOperatorCount:
op := "toFloat64(count(*))"
query := fmt.Sprintf(queryTmpl, groupTags, op, filterSubQuery, groupBy, orderBy)
return query, nil
case v3.AggregateOperatorCountDistinct:
op := "toFloat64(count(distinct(value)))"
query := fmt.Sprintf(queryTmpl, groupTags, op, filterSubQuery, groupBy, orderBy)
return query, nil
case v3.AggregateOperatorNoOp:
return "", fmt.Errorf("noop is not supported for table view")
default:
return "", fmt.Errorf("unsupported aggregate operator")
}
}

View File

@@ -0,0 +1,99 @@
package v3
import (
"testing"
v3 "go.signoz.io/signoz/pkg/query-service/model/v3"
)
func TestPanelTableForDelta(t *testing.T) {
cases := []struct {
name string
query *v3.BuilderQuery
expected string
}{
{
name: "request rate",
query: &v3.BuilderQuery{
QueryName: "A",
DataSource: v3.DataSourceMetrics,
AggregateOperator: v3.AggregateOperatorSumRate,
AggregateAttribute: v3.AttributeKey{
Key: "signoz_latency_count",
},
Temporality: v3.Delta,
Filters: &v3.FilterSet{
Items: []v3.FilterItem{
{
Key: v3.AttributeKey{Key: "service_name"},
Operator: v3.FilterOperatorIn,
Value: []interface{}{"frontend"},
},
{
Key: v3.AttributeKey{Key: "operation"},
Operator: v3.FilterOperatorIn,
Value: []interface{}{"HTTP GET /dispatch"},
},
},
},
Expression: "A",
},
expected: "SELECT toStartOfHour(now()) as ts, sum(value)/1800 as value FROM signoz_metrics.distributed_samples_v2 GLOBAL INNER JOIN (SELECT fingerprint FROM signoz_metrics.distributed_time_series_v2 WHERE metric_name = 'signoz_latency_count' AND temporality = 'Delta' AND JSONExtractString(labels, 'service_name') IN ['frontend'] AND JSONExtractString(labels, 'operation') IN ['HTTP GET /dispatch']) as filtered_time_series USING fingerprint WHERE metric_name = 'signoz_latency_count' AND timestamp_ms >= 1689255866000 AND timestamp_ms <= 1689257640000 GROUP BY ts ORDER BY ts",
},
{
name: "latency p50",
query: &v3.BuilderQuery{
QueryName: "A",
DataSource: v3.DataSourceMetrics,
AggregateOperator: v3.AggregateOperatorHistQuant50,
AggregateAttribute: v3.AttributeKey{
Key: "signoz_latency_bucket",
},
Temporality: v3.Delta,
Filters: &v3.FilterSet{
Items: []v3.FilterItem{
{
Key: v3.AttributeKey{Key: "service_name"},
Operator: v3.FilterOperatorEqual,
Value: "frontend",
},
},
},
Expression: "A",
},
expected: "SELECT ts, histogramQuantile(arrayMap(x -> toFloat64(x), groupArray(le)), groupArray(value), 0.500) as value FROM (SELECT le, toStartOfHour(now()) as ts, sum(value)/1800 as value FROM signoz_metrics.distributed_samples_v2 GLOBAL INNER JOIN (SELECT JSONExtractString(labels, 'le') as le, fingerprint FROM signoz_metrics.distributed_time_series_v2 WHERE metric_name = 'signoz_latency_bucket' AND temporality = 'Delta' AND JSONExtractString(labels, 'service_name') = 'frontend') as filtered_time_series USING fingerprint WHERE metric_name = 'signoz_latency_bucket' AND timestamp_ms >= 1689255866000 AND timestamp_ms <= 1689257640000 GROUP BY le,ts ORDER BY le ASC, ts) GROUP BY ts ORDER BY ts",
},
{
name: "latency p99 with group by",
query: &v3.BuilderQuery{
QueryName: "A",
DataSource: v3.DataSourceMetrics,
AggregateOperator: v3.AggregateOperatorHistQuant99,
AggregateAttribute: v3.AttributeKey{
Key: "signoz_latency_bucket",
},
Temporality: v3.Delta,
GroupBy: []v3.AttributeKey{
{
Key: "service_name",
},
},
Expression: "A",
},
expected: "SELECT service_name, ts, histogramQuantile(arrayMap(x -> toFloat64(x), groupArray(le)), groupArray(value), 0.990) as value FROM (SELECT service_name,le, toStartOfHour(now()) as ts, sum(value)/1800 as value FROM signoz_metrics.distributed_samples_v2 GLOBAL INNER JOIN (SELECT JSONExtractString(labels, 'service_name') as service_name, JSONExtractString(labels, 'le') as le, fingerprint FROM signoz_metrics.distributed_time_series_v2 WHERE metric_name = 'signoz_latency_bucket' AND temporality = 'Delta' ) as filtered_time_series USING fingerprint WHERE metric_name = 'signoz_latency_bucket' AND timestamp_ms >= 1689255866000 AND timestamp_ms <= 1689257640000 GROUP BY service_name,le,ts ORDER BY service_name ASC,le ASC, ts) GROUP BY service_name,ts ORDER BY service_name ASC, ts",
},
}
for _, c := range cases {
t.Run(c.name, func(t *testing.T) {
query, err := buildDeltaMetricQueryForTable(1689255866000, 1689257640000, 1800, c.query, "distributed_time_series_v2")
if err != nil {
t.Fatalf("unexpected error: %v", err)
}
if query != c.expected {
t.Fatalf("expected: %s, got: %s", c.expected, query)
}
})
}
}

View File

@@ -403,15 +403,15 @@ func reduceQuery(query string, reduceTo v3.ReduceToOperator, aggregateOperator v
// chart with just the query value. For the quer
switch reduceTo {
case v3.ReduceToOperatorLast:
query = fmt.Sprintf("SELECT anyLast(value) as value, any(ts) as ts %s FROM (%s) %s", selectLabels, query, groupBy)
query = fmt.Sprintf("SELECT anyLastIf(value, toUnixTimestamp(ts) != 0) as value, anyIf(ts, toUnixTimestamp(ts) != 0) AS timestamp %s FROM (%s) %s", selectLabels, query, groupBy)
case v3.ReduceToOperatorSum:
query = fmt.Sprintf("SELECT sum(value) as value, any(ts) as ts %s FROM (%s) %s", selectLabels, query, groupBy)
query = fmt.Sprintf("SELECT sumIf(value, toUnixTimestamp(ts) != 0) as value, anyIf(ts, toUnixTimestamp(ts) != 0) AS timestamp %s FROM (%s) %s", selectLabels, query, groupBy)
case v3.ReduceToOperatorAvg:
query = fmt.Sprintf("SELECT avg(value) as value, any(ts) as ts %s FROM (%s) %s", selectLabels, query, groupBy)
query = fmt.Sprintf("SELECT avgIf(value, toUnixTimestamp(ts) != 0) as value, anyIf(ts, toUnixTimestamp(ts) != 0) AS timestamp %s FROM (%s) %s", selectLabels, query, groupBy)
case v3.ReduceToOperatorMax:
query = fmt.Sprintf("SELECT max(value) as value, any(ts) as ts %s FROM (%s) %s", selectLabels, query, groupBy)
query = fmt.Sprintf("SELECT maxIf(value, toUnixTimestamp(ts) != 0) as value, anyIf(ts, toUnixTimestamp(ts) != 0) AS timestamp %s FROM (%s) %s", selectLabels, query, groupBy)
case v3.ReduceToOperatorMin:
query = fmt.Sprintf("SELECT min(value) as value, any(ts) as ts %s FROM (%s) %s", selectLabels, query, groupBy)
query = fmt.Sprintf("SELECT minIf(value, toUnixTimestamp(ts) != 0) as value, anyIf(ts, toUnixTimestamp(ts) != 0) AS timestamp %s FROM (%s) %s", selectLabels, query, groupBy)
default:
return "", fmt.Errorf("unsupported reduce operator")
}
@@ -422,9 +422,17 @@ func PrepareMetricQuery(start, end int64, queryType v3.QueryType, panelType v3.P
var query string
var err error
if mq.Temporality == v3.Delta {
query, err = buildDeltaMetricQuery(start, end, mq.StepInterval, mq, constants.SIGNOZ_TIMESERIES_TABLENAME)
if panelType == v3.PanelTypeTable {
query, err = buildDeltaMetricQueryForTable(start, end, mq.StepInterval, mq, constants.SIGNOZ_TIMESERIES_TABLENAME)
} else {
query, err = buildDeltaMetricQuery(start, end, mq.StepInterval, mq, constants.SIGNOZ_TIMESERIES_TABLENAME)
}
} else {
query, err = buildMetricQuery(start, end, mq.StepInterval, mq, constants.SIGNOZ_TIMESERIES_TABLENAME)
if panelType == v3.PanelTypeTable {
query, err = buildMetricQueryForTable(start, end, mq.StepInterval, mq, constants.SIGNOZ_TIMESERIES_TABLENAME)
} else {
query, err = buildMetricQuery(start, end, mq.StepInterval, mq, constants.SIGNOZ_TIMESERIES_TABLENAME)
}
}
if err != nil {
return "", err

View File

@@ -235,7 +235,7 @@ func (q *querier) runBuilderQueries(ctx context.Context, params *v3.QueryRangePa
// TODO: add support for logs and traces
if builderQuery.DataSource == v3.DataSourceLogs {
query, err := logsV3.PrepareLogsQuery(params.Start, params.End, params.CompositeQuery.QueryType, params.CompositeQuery.PanelType, builderQuery)
query, err := logsV3.PrepareLogsQuery(params.Start, params.End, params.CompositeQuery.QueryType, params.CompositeQuery.PanelType, builderQuery, "")
if err != nil {
errQueriesByName[queryName] = err.Error()
continue
@@ -250,7 +250,7 @@ func (q *querier) runBuilderQueries(ctx context.Context, params *v3.QueryRangePa
}
if builderQuery.DataSource == v3.DataSourceTraces {
query, err := tracesV3.PrepareTracesQuery(params.Start, params.End, params.CompositeQuery.QueryType, params.CompositeQuery.PanelType, builderQuery, keys)
query, err := tracesV3.PrepareTracesQuery(params.Start, params.End, params.CompositeQuery.PanelType, builderQuery, keys, "")
if err != nil {
errQueriesByName[queryName] = err.Error()
continue

View File

@@ -6,6 +6,7 @@ import (
"github.com/SigNoz/govaluate"
"go.signoz.io/signoz/pkg/query-service/cache"
"go.signoz.io/signoz/pkg/query-service/constants"
v3 "go.signoz.io/signoz/pkg/query-service/model/v3"
"go.uber.org/zap"
)
@@ -38,8 +39,8 @@ var SupportedFunctions = []string{
var EvalFuncs = map[string]govaluate.ExpressionFunction{}
type prepareTracesQueryFunc func(start, end int64, queryType v3.QueryType, panelType v3.PanelType, bq *v3.BuilderQuery, keys map[string]v3.AttributeKey) (string, error)
type prepareLogsQueryFunc func(start, end int64, queryType v3.QueryType, panelType v3.PanelType, bq *v3.BuilderQuery) (string, error)
type prepareTracesQueryFunc func(start, end int64, panelType v3.PanelType, bq *v3.BuilderQuery, keys map[string]v3.AttributeKey, graphLimitQtype string) (string, error)
type prepareLogsQueryFunc func(start, end int64, queryType v3.QueryType, panelType v3.PanelType, bq *v3.BuilderQuery, graphLimitQtype string) (string, error)
type prepareMetricQueryFunc func(start, end int64, queryType v3.QueryType, panelType v3.PanelType, bq *v3.BuilderQuery) (string, error)
type QueryBuilder struct {
@@ -146,17 +147,45 @@ func (qb *QueryBuilder) PrepareQueries(params *v3.QueryRangeParamsV3, args ...in
if len(args) > 0 {
keys = args[0].(map[string]v3.AttributeKey)
}
queryString, err := qb.options.BuildTraceQuery(params.Start, params.End, compositeQuery.QueryType, compositeQuery.PanelType, query, keys)
if err != nil {
return nil, err
// for ts query with group by and limit form two queries
if compositeQuery.PanelType == v3.PanelTypeGraph && query.Limit > 0 && len(query.GroupBy) > 0 {
limitQuery, err := qb.options.BuildTraceQuery(params.Start, params.End, compositeQuery.PanelType, query, keys, constants.FirstQueryGraphLimit)
if err != nil {
return nil, err
}
placeholderQuery, err := qb.options.BuildTraceQuery(params.Start, params.End, compositeQuery.PanelType, query, keys, constants.SecondQueryGraphLimit)
if err != nil {
return nil, err
}
query := fmt.Sprintf(placeholderQuery, limitQuery)
queries[queryName] = query
} else {
queryString, err := qb.options.BuildTraceQuery(params.Start, params.End, compositeQuery.PanelType, query, keys, "")
if err != nil {
return nil, err
}
queries[queryName] = queryString
}
queries[queryName] = queryString
case v3.DataSourceLogs:
queryString, err := qb.options.BuildLogQuery(params.Start, params.End, compositeQuery.QueryType, compositeQuery.PanelType, query)
if err != nil {
return nil, err
// for ts query with limit replace it as it is already formed
if compositeQuery.PanelType == v3.PanelTypeGraph && query.Limit > 0 && len(query.GroupBy) > 0 {
limitQuery, err := qb.options.BuildLogQuery(params.Start, params.End, compositeQuery.QueryType, compositeQuery.PanelType, query, constants.FirstQueryGraphLimit)
if err != nil {
return nil, err
}
placeholderQuery, err := qb.options.BuildLogQuery(params.Start, params.End, compositeQuery.QueryType, compositeQuery.PanelType, query, constants.SecondQueryGraphLimit)
if err != nil {
return nil, err
}
query := fmt.Sprintf(placeholderQuery, limitQuery)
queries[queryName] = query
} else {
queryString, err := qb.options.BuildLogQuery(params.Start, params.End, compositeQuery.QueryType, compositeQuery.PanelType, query, "")
if err != nil {
return nil, err
}
queries[queryName] = queryString
}
queries[queryName] = queryString
case v3.DataSourceMetrics:
queryString, err := qb.options.BuildMetricQuery(params.Start, params.End, compositeQuery.QueryType, compositeQuery.PanelType, query)
if err != nil {

View File

@@ -46,9 +46,10 @@ type ServerOptions struct {
HTTPHostPort string
PrivateHostPort string
// alert specific params
DisableRules bool
RuleRepoURL string
PreferDelta bool
DisableRules bool
RuleRepoURL string
PreferDelta bool
PreferSpanMetrics bool
}
// Server runs HTTP, Mux and a grpc server
@@ -124,12 +125,13 @@ func NewServer(serverOptions *ServerOptions) (*Server, error) {
telemetry.GetInstance().SetReader(reader)
apiHandler, err := NewAPIHandler(APIHandlerOpts{
Reader: reader,
SkipConfig: skipConfig,
PerferDelta: serverOptions.PreferDelta,
AppDao: dao.DB(),
RuleManager: rm,
FeatureFlags: fm,
Reader: reader,
SkipConfig: skipConfig,
PerferDelta: serverOptions.PreferDelta,
PreferSpanMetrics: serverOptions.PreferSpanMetrics,
AppDao: dao.DB(),
RuleManager: rm,
FeatureFlags: fm,
})
if err != nil {
return nil, err

View File

@@ -108,6 +108,18 @@ func getSelectLabels(aggregatorOperator v3.AggregateOperator, groupBy []v3.Attri
return selectLabels
}
func getSelectKeys(aggregatorOperator v3.AggregateOperator, groupBy []v3.AttributeKey) string {
var selectLabels []string
if aggregatorOperator == v3.AggregateOperatorNoOp {
return ""
} else {
for _, tag := range groupBy {
selectLabels = append(selectLabels, fmt.Sprintf("`%s`", tag.Key))
}
}
return strings.Join(selectLabels, ",")
}
func getSelectColumns(sc []v3.AttributeKey, keys map[string]v3.AttributeKey) string {
var columns []string
for _, tag := range sc {
@@ -219,7 +231,7 @@ func handleEmptyValuesInGroupBy(keys map[string]v3.AttributeKey, groupBy []v3.At
return "", nil
}
func buildTracesQuery(start, end, step int64, mq *v3.BuilderQuery, tableName string, keys map[string]v3.AttributeKey, panelType v3.PanelType) (string, error) {
func buildTracesQuery(start, end, step int64, mq *v3.BuilderQuery, tableName string, keys map[string]v3.AttributeKey, panelType v3.PanelType, graphLimitQtype string) (string, error) {
filterSubQuery, err := buildTracesFilterQuery(mq.Filters, keys)
if err != nil {
@@ -236,24 +248,27 @@ func buildTracesQuery(start, end, step int64, mq *v3.BuilderQuery, tableName str
}
var queryTmpl string
if panelType == v3.PanelTypeTable {
if graphLimitQtype == constants.FirstQueryGraphLimit {
queryTmpl = "SELECT"
} else if panelType == v3.PanelTypeTable {
queryTmpl =
"SELECT now() as ts," + selectLabels +
" %s as value " +
"from " + constants.SIGNOZ_TRACE_DBNAME + "." + constants.SIGNOZ_SPAN_INDEX_TABLENAME +
" where " + spanIndexTableTimeFilter + "%s" +
"%s%s" +
"%s"
"SELECT now() as ts,"
} else if panelType == v3.PanelTypeGraph || panelType == v3.PanelTypeValue {
// Select the aggregate value for interval
queryTmpl =
fmt.Sprintf("SELECT toStartOfInterval(timestamp, INTERVAL %d SECOND) AS ts,", step) + selectLabels +
" %s as value " +
"from " + constants.SIGNOZ_TRACE_DBNAME + "." + constants.SIGNOZ_SPAN_INDEX_TABLENAME +
" where " + spanIndexTableTimeFilter + "%s" +
"%s%s" +
"%s"
fmt.Sprintf("SELECT toStartOfInterval(timestamp, INTERVAL %d SECOND) AS ts,", step)
}
queryTmpl = queryTmpl + selectLabels +
" %s as value " +
"from " + constants.SIGNOZ_TRACE_DBNAME + "." + constants.SIGNOZ_SPAN_INDEX_TABLENAME +
" where " + spanIndexTableTimeFilter + "%s" +
"%s%s" +
"%s"
// we don't need value for first query
if graphLimitQtype == constants.FirstQueryGraphLimit {
queryTmpl = "SELECT " + getSelectKeys(mq.AggregateOperator, mq.GroupBy) + " from (" + queryTmpl + ")"
}
emptyValuesInGroupByFilter, err := handleEmptyValuesInGroupBy(keys, mq.GroupBy)
@@ -262,7 +277,7 @@ func buildTracesQuery(start, end, step int64, mq *v3.BuilderQuery, tableName str
}
filterSubQuery += emptyValuesInGroupByFilter
groupBy := groupByAttributeKeyTags(panelType, mq.GroupBy...)
groupBy := groupByAttributeKeyTags(panelType, graphLimitQtype, mq.GroupBy...)
if groupBy != "" {
groupBy = " group by " + groupBy
}
@@ -271,6 +286,11 @@ func buildTracesQuery(start, end, step int64, mq *v3.BuilderQuery, tableName str
if orderBy != "" {
orderBy = " order by " + orderBy
}
if graphLimitQtype == constants.SecondQueryGraphLimit {
filterSubQuery = filterSubQuery + " AND " + fmt.Sprintf("(%s) GLOBAL IN (", getSelectKeys(mq.AggregateOperator, mq.GroupBy)) + "%s)"
}
aggregationKey := ""
if mq.AggregateAttribute.Key != "" {
aggregationKey = getColumnName(mq.AggregateAttribute, keys)
@@ -326,7 +346,7 @@ func buildTracesQuery(start, end, step int64, mq *v3.BuilderQuery, tableName str
var query string
if panelType == v3.PanelTypeTrace {
withSubQuery := fmt.Sprintf(constants.TracesExplorerViewSQLSelectWithSubQuery, constants.SIGNOZ_TRACE_DBNAME, constants.SIGNOZ_SPAN_INDEX_TABLENAME, spanIndexTableTimeFilter, filterSubQuery)
withSubQuery = addLimitToQuery(withSubQuery, mq.Limit, panelType)
withSubQuery = addLimitToQuery(withSubQuery, mq.Limit)
if mq.Offset != 0 {
withSubQuery = addOffsetToQuery(withSubQuery, mq.Offset)
}
@@ -367,84 +387,60 @@ func enrichOrderBy(items []v3.OrderBy, keys map[string]v3.AttributeKey) []v3.Ord
// groupBy returns a string of comma separated tags for group by clause
// `ts` is always added to the group by clause
func groupBy(panelType v3.PanelType, tags ...string) string {
if panelType == v3.PanelTypeGraph || panelType == v3.PanelTypeValue {
func groupBy(panelType v3.PanelType, graphLimitQtype string, tags ...string) string {
if (graphLimitQtype != constants.FirstQueryGraphLimit) && (panelType == v3.PanelTypeGraph || panelType == v3.PanelTypeValue) {
tags = append(tags, "ts")
}
return strings.Join(tags, ",")
}
func groupByAttributeKeyTags(panelType v3.PanelType, tags ...v3.AttributeKey) string {
func groupByAttributeKeyTags(panelType v3.PanelType, graphLimitQtype string, tags ...v3.AttributeKey) string {
groupTags := []string{}
for _, tag := range tags {
groupTags = append(groupTags, fmt.Sprintf("`%s`", tag.Key))
}
return groupBy(panelType, groupTags...)
return groupBy(panelType, graphLimitQtype, groupTags...)
}
// orderBy returns a string of comma separated tags for order by clause
// if there are remaining items which are not present in tags they are also added
// if the order is not specified, it defaults to ASC
func orderBy(panelType v3.PanelType, items []v3.OrderBy, tags []string, keys map[string]v3.AttributeKey) []string {
func orderBy(panelType v3.PanelType, items []v3.OrderBy, tagLookup map[string]struct{}, keys map[string]v3.AttributeKey) []string {
var orderBy []string
// create a lookup
addedToOrderBy := map[string]bool{}
itemsLookup := map[string]v3.OrderBy{}
for i := 0; i < len(items); i++ {
addedToOrderBy[items[i].ColumnName] = false
itemsLookup[items[i].ColumnName] = items[i]
}
for _, tag := range tags {
if item, ok := itemsLookup[tag]; ok {
orderBy = append(orderBy, fmt.Sprintf("`%s` %s", item.ColumnName, item.Order))
addedToOrderBy[item.ColumnName] = true
} else {
orderBy = append(orderBy, fmt.Sprintf("`%s` ASC", tag))
}
}
// users might want to order by value of aggregation
for _, item := range items {
if item.ColumnName == constants.SigNozOrderByValue {
orderBy = append(orderBy, fmt.Sprintf("value %s", item.Order))
addedToOrderBy[item.ColumnName] = true
}
}
// add the remaining items
if panelType == v3.PanelTypeList {
for _, item := range items {
// since these are not present in tags we will have to select them correctly
// for list view there is no need to check if it was added since they wont be added yet but this is just for safety
if !addedToOrderBy[item.ColumnName] {
attr := v3.AttributeKey{Key: item.ColumnName, DataType: item.DataType, Type: item.Type, IsColumn: item.IsColumn}
name := getColumnName(attr, keys)
if item.IsColumn {
orderBy = append(orderBy, fmt.Sprintf("`%s` %s", name, item.Order))
} else {
orderBy = append(orderBy, fmt.Sprintf("%s %s", name, item.Order))
}
} else if _, ok := tagLookup[item.ColumnName]; ok {
orderBy = append(orderBy, fmt.Sprintf("`%s` %s", item.ColumnName, item.Order))
} else if panelType == v3.PanelTypeList {
attr := v3.AttributeKey{Key: item.ColumnName, DataType: item.DataType, Type: item.Type, IsColumn: item.IsColumn}
name := getColumnName(attr, keys)
if item.IsColumn {
orderBy = append(orderBy, fmt.Sprintf("`%s` %s", name, item.Order))
} else {
orderBy = append(orderBy, fmt.Sprintf("%s %s", name, item.Order))
}
}
}
return orderBy
}
func orderByAttributeKeyTags(panelType v3.PanelType, items []v3.OrderBy, tags []v3.AttributeKey, keys map[string]v3.AttributeKey) string {
var groupTags []string
for _, tag := range tags {
groupTags = append(groupTags, tag.Key)
tagLookup := map[string]struct{}{}
for _, v := range tags {
tagLookup[v.Key] = struct{}{}
}
orderByArray := orderBy(panelType, items, groupTags, keys)
if panelType == v3.PanelTypeList && len(orderByArray) == 0 {
orderByArray = append(orderByArray, constants.TIMESTAMP+" DESC")
} else if panelType == v3.PanelTypeGraph || panelType == v3.PanelTypeValue {
orderByArray = append(orderByArray, "ts")
orderByArray := orderBy(panelType, items, tagLookup, keys)
if len(orderByArray) == 0 {
if panelType == v3.PanelTypeList {
orderByArray = append(orderByArray, constants.TIMESTAMP+" DESC")
} else if panelType == v3.PanelTypeGraph {
orderByArray = append(orderByArray, "value DESC")
}
}
str := strings.Join(orderByArray, ",")
@@ -480,7 +476,7 @@ func reduceToQuery(query string, reduceTo v3.ReduceToOperator, aggregateOperator
return query, nil
}
func addLimitToQuery(query string, limit uint64, panelType v3.PanelType) string {
func addLimitToQuery(query string, limit uint64) string {
if limit == 0 {
limit = 100
}
@@ -491,16 +487,33 @@ func addOffsetToQuery(query string, offset uint64) string {
return fmt.Sprintf("%s OFFSET %d", query, offset)
}
func PrepareTracesQuery(start, end int64, queryType v3.QueryType, panelType v3.PanelType, mq *v3.BuilderQuery, keys map[string]v3.AttributeKey) (string, error) {
query, err := buildTracesQuery(start, end, mq.StepInterval, mq, constants.SIGNOZ_SPAN_INDEX_TABLENAME, keys, panelType)
func PrepareTracesQuery(start, end int64, panelType v3.PanelType, mq *v3.BuilderQuery, keys map[string]v3.AttributeKey, graphLimitQtype string) (string, error) {
if graphLimitQtype == constants.FirstQueryGraphLimit {
// give me just the group by names
query, err := buildTracesQuery(start, end, mq.StepInterval, mq, constants.SIGNOZ_SPAN_INDEX_TABLENAME, keys, panelType, graphLimitQtype)
if err != nil {
return "", err
}
query = addLimitToQuery(query, mq.Limit)
return query, nil
} else if graphLimitQtype == constants.SecondQueryGraphLimit {
query, err := buildTracesQuery(start, end, mq.StepInterval, mq, constants.SIGNOZ_SPAN_INDEX_TABLENAME, keys, panelType, graphLimitQtype)
if err != nil {
return "", err
}
return query, nil
}
query, err := buildTracesQuery(start, end, mq.StepInterval, mq, constants.SIGNOZ_SPAN_INDEX_TABLENAME, keys, panelType, graphLimitQtype)
if err != nil {
return "", err
}
if panelType == v3.PanelTypeValue {
query, err = reduceToQuery(query, mq.ReduceTo, mq.AggregateOperator)
}
if panelType == v3.PanelTypeList {
query = addLimitToQuery(query, mq.Limit, panelType)
if panelType == v3.PanelTypeList || panelType == v3.PanelTypeTable {
query = addLimitToQuery(query, mq.Limit)
if mq.Offset != 0 {
query = addOffsetToQuery(query, mq.Offset)

View File

@@ -323,8 +323,8 @@ var testOrderBy = []struct {
Name string
PanelType v3.PanelType
Items []v3.OrderBy
Tags []string
Result []string
Tags []v3.AttributeKey
Result string
}{
{
Name: "Test 1",
@@ -339,8 +339,10 @@ var testOrderBy = []struct {
Order: "desc",
},
},
Tags: []string{"name"},
Result: []string{"`name` asc", "value desc"},
Tags: []v3.AttributeKey{
{Key: "name"},
},
Result: "`name` asc,value desc",
},
{
Name: "Test 2",
@@ -355,8 +357,11 @@ var testOrderBy = []struct {
Order: "asc",
},
},
Tags: []string{"name", "bytes"},
Result: []string{"`name` asc", "`bytes` asc"},
Tags: []v3.AttributeKey{
{Key: "name"},
{Key: "bytes"},
},
Result: "`name` asc,`bytes` asc",
},
{
Name: "Test 3",
@@ -375,8 +380,11 @@ var testOrderBy = []struct {
Order: "asc",
},
},
Tags: []string{"name", "bytes"},
Result: []string{"`name` asc", "`bytes` asc", "value asc"},
Tags: []v3.AttributeKey{
{Key: "name"},
{Key: "bytes"},
},
Result: "`name` asc,value asc,`bytes` asc",
},
{
Name: "Test 4",
@@ -398,8 +406,11 @@ var testOrderBy = []struct {
DataType: v3.AttributeKeyDataTypeString,
},
},
Tags: []string{"name", "bytes"},
Result: []string{"`name` asc", "`bytes` asc", "stringTagMap['response_time'] desc"},
Tags: []v3.AttributeKey{
{Key: "name"},
{Key: "bytes"},
},
Result: "`name` asc,`bytes` asc,stringTagMap['response_time'] desc",
},
{
Name: "Test 5",
@@ -426,15 +437,15 @@ var testOrderBy = []struct {
Order: "desc",
},
},
Tags: []string{},
Result: []string{"`name` asc", "`bytes` asc", "stringTagMap['response_time'] desc"},
Tags: []v3.AttributeKey{},
Result: "`name` asc,`bytes` asc,stringTagMap['response_time'] desc",
},
}
func TestOrderBy(t *testing.T) {
for _, tt := range testOrderBy {
Convey("testOrderBy", t, func() {
res := orderBy(tt.PanelType, tt.Items, tt.Tags, map[string]v3.AttributeKey{
res := orderByAttributeKeyTags(tt.PanelType, tt.Items, tt.Tags, map[string]v3.AttributeKey{
"name": {Key: "name", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag, IsColumn: true},
"bytes": {Key: "bytes", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag, IsColumn: true},
"response_time": {Key: "response_time", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag, IsColumn: false},
@@ -470,7 +481,7 @@ var testBuildTracesQueryData = []struct {
TableName: "signoz_traces.distributed_signoz_index_v2",
ExpectedQuery: "SELECT toStartOfInterval(timestamp, INTERVAL 60 SECOND) AS ts, toFloat64(count()) as value" +
" from signoz_traces.distributed_signoz_index_v2 where (timestamp >= '1680066360726210000' AND timestamp <= '1680066458000000000')" +
" group by ts order by ts",
" group by ts order by value DESC",
PanelType: v3.PanelTypeGraph,
},
{
@@ -486,7 +497,7 @@ var testBuildTracesQueryData = []struct {
TableName: "signoz_traces.distributed_signoz_index_v2",
ExpectedQuery: "SELECT toStartOfInterval(timestamp, INTERVAL 60 SECOND) AS ts, count()/60 as value from" +
" signoz_traces.distributed_signoz_index_v2 where (timestamp >= '1680066360726210000' AND timestamp <=" +
" '1680066458000000000') group by ts order by ts",
" '1680066458000000000') group by ts order by value DESC",
PanelType: v3.PanelTypeGraph,
},
{
@@ -505,7 +516,7 @@ var testBuildTracesQueryData = []struct {
ExpectedQuery: "SELECT toStartOfInterval(timestamp, INTERVAL 60 SECOND) AS ts," +
" toFloat64(count()) as value from signoz_traces.distributed_signoz_index_v2" +
" where (timestamp >= '1680066360726210000' AND timestamp <= '1680066458000000000')" +
" AND stringTagMap['customer_id'] = '10001' group by ts order by ts",
" AND stringTagMap['customer_id'] = '10001' group by ts order by value DESC",
PanelType: v3.PanelTypeGraph,
},
{
@@ -522,7 +533,7 @@ var testBuildTracesQueryData = []struct {
TableName: "signoz_traces.distributed_signoz_index_v2",
ExpectedQuery: "SELECT toStartOfInterval(timestamp, INTERVAL 60 SECOND) AS ts, toFloat64(count()) as value" +
" from signoz_traces.distributed_signoz_index_v2 where (timestamp >= '1680066360726210000' AND timestamp <= '1680066458000000000')" +
" group by ts order by ts",
" group by ts order by value DESC",
PanelType: v3.PanelTypeGraph,
},
{
@@ -539,7 +550,7 @@ var testBuildTracesQueryData = []struct {
TableName: "signoz_traces.distributed_signoz_index_v2",
ExpectedQuery: "SELECT toStartOfInterval(timestamp, INTERVAL 60 SECOND) AS ts, toFloat64(count()) as value" +
" from signoz_traces.distributed_signoz_index_v2 where (timestamp >= '1680066360726210000' AND timestamp <= '1680066458000000000')" +
" AND has(stringTagMap, 'user_name') group by ts order by ts",
" AND has(stringTagMap, 'user_name') group by ts order by value DESC",
PanelType: v3.PanelTypeGraph,
},
{
@@ -556,7 +567,7 @@ var testBuildTracesQueryData = []struct {
TableName: "signoz_traces.distributed_signoz_index_v2",
ExpectedQuery: "SELECT toStartOfInterval(timestamp, INTERVAL 60 SECOND) AS ts, toFloat64(count()) as value" +
" from signoz_traces.distributed_signoz_index_v2 where (timestamp >= '1680066360726210000' AND timestamp <= '1680066458000000000')" +
" AND name != '' group by ts order by ts",
" AND name != '' group by ts order by value DESC",
PanelType: v3.PanelTypeGraph,
},
{
@@ -576,7 +587,7 @@ var testBuildTracesQueryData = []struct {
TableName: "signoz_traces.distributed_signoz_index_v2",
ExpectedQuery: "SELECT toStartOfInterval(timestamp, INTERVAL 60 SECOND) AS ts, toFloat64(count()) as value" +
" from signoz_traces.distributed_signoz_index_v2 where (timestamp >= '1680066360726210000' AND timestamp <= '1680066458000000000')" +
" AND numberTagMap['bytes'] > 100.000000 AND has(stringTagMap, 'user_name') group by ts order by ts",
" AND numberTagMap['bytes'] > 100.000000 AND has(stringTagMap, 'user_name') group by ts order by value DESC",
PanelType: v3.PanelTypeGraph,
},
{
@@ -594,7 +605,7 @@ var testBuildTracesQueryData = []struct {
TableName: "signoz_traces.distributed_signoz_index_v2",
ExpectedQuery: "SELECT toStartOfInterval(timestamp, INTERVAL 60 SECOND) AS ts, toFloat64(count(distinct(name))) as value" +
" from signoz_traces.distributed_signoz_index_v2 where (timestamp >= '1680066360726210000' AND timestamp <= '1680066458000000000')" +
" group by ts order by value ASC,ts",
" group by ts order by value ASC",
PanelType: v3.PanelTypeGraph,
},
{
@@ -611,7 +622,7 @@ var testBuildTracesQueryData = []struct {
TableName: "signoz_traces.distributed_signoz_index_v2",
ExpectedQuery: "SELECT toStartOfInterval(timestamp, INTERVAL 60 SECOND) AS ts, toFloat64(count(distinct(stringTagMap['name'])))" +
" as value from signoz_traces.distributed_signoz_index_v2 where" +
" (timestamp >= '1680066360726210000' AND timestamp <= '1680066458000000000') group by ts order by ts",
" (timestamp >= '1680066360726210000' AND timestamp <= '1680066458000000000') group by ts order by value DESC",
PanelType: v3.PanelTypeGraph,
},
{
@@ -630,7 +641,7 @@ var testBuildTracesQueryData = []struct {
},
},
GroupBy: []v3.AttributeKey{{Key: "http.method", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag}},
OrderBy: []v3.OrderBy{{ColumnName: "http.method", Order: "ASC"}, {ColumnName: "ts", Order: "ASC"}},
OrderBy: []v3.OrderBy{{ColumnName: "http.method", Order: "ASC"}},
},
TableName: "signoz_traces.distributed_signoz_index_v2",
ExpectedQuery: "SELECT toStartOfInterval(timestamp, INTERVAL 60 SECOND) AS ts," +
@@ -639,7 +650,7 @@ var testBuildTracesQueryData = []struct {
"where (timestamp >= '1680066360726210000' AND timestamp <= '1680066458000000000') " +
"AND stringTagMap['http.method'] = 'GET' AND resourceTagsMap['x'] != 'abc' " +
"AND has(stringTagMap, 'http.method') group by `http.method`,ts " +
"order by `http.method` ASC,ts",
"order by `http.method` ASC",
PanelType: v3.PanelTypeGraph,
},
{
@@ -671,7 +682,7 @@ var testBuildTracesQueryData = []struct {
"where (timestamp >= '1680066360726210000' AND timestamp <= '1680066458000000000') " +
"AND stringTagMap['method'] = 'GET' AND resourceTagsMap['x'] != 'abc' " +
"AND has(stringTagMap, 'method') AND has(resourceTagsMap, 'x') group by `method`,`x`,ts " +
"order by `method` ASC,`x` ASC,ts",
"order by `method` ASC,`x` ASC",
PanelType: v3.PanelTypeGraph,
},
{
@@ -699,7 +710,7 @@ var testBuildTracesQueryData = []struct {
"where (timestamp >= '1680066360726210000' AND timestamp <= '1680066458000000000') " +
"AND stringTagMap['method'] = 'GET' " +
"AND has(stringTagMap, 'method') group by `method`,ts " +
"order by `method` ASC,ts",
"order by `method` ASC",
PanelType: v3.PanelTypeGraph,
},
{
@@ -727,7 +738,7 @@ var testBuildTracesQueryData = []struct {
"where (timestamp >= '1680066360726210000' AND timestamp <= '1680066458000000000') " +
"AND stringTagMap['method'] = 'GET' " +
"AND has(stringTagMap, 'method') group by `method`,ts " +
"order by `method` ASC,ts",
"order by `method` ASC",
PanelType: v3.PanelTypeGraph,
},
{
@@ -755,7 +766,7 @@ var testBuildTracesQueryData = []struct {
"where (timestamp >= '1680066360726210000' AND timestamp <= '1680066458000000000') " +
"AND stringTagMap['method'] = 'GET' " +
"AND has(stringTagMap, 'method') group by `method`,ts " +
"order by `method` ASC,ts",
"order by `method` ASC",
PanelType: v3.PanelTypeGraph,
},
{
@@ -783,7 +794,7 @@ var testBuildTracesQueryData = []struct {
"where (timestamp >= '1680066360726210000' AND timestamp <= '1680066458000000000') " +
"AND stringTagMap['method'] = 'GET' " +
"AND has(stringTagMap, 'method') group by `method`,ts " +
"order by `method` ASC,ts",
"order by `method` ASC",
PanelType: v3.PanelTypeGraph,
},
{
@@ -807,7 +818,7 @@ var testBuildTracesQueryData = []struct {
"from signoz_traces.distributed_signoz_index_v2 " +
"where (timestamp >= '1680066360726210000' AND timestamp <= '1680066458000000000') " +
"AND has(stringTagMap, 'method') group by `method`,ts " +
"order by `method` ASC,ts",
"order by `method` ASC",
PanelType: v3.PanelTypeGraph,
},
{
@@ -828,7 +839,7 @@ var testBuildTracesQueryData = []struct {
ExpectedQuery: "SELECT toStartOfInterval(timestamp, INTERVAL 60 SECOND) AS ts, stringTagMap['method'] as `method`" +
", sum(bytes)/60 as value from signoz_traces.distributed_signoz_index_v2 " +
"where (timestamp >= '1680066360726210000' AND timestamp <= '1680066458000000000')" +
" AND has(stringTagMap, 'method') group by `method`,ts order by `method` ASC,ts",
" AND has(stringTagMap, 'method') group by `method`,ts order by `method` ASC",
PanelType: v3.PanelTypeGraph,
},
{
@@ -850,7 +861,7 @@ var testBuildTracesQueryData = []struct {
", count(numberTagMap['bytes'])/60 as value " +
"from signoz_traces.distributed_signoz_index_v2 where (timestamp >= '1680066360726210000' AND timestamp <= '1680066458000000000') " +
"AND has(stringTagMap, 'method') group by `method`,ts " +
"order by `method` ASC,ts",
"order by `method` ASC",
PanelType: v3.PanelTypeGraph,
},
{
@@ -873,7 +884,7 @@ var testBuildTracesQueryData = []struct {
"sum(numberTagMap['bytes'])/60 as value " +
"from signoz_traces.distributed_signoz_index_v2 where (timestamp >= '1680066360726210000' AND timestamp <= '1680066458000000000') " +
"AND has(stringTagMap, 'method') group by `method`,ts " +
"order by `method` ASC,ts",
"order by `method` ASC",
PanelType: v3.PanelTypeGraph,
},
{
@@ -897,7 +908,7 @@ var testBuildTracesQueryData = []struct {
TableName: "signoz_traces.distributed_signoz_index_v2",
ExpectedQuery: "SELECT toStartOfInterval(timestamp, INTERVAL 60 SECOND) AS ts, toFloat64(count(distinct(stringTagMap['name']))) as value" +
" from signoz_traces.distributed_signoz_index_v2 where (timestamp >= '1680066360726210000' AND timestamp <= '1680066458000000000')" +
" group by ts having value > 10 order by ts",
" group by ts having value > 10 order by value DESC",
PanelType: v3.PanelTypeGraph,
},
{
@@ -925,7 +936,7 @@ var testBuildTracesQueryData = []struct {
TableName: "signoz_traces.distributed_signoz_index_v2",
ExpectedQuery: "SELECT toStartOfInterval(timestamp, INTERVAL 60 SECOND) AS ts, toFloat64(count()) as value from " +
"signoz_traces.distributed_signoz_index_v2 where (timestamp >= '1680066360726210000' AND timestamp <= '1680066458000000000') " +
"AND stringTagMap['method'] = 'GET' AND has(stringTagMap, 'name') group by ts having value > 10 order by ts",
"AND stringTagMap['method'] = 'GET' AND has(stringTagMap, 'name') group by ts having value > 10 order by value DESC",
PanelType: v3.PanelTypeGraph,
},
{
@@ -953,7 +964,7 @@ var testBuildTracesQueryData = []struct {
TableName: "signoz_traces.distributed_signoz_index_v2",
ExpectedQuery: "SELECT toStartOfInterval(timestamp, INTERVAL 60 SECOND) AS ts, toFloat64(count(distinct(stringTagMap['name']))) as value" +
" from signoz_traces.distributed_signoz_index_v2 where (timestamp >= '1680066360726210000' AND timestamp <= '1680066458000000000') " +
"AND stringTagMap['method'] = 'GET' group by ts having value > 10 order by ts",
"AND stringTagMap['method'] = 'GET' group by ts having value > 10 order by value DESC",
PanelType: v3.PanelTypeGraph,
},
{
@@ -981,7 +992,7 @@ var testBuildTracesQueryData = []struct {
TableName: "signoz_traces.distributed_signoz_index_v2",
ExpectedQuery: "SELECT toStartOfInterval(timestamp, INTERVAL 60 SECOND) AS ts, toFloat64(count()) as value" +
" from signoz_traces.distributed_signoz_index_v2 where (timestamp >= '1680066360726210000' AND timestamp <= '1680066458000000000') " +
"AND stringTagMap['method'] = 'GET' AND has(stringTagMap, 'name') group by ts having value > 10 order by ts",
"AND stringTagMap['method'] = 'GET' AND has(stringTagMap, 'name') group by ts having value > 10",
PanelType: v3.PanelTypeValue,
},
{
@@ -1108,7 +1119,7 @@ var testBuildTracesQueryData = []struct {
" name FROM signoz_traces.distributed_signoz_index_v2 WHERE parentSpanID = '' AND (timestamp >= '1680066360726210000' AND " +
"timestamp <= '1680066458000000000') AND stringTagMap['method'] = 'GET' ORDER BY durationNano DESC LIMIT 100)" +
" SELECT subQuery.serviceName, subQuery.name, count() AS span_count, subQuery.durationNano, traceID" +
" FROM signoz_traces.distributed_signoz_index_v2 INNER JOIN subQuery ON distributed_signoz_index_v2.traceID" +
" FROM signoz_traces.distributed_signoz_index_v2 GLOBAL INNER JOIN subQuery ON distributed_signoz_index_v2.traceID" +
" = subQuery.traceID GROUP BY traceID, subQuery.durationNano, subQuery.name, subQuery.serviceName " +
"ORDER BY subQuery.durationNano desc;",
PanelType: v3.PanelTypeTrace,
@@ -1120,7 +1131,218 @@ func TestBuildTracesQuery(t *testing.T) {
Convey("TestBuildTracesQuery", t, func() {
query, err := buildTracesQuery(tt.Start, tt.End, tt.Step, tt.BuilderQuery, tt.TableName, map[string]v3.AttributeKey{
"name": {Key: "name", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag, IsColumn: true},
}, tt.PanelType)
}, tt.PanelType, "")
So(err, ShouldBeNil)
So(query, ShouldEqual, tt.ExpectedQuery)
})
}
}
var testPrepTracesQueryData = []struct {
Name string
PanelType v3.PanelType
Start int64
End int64
BuilderQuery *v3.BuilderQuery
ExpectedQuery string
Keys map[string]v3.AttributeKey
Type string
}{
{
Name: "Test TS with limit- first",
PanelType: v3.PanelTypeGraph,
Start: 1680066360726210000,
End: 1680066458000000000,
BuilderQuery: &v3.BuilderQuery{
QueryName: "A",
AggregateAttribute: v3.AttributeKey{Key: "name", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag},
AggregateOperator: v3.AggregateOperatorCountDistinct,
Expression: "A",
Filters: &v3.FilterSet{Operator: "AND", Items: []v3.FilterItem{
{Key: v3.AttributeKey{Key: "method", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag}, Value: "GET", Operator: "="},
},
},
Limit: 10,
StepInterval: 60,
GroupBy: []v3.AttributeKey{{Key: "method", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag}},
},
ExpectedQuery: "SELECT `method` from (SELECT stringTagMap['method'] as `method`," +
" toFloat64(count(distinct(stringTagMap['name']))) as value from signoz_traces.distributed_signoz_index_v2" +
" where (timestamp >= '1680066360726210000' AND timestamp <= '1680066458000000000') AND" +
" stringTagMap['method'] = 'GET' AND has(stringTagMap, 'method') group by `method` order by value DESC) LIMIT 10",
Keys: map[string]v3.AttributeKey{"name": {Key: "name", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag, IsColumn: true}},
Type: constants.FirstQueryGraphLimit,
},
{
Name: "Test TS with limit- first - with order by value",
PanelType: v3.PanelTypeGraph,
Start: 1680066360726210000,
End: 1680066458000000000,
BuilderQuery: &v3.BuilderQuery{
QueryName: "A",
AggregateAttribute: v3.AttributeKey{Key: "name", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag},
AggregateOperator: v3.AggregateOperatorCountDistinct,
Expression: "A",
Filters: &v3.FilterSet{Operator: "AND", Items: []v3.FilterItem{
{Key: v3.AttributeKey{Key: "method", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag}, Value: "GET", Operator: "="},
},
},
Limit: 10,
StepInterval: 60,
GroupBy: []v3.AttributeKey{{Key: "method", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag}},
OrderBy: []v3.OrderBy{{ColumnName: constants.SigNozOrderByValue, Order: "ASC"}},
},
ExpectedQuery: "SELECT `method` from (SELECT stringTagMap['method'] as `method`," +
" toFloat64(count(distinct(stringTagMap['name']))) as value from " +
"signoz_traces.distributed_signoz_index_v2 where (timestamp >= '1680066360726210000'" +
" AND timestamp <= '1680066458000000000') AND stringTagMap['method'] = 'GET' AND" +
" has(stringTagMap, 'method') group by `method` order by value ASC) LIMIT 10",
Keys: map[string]v3.AttributeKey{},
Type: constants.FirstQueryGraphLimit,
},
{
Name: "Test TS with limit- first - with order by attribute",
PanelType: v3.PanelTypeGraph,
Start: 1680066360726210000,
End: 1680066458000000000,
BuilderQuery: &v3.BuilderQuery{
QueryName: "A",
AggregateAttribute: v3.AttributeKey{Key: "serviceName", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag, IsColumn: true},
AggregateOperator: v3.AggregateOperatorCountDistinct,
Expression: "A",
Filters: &v3.FilterSet{},
Limit: 10,
StepInterval: 60,
GroupBy: []v3.AttributeKey{{Key: "serviceName", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag, IsColumn: true}},
OrderBy: []v3.OrderBy{{ColumnName: "serviceName", Order: "ASC"}},
},
ExpectedQuery: "SELECT `serviceName` from (SELECT serviceName as `serviceName`," +
" toFloat64(count(distinct(serviceName))) as value from " +
"signoz_traces.distributed_signoz_index_v2 where (timestamp >= '1680066360726210000'" +
" AND timestamp <= '1680066458000000000') " +
"group by `serviceName` order by `serviceName` ASC) LIMIT 10",
Keys: map[string]v3.AttributeKey{},
Type: constants.FirstQueryGraphLimit,
},
{
Name: "Test TS with limit- first - with 2 group by and 2 order by",
PanelType: v3.PanelTypeGraph,
Start: 1680066360726210000,
End: 1680066458000000000,
BuilderQuery: &v3.BuilderQuery{
QueryName: "A",
AggregateAttribute: v3.AttributeKey{Key: "serviceName", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag, IsColumn: true},
AggregateOperator: v3.AggregateOperatorCountDistinct,
Expression: "A",
Filters: &v3.FilterSet{},
Limit: 10,
StepInterval: 60,
GroupBy: []v3.AttributeKey{
{Key: "serviceName", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag, IsColumn: true},
{Key: "http.method", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag},
},
OrderBy: []v3.OrderBy{{ColumnName: "serviceName", Order: "ASC"}, {ColumnName: constants.SigNozOrderByValue, Order: "ASC"}},
},
ExpectedQuery: "SELECT `serviceName`,`http.method` from (SELECT serviceName as `serviceName`," +
" stringTagMap['http.method'] as `http.method`," +
" toFloat64(count(distinct(serviceName))) as value from " +
"signoz_traces.distributed_signoz_index_v2 where (timestamp >= '1680066360726210000'" +
" AND timestamp <= '1680066458000000000') AND has(stringTagMap, 'http.method') " +
"group by `serviceName`,`http.method` order by `serviceName` ASC,value ASC) LIMIT 10",
Keys: map[string]v3.AttributeKey{},
Type: constants.FirstQueryGraphLimit,
},
{
Name: "Test TS with limit- second",
PanelType: v3.PanelTypeGraph,
Start: 1680066360726210000,
End: 1680066458000000000,
BuilderQuery: &v3.BuilderQuery{
QueryName: "A",
AggregateAttribute: v3.AttributeKey{Key: "name", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag},
AggregateOperator: v3.AggregateOperatorCountDistinct,
Expression: "A",
Filters: &v3.FilterSet{Operator: "AND", Items: []v3.FilterItem{
{Key: v3.AttributeKey{Key: "method", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag}, Value: "GET", Operator: "="},
},
},
GroupBy: []v3.AttributeKey{{Key: "method", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag}},
Limit: 2,
StepInterval: 60,
},
ExpectedQuery: "SELECT toStartOfInterval(timestamp, INTERVAL 60 SECOND) AS ts, " +
"stringTagMap['method'] as `method`, toFloat64(count(distinct(stringTagMap['name'])))" +
" as value from signoz_traces.distributed_signoz_index_v2 where (timestamp >= '1680066360726210000'" +
" AND timestamp <= '1680066458000000000') AND stringTagMap['method'] = 'GET' AND" +
" has(stringTagMap, 'method') AND (`method`) GLOBAL IN (%s) group by `method`,ts order by value DESC",
Keys: map[string]v3.AttributeKey{},
Type: constants.SecondQueryGraphLimit,
},
{
Name: "Test TS with limit- second - with order by",
PanelType: v3.PanelTypeGraph,
Start: 1680066360726210000,
End: 1680066458000000000,
BuilderQuery: &v3.BuilderQuery{
QueryName: "A",
AggregateAttribute: v3.AttributeKey{Key: "name", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag},
AggregateOperator: v3.AggregateOperatorCountDistinct,
Expression: "A",
Filters: &v3.FilterSet{Operator: "AND", Items: []v3.FilterItem{
{Key: v3.AttributeKey{Key: "method", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag}, Value: "GET", Operator: "="},
},
},
GroupBy: []v3.AttributeKey{{Key: "method", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag}},
OrderBy: []v3.OrderBy{{ColumnName: "method", Order: "ASC"}},
Limit: 2,
StepInterval: 60,
},
ExpectedQuery: "SELECT toStartOfInterval(timestamp, INTERVAL 60 SECOND) AS ts, " +
"stringTagMap['method'] as `method`, toFloat64(count(distinct(stringTagMap['name'])))" +
" as value from signoz_traces.distributed_signoz_index_v2 where (timestamp >= '1680066360726210000'" +
" AND timestamp <= '1680066458000000000') AND stringTagMap['method'] = 'GET' AND" +
" has(stringTagMap, 'method') AND (`method`) GLOBAL IN (%s) group by `method`,ts order by `method` ASC", Keys: map[string]v3.AttributeKey{},
Type: constants.SecondQueryGraphLimit,
},
{
Name: "Test TS with limit - second - with two group by and two order by",
PanelType: v3.PanelTypeGraph,
Start: 1680066360726210000,
End: 1680066458000000000,
BuilderQuery: &v3.BuilderQuery{
QueryName: "A",
AggregateAttribute: v3.AttributeKey{Key: "name", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag},
AggregateOperator: v3.AggregateOperatorCountDistinct,
Expression: "A",
Filters: &v3.FilterSet{Operator: "AND", Items: []v3.FilterItem{
{Key: v3.AttributeKey{Key: "method", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag}, Value: "GET", Operator: "="},
},
},
GroupBy: []v3.AttributeKey{
{Key: "method", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag},
{Key: "name", DataType: v3.AttributeKeyDataTypeString, Type: v3.AttributeKeyTypeTag},
},
OrderBy: []v3.OrderBy{{ColumnName: "method", Order: "ASC"}, {ColumnName: "name", Order: "ASC"}},
Limit: 2,
StepInterval: 60,
},
ExpectedQuery: "SELECT toStartOfInterval(timestamp, INTERVAL 60 SECOND) AS ts, " +
"stringTagMap['method'] as `method`, stringTagMap['name'] as `name`," +
" toFloat64(count(distinct(stringTagMap['name'])))" +
" as value from signoz_traces.distributed_signoz_index_v2 where (timestamp >= '1680066360726210000'" +
" AND timestamp <= '1680066458000000000') AND stringTagMap['method'] = 'GET' AND" +
" has(stringTagMap, 'method') AND has(stringTagMap, 'name') " +
"AND (`method`,`name`) GLOBAL IN (%s) group by `method`,`name`,ts " +
"order by `method` ASC,`name` ASC",
Keys: map[string]v3.AttributeKey{},
Type: constants.SecondQueryGraphLimit,
},
}
func TestPrepareTracesQuery(t *testing.T) {
for _, tt := range testPrepTracesQueryData {
Convey("TestPrepareTracesQuery", t, func() {
query, err := PrepareTracesQuery(tt.Start, tt.End, tt.PanelType, tt.BuilderQuery, tt.Keys, tt.Type)
So(err, ShouldBeNil)
So(query, ShouldEqual, tt.ExpectedQuery)
})

View File

@@ -87,6 +87,13 @@ var DEFAULT_FEATURE_SET = model.FeatureSet{
UsageLimit: -1,
Route: "",
},
model.Feature{
Name: model.UseSpanMetrics,
Active: false,
Usage: 0,
UsageLimit: -1,
Route: "",
},
}
func GetContextTimeout() time.Duration {
@@ -239,7 +246,7 @@ const (
TracesExplorerViewSQLSelectWithSubQuery = "WITH subQuery AS (SELECT distinct on (traceID) traceID, durationNano, " +
"serviceName, name FROM %s.%s WHERE parentSpanID = '' AND %s %s ORDER BY durationNano DESC "
TracesExplorerViewSQLSelectQuery = "SELECT subQuery.serviceName, subQuery.name, count() AS " +
"span_count, subQuery.durationNano, traceID FROM %s.%s INNER JOIN subQuery ON %s.traceID = subQuery.traceID GROUP " +
"span_count, subQuery.durationNano, traceID FROM %s.%s GLOBAL INNER JOIN subQuery ON %s.traceID = subQuery.traceID GROUP " +
"BY traceID, subQuery.durationNano, subQuery.name, subQuery.serviceName ORDER BY subQuery.durationNano desc;"
)
@@ -301,3 +308,6 @@ var StaticFieldsLogsV3 = map[string]v3.AttributeKey{
const SigNozOrderByValue = "#SIGNOZ_VALUE"
const TIMESTAMP = "timestamp"
const FirstQueryGraphLimit = "first_query_graph_limit"
const SecondQueryGraphLimit = "second_query_graph_limit"

View File

@@ -35,11 +35,13 @@ func main() {
var ruleRepoURL string
var preferDelta bool
var preferSpanMetrics bool
flag.StringVar(&promConfigPath, "config", "./config/prometheus.yml", "(prometheus config to read metrics)")
flag.StringVar(&skipTopLvlOpsPath, "skip-top-level-ops", "", "(config file to skip top level operations)")
flag.BoolVar(&disableRules, "rules.disable", false, "(disable rule evaluation)")
flag.BoolVar(&preferDelta, "prefer-delta", false, "(prefer delta over gauge)")
flag.BoolVar(&preferDelta, "prefer-delta", false, "(prefer delta over cumulative metrics)")
flag.BoolVar(&preferSpanMetrics, "prefer-span-metrics", false, "(prefer span metrics for service level metrics)")
flag.StringVar(&ruleRepoURL, "rules.repo-url", constants.AlertHelpPage, "(host address used to build rule link in alert messages)")
flag.Parse()
@@ -55,6 +57,7 @@ func main() {
PromConfigPath: promConfigPath,
SkipTopLvlOpsPath: skipTopLvlOpsPath,
PreferDelta: preferDelta,
PreferSpanMetrics: preferSpanMetrics,
PrivateHostPort: constants.PrivateHostPort,
DisableRules: disableRules,
RuleRepoURL: ruleRepoURL,

View File

@@ -13,4 +13,5 @@ const SmartTraceDetail = "SMART_TRACE_DETAIL"
const CustomMetricsFunction = "CUSTOM_METRICS_FUNCTION"
const OSS = "OSS"
const QueryBuilderPanels = "QUERY_BUILDER_PANELS"
const QueryBuilderAlerts = "QUERY_BUILDER_ALERTS"
const QueryBuilderAlerts = "QUERY_BUILDER_ALERTS"
const UseSpanMetrics = "USE_SPAN_METRICS"

Some files were not shown because too many files have changed in this diff Show More