Compare commits

...

16 Commits

Author SHA1 Message Date
Shivanshu Raj Shrivastava
aa24227a99 chore: tf testing
Signed-off-by: Shivanshu Raj Shrivastava <shivanshu1333@gmail.com>
2025-04-29 15:57:41 +05:30
Shivanshu Raj Shrivastava
bd3794b7d4 chore: tf testing
Signed-off-by: Shivanshu Raj Shrivastava <shivanshu1333@gmail.com>
2025-04-29 15:49:35 +05:30
Gabber235
ef4e3a30fb fix: black gap on on cloud in sidebar (#7383) (#7427)
Co-authored-by: Vikrant Gupta <vikrant@signoz.io>
2025-04-28 20:32:32 +00:00
Vikrant Gupta
39532d5da0 fix(zeus): build pipelines LD flags (#7754) 2025-04-28 19:40:22 +00:00
Vishal Sharma
4d216bae4d feat: init userpilot (#7579) 2025-04-28 18:42:14 +00:00
Vikrant Gupta
21563914c7 fix(ruler): telemetry for rules (#7751)
### Summary

- fix the telemetry for rules and notification channels 
- not adding bun as this needs to go away soon
2025-04-28 23:07:28 +05:30
Vibhu Pandey
accb77f227 chore(use-*): remove use-new-traces-schema and use-new-logs-schema flags (#7741)
### Summary

remove use-new-traces-schema and use-new-logs-schema flags
2025-04-28 21:01:35 +05:30
Vibhu Pandey
e73e1bd078 feat(zeus): add zeus package (#7745)
* feat(zeus): add zeus package

* feat(signoz): add DI for zeus

* feat(zeus): integrate with the codebase

* ci(make): change makefile

* ci: change workflows to point to the new zeus url

* Update ee/query-service/usage/manager.go

Co-authored-by: ellipsis-dev[bot] <65095814+ellipsis-dev[bot]@users.noreply.github.com>

* Update ee/query-service/license/manager.go

Co-authored-by: ellipsis-dev[bot] <65095814+ellipsis-dev[bot]@users.noreply.github.com>

* fix: fix nil retriable

* fix: fix zeus DI

* fix: fix path of ldflag

* feat(zeus): added zeus integration tests

* feat(zeus): added zeus integration tests

* feat(zeus): format the pytest

---------

Co-authored-by: ellipsis-dev[bot] <65095814+ellipsis-dev[bot]@users.noreply.github.com>
Co-authored-by: vikrantgupta25 <vikrant.thomso@gmail.com>
Co-authored-by: Vikrant Gupta <vikrant@signoz.io>
2025-04-28 14:20:47 +00:00
Vikrant Gupta
940313d28b fix(organization): return display name instead of name for organization (#7747)
### Summary

- return display name instead of name for organization
2025-04-28 10:54:53 +00:00
Vibhu Pandey
9815ec7d81 chore: remove references to unused flags (#7739)
### Summary

remove references to unused flags
2025-04-28 09:27:26 +00:00
Vibhu Pandey
a7cad0f1a5 chore(conf): add clickhouse settings (#7743) 2025-04-27 14:26:36 +00:00
SagarRajput-7
a624b4758d chore: fix failing typecheck (#7742) 2025-04-27 19:51:05 +05:30
SagarRajput-7
ee5684b130 feat: added permission restriction for viewer for planned Maintaince (#7736)
* feat: added permission restriction for viewer for planned Maintaince

* feat: added test cases
2025-04-27 17:24:56 +05:30
SagarRajput-7
2f8da5957b feat: added custom single and multiselect components (#7497)
* feat: added new Select component for multi and single select

* feat: refactored code and added keyboard navigations in single select

* feat: different state handling in single select

* feat: updated the playground page

* feat: multi-select updates

* feat: fixed multiselect selection issues

* feat: multiselect cleanup

* feat: multiselect key navigation cleanup

* feat: added tokenization in multiselect

* feat: add on enter and handle duplicates

* feat: design update to the components

* feat: design update to the components

* feat: design update to the components

* feat: updated the playground page

* feat: edited playground data

* feat: edited styles

* feat: code cleanup

* feat: added shift + keys navigation and selection

* feat: improved styles and added darkmode styles

* feat: removed scroll bar hover style

* feat: added scroll bar on hover

* feat: added regex wrapper support

* feat: fixed right arrow navigation across chips

* feat: addressed all the single select feedbacks

* feat: addressed all the single select feedbacks

* feat: added only-all-toggle feat with ALL selection tag

* feat: remove clear, update footer info content and style and misc fixes

* feat: misc style fixes

* feat: added quotes exception to the multiselect tagging

* feat: removing demo page, and cleanup PR for reviews

* feat: resolved comments and refactoring

* feat: added test cases
2025-04-27 16:55:53 +05:30
dependabot[bot]
3f6f77d0e2 chore(deps): bump axios from 1.7.7 to 1.8.2 in /frontend (#7249)
Bumps [axios](https://github.com/axios/axios) from 1.7.7 to 1.8.2.
- [Release notes](https://github.com/axios/axios/releases)
- [Changelog](https://github.com/axios/axios/blob/v1.x/CHANGELOG.md)
- [Commits](https://github.com/axios/axios/compare/v1.7.7...v1.8.2)

---
updated-dependencies:
- dependency-name: axios
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-27 11:16:16 +00:00
Vibhu Pandey
5bceffbeaa fix: fix modules and handler (#7737)
* fix: fix modules and handler

* fix: fix sqlmigration package

* fix: fix other fmt issues

* fix: fix tests

* fix: fix tests
2025-04-27 16:38:34 +05:30
119 changed files with 7656 additions and 2308 deletions

View File

@@ -1,42 +0,0 @@
# Github actions
## Testing the UI manually on each PR
First we need to make sure the UI is ready
* Check the `Start tunnel` step in `e2e-k8s/deploy-on-k3s-cluster` job and make sure you see `your url is: https://pull-<number>-signoz.loca.lt`
* This job will run until the PR is merged or closed to keep the local tunneling alive
- github will cancel this job if the PR wasn't merged after 6h
- if the job was cancel, go to the action and press `Re-run all jobs`
Now you can open your browser at https://pull-<number>-signoz.loca.lt and check the UI.
## Environment Variables
To run GitHub workflow, a few environment variables needs to add in GitHub secrets
<table>
<tr>
<th> Variables </th>
<th> Description </th>
<th> Example </th>
</tr>
<tr>
<td> REPONAME </td>
<td> Provide the DockerHub user/organisation name of the image. </td>
<td> signoz</td>
</tr>
<tr>
<td> DOCKERHUB_USERNAME </td>
<td> Docker hub username </td>
<td> signoz</td>
</tr>
<tr>
<td> DOCKERHUB_TOKEN </td>
<td> Docker hub password/token with push permission </td>
<td> **** </td>
</tr>
<tr>
<td> SONAR_TOKEN </td>
<td> <a href="https://sonarcloud.io">SonarCloud</a> token </td>
<td> **** </td>
</tr>

View File

@@ -104,6 +104,8 @@ jobs:
-X github.com/SigNoz/signoz/pkg/version.hash=${{ needs.prepare.outputs.hash }}
-X github.com/SigNoz/signoz/pkg/version.time=${{ needs.prepare.outputs.time }}
-X github.com/SigNoz/signoz/pkg/version.branch=${{ needs.prepare.outputs.branch }}
-X github.com/SigNoz/signoz/ee/zeus.url=https://api.signoz.cloud
-X github.com/SigNoz/signoz/ee/zeus.deprecatedURL=https://license.signoz.io
-X github.com/SigNoz/signoz/ee/query-service/constants.ZeusURL=https://api.signoz.cloud
-X github.com/SigNoz/signoz/ee/query-service/constants.LicenseSignozIo=https://license.signoz.io/api/v1'
GO_CGO_ENABLED: 1

View File

@@ -101,6 +101,8 @@ jobs:
-X github.com/SigNoz/signoz/pkg/version.hash=${{ needs.prepare.outputs.hash }}
-X github.com/SigNoz/signoz/pkg/version.time=${{ needs.prepare.outputs.time }}
-X github.com/SigNoz/signoz/pkg/version.branch=${{ needs.prepare.outputs.branch }}
-X github.com/SigNoz/signoz/ee/zeus.url=https://api.staging.signoz.cloud
-X github.com/SigNoz/signoz/ee/zeus.deprecatedURL=https://license.staging.signoz.cloud
-X github.com/SigNoz/signoz/ee/query-service/constants.ZeusURL=https://api.staging.signoz.cloud
-X github.com/SigNoz/signoz/ee/query-service/constants.LicenseSignozIo=https://license.staging.signoz.cloud/api/v1'
GO_CGO_ENABLED: 1

View File

@@ -1,16 +0,0 @@
name: remove-label
on:
pull_request_target:
types: [synchronize]
jobs:
remove:
runs-on: ubuntu-latest
steps:
- name: Remove label testing-deploy from PR
uses: buildsville/add-remove-label@v2.0.0
with:
label: testing-deploy
type: remove
token: ${{ secrets.GITHUB_TOKEN }}

View File

@@ -14,9 +14,9 @@ ARCHS ?= amd64 arm64
TARGET_DIR ?= $(shell pwd)/target
ZEUS_URL ?= https://api.signoz.cloud
GO_BUILD_LDFLAG_ZEUS_URL = -X github.com/SigNoz/signoz/ee/query-service/constants.ZeusURL=$(ZEUS_URL)
LICENSE_URL ?= https://license.signoz.io/api/v1
GO_BUILD_LDFLAG_LICENSE_SIGNOZ_IO = -X github.com/SigNoz/signoz/ee/query-service/constants.LicenseSignozIo=$(LICENSE_URL)
GO_BUILD_LDFLAG_ZEUS_URL = -X github.com/SigNoz/signoz/ee/zeus.url=$(ZEUS_URL)
LICENSE_URL ?= https://license.signoz.io
GO_BUILD_LDFLAG_LICENSE_SIGNOZ_IO = -X github.com/SigNoz/signoz/ee/zeus.deprecatedURL=$(LICENSE_URL)
GO_BUILD_VERSION_LDFLAGS = -X github.com/SigNoz/signoz/pkg/version.version=$(VERSION) -X github.com/SigNoz/signoz/pkg/version.hash=$(COMMIT_SHORT_SHA) -X github.com/SigNoz/signoz/pkg/version.time=$(TIMESTAMP) -X github.com/SigNoz/signoz/pkg/version.branch=$(BRANCH_NAME)
GO_BUILD_ARCHS_COMMUNITY = $(addprefix go-build-community-,$(ARCHS))

View File

@@ -103,6 +103,13 @@ telemetrystore:
clickhouse:
# The DSN to use for clickhouse.
dsn: tcp://localhost:9000
# The query settings for clickhouse.
settings:
max_execution_time: 0
max_execution_time_leaf: 0
timeout_before_checking_execution_speed: 0
max_bytes_to_read: 0
max_result_rows_for_ch_query: 0
##################### Prometheus #####################
prometheus:

View File

@@ -35,6 +35,8 @@ builds:
- -X github.com/SigNoz/signoz/pkg/version.hash={{ .ShortCommit }}
- -X github.com/SigNoz/signoz/pkg/version.time={{ .CommitTimestamp }}
- -X github.com/SigNoz/signoz/pkg/version.branch={{ .Branch }}
- -X github.com/SigNoz/signoz/ee/zeus.url=https://api.signoz.cloud
- -X github.com/SigNoz/signoz/ee/zeus.deprecatedURL=https://license.signoz.io
- -X github.com/SigNoz/signoz/ee/query-service/constants.ZeusURL=https://api.signoz.cloud
- -X github.com/SigNoz/signoz/ee/query-service/constants.LicenseSignozIo=https://license.signoz.io/api/v1
- >-

View File

@@ -13,9 +13,6 @@ import (
"github.com/SigNoz/signoz/pkg/alertmanager"
"github.com/SigNoz/signoz/pkg/apis/fields"
"github.com/SigNoz/signoz/pkg/http/middleware"
"github.com/SigNoz/signoz/pkg/modules/organization/implorganization"
"github.com/SigNoz/signoz/pkg/modules/preference"
preferencecore "github.com/SigNoz/signoz/pkg/modules/preference/core"
baseapp "github.com/SigNoz/signoz/pkg/query-service/app"
"github.com/SigNoz/signoz/pkg/query-service/app/cloudintegrations"
"github.com/SigNoz/signoz/pkg/query-service/app/integrations"
@@ -26,14 +23,12 @@ import (
rules "github.com/SigNoz/signoz/pkg/query-service/rules"
"github.com/SigNoz/signoz/pkg/signoz"
"github.com/SigNoz/signoz/pkg/types/authtypes"
"github.com/SigNoz/signoz/pkg/types/preferencetypes"
"github.com/SigNoz/signoz/pkg/version"
"github.com/gorilla/mux"
)
type APIHandlerOptions struct {
DataConnector interfaces.DataConnector
SkipConfig *basemodel.SkipConfig
PreferSpanMetrics bool
AppDao dao.ModelDao
RulesManager *rules.Manager
@@ -60,13 +55,8 @@ type APIHandler struct {
// NewAPIHandler returns an APIHandler
func NewAPIHandler(opts APIHandlerOptions, signoz *signoz.SigNoz) (*APIHandler, error) {
preference := preference.NewAPI(preferencecore.NewPreference(preferencecore.NewStore(signoz.SQLStore), preferencetypes.NewDefaultPreferenceMap()))
organizationAPI := implorganization.NewAPI(implorganization.NewModule(implorganization.NewStore(signoz.SQLStore)))
organizationModule := implorganization.NewModule(implorganization.NewStore(signoz.SQLStore))
baseHandler, err := baseapp.NewAPIHandler(baseapp.APIHandlerOpts{
Reader: opts.DataConnector,
SkipConfig: opts.SkipConfig,
PreferSpanMetrics: opts.PreferSpanMetrics,
AppDao: opts.AppDao,
RuleManager: opts.RulesManager,
@@ -76,14 +66,9 @@ func NewAPIHandler(opts APIHandlerOptions, signoz *signoz.SigNoz) (*APIHandler,
LogsParsingPipelineController: opts.LogsParsingPipelineController,
Cache: opts.Cache,
FluxInterval: opts.FluxInterval,
UseLogsNewSchema: opts.UseLogsNewSchema,
UseTraceNewSchema: opts.UseTraceNewSchema,
AlertmanagerAPI: alertmanager.NewAPI(signoz.Alertmanager),
FieldsAPI: fields.NewAPI(signoz.TelemetryStore),
Signoz: signoz,
Preference: preference,
OrganizationAPI: organizationAPI,
OrganizationModule: organizationModule,
})
if err != nil {

View File

@@ -134,7 +134,7 @@ func (ah *APIHandler) registerUser(w http.ResponseWriter, r *http.Request) {
return
}
_, registerError := baseauth.Register(ctx, req, ah.Signoz.Alertmanager, ah.OrganizationModule)
_, registerError := baseauth.Register(ctx, req, ah.Signoz.Alertmanager, ah.Signoz.Modules.Organization)
if !registerError.IsNil() {
RespondError(w, apierr, nil)
return
@@ -152,7 +152,7 @@ func (ah *APIHandler) getInvite(w http.ResponseWriter, r *http.Request) {
token := mux.Vars(r)["token"]
sourceUrl := r.URL.Query().Get("ref")
inviteObject, err := baseauth.GetInvite(r.Context(), token, ah.OrganizationModule)
inviteObject, err := baseauth.GetInvite(r.Context(), token, ah.Signoz.Modules.Organization)
if err != nil {
RespondError(w, model.BadRequest(err), nil)
return

View File

@@ -9,6 +9,8 @@ import (
"github.com/SigNoz/signoz/ee/query-service/integrations/signozio"
"github.com/SigNoz/signoz/ee/query-service/model"
"github.com/SigNoz/signoz/pkg/http/render"
"github.com/SigNoz/signoz/pkg/query-service/telemetry"
"github.com/SigNoz/signoz/pkg/types/authtypes"
)
type DayWiseBreakdown struct {
@@ -90,8 +92,13 @@ func (ah *APIHandler) getActiveLicenseV3(w http.ResponseWriter, r *http.Request)
// this function is called by zeus when inserting licenses in the query-service
func (ah *APIHandler) applyLicenseV3(w http.ResponseWriter, r *http.Request) {
var licenseKey ApplyLicenseRequest
claims, err := authtypes.ClaimsFromContext(r.Context())
if err != nil {
render.Error(w, err)
return
}
var licenseKey ApplyLicenseRequest
if err := json.NewDecoder(r.Body).Decode(&licenseKey); err != nil {
RespondError(w, model.BadRequest(err), nil)
return
@@ -102,9 +109,10 @@ func (ah *APIHandler) applyLicenseV3(w http.ResponseWriter, r *http.Request) {
return
}
_, apiError := ah.LM().ActivateV3(r.Context(), licenseKey.LicenseKey)
if apiError != nil {
RespondError(w, apiError, nil)
_, err = ah.LM().ActivateV3(r.Context(), licenseKey.LicenseKey)
if err != nil {
telemetry.GetInstance().SendEvent(telemetry.TELEMETRY_LICENSE_ACT_FAILED, map[string]interface{}{"err": err.Error()}, claims.Email, true, false)
render.Error(w, err)
return
}
@@ -112,10 +120,9 @@ func (ah *APIHandler) applyLicenseV3(w http.ResponseWriter, r *http.Request) {
}
func (ah *APIHandler) refreshLicensesV3(w http.ResponseWriter, r *http.Request) {
apiError := ah.LM().RefreshLicense(r.Context())
if apiError != nil {
RespondError(w, apiError, nil)
err := ah.LM().RefreshLicense(r.Context())
if err != nil {
render.Error(w, err)
return
}
@@ -127,7 +134,6 @@ func getCheckoutPortalResponse(redirectURL string) *Redirect {
}
func (ah *APIHandler) checkout(w http.ResponseWriter, r *http.Request) {
checkoutRequest := &model.CheckoutRequest{}
if err := json.NewDecoder(r.Body).Decode(checkoutRequest); err != nil {
RespondError(w, model.BadRequest(err), nil)
@@ -140,9 +146,9 @@ func (ah *APIHandler) checkout(w http.ResponseWriter, r *http.Request) {
return
}
redirectUrl, err := signozio.CheckoutSession(r.Context(), checkoutRequest, license.Key)
redirectUrl, err := signozio.CheckoutSession(r.Context(), checkoutRequest, license.Key, ah.Signoz.Zeus)
if err != nil {
RespondError(w, err, nil)
render.Error(w, err)
return
}
@@ -230,7 +236,6 @@ func (ah *APIHandler) listLicensesV2(w http.ResponseWriter, r *http.Request) {
}
func (ah *APIHandler) portalSession(w http.ResponseWriter, r *http.Request) {
portalRequest := &model.PortalRequest{}
if err := json.NewDecoder(r.Body).Decode(portalRequest); err != nil {
RespondError(w, model.BadRequest(err), nil)
@@ -243,9 +248,9 @@ func (ah *APIHandler) portalSession(w http.ResponseWriter, r *http.Request) {
return
}
redirectUrl, err := signozio.PortalSession(r.Context(), portalRequest, license.Key)
redirectUrl, err := signozio.PortalSession(r.Context(), portalRequest, license.Key, ah.Signoz.Zeus)
if err != nil {
RespondError(w, err, nil)
render.Error(w, err)
return
}

View File

@@ -119,7 +119,7 @@ func (ah *APIHandler) updatePAT(w http.ResponseWriter, r *http.Request) {
req.UpdatedByUserID = claims.UserID
req.UpdatedAt = time.Now()
zap.L().Info("Got Update PAT request", zap.Any("pat", req))
zap.L().Info("Got UpdateSteps PAT request", zap.Any("pat", req))
var apierr basemodel.BaseApiError
if apierr = ah.AppDao().UpdatePAT(r.Context(), claims.OrgID, req, id); apierr != nil {
RespondError(w, apierr, nil)

View File

@@ -23,12 +23,10 @@ func NewDataConnector(
telemetryStore telemetrystore.TelemetryStore,
prometheus prometheus.Prometheus,
cluster string,
useLogsNewSchema bool,
useTraceNewSchema bool,
fluxIntervalForTraceDetail time.Duration,
cache cache.Cache,
) *ClickhouseReader {
chReader := basechr.NewReader(sqlDB, telemetryStore, prometheus, cluster, useLogsNewSchema, useTraceNewSchema, fluxIntervalForTraceDetail, cache)
chReader := basechr.NewReader(sqlDB, telemetryStore, prometheus, cluster, fluxIntervalForTraceDetail, cache)
return &ClickhouseReader{
conn: telemetryStore.ClickhouseDB(),
appdb: sqlDB,

View File

@@ -45,33 +45,23 @@ import (
baseconst "github.com/SigNoz/signoz/pkg/query-service/constants"
"github.com/SigNoz/signoz/pkg/query-service/healthcheck"
baseint "github.com/SigNoz/signoz/pkg/query-service/interfaces"
basemodel "github.com/SigNoz/signoz/pkg/query-service/model"
baserules "github.com/SigNoz/signoz/pkg/query-service/rules"
"github.com/SigNoz/signoz/pkg/query-service/telemetry"
"github.com/SigNoz/signoz/pkg/query-service/utils"
"go.uber.org/zap"
)
const AppDbEngine = "sqlite"
type ServerOptions struct {
Config signoz.Config
SigNoz *signoz.SigNoz
PromConfigPath string
SkipTopLvlOpsPath string
HTTPHostPort string
PrivateHostPort string
// alert specific params
DisableRules bool
RuleRepoURL string
Config signoz.Config
SigNoz *signoz.SigNoz
HTTPHostPort string
PrivateHostPort string
PreferSpanMetrics bool
CacheConfigPath string
FluxInterval string
FluxIntervalForTraceDetail string
Cluster string
GatewayUrl string
UseLogsNewSchema bool
UseTraceNewSchema bool
Jwt *authtypes.JWT
}
@@ -122,7 +112,7 @@ func NewServer(serverOptions *ServerOptions) (*Server, error) {
}
// initiate license manager
lm, err := licensepkg.StartManager(serverOptions.SigNoz.SQLStore.SQLxDB(), serverOptions.SigNoz.SQLStore)
lm, err := licensepkg.StartManager(serverOptions.SigNoz.SQLStore.SQLxDB(), serverOptions.SigNoz.SQLStore, serverOptions.SigNoz.Zeus)
if err != nil {
return nil, err
}
@@ -140,20 +130,10 @@ func NewServer(serverOptions *ServerOptions) (*Server, error) {
serverOptions.SigNoz.TelemetryStore,
serverOptions.SigNoz.Prometheus,
serverOptions.Cluster,
serverOptions.UseLogsNewSchema,
serverOptions.UseTraceNewSchema,
fluxIntervalForTraceDetail,
serverOptions.SigNoz.Cache,
)
skipConfig := &basemodel.SkipConfig{}
if serverOptions.SkipTopLvlOpsPath != "" {
// read skip config
skipConfig, err = basemodel.ReadSkipConfig(serverOptions.SkipTopLvlOpsPath)
if err != nil {
return nil, err
}
}
var c cache.Cache
if serverOptions.CacheConfigPath != "" {
cacheOpts, err := cache.LoadFromYAMLCacheConfigFile(serverOptions.CacheConfigPath)
@@ -164,13 +144,9 @@ func NewServer(serverOptions *ServerOptions) (*Server, error) {
}
rm, err := makeRulesManager(
serverOptions.RuleRepoURL,
serverOptions.SigNoz.SQLStore.SQLxDB(),
reader,
c,
serverOptions.DisableRules,
serverOptions.UseLogsNewSchema,
serverOptions.UseTraceNewSchema,
serverOptions.SigNoz.Alertmanager,
serverOptions.SigNoz.SQLStore,
serverOptions.SigNoz.TelemetryStore,
@@ -219,7 +195,7 @@ func NewServer(serverOptions *ServerOptions) (*Server, error) {
}
// start the usagemanager
usageManager, err := usage.New(modelDao, lm.GetRepo(), serverOptions.SigNoz.TelemetryStore.ClickhouseDB(), serverOptions.Config.TelemetryStore.Clickhouse.DSN)
usageManager, err := usage.New(modelDao, lm.GetRepo(), serverOptions.SigNoz.TelemetryStore.ClickhouseDB(), serverOptions.SigNoz.Zeus)
if err != nil {
return nil, err
}
@@ -238,7 +214,6 @@ func NewServer(serverOptions *ServerOptions) (*Server, error) {
apiOpts := api.APIHandlerOptions{
DataConnector: reader,
SkipConfig: skipConfig,
PreferSpanMetrics: serverOptions.PreferSpanMetrics,
AppDao: modelDao,
RulesManager: rm,
@@ -252,8 +227,6 @@ func NewServer(serverOptions *ServerOptions) (*Server, error) {
FluxInterval: fluxInterval,
Gateway: gatewayProxy,
GatewayUrl: serverOptions.GatewayUrl,
UseLogsNewSchema: serverOptions.UseLogsNewSchema,
UseTraceNewSchema: serverOptions.UseTraceNewSchema,
JWT: serverOptions.Jwt,
}
@@ -263,8 +236,6 @@ func NewServer(serverOptions *ServerOptions) (*Server, error) {
}
s := &Server{
// logger: logger,
// tracer: tracer,
ruleManager: rm,
serverOptions: serverOptions,
unavailableChannel: make(chan healthcheck.Status),
@@ -356,6 +327,7 @@ func (s *Server) createPublicServer(apiHandler *api.APIHandler, web web.Web) (*h
apiHandler.RegisterMessagingQueuesRoutes(r, am)
apiHandler.RegisterThirdPartyApiRoutes(r, am)
apiHandler.MetricExplorerRoutes(r, am)
apiHandler.RegisterTraceFunnelsRoutes(r, am)
c := cors.New(cors.Options{
AllowedOrigins: []string{"*"},
@@ -411,13 +383,7 @@ func (s *Server) initListeners() error {
// Start listening on http and private http port concurrently
func (s *Server) Start(ctx context.Context) error {
// initiate rule manager first
if !s.serverOptions.DisableRules {
s.ruleManager.Start(ctx)
} else {
zap.L().Info("msg: Rules disabled as rules.disable is set to TRUE")
}
s.ruleManager.Start(ctx)
err := s.initListeners()
if err != nil {
@@ -508,13 +474,9 @@ func (s *Server) Stop() error {
}
func makeRulesManager(
ruleRepoURL string,
db *sqlx.DB,
ch baseint.Reader,
cache cache.Cache,
disableRules bool,
useLogsNewSchema bool,
useTraceNewSchema bool,
alertmanager alertmanager.Alertmanager,
sqlstore sqlstore.SQLStore,
telemetryStore telemetrystore.TelemetryStore,
@@ -524,17 +486,13 @@ func makeRulesManager(
managerOpts := &baserules.ManagerOptions{
TelemetryStore: telemetryStore,
Prometheus: prometheus,
RepoURL: ruleRepoURL,
DBConn: db,
Context: context.Background(),
Logger: zap.L(),
DisableRules: disableRules,
Reader: ch,
Cache: cache,
EvalDelay: baseconst.GetEvalDelay(),
PrepareTaskFunc: rules.PrepareTaskFunc,
UseLogsNewSchema: useLogsNewSchema,
UseTraceNewSchema: useTraceNewSchema,
PrepareTestRuleFunc: rules.TestNotification,
Alertmanager: alertmanager,
SQLStore: sqlstore,

View File

@@ -1,16 +0,0 @@
package signozio
type status string
type ValidateLicenseResponse struct {
Status status `json:"status"`
Data map[string]interface{} `json:"data"`
}
type CheckoutSessionRedirect struct {
RedirectURL string `json:"url"`
}
type CheckoutResponse struct {
Status status `json:"status"`
Data CheckoutSessionRedirect `json:"data"`
}

View File

@@ -1,222 +1,67 @@
package signozio
import (
"bytes"
"context"
"encoding/json"
"fmt"
"io"
"net/http"
"time"
"github.com/SigNoz/signoz/ee/query-service/constants"
"github.com/SigNoz/signoz/ee/query-service/model"
"github.com/pkg/errors"
"github.com/SigNoz/signoz/pkg/zeus"
"github.com/tidwall/gjson"
)
var C *Client
const (
POST = "POST"
APPLICATION_JSON = "application/json"
)
type Client struct {
Prefix string
GatewayUrl string
}
func New() *Client {
return &Client{
Prefix: constants.LicenseSignozIo,
GatewayUrl: constants.ZeusURL,
}
}
func init() {
C = New()
}
func ValidateLicenseV3(licenseKey string) (*model.LicenseV3, *model.ApiError) {
// Creating an HTTP client with a timeout for better control
client := &http.Client{
Timeout: 10 * time.Second,
}
req, err := http.NewRequest("GET", C.GatewayUrl+"/v2/licenses/me", nil)
if err != nil {
return nil, model.BadRequest(errors.Wrap(err, "failed to create request"))
}
// Setting the custom header
req.Header.Set("X-Signoz-Cloud-Api-Key", licenseKey)
response, err := client.Do(req)
if err != nil {
return nil, model.BadRequest(errors.Wrap(err, "failed to make post request"))
}
body, err := io.ReadAll(response.Body)
if err != nil {
return nil, model.BadRequest(errors.Wrap(err, fmt.Sprintf("failed to read validation response from %v", C.GatewayUrl)))
}
defer response.Body.Close()
switch response.StatusCode {
case 200:
a := ValidateLicenseResponse{}
err = json.Unmarshal(body, &a)
if err != nil {
return nil, model.BadRequest(errors.Wrap(err, "failed to marshal license validation response"))
}
license, err := model.NewLicenseV3(a.Data)
if err != nil {
return nil, model.BadRequest(errors.Wrap(err, "failed to generate new license v3"))
}
return license, nil
case 400:
return nil, model.BadRequest(errors.Wrap(fmt.Errorf(string(body)),
fmt.Sprintf("bad request error received from %v", C.GatewayUrl)))
case 401:
return nil, model.Unauthorized(errors.Wrap(fmt.Errorf(string(body)),
fmt.Sprintf("unauthorized request error received from %v", C.GatewayUrl)))
default:
return nil, model.InternalError(errors.Wrap(fmt.Errorf(string(body)),
fmt.Sprintf("internal request error received from %v", C.GatewayUrl)))
}
}
func NewPostRequestWithCtx(ctx context.Context, url string, contentType string, body io.Reader) (*http.Request, error) {
req, err := http.NewRequestWithContext(ctx, POST, url, body)
func ValidateLicenseV3(ctx context.Context, licenseKey string, zeus zeus.Zeus) (*model.LicenseV3, error) {
data, err := zeus.GetLicense(ctx, licenseKey)
if err != nil {
return nil, err
}
req.Header.Add("Content-Type", contentType)
return req, err
var m map[string]any
if err = json.Unmarshal(data, &m); err != nil {
return nil, err
}
license, err := model.NewLicenseV3(m)
if err != nil {
return nil, err
}
return license, nil
}
// SendUsage reports the usage of signoz to license server
func SendUsage(ctx context.Context, usage model.UsagePayload) *model.ApiError {
reqString, _ := json.Marshal(usage)
req, err := NewPostRequestWithCtx(ctx, C.Prefix+"/usage", APPLICATION_JSON, bytes.NewBuffer(reqString))
func SendUsage(ctx context.Context, usage model.UsagePayload, zeus zeus.Zeus) error {
body, err := json.Marshal(usage)
if err != nil {
return model.BadRequest(errors.Wrap(err, "unable to create http request"))
return err
}
res, err := http.DefaultClient.Do(req)
if err != nil {
return model.BadRequest(errors.Wrap(err, "unable to connect with license.signoz.io, please check your network connection"))
}
body, err := io.ReadAll(res.Body)
if err != nil {
return model.BadRequest(errors.Wrap(err, "failed to read usage response from license.signoz.io"))
}
defer res.Body.Close()
switch res.StatusCode {
case 200, 201:
return nil
case 400, 401:
return model.BadRequest(errors.Wrap(errors.New(string(body)),
"bad request error received from license.signoz.io"))
default:
return model.InternalError(errors.Wrap(errors.New(string(body)),
"internal error received from license.signoz.io"))
}
return zeus.PutMeters(ctx, usage.LicenseKey.String(), body)
}
func CheckoutSession(ctx context.Context, checkoutRequest *model.CheckoutRequest, licenseKey string) (string, *model.ApiError) {
hClient := &http.Client{}
reqString, err := json.Marshal(checkoutRequest)
func CheckoutSession(ctx context.Context, checkoutRequest *model.CheckoutRequest, licenseKey string, zeus zeus.Zeus) (string, error) {
body, err := json.Marshal(checkoutRequest)
if err != nil {
return "", model.BadRequest(err)
return "", err
}
req, err := http.NewRequestWithContext(ctx, "POST", C.GatewayUrl+"/v2/subscriptions/me/sessions/checkout", bytes.NewBuffer(reqString))
response, err := zeus.GetCheckoutURL(ctx, licenseKey, body)
if err != nil {
return "", model.BadRequest(err)
return "", err
}
req.Header.Set("X-Signoz-Cloud-Api-Key", licenseKey)
response, err := hClient.Do(req)
if err != nil {
return "", model.BadRequest(err)
}
body, err := io.ReadAll(response.Body)
if err != nil {
return "", model.BadRequest(errors.Wrap(err, fmt.Sprintf("failed to read checkout response from %v", C.GatewayUrl)))
}
defer response.Body.Close()
switch response.StatusCode {
case 201:
a := CheckoutResponse{}
err = json.Unmarshal(body, &a)
if err != nil {
return "", model.BadRequest(errors.Wrap(err, "failed to unmarshal zeus checkout response"))
}
return a.Data.RedirectURL, nil
case 400:
return "", model.BadRequest(errors.Wrap(errors.New(string(body)),
fmt.Sprintf("bad request error received from %v", C.GatewayUrl)))
case 401:
return "", model.Unauthorized(errors.Wrap(errors.New(string(body)),
fmt.Sprintf("unauthorized request error received from %v", C.GatewayUrl)))
default:
return "", model.InternalError(errors.Wrap(errors.New(string(body)),
fmt.Sprintf("internal request error received from %v", C.GatewayUrl)))
}
return gjson.GetBytes(response, "url").String(), nil
}
func PortalSession(ctx context.Context, checkoutRequest *model.PortalRequest, licenseKey string) (string, *model.ApiError) {
hClient := &http.Client{}
reqString, err := json.Marshal(checkoutRequest)
func PortalSession(ctx context.Context, portalRequest *model.PortalRequest, licenseKey string, zeus zeus.Zeus) (string, error) {
body, err := json.Marshal(portalRequest)
if err != nil {
return "", model.BadRequest(err)
return "", err
}
req, err := http.NewRequestWithContext(ctx, "POST", C.GatewayUrl+"/v2/subscriptions/me/sessions/portal", bytes.NewBuffer(reqString))
response, err := zeus.GetPortalURL(ctx, licenseKey, body)
if err != nil {
return "", model.BadRequest(err)
return "", err
}
req.Header.Set("X-Signoz-Cloud-Api-Key", licenseKey)
response, err := hClient.Do(req)
if err != nil {
return "", model.BadRequest(err)
}
body, err := io.ReadAll(response.Body)
if err != nil {
return "", model.BadRequest(errors.Wrap(err, fmt.Sprintf("failed to read portal response from %v", C.GatewayUrl)))
}
defer response.Body.Close()
switch response.StatusCode {
case 201:
a := CheckoutResponse{}
err = json.Unmarshal(body, &a)
if err != nil {
return "", model.BadRequest(errors.Wrap(err, "failed to unmarshal zeus portal response"))
}
return a.Data.RedirectURL, nil
case 400:
return "", model.BadRequest(errors.Wrap(errors.New(string(body)),
fmt.Sprintf("bad request error received from %v", C.GatewayUrl)))
case 401:
return "", model.Unauthorized(errors.Wrap(errors.New(string(body)),
fmt.Sprintf("unauthorized request error received from %v", C.GatewayUrl)))
default:
return "", model.InternalError(errors.Wrap(errors.New(string(body)),
fmt.Sprintf("internal request error received from %v", C.GatewayUrl)))
}
return gjson.GetBytes(response, "url").String(), nil
}

View File

@@ -6,14 +6,13 @@ import (
"time"
"github.com/jmoiron/sqlx"
"github.com/pkg/errors"
"sync"
baseconstants "github.com/SigNoz/signoz/pkg/query-service/constants"
"github.com/SigNoz/signoz/pkg/sqlstore"
"github.com/SigNoz/signoz/pkg/types"
"github.com/SigNoz/signoz/pkg/types/authtypes"
"github.com/SigNoz/signoz/pkg/zeus"
validate "github.com/SigNoz/signoz/ee/query-service/integrations/signozio"
"github.com/SigNoz/signoz/ee/query-service/model"
@@ -29,6 +28,7 @@ var validationFrequency = 24 * 60 * time.Minute
type Manager struct {
repo *Repo
zeus zeus.Zeus
mutex sync.Mutex
validatorRunning bool
// end the license validation, this is important to gracefully
@@ -45,7 +45,7 @@ type Manager struct {
activeFeatures basemodel.FeatureSet
}
func StartManager(db *sqlx.DB, store sqlstore.SQLStore, features ...basemodel.Feature) (*Manager, error) {
func StartManager(db *sqlx.DB, store sqlstore.SQLStore, zeus zeus.Zeus, features ...basemodel.Feature) (*Manager, error) {
if LM != nil {
return LM, nil
}
@@ -53,6 +53,7 @@ func StartManager(db *sqlx.DB, store sqlstore.SQLStore, features ...basemodel.Fe
repo := NewLicenseRepo(db, store)
m := &Manager{
repo: &repo,
zeus: zeus,
}
if err := m.start(features...); err != nil {
return m, err
@@ -172,17 +173,15 @@ func (lm *Manager) ValidatorV3(ctx context.Context) {
}
}
func (lm *Manager) RefreshLicense(ctx context.Context) *model.ApiError {
license, apiError := validate.ValidateLicenseV3(lm.activeLicenseV3.Key)
if apiError != nil {
zap.L().Error("failed to validate license", zap.Error(apiError.Err))
return apiError
func (lm *Manager) RefreshLicense(ctx context.Context) error {
license, err := validate.ValidateLicenseV3(ctx, lm.activeLicenseV3.Key, lm.zeus)
if err != nil {
return err
}
err := lm.repo.UpdateLicenseV3(ctx, license)
err = lm.repo.UpdateLicenseV3(ctx, license)
if err != nil {
return model.BadRequest(errors.Wrap(err, "failed to update the new license"))
return err
}
lm.SetActiveV3(license)
@@ -190,7 +189,6 @@ func (lm *Manager) RefreshLicense(ctx context.Context) *model.ApiError {
}
func (lm *Manager) ValidateV3(ctx context.Context) (reterr error) {
zap.L().Info("License validation started")
if lm.activeLicenseV3 == nil {
return nil
}
@@ -236,28 +234,17 @@ func (lm *Manager) ValidateV3(ctx context.Context) (reterr error) {
return nil
}
func (lm *Manager) ActivateV3(ctx context.Context, licenseKey string) (licenseResponse *model.LicenseV3, errResponse *model.ApiError) {
defer func() {
if errResponse != nil {
claims, err := authtypes.ClaimsFromContext(ctx)
if err != nil {
telemetry.GetInstance().SendEvent(telemetry.TELEMETRY_LICENSE_ACT_FAILED,
map[string]interface{}{"err": errResponse.Err.Error()}, claims.Email, true, false)
}
}
}()
license, apiError := validate.ValidateLicenseV3(licenseKey)
if apiError != nil {
zap.L().Error("failed to get the license", zap.Error(apiError.Err))
return nil, apiError
func (lm *Manager) ActivateV3(ctx context.Context, licenseKey string) (*model.LicenseV3, error) {
license, err := validate.ValidateLicenseV3(ctx, licenseKey, lm.zeus)
if err != nil {
return nil, err
}
// insert the new license to the sqlite db
err := lm.repo.InsertLicenseV3(ctx, license)
if err != nil {
zap.L().Error("failed to activate license", zap.Error(err))
return nil, err
modelErr := lm.repo.InsertLicenseV3(ctx, license)
if modelErr != nil {
zap.L().Error("failed to activate license", zap.Error(modelErr))
return nil, modelErr
}
// license is valid, activate it

View File

@@ -8,6 +8,8 @@ import (
"github.com/SigNoz/signoz/ee/query-service/app"
"github.com/SigNoz/signoz/ee/sqlstore/postgressqlstore"
"github.com/SigNoz/signoz/ee/zeus"
"github.com/SigNoz/signoz/ee/zeus/httpzeus"
"github.com/SigNoz/signoz/pkg/config"
"github.com/SigNoz/signoz/pkg/config/envprovider"
"github.com/SigNoz/signoz/pkg/config/fileprovider"
@@ -21,6 +23,7 @@ import (
"go.uber.org/zap/zapcore"
)
// Deprecated: Please use the logger from pkg/instrumentation.
func initZapLog() *zap.Logger {
config := zap.NewProductionConfig()
config.EncoderConfig.TimeKey = "timestamp"
@@ -50,21 +53,31 @@ func main() {
var gatewayUrl string
var useLicensesV3 bool
// Deprecated
flag.BoolVar(&useLogsNewSchema, "use-logs-new-schema", false, "use logs_v2 schema for logs")
// Deprecated
flag.BoolVar(&useTraceNewSchema, "use-trace-new-schema", false, "use new schema for traces")
// Deprecated
flag.StringVar(&promConfigPath, "config", "./config/prometheus.yml", "(prometheus config to read metrics)")
// Deprecated
flag.StringVar(&skipTopLvlOpsPath, "skip-top-level-ops", "", "(config file to skip top level operations)")
// Deprecated
flag.BoolVar(&disableRules, "rules.disable", false, "(disable rule evaluation)")
flag.BoolVar(&preferSpanMetrics, "prefer-span-metrics", false, "(prefer span metrics for service level metrics)")
// Deprecated
flag.IntVar(&maxIdleConns, "max-idle-conns", 50, "(number of connections to maintain in the pool.)")
// Deprecated
flag.IntVar(&maxOpenConns, "max-open-conns", 100, "(max connections for use at any time.)")
// Deprecated
flag.DurationVar(&dialTimeout, "dial-timeout", 5*time.Second, "(the maximum time to establish a connection.)")
// Deprecated
flag.StringVar(&ruleRepoURL, "rules.repo-url", baseconst.AlertHelpPage, "(host address used to build rule link in alert messages)")
flag.StringVar(&cacheConfigPath, "experimental.cache-config", "", "(cache config to use)")
flag.StringVar(&fluxInterval, "flux-interval", "5m", "(the interval to exclude data from being cached to avoid incorrect cache for data in motion)")
flag.StringVar(&fluxIntervalForTraceDetail, "flux-interval-trace-detail", "2m", "(the interval to exclude data from being cached to avoid incorrect cache for trace data in motion)")
flag.StringVar(&cluster, "cluster", "cluster", "(cluster name - defaults to 'cluster')")
flag.StringVar(&gatewayUrl, "gateway-url", "", "(url to the gateway)")
// Deprecated
flag.BoolVar(&useLicensesV3, "use-licenses-v3", false, "use licenses_v3 schema for licenses")
flag.Parse()
@@ -98,6 +111,8 @@ func main() {
signoz, err := signoz.New(
context.Background(),
config,
zeus.Config(),
httpzeus.NewProviderFactory(),
signoz.NewCacheProviderFactories(),
signoz.NewWebProviderFactories(),
sqlStoreFactories,
@@ -121,19 +136,13 @@ func main() {
Config: config,
SigNoz: signoz,
HTTPHostPort: baseconst.HTTPHostPort,
PromConfigPath: promConfigPath,
SkipTopLvlOpsPath: skipTopLvlOpsPath,
PreferSpanMetrics: preferSpanMetrics,
PrivateHostPort: baseconst.PrivateHostPort,
DisableRules: disableRules,
RuleRepoURL: ruleRepoURL,
CacheConfigPath: cacheConfigPath,
FluxInterval: fluxInterval,
FluxIntervalForTraceDetail: fluxIntervalForTraceDetail,
Cluster: cluster,
GatewayUrl: gatewayUrl,
UseLogsNewSchema: useLogsNewSchema,
UseTraceNewSchema: useTraceNewSchema,
Jwt: jwt,
}

View File

@@ -297,7 +297,7 @@ func (r *AnomalyRule) Eval(ctx context.Context, ts time.Time) (interface{}, erro
// alerts[h] is ready, add or update active list now
for h, a := range alerts {
// Check whether we already have alerting state for the identifying label set.
// Update the last value and annotations if so, create a new alert entry otherwise.
// UpdateSteps the last value and annotations if so, create a new alert entry otherwise.
if alert, ok := r.Active[h]; ok && alert.State != model.StateInactive {
alert.Value = a.Value

View File

@@ -25,8 +25,6 @@ func PrepareTaskFunc(opts baserules.PrepareTaskOptions) (baserules.Task, error)
ruleId,
opts.Rule,
opts.Reader,
opts.UseLogsNewSchema,
opts.UseTraceNewSchema,
baserules.WithEvalDelay(opts.ManagerOpts.EvalDelay),
baserules.WithSQLStore(opts.SQLStore),
)
@@ -123,8 +121,6 @@ func TestNotification(opts baserules.PrepareTestRuleOptions) (int, *basemodel.Ap
alertname,
parsedRule,
opts.Reader,
opts.UseLogsNewSchema,
opts.UseTraceNewSchema,
baserules.WithSendAlways(),
baserules.WithSendUnmatched(),
baserules.WithSQLStore(opts.SQLStore),

View File

@@ -4,7 +4,6 @@ import (
"context"
"encoding/json"
"fmt"
"regexp"
"strings"
"sync/atomic"
"time"
@@ -16,10 +15,10 @@ import (
"go.uber.org/zap"
"github.com/SigNoz/signoz/ee/query-service/dao"
licenseserver "github.com/SigNoz/signoz/ee/query-service/integrations/signozio"
"github.com/SigNoz/signoz/ee/query-service/license"
"github.com/SigNoz/signoz/ee/query-service/model"
"github.com/SigNoz/signoz/pkg/query-service/utils/encryption"
"github.com/SigNoz/signoz/pkg/zeus"
)
const (
@@ -42,26 +41,16 @@ type Manager struct {
modelDao dao.ModelDao
tenantID string
zeus zeus.Zeus
}
func New(modelDao dao.ModelDao, licenseRepo *license.Repo, clickhouseConn clickhouse.Conn, chUrl string) (*Manager, error) {
hostNameRegex := regexp.MustCompile(`tcp://(?P<hostname>.*):`)
hostNameRegexMatches := hostNameRegex.FindStringSubmatch(chUrl)
tenantID := ""
if len(hostNameRegexMatches) == 2 {
tenantID = hostNameRegexMatches[1]
tenantID = strings.TrimSuffix(tenantID, "-clickhouse")
}
func New(modelDao dao.ModelDao, licenseRepo *license.Repo, clickhouseConn clickhouse.Conn, zeus zeus.Zeus) (*Manager, error) {
m := &Manager{
// repository: repo,
clickhouseConn: clickhouseConn,
licenseRepo: licenseRepo,
scheduler: gocron.NewScheduler(time.UTC).Every(1).Day().At("00:00"), // send usage every at 00:00 UTC
modelDao: modelDao,
tenantID: tenantID,
zeus: zeus,
}
return m, nil
}
@@ -158,7 +147,7 @@ func (lm *Manager) UploadUsage() {
usageData.Type = usage.Type
usageData.Tenant = "default"
usageData.OrgName = "default"
usageData.TenantId = lm.tenantID
usageData.TenantId = "default"
usagesPayload = append(usagesPayload, usageData)
}
@@ -167,24 +156,18 @@ func (lm *Manager) UploadUsage() {
LicenseKey: key,
Usage: usagesPayload,
}
lm.UploadUsageWithExponentalBackOff(ctx, payload)
}
func (lm *Manager) UploadUsageWithExponentalBackOff(ctx context.Context, payload model.UsagePayload) {
for i := 1; i <= MaxRetries; i++ {
apiErr := licenseserver.SendUsage(ctx, payload)
if apiErr != nil && i == MaxRetries {
zap.L().Error("retries stopped : %v", zap.Error(apiErr))
// not returning error here since it is captured in the failed count
return
} else if apiErr != nil {
// sleeping for exponential backoff
sleepDuration := RetryInterval * time.Duration(i)
zap.L().Error("failed to upload snapshot retrying after %v secs : %v", zap.Duration("sleepDuration", sleepDuration), zap.Error(apiErr.Err))
time.Sleep(sleepDuration)
} else {
break
}
body, errv2 := json.Marshal(payload)
if errv2 != nil {
zap.L().Error("error while marshalling usage payload: %v", zap.Error(errv2))
return
}
errv2 = lm.zeus.PutMeters(ctx, payload.LicenseKey.String(), body)
if errv2 != nil {
zap.L().Error("failed to upload usage: %v", zap.Error(errv2))
// not returning error here since it is captured in the failed count
return
}
}

View File

@@ -194,7 +194,7 @@ func (dialect *dialect) RenameColumn(ctx context.Context, bun bun.IDB, table str
}
if !oldColumnExists {
return false, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, fmt.Sprintf("old column: %s doesn't exist", oldColumnName))
return false, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "old column: %s doesn't exist", oldColumnName)
}
_, err = bun.

View File

@@ -4,8 +4,10 @@ import (
"context"
"database/sql"
"github.com/SigNoz/signoz/pkg/errors"
"github.com/SigNoz/signoz/pkg/factory"
"github.com/SigNoz/signoz/pkg/sqlstore"
"github.com/jackc/pgx/v5/pgconn"
"github.com/jackc/pgx/v5/pgxpool"
"github.com/jackc/pgx/v5/stdlib"
"github.com/jmoiron/sqlx"
@@ -87,3 +89,20 @@ func (provider *provider) BunDBCtx(ctx context.Context) bun.IDB {
func (provider *provider) RunInTxCtx(ctx context.Context, opts *sql.TxOptions, cb func(ctx context.Context) error) error {
return provider.bundb.RunInTxCtx(ctx, opts, cb)
}
func (provider *provider) WrapNotFoundErrf(err error, code errors.Code, format string, args ...any) error {
if err == sql.ErrNoRows {
return errors.Wrapf(err, errors.TypeNotFound, code, format, args...)
}
return err
}
func (provider *provider) WrapAlreadyExistsErrf(err error, code errors.Code, format string, args ...any) error {
var pgErr *pgconn.PgError
if errors.As(err, &pgErr) && pgErr.Code == "23505" {
return errors.Wrapf(err, errors.TypeAlreadyExists, code, format, args...)
}
return err
}

42
ee/zeus/config.go Normal file
View File

@@ -0,0 +1,42 @@
package zeus
import (
"fmt"
neturl "net/url"
"sync"
"github.com/SigNoz/signoz/pkg/zeus"
)
// This will be set via ldflags at build time.
var (
url string = "<unset>"
deprecatedURL string = "<unset>"
)
var (
config zeus.Config
once sync.Once
)
// initializes the Zeus configuration
func Config() zeus.Config {
once.Do(func() {
parsedURL, err := neturl.Parse(url)
if err != nil {
panic(fmt.Errorf("invalid zeus URL: %w", err))
}
deprecatedParsedURL, err := neturl.Parse(deprecatedURL)
if err != nil {
panic(fmt.Errorf("invalid zeus deprecated URL: %w", err))
}
config = zeus.Config{URL: parsedURL, DeprecatedURL: deprecatedParsedURL}
if err := config.Validate(); err != nil {
panic(fmt.Errorf("invalid zeus config: %w", err))
}
})
return config
}

View File

@@ -0,0 +1,189 @@
package httpzeus
import (
"bytes"
"context"
"io"
"net/http"
"net/url"
"github.com/SigNoz/signoz/pkg/errors"
"github.com/SigNoz/signoz/pkg/factory"
"github.com/SigNoz/signoz/pkg/http/client"
"github.com/SigNoz/signoz/pkg/zeus"
"github.com/tidwall/gjson"
)
type Provider struct {
settings factory.ScopedProviderSettings
config zeus.Config
httpClient *client.Client
}
func NewProviderFactory() factory.ProviderFactory[zeus.Zeus, zeus.Config] {
return factory.NewProviderFactory(factory.MustNewName("http"), func(ctx context.Context, providerSettings factory.ProviderSettings, config zeus.Config) (zeus.Zeus, error) {
return New(ctx, providerSettings, config)
})
}
func New(ctx context.Context, providerSettings factory.ProviderSettings, config zeus.Config) (zeus.Zeus, error) {
settings := factory.NewScopedProviderSettings(providerSettings, "github.com/SigNoz/signoz/ee/zeus/httpzeus")
httpClient, err := client.New(
settings.Logger(),
providerSettings.TracerProvider,
providerSettings.MeterProvider,
client.WithRequestResponseLog(true),
client.WithRetryCount(3),
)
if err != nil {
return nil, err
}
return &Provider{
settings: settings,
config: config,
httpClient: httpClient,
}, nil
}
func (provider *Provider) GetLicense(ctx context.Context, key string) ([]byte, error) {
response, err := provider.do(
ctx,
provider.config.URL.JoinPath("/v2/licenses/me"),
http.MethodGet,
key,
nil,
)
if err != nil {
return nil, err
}
return []byte(gjson.GetBytes(response, "data").String()), nil
}
func (provider *Provider) GetCheckoutURL(ctx context.Context, key string, body []byte) ([]byte, error) {
response, err := provider.do(
ctx,
provider.config.URL.JoinPath("/v2/subscriptions/me/sessions/checkout"),
http.MethodPost,
key,
body,
)
if err != nil {
return nil, err
}
return []byte(gjson.GetBytes(response, "data").String()), nil
}
func (provider *Provider) GetPortalURL(ctx context.Context, key string, body []byte) ([]byte, error) {
response, err := provider.do(
ctx,
provider.config.URL.JoinPath("/v2/subscriptions/me/sessions/portal"),
http.MethodPost,
key,
body,
)
if err != nil {
return nil, err
}
return []byte(gjson.GetBytes(response, "data").String()), nil
}
func (provider *Provider) GetDeployment(ctx context.Context, key string) ([]byte, error) {
response, err := provider.do(
ctx,
provider.config.URL.JoinPath("/v2/deployments/me"),
http.MethodGet,
key,
nil,
)
if err != nil {
return nil, err
}
return []byte(gjson.GetBytes(response, "data").String()), nil
}
func (provider *Provider) PutMeters(ctx context.Context, key string, data []byte) error {
_, err := provider.do(
ctx,
provider.config.DeprecatedURL.JoinPath("/api/v1/usage"),
http.MethodPost,
key,
data,
)
return err
}
func (provider *Provider) PutProfile(ctx context.Context, key string, body []byte) error {
_, err := provider.do(
ctx,
provider.config.URL.JoinPath("/v2/profiles/me"),
http.MethodPut,
key,
body,
)
return err
}
func (provider *Provider) PutHost(ctx context.Context, key string, body []byte) error {
_, err := provider.do(
ctx,
provider.config.URL.JoinPath("/v2/deployments/me/hosts"),
http.MethodPut,
key,
body,
)
return err
}
func (provider *Provider) do(ctx context.Context, url *url.URL, method string, key string, requestBody []byte) ([]byte, error) {
request, err := http.NewRequestWithContext(ctx, method, url.String(), bytes.NewBuffer(requestBody))
if err != nil {
return nil, err
}
request.Header.Set("X-Signoz-Cloud-Api-Key", key)
request.Header.Set("Content-Type", "application/json")
response, err := provider.httpClient.Do(request)
if err != nil {
return nil, err
}
defer func() {
_ = response.Body.Close()
}()
body, err := io.ReadAll(response.Body)
if err != nil {
return nil, err
}
if response.StatusCode/100 == 2 {
return body, nil
}
return nil, provider.errFromStatusCode(response.StatusCode)
}
// This can be taken down to the client package
func (provider *Provider) errFromStatusCode(statusCode int) error {
switch statusCode {
case http.StatusBadRequest:
return errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "bad request")
case http.StatusUnauthorized:
return errors.Newf(errors.TypeUnauthenticated, errors.CodeUnauthenticated, "unauthenticated")
case http.StatusForbidden:
return errors.Newf(errors.TypeForbidden, errors.CodeForbidden, "forbidden")
case http.StatusNotFound:
return errors.Newf(errors.TypeNotFound, errors.CodeNotFound, "not found")
}
return errors.Newf(errors.TypeInternal, errors.CodeInternal, "internal")
}

View File

@@ -55,7 +55,7 @@
"ansi-to-html": "0.7.2",
"antd": "5.11.0",
"antd-table-saveas-excel": "2.2.1",
"axios": "1.7.7",
"axios": "1.8.2",
"babel-eslint": "^10.1.0",
"babel-jest": "^29.6.4",
"babel-loader": "9.1.3",
@@ -132,6 +132,7 @@
"tsconfig-paths-webpack-plugin": "^3.5.1",
"typescript": "^4.0.5",
"uplot": "1.6.31",
"userpilot": "1.3.9",
"uuid": "^8.3.2",
"web-vitals": "^0.2.4",
"webpack": "5.94.0",

View File

@@ -26,6 +26,7 @@ import { QueryBuilderProvider } from 'providers/QueryBuilder';
import { Suspense, useCallback, useEffect, useState } from 'react';
import { Route, Router, Switch } from 'react-router-dom';
import { CompatRouter } from 'react-router-dom-v5-compat';
import { Userpilot } from 'userpilot';
import { extractDomain } from 'utils/app';
import { Home } from './pageComponents';
@@ -100,6 +101,18 @@ function App(): JSX.Element {
logEvent('Domain Identified', groupTraits, 'group');
}
Userpilot.identify(email, {
email,
name,
orgName,
tenant_id: hostNameParts[0],
data_region: hostNameParts[1],
tenant_url: hostname,
company_domain: domain,
source: 'signoz-ui',
isPaidUser: !!trialInfo?.trialConvertedToSubscription,
});
posthog?.identify(email, {
email,
name,
@@ -276,6 +289,10 @@ function App(): JSX.Element {
});
}
if (process.env.USERPILOT_KEY) {
Userpilot.initialize(process.env.USERPILOT_KEY);
}
Sentry.init({
dsn: process.env.SENTRY_DSN,
tunnel: process.env.TUNNEL_URL,

View File

@@ -0,0 +1,13 @@
.custom-multiselect-dropdown {
.divider {
height: 1px;
background-color: #e8e8e8;
margin: 4px 0;
}
.all-option {
font-weight: 500;
border-bottom: 1px solid #f0f0f0;
margin-bottom: 8px;
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,606 @@
/* eslint-disable no-nested-ternary */
/* eslint-disable sonarjs/cognitive-complexity */
/* eslint-disable react/jsx-props-no-spreading */
/* eslint-disable react/function-component-definition */
import './styles.scss';
import {
CloseOutlined,
DownOutlined,
LoadingOutlined,
ReloadOutlined,
} from '@ant-design/icons';
import { Color } from '@signozhq/design-tokens';
import { Select } from 'antd';
import cx from 'classnames';
import { SOMETHING_WENT_WRONG } from 'constants/api';
import { capitalize, isEmpty } from 'lodash-es';
import { ArrowDown, ArrowUp } from 'lucide-react';
import type { BaseSelectRef } from 'rc-select';
import React, {
useCallback,
useEffect,
useMemo,
useRef,
useState,
} from 'react';
import { popupContainer } from 'utils/selectPopupContainer';
import { CustomSelectProps, OptionData } from './types';
import {
filterOptionsBySearch,
prioritizeOrAddOptionForSingleSelect,
SPACEKEY,
} from './utils';
/**
* CustomSelect Component
*
*/
const CustomSelect: React.FC<CustomSelectProps> = ({
placeholder = 'Search...',
className,
loading = false,
onSearch,
options = [],
value,
onChange,
defaultActiveFirstOption = true,
noDataMessage,
onClear,
getPopupContainer,
dropdownRender,
highlightSearch = true,
placement = 'bottomLeft',
popupMatchSelectWidth = true,
popupClassName,
errorMessage,
allowClear = false,
onRetry,
...rest
}) => {
// ===== State & Refs =====
const [isOpen, setIsOpen] = useState(false);
const [searchText, setSearchText] = useState('');
const [activeOptionIndex, setActiveOptionIndex] = useState<number>(-1);
// Refs for element access and scroll behavior
const selectRef = useRef<BaseSelectRef>(null);
const dropdownRef = useRef<HTMLDivElement>(null);
const optionRefs = useRef<Record<number, HTMLDivElement | null>>({});
// ===== Option Filtering & Processing Utilities =====
/**
* Checks if a label exists in the provided options
*/
const isLabelPresent = useCallback(
(options: OptionData[], label: string): boolean =>
options.some((option) => {
const lowerLabel = label.toLowerCase();
// Check in nested options if they exist
if ('options' in option && Array.isArray(option.options)) {
return option.options.some(
(subOption) => subOption.label.toLowerCase() === lowerLabel,
);
}
// Check top-level option
return option.label.toLowerCase() === lowerLabel;
}),
[],
);
/**
* Separates section and non-section options
*/
const splitOptions = useCallback((options: OptionData[]): {
sectionOptions: OptionData[];
nonSectionOptions: OptionData[];
} => {
const sectionOptions: OptionData[] = [];
const nonSectionOptions: OptionData[] = [];
options.forEach((option) => {
if ('options' in option && Array.isArray(option.options)) {
sectionOptions.push(option);
} else {
nonSectionOptions.push(option);
}
});
return { sectionOptions, nonSectionOptions };
}, []);
/**
* Apply search filtering to options
*/
const filteredOptions = useMemo(
(): OptionData[] => filterOptionsBySearch(options, searchText),
[options, searchText],
);
// ===== UI & Rendering Functions =====
/**
* Highlights matched text in search results
*/
const highlightMatchedText = useCallback(
(text: string, searchQuery: string): React.ReactNode => {
if (!searchQuery || !highlightSearch) return text;
const parts = text.split(new RegExp(`(${searchQuery})`, 'gi'));
return (
<>
{parts.map((part, i) => {
// Create a deterministic but unique key
const uniqueKey = `${text.substring(0, 3)}-${part.substring(0, 3)}-${i}`;
return part.toLowerCase() === searchQuery.toLowerCase() ? (
<span key={uniqueKey} className="highlight-text">
{part}
</span>
) : (
part
);
})}
</>
);
},
[highlightSearch],
);
/**
* Renders an individual option with proper keyboard navigation support
*/
const renderOptionItem = useCallback(
(
option: OptionData,
isSelected: boolean,
index?: number,
): React.ReactElement => {
const handleSelection = (): void => {
if (onChange) {
onChange(option.value, option);
setIsOpen(false);
}
};
const isActive = index === activeOptionIndex;
const optionId = `option-${index}`;
return (
<div
key={option.value}
id={optionId}
ref={(el): void => {
if (index !== undefined) {
optionRefs.current[index] = el;
}
}}
className={cx('option-item', {
selected: isSelected,
active: isActive,
})}
onClick={(e): void => {
e.stopPropagation();
handleSelection();
}}
onKeyDown={(e): void => {
if (e.key === 'Enter' || e.key === SPACEKEY) {
e.preventDefault();
handleSelection();
}
}}
onMouseEnter={(): void => setActiveOptionIndex(index || -1)}
role="option"
aria-selected={isSelected}
aria-disabled={option.disabled}
tabIndex={isActive ? 0 : -1}
>
<div className="option-content">
<div>{highlightMatchedText(String(option.label || ''), searchText)}</div>
{option.type === 'custom' && (
<div className="option-badge">{capitalize(option.type)}</div>
)}
</div>
</div>
);
},
[highlightMatchedText, searchText, onChange, activeOptionIndex],
);
/**
* Helper function to render option with index tracking
*/
const renderOptionWithIndex = useCallback(
(option: OptionData, isSelected: boolean, idx: number) =>
renderOptionItem(option, isSelected, idx),
[renderOptionItem],
);
/**
* Custom clear button renderer
*/
const clearIcon = useCallback(
() => (
<CloseOutlined
onClick={(e): void => {
e.stopPropagation();
if (onChange) onChange(undefined, []);
if (onClear) onClear();
}}
/>
),
[onChange, onClear],
);
// ===== Event Handlers =====
/**
* Handles search input changes
*/
const handleSearch = useCallback(
(value: string): void => {
const trimmedValue = value.trim();
setSearchText(trimmedValue);
if (onSearch) onSearch(trimmedValue);
},
[onSearch],
);
/**
* Prevents event propagation for dropdown clicks
*/
const handleDropdownClick = useCallback((e: React.MouseEvent): void => {
e.stopPropagation();
}, []);
/**
* Comprehensive keyboard navigation handler
*/
const handleKeyDown = useCallback(
(e: React.KeyboardEvent): void => {
// Handle keyboard navigation when dropdown is open
if (isOpen) {
// Get flattened list of all selectable options
const getFlatOptions = (): OptionData[] => {
if (!filteredOptions) return [];
const flatList: OptionData[] = [];
// Process options
const { sectionOptions, nonSectionOptions } = splitOptions(
isEmpty(value)
? filteredOptions
: prioritizeOrAddOptionForSingleSelect(filteredOptions, value),
);
// Add custom option if needed
if (!isEmpty(searchText) && !isLabelPresent(filteredOptions, searchText)) {
flatList.push({
label: searchText,
value: searchText,
type: 'custom',
});
}
// Add all options to flat list
flatList.push(...nonSectionOptions);
sectionOptions.forEach((section) => {
if (section.options) {
flatList.push(...section.options);
}
});
return flatList;
};
const options = getFlatOptions();
switch (e.key) {
case 'ArrowDown':
e.preventDefault();
setActiveOptionIndex((prev) =>
prev < options.length - 1 ? prev + 1 : 0,
);
break;
case 'ArrowUp':
e.preventDefault();
setActiveOptionIndex((prev) =>
prev > 0 ? prev - 1 : options.length - 1,
);
break;
case 'Tab':
// Tab navigation with Shift key support
if (e.shiftKey) {
e.preventDefault();
setActiveOptionIndex((prev) =>
prev > 0 ? prev - 1 : options.length - 1,
);
} else {
e.preventDefault();
setActiveOptionIndex((prev) =>
prev < options.length - 1 ? prev + 1 : 0,
);
}
break;
case 'Enter':
e.preventDefault();
if (activeOptionIndex >= 0 && activeOptionIndex < options.length) {
// Select the focused option
const selectedOption = options[activeOptionIndex];
if (onChange) {
onChange(selectedOption.value, selectedOption);
setIsOpen(false);
setActiveOptionIndex(-1);
}
} else if (!isEmpty(searchText)) {
// Add custom value when no option is focused
const customOption = {
label: searchText,
value: searchText,
type: 'custom',
};
if (onChange) {
onChange(customOption.value, customOption);
setIsOpen(false);
setActiveOptionIndex(-1);
}
}
break;
case 'Escape':
e.preventDefault();
setIsOpen(false);
setActiveOptionIndex(-1);
break;
case ' ': // Space key
if (activeOptionIndex >= 0 && activeOptionIndex < options.length) {
e.preventDefault();
const selectedOption = options[activeOptionIndex];
if (onChange) {
onChange(selectedOption.value, selectedOption);
setIsOpen(false);
setActiveOptionIndex(-1);
}
}
break;
default:
break;
}
} else if (e.key === 'ArrowDown' || e.key === 'Tab') {
// Open dropdown when Down or Tab is pressed while closed
e.preventDefault();
setIsOpen(true);
setActiveOptionIndex(0);
}
},
[
isOpen,
activeOptionIndex,
filteredOptions,
searchText,
onChange,
splitOptions,
value,
isLabelPresent,
],
);
// ===== Dropdown Rendering =====
/**
* Renders the custom dropdown with sections and keyboard navigation
*/
const customDropdownRender = useCallback((): React.ReactElement => {
// Process options based on current value
let processedOptions = isEmpty(value)
? filteredOptions
: prioritizeOrAddOptionForSingleSelect(filteredOptions, value);
if (!isEmpty(searchText)) {
processedOptions = filterOptionsBySearch(processedOptions, searchText);
}
const { sectionOptions, nonSectionOptions } = splitOptions(processedOptions);
// Check if we need to add a custom option based on search text
const isSearchTextNotPresent =
!isEmpty(searchText) && !isLabelPresent(processedOptions, searchText);
let optionIndex = 0;
// Add custom option if needed
if (isSearchTextNotPresent) {
nonSectionOptions.unshift({
label: searchText,
value: searchText,
type: 'custom',
});
}
// Helper function to map options with index tracking
const mapOptions = (options: OptionData[]): React.ReactNode =>
options.map((option) => {
const result = renderOptionWithIndex(
option,
option.value === value,
optionIndex,
);
optionIndex += 1;
return result;
});
const customMenu = (
<div
ref={dropdownRef}
className="custom-select-dropdown"
onClick={handleDropdownClick}
onKeyDown={handleKeyDown}
role="listbox"
tabIndex={-1}
aria-activedescendant={
activeOptionIndex >= 0 ? `option-${activeOptionIndex}` : undefined
}
>
{/* Non-section options */}
<div className="no-section-options">
{nonSectionOptions.length > 0 && mapOptions(nonSectionOptions)}
</div>
{/* Section options */}
{sectionOptions.length > 0 &&
sectionOptions.map((section) =>
!isEmpty(section.options) ? (
<div className="select-group" key={section.label}>
<div className="group-label" role="heading" aria-level={2}>
{section.label}
</div>
<div role="group" aria-label={`${section.label} options`}>
{section.options && mapOptions(section.options)}
</div>
</div>
) : null,
)}
{/* Navigation help footer */}
<div className="navigation-footer" role="note">
{!loading && !errorMessage && !noDataMessage && (
<section className="navigate">
<ArrowDown size={8} className="icons" />
<ArrowUp size={8} className="icons" />
<span className="keyboard-text">to navigate</span>
</section>
)}
{loading && (
<div className="navigation-loading">
<div className="navigation-icons">
<LoadingOutlined />
</div>
<div className="navigation-text">We are updating the values...</div>
</div>
)}
{errorMessage && !loading && (
<div className="navigation-error">
<div className="navigation-text">
{errorMessage || SOMETHING_WENT_WRONG}
</div>
<div className="navigation-icons">
<ReloadOutlined
twoToneColor={Color.BG_CHERRY_400}
onClick={(e): void => {
e.stopPropagation();
if (onRetry) onRetry();
}}
/>
</div>
</div>
)}
{noDataMessage && !loading && (
<div className="navigation-text">{noDataMessage}</div>
)}
</div>
</div>
);
return dropdownRender ? dropdownRender(customMenu) : customMenu;
}, [
value,
filteredOptions,
searchText,
splitOptions,
isLabelPresent,
handleDropdownClick,
handleKeyDown,
activeOptionIndex,
loading,
errorMessage,
noDataMessage,
dropdownRender,
renderOptionWithIndex,
onRetry,
]);
// ===== Side Effects =====
// Clear search text when dropdown closes
useEffect(() => {
if (!isOpen) {
setSearchText('');
setActiveOptionIndex(-1);
}
}, [isOpen]);
// Auto-scroll to active option for keyboard navigation
useEffect(() => {
if (
isOpen &&
activeOptionIndex >= 0 &&
optionRefs.current[activeOptionIndex]
) {
optionRefs.current[activeOptionIndex]?.scrollIntoView({
behavior: 'smooth',
block: 'nearest',
});
}
}, [isOpen, activeOptionIndex]);
// ===== Final Processing =====
// Apply highlight to matched text in options
const optionsWithHighlight = useMemo(
() =>
options
?.filter((option) =>
String(option.label || '')
.toLowerCase()
.includes(searchText.toLowerCase()),
)
?.map((option) => ({
...option,
label: highlightMatchedText(String(option.label || ''), searchText),
})),
[options, searchText, highlightMatchedText],
);
// ===== Component Rendering =====
return (
<Select
ref={selectRef}
className={cx('custom-select', className)}
placeholder={placeholder}
showSearch
filterOption={false}
onSearch={handleSearch}
value={value}
onChange={onChange}
onDropdownVisibleChange={setIsOpen}
open={isOpen}
options={optionsWithHighlight}
defaultActiveFirstOption={defaultActiveFirstOption}
popupMatchSelectWidth={popupMatchSelectWidth}
allowClear={allowClear ? { clearIcon } : false}
getPopupContainer={getPopupContainer ?? popupContainer}
suffixIcon={<DownOutlined style={{ cursor: 'default' }} />}
dropdownRender={customDropdownRender}
menuItemSelectedIcon={null}
popupClassName={cx('custom-select-dropdown-container', popupClassName)}
listHeight={300}
placement={placement}
optionFilterProp="label"
notFoundContent={<div className="empty-message">{noDataMessage}</div>}
onKeyDown={handleKeyDown}
{...rest}
/>
);
};
export default CustomSelect;

View File

@@ -0,0 +1,263 @@
import { fireEvent, render, screen, waitFor } from '@testing-library/react';
import CustomMultiSelect from '../CustomMultiSelect';
// Mock scrollIntoView which isn't available in JSDOM
window.HTMLElement.prototype.scrollIntoView = jest.fn();
// Mock options data
const mockOptions = [
{ label: 'Option 1', value: 'option1' },
{ label: 'Option 2', value: 'option2' },
{ label: 'Option 3', value: 'option3' },
];
const mockGroupedOptions = [
{
label: 'Group 1',
options: [
{ label: 'Group 1 - Option 1', value: 'g1-option1' },
{ label: 'Group 1 - Option 2', value: 'g1-option2' },
],
},
{
label: 'Group 2',
options: [
{ label: 'Group 2 - Option 1', value: 'g2-option1' },
{ label: 'Group 2 - Option 2', value: 'g2-option2' },
],
},
];
describe('CustomMultiSelect Component', () => {
it('renders with placeholder', () => {
const handleChange = jest.fn();
render(
<CustomMultiSelect
placeholder="Select multiple options"
options={mockOptions}
onChange={handleChange}
/>,
);
// Check placeholder exists
const placeholderElement = screen.getByText('Select multiple options');
expect(placeholderElement).toBeInTheDocument();
});
it('opens dropdown when clicked', async () => {
const handleChange = jest.fn();
render(<CustomMultiSelect options={mockOptions} onChange={handleChange} />);
// Click to open the dropdown
const selectElement = screen.getByRole('combobox');
fireEvent.mouseDown(selectElement);
// Wait for dropdown to appear
await waitFor(() => {
expect(screen.getByText('ALL')).toBeInTheDocument(); // The ALL option
expect(screen.getByText('Option 1')).toBeInTheDocument();
expect(screen.getByText('Option 2')).toBeInTheDocument();
expect(screen.getByText('Option 3')).toBeInTheDocument();
});
});
it('selects multiple options', async () => {
const handleChange = jest.fn();
// Start with option1 already selected
render(
<CustomMultiSelect
options={mockOptions}
onChange={handleChange}
value={['option1']}
/>,
);
// Open dropdown
const selectElement = screen.getByRole('combobox');
fireEvent.mouseDown(selectElement);
// Wait for dropdown to appear
await waitFor(() => {
expect(screen.getByText('Option 3')).toBeInTheDocument();
});
// Click on Option 3
const option3 = screen.getByText('Option 3');
fireEvent.click(option3);
// Verify onChange was called with the right values
expect(handleChange).toHaveBeenCalled();
});
it('selects ALL options when ALL is clicked', async () => {
const handleChange = jest.fn();
render(
<CustomMultiSelect
options={mockOptions}
onChange={handleChange}
enableAllSelection
/>,
);
// Open dropdown
const selectElement = screen.getByRole('combobox');
fireEvent.mouseDown(selectElement);
// Wait for dropdown to appear
await waitFor(() => {
expect(screen.getByText('ALL')).toBeInTheDocument();
});
// Click on ALL option
const allOption = screen.getByText('ALL');
fireEvent.click(allOption);
// Verify onChange was called with all option values
expect(handleChange).toHaveBeenCalledWith(
['option1', 'option2', 'option3'],
expect.arrayContaining([
expect.objectContaining({ value: 'option1' }),
expect.objectContaining({ value: 'option2' }),
expect.objectContaining({ value: 'option3' }),
]),
);
});
it('displays selected options as tags', async () => {
render(
<CustomMultiSelect options={mockOptions} value={['option1', 'option2']} />,
);
// Check that option values are shown as tags (not labels)
expect(screen.getByText('option1')).toBeInTheDocument();
expect(screen.getByText('option2')).toBeInTheDocument();
});
it('removes a tag when clicked', async () => {
const handleChange = jest.fn();
render(
<CustomMultiSelect
options={mockOptions}
value={['option1', 'option2']}
onChange={handleChange}
/>,
);
// Find close button on Option 1 tag and click it
const closeButtons = document.querySelectorAll(
'.ant-select-selection-item-remove',
);
fireEvent.click(closeButtons[0]);
// Verify onChange was called with remaining option
expect(handleChange).toHaveBeenCalledWith(
['option2'],
expect.arrayContaining([expect.objectContaining({ value: 'option2' })]),
);
});
it('filters options when searching', async () => {
render(<CustomMultiSelect options={mockOptions} />);
// Open dropdown
const selectElement = screen.getByRole('combobox');
fireEvent.mouseDown(selectElement);
// Type into search box - get input directly
const inputElement = selectElement.querySelector('input');
if (inputElement) {
fireEvent.change(inputElement, { target: { value: '2' } });
}
// Wait for the dropdown filtering to happen
await waitFor(() => {
// Check that the dropdown is present
const dropdownElement = document.querySelector(
'.custom-multiselect-dropdown',
);
expect(dropdownElement).toBeInTheDocument();
// Verify Option 2 is visible in the dropdown
const options = document.querySelectorAll('.option-label-text');
let foundOption2 = false;
options.forEach((option) => {
const text = option.textContent || '';
if (text.includes('Option 2')) foundOption2 = true;
});
expect(foundOption2).toBe(true);
});
});
it('renders grouped options correctly', async () => {
render(<CustomMultiSelect options={mockGroupedOptions} />);
// Open dropdown
const selectElement = screen.getByRole('combobox');
fireEvent.mouseDown(selectElement);
// Check group headers and options
await waitFor(() => {
expect(screen.getByText('Group 1')).toBeInTheDocument();
expect(screen.getByText('Group 2')).toBeInTheDocument();
expect(screen.getByText('Group 1 - Option 1')).toBeInTheDocument();
expect(screen.getByText('Group 1 - Option 2')).toBeInTheDocument();
expect(screen.getByText('Group 2 - Option 1')).toBeInTheDocument();
expect(screen.getByText('Group 2 - Option 2')).toBeInTheDocument();
});
});
it('shows loading state', () => {
render(<CustomMultiSelect options={mockOptions} loading />);
// Open dropdown
const selectElement = screen.getByRole('combobox');
fireEvent.mouseDown(selectElement);
// Check loading text is displayed
expect(screen.getByText('We are updating the values...')).toBeInTheDocument();
});
it('shows error message', () => {
render(
<CustomMultiSelect
options={mockOptions}
errorMessage="Test error message"
/>,
);
// Open dropdown
const selectElement = screen.getByRole('combobox');
fireEvent.mouseDown(selectElement);
// Check error message is displayed
expect(screen.getByText('Test error message')).toBeInTheDocument();
});
it('shows no data message', () => {
render(<CustomMultiSelect options={[]} noDataMessage="No data available" />);
// Open dropdown
const selectElement = screen.getByRole('combobox');
fireEvent.mouseDown(selectElement);
// Check no data message is displayed
expect(screen.getByText('No data available')).toBeInTheDocument();
});
it('shows "ALL" tag when all options are selected', () => {
render(
<CustomMultiSelect
options={mockOptions}
value={['option1', 'option2', 'option3']}
maxTagCount={2}
/>,
);
// When all options are selected, component shows ALL tag instead
expect(screen.getByText('ALL')).toBeInTheDocument();
});
});

View File

@@ -0,0 +1,206 @@
import { fireEvent, render, screen, waitFor } from '@testing-library/react';
import CustomSelect from '../CustomSelect';
// Mock scrollIntoView which isn't available in JSDOM
window.HTMLElement.prototype.scrollIntoView = jest.fn();
// Mock options data
const mockOptions = [
{ label: 'Option 1', value: 'option1' },
{ label: 'Option 2', value: 'option2' },
{ label: 'Option 3', value: 'option3' },
];
const mockGroupedOptions = [
{
label: 'Group 1',
options: [
{ label: 'Group 1 - Option 1', value: 'g1-option1' },
{ label: 'Group 1 - Option 2', value: 'g1-option2' },
],
},
{
label: 'Group 2',
options: [
{ label: 'Group 2 - Option 1', value: 'g2-option1' },
{ label: 'Group 2 - Option 2', value: 'g2-option2' },
],
},
];
describe('CustomSelect Component', () => {
it('renders with placeholder and options', () => {
const handleChange = jest.fn();
render(
<CustomSelect
placeholder="Test placeholder"
options={mockOptions}
onChange={handleChange}
/>,
);
// Check placeholder exists in the DOM (not using getByPlaceholderText)
const placeholderElement = screen.getByText('Test placeholder');
expect(placeholderElement).toBeInTheDocument();
});
it('opens dropdown when clicked', async () => {
const handleChange = jest.fn();
render(<CustomSelect options={mockOptions} onChange={handleChange} />);
// Click to open the dropdown
const selectElement = screen.getByRole('combobox');
fireEvent.mouseDown(selectElement);
// Wait for dropdown to appear
await waitFor(() => {
expect(screen.getByText('Option 1')).toBeInTheDocument();
expect(screen.getByText('Option 2')).toBeInTheDocument();
expect(screen.getByText('Option 3')).toBeInTheDocument();
});
});
it('calls onChange when option is selected', async () => {
const handleChange = jest.fn();
render(<CustomSelect options={mockOptions} onChange={handleChange} />);
// Open dropdown
const selectElement = screen.getByRole('combobox');
fireEvent.mouseDown(selectElement);
// Click on an option
await waitFor(() => {
const option = screen.getByText('Option 2');
fireEvent.click(option);
});
// Check onChange was called with correct value
expect(handleChange).toHaveBeenCalledWith('option2', expect.anything());
});
it('filters options when searching', async () => {
render(<CustomSelect options={mockOptions} />);
// Open dropdown
const selectElement = screen.getByRole('combobox');
fireEvent.mouseDown(selectElement);
// Type into search box
fireEvent.change(selectElement, { target: { value: '2' } });
// Dropdown should only show Option 2
await waitFor(() => {
// Check that the dropdown is present
const dropdownElement = document.querySelector('.custom-select-dropdown');
expect(dropdownElement).toBeInTheDocument();
// Use a simple approach to verify filtering
const allOptionsInDropdown = document.querySelectorAll('.option-item');
let foundOption2 = false;
allOptionsInDropdown.forEach((option) => {
if (option.textContent?.includes('Option 2')) {
foundOption2 = true;
}
// Should not show Options 1 or 3
expect(option.textContent).not.toContain('Option 1');
expect(option.textContent).not.toContain('Option 3');
});
expect(foundOption2).toBe(true);
});
});
it('renders grouped options correctly', async () => {
const handleChange = jest.fn();
render(<CustomSelect options={mockGroupedOptions} onChange={handleChange} />);
// Open dropdown
const selectElement = screen.getByRole('combobox');
fireEvent.mouseDown(selectElement);
// Check group headers and options
await waitFor(() => {
expect(screen.getByText('Group 1')).toBeInTheDocument();
expect(screen.getByText('Group 2')).toBeInTheDocument();
expect(screen.getByText('Group 1 - Option 1')).toBeInTheDocument();
expect(screen.getByText('Group 1 - Option 2')).toBeInTheDocument();
expect(screen.getByText('Group 2 - Option 1')).toBeInTheDocument();
expect(screen.getByText('Group 2 - Option 2')).toBeInTheDocument();
});
});
it('shows loading state', () => {
render(<CustomSelect options={mockOptions} loading />);
// Open dropdown
const selectElement = screen.getByRole('combobox');
fireEvent.mouseDown(selectElement);
// Check loading text is displayed
expect(screen.getByText('We are updating the values...')).toBeInTheDocument();
});
it('shows error message', () => {
render(
<CustomSelect options={mockOptions} errorMessage="Test error message" />,
);
// Open dropdown
const selectElement = screen.getByRole('combobox');
fireEvent.mouseDown(selectElement);
// Check error message is displayed
expect(screen.getByText('Test error message')).toBeInTheDocument();
});
it('shows no data message', () => {
render(<CustomSelect options={[]} noDataMessage="No data available" />);
// Open dropdown
const selectElement = screen.getByRole('combobox');
fireEvent.mouseDown(selectElement);
// Check no data message is displayed
expect(screen.getByText('No data available')).toBeInTheDocument();
});
it('supports keyboard navigation', async () => {
const handleChange = jest.fn();
render(<CustomSelect options={mockOptions} onChange={handleChange} />);
// Open dropdown using keyboard
const selectElement = screen.getByRole('combobox');
fireEvent.focus(selectElement);
// Press down arrow to open dropdown
fireEvent.keyDown(selectElement, { key: 'ArrowDown' });
// Wait for dropdown to appear
await waitFor(() => {
expect(screen.getByText('Option 1')).toBeInTheDocument();
});
});
it('handles selection via keyboard', async () => {
const handleChange = jest.fn();
render(<CustomSelect options={mockOptions} onChange={handleChange} />);
// Open dropdown
const selectElement = screen.getByRole('combobox');
fireEvent.mouseDown(selectElement);
// Wait for dropdown to appear then press Enter
await waitFor(() => {
expect(screen.getByText('Option 1')).toBeInTheDocument();
// Press Enter to select first option
fireEvent.keyDown(screen.getByText('Option 1'), { key: 'Enter' });
});
// Check onChange was called
expect(handleChange).toHaveBeenCalled();
});
});

View File

@@ -0,0 +1,4 @@
import CustomMultiSelect from './CustomMultiSelect';
import CustomSelect from './CustomSelect';
export { CustomMultiSelect, CustomSelect };

View File

@@ -0,0 +1,838 @@
// Main container styles
// make const of #2c3044
$custom-border-color: #2c3044;
.custom-select {
width: 100%;
position: relative;
&.ant-select-focused {
.ant-select-selector {
border-color: var(--bg-robin-500);
box-shadow: 0 0 0 2px rgba(78, 116, 248, 0.2);
}
}
.ant-select-selection-placeholder {
color: rgba(192, 193, 195, 0.45);
}
// Base styles are for dark mode
.ant-select-selector {
background-color: var(--bg-ink-400);
border-color: var(--bg-slate-400);
}
.ant-select-clear {
background-color: var(--bg-ink-400);
color: rgba(192, 193, 195, 0.7);
}
}
// Keep chip styles ONLY in the multi-select
.custom-multiselect {
width: 100%;
position: relative;
.ant-select-selector {
max-height: 200px;
overflow: auto;
scrollbar-width: thin;
background-color: var(--bg-ink-400);
border-color: var(--bg-slate-400);
&::-webkit-scrollbar {
width: 6px;
}
&::-webkit-scrollbar-thumb {
background-color: $custom-border-color;
border-radius: 3px;
}
&::-webkit-scrollbar-track {
background-color: var(--bg-slate-400);
}
}
&.ant-select-focused {
.ant-select-selector {
border-color: var(--bg-robin-500);
box-shadow: 0 0 0 2px rgba(78, 116, 248, 0.2);
}
}
.ant-select-selection-placeholder {
color: rgba(192, 193, 195, 0.45);
}
// Customize tags in multiselect (dark mode by default)
.ant-select-selection-item {
background-color: var(--bg-slate-400);
border-radius: 4px;
border: 1px solid $custom-border-color;
margin-right: 4px;
transition: all 0.2s;
color: var(--bg-vanilla-400);
// Style for active tag (keyboard navigation)
&-active {
border-color: var(--bg-robin-500) !important;
background-color: rgba(78, 116, 248, 0.15) !important;
outline: 2px solid rgba(78, 116, 248, 0.2);
}
// Style for selected tags (via keyboard or mouse selection)
&-selected {
border-color: var(--bg-robin-500) !important;
background-color: rgba(78, 116, 248, 0.15) !important;
box-shadow: 0 0 0 2px rgba(78, 116, 248, 0.2);
}
.ant-select-selection-item-content {
color: var(--bg-vanilla-400);
}
.ant-select-selection-item-remove {
color: rgba(192, 193, 195, 0.7);
&:hover {
color: rgba(192, 193, 195, 1);
}
}
}
// Class applied when in selection mode
&.has-selection {
.ant-select-selection-item-selected {
cursor: move; // Indicate draggable
}
// Change cursor for selection
.ant-select-selector {
cursor: text;
}
}
}
// Dropdown styles
.custom-select-dropdown-container,
.custom-multiselect-dropdown-container {
z-index: 1050 !important;
padding: 0;
box-shadow: 0 3px 6px -4px rgba(0, 0, 0, 0.5), 0 6px 16px 0 rgba(0, 0, 0, 0.4),
0 9px 28px 8px rgba(0, 0, 0, 0.3);
background-color: var(--bg-ink-400);
border: 1px solid var(--bg-slate-400);
.ant-select-item {
padding: 8px 12px;
color: var(--bg-vanilla-400);
// Make keyboard navigation visible
&-option-active {
background-color: var(--bg-slate-400) !important;
}
&-option-selected {
background-color: rgba(78, 116, 248, 0.15) !important;
}
}
}
.custom-select-dropdown-container,
.custom-multiselect-dropdown-container {
width: 100%;
overflow-x: auto;
overflow-y: hidden;
resize: horizontal;
min-width: 300px !important;
.empty-message {
padding: 12px;
text-align: center;
color: rgba(192, 193, 195, 0.45);
}
}
// Custom dropdown styles for single select
.custom-select-dropdown {
padding: 8px 0 0 0;
max-height: 500px;
overflow-y: auto;
overflow-x: hidden;
scrollbar-width: thin;
border-radius: 4px;
border: 1px solid var(--bg-slate-400);
width: 100%;
background-color: var(--bg-ink-400);
&::-webkit-scrollbar {
width: 6px;
}
&::-webkit-scrollbar-thumb {
background-color: $custom-border-color;
border-radius: 3px;
}
&::-webkit-scrollbar-track {
background-color: var(--bg-slate-400);
}
.no-section-options {
margin-bottom: 8px;
}
.select-group {
margin-bottom: 16px;
border-radius: 4px;
overflow: hidden;
.group-label {
font-weight: 500;
padding: 4px 12px;
font-size: 13px;
color: var(--bg-vanilla-400);
background-color: var(--bg-slate-400);
border-bottom: 1px solid $custom-border-color;
border-top: 1px solid $custom-border-color;
position: relative;
z-index: 1;
margin-bottom: 4px;
}
}
.option-item {
padding: 8px 12px;
cursor: pointer;
display: flex;
align-items: center;
white-space: nowrap;
overflow: hidden;
text-overflow: ellipsis;
color: var(--bg-vanilla-400);
&:hover {
background-color: var(--bg-slate-400);
}
&.selected {
background-color: rgba(78, 116, 248, 0.15);
font-weight: 500;
}
&.active {
background-color: rgba(78, 116, 248, 0.15);
border-color: var(--bg-robin-500);
}
.option-content {
display: flex;
justify-content: space-between;
align-items: center;
width: 100%;
.option-label-text {
margin-bottom: 0;
}
.option-badge {
font-size: 12px;
padding: 2px 6px;
border-radius: 4px;
background-color: $custom-border-color;
color: var(--bg-vanilla-400);
margin-left: 8px;
}
}
}
.loading-container {
display: flex;
justify-content: center;
padding: 12px;
}
}
.navigation-footer {
display: flex;
align-items: center;
padding: 8px 12px;
border-top: 1px solid var(--bg-slate-400);
position: sticky;
bottom: 0;
background-color: var(--bg-ink-400);
z-index: 1;
.navigation-icons {
display: flex;
margin-right: 8px;
color: var(--bg-vanilla-400);
}
.navigation-text {
color: var(--bg-vanilla-400);
font-size: 12px;
}
.navigation-error {
.navigation-text,
.navigation-icons {
color: var(--bg-cherry-500) !important;
}
display: flex;
align-items: center;
justify-content: space-between;
width: 100%;
gap: 4px;
}
.navigation-loading {
display: flex;
align-items: center;
gap: 8px;
.navigation-text,
.navigation-icons {
color: var(--bg-robin-600) !important;
}
}
.navigate {
display: flex;
align-items: center;
padding-right: 12px;
gap: 6px;
.icons {
width: 14px;
height: 14px;
flex-shrink: 0;
border-radius: 2.286px;
border-top: 1.143px solid var(--bg-ink-200);
border-right: 1.143px solid var(--bg-ink-200);
border-bottom: 2.286px solid var(--bg-ink-200);
border-left: 1.143px solid var(--bg-ink-200);
background: var(--Ink-400, var(--bg-ink-400));
}
}
}
// Custom dropdown styles for multi-select
.custom-multiselect-dropdown {
padding: 8px 0 0 0;
max-height: 500px;
overflow-y: auto;
overflow-x: hidden;
scrollbar-width: thin;
border-radius: 4px;
border: 1px solid var(--bg-slate-400);
width: 100%;
background-color: var(--bg-ink-400);
.select-all-option,
.custom-value-option {
padding: 8px 12px;
border-bottom: 1px solid $custom-border-color;
margin-bottom: 8px;
background-color: var(--bg-slate-400);
position: sticky;
top: 0;
z-index: 1;
}
.selected-values-section {
padding: 0 0 8px 0;
border-bottom: 1px solid $custom-border-color;
margin-bottom: 8px;
.selected-option {
padding: 4px 12px;
}
}
.select-group {
margin-bottom: 12px;
overflow: hidden;
.group-label {
font-weight: 500;
padding: 4px 12px;
font-size: 13px;
color: var(--bg-vanilla-400);
background-color: var(--bg-slate-400);
border-bottom: 1px solid $custom-border-color;
border-top: 1px solid $custom-border-color;
position: relative;
z-index: 1;
}
}
.option-item {
padding: 8px 12px;
cursor: pointer;
white-space: nowrap;
overflow: hidden;
text-overflow: ellipsis;
color: var(--bg-vanilla-400);
&.active {
background-color: rgba(78, 116, 248, 0.15);
border-color: var(--bg-robin-500);
}
&:hover {
background-color: var(--bg-slate-400);
}
&.selected {
background-color: rgba(78, 116, 248, 0.15);
font-weight: 500;
}
&.all-option {
font-weight: 500;
border-bottom: 1px solid $custom-border-color;
margin-bottom: 8px;
}
.option-checkbox {
width: 100%;
> span:not(.ant-checkbox) {
width: 100%;
}
.option-content {
display: flex;
justify-content: space-between;
align-items: center;
width: 100%;
.option-label-text {
margin-bottom: 0;
}
.option-badge {
font-size: 12px;
padding: 2px 6px;
border-radius: 4px;
background-color: $custom-border-color;
color: var(--bg-vanilla-400);
margin-left: 8px;
}
}
.only-btn {
display: none;
}
.toggle-btn {
display: none;
}
.only-btn:hover {
background-color: unset;
}
.toggle-btn:hover {
background-color: unset;
}
.option-content:hover {
.only-btn {
display: flex;
align-items: center;
justify-content: center;
height: 21px;
}
.toggle-btn {
display: none;
}
.option-badge {
display: none;
}
}
}
.option-checkbox:hover {
.toggle-btn {
display: flex;
align-items: center;
justify-content: center;
height: 21px;
}
.option-badge {
display: none;
}
}
}
.loading-container {
display: flex;
justify-content: center;
padding: 12px;
}
.empty-message {
padding: 12px;
text-align: center;
color: rgba(192, 193, 195, 0.45);
}
.status-message {
padding: 8px 12px;
text-align: center;
font-style: italic;
color: rgba(192, 193, 195, 0.65);
border-top: 1px dashed $custom-border-color;
}
}
// Custom styles for highlight text
.highlight-text {
background-color: rgba(78, 116, 248, 0.2);
padding: 0 1px;
border-radius: 2px;
font-weight: 500;
}
// Custom option styles for keyboard navigation
.custom-option {
&.focused,
&.ant-select-item-option-active {
background-color: var(--bg-slate-400) !important;
}
}
// Improve the sticky headers appearance
.custom-select-dropdown-container {
.group-label,
.ant-select-item-group {
position: sticky;
top: 0;
z-index: 2;
background-color: var(--bg-slate-400);
border-bottom: 1px solid $custom-border-color;
padding: 4px 12px;
margin: 0;
width: 100%; // Ensure the header spans full width
box-shadow: 0 1px 2px rgba(0, 0, 0, 0.2); // Add subtle shadow for separation
}
// Ensure proper spacing between sections
.select-group {
margin-bottom: 8px;
position: relative; // Create a positioning context
}
}
// Custom scrollbar styling (shared between components)
@mixin custom-scrollbar {
scrollbar-width: thin;
scrollbar-color: rgba(192, 193, 195, 0.3) rgba(29, 33, 45, 0.6);
&::-webkit-scrollbar {
width: 6px;
height: 6px;
}
&::-webkit-scrollbar-track {
background-color: rgba(29, 33, 45, 0.6);
border-radius: 10px;
}
&::-webkit-scrollbar-thumb {
background-color: rgba(192, 193, 195, 0.3);
border-radius: 10px;
transition: background-color 0.2s ease;
&:hover {
background-color: rgba(192, 193, 195, 0.5);
}
}
}
// Subtle nested scrollbar styling
@mixin nested-scrollbar {
scrollbar-width: thin;
scrollbar-color: rgba(192, 193, 195, 0.2) rgba(29, 33, 45, 0.6);
&::-webkit-scrollbar {
width: 4px;
height: 4px;
}
&::-webkit-scrollbar-track {
background-color: rgba(29, 33, 45, 0.6);
border-radius: 10px;
}
&::-webkit-scrollbar-thumb {
background-color: rgba(192, 193, 195, 0.2);
border-radius: 10px;
&:hover {
background-color: rgba(192, 193, 195, 0.3);
}
}
}
// Apply to main dropdown containers
.custom-select-dropdown,
.custom-multiselect-dropdown {
@include custom-scrollbar;
// Main content area
.options-container {
@include custom-scrollbar;
padding-right: 2px; // Add slight padding to prevent content touching scrollbar
}
// Non-sectioned options
.no-section-options {
@include nested-scrollbar;
margin-right: 2px;
padding-right: 2px;
}
}
// Apply to dropdown container wrappers
.custom-select-dropdown-container,
.custom-multiselect-dropdown-container {
@include custom-scrollbar;
// Add subtle shadow inside to indicate scrollable area
&.has-overflow {
box-shadow: inset 0 -10px 10px -10px rgba(0, 0, 0, 0.2);
}
}
// Light Mode Overrides
.lightMode {
.custom-select {
.ant-select-selector {
background-color: var(--bg-vanilla-100);
border-color: #e9e9e9;
}
.ant-select-selection-placeholder {
color: rgba(0, 0, 0, 0.45);
}
.ant-select-clear {
background-color: var(--bg-vanilla-100);
color: rgba(0, 0, 0, 0.45);
}
&.ant-select-focused {
.ant-select-selector {
border-color: #1890ff;
box-shadow: 0 0 0 2px rgba(24, 144, 255, 0.2);
}
}
}
.custom-multiselect {
.ant-select-selector {
background-color: var(--bg-vanilla-100);
border-color: #e9e9e9;
&::-webkit-scrollbar-thumb {
background-color: #ccc;
}
&::-webkit-scrollbar-track {
background-color: #f0f0f0;
}
}
.ant-select-selection-placeholder {
color: rgba(0, 0, 0, 0.45);
}
.ant-select-selection-item {
background-color: #f5f5f5;
border: 1px solid #e8e8e8;
color: rgba(0, 0, 0, 0.85);
.ant-select-selection-item-content {
color: rgba(0, 0, 0, 0.85);
}
.ant-select-selection-item-remove {
color: rgba(0, 0, 0, 0.45);
&:hover {
color: rgba(0, 0, 0, 0.85);
}
}
&-active {
border-color: var(--bg-robin-500) !important;
background-color: var(--bg-vanilla-300) !important;
}
&-selected {
border-color: #1890ff !important;
background-color: var(--bg-vanilla-300) !important;
}
}
}
.custom-select-dropdown-container,
.custom-multiselect-dropdown-container {
background-color: var(--bg-vanilla-100);
border: 1px solid #f0f0f0;
box-shadow: 0 3px 6px -4px rgba(0, 0, 0, 0.12),
0 6px 16px 0 rgba(0, 0, 0, 0.08), 0 9px 28px 8px rgba(0, 0, 0, 0.05);
.empty-message {
color: rgba(0, 0, 0, 0.45);
}
.ant-select-item {
color: rgba(0, 0, 0, 0.85);
&-option-active {
background-color: #f5f5f5 !important;
}
&-option-selected {
background-color: var(--bg-vanilla-300) !important;
}
}
}
.custom-select-dropdown,
.custom-multiselect-dropdown {
border: 1px solid #f0f0f0;
background-color: var(--bg-vanilla-100);
&::-webkit-scrollbar-thumb {
background-color: #ccc;
}
&::-webkit-scrollbar-track {
background-color: #f0f0f0;
}
.select-group {
.group-label {
color: rgba(0, 0, 0, 0.85);
background-color: #fafafa;
border-bottom: 1px solid #f0f0f0;
border-top: 1px solid #f0f0f0;
}
}
.option-item {
color: rgba(0, 0, 0, 0.85);
&:hover {
background-color: #f5f5f5;
}
&.selected {
background-color: var(--bg-vanilla-300);
}
&.active {
background-color: var(--bg-vanilla-300);
border-color: #91d5ff;
}
.option-content {
.option-badge {
background-color: #f0f0f0;
color: #666;
}
}
}
}
.navigation-footer {
border-top: 1px solid #f0f0f0;
background-color: var(--bg-vanilla-100);
.navigation-icons {
color: rgba(0, 0, 0, 0.45);
}
.navigation-text {
color: rgba(0, 0, 0, 0.45);
}
.navigate {
.icons {
border-top: 1.143px solid var(--bg-ink-200);
border-right: 1.143px solid var(--bg-ink-200);
border-bottom: 2.286px solid var(--bg-ink-200);
border-left: 1.143px solid var(--bg-ink-200);
background: var(--bg-vanilla-300);
}
}
}
.custom-multiselect-dropdown {
.select-all-option,
.custom-value-option {
border-bottom: 1px solid #f0f0f0;
background-color: #fafafa;
}
.selected-values-section {
border-bottom: 1px solid #f0f0f0;
}
.status-message {
color: rgba(0, 0, 0, 0.65);
border-top: 1px dashed #f0f0f0;
}
.option-item {
&.all-option {
border-bottom: 1px solid #f0f0f0;
}
}
}
.highlight-text {
background-color: rgba(24, 144, 255, 0.2);
}
.custom-option {
&.focused,
&.ant-select-item-option-active {
background-color: #f5f5f5 !important;
}
}
.custom-select-dropdown-container {
.group-label,
.ant-select-item-group {
background-color: #f5f0f0;
border-bottom: 1px solid #e8e8e8;
box-shadow: 0 1px 2px rgba(0, 0, 0, 0.05);
}
}
// Light mode scrollbar overrides
.custom-select-dropdown,
.custom-multiselect-dropdown,
.custom-select-dropdown-container,
.custom-multiselect-dropdown-container {
scrollbar-color: rgba(0, 0, 0, 0.2) rgba(0, 0, 0, 0.05);
&::-webkit-scrollbar-track {
background-color: rgba(0, 0, 0, 0.05);
}
&::-webkit-scrollbar-thumb {
background-color: rgba(0, 0, 0, 0.2);
&:hover {
background-color: rgba(0, 0, 0, 0.3);
}
}
}
}

View File

@@ -0,0 +1,60 @@
import { SelectProps } from 'antd';
export interface OptionData {
label: string;
value?: string;
disabled?: boolean;
className?: string;
style?: React.CSSProperties;
options?: OptionData[];
type?: 'defined' | 'custom' | 'regex';
}
export interface CustomSelectProps extends Omit<SelectProps, 'options'> {
placeholder?: string;
className?: string;
loading?: boolean;
onSearch?: (value: string) => void;
options?: OptionData[];
defaultActiveFirstOption?: boolean;
noDataMessage?: string;
onClear?: () => void;
getPopupContainer?: (triggerNode: HTMLElement) => HTMLElement;
dropdownRender?: (menu: React.ReactElement) => React.ReactElement;
highlightSearch?: boolean;
placement?: 'topLeft' | 'topRight' | 'bottomLeft' | 'bottomRight';
popupMatchSelectWidth?: boolean;
errorMessage?: string;
allowClear?: SelectProps['allowClear'];
onRetry?: () => void;
}
export interface CustomTagProps {
label: React.ReactNode;
value: string;
closable: boolean;
onClose: () => void;
}
export interface CustomMultiSelectProps
extends Omit<SelectProps<string[] | string>, 'options'> {
placeholder?: string;
className?: string;
loading?: boolean;
onSearch?: (value: string) => void;
options?: OptionData[];
defaultActiveFirstOption?: boolean;
dropdownMatchSelectWidth?: boolean | number;
noDataMessage?: string;
onClear?: () => void;
enableAllSelection?: boolean;
getPopupContainer?: (triggerNode: HTMLElement) => HTMLElement;
dropdownRender?: (menu: React.ReactElement) => React.ReactElement;
highlightSearch?: boolean;
errorMessage?: string;
popupClassName?: string;
placement?: 'topLeft' | 'topRight' | 'bottomLeft' | 'bottomRight';
maxTagCount?: number;
allowClear?: SelectProps['allowClear'];
onRetry?: () => void;
}

View File

@@ -0,0 +1,135 @@
/* eslint-disable sonarjs/cognitive-complexity */
import { OptionData } from './types';
export const SPACEKEY = ' ';
export const prioritizeOrAddOptionForSingleSelect = (
options: OptionData[],
value: string,
label?: string,
): OptionData[] => {
let foundOption: OptionData | null = null;
// Separate the found option and the rest
const filteredOptions = options
.map((option) => {
if ('options' in option && Array.isArray(option.options)) {
// Filter out the value from nested options
const remainingSubOptions = option.options.filter(
(subOption) => subOption.value !== value,
);
const extractedOption = option.options.find(
(subOption) => subOption.value === value,
);
if (extractedOption) foundOption = extractedOption;
// Keep the group if it still has remaining options
return remainingSubOptions.length > 0
? { ...option, options: remainingSubOptions }
: null;
}
// Check top-level options
if (option.value === value) {
foundOption = option;
return null; // Remove it from the list
}
return option;
})
.filter(Boolean) as OptionData[]; // Remove null values
// If not found, create a new option
if (!foundOption) {
foundOption = { value, label: label ?? value };
}
// Add the found/new option at the top
return [foundOption, ...filteredOptions];
};
export const prioritizeOrAddOptionForMultiSelect = (
options: OptionData[],
values: string[], // Only supports multiple values (string[])
labels?: Record<string, string>,
): OptionData[] => {
const foundOptions: OptionData[] = [];
// Separate the found options and the rest
const filteredOptions = options
.map((option) => {
if ('options' in option && Array.isArray(option.options)) {
// Filter out selected values from nested options
const remainingSubOptions = option.options.filter(
(subOption) => subOption.value && !values.includes(subOption.value),
);
const extractedOptions = option.options.filter(
(subOption) => subOption.value && values.includes(subOption.value),
);
if (extractedOptions.length > 0) {
foundOptions.push(...extractedOptions);
}
// Keep the group if it still has remaining options
return remainingSubOptions.length > 0
? { ...option, options: remainingSubOptions }
: null;
}
// Check top-level options
if (option.value && values.includes(option.value)) {
foundOptions.push(option);
return null; // Remove it from the list
}
return option;
})
.filter(Boolean) as OptionData[]; // Remove null values
// Find missing values that were not present in the original options and create new ones
const missingValues = values.filter(
(value) => !foundOptions.some((opt) => opt.value === value),
);
const newOptions = missingValues.map((value) => ({
value,
label: labels?.[value] ?? value, // Use provided label or default to value
}));
// Add found & new options to the top
return [...newOptions, ...foundOptions, ...filteredOptions];
};
/**
* Filters options based on search text
*/
export const filterOptionsBySearch = (
options: OptionData[],
searchText: string,
): OptionData[] => {
if (!searchText.trim()) return options;
const lowerSearchText = searchText.toLowerCase();
return options
.map((option) => {
if ('options' in option && Array.isArray(option.options)) {
// Filter nested options
const filteredSubOptions = option.options.filter((subOption) =>
subOption.label.toLowerCase().includes(lowerSearchText),
);
return filteredSubOptions.length > 0
? { ...option, options: filteredSubOptions }
: undefined;
}
// Filter top-level options
return option.label.toLowerCase().includes(lowerSearchText)
? option
: undefined;
})
.filter(Boolean) as OptionData[];
};

View File

@@ -3,7 +3,7 @@ import 'dayjs/locale/en';
import { PlusOutlined } from '@ant-design/icons';
import { Color } from '@signozhq/design-tokens';
import { Button, Flex, Form, Input, Typography } from 'antd';
import { Button, Flex, Form, Input, Tooltip, Typography } from 'antd';
import getAll from 'api/alerts/getAll';
import { useDeleteDowntimeSchedule } from 'api/plannedDowntime/deleteDowntimeSchedule';
import {
@@ -13,8 +13,10 @@ import {
import dayjs from 'dayjs';
import { useNotifications } from 'hooks/useNotifications';
import { Search } from 'lucide-react';
import { useAppContext } from 'providers/App/App';
import React, { ChangeEvent, useEffect, useState } from 'react';
import { useQuery } from 'react-query';
import { USER_ROLES } from 'types/roles';
import { PlannedDowntimeDeleteModal } from './PlannedDowntimeDeleteModal';
import { PlannedDowntimeForm } from './PlannedDowntimeForm';
@@ -33,6 +35,7 @@ export function PlannedDowntime(): JSX.Element {
});
const [isOpen, setIsOpen] = React.useState(false);
const [form] = Form.useForm();
const { user } = useAppContext();
const [initialValues, setInitialValues] = useState<
Partial<DowntimeSchedules & { editMode: boolean }>
@@ -108,18 +111,27 @@ export function PlannedDowntime(): JSX.Element {
value={searchValue}
onChange={handleSearch}
/>
<Button
icon={<PlusOutlined />}
type="primary"
onClick={(): void => {
setInitialValues({ ...defautlInitialValues, editMode: false });
setIsOpen(true);
setEditMode(false);
form.resetFields();
}}
<Tooltip
title={
user?.role === USER_ROLES.VIEWER
? 'You need edit permissions to create a planned downtime'
: ''
}
>
New downtime
</Button>
<Button
icon={<PlusOutlined />}
type="primary"
onClick={(): void => {
setInitialValues({ ...defautlInitialValues, editMode: false });
setIsOpen(true);
setEditMode(false);
form.resetFields();
}}
disabled={user?.role === USER_ROLES.VIEWER}
>
New downtime
</Button>
</Tooltip>
</Flex>
<br />
<PlannedDowntimeList

View File

@@ -0,0 +1,44 @@
import { screen } from '@testing-library/react';
import { render } from 'tests/test-utils';
import { USER_ROLES } from 'types/roles';
import { PlannedDowntime } from '../PlannedDowntime';
describe('PlannedDowntime Component', () => {
it('renders the PlannedDowntime component properly', () => {
render(<PlannedDowntime />, {}, 'ADMIN');
// Check if title is rendered
expect(screen.getByText('Planned Downtime')).toBeInTheDocument();
// Check if subtitle is rendered
expect(
screen.getByText('Create and manage planned downtimes.'),
).toBeInTheDocument();
// Check if search input is rendered
expect(
screen.getByPlaceholderText('Search for a planned downtime...'),
).toBeInTheDocument();
// Check if "New downtime" button is enabled for ADMIN
const newDowntimeButton = screen.getByRole('button', {
name: /new downtime/i,
});
expect(newDowntimeButton).toBeInTheDocument();
expect(newDowntimeButton).not.toBeDisabled();
});
it('disables the "New downtime" button for users with VIEWER role', () => {
render(<PlannedDowntime />, {}, USER_ROLES.VIEWER);
// Check if "New downtime" button is disabled for VIEWER
const newDowntimeButton = screen.getByRole('button', {
name: /new downtime/i,
});
expect(newDowntimeButton).toBeInTheDocument();
expect(newDowntimeButton).toBeDisabled();
expect(newDowntimeButton).toHaveAttribute('disabled');
});
});

View File

@@ -110,9 +110,16 @@
}
.nav-wrapper {
height: calc(100% - 52px);
display: flex;
flex-direction: column;
justify-content: space-between;
.primary-nav-items {
max-height: 65%;
display: flex;
flex-direction: column;
flex: 1;
min-height: 0;
max-height: 100%;
overflow-y: auto;
overflow-x: hidden;
@@ -121,15 +128,14 @@
}
}
.secondary-nav-items {
max-height: 35%;
display: flex;
flex-direction: column;
flex-shrink: 0;
overflow-y: auto;
overflow-x: hidden;
border-top: 1px solid var(--bg-slate-400);
padding: 8px 0;
max-width: 100%;
position: fixed;
bottom: 0;
left: 0;
width: 64px;
transition: all 0.2s, background 0s, border 0s;

View File

@@ -24,6 +24,7 @@ const plugins = [
CUSTOMERIO_SITE_ID: process.env.CUSTOMERIO_SITE_ID,
CUSTOMERIO_ID: process.env.CUSTOMERIO_ID,
POSTHOG_KEY: process.env.POSTHOG_KEY,
USERPILOT_KEY: process.env.USERPILOT_KEY,
SENTRY_AUTH_TOKEN: process.env.SENTRY_AUTH_TOKEN,
SENTRY_ORG: process.env.SENTRY_ORG,
SENTRY_PROJECT_ID: process.env.SENTRY_PROJECT_ID,
@@ -43,6 +44,7 @@ const plugins = [
CUSTOMERIO_SITE_ID: process.env.CUSTOMERIO_SITE_ID,
CUSTOMERIO_ID: process.env.CUSTOMERIO_ID,
POSTHOG_KEY: process.env.POSTHOG_KEY,
USERPILOT_KEY: process.env.USERPILOT_KEY,
SENTRY_AUTH_TOKEN: process.env.SENTRY_AUTH_TOKEN,
SENTRY_ORG: process.env.SENTRY_ORG,
SENTRY_PROJECT_ID: process.env.SENTRY_PROJECT_ID,

View File

@@ -29,6 +29,7 @@ const plugins = [
CUSTOMERIO_SITE_ID: process.env.CUSTOMERIO_SITE_ID,
CUSTOMERIO_ID: process.env.CUSTOMERIO_ID,
POSTHOG_KEY: process.env.POSTHOG_KEY,
USERPILOT_KEY: process.env.USERPILOT_KEY,
SENTRY_AUTH_TOKEN: process.env.SENTRY_AUTH_TOKEN,
SENTRY_ORG: process.env.SENTRY_ORG,
SENTRY_PROJECT_ID: process.env.SENTRY_PROJECT_ID,
@@ -53,6 +54,7 @@ const plugins = [
CUSTOMERIO_SITE_ID: process.env.CUSTOMERIO_SITE_ID,
CUSTOMERIO_ID: process.env.CUSTOMERIO_ID,
POSTHOG_KEY: process.env.POSTHOG_KEY,
USERPILOT_KEY: process.env.USERPILOT_KEY,
SENTRY_AUTH_TOKEN: process.env.SENTRY_AUTH_TOKEN,
SENTRY_ORG: process.env.SENTRY_ORG,
SENTRY_PROJECT_ID: process.env.SENTRY_PROJECT_ID,

View File

@@ -3135,6 +3135,30 @@
strict-event-emitter "^0.2.4"
web-encoding "^1.1.5"
"@ndhoule/each@^2.0.1":
version "2.0.1"
resolved "https://registry.yarnpkg.com/@ndhoule/each/-/each-2.0.1.tgz#bbed372a603e0713a3193c706a73ddebc5b426a9"
integrity sha512-wHuJw6x+rF6Q9Skgra++KccjBozCr9ymtna0FhxmV/8xT/hZ2ExGYR8SV8prg8x4AH/7mzDYErNGIVHuzHeybw==
dependencies:
"@ndhoule/keys" "^2.0.0"
"@ndhoule/includes@^2.0.1":
version "2.0.1"
resolved "https://registry.yarnpkg.com/@ndhoule/includes/-/includes-2.0.1.tgz#051ff5eb042c8fa17e7158f0a8a70172e1affaa5"
integrity sha512-Q8zN6f3yIhxgBwZ5ldLozHqJlc/fRQ5+hFFsPMFeC9SJvz0nq8vG9hoRXL1c1iaNFQd7yAZIy2igQpERoFqxqg==
dependencies:
"@ndhoule/each" "^2.0.1"
"@ndhoule/keys@^2.0.0":
version "2.0.0"
resolved "https://registry.yarnpkg.com/@ndhoule/keys/-/keys-2.0.0.tgz#3d64ae677c65a261747bf3a457c62eb292a4e0ce"
integrity sha512-vtCqKBC1Av6dsBA8xpAO+cgk051nfaI+PnmTZep2Px0vYrDvpUmLxv7z40COlWH5yCpu3gzNhepk+02yiQiZNw==
"@ndhoule/pick@^2.0.0":
version "2.0.0"
resolved "https://registry.yarnpkg.com/@ndhoule/pick/-/pick-2.0.0.tgz#e1eb1a6ca3243eef56daa095c3a1612c74a52156"
integrity sha512-xkYtpf1pRd8egwvl5tJcdGu+GBd6ZZH3S/zoIQ9txEI+pHF9oTIlxMC9G4CB3sRugAeLgu8qYJGl3tnxWq74Qw==
"@nodelib/fs.scandir@2.1.5":
version "2.1.5"
resolved "https://registry.npmjs.org/@nodelib/fs.scandir/-/fs.scandir-2.1.5.tgz"
@@ -5522,10 +5546,10 @@ axe-core@^4.6.2:
resolved "https://registry.npmjs.org/axe-core/-/axe-core-4.7.0.tgz"
integrity sha512-M0JtH+hlOL5pLQwHOLNYZaXuhqmvS8oExsqB1SBYgA4Dk7u/xx+YdGHXaK5pyUfed5mYXdlYiphWq3G8cRi5JQ==
axios@1.7.7:
version "1.7.7"
resolved "https://registry.yarnpkg.com/axios/-/axios-1.7.7.tgz#2f554296f9892a72ac8d8e4c5b79c14a91d0a47f"
integrity sha512-S4kL7XrjgBmvdGut0sN3yJxqYzrDOnivkBiN0OFs6hLiUam3UPvswUo0kqGyhqUZGEOytHyumEdXsAkgCOUf3Q==
axios@1.8.2:
version "1.8.2"
resolved "https://registry.yarnpkg.com/axios/-/axios-1.8.2.tgz#fabe06e241dfe83071d4edfbcaa7b1c3a40f7979"
integrity sha512-ls4GYBm5aig9vWx8AWDSGLpnpDQRtWAfrjU+EuytuODrFBkqesN2RkOQCBzrA1RQNHw1SmRMSDDDSwzNAYQ6Rg==
dependencies:
follow-redirects "^1.15.6"
form-data "^4.0.0"
@@ -6713,6 +6737,11 @@ compare-func@^2.0.0:
array-ify "^1.0.0"
dot-prop "^5.1.0"
component-indexof@0.0.3:
version "0.0.3"
resolved "https://registry.yarnpkg.com/component-indexof/-/component-indexof-0.0.3.tgz#11d091312239eb8f32c8f25ae9cb002ffe8d3c24"
integrity sha512-puDQKvx/64HZXb4hBwIcvQLaLgux8o1CbWl39s41hrIIZDl1lJiD5jc22gj3RBeGK0ovxALDYpIbyjqDUUl0rw==
compressible@~2.0.16:
version "2.0.18"
resolved "https://registry.npmjs.org/compressible/-/compressible-2.0.18.tgz"
@@ -10742,6 +10771,11 @@ is-wsl@^2.2.0:
dependencies:
is-docker "^2.0.0"
is@^3.1.0:
version "3.3.0"
resolved "https://registry.yarnpkg.com/is/-/is-3.3.0.tgz#61cff6dd3c4193db94a3d62582072b44e5645d79"
integrity sha512-nW24QBoPcFGGHJGUwnfpI7Yc5CdqWNdsyHQszVE/z2pKHXzh7FZ5GWhJqSyaQ9wMkQnsTx+kAI8bHlCX4tKdbg==
isarray@0.0.1:
version "0.0.1"
resolved "https://registry.yarnpkg.com/isarray/-/isarray-0.0.1.tgz#8a18acfca9a8f4177e09abfc6038939b05d1eedf"
@@ -13130,6 +13164,11 @@ nwsapi@^2.2.0:
resolved "https://registry.npmjs.org/nwsapi/-/nwsapi-2.2.4.tgz"
integrity sha512-NHj4rzRo0tQdijE9ZqAx6kYDcoRwYwSYzCA8MY3JzfxlrvEU0jhnhJT9BhqhJs7I/dKcrDm6TyulaRqZPIhN5g==
obj-case@^0.2.0:
version "0.2.1"
resolved "https://registry.yarnpkg.com/obj-case/-/obj-case-0.2.1.tgz#13a554d04e5ca32dfd9d566451fd2b0e11007f1a"
integrity sha512-PquYBBTy+Y6Ob/O2574XHhDtHJlV1cJHMCgW+rDRc9J5hhmRelJB3k5dTK/3cVmFVtzvAKuENeuLpoyTzMzkOg==
object-assign@^4.0.1, object-assign@^4.1.0, object-assign@^4.1.1:
version "4.1.1"
resolved "https://registry.npmjs.org/object-assign/-/object-assign-4.1.1.tgz"
@@ -17466,6 +17505,17 @@ use-sync-external-store@^1.0.0:
resolved "https://registry.npmjs.org/use-sync-external-store/-/use-sync-external-store-1.2.0.tgz"
integrity sha512-eEgnFxGQ1Ife9bzYs6VLi8/4X6CObHMw9Qr9tPY43iKwsPw8xE8+EFsf/2cFZ5S3esXgpWgtSCtLNS41F+sKPA==
userpilot@1.3.9:
version "1.3.9"
resolved "https://registry.yarnpkg.com/userpilot/-/userpilot-1.3.9.tgz#6374083f3e84cbf1fc825133588b5b499054271b"
integrity sha512-V0QIuIlAJPB8s3j+qtv7BW7NKSXthlZWuowIu+IZOMGLgUbqQTaSW5m1Ct4wJviPKUNOi8kbhCXN4c4b3zcJzg==
dependencies:
"@ndhoule/includes" "^2.0.1"
"@ndhoule/pick" "^2.0.0"
component-indexof "0.0.3"
is "^3.1.0"
obj-case "^0.2.0"
util-deprecate@^1.0.1, util-deprecate@^1.0.2, util-deprecate@~1.0.1:
version "1.0.2"
resolved "https://registry.npmjs.org/util-deprecate/-/util-deprecate-1.0.2.tgz"

View File

@@ -6,31 +6,31 @@ import (
)
var (
CodeInvalidInput code = code{"invalid_input"}
CodeInternal = code{"internal"}
CodeUnsupported = code{"unsupported"}
CodeNotFound = code{"not_found"}
CodeMethodNotAllowed = code{"method_not_allowed"}
CodeAlreadyExists = code{"already_exists"}
CodeUnauthenticated = code{"unauthenticated"}
CodeForbidden = code{"forbidden"}
CodeInvalidInput Code = Code{"invalid_input"}
CodeInternal = Code{"internal"}
CodeUnsupported = Code{"unsupported"}
CodeNotFound = Code{"not_found"}
CodeMethodNotAllowed = Code{"method_not_allowed"}
CodeAlreadyExists = Code{"already_exists"}
CodeUnauthenticated = Code{"unauthenticated"}
CodeForbidden = Code{"forbidden"}
)
var (
codeRegex = regexp.MustCompile(`^[a-z_]+$`)
)
type code struct{ s string }
type Code struct{ s string }
func NewCode(s string) (code, error) {
func NewCode(s string) (Code, error) {
if !codeRegex.MatchString(s) {
return code{}, fmt.Errorf("invalid code: %v", s)
return Code{}, fmt.Errorf("invalid code: %v", s)
}
return code{s: s}, nil
return Code{s: s}, nil
}
func MustNewCode(s string) code {
func MustNewCode(s string) Code {
code, err := NewCode(s)
if err != nil {
panic(err)
@@ -39,6 +39,6 @@ func MustNewCode(s string) code {
return code
}
func (c code) String() string {
func (c Code) String() string {
return c.s
}

View File

@@ -7,7 +7,7 @@ import (
)
var (
codeUnknown code = MustNewCode("unknown")
codeUnknown Code = MustNewCode("unknown")
)
// base is the fundamental struct that implements the error interface.
@@ -16,7 +16,7 @@ type base struct {
// t denotes the custom type of the error.
t typ
// c denotes the short code for the error message.
c code
c Code
// m contains error message passed through errors.New.
m string
// e is the actual error being wrapped.
@@ -47,7 +47,7 @@ func (b *base) Error() string {
}
// New returns a base error. It requires type, code and message as input.
func New(t typ, code code, message string) *base {
func New(t typ, code Code, message string) *base {
return &base{
t: t,
c: code,
@@ -59,7 +59,7 @@ func New(t typ, code code, message string) *base {
}
// Newf returns a new base by formatting the error message with the supplied format specifier.
func Newf(t typ, code code, format string, args ...interface{}) *base {
func Newf(t typ, code Code, format string, args ...interface{}) *base {
return &base{
t: t,
c: code,
@@ -70,7 +70,7 @@ func Newf(t typ, code code, format string, args ...interface{}) *base {
// Wrapf returns a new error by formatting the error message with the supplied format specifier
// and wrapping another error with base.
func Wrapf(cause error, t typ, code code, format string, args ...interface{}) *base {
func Wrapf(cause error, t typ, code Code, format string, args ...interface{}) *base {
return &base{
t: t,
c: code,
@@ -110,7 +110,7 @@ func (b *base) WithAdditional(a ...string) *base {
// and the error itself.
//
//lint:ignore ST1008 we want to return arguments in the 'TCMEUA' order of the struct
func Unwrapb(cause error) (typ, code, string, error, string, []string) {
func Unwrapb(cause error) (typ, Code, string, error, string, []string) {
base, ok := cause.(*base)
if ok {
return base.t, base.c, base.m, base.e, base.u, base.a
@@ -127,7 +127,7 @@ func Ast(cause error, typ typ) bool {
}
// Ast checks if the provided error matches the specified custom error code.
func Asc(cause error, code code) bool {
func Asc(cause error, code Code) bool {
_, c, _, _, _, _ := Unwrapb(cause)
return c.s == code.s
@@ -137,3 +137,7 @@ func Asc(cause error, code code) bool {
func Join(errs ...error) error {
return errors.Join(errs...)
}
func As(err error, target any) bool {
return errors.As(err, target)
}

View File

@@ -6,6 +6,7 @@ import (
"time"
"github.com/SigNoz/signoz/pkg/http/client/plugin"
"github.com/gojek/heimdall/v7"
"github.com/gojek/heimdall/v7/httpclient"
"go.opentelemetry.io/contrib/instrumentation/net/http/otelhttp"
"go.opentelemetry.io/otel/metric"
@@ -33,6 +34,15 @@ func New(logger *slog.Logger, tracerProvider trace.TracerProvider, meterProvider
Transport: otelhttp.NewTransport(http.DefaultTransport, otelhttp.WithTracerProvider(tracerProvider), otelhttp.WithMeterProvider(meterProvider)),
}
if clientOpts.retriable == nil {
clientOpts.retriable = heimdall.NewRetrier(
heimdall.NewConstantBackoff(
2*time.Second,
100*time.Millisecond,
),
)
}
c := httpclient.NewClient(
httpclient.WithHTTPClient(netc),
httpclient.WithRetrier(clientOpts.retriable),

View File

@@ -63,7 +63,7 @@ func (plugin *reqResLog) OnRequestEnd(request *http.Request, response *http.Resp
func (plugin *reqResLog) OnError(request *http.Request, err error) {
host, port, _ := net.SplitHostPort(request.Host)
fields := []any{
err,
"error", err,
string(semconv.HTTPRequestMethodKey), request.Method,
string(semconv.URLPathKey), request.URL.Path,
string(semconv.URLSchemeKey), request.URL.Scheme,

View File

@@ -1,8 +1,10 @@
package implorganization
import (
"context"
"encoding/json"
"net/http"
"time"
"github.com/SigNoz/signoz/pkg/errors"
"github.com/SigNoz/signoz/pkg/http/render"
@@ -12,16 +14,19 @@ import (
"github.com/SigNoz/signoz/pkg/valuer"
)
type organizationAPI struct {
type handler struct {
module organization.Module
}
func NewAPI(module organization.Module) organization.API {
return &organizationAPI{module: module}
func NewHandler(module organization.Module) organization.Handler {
return &handler{module: module}
}
func (api *organizationAPI) Get(rw http.ResponseWriter, r *http.Request) {
claims, err := authtypes.ClaimsFromContext(r.Context())
func (handler *handler) Get(rw http.ResponseWriter, r *http.Request) {
ctx, cancel := context.WithTimeout(r.Context(), 10*time.Second)
defer cancel()
claims, err := authtypes.ClaimsFromContext(ctx)
if err != nil {
render.Error(rw, err)
return
@@ -29,11 +34,11 @@ func (api *organizationAPI) Get(rw http.ResponseWriter, r *http.Request) {
orgID, err := valuer.NewUUID(claims.OrgID)
if err != nil {
render.Error(rw, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "invalid org id"))
render.Error(rw, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "orgId is invalid"))
return
}
organization, err := api.module.Get(r.Context(), orgID)
organization, err := handler.module.Get(ctx, orgID)
if err != nil {
render.Error(rw, err)
return
@@ -42,18 +47,11 @@ func (api *organizationAPI) Get(rw http.ResponseWriter, r *http.Request) {
render.Success(rw, http.StatusOK, organization)
}
func (api *organizationAPI) GetAll(rw http.ResponseWriter, r *http.Request) {
organizations, err := api.module.GetAll(r.Context())
if err != nil {
render.Error(rw, err)
return
}
func (handler *handler) Update(rw http.ResponseWriter, r *http.Request) {
ctx, cancel := context.WithTimeout(r.Context(), 10*time.Second)
defer cancel()
render.Success(rw, http.StatusOK, organizations)
}
func (api *organizationAPI) Update(rw http.ResponseWriter, r *http.Request) {
claims, err := authtypes.ClaimsFromContext(r.Context())
claims, err := authtypes.ClaimsFromContext(ctx)
if err != nil {
render.Error(rw, err)
return
@@ -72,7 +70,7 @@ func (api *organizationAPI) Update(rw http.ResponseWriter, r *http.Request) {
}
req.ID = orgID
err = api.module.Update(r.Context(), req)
err = handler.module.Update(ctx, req)
if err != nil {
render.Error(rw, err)
return

View File

@@ -8,26 +8,26 @@ import (
"github.com/SigNoz/signoz/pkg/valuer"
)
type organizationModule struct {
type module struct {
store types.OrganizationStore
}
func NewModule(organizationStore types.OrganizationStore) organization.Module {
return &organizationModule{store: organizationStore}
return &module{store: organizationStore}
}
func (o *organizationModule) Create(ctx context.Context, organization *types.Organization) error {
return o.store.Create(ctx, organization)
func (module *module) Create(ctx context.Context, organization *types.Organization) error {
return module.store.Create(ctx, organization)
}
func (o *organizationModule) Get(ctx context.Context, id valuer.UUID) (*types.Organization, error) {
return o.store.Get(ctx, id)
func (module *module) Get(ctx context.Context, id valuer.UUID) (*types.Organization, error) {
return module.store.Get(ctx, id)
}
func (o *organizationModule) GetAll(ctx context.Context) ([]*types.Organization, error) {
return o.store.GetAll(ctx)
func (module *module) GetAll(ctx context.Context) ([]*types.Organization, error) {
return module.store.GetAll(ctx)
}
func (o *organizationModule) Update(ctx context.Context, updatedOrganization *types.Organization) error {
return o.store.Update(ctx, updatedOrganization)
func (module *module) Update(ctx context.Context, updatedOrganization *types.Organization) error {
return module.store.Update(ctx, updatedOrganization)
}

View File

@@ -2,77 +2,69 @@ package implorganization
import (
"context"
"database/sql"
"time"
"github.com/SigNoz/signoz/pkg/errors"
"github.com/SigNoz/signoz/pkg/sqlstore"
"github.com/SigNoz/signoz/pkg/types"
"github.com/SigNoz/signoz/pkg/valuer"
)
type store struct {
store sqlstore.SQLStore
sqlstore sqlstore.SQLStore
}
func NewStore(db sqlstore.SQLStore) types.OrganizationStore {
return &store{store: db}
func NewStore(sqlstore sqlstore.SQLStore) types.OrganizationStore {
return &store{sqlstore: sqlstore}
}
func (s *store) Create(ctx context.Context, organization *types.Organization) error {
_, err := s.
store.
func (store *store) Create(ctx context.Context, organization *types.Organization) error {
_, err := store.
sqlstore.
BunDB().
NewInsert().
Model(organization).
Exec(ctx)
if err != nil {
return errors.Wrapf(err, errors.TypeInternal, errors.CodeInternal, "failed to create organization")
return store.sqlstore.WrapAlreadyExistsErrf(err, types.ErrOrganizationAlreadyExists, "organization with name: %s already exists", organization.Name)
}
return nil
}
func (s *store) Get(ctx context.Context, id valuer.UUID) (*types.Organization, error) {
func (store *store) Get(ctx context.Context, id valuer.UUID) (*types.Organization, error) {
organization := new(types.Organization)
err := s.
store.
err := store.
sqlstore.
BunDB().
NewSelect().
Model(organization).
Where("id = ?", id.StringValue()).
Scan(ctx)
if err != nil {
if err == sql.ErrNoRows {
return nil, errors.Wrapf(err, errors.TypeInternal, errors.CodeInternal, "no organization found with id: %s", id.StringValue())
}
return nil, errors.Wrapf(err, errors.TypeInternal, errors.CodeInternal, "failed to get organization with id: %s", id.StringValue())
return nil, store.sqlstore.WrapNotFoundErrf(err, types.ErrOrganizationNotFound, "organization with id: %s does not exist", id.StringValue())
}
return organization, nil
}
func (s *store) GetAll(ctx context.Context) ([]*types.Organization, error) {
func (store *store) GetAll(ctx context.Context) ([]*types.Organization, error) {
organizations := make([]*types.Organization, 0)
err := s.
store.
err := store.
sqlstore.
BunDB().
NewSelect().
Model(&organizations).
Scan(ctx)
if err != nil {
if err == sql.ErrNoRows {
return nil, errors.Wrapf(err, errors.TypeInternal, errors.CodeInternal, "no organizations found")
}
return nil, errors.Wrapf(err, errors.TypeInternal, errors.CodeInternal, "failed to get all organizations")
return nil, err
}
return organizations, nil
}
func (s *store) Update(ctx context.Context, organization *types.Organization) error {
_, err := s.
store.
func (store *store) Update(ctx context.Context, organization *types.Organization) error {
_, err := store.
sqlstore.
BunDB().
NewUpdate().
Model(organization).
@@ -81,21 +73,21 @@ func (s *store) Update(ctx context.Context, organization *types.Organization) er
Where("id = ?", organization.ID.StringValue()).
Exec(ctx)
if err != nil {
return errors.Wrapf(err, errors.TypeInternal, errors.CodeInternal, "failed to update organization with id: %s", organization.ID.StringValue())
return store.sqlstore.WrapAlreadyExistsErrf(err, types.ErrOrganizationAlreadyExists, "organization already exists")
}
return nil
}
func (s *store) Delete(ctx context.Context, id valuer.UUID) error {
_, err := s.
store.
func (store *store) Delete(ctx context.Context, id valuer.UUID) error {
_, err := store.
sqlstore.
BunDB().
NewDelete().
Model(new(types.Organization)).
Where("id = ?", id.StringValue()).
Exec(ctx)
if err != nil {
return errors.Wrapf(err, errors.TypeInternal, errors.CodeInternal, "failed to delete organization with id: %s", id.StringValue())
return err
}
return nil

View File

@@ -22,13 +22,10 @@ type Module interface {
Update(context.Context, *types.Organization) error
}
type API interface {
type Handler interface {
// Get gets the organization based on the id in claims
Get(http.ResponseWriter, *http.Request)
// GetAll gets all the organizations
GetAll(http.ResponseWriter, *http.Request)
// Update updates the organization based on the id in claims
Update(http.ResponseWriter, *http.Request)
}

View File

@@ -1,147 +0,0 @@
package preference
import (
"encoding/json"
"net/http"
"github.com/SigNoz/signoz/pkg/http/render"
"github.com/SigNoz/signoz/pkg/types/authtypes"
"github.com/SigNoz/signoz/pkg/types/preferencetypes"
"github.com/gorilla/mux"
)
type API interface {
GetOrgPreference(http.ResponseWriter, *http.Request)
UpdateOrgPreference(http.ResponseWriter, *http.Request)
GetAllOrgPreferences(http.ResponseWriter, *http.Request)
GetUserPreference(http.ResponseWriter, *http.Request)
UpdateUserPreference(http.ResponseWriter, *http.Request)
GetAllUserPreferences(http.ResponseWriter, *http.Request)
}
type preferenceAPI struct {
usecase Usecase
}
func NewAPI(usecase Usecase) API {
return &preferenceAPI{usecase: usecase}
}
func (p *preferenceAPI) GetOrgPreference(rw http.ResponseWriter, r *http.Request) {
preferenceId := mux.Vars(r)["preferenceId"]
claims, err := authtypes.ClaimsFromContext(r.Context())
if err != nil {
render.Error(rw, err)
return
}
preference, err := p.usecase.GetOrgPreference(
r.Context(), preferenceId, claims.OrgID,
)
if err != nil {
render.Error(rw, err)
return
}
render.Success(rw, http.StatusOK, preference)
}
func (p *preferenceAPI) UpdateOrgPreference(rw http.ResponseWriter, r *http.Request) {
preferenceId := mux.Vars(r)["preferenceId"]
req := preferencetypes.UpdatablePreference{}
claims, err := authtypes.ClaimsFromContext(r.Context())
if err != nil {
render.Error(rw, err)
return
}
err = json.NewDecoder(r.Body).Decode(&req)
if err != nil {
render.Error(rw, err)
return
}
err = p.usecase.UpdateOrgPreference(r.Context(), preferenceId, req.PreferenceValue, claims.OrgID)
if err != nil {
render.Error(rw, err)
return
}
render.Success(rw, http.StatusNoContent, nil)
}
func (p *preferenceAPI) GetAllOrgPreferences(rw http.ResponseWriter, r *http.Request) {
claims, err := authtypes.ClaimsFromContext(r.Context())
if err != nil {
render.Error(rw, err)
return
}
preferences, err := p.usecase.GetAllOrgPreferences(
r.Context(), claims.OrgID,
)
if err != nil {
render.Error(rw, err)
return
}
render.Success(rw, http.StatusOK, preferences)
}
func (p *preferenceAPI) GetUserPreference(rw http.ResponseWriter, r *http.Request) {
preferenceId := mux.Vars(r)["preferenceId"]
claims, err := authtypes.ClaimsFromContext(r.Context())
if err != nil {
render.Error(rw, err)
return
}
preference, err := p.usecase.GetUserPreference(
r.Context(), preferenceId, claims.OrgID, claims.UserID,
)
if err != nil {
render.Error(rw, err)
return
}
render.Success(rw, http.StatusOK, preference)
}
func (p *preferenceAPI) UpdateUserPreference(rw http.ResponseWriter, r *http.Request) {
preferenceId := mux.Vars(r)["preferenceId"]
claims, err := authtypes.ClaimsFromContext(r.Context())
if err != nil {
render.Error(rw, err)
return
}
req := preferencetypes.UpdatablePreference{}
err = json.NewDecoder(r.Body).Decode(&req)
if err != nil {
render.Error(rw, err)
return
}
err = p.usecase.UpdateUserPreference(r.Context(), preferenceId, req.PreferenceValue, claims.UserID)
if err != nil {
render.Error(rw, err)
return
}
render.Success(rw, http.StatusNoContent, nil)
}
func (p *preferenceAPI) GetAllUserPreferences(rw http.ResponseWriter, r *http.Request) {
claims, err := authtypes.ClaimsFromContext(r.Context())
if err != nil {
render.Error(rw, err)
return
}
preferences, err := p.usecase.GetAllUserPreferences(
r.Context(), claims.OrgID, claims.UserID,
)
if err != nil {
render.Error(rw, err)
return
}
render.Success(rw, http.StatusOK, preferences)
}

View File

@@ -0,0 +1,176 @@
package implpreference
import (
"context"
"encoding/json"
"net/http"
"time"
"github.com/SigNoz/signoz/pkg/errors"
"github.com/SigNoz/signoz/pkg/http/render"
"github.com/SigNoz/signoz/pkg/modules/preference"
"github.com/SigNoz/signoz/pkg/types/authtypes"
"github.com/SigNoz/signoz/pkg/types/preferencetypes"
"github.com/gorilla/mux"
)
type handler struct {
module preference.Module
}
func NewHandler(module preference.Module) preference.Handler {
return &handler{module: module}
}
func (handler *handler) GetOrg(rw http.ResponseWriter, r *http.Request) {
ctx, cancel := context.WithTimeout(r.Context(), 10*time.Second)
defer cancel()
claims, err := authtypes.ClaimsFromContext(ctx)
if err != nil {
render.Error(rw, err)
return
}
id, ok := mux.Vars(r)["preferenceId"]
if !ok {
render.Error(rw, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "id is required"))
return
}
preference, err := handler.module.GetOrg(ctx, id, claims.OrgID)
if err != nil {
render.Error(rw, err)
return
}
render.Success(rw, http.StatusOK, preference)
}
func (handler *handler) UpdateOrg(rw http.ResponseWriter, r *http.Request) {
ctx, cancel := context.WithTimeout(r.Context(), 10*time.Second)
defer cancel()
claims, err := authtypes.ClaimsFromContext(ctx)
if err != nil {
render.Error(rw, err)
return
}
id, ok := mux.Vars(r)["preferenceId"]
if !ok {
render.Error(rw, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "id is required"))
return
}
req := new(preferencetypes.UpdatablePreference)
err = json.NewDecoder(r.Body).Decode(req)
if err != nil {
render.Error(rw, err)
return
}
err = handler.module.UpdateOrg(ctx, id, req.PreferenceValue, claims.OrgID)
if err != nil {
render.Error(rw, err)
return
}
render.Success(rw, http.StatusNoContent, nil)
}
func (handler *handler) GetAllOrg(rw http.ResponseWriter, r *http.Request) {
ctx, cancel := context.WithTimeout(r.Context(), 10*time.Second)
defer cancel()
claims, err := authtypes.ClaimsFromContext(ctx)
if err != nil {
render.Error(rw, err)
return
}
preferences, err := handler.module.GetAllOrg(ctx, claims.OrgID)
if err != nil {
render.Error(rw, err)
return
}
render.Success(rw, http.StatusOK, preferences)
}
func (handler *handler) GetUser(rw http.ResponseWriter, r *http.Request) {
ctx, cancel := context.WithTimeout(r.Context(), 10*time.Second)
defer cancel()
claims, err := authtypes.ClaimsFromContext(ctx)
if err != nil {
render.Error(rw, err)
return
}
id, ok := mux.Vars(r)["preferenceId"]
if !ok {
render.Error(rw, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "id is required"))
return
}
preference, err := handler.module.GetUser(ctx, id, claims.OrgID, claims.UserID)
if err != nil {
render.Error(rw, err)
return
}
render.Success(rw, http.StatusOK, preference)
}
func (handler *handler) UpdateUser(rw http.ResponseWriter, r *http.Request) {
ctx, cancel := context.WithTimeout(r.Context(), 10*time.Second)
defer cancel()
claims, err := authtypes.ClaimsFromContext(ctx)
if err != nil {
render.Error(rw, err)
return
}
id, ok := mux.Vars(r)["preferenceId"]
if !ok {
render.Error(rw, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "id is required"))
return
}
req := new(preferencetypes.UpdatablePreference)
err = json.NewDecoder(r.Body).Decode(req)
if err != nil {
render.Error(rw, err)
return
}
err = handler.module.UpdateUser(ctx, id, req.PreferenceValue, claims.UserID)
if err != nil {
render.Error(rw, err)
return
}
render.Success(rw, http.StatusNoContent, nil)
}
func (handler *handler) GetAllUser(rw http.ResponseWriter, r *http.Request) {
ctx, cancel := context.WithTimeout(r.Context(), 10*time.Second)
defer cancel()
claims, err := authtypes.ClaimsFromContext(ctx)
if err != nil {
render.Error(rw, err)
return
}
preferences, err := handler.module.GetAllUser(ctx, claims.OrgID, claims.UserID)
if err != nil {
render.Error(rw, err)
return
}
render.Success(rw, http.StatusOK, preferences)
}

View File

@@ -1,10 +1,9 @@
package core
package implpreference
import (
"context"
"database/sql"
"encoding/json"
"fmt"
"github.com/SigNoz/signoz/pkg/errors"
"github.com/SigNoz/signoz/pkg/modules/preference"
@@ -12,27 +11,28 @@ import (
"github.com/SigNoz/signoz/pkg/valuer"
)
type usecase struct {
store preferencetypes.PreferenceStore
// Do not take inspiration from this code, it is a work in progress. See Organization module for a better implementation.
type module struct {
store preferencetypes.Store
defaultMap map[string]preferencetypes.Preference
}
func NewPreference(store preferencetypes.PreferenceStore, defaultMap map[string]preferencetypes.Preference) preference.Usecase {
return &usecase{store: store, defaultMap: defaultMap}
func NewModule(store preferencetypes.Store, defaultMap map[string]preferencetypes.Preference) preference.Module {
return &module{store: store, defaultMap: defaultMap}
}
func (usecase *usecase) GetOrgPreference(ctx context.Context, preferenceID string, orgID string) (*preferencetypes.GettablePreference, error) {
preference, seen := usecase.defaultMap[preferenceID]
func (module *module) GetOrg(ctx context.Context, preferenceID string, orgID string) (*preferencetypes.GettablePreference, error) {
preference, seen := module.defaultMap[preferenceID]
if !seen {
return nil, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, fmt.Sprintf("no such preferenceID exists: %s", preferenceID))
return nil, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "cannot find preference with id: %s", preferenceID)
}
isPreferenceEnabled := preference.IsEnabledForScope(preferencetypes.OrgAllowedScope)
if !isPreferenceEnabled {
return nil, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, fmt.Sprintf("preference is not enabled at org scope: %s", preferenceID))
isEnabled := preference.IsEnabledForScope(preferencetypes.OrgAllowedScope)
if !isEnabled {
return nil, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "preference is not enabled at org scope: %s", preferenceID)
}
orgPreference, err := usecase.store.GetOrgPreference(ctx, orgID, preferenceID)
org, err := module.store.GetOrg(ctx, orgID, preferenceID)
if err != nil {
if err == sql.ErrNoRows {
return &preferencetypes.GettablePreference{
@@ -40,24 +40,24 @@ func (usecase *usecase) GetOrgPreference(ctx context.Context, preferenceID strin
PreferenceValue: preference.DefaultValue,
}, nil
}
return nil, errors.Wrapf(err, errors.TypeInternal, errors.CodeInternal, fmt.Sprintf("error in fetching the org preference: %s", preferenceID))
return nil, errors.Wrapf(err, errors.TypeInternal, errors.CodeInternal, "error in fetching the org preference: %s", preferenceID)
}
return &preferencetypes.GettablePreference{
PreferenceID: preferenceID,
PreferenceValue: preference.SanitizeValue(orgPreference.PreferenceValue),
PreferenceValue: preference.SanitizeValue(org.PreferenceValue),
}, nil
}
func (usecase *usecase) UpdateOrgPreference(ctx context.Context, preferenceID string, preferenceValue interface{}, orgID string) error {
preference, seen := usecase.defaultMap[preferenceID]
func (module *module) UpdateOrg(ctx context.Context, preferenceID string, preferenceValue interface{}, orgID string) error {
preference, seen := module.defaultMap[preferenceID]
if !seen {
return errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, fmt.Sprintf("no such preferenceID exists: %s", preferenceID))
return errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "cannot find preference with id: %s", preferenceID)
}
isPreferenceEnabled := preference.IsEnabledForScope(preferencetypes.OrgAllowedScope)
if !isPreferenceEnabled {
return errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, fmt.Sprintf("preference is not enabled at org scope: %s", preferenceID))
isEnabled := preference.IsEnabledForScope(preferencetypes.OrgAllowedScope)
if !isEnabled {
return errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "preference is not enabled at org scope: %s", preferenceID)
}
err := preference.IsValidValue(preferenceValue)
@@ -65,26 +65,26 @@ func (usecase *usecase) UpdateOrgPreference(ctx context.Context, preferenceID st
return err
}
storablePreferenceValue, encodeErr := json.Marshal(preferenceValue)
storableValue, encodeErr := json.Marshal(preferenceValue)
if encodeErr != nil {
return errors.Wrapf(encodeErr, errors.TypeInvalidInput, errors.CodeInvalidInput, "error in encoding the preference value")
}
orgPreference, dberr := usecase.store.GetOrgPreference(ctx, orgID, preferenceID)
org, dberr := module.store.GetOrg(ctx, orgID, preferenceID)
if dberr != nil && dberr != sql.ErrNoRows {
return errors.Wrapf(dberr, errors.TypeInternal, errors.CodeInternal, "error in getting the preference value")
}
if dberr != nil {
orgPreference.ID = valuer.GenerateUUID()
orgPreference.PreferenceID = preferenceID
orgPreference.PreferenceValue = string(storablePreferenceValue)
orgPreference.OrgID = orgID
org.ID = valuer.GenerateUUID()
org.PreferenceID = preferenceID
org.PreferenceValue = string(storableValue)
org.OrgID = orgID
} else {
orgPreference.PreferenceValue = string(storablePreferenceValue)
org.PreferenceValue = string(storableValue)
}
dberr = usecase.store.UpsertOrgPreference(ctx, orgPreference)
dberr = module.store.UpsertOrg(ctx, org)
if dberr != nil {
return errors.Wrapf(dberr, errors.TypeInternal, errors.CodeInternal, "error in setting the preference value")
}
@@ -92,19 +92,19 @@ func (usecase *usecase) UpdateOrgPreference(ctx context.Context, preferenceID st
return nil
}
func (usecase *usecase) GetAllOrgPreferences(ctx context.Context, orgID string) ([]*preferencetypes.PreferenceWithValue, error) {
allOrgPreferences := []*preferencetypes.PreferenceWithValue{}
orgPreferences, err := usecase.store.GetAllOrgPreferences(ctx, orgID)
func (module *module) GetAllOrg(ctx context.Context, orgID string) ([]*preferencetypes.PreferenceWithValue, error) {
allOrgs := []*preferencetypes.PreferenceWithValue{}
orgs, err := module.store.GetAllOrg(ctx, orgID)
if err != nil {
return nil, errors.Wrapf(err, errors.TypeInternal, errors.CodeInternal, "error in setting all org preference values")
}
preferenceValueMap := map[string]interface{}{}
for _, preferenceValue := range orgPreferences {
for _, preferenceValue := range orgs {
preferenceValueMap[preferenceValue.PreferenceID] = preferenceValue.PreferenceValue
}
for _, preference := range usecase.defaultMap {
for _, preference := range module.defaultMap {
isEnabledForOrgScope := preference.IsEnabledForScope(preferencetypes.OrgAllowedScope)
if isEnabledForOrgScope {
preferenceWithValue := &preferencetypes.PreferenceWithValue{}
@@ -126,16 +126,16 @@ func (usecase *usecase) GetAllOrgPreferences(ctx context.Context, orgID string)
}
preferenceWithValue.Value = preference.SanitizeValue(preferenceWithValue.Value)
allOrgPreferences = append(allOrgPreferences, preferenceWithValue)
allOrgs = append(allOrgs, preferenceWithValue)
}
}
return allOrgPreferences, nil
return allOrgs, nil
}
func (usecase *usecase) GetUserPreference(ctx context.Context, preferenceID string, orgID string, userID string) (*preferencetypes.GettablePreference, error) {
preference, seen := usecase.defaultMap[preferenceID]
func (module *module) GetUser(ctx context.Context, preferenceID string, orgID string, userID string) (*preferencetypes.GettablePreference, error) {
preference, seen := module.defaultMap[preferenceID]
if !seen {
return nil, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, fmt.Sprintf("no such preferenceID exists: %s", preferenceID))
return nil, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "cannot find preference with id: %s", preferenceID)
}
preferenceValue := preferencetypes.GettablePreference{
@@ -143,29 +143,29 @@ func (usecase *usecase) GetUserPreference(ctx context.Context, preferenceID stri
PreferenceValue: preference.DefaultValue,
}
isPreferenceEnabledAtUserScope := preference.IsEnabledForScope(preferencetypes.UserAllowedScope)
if !isPreferenceEnabledAtUserScope {
return nil, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, fmt.Sprintf("preference is not enabled at user scope: %s", preferenceID))
isEnabledAtUserScope := preference.IsEnabledForScope(preferencetypes.UserAllowedScope)
if !isEnabledAtUserScope {
return nil, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "preference is not enabled at user scope: %s", preferenceID)
}
isPreferenceEnabledAtOrgScope := preference.IsEnabledForScope(preferencetypes.OrgAllowedScope)
if isPreferenceEnabledAtOrgScope {
orgPreference, err := usecase.store.GetOrgPreference(ctx, orgID, preferenceID)
isEnabledAtOrgScope := preference.IsEnabledForScope(preferencetypes.OrgAllowedScope)
if isEnabledAtOrgScope {
org, err := module.store.GetOrg(ctx, orgID, preferenceID)
if err != nil && err != sql.ErrNoRows {
return nil, errors.Wrapf(err, errors.TypeInternal, errors.CodeInternal, fmt.Sprintf("error in fetching the org preference: %s", preferenceID))
return nil, errors.Wrapf(err, errors.TypeInternal, errors.CodeInternal, "error in fetching the org preference: %s", preferenceID)
}
if err == nil {
preferenceValue.PreferenceValue = orgPreference.PreferenceValue
preferenceValue.PreferenceValue = org.PreferenceValue
}
}
userPreference, err := usecase.store.GetUserPreference(ctx, userID, preferenceID)
user, err := module.store.GetUser(ctx, userID, preferenceID)
if err != nil && err != sql.ErrNoRows {
return nil, errors.Wrapf(err, errors.TypeInternal, errors.CodeInternal, fmt.Sprintf("error in fetching the user preference: %s", preferenceID))
return nil, errors.Wrapf(err, errors.TypeInternal, errors.CodeInternal, "error in fetching the user preference: %s", preferenceID)
}
if err == nil {
preferenceValue.PreferenceValue = userPreference.PreferenceValue
preferenceValue.PreferenceValue = user.PreferenceValue
}
return &preferencetypes.GettablePreference{
@@ -174,15 +174,15 @@ func (usecase *usecase) GetUserPreference(ctx context.Context, preferenceID stri
}, nil
}
func (usecase *usecase) UpdateUserPreference(ctx context.Context, preferenceID string, preferenceValue interface{}, userID string) error {
preference, seen := usecase.defaultMap[preferenceID]
func (module *module) UpdateUser(ctx context.Context, preferenceID string, preferenceValue interface{}, userID string) error {
preference, seen := module.defaultMap[preferenceID]
if !seen {
return errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, fmt.Sprintf("no such preferenceID exists: %s", preferenceID))
return errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "cannot find preference with id: %s", preferenceID)
}
isPreferenceEnabledAtUserScope := preference.IsEnabledForScope(preferencetypes.UserAllowedScope)
if !isPreferenceEnabledAtUserScope {
return errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, fmt.Sprintf("preference is not enabled at user scope: %s", preferenceID))
isEnabledAtUserScope := preference.IsEnabledForScope(preferencetypes.UserAllowedScope)
if !isEnabledAtUserScope {
return errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "preference is not enabled at user scope: %s", preferenceID)
}
err := preference.IsValidValue(preferenceValue)
@@ -190,26 +190,26 @@ func (usecase *usecase) UpdateUserPreference(ctx context.Context, preferenceID s
return err
}
storablePreferenceValue, encodeErr := json.Marshal(preferenceValue)
storableValue, encodeErr := json.Marshal(preferenceValue)
if encodeErr != nil {
return errors.Wrapf(encodeErr, errors.TypeInvalidInput, errors.CodeInvalidInput, "error in encoding the preference value")
}
userPreference, dberr := usecase.store.GetUserPreference(ctx, userID, preferenceID)
user, dberr := module.store.GetUser(ctx, userID, preferenceID)
if dberr != nil && dberr != sql.ErrNoRows {
return errors.Wrapf(dberr, errors.TypeInternal, errors.CodeInternal, "error in getting the preference value")
}
if dberr != nil {
userPreference.ID = valuer.GenerateUUID()
userPreference.PreferenceID = preferenceID
userPreference.PreferenceValue = string(storablePreferenceValue)
userPreference.UserID = userID
user.ID = valuer.GenerateUUID()
user.PreferenceID = preferenceID
user.PreferenceValue = string(storableValue)
user.UserID = userID
} else {
userPreference.PreferenceValue = string(storablePreferenceValue)
user.PreferenceValue = string(storableValue)
}
dberr = usecase.store.UpsertUserPreference(ctx, userPreference)
dberr = module.store.UpsertUser(ctx, user)
if dberr != nil {
return errors.Wrapf(dberr, errors.TypeInternal, errors.CodeInternal, "error in setting the preference value")
}
@@ -217,30 +217,30 @@ func (usecase *usecase) UpdateUserPreference(ctx context.Context, preferenceID s
return nil
}
func (usecase *usecase) GetAllUserPreferences(ctx context.Context, orgID string, userID string) ([]*preferencetypes.PreferenceWithValue, error) {
allUserPreferences := []*preferencetypes.PreferenceWithValue{}
func (module *module) GetAllUser(ctx context.Context, orgID string, userID string) ([]*preferencetypes.PreferenceWithValue, error) {
allUsers := []*preferencetypes.PreferenceWithValue{}
orgPreferences, err := usecase.store.GetAllOrgPreferences(ctx, orgID)
orgs, err := module.store.GetAllOrg(ctx, orgID)
if err != nil {
return nil, errors.Wrapf(err, errors.TypeInternal, errors.CodeInternal, "error in setting all org preference values")
}
preferenceOrgValueMap := map[string]interface{}{}
for _, preferenceValue := range orgPreferences {
for _, preferenceValue := range orgs {
preferenceOrgValueMap[preferenceValue.PreferenceID] = preferenceValue.PreferenceValue
}
userPreferences, err := usecase.store.GetAllUserPreferences(ctx, userID)
users, err := module.store.GetAllUser(ctx, userID)
if err != nil {
return nil, errors.Wrapf(err, errors.TypeInternal, errors.CodeInternal, "error in setting all user preference values")
}
preferenceUserValueMap := map[string]interface{}{}
for _, preferenceValue := range userPreferences {
for _, preferenceValue := range users {
preferenceUserValueMap[preferenceValue.PreferenceID] = preferenceValue.PreferenceValue
}
for _, preference := range usecase.defaultMap {
for _, preference := range module.defaultMap {
isEnabledForUserScope := preference.IsEnabledForScope(preferencetypes.UserAllowedScope)
if isEnabledForUserScope {
@@ -271,8 +271,8 @@ func (usecase *usecase) GetAllUserPreferences(ctx context.Context, orgID string,
}
preferenceWithValue.Value = preference.SanitizeValue(preferenceWithValue.Value)
allUserPreferences = append(allUserPreferences, preferenceWithValue)
allUsers = append(allUsers, preferenceWithValue)
}
}
return allUserPreferences, nil
return allUsers, nil
}

View File

@@ -1,4 +1,4 @@
package core
package implpreference
import (
"context"
@@ -11,11 +11,11 @@ type store struct {
store sqlstore.SQLStore
}
func NewStore(db sqlstore.SQLStore) preferencetypes.PreferenceStore {
func NewStore(db sqlstore.SQLStore) preferencetypes.Store {
return &store{store: db}
}
func (store *store) GetOrgPreference(ctx context.Context, orgID string, preferenceID string) (*preferencetypes.StorableOrgPreference, error) {
func (store *store) GetOrg(ctx context.Context, orgID string, preferenceID string) (*preferencetypes.StorableOrgPreference, error) {
orgPreference := new(preferencetypes.StorableOrgPreference)
err := store.
store.
@@ -33,7 +33,7 @@ func (store *store) GetOrgPreference(ctx context.Context, orgID string, preferen
return orgPreference, nil
}
func (store *store) GetAllOrgPreferences(ctx context.Context, orgID string) ([]*preferencetypes.StorableOrgPreference, error) {
func (store *store) GetAllOrg(ctx context.Context, orgID string) ([]*preferencetypes.StorableOrgPreference, error) {
orgPreferences := make([]*preferencetypes.StorableOrgPreference, 0)
err := store.
store.
@@ -50,7 +50,7 @@ func (store *store) GetAllOrgPreferences(ctx context.Context, orgID string) ([]*
return orgPreferences, nil
}
func (store *store) UpsertOrgPreference(ctx context.Context, orgPreference *preferencetypes.StorableOrgPreference) error {
func (store *store) UpsertOrg(ctx context.Context, orgPreference *preferencetypes.StorableOrgPreference) error {
_, err := store.
store.
BunDB().
@@ -65,7 +65,7 @@ func (store *store) UpsertOrgPreference(ctx context.Context, orgPreference *pref
return nil
}
func (store *store) GetUserPreference(ctx context.Context, userID string, preferenceID string) (*preferencetypes.StorableUserPreference, error) {
func (store *store) GetUser(ctx context.Context, userID string, preferenceID string) (*preferencetypes.StorableUserPreference, error) {
userPreference := new(preferencetypes.StorableUserPreference)
err := store.
store.
@@ -83,7 +83,7 @@ func (store *store) GetUserPreference(ctx context.Context, userID string, prefer
return userPreference, nil
}
func (store *store) GetAllUserPreferences(ctx context.Context, userID string) ([]*preferencetypes.StorableUserPreference, error) {
func (store *store) GetAllUser(ctx context.Context, userID string) ([]*preferencetypes.StorableUserPreference, error) {
userPreferences := make([]*preferencetypes.StorableUserPreference, 0)
err := store.
store.
@@ -100,7 +100,7 @@ func (store *store) GetAllUserPreferences(ctx context.Context, userID string) ([
return userPreferences, nil
}
func (store *store) UpsertUserPreference(ctx context.Context, userPreference *preferencetypes.StorableUserPreference) error {
func (store *store) UpsertUser(ctx context.Context, userPreference *preferencetypes.StorableUserPreference) error {
_, err := store.
store.
BunDB().

View File

@@ -0,0 +1,48 @@
package preference
import (
"context"
"net/http"
"github.com/SigNoz/signoz/pkg/types/preferencetypes"
)
type Module interface {
// Returns the preference for the given organization
GetOrg(ctx context.Context, preferenceId string, orgId string) (*preferencetypes.GettablePreference, error)
// Returns the preference for the given user
GetUser(ctx context.Context, preferenceId string, orgId string, userId string) (*preferencetypes.GettablePreference, error)
// Returns all preferences for the given organization
GetAllOrg(ctx context.Context, orgId string) ([]*preferencetypes.PreferenceWithValue, error)
// Returns all preferences for the given user
GetAllUser(ctx context.Context, orgId string, userId string) ([]*preferencetypes.PreferenceWithValue, error)
// Updates the preference for the given organization
UpdateOrg(ctx context.Context, preferenceId string, preferenceValue interface{}, orgId string) error
// Updates the preference for the given user
UpdateUser(ctx context.Context, preferenceId string, preferenceValue interface{}, userId string) error
}
type Handler interface {
// Returns the preference for the given organization
GetOrg(http.ResponseWriter, *http.Request)
// Updates the preference for the given organization
UpdateOrg(http.ResponseWriter, *http.Request)
// Returns all preferences for the given organization
GetAllOrg(http.ResponseWriter, *http.Request)
// Returns the preference for the given user
GetUser(http.ResponseWriter, *http.Request)
// Updates the preference for the given user
UpdateUser(http.ResponseWriter, *http.Request)
// Returns all preferences for the given user
GetAllUser(http.ResponseWriter, *http.Request)
}

View File

@@ -1,16 +0,0 @@
package preference
import (
"context"
"github.com/SigNoz/signoz/pkg/types/preferencetypes"
)
type Usecase interface {
GetOrgPreference(ctx context.Context, preferenceId string, orgId string) (*preferencetypes.GettablePreference, error)
UpdateOrgPreference(ctx context.Context, preferenceId string, preferenceValue interface{}, orgId string) error
GetAllOrgPreferences(ctx context.Context, orgId string) ([]*preferencetypes.PreferenceWithValue, error)
GetUserPreference(ctx context.Context, preferenceId string, orgId string, userId string) (*preferencetypes.GettablePreference, error)
UpdateUserPreference(ctx context.Context, preferenceId string, preferenceValue interface{}, userId string) error
GetAllUserPreferences(ctx context.Context, orgId string, userId string) ([]*preferencetypes.PreferenceWithValue, error)
}

View File

@@ -0,0 +1,533 @@
package impltracefunnel
import (
"encoding/json"
"net/http"
"time"
"github.com/SigNoz/signoz/pkg/errors"
"github.com/SigNoz/signoz/pkg/http/render"
"github.com/SigNoz/signoz/pkg/modules/tracefunnel"
"github.com/SigNoz/signoz/pkg/types/authtypes"
tf "github.com/SigNoz/signoz/pkg/types/tracefunnel"
"github.com/SigNoz/signoz/pkg/valuer"
"github.com/gorilla/mux"
)
type handler struct {
module tracefunnel.Module
}
func NewHandler(module tracefunnel.Module) tracefunnel.Handler {
return &handler{module: module}
}
func (handler *handler) New(rw http.ResponseWriter, r *http.Request) {
var req tf.FunnelRequest
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
render.Error(rw, err)
return
}
claims, err := authtypes.ClaimsFromContext(r.Context())
if err != nil {
render.Error(rw, err)
return
}
userID := claims.UserID
orgID := claims.OrgID
funnels, err := handler.module.List(r.Context(), orgID)
if err != nil {
render.Error(rw, err)
return
}
for _, f := range funnels {
if f.Name == req.Name {
render.Error(rw, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "a funnel with name '%s' already exists in this organization", req.Name))
return
}
}
funnel, err := handler.module.Create(r.Context(), req.Timestamp, req.Name, userID, orgID)
if err != nil {
render.Error(rw, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "failed to create funnel"))
return
}
response := tf.FunnelResponse{
FunnelID: funnel.ID.String(),
FunnelName: funnel.Name,
CreatedAt: req.Timestamp,
UserEmail: claims.Email,
OrgID: orgID,
}
render.Success(rw, http.StatusOK, response)
}
func (handler *handler) UpdateSteps(rw http.ResponseWriter, r *http.Request) {
var req tf.FunnelRequest
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
render.Error(rw, err)
return
}
claims, err := authtypes.ClaimsFromContext(r.Context())
if err != nil {
render.Error(rw, err)
return
}
userID := claims.UserID
orgID := claims.OrgID
if err := tracefunnel.ValidateTimestamp(req.Timestamp, "timestamp"); err != nil {
render.Error(rw, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "timestamp is invalid: %v", err))
return
}
funnel, err := handler.module.Get(r.Context(), req.FunnelID.String())
if err != nil {
render.Error(rw, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "funnel not found: %v", err))
return
}
// Check if name is being updated and if it already exists
if req.Name != "" && req.Name != funnel.Name {
funnels, err := handler.module.List(r.Context(), orgID)
if err != nil {
render.Error(rw, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "failed to list funnels: %v", err))
return
}
for _, f := range funnels {
if f.Name == req.Name {
render.Error(rw, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "a funnel with name '%s' already exists in this organization", req.Name))
return
}
}
}
// Process each step in the request
for i := range req.Steps {
if req.Steps[i].Order < 1 {
req.Steps[i].Order = int64(i + 1) // Default to sequential ordering if not specified
}
// Generate a new UUID for the step if it doesn't have one
if req.Steps[i].Id.IsZero() {
newUUID := valuer.GenerateUUID()
req.Steps[i].Id = newUUID
}
}
if err := tracefunnel.ValidateFunnelSteps(req.Steps); err != nil {
render.Error(rw, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "invalid funnel steps: %v", err))
return
}
// Normalize step orders
req.Steps = tracefunnel.NormalizeFunnelSteps(req.Steps)
// UpdateSteps the funnel with new steps
funnel.Steps = req.Steps
funnel.UpdatedAt = time.Unix(0, req.Timestamp*1000000) // Convert to nanoseconds
funnel.UpdatedBy = userID
if req.Name != "" {
funnel.Name = req.Name
}
if req.Description != "" {
funnel.Description = req.Description
}
// UpdateSteps funnel in database
err = handler.module.Update(r.Context(), funnel, userID)
if err != nil {
render.Error(rw, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "failed to update funnel in database: %v", err))
return
}
//// UpdateSteps name and description if provided
//if req.Name != "" || req.Description != "" {
// name := req.Name
//
// description := req.Description
//
// err = handler.module.UpdateMetadata(r.Context(), funnel.ID, name, description, userID)
// if err != nil {
// render.Error(rw, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "failed to update funnel metadata: %v", err))
// return
// }
//}
// Get the updated funnel to return in response
updatedFunnel, err := handler.module.Get(r.Context(), funnel.ID.String())
if err != nil {
render.Error(rw, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "failed to get updated funnel: %v", err))
return
}
response := tf.FunnelResponse{
FunnelName: updatedFunnel.Name,
FunnelID: updatedFunnel.ID.String(),
Steps: updatedFunnel.Steps,
CreatedAt: updatedFunnel.CreatedAt.UnixNano() / 1000000,
CreatedBy: updatedFunnel.CreatedBy,
OrgID: updatedFunnel.OrgID.String(),
UpdatedBy: userID,
UpdatedAt: updatedFunnel.UpdatedAt.UnixNano() / 1000000,
Description: updatedFunnel.Description,
}
render.Success(rw, http.StatusOK, response)
}
func (handler *handler) UpdateFunnel(rw http.ResponseWriter, r *http.Request) {
var req tf.FunnelRequest
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
render.Error(rw, err)
return
}
claims, err := authtypes.ClaimsFromContext(r.Context())
if err != nil {
render.Error(rw, err)
return
}
userID := claims.UserID
orgID := claims.OrgID
if err := tracefunnel.ValidateTimestamp(req.Timestamp, "timestamp"); err != nil {
render.Error(rw, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "timestamp is invalid: %v", err))
return
}
vars := mux.Vars(r)
funnelID := vars["funnel_id"]
funnel, err := handler.module.Get(r.Context(), funnelID)
if err != nil {
render.Error(rw, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "funnel not found: %v", err))
return
}
// Check if name is being updated and if it already exists
if req.Name != "" && req.Name != funnel.Name {
funnels, err := handler.module.List(r.Context(), orgID)
if err != nil {
render.Error(rw, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "failed to list funnels: %v", err))
return
}
for _, f := range funnels {
if f.Name == req.Name {
render.Error(rw, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "a funnel with name '%s' already exists in this organization", req.Name))
return
}
}
}
funnel.UpdatedAt = time.Unix(0, req.Timestamp*1000000) // Convert to nanoseconds
funnel.UpdatedBy = userID
if req.Name != "" {
funnel.Name = req.Name
}
if req.Description != "" {
funnel.Description = req.Description
}
// Update funnel in database
err = handler.module.Update(r.Context(), funnel, userID)
if err != nil {
render.Error(rw, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "failed to update funnel in database: %v", err))
return
}
// Get the updated funnel to return in response
updatedFunnel, err := handler.module.Get(r.Context(), funnel.ID.String())
if err != nil {
render.Error(rw, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "failed to get updated funnel: %v", err))
return
}
response := tf.FunnelResponse{
FunnelName: updatedFunnel.Name,
FunnelID: updatedFunnel.ID.String(),
Steps: updatedFunnel.Steps,
CreatedAt: updatedFunnel.CreatedAt.UnixNano() / 1000000,
CreatedBy: updatedFunnel.CreatedBy,
OrgID: updatedFunnel.OrgID.String(),
UpdatedBy: userID,
UpdatedAt: updatedFunnel.UpdatedAt.UnixNano() / 1000000,
Description: updatedFunnel.Description,
}
render.Success(rw, http.StatusOK, response)
}
func (handler *handler) List(rw http.ResponseWriter, r *http.Request) {
claims, err := authtypes.ClaimsFromContext(r.Context())
if err != nil {
render.Error(rw, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "unauthenticated"))
return
}
orgID := claims.OrgID
funnels, err := handler.module.List(r.Context(), orgID)
if err != nil {
render.Error(rw, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "failed to list funnels: %v", err))
return
}
var response []tf.FunnelResponse
for _, f := range funnels {
funnelResp := tf.FunnelResponse{
FunnelName: f.Name,
FunnelID: f.ID.String(),
CreatedAt: f.CreatedAt.UnixNano() / 1000000,
CreatedBy: f.CreatedBy,
OrgID: f.OrgID.String(),
UpdatedAt: f.UpdatedAt.UnixNano() / 1000000,
UpdatedBy: f.UpdatedBy,
Description: f.Description,
}
// Get user email if available
if f.CreatedByUser != nil {
funnelResp.UserEmail = f.CreatedByUser.Email
}
response = append(response, funnelResp)
}
render.Success(rw, http.StatusOK, response)
}
func (handler *handler) Get(rw http.ResponseWriter, r *http.Request) {
vars := mux.Vars(r)
funnelID := vars["funnel_id"]
funnel, err := handler.module.Get(r.Context(), funnelID)
if err != nil {
render.Error(rw, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "funnel not found: %v", err))
return
}
// Create a response with all funnel details including step IDs
response := tf.FunnelResponse{
FunnelID: funnel.ID.String(),
FunnelName: funnel.Name,
Description: funnel.Description,
CreatedAt: funnel.CreatedAt.UnixNano() / 1000000,
UpdatedAt: funnel.UpdatedAt.UnixNano() / 1000000,
CreatedBy: funnel.CreatedBy,
UpdatedBy: funnel.UpdatedBy,
OrgID: funnel.OrgID.String(),
Steps: funnel.Steps,
}
// Add user email if available
if funnel.CreatedByUser != nil {
response.UserEmail = funnel.CreatedByUser.Email
}
render.Success(rw, http.StatusOK, response)
}
func (handler *handler) Delete(rw http.ResponseWriter, r *http.Request) {
vars := mux.Vars(r)
funnelID := vars["funnel_id"]
err := handler.module.Delete(r.Context(), funnelID)
if err != nil {
render.Error(rw, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "failed to delete funnel: %v", err))
return
}
render.Success(rw, http.StatusOK, nil)
}
func (handler *handler) Save(rw http.ResponseWriter, r *http.Request) {
var req tf.FunnelRequest
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
render.Error(rw, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "invalid request: %v", err))
return
}
claims, err := authtypes.ClaimsFromContext(r.Context())
if err != nil {
render.Error(rw, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "unauthenticated"))
return
}
orgID := claims.OrgID
usrID := claims.UserID
funnel, err := handler.module.Get(r.Context(), req.FunnelID.String())
if err != nil {
render.Error(rw, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "funnel not found: %v", err))
return
}
updateTimestamp := req.Timestamp
if updateTimestamp == 0 {
updateTimestamp = time.Now().UnixMilli()
} else if !tracefunnel.ValidateTimestampIsMilliseconds(updateTimestamp) {
render.Error(rw, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "timestamp must be in milliseconds format (13 digits)"))
return
}
funnel.UpdatedAt = time.Unix(0, updateTimestamp*1000000) // Convert to nanoseconds
if req.UserID != "" {
funnel.UpdatedBy = usrID
}
funnel.Description = req.Description
if err := handler.module.Save(r.Context(), funnel, funnel.UpdatedBy, orgID); err != nil {
render.Error(rw, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "failed to save funnel: %v", err))
return
}
// Try to fetch metadata from DB
createdAt, updatedAt, extraDataFromDB, err := handler.module.GetFunnelMetadata(r.Context(), funnel.ID.String())
if err != nil {
render.Error(rw, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "failed to get funnel metadata: %v", err))
return
}
resp := tf.FunnelResponse{
FunnelName: funnel.Name,
CreatedAt: createdAt,
UpdatedAt: updatedAt,
CreatedBy: funnel.CreatedBy,
UpdatedBy: funnel.UpdatedBy,
OrgID: funnel.OrgID.String(),
Description: extraDataFromDB,
}
render.Success(rw, http.StatusOK, resp)
}
//func (handler *handler) ValidateTraces(rw http.ResponseWriter, r *http.Request) {
// vars := mux.Vars(r)
// funnelID := vars["funnel_id"]
//
// funnel, err := handler.module.Get(r.Context(), funnelID)
// if err != nil {
// render.Error(rw, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "funnel not found: %v", err))
// return
// }
//
// var timeRange tf.TimeRange
// if err := json.NewDecoder(r.Body).Decode(&timeRange); err != nil {
// render.Error(rw, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "error decoding time range: %v", err))
// return
// }
//
// response, err := handler.module.ValidateTraces(r.Context(), funnel, timeRange)
// if err != nil {
// render.Error(rw, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "error validating traces: %v", err))
// return
// }
//
// render.Success(rw, http.StatusOK, response)
//}
//
//func (handler *handler) FunnelAnalytics(rw http.ResponseWriter, r *http.Request) {
// vars := mux.Vars(r)
// funnelID := vars["funnel_id"]
//
// funnel, err := handler.module.Get(r.Context(), funnelID)
// if err != nil {
// render.Error(rw, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "funnel not found: %v", err))
// return
// }
//
// var timeRange tf.TimeRange
// if err := json.NewDecoder(r.Body).Decode(&timeRange); err != nil {
// render.Error(rw, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "error decoding time range: %v", err))
// return
// }
//
// response, err := handler.module.GetFunnelAnalytics(r.Context(), funnel, timeRange)
// if err != nil {
// render.Error(rw, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "error getting funnel analytics: %v", err))
// return
// }
//
// render.Success(rw, http.StatusOK, response)
//}
//
//func (handler *handler) StepAnalytics(rw http.ResponseWriter, r *http.Request) {
// vars := mux.Vars(r)
// funnelID := vars["funnel_id"]
//
// funnel, err := handler.module.Get(r.Context(), funnelID)
// if err != nil {
// render.Error(rw, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "funnel not found: %v", err))
// return
// }
//
// var timeRange tf.TimeRange
// if err := json.NewDecoder(r.Body).Decode(&timeRange); err != nil {
// render.Error(rw, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "error decoding time range: %v", err))
// return
// }
//
// response, err := handler.module.GetStepAnalytics(r.Context(), funnel, timeRange)
// if err != nil {
// render.Error(rw, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "error getting step analytics: %v", err))
// return
// }
//
// render.Success(rw, http.StatusOK, response)
//}
//
//func (handler *handler) SlowestTraces(rw http.ResponseWriter, r *http.Request) {
// handler.handleTracesWithLatency(rw, r, false)
//}
//
//func (handler *handler) ErrorTraces(rw http.ResponseWriter, r *http.Request) {
// handler.handleTracesWithLatency(rw, r, true)
//}
//
//// handleTracesWithLatency handles both slow and error traces with common logic
//func (handler *handler) handleTracesWithLatency(rw http.ResponseWriter, r *http.Request, isError bool) {
// funnel, req, err := handler.validateTracesRequest(r)
// if err != nil {
// render.Error(rw, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "%v", err))
// return
// }
//
// if err := tracefunnel.ValidateSteps(funnel, req.StepAOrder, req.StepBOrder); err != nil {
// render.Error(rw, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "%v", err))
// return
// }
//
// response, err := handler.module.GetSlowestTraces(r.Context(), funnel, req.StepAOrder, req.StepBOrder, req.TimeRange, isError)
// if err != nil {
// render.Error(rw, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "error getting traces: %v", err))
// return
// }
//
// render.Success(rw, http.StatusOK, response)
//}
//
//// validateTracesRequest validates and extracts the request parameters
//func (handler *handler) validateTracesRequest(r *http.Request) (*tf.Funnel, *tf.StepTransitionRequest, error) {
// vars := mux.Vars(r)
// funnelID := vars["funnel_id"]
//
// funnel, err := handler.module.Get(r.Context(), funnelID)
// if err != nil {
// return nil, nil, fmt.Errorf("funnel not found: %v", err)
// }
//
// var req tf.StepTransitionRequest
// if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
// return nil, nil, fmt.Errorf("invalid request body: %v", err)
// }
//
// return funnel, &req, nil
//}

View File

@@ -0,0 +1,220 @@
package impltracefunnel
import (
"context"
"fmt"
"time"
"github.com/SigNoz/signoz/pkg/modules/tracefunnel"
"github.com/SigNoz/signoz/pkg/types"
traceFunnels "github.com/SigNoz/signoz/pkg/types/tracefunnel"
"github.com/SigNoz/signoz/pkg/valuer"
)
type module struct {
store traceFunnels.TraceFunnelStore
}
func NewModule(store traceFunnels.TraceFunnelStore) tracefunnel.Module {
return &module{
store: store,
}
}
func (module *module) Create(ctx context.Context, timestamp int64, name string, userID string, orgID string) (*traceFunnels.Funnel, error) {
orgUUID, err := valuer.NewUUID(orgID)
if err != nil {
return nil, fmt.Errorf("invalid org ID: %v", err)
}
funnel := &traceFunnels.Funnel{
BaseMetadata: traceFunnels.BaseMetadata{
Name: name,
OrgID: orgUUID,
},
}
funnel.CreatedAt = time.Unix(0, timestamp*1000000) // Convert to nanoseconds
funnel.CreatedBy = userID
// Set up the user relationship
funnel.CreatedByUser = &types.User{
ID: userID,
}
if err := module.store.Create(ctx, funnel); err != nil {
return nil, fmt.Errorf("failed to create funnel: %v", err)
}
return funnel, nil
}
// Get gets a funnel by ID
func (module *module) Get(ctx context.Context, funnelID string) (*traceFunnels.Funnel, error) {
uuid, err := valuer.NewUUID(funnelID)
if err != nil {
return nil, fmt.Errorf("invalid funnel ID: %v", err)
}
return module.store.Get(ctx, uuid)
}
// Update updates a funnel
func (module *module) Update(ctx context.Context, funnel *traceFunnels.Funnel, userID string) error {
funnel.UpdatedBy = userID
return module.store.Update(ctx, funnel)
}
// List lists all funnels for an organization
func (module *module) List(ctx context.Context, orgID string) ([]*traceFunnels.Funnel, error) {
orgUUID, err := valuer.NewUUID(orgID)
if err != nil {
return nil, fmt.Errorf("invalid org ID: %v", err)
}
funnels, err := module.store.List(ctx)
if err != nil {
return nil, err
}
// Filter by orgID
var orgFunnels []*traceFunnels.Funnel
for _, f := range funnels {
if f.OrgID == orgUUID {
orgFunnels = append(orgFunnels, f)
}
}
return orgFunnels, nil
}
// Delete deletes a funnel
func (module *module) Delete(ctx context.Context, funnelID string) error {
uuid, err := valuer.NewUUID(funnelID)
if err != nil {
return fmt.Errorf("invalid funnel ID: %v", err)
}
return module.store.Delete(ctx, uuid)
}
// Save saves a funnel
func (module *module) Save(ctx context.Context, funnel *traceFunnels.Funnel, userID string, orgID string) error {
orgUUID, err := valuer.NewUUID(orgID)
if err != nil {
return fmt.Errorf("invalid org ID: %v", err)
}
funnel.UpdatedBy = userID
funnel.OrgID = orgUUID
return module.store.Update(ctx, funnel)
}
// GetFunnelMetadata gets metadata for a funnel
func (module *module) GetFunnelMetadata(ctx context.Context, funnelID string) (int64, int64, string, error) {
uuid, err := valuer.NewUUID(funnelID)
if err != nil {
return 0, 0, "", fmt.Errorf("invalid funnel ID: %v", err)
}
funnel, err := module.store.Get(ctx, uuid)
if err != nil {
return 0, 0, "", err
}
return funnel.CreatedAt.UnixNano() / 1000000, funnel.UpdatedAt.UnixNano() / 1000000, funnel.Description, nil
}
// ValidateTraces validates traces in a funnel
//func (module *module) ValidateTraces(ctx context.Context, funnel *traceFunnels.Funnel, timeRange traceFunnels.TimeRange) ([]*v3.Row, error) {
// chq, err := tracefunnel.ValidateTraces(funnel, timeRange)
// if err != nil {
// RespondError(w, &model.ApiError{Typ: model.ErrorInternal, Err: fmt.Errorf("error building clickhouse query: %v", err)}, nil)
// return
// }
//
// results, err := aH.reader. GetListResultV3(r.Context(), chq.Query)
// if err != nil {
// RespondError(w, &model.ApiError{Typ: model.ErrorInternal, Err: fmt.Errorf("error converting clickhouse results to list: %v", err)}, nil)
// return
// }
//
//}
// GetFunnelAnalytics gets analytics for a funnel
//func (module *module) GetFunnelAnalytics(ctx context.Context, funnel *traceFunnels.Funnel, timeRange traceFunnels.TimeRange) (*traceFunnels.FunnelAnalytics, error) {
// if err := tracefunnel.ValidateFunnel(funnel); err != nil {
// return nil, fmt.Errorf("invalid funnel: %v", err)
// }
//
// if err := tracefunnel.ValidateTimeRange(timeRange); err != nil {
// return nil, fmt.Errorf("invalid time range: %v", err)
// }
//
// _, err := tracefunnel.ValidateTracesWithLatency(funnel, timeRange)
// if err != nil {
// return nil, fmt.Errorf("error building clickhouse query: %v", err)
// }
//
// // TODO: Execute query and return results
// // For now, return empty analytics
// return &traceFunnels.FunnelAnalytics{
// TotalStart: 0,
// TotalComplete: 0,
// ErrorCount: 0,
// AvgDurationMs: 0,
// P99LatencyMs: 0,
// ConversionRate: 0,
// }, nil
//}
// GetStepAnalytics gets analytics for each step
//func (module *module) GetStepAnalytics(ctx context.Context, funnel *traceFunnels.Funnel, timeRange traceFunnels.TimeRange) (*traceFunnels.FunnelAnalytics, error) {
// if err := tracefunnel.ValidateFunnel(funnel); err != nil {
// return nil, fmt.Errorf("invalid funnel: %v", err)
// }
//
// if err := tracefunnel.ValidateTimeRange(timeRange); err != nil {
// return nil, fmt.Errorf("invalid time range: %v", err)
// }
//
// _, err := tracefunnel.GetStepAnalytics(funnel, timeRange)
// if err != nil {
// return nil, fmt.Errorf("error building clickhouse query: %v", err)
// }
//
// // TODO: Execute query and return results
// // For now, return empty analytics
// return &traceFunnels.FunnelAnalytics{
// TotalStart: 0,
// TotalComplete: 0,
// ErrorCount: 0,
// AvgDurationMs: 0,
// P99LatencyMs: 0,
// ConversionRate: 0,
// }, nil
//}
// GetSlowestTraces gets the slowest traces between two steps
//func (module *module) GetSlowestTraces(ctx context.Context, funnel *traceFunnels.Funnel, stepAOrder, stepBOrder int64, timeRange traceFunnels.TimeRange, isError bool) (*traceFunnels.ValidTracesResponse, error) {
// if err := tracefunnel.ValidateFunnel(funnel); err != nil {
// return nil, fmt.Errorf("invalid funnel: %v", err)
// }
//
// if err := tracefunnel.ValidateTimeRange(timeRange); err != nil {
// return nil, fmt.Errorf("invalid time range: %v", err)
// }
//
// _, err := tracefunnel.GetSlowestTraces(funnel, stepAOrder, stepBOrder, timeRange, isError)
// if err != nil {
// return nil, fmt.Errorf("error building clickhouse query: %v", err)
// }
//
// // TODO: Execute query and return results
// // For now, return empty response
// return &traceFunnels.ValidTracesResponse{
// TraceIDs: []string{},
// }, nil
//}
//UpdateMetadata updates the metadata of a funnel
//func (module *module) UpdateMetadata(ctx context.Context, funnelID valuer.UUID, name, description string, userID string) error {
// return module.store.UpdateMetadata(ctx, funnelID, name, description, userID)
//}

View File

@@ -0,0 +1,220 @@
package impltracefunnel
import (
"context"
"fmt"
"time"
"github.com/SigNoz/signoz/pkg/sqlstore"
traceFunnels "github.com/SigNoz/signoz/pkg/types/tracefunnel"
"github.com/SigNoz/signoz/pkg/valuer"
)
type store struct {
sqlstore sqlstore.SQLStore
}
func NewStore(sqlstore sqlstore.SQLStore) traceFunnels.TraceFunnelStore {
return &store{sqlstore: sqlstore}
}
func (store *store) Create(ctx context.Context, funnel *traceFunnels.Funnel) error {
if funnel.ID.IsZero() {
funnel.ID = valuer.GenerateUUID()
}
if funnel.CreatedAt.IsZero() {
funnel.CreatedAt = time.Now()
}
if funnel.UpdatedAt.IsZero() {
funnel.UpdatedAt = time.Now()
}
_, err := store.
sqlstore.
BunDB().
NewInsert().
Model(funnel).
Exec(ctx)
if err != nil {
return fmt.Errorf("failed to create funnel: %v", err)
}
if funnel.CreatedByUser != nil {
_, err = store.sqlstore.BunDB().NewUpdate().
Model(funnel).
Set("created_by = ?", funnel.CreatedByUser.ID).
Where("id = ?", funnel.ID).
Exec(ctx)
if err != nil {
return fmt.Errorf("failed to update funnel user relationship: %v", err)
}
}
return nil
}
// Get retrieves a funnel by ID
func (store *store) Get(ctx context.Context, uuid valuer.UUID) (*traceFunnels.Funnel, error) {
funnel := &traceFunnels.Funnel{}
err := store.
sqlstore.
BunDB().
NewSelect().
Model(funnel).
Relation("CreatedByUser").
Where("?TableAlias.id = ?", uuid).
Scan(ctx)
if err != nil {
return nil, fmt.Errorf("failed to get funnel: %v", err)
}
return funnel, nil
}
// Update updates an existing funnel
func (store *store) Update(ctx context.Context, funnel *traceFunnels.Funnel) error {
// UpdateSteps the updated_at timestamp
funnel.UpdatedAt = time.Now()
_, err := store.
sqlstore.
BunDB().
NewUpdate().
Model(funnel).
WherePK().
Exec(ctx)
if err != nil {
return fmt.Errorf("failed to update funnel: %v", err)
}
return nil
}
// List retrieves all funnels
func (store *store) List(ctx context.Context) ([]*traceFunnels.Funnel, error) {
var funnels []*traceFunnels.Funnel
err := store.
sqlstore.
BunDB().
NewSelect().
Model(&funnels).
Relation("CreatedByUser").
Scan(ctx)
if err != nil {
return nil, fmt.Errorf("failed to list funnels: %v", err)
}
return funnels, nil
}
// Delete removes a funnel by ID
func (store *store) Delete(ctx context.Context, uuid valuer.UUID) error {
_, err := store.
sqlstore.
BunDB().
NewDelete().
Model((*traceFunnels.Funnel)(nil)).
Where("id = ?", uuid).Exec(ctx)
if err != nil {
return fmt.Errorf("failed to delete funnel: %v", err)
}
return nil
}
// ListByOrg retrieves all funnels for a specific organization
//func (store *store) ListByOrg(ctx context.Context, orgID valuer.UUID) ([]*traceFunnels.Funnel, error) {
// var funnels []*traceFunnels.Funnel
// err := store.
// sqlstore.
// BunDB().
// NewSelect().
// Model(&funnels).
// Relation("CreatedByUser").
// Where("org_id = ?", orgID).
// Scan(ctx)
// if err != nil {
// return nil, fmt.Errorf("failed to list funnels by org: %v", err)
// }
// return funnels, nil
//}
// GetByIDAndOrg retrieves a funnel by ID and organization ID
//func (store *store) GetByIDAndOrg(ctx context.Context, id, orgID valuer.UUID) (*traceFunnels.Funnel, error) {
// funnel := &traceFunnels.Funnel{}
// err := store.
// sqlstore.
// BunDB().
// NewSelect().
// Model(funnel).
// Relation("CreatedByUser").
// Where("?TableAlias.id = ? AND ?TableAlias.org_id = ?", id, orgID).
// Scan(ctx)
// if err != nil {
// return nil, fmt.Errorf("failed to get funnel by ID and org: %v", err)
// }
// return funnel, nil
//}
// UpdateSteps updates the steps of a funnel
//func (store *store) UpdateSteps(ctx context.Context, funnelID valuer.UUID, steps []traceFunnels.FunnelStep) error {
// _, err := store.
// sqlstore.
// BunDB().
// NewUpdate().
// Model((*traceFunnels.Funnel)(nil)).
// Set("steps = ?", steps).
// Where("id = ?", funnelID).
// Exec(ctx)
// if err != nil {
// return fmt.Errorf("failed to update funnel steps: %v", err)
// }
// return nil
//}
// UpdateMetadata updates the metadata of a funnel
//func (store *store) UpdateMetadata(ctx context.Context, funnelID valuer.UUID, name, description string, userID string) error {
//
// // First get the current funnel to preserve other fields
// funnel := &traceFunnels.Funnel{}
// err := store.
// sqlstore.
// BunDB().
// NewSelect().
// Model(funnel).
// Where("id = ?", funnelID).
// Scan(ctx)
// if err != nil {
// return fmt.Errorf("failed to get funnel: %v", err)
// }
//
// // UpdateSteps the fields
// funnel.Name = name
// funnel.Description = description
// funnel.UpdatedAt = time.Now()
// funnel.UpdatedBy = userID
//
// // Save the updated funnel
// _, err = store.
// sqlstore.
// BunDB().
// NewUpdate().
// Model(funnel).
// WherePK().
// Exec(ctx)
// if err != nil {
// return fmt.Errorf("failed to update funnel metadata: %v", err)
// }
//
// // Verify the update
// updatedFunnel := &traceFunnels.Funnel{}
// err = store.
// sqlstore.
// BunDB().
// NewSelect().
// Model(updatedFunnel).
// Where("id = ?", funnelID).
// Scan(ctx)
// if err != nil {
// return fmt.Errorf("failed to verify update: %v", err)
// }
//
// return nil
//}

View File

@@ -0,0 +1,442 @@
package tracefunnel
import (
"fmt"
v3 "github.com/SigNoz/signoz/pkg/query-service/model/v3"
tracefunnel "github.com/SigNoz/signoz/pkg/types/tracefunnel"
"strings"
)
// GetSlowestTraces builds a ClickHouse query to get the slowest traces between two steps
func GetSlowestTraces(funnel *tracefunnel.Funnel, stepAOrder, stepBOrder int64, timeRange tracefunnel.TimeRange, withErrors bool) (*v3.ClickHouseQuery, error) {
// Find steps by order
var stepA, stepB *tracefunnel.FunnelStep
for i := range funnel.Steps {
if funnel.Steps[i].Order == stepAOrder {
stepA = &funnel.Steps[i]
}
if funnel.Steps[i].Order == stepBOrder {
stepB = &funnel.Steps[i]
}
}
if stepA == nil || stepB == nil {
return nil, fmt.Errorf("step not found")
}
// Build having clause based on withErrors flag
havingClause := ""
if withErrors {
havingClause = "HAVING has_error = 1"
}
// Build filter strings for each step
stepAFilters := ""
if stepA.Filters != nil && len(stepA.Filters.Items) > 0 {
// ToDO: need to implement where clause filtering with minimal code duplication
stepAFilters = "/* Custom filters for step A would be applied here */"
}
stepBFilters := ""
if stepB.Filters != nil && len(stepB.Filters.Items) > 0 {
// ToDO: need to implement where clause filtering with minimal code duplication
stepBFilters = "/* Custom filters for step B would be applied here */"
}
query := fmt.Sprintf(`
WITH
toUInt64(%d) AS start_time,
toUInt64(%d) AS end_time,
toString(intDiv(start_time, 1000000000) - 1800) AS tsBucketStart,
toString(intDiv(end_time, 1000000000)) AS tsBucketEnd
SELECT
trace_id,
concat(toString((max_end_time_ns - min_start_time_ns) / 1e6), ' ms') AS duration_ms,
COUNT(*) AS span_count
FROM (
SELECT
s1.trace_id,
MIN(toUnixTimestamp64Nano(s1.timestamp)) AS min_start_time_ns,
MAX(toUnixTimestamp64Nano(s2.timestamp) + s2.duration_nano) AS max_end_time_ns,
MAX(s1.has_error OR s2.has_error) AS has_error
FROM %s AS s1
JOIN %s AS s2
ON s1.trace_id = s2.trace_id
WHERE s1.resource_string_service$$name = '%s'
AND s1.name = '%s'
AND s2.resource_string_service$$name = '%s'
AND s2.name = '%s'
AND s1.timestamp BETWEEN toString(start_time) AND toString(end_time)
AND s1.ts_bucket_start BETWEEN tsBucketStart AND tsBucketEnd
AND s2.timestamp BETWEEN toString(start_time) AND toString(end_time)
AND s2.ts_bucket_start BETWEEN tsBucketStart AND tsBucketEnd
%s
%s
GROUP BY s1.trace_id
%s
) AS trace_durations
JOIN %s AS spans
ON spans.trace_id = trace_durations.trace_id
WHERE spans.timestamp BETWEEN toString(start_time) AND toString(end_time)
AND spans.ts_bucket_start BETWEEN tsBucketStart AND tsBucketEnd
GROUP BY trace_id, duration_ms
ORDER BY CAST(replaceRegexpAll(duration_ms, ' ms$', '') AS Float64) DESC
LIMIT 5`,
timeRange.StartTime,
timeRange.EndTime,
TracesTable,
TracesTable,
escapeString(stepA.ServiceName),
escapeString(stepA.SpanName),
escapeString(stepB.ServiceName),
escapeString(stepB.SpanName),
stepAFilters,
stepBFilters,
havingClause,
TracesTable,
)
return &v3.ClickHouseQuery{
Query: query,
}, nil
}
// GetStepAnalytics builds a ClickHouse query to get analytics for each step
func GetStepAnalytics(funnel *tracefunnel.Funnel, timeRange tracefunnel.TimeRange) (*v3.ClickHouseQuery, error) {
if len(funnel.Steps) == 0 {
return nil, fmt.Errorf("funnel has no steps")
}
// Build funnel steps array
var steps []string
for _, step := range funnel.Steps {
steps = append(steps, fmt.Sprintf("('%s', '%s')",
escapeString(step.ServiceName), escapeString(step.SpanName)))
}
stepsArray := fmt.Sprintf("array(%s)", strings.Join(steps, ","))
// Build step CTEs
var stepCTEs []string
for i, step := range funnel.Steps {
filterStr := ""
if step.Filters != nil && len(step.Filters.Items) > 0 {
// ToDO: need to implement where clause filtering with minimal code duplication
filterStr = "/* Custom filters would be applied here */"
}
cte := fmt.Sprintf(`
step%d_traces AS (
SELECT DISTINCT trace_id
FROM %s
WHERE resource_string_service$$name = '%s'
AND name = '%s'
AND timestamp BETWEEN toString(start_time) AND toString(end_time)
AND ts_bucket_start BETWEEN tsBucketStart AND tsBucketEnd
%s
)`,
i+1,
TracesTable,
escapeString(step.ServiceName),
escapeString(step.SpanName),
filterStr,
)
stepCTEs = append(stepCTEs, cte)
}
// Build intersecting traces CTE
var intersections []string
for i := 1; i <= len(funnel.Steps); i++ {
intersections = append(intersections, fmt.Sprintf("SELECT trace_id FROM step%d_traces", i))
}
intersectingTracesCTE := fmt.Sprintf(`
intersecting_traces AS (
%s
)`,
strings.Join(intersections, "\nINTERSECT\n"),
)
// Build CASE expressions for each step
var caseExpressions []string
for i, step := range funnel.Steps {
totalSpansExpr := fmt.Sprintf(`
COUNT(CASE WHEN resource_string_service$$name = '%s'
AND name = '%s'
THEN trace_id END) AS total_s%d_spans`,
escapeString(step.ServiceName), escapeString(step.SpanName), i+1)
erroredSpansExpr := fmt.Sprintf(`
COUNT(CASE WHEN resource_string_service$$name = '%s'
AND name = '%s'
AND has_error = true
THEN trace_id END) AS total_s%d_errored_spans`,
escapeString(step.ServiceName), escapeString(step.SpanName), i+1)
caseExpressions = append(caseExpressions, totalSpansExpr, erroredSpansExpr)
}
query := fmt.Sprintf(`
WITH
toUInt64(%d) AS start_time,
toUInt64(%d) AS end_time,
toString(intDiv(start_time, 1000000000) - 1800) AS tsBucketStart,
toString(intDiv(end_time, 1000000000)) AS tsBucketEnd,
%s AS funnel_steps,
%s,
%s
SELECT
%s
FROM %s
WHERE trace_id IN (SELECT trace_id FROM intersecting_traces)
AND timestamp BETWEEN toString(start_time) AND toString(end_time)
AND ts_bucket_start BETWEEN tsBucketStart AND tsBucketEnd`,
timeRange.StartTime,
timeRange.EndTime,
stepsArray,
strings.Join(stepCTEs, ",\n"),
intersectingTracesCTE,
strings.Join(caseExpressions, ",\n "),
TracesTable,
)
return &v3.ClickHouseQuery{
Query: query,
}, nil
}
// ValidateTracesWithLatency builds a ClickHouse query to validate traces with latency information
func ValidateTracesWithLatency(funnel *tracefunnel.Funnel, timeRange tracefunnel.TimeRange) (*v3.ClickHouseQuery, error) {
filters, err := buildFunnelFiltersWithLatency(funnel)
if err != nil {
return nil, fmt.Errorf("error building funnel filters with latency: %w", err)
}
query := generateFunnelSQLWithLatency(timeRange.StartTime, timeRange.EndTime, filters)
return &v3.ClickHouseQuery{
Query: query,
}, nil
}
func generateFunnelSQLWithLatency(start, end int64, filters []tracefunnel.FunnelStepFilter) string {
var expressions []string
// Convert timestamps to nanoseconds
startTime := fmt.Sprintf("toUInt64(%d)", start)
endTime := fmt.Sprintf("toUInt64(%d)", end)
expressions = append(expressions, fmt.Sprintf("%s AS start_time", startTime))
expressions = append(expressions, fmt.Sprintf("%s AS end_time", endTime))
expressions = append(expressions, "toString(intDiv(start_time, 1000000000) - 1800) AS tsBucketStart")
expressions = append(expressions, "toString(intDiv(end_time, 1000000000)) AS tsBucketEnd")
expressions = append(expressions, "(end_time - start_time) / 1e9 AS total_time_seconds")
// Define step configurations dynamically
for _, f := range filters {
expressions = append(expressions, fmt.Sprintf("('%s', '%s') AS s%d_config",
escapeString(f.ServiceName),
escapeString(f.SpanName),
f.StepNumber))
}
withClause := "WITH \n" + strings.Join(expressions, ",\n") + "\n"
// Build step raw expressions and cumulative logic
var stepRaws []string
var cumulativeLogic []string
var filterConditions []string
stepCount := len(filters)
// Build raw step detection
for i := 1; i <= stepCount; i++ {
stepRaws = append(stepRaws, fmt.Sprintf(
"MAX(CASE WHEN (resource_string_service$$name, name) = s%d_config THEN 1 ELSE 0 END) AS has_s%d_raw", i, i))
filterConditions = append(filterConditions, fmt.Sprintf("s%d_config", i))
}
// Build cumulative IF logic
for i := 1; i <= stepCount; i++ {
if i == 1 {
cumulativeLogic = append(cumulativeLogic, fmt.Sprintf(`
IF(MAX(CASE WHEN (resource_string_service$$name, name) = s1_config THEN 1 ELSE 0 END) = 1, 1, 0) AS has_s1`))
} else {
innerIf := "IF(MAX(CASE WHEN (resource_string_service$$name, name) = s1_config THEN 1 ELSE 0 END) = 1, 1, 0)"
for j := 2; j < i; j++ {
innerIf = fmt.Sprintf(`IF(%s = 1 AND MAX(CASE WHEN (resource_string_service$$name, name) = s%d_config THEN 1 ELSE 0 END) = 1, 1, 0)`, innerIf, j)
}
cumulativeLogic = append(cumulativeLogic, fmt.Sprintf(`
IF(
%s = 1 AND MAX(CASE WHEN (resource_string_service$$name, name) = s%d_config THEN 1 ELSE 0 END) = 1,
1, 0
) AS has_s%d`, innerIf, i, i))
}
}
// Final SELECT counts using FILTER clauses
var stepCounts []string
for i := 1; i <= stepCount; i++ {
stepCounts = append(stepCounts, fmt.Sprintf("COUNT(DISTINCT trace_id) FILTER (WHERE has_s%d = 1) AS step%d_count", i, i))
}
// Final query assembly
lastStep := fmt.Sprint(stepCount)
query := withClause + `
SELECT
` + strings.Join(stepCounts, ",\n ") + `,
IF(total_time_seconds = 0 OR COUNT(DISTINCT trace_id) FILTER (WHERE has_s` + lastStep + ` = 1) = 0, 0,
COUNT(DISTINCT trace_id) FILTER (WHERE has_s` + lastStep + ` = 1) / total_time_seconds
) AS avg_rate,
COUNT(DISTINCT trace_id) FILTER (WHERE has_s` + lastStep + ` = 1 AND has_error = true) AS errors,
IF(COUNT(*) = 0, 0, avg(trace_duration)) AS avg_duration,
IF(COUNT(*) = 0, 0, quantile(0.99)(trace_duration)) AS p99_latency,
IF(COUNT(DISTINCT trace_id) FILTER (WHERE has_s1 = 1) = 0, 0,
100.0 * COUNT(DISTINCT trace_id) FILTER (WHERE has_s` + lastStep + ` = 1) /
COUNT(DISTINCT trace_id) FILTER (WHERE has_s1 = 1)
) AS conversion_rate
FROM (
SELECT
trace_id,
MAX(has_error) AS has_error,
` + strings.Join(stepRaws, ",\n ") + `,
MAX(toUnixTimestamp64Nano(timestamp) + duration_nano) - MIN(toUnixTimestamp64Nano(timestamp)) AS trace_duration,
` + strings.Join(cumulativeLogic, ",\n ") + `
FROM ` + TracesTable + `
WHERE
timestamp BETWEEN toString(start_time) AND toString(end_time)
AND ts_bucket_start BETWEEN tsBucketStart AND tsBucketEnd
AND (resource_string_service$$name, name) IN (` + strings.Join(filterConditions, ", ") + `)
GROUP BY trace_id
) AS funnel_data;`
return query
}
func buildFunnelFiltersWithLatency(funnel *tracefunnel.Funnel) ([]tracefunnel.FunnelStepFilter, error) {
if funnel == nil {
return nil, fmt.Errorf("funnel cannot be nil")
}
if len(funnel.Steps) == 0 {
return nil, fmt.Errorf("funnel must have at least one step")
}
filters := make([]tracefunnel.FunnelStepFilter, len(funnel.Steps))
for i, step := range funnel.Steps {
latencyPointer := "start" // Default value
if step.LatencyPointer != "" {
latencyPointer = step.LatencyPointer
}
filters[i] = tracefunnel.FunnelStepFilter{
StepNumber: i + 1,
ServiceName: step.ServiceName,
SpanName: step.SpanName,
LatencyPointer: latencyPointer,
CustomFilters: step.Filters,
}
}
return filters, nil
}
func buildFunnelFilters(funnel *tracefunnel.Funnel) ([]tracefunnel.FunnelStepFilter, error) {
if funnel == nil {
return nil, fmt.Errorf("funnel cannot be nil")
}
if len(funnel.Steps) == 0 {
return nil, fmt.Errorf("funnel must have at least one step")
}
filters := make([]tracefunnel.FunnelStepFilter, len(funnel.Steps))
for i, step := range funnel.Steps {
filters[i] = tracefunnel.FunnelStepFilter{
StepNumber: i + 1,
ServiceName: step.ServiceName,
SpanName: step.SpanName,
CustomFilters: step.Filters,
}
}
return filters, nil
}
func escapeString(s string) string {
// Replace single quotes with double single quotes to escape them in SQL
return strings.ReplaceAll(s, "'", "''")
}
const TracesTable = "signoz_traces.signoz_index_v3"
func generateFunnelSQL(start, end int64, filters []tracefunnel.FunnelStepFilter) string {
var expressions []string
// Basic time expressions.
expressions = append(expressions, fmt.Sprintf("toUInt64(%d) AS start_time", start))
expressions = append(expressions, fmt.Sprintf("toUInt64(%d) AS end_time", end))
expressions = append(expressions, "toString(intDiv(start_time, 1000000000) - 1800) AS tsBucketStart")
expressions = append(expressions, "toString(intDiv(end_time, 1000000000)) AS tsBucketEnd")
// Add service and span alias definitions from each filter.
for _, f := range filters {
expressions = append(expressions, fmt.Sprintf("'%s' AS service_%d", escapeString(f.ServiceName), f.StepNumber))
expressions = append(expressions, fmt.Sprintf("'%s' AS span_%d", escapeString(f.SpanName), f.StepNumber))
}
// Add the CTE for each step.
for _, f := range filters {
cte := fmt.Sprintf(`step%d_traces AS (
SELECT DISTINCT trace_id
FROM %s
WHERE serviceName = service_%d
AND name = span_%d
AND timestamp BETWEEN toString(start_time) AND toString(end_time)
AND ts_bucket_start BETWEEN tsBucketStart AND tsBucketEnd
)`, f.StepNumber, TracesTable, f.StepNumber, f.StepNumber)
expressions = append(expressions, cte)
}
withClause := "WITH \n" + strings.Join(expressions, ",\n") + "\n"
// Build the intersect clause for each step.
var intersectQueries []string
for _, f := range filters {
intersectQueries = append(intersectQueries, fmt.Sprintf("SELECT trace_id FROM step%d_traces", f.StepNumber))
}
intersectClause := strings.Join(intersectQueries, "\nINTERSECT\n")
query := withClause + `
SELECT trace_id
FROM ` + TracesTable + `
WHERE trace_id IN (
` + intersectClause + `
)
AND timestamp BETWEEN toString(start_time) AND toString(end_time)
AND ts_bucket_start BETWEEN tsBucketStart AND tsBucketEnd
GROUP BY trace_id
LIMIT 5
`
return query
}
// ValidateTraces builds a ClickHouse query to validate traces in a funnel
func ValidateTraces(funnel *tracefunnel.Funnel, timeRange tracefunnel.TimeRange) (*v3.ClickHouseQuery, error) {
filters, err := buildFunnelFilters(funnel)
if err != nil {
return nil, fmt.Errorf("error building funnel filters: %w", err)
}
query := generateFunnelSQL(timeRange.StartTime, timeRange.EndTime, filters)
return &v3.ClickHouseQuery{
Query: query,
}, nil
}

View File

@@ -0,0 +1,67 @@
package tracefunnel
import (
"context"
"net/http"
traceFunnels "github.com/SigNoz/signoz/pkg/types/tracefunnel"
)
// Module defines the interface for trace funnel operations
type Module interface {
// operations on funnel
Create(ctx context.Context, timestamp int64, name string, userID string, orgID string) (*traceFunnels.Funnel, error)
Get(ctx context.Context, funnelID string) (*traceFunnels.Funnel, error)
Update(ctx context.Context, funnel *traceFunnels.Funnel, userID string) error
List(ctx context.Context, orgID string) ([]*traceFunnels.Funnel, error)
Delete(ctx context.Context, funnelID string) error
Save(ctx context.Context, funnel *traceFunnels.Funnel, userID string, orgID string) error
GetFunnelMetadata(ctx context.Context, funnelID string) (int64, int64, string, error)
//
//GetFunnelAnalytics(ctx context.Context, funnel *traceFunnels.Funnel, timeRange traceFunnels.TimeRange) (*traceFunnels.FunnelAnalytics, error)
//
//GetStepAnalytics(ctx context.Context, funnel *traceFunnels.Funnel, timeRange traceFunnels.TimeRange) (*traceFunnels.FunnelAnalytics, error)
//
//GetSlowestTraces(ctx context.Context, funnel *traceFunnels.Funnel, stepAOrder, stepBOrder int64, timeRange traceFunnels.TimeRange, isError bool) (*traceFunnels.ValidTracesResponse, error)
// updates funnel metadata
//UpdateMetadata(ctx context.Context, funnelID valuer.UUID, name, description string, userID string) error
// validates funnel
//ValidateTraces(ctx context.Context, funnel *traceFunnels.Funnel, timeRange traceFunnels.TimeRange) ([]*v3.Row, error)
}
type Handler interface {
// CRUD on funnel
New(http.ResponseWriter, *http.Request)
UpdateSteps(http.ResponseWriter, *http.Request)
UpdateFunnel(http.ResponseWriter, *http.Request)
List(http.ResponseWriter, *http.Request)
Get(http.ResponseWriter, *http.Request)
Delete(http.ResponseWriter, *http.Request)
Save(http.ResponseWriter, *http.Request)
// validator handlers
//ValidateTraces(http.ResponseWriter, *http.Request)
//
//// Analytics handlers
//FunnelAnalytics(http.ResponseWriter, *http.Request)
//
//StepAnalytics(http.ResponseWriter, *http.Request)
//
//SlowestTraces(http.ResponseWriter, *http.Request)
//
//ErrorTraces(http.ResponseWriter, *http.Request)
}

View File

@@ -0,0 +1,171 @@
package tracefunnel
import (
"fmt"
tracefunnel "github.com/SigNoz/signoz/pkg/types/tracefunnel"
"sort"
)
// ValidateTimestamp validates a timestamp
func ValidateTimestamp(timestamp int64, fieldName string) error {
if timestamp == 0 {
return fmt.Errorf("%s is required", fieldName)
}
if timestamp < 0 {
return fmt.Errorf("%s must be positive", fieldName)
}
return nil
}
// ValidateTimestampIsMilliseconds validates that a timestamp is in milliseconds
func ValidateTimestampIsMilliseconds(timestamp int64) bool {
// Check if timestamp is in milliseconds (13 digits)
return timestamp >= 1000000000000 && timestamp <= 9999999999999
}
// ValidateFunnelSteps validates funnel steps
func ValidateFunnelSteps(steps []tracefunnel.FunnelStep) error {
if len(steps) < 2 {
return fmt.Errorf("funnel must have at least 2 steps")
}
for i, step := range steps {
if step.ServiceName == "" {
return fmt.Errorf("step %d: service name is required", i+1)
}
if step.SpanName == "" {
return fmt.Errorf("step %d: span name is required", i+1)
}
if step.Order < 0 {
return fmt.Errorf("step %d: order must be non-negative", i+1)
}
}
return nil
}
// NormalizeFunnelSteps normalizes step orders to be sequential
func NormalizeFunnelSteps(steps []tracefunnel.FunnelStep) []tracefunnel.FunnelStep {
// Sort steps by order
sort.Slice(steps, func(i, j int) bool {
return steps[i].Order < steps[j].Order
})
// Normalize orders to be sequential
for i := range steps {
steps[i].Order = int64(i + 1)
}
return steps
}
//// ValidateSteps checks if the requested steps exist in the funnel
//func ValidateSteps(funnel *tracefunnel.Funnel, stepAOrder, stepBOrder int64) error {
// stepAExists, stepBExists := false, false
// for _, step := range funnel.Steps {
// if step.Order == stepAOrder {
// stepAExists = true
// }
// if step.Order == stepBOrder {
// stepBExists = true
// }
// }
//
// if !stepAExists || !stepBExists {
// return fmt.Errorf("one or both steps not found. Step A Order: %d, Step B Order: %d", stepAOrder, stepBOrder)
// }
//
// return nil
//}
//// ValidateFunnel validates a funnel's data
//func ValidateFunnel(funnel *tracefunnel.Funnel) error {
// if funnel == nil {
// return fmt.Errorf("funnel cannot be nil")
// }
//
// if len(funnel.Steps) < 2 {
// return fmt.Errorf("funnel must have at least 2 steps")
// }
//
// // Validate each step
// for i, step := range funnel.Steps {
// if err := ValidateStep(step, i+1); err != nil {
// return err
// }
// }
//
// return nil
//}
// ValidateStep validates a single funnel step
//func ValidateStep(step tracefunnel.FunnelStep, stepNum int) error {
// if step.ServiceName == "" {
// return fmt.Errorf("step %d: service name is required", stepNum)
// }
//
// if step.SpanName == "" {
// return fmt.Errorf("step %d: span name is required", stepNum)
// }
//
// if step.Order < 0 {
// return fmt.Errorf("step %d: order must be non-negative", stepNum)
// }
//
// return nil
//}
//
//// ValidateTimeRange validates a time range
//func ValidateTimeRange(timeRange tracefunnel.TimeRange) error {
// if timeRange.StartTime <= 0 {
// return fmt.Errorf("start time must be positive")
// }
//
// if timeRange.EndTime <= 0 {
// return fmt.Errorf("end time must be positive")
// }
//
// if timeRange.EndTime < timeRange.StartTime {
// return fmt.Errorf("end time must be after start time")
// }
//
// // Check if the time range is not too far in the future
// now := time.Now().UnixNano() / 1000000 // Convert to milliseconds
// if timeRange.EndTime > now {
// return fmt.Errorf("end time cannot be in the future")
// }
//
// // Check if the time range is not too old (e.g., more than 30 days)
// maxAge := int64(30 * 24 * 60 * 60 * 1000) // 30 days in milliseconds
// if now-timeRange.StartTime > maxAge {
// return fmt.Errorf("time range cannot be older than 30 days")
// }
//
// return nil
//}
//
//// ValidateStepOrder validates that step orders are sequential
//func ValidateStepOrder(steps []tracefunnel.FunnelStep) error {
// if len(steps) < 2 {
// return nil
// }
//
// // Create a map to track used orders
// usedOrders := make(map[int64]bool)
//
// for i, step := range steps {
// if usedOrders[step.Order] {
// return fmt.Errorf("duplicate step order %d at step %d", step.Order, i+1)
// }
// usedOrders[step.Order] = true
// }
//
// // Check if orders are sequential
// for i := 0; i < len(steps)-1; i++ {
// if steps[i+1].Order != steps[i].Order+1 {
// return fmt.Errorf("step orders must be sequential")
// }
// }
//
// return nil
//}

File diff suppressed because it is too large Load Diff

View File

@@ -23,11 +23,11 @@ import (
errorsV2 "github.com/SigNoz/signoz/pkg/errors"
"github.com/SigNoz/signoz/pkg/http/middleware"
"github.com/SigNoz/signoz/pkg/http/render"
"github.com/SigNoz/signoz/pkg/modules/organization"
"github.com/SigNoz/signoz/pkg/modules/preference"
tracefunnels "github.com/SigNoz/signoz/pkg/modules/tracefunnel"
"github.com/SigNoz/signoz/pkg/query-service/app/integrations"
"github.com/SigNoz/signoz/pkg/query-service/app/metricsexplorer"
"github.com/SigNoz/signoz/pkg/signoz"
traceFunnels "github.com/SigNoz/signoz/pkg/types/tracefunnel"
"github.com/SigNoz/signoz/pkg/valuer"
"github.com/prometheus/prometheus/promql"
@@ -91,7 +91,6 @@ func NewRouter() *mux.Router {
// APIHandler implements the query service public API
type APIHandler struct {
reader interfaces.Reader
skipConfig *model.SkipConfig
appDao dao.ModelDao
ruleManager *rules.Manager
featureFlags interfaces.FeatureLookup
@@ -121,9 +120,6 @@ type APIHandler struct {
// Websocket connection upgrader
Upgrader *websocket.Upgrader
UseLogsNewSchema bool
UseTraceNewSchema bool
hostsRepo *inframetrics.HostsRepo
processesRepo *inframetrics.ProcessesRepo
podsRepo *inframetrics.PodsRepo
@@ -147,11 +143,6 @@ type APIHandler struct {
FieldsAPI *fields.API
Signoz *signoz.SigNoz
Preference preference.API
OrganizationAPI organization.API
OrganizationModule organization.Module
}
type APIHandlerOpts struct {
@@ -159,8 +150,6 @@ type APIHandlerOpts struct {
// business data reader e.g. clickhouse
Reader interfaces.Reader
SkipConfig *model.SkipConfig
PreferSpanMetrics bool
// dao layer to perform crud on app objects like dashboard, alerts etc
@@ -187,11 +176,6 @@ type APIHandlerOpts struct {
// Querier Influx Interval
FluxInterval time.Duration
// Use Logs New schema
UseLogsNewSchema bool
UseTraceNewSchema bool
JWT *authtypes.JWT
AlertmanagerAPI *alertmanager.API
@@ -199,30 +183,22 @@ type APIHandlerOpts struct {
FieldsAPI *fields.API
Signoz *signoz.SigNoz
Preference preference.API
OrganizationAPI organization.API
OrganizationModule organization.Module
}
// NewAPIHandler returns an APIHandler
func NewAPIHandler(opts APIHandlerOpts) (*APIHandler, error) {
querierOpts := querier.QuerierOptions{
Reader: opts.Reader,
Cache: opts.Cache,
KeyGenerator: queryBuilder.NewKeyGenerator(),
FluxInterval: opts.FluxInterval,
UseLogsNewSchema: opts.UseLogsNewSchema,
UseTraceNewSchema: opts.UseTraceNewSchema,
Reader: opts.Reader,
Cache: opts.Cache,
KeyGenerator: queryBuilder.NewKeyGenerator(),
FluxInterval: opts.FluxInterval,
}
querierOptsV2 := querierV2.QuerierOptions{
Reader: opts.Reader,
Cache: opts.Cache,
KeyGenerator: queryBuilder.NewKeyGenerator(),
FluxInterval: opts.FluxInterval,
UseLogsNewSchema: opts.UseLogsNewSchema,
UseTraceNewSchema: opts.UseTraceNewSchema,
Reader: opts.Reader,
Cache: opts.Cache,
KeyGenerator: queryBuilder.NewKeyGenerator(),
FluxInterval: opts.FluxInterval,
}
querier := querier.NewQuerier(querierOpts)
@@ -244,7 +220,6 @@ func NewAPIHandler(opts APIHandlerOpts) (*APIHandler, error) {
aH := &APIHandler{
reader: opts.Reader,
appDao: opts.AppDao,
skipConfig: opts.SkipConfig,
preferSpanMetrics: opts.PreferSpanMetrics,
temporalityMap: make(map[string]map[v3.Temporality]bool),
ruleManager: opts.RuleManager,
@@ -254,8 +229,6 @@ func NewAPIHandler(opts APIHandlerOpts) (*APIHandler, error) {
LogsParsingPipelineController: opts.LogsParsingPipelineController,
querier: querier,
querierV2: querierv2,
UseLogsNewSchema: opts.UseLogsNewSchema,
UseTraceNewSchema: opts.UseTraceNewSchema,
hostsRepo: hostsRepo,
processesRepo: processesRepo,
podsRepo: podsRepo,
@@ -271,21 +244,11 @@ func NewAPIHandler(opts APIHandlerOpts) (*APIHandler, error) {
SummaryService: summaryService,
AlertmanagerAPI: opts.AlertmanagerAPI,
Signoz: opts.Signoz,
Preference: opts.Preference,
FieldsAPI: opts.FieldsAPI,
OrganizationAPI: opts.OrganizationAPI,
OrganizationModule: opts.OrganizationModule,
}
logsQueryBuilder := logsv3.PrepareLogsQuery
if opts.UseLogsNewSchema {
logsQueryBuilder = logsv4.PrepareLogsQuery
}
tracesQueryBuilder := tracesV3.PrepareTracesQuery
if opts.UseTraceNewSchema {
tracesQueryBuilder = tracesV4.PrepareTracesQuery
}
logsQueryBuilder := logsv4.PrepareLogsQuery
tracesQueryBuilder := tracesV4.PrepareTracesQuery
builderOpts := queryBuilder.QueryBuilderOptions{
BuildMetricQuery: metricsv3.PrepareMetricQuery,
@@ -596,23 +559,13 @@ func (aH *APIHandler) RegisterRoutes(router *mux.Router, am *middleware.AuthZ) {
router.HandleFunc("/api/v1/disks", am.ViewAccess(aH.getDisks)).Methods(http.MethodGet)
// === Preference APIs ===
router.HandleFunc("/api/v1/user/preferences", am.ViewAccess(aH.Signoz.Handlers.Preference.GetAllUser)).Methods(http.MethodGet)
router.HandleFunc("/api/v1/user/preferences/{preferenceId}", am.ViewAccess(aH.Signoz.Handlers.Preference.GetUser)).Methods(http.MethodGet)
router.HandleFunc("/api/v1/user/preferences/{preferenceId}", am.ViewAccess(aH.Signoz.Handlers.Preference.UpdateUser)).Methods(http.MethodPut)
router.HandleFunc("/api/v1/org/preferences", am.AdminAccess(aH.Signoz.Handlers.Preference.GetAllOrg)).Methods(http.MethodGet)
router.HandleFunc("/api/v1/org/preferences/{preferenceId}", am.AdminAccess(aH.Signoz.Handlers.Preference.GetOrg)).Methods(http.MethodGet)
router.HandleFunc("/api/v1/org/preferences/{preferenceId}", am.AdminAccess(aH.Signoz.Handlers.Preference.UpdateOrg)).Methods(http.MethodPut)
// user actions
router.HandleFunc("/api/v1/user/preferences", am.ViewAccess(aH.getAllUserPreferences)).Methods(http.MethodGet)
router.HandleFunc("/api/v1/user/preferences/{preferenceId}", am.ViewAccess(aH.getUserPreference)).Methods(http.MethodGet)
router.HandleFunc("/api/v1/user/preferences/{preferenceId}", am.ViewAccess(aH.updateUserPreference)).Methods(http.MethodPut)
// org actions
router.HandleFunc("/api/v1/org/preferences", am.AdminAccess(aH.getAllOrgPreferences)).Methods(http.MethodGet)
router.HandleFunc("/api/v1/org/preferences/{preferenceId}", am.AdminAccess(aH.getOrgPreference)).Methods(http.MethodGet)
router.HandleFunc("/api/v1/org/preferences/{preferenceId}", am.AdminAccess(aH.updateOrgPreference)).Methods(http.MethodPut)
// === Authentication APIs ===
router.HandleFunc("/api/v1/invite", am.AdminAccess(aH.inviteUser)).Methods(http.MethodPost)
router.HandleFunc("/api/v1/invite/bulk", am.AdminAccess(aH.inviteUsers)).Methods(http.MethodPost)
router.HandleFunc("/api/v1/invite/{token}", am.OpenAccess(aH.getInvite)).Methods(http.MethodGet)
@@ -633,9 +586,8 @@ func (aH *APIHandler) RegisterRoutes(router *mux.Router, am *middleware.AuthZ) {
router.HandleFunc("/api/v1/orgUsers/{id}", am.AdminAccess(aH.getOrgUsers)).Methods(http.MethodGet)
router.HandleFunc("/api/v2/orgs", am.AdminAccess(aH.getOrgs)).Methods(http.MethodGet)
router.HandleFunc("/api/v2/orgs/me", am.AdminAccess(aH.getOrg)).Methods(http.MethodGet)
router.HandleFunc("/api/v2/orgs/me", am.AdminAccess(aH.updateOrg)).Methods(http.MethodPut)
router.HandleFunc("/api/v2/orgs/me", am.AdminAccess(aH.Signoz.Handlers.Organization.Get)).Methods(http.MethodGet)
router.HandleFunc("/api/v2/orgs/me", am.AdminAccess(aH.Signoz.Handlers.Organization.Update)).Methods(http.MethodPut)
router.HandleFunc("/api/v1/getResetPasswordToken/{id}", am.AdminAccess(aH.getResetPasswordToken)).Methods(http.MethodGet)
router.HandleFunc("/api/v1/resetPassword", am.OpenAccess(aH.resetPassword)).Methods(http.MethodPost)
@@ -1709,7 +1661,7 @@ func (aH *APIHandler) getServicesTopLevelOps(w http.ResponseWriter, r *http.Requ
end = time.Unix(0, endEpochInt)
}
result, apiErr := aH.reader.GetTopLevelOperations(r.Context(), aH.skipConfig, start, end, services)
result, apiErr := aH.reader.GetTopLevelOperations(r.Context(), start, end, services)
if apiErr != nil {
RespondError(w, apiErr, nil)
return
@@ -1725,7 +1677,7 @@ func (aH *APIHandler) getServices(w http.ResponseWriter, r *http.Request) {
return
}
result, apiErr := aH.reader.GetServices(r.Context(), query, aH.skipConfig)
result, apiErr := aH.reader.GetServices(r.Context(), query)
if apiErr != nil && aH.HandleError(w, apiErr.Err, http.StatusInternalServerError) {
return
}
@@ -2064,7 +2016,7 @@ func (aH *APIHandler) inviteUsers(w http.ResponseWriter, r *http.Request) {
func (aH *APIHandler) getInvite(w http.ResponseWriter, r *http.Request) {
token := mux.Vars(r)["token"]
resp, err := auth.GetInvite(context.Background(), token, aH.OrganizationModule)
resp, err := auth.GetInvite(context.Background(), token, aH.Signoz.Modules.Organization)
if err != nil {
RespondError(w, &model.ApiError{Err: err, Typ: model.ErrorNotFound}, nil)
return
@@ -2105,7 +2057,7 @@ func (aH *APIHandler) listPendingInvites(w http.ResponseWriter, r *http.Request)
if err != nil {
render.Error(w, errorsV2.Newf(errorsV2.TypeInvalidInput, errorsV2.CodeInvalidInput, "invalid org_id in the invite"))
}
org, err := aH.OrganizationModule.Get(ctx, orgID)
org, err := aH.Signoz.Modules.Organization.Get(ctx, orgID)
if err != nil {
render.Error(w, errorsV2.Newf(errorsV2.TypeInternal, errorsV2.CodeInternal, err.Error()))
}
@@ -2132,7 +2084,7 @@ func (aH *APIHandler) registerUser(w http.ResponseWriter, r *http.Request) {
return
}
_, apiErr := auth.Register(context.Background(), req, aH.Signoz.Alertmanager, aH.OrganizationModule)
_, apiErr := auth.Register(context.Background(), req, aH.Signoz.Alertmanager, aH.Signoz.Modules.Organization)
if apiErr != nil {
RespondError(w, apiErr, nil)
return
@@ -2391,18 +2343,6 @@ func (aH *APIHandler) editRole(w http.ResponseWriter, r *http.Request) {
aH.WriteJSON(w, r, map[string]string{"data": "user group updated successfully"})
}
func (aH *APIHandler) getOrgs(w http.ResponseWriter, r *http.Request) {
aH.OrganizationAPI.GetAll(w, r)
}
func (aH *APIHandler) getOrg(w http.ResponseWriter, r *http.Request) {
aH.OrganizationAPI.Get(w, r)
}
func (aH *APIHandler) updateOrg(w http.ResponseWriter, r *http.Request) {
aH.OrganizationAPI.Update(w, r)
}
func (aH *APIHandler) getOrgUsers(w http.ResponseWriter, r *http.Request) {
id := mux.Vars(r)["id"]
users, apiErr := dao.DB().GetUsersByOrg(context.Background(), id)
@@ -3437,44 +3377,6 @@ func (aH *APIHandler) getProducerConsumerEval(
aH.Respond(w, resp)
}
// Preferences
func (aH *APIHandler) getUserPreference(
w http.ResponseWriter, r *http.Request,
) {
aH.Preference.GetUserPreference(w, r)
}
func (aH *APIHandler) updateUserPreference(
w http.ResponseWriter, r *http.Request,
) {
aH.Preference.UpdateUserPreference(w, r)
}
func (aH *APIHandler) getAllUserPreferences(
w http.ResponseWriter, r *http.Request,
) {
aH.Preference.GetAllUserPreferences(w, r)
}
func (aH *APIHandler) getOrgPreference(
w http.ResponseWriter, r *http.Request,
) {
aH.Preference.GetOrgPreference(w, r)
}
func (aH *APIHandler) updateOrgPreference(
w http.ResponseWriter, r *http.Request,
) {
aH.Preference.UpdateOrgPreference(w, r)
}
func (aH *APIHandler) getAllOrgPreferences(
w http.ResponseWriter, r *http.Request,
) {
aH.Preference.GetAllOrgPreferences(w, r)
}
// RegisterIntegrationRoutes Registers all Integrations
func (aH *APIHandler) RegisterIntegrationRoutes(router *mux.Router, am *middleware.AuthZ) {
subRouter := router.PathPrefix("/api/v1/integrations").Subrouter()
@@ -4700,7 +4602,6 @@ func (aH *APIHandler) updateSavedView(w http.ResponseWriter, r *http.Request) {
}
func (aH *APIHandler) deleteSavedView(w http.ResponseWriter, r *http.Request) {
viewID := mux.Vars(r)["viewId"]
viewUUID, err := valuer.NewUUID(viewID)
if err != nil {
@@ -4919,11 +4820,7 @@ func (aH *APIHandler) queryRangeV3(ctx context.Context, queryRangeParams *v3.Que
RespondError(w, apiErrObj, errQuriesByName)
return
}
if aH.UseTraceNewSchema {
tracesV4.Enrich(queryRangeParams, spanKeys)
} else {
tracesV3.Enrich(queryRangeParams, spanKeys)
}
tracesV4.Enrich(queryRangeParams, spanKeys)
}
@@ -5282,88 +5179,7 @@ func (aH *APIHandler) liveTailLogsV2(w http.ResponseWriter, r *http.Request) {
}
func (aH *APIHandler) liveTailLogs(w http.ResponseWriter, r *http.Request) {
if aH.UseLogsNewSchema {
aH.liveTailLogsV2(w, r)
return
}
// get the param from url and add it to body
stringReader := strings.NewReader(r.URL.Query().Get("q"))
r.Body = io.NopCloser(stringReader)
queryRangeParams, apiErrorObj := ParseQueryRangeParams(r)
if apiErrorObj != nil {
zap.L().Error(apiErrorObj.Err.Error())
RespondError(w, apiErrorObj, nil)
return
}
var err error
var queryString string
switch queryRangeParams.CompositeQuery.QueryType {
case v3.QueryTypeBuilder:
// check if any enrichment is required for logs if yes then enrich them
if logsv3.EnrichmentRequired(queryRangeParams) {
logsFields, err := aH.reader.GetLogFields(r.Context())
if err != nil {
apiErrObj := &model.ApiError{Typ: model.ErrorInternal, Err: err}
RespondError(w, apiErrObj, nil)
return
}
// get the fields if any logs query is present
fields := model.GetLogFieldsV3(r.Context(), queryRangeParams, logsFields)
logsv3.Enrich(queryRangeParams, fields)
}
queryString, err = aH.queryBuilder.PrepareLiveTailQuery(queryRangeParams)
if err != nil {
RespondError(w, &model.ApiError{Typ: model.ErrorBadData, Err: err}, nil)
return
}
default:
err = fmt.Errorf("invalid query type")
RespondError(w, &model.ApiError{Typ: model.ErrorBadData, Err: err}, nil)
return
}
// create the client
client := &model.LogsLiveTailClient{Name: r.RemoteAddr, Logs: make(chan *model.SignozLog, 1000), Done: make(chan *bool), Error: make(chan error)}
go aH.reader.LiveTailLogsV3(r.Context(), queryString, uint64(queryRangeParams.Start), "", client)
w.Header().Set("Connection", "keep-alive")
w.Header().Set("Content-Type", "text/event-stream")
w.Header().Set("Cache-Control", "no-cache")
w.Header().Set("Access-Control-Allow-Origin", "*")
w.WriteHeader(200)
flusher, ok := w.(http.Flusher)
if !ok {
err := model.ApiError{Typ: model.ErrorStreamingNotSupported, Err: nil}
RespondError(w, &err, "streaming is not supported")
return
}
// flush the headers
flusher.Flush()
for {
select {
case log := <-client.Logs:
var buf bytes.Buffer
enc := json.NewEncoder(&buf)
enc.Encode(log)
fmt.Fprintf(w, "data: %v\n\n", buf.String())
flusher.Flush()
case <-client.Done:
zap.L().Debug("done!")
return
case err := <-client.Error:
zap.L().Error("error occurred", zap.Error(err))
fmt.Fprintf(w, "event: error\ndata: %v\n\n", err.Error())
flusher.Flush()
return
}
}
aH.liveTailLogsV2(w, r)
}
func (aH *APIHandler) getMetricMetadata(w http.ResponseWriter, r *http.Request) {
@@ -5404,11 +5220,7 @@ func (aH *APIHandler) queryRangeV4(ctx context.Context, queryRangeParams *v3.Que
RespondError(w, apiErrObj, errQuriesByName)
return
}
if aH.UseTraceNewSchema {
tracesV4.Enrich(queryRangeParams, spanKeys)
} else {
tracesV3.Enrich(queryRangeParams, spanKeys)
}
tracesV4.Enrich(queryRangeParams, spanKeys)
}
// WARN: Only works for AND operator in traces query
@@ -5615,3 +5427,210 @@ func (aH *APIHandler) getDomainInfo(w http.ResponseWriter, r *http.Request) {
}
aH.Respond(w, resp)
}
// RegisterTraceFunnelsRoutes adds trace funnels routes
func (aH *APIHandler) RegisterTraceFunnelsRoutes(router *mux.Router, am *middleware.AuthZ) {
// Main trace funnels router
traceFunnelsRouter := router.PathPrefix("/api/v1/trace-funnels").Subrouter()
// API endpoints
traceFunnelsRouter.HandleFunc("/new",
am.ViewAccess(aH.Signoz.Handlers.TraceFunnel.New)).
Methods(http.MethodPost)
traceFunnelsRouter.HandleFunc("/list",
am.ViewAccess(aH.Signoz.Handlers.TraceFunnel.List)).
Methods(http.MethodGet)
traceFunnelsRouter.HandleFunc("/steps/update",
am.ViewAccess(aH.Signoz.Handlers.TraceFunnel.UpdateSteps)).
Methods(http.MethodPut)
traceFunnelsRouter.HandleFunc("/{funnel_id}",
am.ViewAccess(aH.Signoz.Handlers.TraceFunnel.Get)).
Methods(http.MethodGet)
traceFunnelsRouter.HandleFunc("/{funnel_id}",
am.ViewAccess(aH.Signoz.Handlers.TraceFunnel.Delete)).
Methods(http.MethodDelete)
traceFunnelsRouter.HandleFunc("/{funnel_id}",
am.ViewAccess(aH.Signoz.Handlers.TraceFunnel.UpdateFunnel)).
Methods(http.MethodPut)
traceFunnelsRouter.HandleFunc("/save",
am.ViewAccess(aH.Signoz.Handlers.TraceFunnel.Save)).
Methods(http.MethodPost)
// Analytics endpoints
traceFunnelsRouter.HandleFunc("/{funnel_id}/analytics/validate", aH.handleValidateTraces).Methods("POST")
traceFunnelsRouter.HandleFunc("/{funnel_id}/analytics/overview", aH.handleFunnelAnalytics).Methods("POST")
traceFunnelsRouter.HandleFunc("/{funnel_id}/analytics/steps", aH.handleStepAnalytics).Methods("POST")
traceFunnelsRouter.HandleFunc("/{funnel_id}/analytics/slow-traces", aH.handleFunnelSlowTraces).Methods("POST")
traceFunnelsRouter.HandleFunc("/{funnel_id}/analytics/error-traces", aH.handleFunnelErrorTraces).Methods("POST")
}
func (aH *APIHandler) handleValidateTraces(w http.ResponseWriter, r *http.Request) {
vars := mux.Vars(r)
funnelID := vars["funnel_id"]
funnel, err := aH.Signoz.Modules.TraceFunnel.Get(r.Context(), funnelID)
if err != nil {
RespondError(w, &model.ApiError{Typ: model.ErrorNotFound, Err: fmt.Errorf("funnel not found: %v", err)}, nil)
return
}
var timeRange traceFunnels.TimeRange
if err := json.NewDecoder(r.Body).Decode(&timeRange); err != nil {
RespondError(w, &model.ApiError{Typ: model.ErrorBadData, Err: fmt.Errorf("error decoding time range: %v", err)}, nil)
return
}
if len(funnel.Steps) < 2 {
RespondError(w, &model.ApiError{Typ: model.ErrorBadData, Err: fmt.Errorf("funnel must have at least 2 steps")}, nil)
return
}
chq, err := tracefunnels.ValidateTraces(funnel, timeRange)
if err != nil {
RespondError(w, &model.ApiError{Typ: model.ErrorInternal, Err: fmt.Errorf("error building clickhouse query: %v", err)}, nil)
return
}
results, err := aH.reader.GetListResultV3(r.Context(), chq.Query)
if err != nil {
RespondError(w, &model.ApiError{Typ: model.ErrorInternal, Err: fmt.Errorf("error converting clickhouse results to list: %v", err)}, nil)
return
}
aH.Respond(w, results)
}
func (aH *APIHandler) handleFunnelAnalytics(w http.ResponseWriter, r *http.Request) {
vars := mux.Vars(r)
funnelID := vars["funnel_id"]
funnel, err := aH.Signoz.Modules.TraceFunnel.Get(r.Context(), funnelID)
if err != nil {
RespondError(w, &model.ApiError{Typ: model.ErrorNotFound, Err: fmt.Errorf("funnel not found: %v", err)}, nil)
return
}
var timeRange traceFunnels.TimeRange
if err := json.NewDecoder(r.Body).Decode(&timeRange); err != nil {
RespondError(w, &model.ApiError{Typ: model.ErrorBadData, Err: fmt.Errorf("error decoding time range: %v", err)}, nil)
return
}
chq, err := tracefunnels.ValidateTracesWithLatency(funnel, timeRange)
if err != nil {
RespondError(w, &model.ApiError{Typ: model.ErrorInternal, Err: fmt.Errorf("error building clickhouse query: %v", err)}, nil)
return
}
results, err := aH.reader.GetListResultV3(r.Context(), chq.Query)
if err != nil {
RespondError(w, &model.ApiError{Typ: model.ErrorInternal, Err: fmt.Errorf("error converting clickhouse results to list: %v", err)}, nil)
return
}
aH.Respond(w, results)
}
func (aH *APIHandler) handleStepAnalytics(w http.ResponseWriter, r *http.Request) {
vars := mux.Vars(r)
funnelID := vars["funnel_id"]
funnel, err := aH.Signoz.Modules.TraceFunnel.Get(r.Context(), funnelID)
if err != nil {
RespondError(w, &model.ApiError{Typ: model.ErrorNotFound, Err: fmt.Errorf("funnel not found: %v", err)}, nil)
return
}
var timeRange traceFunnels.TimeRange
if err := json.NewDecoder(r.Body).Decode(&timeRange); err != nil {
RespondError(w, &model.ApiError{Typ: model.ErrorBadData, Err: fmt.Errorf("error decoding time range: %v", err)}, nil)
return
}
chq, err := tracefunnels.GetStepAnalytics(funnel, timeRange)
if err != nil {
RespondError(w, &model.ApiError{Typ: model.ErrorInternal, Err: fmt.Errorf("error building clickhouse query: %v", err)}, nil)
return
}
results, err := aH.reader.GetListResultV3(r.Context(), chq.Query)
if err != nil {
RespondError(w, &model.ApiError{Typ: model.ErrorInternal, Err: fmt.Errorf("error converting clickhouse results to list: %v", err)}, nil)
return
}
aH.Respond(w, results)
}
// handleFunnelSlowTraces handles requests for slow traces in a funnel
func (aH *APIHandler) handleFunnelSlowTraces(w http.ResponseWriter, r *http.Request) {
aH.handleTracesWithLatency(w, r, false)
}
// handleFunnelErrorTraces handles requests for error traces in a funnel
func (aH *APIHandler) handleFunnelErrorTraces(w http.ResponseWriter, r *http.Request) {
aH.handleTracesWithLatency(w, r, true)
}
// handleTracesWithLatency handles both slow and error traces with common logic
func (aH *APIHandler) handleTracesWithLatency(w http.ResponseWriter, r *http.Request, isError bool) {
funnel, req, err := aH.validateTracesRequest(r)
if err != nil {
RespondError(w, &model.ApiError{Typ: model.ErrorBadData, Err: err}, nil)
return
}
if err := aH.validateSteps(funnel, req.StepAOrder, req.StepBOrder); err != nil {
RespondError(w, &model.ApiError{Typ: model.ErrorBadData, Err: err}, nil)
return
}
chq, err := tracefunnels.GetSlowestTraces(funnel, req.StepAOrder, req.StepBOrder, req.TimeRange, isError)
if err != nil {
RespondError(w, &model.ApiError{Typ: model.ErrorInternal, Err: fmt.Errorf("error building clickhouse query: %v", err)}, nil)
return
}
results, err := aH.reader.GetListResultV3(r.Context(), chq.Query)
if err != nil {
RespondError(w, &model.ApiError{Typ: model.ErrorInternal, Err: fmt.Errorf("error converting clickhouse results to list: %v", err)}, nil)
return
}
aH.Respond(w, results)
}
// validateTracesRequest validates and extracts the request parameters
func (aH *APIHandler) validateTracesRequest(r *http.Request) (*traceFunnels.Funnel, *traceFunnels.StepTransitionRequest, error) {
vars := mux.Vars(r)
funnelID := vars["funnel_id"]
funnel, err := aH.Signoz.Modules.TraceFunnel.Get(r.Context(), funnelID)
if err != nil {
return nil, nil, fmt.Errorf("funnel not found: %v", err)
}
var req traceFunnels.StepTransitionRequest
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
return nil, nil, fmt.Errorf("invalid request body: %v", err)
}
return funnel, &req, nil
}
// validateSteps checks if the requested steps exist in the funnel
func (aH *APIHandler) validateSteps(funnel *traceFunnels.Funnel, stepAOrder, stepBOrder int64) error {
stepAExists, stepBExists := false, false
for _, step := range funnel.Steps {
if step.Order == stepAOrder {
stepAExists = true
}
if step.Order == stepBOrder {
stepBExists = true
}
}
if !stepAExists || !stepBExists {
return fmt.Errorf("one or both steps not found. Step A Order: %d, Step B Order: %d", stepAOrder, stepBOrder)
}
return nil
}

View File

@@ -104,7 +104,7 @@ func (agent *Agent) updateAgentDescription(newStatus *protobufs.AgentToServer) (
agent.Status = newStatus
agentDescrChanged = true
} else {
// Not a new Agent. Update the Status.
// Not a new Agent. UpdateSteps the Status.
agent.Status.SequenceNum = newStatus.SequenceNum
// Check what's changed in the AgentDescription.
@@ -127,7 +127,7 @@ func (agent *Agent) updateAgentDescription(newStatus *protobufs.AgentToServer) (
agentDescrChanged = false
}
// Update remote config status if it is included and is different from what we have.
// UpdateSteps remote config status if it is included and is different from what we have.
if newStatus.RemoteConfigStatus != nil &&
!proto.Equal(agent.Status.RemoteConfigStatus, newStatus.RemoteConfigStatus) {
agent.Status.RemoteConfigStatus = newStatus.RemoteConfigStatus
@@ -164,7 +164,7 @@ func (agent *Agent) updateHealth(newStatus *protobufs.AgentToServer) {
}
func (agent *Agent) updateRemoteConfigStatus(newStatus *protobufs.AgentToServer) {
// Update remote config status if it is included and is different from what we have.
// UpdateSteps remote config status if it is included and is different from what we have.
if newStatus.RemoteConfigStatus != nil {
agent.Status.RemoteConfigStatus = newStatus.RemoteConfigStatus
}
@@ -184,7 +184,7 @@ func (agent *Agent) updateStatusField(newStatus *protobufs.AgentToServer) (agent
}
func (agent *Agent) updateEffectiveConfig(newStatus *protobufs.AgentToServer, response *protobufs.ServerToAgent) {
// Update effective config if provided.
// UpdateSteps effective config if provided.
if newStatus.EffectiveConfig != nil {
if newStatus.EffectiveConfig.ConfigMap != nil {
agent.Status.EffectiveConfig = newStatus.EffectiveConfig

View File

@@ -6,10 +6,8 @@ import (
"strings"
"sync"
logsV3 "github.com/SigNoz/signoz/pkg/query-service/app/logs/v3"
logsV4 "github.com/SigNoz/signoz/pkg/query-service/app/logs/v4"
metricsV3 "github.com/SigNoz/signoz/pkg/query-service/app/metrics/v3"
tracesV3 "github.com/SigNoz/signoz/pkg/query-service/app/traces/v3"
tracesV4 "github.com/SigNoz/signoz/pkg/query-service/app/traces/v4"
"github.com/SigNoz/signoz/pkg/query-service/common"
"github.com/SigNoz/signoz/pkg/query-service/constants"
@@ -19,19 +17,15 @@ import (
"go.uber.org/zap"
)
func prepareLogsQuery(_ context.Context,
useLogsNewSchema bool,
func prepareLogsQuery(
_ context.Context,
start,
end int64,
builderQuery *v3.BuilderQuery,
params *v3.QueryRangeParamsV3,
) (string, error) {
query := ""
logsQueryBuilder := logsV3.PrepareLogsQuery
if useLogsNewSchema {
logsQueryBuilder = logsV4.PrepareLogsQuery
}
logsQueryBuilder := logsV4.PrepareLogsQuery
if params == nil || builderQuery == nil {
return query, fmt.Errorf("params and builderQuery cannot be nil")
@@ -102,7 +96,7 @@ func (q *querier) runBuilderQuery(
var err error
if _, ok := cacheKeys[queryName]; !ok || params.NoCache {
zap.L().Info("skipping cache for logs query", zap.String("queryName", queryName), zap.Int64("start", start), zap.Int64("end", end), zap.Int64("step", builderQuery.StepInterval), zap.Bool("noCache", params.NoCache), zap.String("cacheKey", cacheKeys[queryName]))
query, err = prepareLogsQuery(ctx, q.UseLogsNewSchema, start, end, builderQuery, params)
query, err = prepareLogsQuery(ctx, start, end, builderQuery, params)
if err != nil {
ch <- channelResult{Err: err, Name: queryName, Query: query, Series: nil}
return
@@ -117,7 +111,7 @@ func (q *querier) runBuilderQuery(
missedSeries := make([]querycache.CachedSeriesData, 0)
filteredMissedSeries := make([]querycache.CachedSeriesData, 0)
for _, miss := range misses {
query, err = prepareLogsQuery(ctx, q.UseLogsNewSchema, miss.Start, miss.End, builderQuery, params)
query, err = prepareLogsQuery(ctx, miss.Start, miss.End, builderQuery, params)
if err != nil {
ch <- channelResult{Err: err, Name: queryName, Query: query, Series: nil}
return
@@ -169,11 +163,7 @@ func (q *querier) runBuilderQuery(
}
if builderQuery.DataSource == v3.DataSourceTraces {
tracesQueryBuilder := tracesV3.PrepareTracesQuery
if q.UseTraceNewSchema {
tracesQueryBuilder = tracesV4.PrepareTracesQuery
}
tracesQueryBuilder := tracesV4.PrepareTracesQuery
var query string
var err error

View File

@@ -6,11 +6,9 @@ import (
"sync"
"time"
logsV3 "github.com/SigNoz/signoz/pkg/query-service/app/logs/v3"
logsV4 "github.com/SigNoz/signoz/pkg/query-service/app/logs/v4"
metricsV3 "github.com/SigNoz/signoz/pkg/query-service/app/metrics/v3"
"github.com/SigNoz/signoz/pkg/query-service/app/queryBuilder"
tracesV3 "github.com/SigNoz/signoz/pkg/query-service/app/traces/v3"
tracesV4 "github.com/SigNoz/signoz/pkg/query-service/app/traces/v4"
"github.com/SigNoz/signoz/pkg/query-service/common"
"github.com/SigNoz/signoz/pkg/query-service/constants"
@@ -52,9 +50,6 @@ type querier struct {
timeRanges [][]int
returnedSeries []*v3.Series
returnedErr error
UseLogsNewSchema bool
UseTraceNewSchema bool
}
type QuerierOptions struct {
@@ -64,22 +59,14 @@ type QuerierOptions struct {
FluxInterval time.Duration
// used for testing
TestingMode bool
ReturnedSeries []*v3.Series
ReturnedErr error
UseLogsNewSchema bool
UseTraceNewSchema bool
TestingMode bool
ReturnedSeries []*v3.Series
ReturnedErr error
}
func NewQuerier(opts QuerierOptions) interfaces.Querier {
logsQueryBuilder := logsV3.PrepareLogsQuery
if opts.UseLogsNewSchema {
logsQueryBuilder = logsV4.PrepareLogsQuery
}
tracesQueryBuilder := tracesV3.PrepareTracesQuery
if opts.UseTraceNewSchema {
tracesQueryBuilder = tracesV4.PrepareTracesQuery
}
logsQueryBuilder := logsV4.PrepareLogsQuery
tracesQueryBuilder := tracesV4.PrepareTracesQuery
qc := querycache.NewQueryCache(querycache.WithCache(opts.Cache), querycache.WithFluxInterval(opts.FluxInterval))
@@ -96,11 +83,9 @@ func NewQuerier(opts QuerierOptions) interfaces.Querier {
BuildMetricQuery: metricsV3.PrepareMetricQuery,
}),
testingMode: opts.TestingMode,
returnedSeries: opts.ReturnedSeries,
returnedErr: opts.ReturnedErr,
UseLogsNewSchema: opts.UseLogsNewSchema,
UseTraceNewSchema: opts.UseTraceNewSchema,
testingMode: opts.TestingMode,
returnedSeries: opts.ReturnedSeries,
returnedErr: opts.ReturnedErr,
}
}
@@ -445,11 +430,6 @@ func (q *querier) runBuilderListQueries(ctx context.Context, params *v3.QueryRan
len(params.CompositeQuery.BuilderQueries) == 1 &&
params.CompositeQuery.PanelType != v3.PanelTypeTrace {
for _, v := range params.CompositeQuery.BuilderQueries {
if (v.DataSource == v3.DataSourceLogs && !q.UseLogsNewSchema) ||
(v.DataSource == v3.DataSourceTraces && !q.UseTraceNewSchema) {
break
}
// only allow of logs queries with timestamp ordering desc
// TODO(nitya): allow for timestamp asc
if (v.DataSource == v3.DataSourceLogs || v.DataSource == v3.DataSourceTraces) &&

View File

@@ -1370,8 +1370,6 @@ func Test_querier_runWindowBasedListQuery(t *testing.T) {
telemetryStore,
prometheustest.New(instrumentationtest.New().Logger(), prometheus.Config{}),
"",
true,
true,
time.Duration(time.Second),
nil,
)
@@ -1384,7 +1382,7 @@ func Test_querier_runWindowBasedListQuery(t *testing.T) {
},
),
}
// Update query parameters
// UpdateSteps query parameters
params.Start = tc.queryParams.start
params.End = tc.queryParams.end
params.CompositeQuery.BuilderQueries["A"].Limit = tc.queryParams.limit

View File

@@ -6,11 +6,9 @@ import (
"strings"
"sync"
logsV3 "github.com/SigNoz/signoz/pkg/query-service/app/logs/v3"
logsV4 "github.com/SigNoz/signoz/pkg/query-service/app/logs/v4"
metricsV3 "github.com/SigNoz/signoz/pkg/query-service/app/metrics/v3"
metricsV4 "github.com/SigNoz/signoz/pkg/query-service/app/metrics/v4"
tracesV3 "github.com/SigNoz/signoz/pkg/query-service/app/traces/v3"
tracesV4 "github.com/SigNoz/signoz/pkg/query-service/app/traces/v4"
"github.com/SigNoz/signoz/pkg/query-service/common"
"github.com/SigNoz/signoz/pkg/query-service/constants"
@@ -19,17 +17,14 @@ import (
"go.uber.org/zap"
)
func prepareLogsQuery(_ context.Context,
useLogsNewSchema bool,
func prepareLogsQuery(
_ context.Context,
start,
end int64,
builderQuery *v3.BuilderQuery,
params *v3.QueryRangeParamsV3,
) (string, error) {
logsQueryBuilder := logsV3.PrepareLogsQuery
if useLogsNewSchema {
logsQueryBuilder = logsV4.PrepareLogsQuery
}
logsQueryBuilder := logsV4.PrepareLogsQuery
query := ""
if params == nil || builderQuery == nil {
@@ -102,7 +97,7 @@ func (q *querier) runBuilderQuery(
var err error
if _, ok := cacheKeys[queryName]; !ok || params.NoCache {
zap.L().Info("skipping cache for logs query", zap.String("queryName", queryName), zap.Int64("start", params.Start), zap.Int64("end", params.End), zap.Int64("step", params.Step), zap.Bool("noCache", params.NoCache), zap.String("cacheKey", cacheKeys[queryName]))
query, err = prepareLogsQuery(ctx, q.UseLogsNewSchema, start, end, builderQuery, params)
query, err = prepareLogsQuery(ctx, start, end, builderQuery, params)
if err != nil {
ch <- channelResult{Err: err, Name: queryName, Query: query, Series: nil}
return
@@ -116,7 +111,7 @@ func (q *querier) runBuilderQuery(
missedSeries := make([]querycache.CachedSeriesData, 0)
filteredMissedSeries := make([]querycache.CachedSeriesData, 0)
for _, miss := range misses {
query, err = prepareLogsQuery(ctx, q.UseLogsNewSchema, miss.Start, miss.End, builderQuery, params)
query, err = prepareLogsQuery(ctx, miss.Start, miss.End, builderQuery, params)
if err != nil {
ch <- channelResult{Err: err, Name: queryName, Query: query, Series: nil}
return
@@ -169,11 +164,7 @@ func (q *querier) runBuilderQuery(
}
if builderQuery.DataSource == v3.DataSourceTraces {
tracesQueryBuilder := tracesV3.PrepareTracesQuery
if q.UseTraceNewSchema {
tracesQueryBuilder = tracesV4.PrepareTracesQuery
}
tracesQueryBuilder := tracesV4.PrepareTracesQuery
var query string
var err error

View File

@@ -6,11 +6,9 @@ import (
"sync"
"time"
logsV3 "github.com/SigNoz/signoz/pkg/query-service/app/logs/v3"
logsV4 "github.com/SigNoz/signoz/pkg/query-service/app/logs/v4"
metricsV4 "github.com/SigNoz/signoz/pkg/query-service/app/metrics/v4"
"github.com/SigNoz/signoz/pkg/query-service/app/queryBuilder"
tracesV3 "github.com/SigNoz/signoz/pkg/query-service/app/traces/v3"
tracesV4 "github.com/SigNoz/signoz/pkg/query-service/app/traces/v4"
"github.com/SigNoz/signoz/pkg/query-service/common"
"github.com/SigNoz/signoz/pkg/query-service/constants"
@@ -49,11 +47,9 @@ type querier struct {
testingMode bool
queriesExecuted []string
// tuple of start and end time in milliseconds
timeRanges [][]int
returnedSeries []*v3.Series
returnedErr error
UseLogsNewSchema bool
UseTraceNewSchema bool
timeRanges [][]int
returnedSeries []*v3.Series
returnedErr error
}
type QuerierOptions struct {
@@ -63,23 +59,14 @@ type QuerierOptions struct {
FluxInterval time.Duration
// used for testing
TestingMode bool
ReturnedSeries []*v3.Series
ReturnedErr error
UseLogsNewSchema bool
UseTraceNewSchema bool
TestingMode bool
ReturnedSeries []*v3.Series
ReturnedErr error
}
func NewQuerier(opts QuerierOptions) interfaces.Querier {
logsQueryBuilder := logsV3.PrepareLogsQuery
if opts.UseLogsNewSchema {
logsQueryBuilder = logsV4.PrepareLogsQuery
}
tracesQueryBuilder := tracesV3.PrepareTracesQuery
if opts.UseTraceNewSchema {
tracesQueryBuilder = tracesV4.PrepareTracesQuery
}
logsQueryBuilder := logsV4.PrepareLogsQuery
tracesQueryBuilder := tracesV4.PrepareTracesQuery
qc := querycache.NewQueryCache(querycache.WithCache(opts.Cache), querycache.WithFluxInterval(opts.FluxInterval))
@@ -96,11 +83,9 @@ func NewQuerier(opts QuerierOptions) interfaces.Querier {
BuildMetricQuery: metricsV4.PrepareMetricQuery,
}),
testingMode: opts.TestingMode,
returnedSeries: opts.ReturnedSeries,
returnedErr: opts.ReturnedErr,
UseLogsNewSchema: opts.UseLogsNewSchema,
UseTraceNewSchema: opts.UseTraceNewSchema,
testingMode: opts.TestingMode,
returnedSeries: opts.ReturnedSeries,
returnedErr: opts.ReturnedErr,
}
}
@@ -446,11 +431,6 @@ func (q *querier) runBuilderListQueries(ctx context.Context, params *v3.QueryRan
len(params.CompositeQuery.BuilderQueries) == 1 &&
params.CompositeQuery.PanelType != v3.PanelTypeTrace {
for _, v := range params.CompositeQuery.BuilderQueries {
if (v.DataSource == v3.DataSourceLogs && !q.UseLogsNewSchema) ||
(v.DataSource == v3.DataSourceTraces && !q.UseTraceNewSchema) {
break
}
// only allow of logs queries with timestamp ordering desc
// TODO(nitya): allow for timestamp asc
if (v.DataSource == v3.DataSourceLogs || v.DataSource == v3.DataSourceTraces) &&

View File

@@ -1424,8 +1424,6 @@ func Test_querier_runWindowBasedListQuery(t *testing.T) {
telemetryStore,
prometheustest.New(instrumentationtest.New().Logger(), prometheus.Config{}),
"",
true,
true,
time.Duration(time.Second),
nil,
)
@@ -1438,7 +1436,7 @@ func Test_querier_runWindowBasedListQuery(t *testing.T) {
},
),
}
// Update query parameters
// UpdateSteps query parameters
params.Start = tc.queryParams.start
params.End = tc.queryParams.end
params.CompositeQuery.BuilderQueries["A"].Limit = tc.queryParams.limit

View File

@@ -140,7 +140,7 @@ func funcEWMA(result *v3.Result, alpha float64) *v3.Result {
}
if !math.IsNaN(point.Value) {
// Update EWMA with the current value
// UpdateSteps EWMA with the current value
ewma = alpha*point.Value + (1-alpha)*ewma
}
// Set the EWMA value for the current point

View File

@@ -14,9 +14,6 @@ import (
"github.com/SigNoz/signoz/pkg/alertmanager"
"github.com/SigNoz/signoz/pkg/apis/fields"
"github.com/SigNoz/signoz/pkg/http/middleware"
"github.com/SigNoz/signoz/pkg/modules/organization/implorganization"
"github.com/SigNoz/signoz/pkg/modules/preference"
preferencecore "github.com/SigNoz/signoz/pkg/modules/preference/core"
"github.com/SigNoz/signoz/pkg/prometheus"
"github.com/SigNoz/signoz/pkg/query-service/agentConf"
"github.com/SigNoz/signoz/pkg/query-service/app/clickhouseReader"
@@ -30,7 +27,6 @@ import (
"github.com/SigNoz/signoz/pkg/sqlstore"
"github.com/SigNoz/signoz/pkg/telemetrystore"
"github.com/SigNoz/signoz/pkg/types/authtypes"
"github.com/SigNoz/signoz/pkg/types/preferencetypes"
"github.com/SigNoz/signoz/pkg/web"
"github.com/rs/cors"
"github.com/soheilhy/cmux"
@@ -42,7 +38,6 @@ import (
"github.com/SigNoz/signoz/pkg/query-service/featureManager"
"github.com/SigNoz/signoz/pkg/query-service/healthcheck"
"github.com/SigNoz/signoz/pkg/query-service/interfaces"
"github.com/SigNoz/signoz/pkg/query-service/model"
"github.com/SigNoz/signoz/pkg/query-service/rules"
"github.com/SigNoz/signoz/pkg/query-service/telemetry"
"github.com/SigNoz/signoz/pkg/query-service/utils"
@@ -50,21 +45,14 @@ import (
)
type ServerOptions struct {
Config signoz.Config
PromConfigPath string
SkipTopLvlOpsPath string
HTTPHostPort string
PrivateHostPort string
// alert specific params
DisableRules bool
RuleRepoURL string
Config signoz.Config
HTTPHostPort string
PrivateHostPort string
PreferSpanMetrics bool
CacheConfigPath string
FluxInterval string
FluxIntervalForTraceDetail string
Cluster string
UseLogsNewSchema bool
UseTraceNewSchema bool
SigNoz *signoz.SigNoz
Jwt *authtypes.JWT
}
@@ -120,21 +108,10 @@ func NewServer(serverOptions *ServerOptions) (*Server, error) {
serverOptions.SigNoz.TelemetryStore,
serverOptions.SigNoz.Prometheus,
serverOptions.Cluster,
serverOptions.UseLogsNewSchema,
serverOptions.UseTraceNewSchema,
fluxIntervalForTraceDetail,
serverOptions.SigNoz.Cache,
)
skipConfig := &model.SkipConfig{}
if serverOptions.SkipTopLvlOpsPath != "" {
// read skip config
skipConfig, err = model.ReadSkipConfig(serverOptions.SkipTopLvlOpsPath)
if err != nil {
return nil, err
}
}
var c cache.Cache
if serverOptions.CacheConfigPath != "" {
cacheOpts, err := cache.LoadFromYAMLCacheConfigFile(serverOptions.CacheConfigPath)
@@ -145,13 +122,9 @@ func NewServer(serverOptions *ServerOptions) (*Server, error) {
}
rm, err := makeRulesManager(
serverOptions.RuleRepoURL,
serverOptions.SigNoz.SQLStore.SQLxDB(),
reader,
c,
serverOptions.DisableRules,
serverOptions.UseLogsNewSchema,
serverOptions.UseTraceNewSchema,
serverOptions.SigNoz.SQLStore,
serverOptions.SigNoz.TelemetryStore,
serverOptions.SigNoz.Prometheus,
@@ -183,12 +156,8 @@ func NewServer(serverOptions *ServerOptions) (*Server, error) {
}
telemetry.GetInstance().SetReader(reader)
preferenceAPI := preference.NewAPI(preferencecore.NewPreference(preferencecore.NewStore(serverOptions.SigNoz.SQLStore), preferencetypes.NewDefaultPreferenceMap()))
organizationAPI := implorganization.NewAPI(implorganization.NewModule(implorganization.NewStore(serverOptions.SigNoz.SQLStore)))
organizationModule := implorganization.NewModule(implorganization.NewStore(serverOptions.SigNoz.SQLStore))
apiHandler, err := NewAPIHandler(APIHandlerOpts{
Reader: reader,
SkipConfig: skipConfig,
PreferSpanMetrics: serverOptions.PreferSpanMetrics,
AppDao: dao.DB(),
RuleManager: rm,
@@ -198,23 +167,16 @@ func NewServer(serverOptions *ServerOptions) (*Server, error) {
LogsParsingPipelineController: logParsingPipelineController,
Cache: c,
FluxInterval: fluxInterval,
UseLogsNewSchema: serverOptions.UseLogsNewSchema,
UseTraceNewSchema: serverOptions.UseTraceNewSchema,
JWT: serverOptions.Jwt,
AlertmanagerAPI: alertmanager.NewAPI(serverOptions.SigNoz.Alertmanager),
FieldsAPI: fields.NewAPI(serverOptions.SigNoz.TelemetryStore),
Signoz: serverOptions.SigNoz,
Preference: preferenceAPI,
OrganizationAPI: organizationAPI,
OrganizationModule: organizationModule,
})
if err != nil {
return nil, err
}
s := &Server{
// logger: logger,
// tracer: tracer,
ruleManager: rm,
serverOptions: serverOptions,
unavailableChannel: make(chan healthcheck.Status),
@@ -319,6 +281,7 @@ func (s *Server) createPublicServer(api *APIHandler, web web.Web) (*http.Server,
api.RegisterMessagingQueuesRoutes(r, am)
api.RegisterThirdPartyApiRoutes(r, am)
api.MetricExplorerRoutes(r, am)
api.RegisterTraceFunnelsRoutes(r, am)
c := cors.New(cors.Options{
AllowedOrigins: []string{"*"},
@@ -374,13 +337,7 @@ func (s *Server) initListeners() error {
// Start listening on http and private http port concurrently
func (s *Server) Start(ctx context.Context) error {
// initiate rule manager first
if !s.serverOptions.DisableRules {
s.ruleManager.Start(ctx)
} else {
zap.L().Info("msg: Rules disabled as rules.disable is set to TRUE")
}
s.ruleManager.Start(ctx)
err := s.initListeners()
if err != nil {
@@ -468,32 +425,24 @@ func (s *Server) Stop(ctx context.Context) error {
}
func makeRulesManager(
ruleRepoURL string,
db *sqlx.DB,
ch interfaces.Reader,
cache cache.Cache,
disableRules bool,
useLogsNewSchema bool,
useTraceNewSchema bool,
sqlstore sqlstore.SQLStore,
telemetryStore telemetrystore.TelemetryStore,
prometheus prometheus.Prometheus,
) (*rules.Manager, error) {
// create manager opts
managerOpts := &rules.ManagerOptions{
TelemetryStore: telemetryStore,
Prometheus: prometheus,
RepoURL: ruleRepoURL,
DBConn: db,
Context: context.Background(),
Logger: zap.L(),
DisableRules: disableRules,
Reader: ch,
Cache: cache,
EvalDelay: constants.GetEvalDelay(),
UseLogsNewSchema: useLogsNewSchema,
UseTraceNewSchema: useTraceNewSchema,
SQLStore: sqlstore,
TelemetryStore: telemetryStore,
Prometheus: prometheus,
DBConn: db,
Context: context.Background(),
Logger: zap.L(),
Reader: ch,
Cache: cache,
EvalDelay: constants.GetEvalDelay(),
SQLStore: sqlstore,
}
// create Manager

View File

@@ -153,7 +153,7 @@ func InviteUsers(ctx context.Context, req *model.BulkInviteRequest) (*model.Bulk
}
}
// Update the status based on the results
// UpdateSteps the status based on the results
if response.Summary.FailedInvites == response.Summary.TotalInvites {
response.Status = "failure"
} else if response.Summary.FailedInvites > 0 {

View File

@@ -18,10 +18,6 @@ const (
OpAmpWsEndpoint = "0.0.0.0:4320" // address for opamp websocket
)
type ContextKey string
const ContextUserKey ContextKey = "user"
var DEFAULT_TELEMETRY_ANONYMOUS = false
func IsOSSTelemetryEnabled() bool {
@@ -57,9 +53,6 @@ var TELEMETRY_ACTIVE_USER_DURATION_MINUTES = GetOrDefaultEnvInt("TELEMETRY_ACTIV
var InviteEmailTemplate = GetOrDefaultEnv("INVITE_EMAIL_TEMPLATE", "/root/templates/invitation_email_template.html")
// [Deprecated] SIGNOZ_LOCAL_DB_PATH is deprecated and scheduled for removal. Please use SIGNOZ_SQLSTORE_SQLITE_PATH instead.
var RELATIONAL_DATASOURCE_PATH = GetOrDefaultEnv("SIGNOZ_LOCAL_DB_PATH", "/var/lib/signoz/signoz.db")
var MetricsExplorerClickhouseThreads = GetOrDefaultEnvInt("METRICS_EXPLORER_CLICKHOUSE_THREADS", 8)
var UpdatedMetricsMetadataCachePrefix = GetOrDefaultEnv("METRICS_UPDATED_METADATA_CACHE_KEY", "UPDATED_METRICS_METADATA")

View File

@@ -208,7 +208,7 @@ func (mds *ModelDaoSqlite) GetUser(ctx context.Context,
query := mds.bundb.NewSelect().
Table("users").
Column("users.id", "users.name", "users.email", "users.password", "users.created_at", "users.profile_picture_url", "users.org_id", "users.role").
ColumnExpr("o.name as organization").
ColumnExpr("o.display_name as organization").
Join("JOIN organizations o ON o.id = users.org_id").
Where("users.id = ?", id)
@@ -243,7 +243,7 @@ func (mds *ModelDaoSqlite) GetUserByEmail(ctx context.Context,
query := mds.bundb.NewSelect().
Table("users").
Column("users.id", "users.name", "users.email", "users.password", "users.created_at", "users.profile_picture_url", "users.org_id", "users.role").
ColumnExpr("o.name as organization").
ColumnExpr("o.display_name as organization").
Join("JOIN organizations o ON o.id = users.org_id").
Where("users.email = ?", email)
@@ -277,7 +277,7 @@ func (mds *ModelDaoSqlite) GetUsersWithOpts(ctx context.Context, limit int) ([]t
Table("users").
Column("users.id", "users.name", "users.email", "users.password", "users.created_at", "users.profile_picture_url", "users.org_id", "users.role").
ColumnExpr("users.role as role").
ColumnExpr("o.name as organization").
ColumnExpr("o.display_name as organization").
Join("JOIN organizations o ON o.id = users.org_id")
if limit > 0 {
@@ -300,7 +300,7 @@ func (mds *ModelDaoSqlite) GetUsersByOrg(ctx context.Context,
Table("users").
Column("users.id", "users.name", "users.email", "users.password", "users.created_at", "users.profile_picture_url", "users.org_id", "users.role").
ColumnExpr("users.role as role").
ColumnExpr("o.name as organization").
ColumnExpr("o.display_name as organization").
Join("JOIN organizations o ON o.id = users.org_id").
Where("users.org_id = ?", orgId)
@@ -318,7 +318,7 @@ func (mds *ModelDaoSqlite) GetUsersByRole(ctx context.Context, role authtypes.Ro
Table("users").
Column("users.id", "users.name", "users.email", "users.password", "users.created_at", "users.profile_picture_url", "users.org_id", "users.role").
ColumnExpr("users.role as role").
ColumnExpr("o.name as organization").
ColumnExpr("o.display_name as organization").
Join("JOIN organizations o ON o.id = users.org_id").
Where("users.role = ?", role)

View File

@@ -15,8 +15,8 @@ import (
type Reader interface {
GetInstantQueryMetricsResult(ctx context.Context, query *model.InstantQueryMetricsParams) (*promql.Result, *stats.QueryStats, *model.ApiError)
GetQueryRangeResult(ctx context.Context, query *model.QueryRangeParams) (*promql.Result, *stats.QueryStats, *model.ApiError)
GetTopLevelOperations(ctx context.Context, skipConfig *model.SkipConfig, start, end time.Time, services []string) (*map[string][]string, *model.ApiError)
GetServices(ctx context.Context, query *model.GetServicesParams, skipConfig *model.SkipConfig) (*[]model.ServiceItem, *model.ApiError)
GetTopLevelOperations(ctx context.Context, start, end time.Time, services []string) (*map[string][]string, *model.ApiError)
GetServices(ctx context.Context, query *model.GetServicesParams) (*[]model.ServiceItem, *model.ApiError)
GetTopOperations(ctx context.Context, query *model.GetTopOperationsParams) (*[]model.TopOperationsItem, *model.ApiError)
GetUsage(ctx context.Context, query *model.GetUsageParams) (*[]model.UsageItem, error)
GetServicesList(ctx context.Context) (*[]string, error)

View File

@@ -14,6 +14,8 @@ import (
"github.com/SigNoz/signoz/pkg/signoz"
"github.com/SigNoz/signoz/pkg/types/authtypes"
"github.com/SigNoz/signoz/pkg/version"
"github.com/SigNoz/signoz/pkg/zeus"
"github.com/SigNoz/signoz/pkg/zeus/noopzeus"
"go.uber.org/zap"
"go.uber.org/zap/zapcore"
@@ -45,12 +47,18 @@ func main() {
var maxOpenConns int
var dialTimeout time.Duration
// Deprecated
flag.BoolVar(&useLogsNewSchema, "use-logs-new-schema", false, "use logs_v2 schema for logs")
// Deprecated
flag.BoolVar(&useTraceNewSchema, "use-trace-new-schema", false, "use new schema for traces")
// Deprecated
flag.StringVar(&promConfigPath, "config", "./config/prometheus.yml", "(prometheus config to read metrics)")
// Deprecated
flag.StringVar(&skipTopLvlOpsPath, "skip-top-level-ops", "", "(config file to skip top level operations)")
// Deprecated
flag.BoolVar(&disableRules, "rules.disable", false, "(disable rule evaluation)")
flag.BoolVar(&preferSpanMetrics, "prefer-span-metrics", false, "(prefer span metrics for service level metrics)")
// Deprecated
flag.StringVar(&ruleRepoURL, "rules.repo-url", constants.AlertHelpPage, "(host address used to build rule link in alert messages)")
flag.StringVar(&cacheConfigPath, "experimental.cache-config", "", "(cache config to use)")
flag.StringVar(&fluxInterval, "flux-interval", "5m", "(the interval to exclude data from being cached to avoid incorrect cache for data in motion)")
@@ -58,8 +66,11 @@ func main() {
flag.StringVar(&cluster, "cluster", "cluster", "(cluster name - defaults to 'cluster')")
// Allow using the consistent naming with the signoz collector
flag.StringVar(&cluster, "cluster-name", "cluster", "(cluster name - defaults to 'cluster')")
// Deprecated
flag.IntVar(&maxIdleConns, "max-idle-conns", 50, "(number of connections to maintain in the pool, only used with clickhouse if not set in ClickHouseUrl env var DSN.)")
// Deprecated
flag.IntVar(&maxOpenConns, "max-open-conns", 100, "(max connections for use at any time, only used with clickhouse if not set in ClickHouseUrl env var DSN.)")
// Deprecated
flag.DurationVar(&dialTimeout, "dial-timeout", 5*time.Second, "(the maximum time to establish a connection, only used with clickhouse if not set in ClickHouseUrl env var DSN.)")
flag.Parse()
@@ -90,6 +101,8 @@ func main() {
signoz, err := signoz.New(
context.Background(),
config,
zeus.Config{},
noopzeus.NewProviderFactory(),
signoz.NewCacheProviderFactories(),
signoz.NewWebProviderFactories(),
signoz.NewSQLStoreProviderFactories(),
@@ -113,18 +126,12 @@ func main() {
serverOptions := &app.ServerOptions{
Config: config,
HTTPHostPort: constants.HTTPHostPort,
PromConfigPath: promConfigPath,
SkipTopLvlOpsPath: skipTopLvlOpsPath,
PreferSpanMetrics: preferSpanMetrics,
PrivateHostPort: constants.PrivateHostPort,
DisableRules: disableRules,
RuleRepoURL: ruleRepoURL,
CacheConfigPath: cacheConfigPath,
FluxInterval: fluxInterval,
FluxIntervalForTraceDetail: fluxIntervalForTraceDetail,
Cluster: cluster,
UseLogsNewSchema: useLogsNewSchema,
UseTraceNewSchema: useTraceNewSchema,
SigNoz: signoz,
Jwt: jwt,
}

View File

@@ -88,6 +88,11 @@ type ChangePasswordRequest struct {
NewPassword string `json:"newPassword"`
}
type ResetPasswordRequest struct {
Password string `json:"password"`
Token string `json:"token"`
}
type UserRole struct {
UserId string `json:"user_id"`
GroupName string `json:"group_name"`

View File

@@ -1,57 +0,0 @@
package model
import (
"os"
"gopkg.in/yaml.v2"
)
type SkipConfig struct {
Services []ServiceSkipConfig `yaml:"services"`
}
type ServiceSkipConfig struct {
Name string `yaml:"name"`
Operations []string `yaml:"operations"`
}
func (s *SkipConfig) ShouldSkip(serviceName, name string) bool {
for _, service := range s.Services {
if service.Name == serviceName {
for _, operation := range service.Operations {
if name == operation {
return true
}
}
}
}
return false
}
func ReadYaml(path string, v interface{}) error {
f, err := os.Open(path)
if err != nil {
return err
}
defer f.Close()
decoder := yaml.NewDecoder(f)
err = decoder.Decode(v)
if err != nil {
return err
}
return nil
}
func ReadSkipConfig(path string) (*SkipConfig, error) {
if path == "" {
return &SkipConfig{}, nil
}
skipConfig := &SkipConfig{}
err := ReadYaml(path, skipConfig)
if err != nil {
return nil, err
}
return skipConfig, nil
}

View File

@@ -1,17 +0,0 @@
package model
import "time"
type ResetPasswordRequest struct {
Password string `json:"password"`
Token string `json:"token"`
}
type IngestionKey struct {
KeyId string `json:"keyId" db:"key_id"`
Name string `json:"name" db:"name"`
CreatedAt time.Time `json:"createdAt" db:"created_at"`
IngestionKey string `json:"ingestionKey" db:"ingestion_key"`
IngestionURL string `json:"ingestionURL" db:"ingestion_url"`
DataRegion string `json:"dataRegion" db:"data_region"`
}

View File

@@ -50,7 +50,7 @@ func PostProcessResult(result []*v3.Result, queryRangeParams *v3.QueryRangeParam
for _, query := range queryRangeParams.CompositeQuery.BuilderQueries {
// The way we distinguish between a formula and a query is by checking if the expression
// is the same as the query name
// TODO(srikanthccv): Update the UI to send a flag to distinguish between a formula and a query
// TODO(srikanthccv): UpdateSteps the UI to send a flag to distinguish between a formula and a query
if query.Expression != query.QueryName {
expression, err := govaluate.NewEvaluableExpressionWithFunctions(query.Expression, EvalFuncs())
// This shouldn't happen here, because it should have been caught earlier in validation

View File

@@ -115,7 +115,7 @@ func (q *queryCache) FindMissingTimeRangesV2(start, end int64, step int64, cache
missingRanges = append(missingRanges, MissInterval{Start: currentTime, End: min(data.Start, end)})
}
// Update currentTime, but don't go past the end time
// UpdateSteps currentTime, but don't go past the end time
currentTime = max(currentTime, min(data.End, end))
}
@@ -205,7 +205,7 @@ func (q *queryCache) FindMissingTimeRanges(start, end, step int64, cacheKey stri
missingRanges = append(missingRanges, MissInterval{Start: currentTime, End: min(data.Start, end)})
}
// Update currentTime, but don't go past the end time
// UpdateSteps currentTime, but don't go past the end time
currentTime = max(currentTime, min(data.End, end))
}

View File

@@ -34,33 +34,29 @@ import (
)
type PrepareTaskOptions struct {
Rule *ruletypes.PostableRule
TaskName string
RuleStore ruletypes.RuleStore
MaintenanceStore ruletypes.MaintenanceStore
Logger *zap.Logger
Reader interfaces.Reader
Cache cache.Cache
ManagerOpts *ManagerOptions
NotifyFunc NotifyFunc
SQLStore sqlstore.SQLStore
UseLogsNewSchema bool
UseTraceNewSchema bool
OrgID string
Rule *ruletypes.PostableRule
TaskName string
RuleStore ruletypes.RuleStore
MaintenanceStore ruletypes.MaintenanceStore
Logger *zap.Logger
Reader interfaces.Reader
Cache cache.Cache
ManagerOpts *ManagerOptions
NotifyFunc NotifyFunc
SQLStore sqlstore.SQLStore
OrgID string
}
type PrepareTestRuleOptions struct {
Rule *ruletypes.PostableRule
RuleStore ruletypes.RuleStore
MaintenanceStore ruletypes.MaintenanceStore
Logger *zap.Logger
Reader interfaces.Reader
Cache cache.Cache
ManagerOpts *ManagerOptions
NotifyFunc NotifyFunc
SQLStore sqlstore.SQLStore
UseLogsNewSchema bool
UseTraceNewSchema bool
Rule *ruletypes.PostableRule
RuleStore ruletypes.RuleStore
MaintenanceStore ruletypes.MaintenanceStore
Logger *zap.Logger
Reader interfaces.Reader
Cache cache.Cache
ManagerOpts *ManagerOptions
NotifyFunc NotifyFunc
SQLStore sqlstore.SQLStore
}
const taskNamesuffix = "webAppEditor"
@@ -84,25 +80,18 @@ func prepareTaskName(ruleId interface{}) string {
type ManagerOptions struct {
TelemetryStore telemetrystore.TelemetryStore
Prometheus prometheus.Prometheus
// RepoURL is used to generate a backlink in sent alert messages
RepoURL string
// rule db conn
DBConn *sqlx.DB
Context context.Context
Logger *zap.Logger
ResendDelay time.Duration
DisableRules bool
Reader interfaces.Reader
Cache cache.Cache
Context context.Context
Logger *zap.Logger
ResendDelay time.Duration
Reader interfaces.Reader
Cache cache.Cache
EvalDelay time.Duration
PrepareTaskFunc func(opts PrepareTaskOptions) (Task, error)
UseLogsNewSchema bool
UseTraceNewSchema bool
PrepareTaskFunc func(opts PrepareTaskOptions) (Task, error)
PrepareTestRuleFunc func(opts PrepareTestRuleOptions) (int, *model.ApiError)
Alertmanager alertmanager.Alertmanager
SQLStore sqlstore.SQLStore
@@ -125,9 +114,6 @@ type Manager struct {
prepareTaskFunc func(opts PrepareTaskOptions) (Task, error)
prepareTestRuleFunc func(opts PrepareTestRuleOptions) (int, *model.ApiError)
UseLogsNewSchema bool
UseTraceNewSchema bool
alertmanager alertmanager.Alertmanager
sqlstore sqlstore.SQLStore
}
@@ -160,8 +146,6 @@ func defaultPrepareTaskFunc(opts PrepareTaskOptions) (Task, error) {
ruleId,
opts.Rule,
opts.Reader,
opts.UseLogsNewSchema,
opts.UseTraceNewSchema,
WithEvalDelay(opts.ManagerOpts.EvalDelay),
WithSQLStore(opts.SQLStore),
)
@@ -395,11 +379,9 @@ func (m *Manager) EditRule(ctx context.Context, ruleStr string, idStr string) er
return err
}
if !m.opts.DisableRules {
err = m.syncRuleStateWithTask(ctx, claims.OrgID, prepareTaskName(existingRule.ID.StringValue()), parsedRule)
if err != nil {
return err
}
err = m.syncRuleStateWithTask(ctx, claims.OrgID, prepareTaskName(existingRule.ID.StringValue()), parsedRule)
if err != nil {
return err
}
return nil
@@ -413,19 +395,17 @@ func (m *Manager) editTask(_ context.Context, orgID string, rule *ruletypes.Post
zap.L().Debug("editing a rule task", zap.String("name", taskName))
newTask, err := m.prepareTaskFunc(PrepareTaskOptions{
Rule: rule,
TaskName: taskName,
RuleStore: m.ruleStore,
MaintenanceStore: m.maintenanceStore,
Logger: m.logger,
Reader: m.reader,
Cache: m.cache,
ManagerOpts: m.opts,
NotifyFunc: m.prepareNotifyFunc(),
SQLStore: m.sqlstore,
UseLogsNewSchema: m.opts.UseLogsNewSchema,
UseTraceNewSchema: m.opts.UseTraceNewSchema,
OrgID: orgID,
Rule: rule,
TaskName: taskName,
RuleStore: m.ruleStore,
MaintenanceStore: m.maintenanceStore,
Logger: m.logger,
Reader: m.reader,
Cache: m.cache,
ManagerOpts: m.opts,
NotifyFunc: m.prepareNotifyFunc(),
SQLStore: m.sqlstore,
OrgID: orgID,
})
if err != nil {
@@ -496,9 +476,7 @@ func (m *Manager) DeleteRule(ctx context.Context, idStr string) error {
}
taskName := prepareTaskName(id.StringValue())
if !m.opts.DisableRules {
m.deleteTask(taskName)
}
m.deleteTask(taskName)
return nil
})
@@ -581,10 +559,8 @@ func (m *Manager) CreateRule(ctx context.Context, ruleStr string) (*ruletypes.Ge
}
taskName := prepareTaskName(id.StringValue())
if !m.opts.DisableRules {
if err := m.addTask(ctx, claims.OrgID, parsedRule, taskName); err != nil {
return err
}
if err := m.addTask(ctx, claims.OrgID, parsedRule, taskName); err != nil {
return err
}
return nil
@@ -605,19 +581,17 @@ func (m *Manager) addTask(_ context.Context, orgID string, rule *ruletypes.Posta
zap.L().Debug("adding a new rule task", zap.String("name", taskName))
newTask, err := m.prepareTaskFunc(PrepareTaskOptions{
Rule: rule,
TaskName: taskName,
RuleStore: m.ruleStore,
MaintenanceStore: m.maintenanceStore,
Logger: m.logger,
Reader: m.reader,
Cache: m.cache,
ManagerOpts: m.opts,
NotifyFunc: m.prepareNotifyFunc(),
SQLStore: m.sqlstore,
UseLogsNewSchema: m.opts.UseLogsNewSchema,
UseTraceNewSchema: m.opts.UseTraceNewSchema,
OrgID: orgID,
Rule: rule,
TaskName: taskName,
RuleStore: m.ruleStore,
MaintenanceStore: m.maintenanceStore,
Logger: m.logger,
Reader: m.reader,
Cache: m.cache,
ManagerOpts: m.opts,
NotifyFunc: m.prepareNotifyFunc(),
SQLStore: m.sqlstore,
OrgID: orgID,
})
if err != nil {
@@ -724,9 +698,6 @@ func (m *Manager) prepareNotifyFunc() NotifyFunc {
for _, alert := range alerts {
generatorURL := alert.GeneratorURL
if generatorURL == "" {
generatorURL = m.opts.RepoURL
}
a := &alertmanagertypes.PostableAlert{
Annotations: alert.Annotations.Map(),
@@ -759,9 +730,6 @@ func (m *Manager) prepareTestNotifyFunc() NotifyFunc {
alert := alerts[0]
generatorURL := alert.GeneratorURL
if generatorURL == "" {
generatorURL = m.opts.RepoURL
}
a := &alertmanagertypes.PostableAlert{
Annotations: alert.Annotations.Map(),
@@ -1003,17 +971,15 @@ func (m *Manager) TestNotification(ctx context.Context, ruleStr string) (int, *m
}
alertCount, apiErr := m.prepareTestRuleFunc(PrepareTestRuleOptions{
Rule: parsedRule,
RuleStore: m.ruleStore,
MaintenanceStore: m.maintenanceStore,
Logger: m.logger,
Reader: m.reader,
Cache: m.cache,
ManagerOpts: m.opts,
NotifyFunc: m.prepareTestNotifyFunc(),
SQLStore: m.sqlstore,
UseLogsNewSchema: m.opts.UseLogsNewSchema,
UseTraceNewSchema: m.opts.UseTraceNewSchema,
Rule: parsedRule,
RuleStore: m.ruleStore,
MaintenanceStore: m.maintenanceStore,
Logger: m.logger,
Reader: m.reader,
Cache: m.cache,
ManagerOpts: m.opts,
NotifyFunc: m.prepareTestNotifyFunc(),
SQLStore: m.sqlstore,
})
return alertCount, apiErr

View File

@@ -209,7 +209,7 @@ func (r *PromRule) Eval(ctx context.Context, ts time.Time) (interface{}, error)
// alerts[h] is ready, add or update active list now
for h, a := range alerts {
// Check whether we already have alerting state for the identifying label set.
// Update the last value and annotations if so, create a new alert entry otherwise.
// UpdateSteps the last value and annotations if so, create a new alert entry otherwise.
if alert, ok := r.Active[h]; ok && alert.State != model.StateInactive {
alert.Value = a.Value
alert.Annotations = a.Annotations

View File

@@ -15,7 +15,6 @@ import (
// TestNotification prepares a dummy rule for given rule parameters and
// sends a test notification. returns alert count and error (if any)
func defaultTestNotification(opts PrepareTestRuleOptions) (int, *model.ApiError) {
ctx := context.Background()
if opts.Rule == nil {
@@ -48,8 +47,6 @@ func defaultTestNotification(opts PrepareTestRuleOptions) (int, *model.ApiError)
alertname,
parsedRule,
opts.Reader,
opts.UseLogsNewSchema,
opts.UseTraceNewSchema,
WithSendAlways(),
WithSendUnmatched(),
WithSQLStore(opts.SQLStore),

View File

@@ -29,7 +29,6 @@ import (
"github.com/SigNoz/signoz/pkg/query-service/utils/timestamp"
logsv3 "github.com/SigNoz/signoz/pkg/query-service/app/logs/v3"
tracesV3 "github.com/SigNoz/signoz/pkg/query-service/app/traces/v3"
tracesV4 "github.com/SigNoz/signoz/pkg/query-service/app/traces/v4"
"github.com/SigNoz/signoz/pkg/query-service/formatter"
@@ -52,16 +51,12 @@ type ThresholdRule struct {
// used for attribute metadata enrichment for logs and traces
logsKeys map[string]v3.AttributeKey
spansKeys map[string]v3.AttributeKey
useTraceNewSchema bool
}
func NewThresholdRule(
id string,
p *ruletypes.PostableRule,
reader interfaces.Reader,
useLogsNewSchema bool,
useTraceNewSchema bool,
opts ...RuleOption,
) (*ThresholdRule, error) {
@@ -73,25 +68,20 @@ func NewThresholdRule(
}
t := ThresholdRule{
BaseRule: baseRule,
version: p.Version,
useTraceNewSchema: useTraceNewSchema,
BaseRule: baseRule,
version: p.Version,
}
querierOption := querier.QuerierOptions{
Reader: reader,
Cache: nil,
KeyGenerator: queryBuilder.NewKeyGenerator(),
UseLogsNewSchema: useLogsNewSchema,
UseTraceNewSchema: useTraceNewSchema,
Reader: reader,
Cache: nil,
KeyGenerator: queryBuilder.NewKeyGenerator(),
}
querierOptsV2 := querierV2.QuerierOptions{
Reader: reader,
Cache: nil,
KeyGenerator: queryBuilder.NewKeyGenerator(),
UseLogsNewSchema: useLogsNewSchema,
UseTraceNewSchema: useTraceNewSchema,
Reader: reader,
Cache: nil,
KeyGenerator: queryBuilder.NewKeyGenerator(),
}
t.querier = querier.NewQuerier(querierOption)
@@ -301,11 +291,7 @@ func (r *ThresholdRule) buildAndRunQuery(ctx context.Context, ts time.Time) (rul
return nil, err
}
r.spansKeys = spanKeys
if r.useTraceNewSchema {
tracesV4.Enrich(params, spanKeys)
} else {
tracesV3.Enrich(params, spanKeys)
}
tracesV4.Enrich(params, spanKeys)
}
}
@@ -485,7 +471,7 @@ func (r *ThresholdRule) Eval(ctx context.Context, ts time.Time) (interface{}, er
// alerts[h] is ready, add or update active list now
for h, a := range alerts {
// Check whether we already have alerting state for the identifying label set.
// Update the last value and annotations if so, create a new alert entry otherwise.
// UpdateSteps the last value and annotations if so, create a new alert entry otherwise.
if alert, ok := r.Active[h]; ok && alert.State != model.StateInactive {
alert.Value = a.Value

View File

@@ -801,7 +801,7 @@ func TestThresholdRuleShouldAlert(t *testing.T) {
postableRule.RuleCondition.MatchType = ruletypes.MatchType(c.matchType)
postableRule.RuleCondition.Target = &c.target
rule, err := NewThresholdRule("69", &postableRule, nil, true, true, WithEvalDelay(2*time.Minute))
rule, err := NewThresholdRule("69", &postableRule, nil, WithEvalDelay(2*time.Minute))
if err != nil {
assert.NoError(t, err)
}
@@ -889,7 +889,7 @@ func TestPrepareLinksToLogs(t *testing.T) {
},
}
rule, err := NewThresholdRule("69", &postableRule, nil, true, true, WithEvalDelay(2*time.Minute))
rule, err := NewThresholdRule("69", &postableRule, nil, WithEvalDelay(2*time.Minute))
if err != nil {
assert.NoError(t, err)
}
@@ -930,7 +930,7 @@ func TestPrepareLinksToTraces(t *testing.T) {
},
}
rule, err := NewThresholdRule("69", &postableRule, nil, true, true, WithEvalDelay(2*time.Minute))
rule, err := NewThresholdRule("69", &postableRule, nil, WithEvalDelay(2*time.Minute))
if err != nil {
assert.NoError(t, err)
}
@@ -1005,7 +1005,7 @@ func TestThresholdRuleLabelNormalization(t *testing.T) {
postableRule.RuleCondition.MatchType = ruletypes.MatchType(c.matchType)
postableRule.RuleCondition.Target = &c.target
rule, err := NewThresholdRule("69", &postableRule, nil, true, true, WithEvalDelay(2*time.Minute))
rule, err := NewThresholdRule("69", &postableRule, nil, WithEvalDelay(2*time.Minute))
if err != nil {
assert.NoError(t, err)
}
@@ -1057,7 +1057,7 @@ func TestThresholdRuleEvalDelay(t *testing.T) {
}
for idx, c := range cases {
rule, err := NewThresholdRule("69", &postableRule, nil, true, true) // no eval delay
rule, err := NewThresholdRule("69", &postableRule, nil) // no eval delay
if err != nil {
assert.NoError(t, err)
}
@@ -1105,7 +1105,7 @@ func TestThresholdRuleClickHouseTmpl(t *testing.T) {
}
for idx, c := range cases {
rule, err := NewThresholdRule("69", &postableRule, nil, true, true, WithEvalDelay(2*time.Minute))
rule, err := NewThresholdRule("69", &postableRule, nil, WithEvalDelay(2*time.Minute))
if err != nil {
assert.NoError(t, err)
}
@@ -1244,8 +1244,8 @@ func TestThresholdRuleUnitCombinations(t *testing.T) {
options := clickhouseReader.NewOptions("", "", "archiveNamespace")
readerCache, err := memorycache.New(context.Background(), factorytest.NewSettings(), cache.Config{Provider: "memory", Memory: cache.Memory{TTL: DefaultFrequency}})
require.NoError(t, err)
reader := clickhouseReader.NewReaderFromClickhouseConnection(options, nil, telemetryStore, prometheustest.New(instrumentationtest.New().Logger(), prometheus.Config{}), "", true, true, time.Duration(time.Second), readerCache)
rule, err := NewThresholdRule("69", &postableRule, reader, true, true)
reader := clickhouseReader.NewReaderFromClickhouseConnection(options, nil, telemetryStore, prometheustest.New(instrumentationtest.New().Logger(), prometheus.Config{}), "", time.Duration(time.Second), readerCache)
rule, err := NewThresholdRule("69", &postableRule, reader)
rule.TemporalityMap = map[string]map[v3.Temporality]bool{
"signoz_calls_total": {
v3.Delta: true,
@@ -1340,9 +1340,9 @@ func TestThresholdRuleNoData(t *testing.T) {
}
readerCache, err := memorycache.New(context.Background(), factorytest.NewSettings(), cache.Config{Provider: "memory", Memory: cache.Memory{TTL: DefaultFrequency}})
options := clickhouseReader.NewOptions("", "", "archiveNamespace")
reader := clickhouseReader.NewReaderFromClickhouseConnection(options, nil, telemetryStore, prometheustest.New(instrumentationtest.New().Logger(), prometheus.Config{}), "", true, true, time.Duration(time.Second), readerCache)
reader := clickhouseReader.NewReaderFromClickhouseConnection(options, nil, telemetryStore, prometheustest.New(instrumentationtest.New().Logger(), prometheus.Config{}), "", time.Duration(time.Second), readerCache)
rule, err := NewThresholdRule("69", &postableRule, reader, true, true)
rule, err := NewThresholdRule("69", &postableRule, reader)
rule.TemporalityMap = map[string]map[v3.Temporality]bool{
"signoz_calls_total": {
v3.Delta: true,
@@ -1444,9 +1444,9 @@ func TestThresholdRuleTracesLink(t *testing.T) {
}
options := clickhouseReader.NewOptions("", "", "archiveNamespace")
reader := clickhouseReader.NewReaderFromClickhouseConnection(options, nil, telemetryStore, prometheustest.New(instrumentationtest.New().Logger(), prometheus.Config{}), "", true, true, time.Duration(time.Second), nil)
reader := clickhouseReader.NewReaderFromClickhouseConnection(options, nil, telemetryStore, prometheustest.New(instrumentationtest.New().Logger(), prometheus.Config{}), "", time.Duration(time.Second), nil)
rule, err := NewThresholdRule("69", &postableRule, reader, true, true)
rule, err := NewThresholdRule("69", &postableRule, reader)
rule.TemporalityMap = map[string]map[v3.Temporality]bool{
"signoz_calls_total": {
v3.Delta: true,
@@ -1565,9 +1565,9 @@ func TestThresholdRuleLogsLink(t *testing.T) {
}
options := clickhouseReader.NewOptions("", "", "archiveNamespace")
reader := clickhouseReader.NewReaderFromClickhouseConnection(options, nil, telemetryStore, prometheustest.New(instrumentationtest.New().Logger(), prometheus.Config{}), "", true, true, time.Duration(time.Second), nil)
reader := clickhouseReader.NewReaderFromClickhouseConnection(options, nil, telemetryStore, prometheustest.New(instrumentationtest.New().Logger(), prometheus.Config{}), "", time.Duration(time.Second), nil)
rule, err := NewThresholdRule("69", &postableRule, reader, true, true)
rule, err := NewThresholdRule("69", &postableRule, reader)
rule.TemporalityMap = map[string]map[v3.Temporality]bool{
"signoz_calls_total": {
v3.Delta: true,
@@ -1643,7 +1643,7 @@ func TestThresholdRuleShiftBy(t *testing.T) {
},
}
rule, err := NewThresholdRule("69", &postableRule, nil, true, true)
rule, err := NewThresholdRule("69", &postableRule, nil)
if err != nil {
assert.NoError(t, err)
}

View File

@@ -18,6 +18,7 @@ import (
"github.com/SigNoz/signoz/pkg/query-service/featureManager"
v3 "github.com/SigNoz/signoz/pkg/query-service/model/v3"
"github.com/SigNoz/signoz/pkg/query-service/utils"
"github.com/SigNoz/signoz/pkg/signoz"
"github.com/SigNoz/signoz/pkg/types"
mockhouse "github.com/srikanthccv/ClickHouse-go-mock"
"github.com/stretchr/testify/require"
@@ -297,11 +298,17 @@ func NewFilterSuggestionsTestBed(t *testing.T) *FilterSuggestionsTestBed {
reader, mockClickhouse := NewMockClickhouseReader(t, testDB)
mockClickhouse.MatchExpectationsInOrder(false)
modules := signoz.NewModules(testDB)
apiHandler, err := app.NewAPIHandler(app.APIHandlerOpts{
Reader: reader,
AppDao: dao.DB(),
FeatureFlags: fm,
JWT: jwt,
Signoz: &signoz.SigNoz{
Modules: modules,
Handlers: signoz.NewHandlers(modules),
},
})
if err != nil {
t.Fatalf("could not create a new ApiHandler: %v", err)

View File

@@ -10,6 +10,7 @@ import (
"github.com/SigNoz/signoz/pkg/http/middleware"
"github.com/SigNoz/signoz/pkg/modules/organization/implorganization"
"github.com/SigNoz/signoz/pkg/signoz"
"github.com/SigNoz/signoz/pkg/instrumentation/instrumentationtest"
"github.com/SigNoz/signoz/pkg/query-service/app"
@@ -360,12 +361,19 @@ func NewCloudIntegrationsTestBed(t *testing.T, testDB sqlstore.SQLStore) *CloudI
reader, mockClickhouse := NewMockClickhouseReader(t, testDB)
mockClickhouse.MatchExpectationsInOrder(false)
modules := signoz.NewModules(testDB)
handlers := signoz.NewHandlers(modules)
apiHandler, err := app.NewAPIHandler(app.APIHandlerOpts{
Reader: reader,
AppDao: dao.DB(),
CloudIntegrationsController: controller,
FeatureFlags: fm,
JWT: jwt,
Signoz: &signoz.SigNoz{
Modules: modules,
Handlers: handlers,
},
})
if err != nil {
t.Fatalf("could not create a new ApiHandler: %v", err)

View File

@@ -19,6 +19,7 @@ import (
"github.com/SigNoz/signoz/pkg/query-service/model"
v3 "github.com/SigNoz/signoz/pkg/query-service/model/v3"
"github.com/SigNoz/signoz/pkg/query-service/utils"
"github.com/SigNoz/signoz/pkg/signoz"
"github.com/SigNoz/signoz/pkg/sqlstore"
"github.com/SigNoz/signoz/pkg/types"
"github.com/SigNoz/signoz/pkg/types/pipelinetypes"
@@ -566,6 +567,9 @@ func NewIntegrationsTestBed(t *testing.T, testDB sqlstore.SQLStore) *Integration
t.Fatalf("could not create cloud integrations controller: %v", err)
}
modules := signoz.NewModules(testDB)
handlers := signoz.NewHandlers(modules)
apiHandler, err := app.NewAPIHandler(app.APIHandlerOpts{
Reader: reader,
AppDao: dao.DB(),
@@ -573,6 +577,10 @@ func NewIntegrationsTestBed(t *testing.T, testDB sqlstore.SQLStore) *Integration
FeatureFlags: fm,
JWT: jwt,
CloudIntegrationsController: cloudIntegrationsController,
Signoz: &signoz.SigNoz{
Modules: modules,
Handlers: handlers,
},
})
if err != nil {
t.Fatalf("could not create a new ApiHandler: %v", err)

View File

@@ -46,8 +46,6 @@ func NewMockClickhouseReader(t *testing.T, testDB sqlstore.SQLStore) (*clickhous
telemetryStore,
prometheustest.New(instrumentationtest.New().Logger(), prometheus.Config{}),
"",
true,
true,
time.Duration(time.Second),
nil,
)

View File

@@ -146,7 +146,7 @@ func (r *rule) ListOrgs(ctx context.Context) ([]string, error) {
func (r *rule) getChannels() (*[]model.ChannelItem, *model.ApiError) {
channels := []model.ChannelItem{}
query := "SELECT id, created_at, updated_at, name, type, data FROM notification_channels"
query := "SELECT id, created_at, updated_at, name, type, data FROM notification_channel"
err := r.Select(&channels, query)
@@ -163,7 +163,7 @@ func (r *rule) getChannels() (*[]model.ChannelItem, *model.ApiError) {
func (r *rule) GetAlertsInfo(ctx context.Context) (*model.AlertsInfo, error) {
alertsInfo := model.AlertsInfo{}
// fetch alerts from rules db
query := "SELECT data FROM rules"
query := "SELECT data FROM rule"
var alertsData []string
var alertNames []string
err := r.Select(&alertsData, query)

24
pkg/signoz/handler.go Normal file
View File

@@ -0,0 +1,24 @@
package signoz
import (
"github.com/SigNoz/signoz/pkg/modules/organization"
"github.com/SigNoz/signoz/pkg/modules/organization/implorganization"
"github.com/SigNoz/signoz/pkg/modules/preference"
"github.com/SigNoz/signoz/pkg/modules/preference/implpreference"
"github.com/SigNoz/signoz/pkg/modules/tracefunnel"
"github.com/SigNoz/signoz/pkg/modules/tracefunnel/impltracefunnel"
)
type Handlers struct {
Organization organization.Handler
Preference preference.Handler
TraceFunnel tracefunnel.Handler
}
func NewHandlers(modules Modules) Handlers {
return Handlers{
Organization: implorganization.NewHandler(modules.Organization),
Preference: implpreference.NewHandler(modules.Preference),
TraceFunnel: impltracefunnel.NewHandler(modules.TraceFunnel),
}
}

26
pkg/signoz/module.go Normal file
View File

@@ -0,0 +1,26 @@
package signoz
import (
"github.com/SigNoz/signoz/pkg/modules/organization"
"github.com/SigNoz/signoz/pkg/modules/organization/implorganization"
"github.com/SigNoz/signoz/pkg/modules/preference"
"github.com/SigNoz/signoz/pkg/modules/preference/implpreference"
"github.com/SigNoz/signoz/pkg/modules/tracefunnel"
"github.com/SigNoz/signoz/pkg/modules/tracefunnel/impltracefunnel"
"github.com/SigNoz/signoz/pkg/sqlstore"
"github.com/SigNoz/signoz/pkg/types/preferencetypes"
)
type Modules struct {
Organization organization.Module
Preference preference.Module
TraceFunnel tracefunnel.Module
}
func NewModules(sqlstore sqlstore.SQLStore) Modules {
return Modules{
Organization: implorganization.NewModule(implorganization.NewStore(sqlstore)),
Preference: implpreference.NewModule(implpreference.NewStore(sqlstore), preferencetypes.NewDefaultPreferenceMap()),
TraceFunnel: impltracefunnel.NewModule(impltracefunnel.NewStore(sqlstore)),
}
}

View File

@@ -74,6 +74,7 @@ func NewSQLMigrationProviderFactories(sqlstore sqlstore.SQLStore) factory.NamedM
sqlmigration.NewUpdateIntegrationsFactory(sqlstore),
sqlmigration.NewUpdateOrganizationsFactory(sqlstore),
sqlmigration.NewDropGroupsFactory(sqlstore),
sqlmigration.NewAddTraceFunnelsFactory(sqlstore),
)
}

View File

@@ -13,6 +13,7 @@ import (
"github.com/SigNoz/signoz/pkg/sqlstore"
"github.com/SigNoz/signoz/pkg/telemetrystore"
"github.com/SigNoz/signoz/pkg/version"
"github.com/SigNoz/signoz/pkg/zeus"
"github.com/SigNoz/signoz/pkg/web"
)
@@ -26,11 +27,16 @@ type SigNoz struct {
TelemetryStore telemetrystore.TelemetryStore
Prometheus prometheus.Prometheus
Alertmanager alertmanager.Alertmanager
Zeus zeus.Zeus
Modules Modules
Handlers Handlers
}
func New(
ctx context.Context,
config Config,
zeusConfig zeus.Config,
zeusProviderFactory factory.ProviderFactory[zeus.Zeus, zeus.Config],
cacheProviderFactories factory.NamedMap[factory.ProviderFactory[cache.Cache, cache.Config]],
webProviderFactories factory.NamedMap[factory.ProviderFactory[web.Web, web.Config]],
sqlstoreProviderFactories factory.NamedMap[factory.ProviderFactory[sqlstore.SQLStore, sqlstore.Config]],
@@ -48,6 +54,17 @@ func New(
// Get the provider settings from instrumentation
providerSettings := instrumentation.ToProviderSettings()
// Initialize zeus from the available zeus provider factory. This is not config controlled
// and depends on the variant of the build.
zeus, err := zeusProviderFactory.New(
ctx,
providerSettings,
zeusConfig,
)
if err != nil {
return nil, err
}
// Initialize cache from the available cache provider factories
cache, err := factory.NewProviderFromNamedMap(
ctx,
@@ -124,6 +141,7 @@ func New(
return nil, err
}
// Initialize alertmanager from the available alertmanager provider factories
alertmanager, err := factory.NewProviderFromNamedMap(
ctx,
providerSettings,
@@ -135,6 +153,12 @@ func New(
return nil, err
}
// Initialize all modules
modules := NewModules(sqlstore)
// Initialize all handlers for the modules
handlers := NewHandlers(modules)
registry, err := factory.NewRegistry(
instrumentation.Logger(),
factory.NewNamedService(factory.MustNewName("instrumentation"), instrumentation),
@@ -153,5 +177,8 @@ func New(
TelemetryStore: telemetrystore,
Prometheus: prometheus,
Alertmanager: alertmanager,
Zeus: zeus,
Modules: modules,
Handlers: handlers,
}, nil
}

View File

@@ -38,14 +38,26 @@ func (migration *dropLicensesSites) Up(ctx context.Context, db *bun.DB) error {
}
defer tx.Rollback()
if _, err := tx.NewDropTable().IfExists().Table("sites").Exec(ctx); err != nil {
return err
}
if _, err := tx.NewDropTable().IfExists().Table("licenses").Exec(ctx); err != nil {
if _, err := tx.
NewDropTable().
IfExists().
Table("sites").
Exec(ctx); err != nil {
return err
}
_, err = migration.store.Dialect().RenameColumn(ctx, tx, "saved_views", "uuid", "id")
if _, err := tx.
NewDropTable().
IfExists().
Table("licenses").
Exec(ctx); err != nil {
return err
}
_, err = migration.
store.
Dialect().
RenameColumn(ctx, tx, "saved_views", "uuid", "id")
if err != nil {
return err
}

View File

@@ -42,12 +42,9 @@ type newInvite struct {
}
func NewUpdateInvitesFactory(sqlstore sqlstore.SQLStore) factory.ProviderFactory[SQLMigration, Config] {
return factory.
NewProviderFactory(
factory.MustNewName("update_invites"),
func(ctx context.Context, ps factory.ProviderSettings, c Config) (SQLMigration, error) {
return newUpdateInvites(ctx, ps, c, sqlstore)
})
return factory.NewProviderFactory(factory.MustNewName("update_invites"), func(ctx context.Context, ps factory.ProviderSettings, c Config) (SQLMigration, error) {
return newUpdateInvites(ctx, ps, c, sqlstore)
})
}
func newUpdateInvites(_ context.Context, _ factory.ProviderSettings, _ Config, store sqlstore.SQLStore) (SQLMigration, error) {
@@ -55,8 +52,7 @@ func newUpdateInvites(_ context.Context, _ factory.ProviderSettings, _ Config, s
}
func (migration *updateInvites) Register(migrations *migrate.Migrations) error {
if err := migrations.
Register(migration.Up, migration.Down); err != nil {
if err := migrations.Register(migration.Up, migration.Down); err != nil {
return err
}
@@ -64,8 +60,7 @@ func (migration *updateInvites) Register(migrations *migrate.Migrations) error {
}
func (migration *updateInvites) Up(ctx context.Context, db *bun.DB) error {
tx, err := db.
BeginTx(ctx, nil)
tx, err := db.BeginTx(ctx, nil)
if err != nil {
return err
}
@@ -88,8 +83,7 @@ func (migration *updateInvites) Up(ctx context.Context, db *bun.DB) error {
}
if err == nil && len(existingInvites) > 0 {
newInvites := migration.
CopyOldInvitesToNewInvites(existingInvites)
newInvites := migration.CopyOldInvitesToNewInvites(existingInvites)
_, err = tx.
NewInsert().
Model(&newInvites).

View File

@@ -20,9 +20,7 @@ func NewUpdatePatFactory(sqlstore sqlstore.SQLStore) factory.ProviderFactory[SQL
}
func newUpdatePat(_ context.Context, _ factory.ProviderSettings, _ Config, store sqlstore.SQLStore) (SQLMigration, error) {
return &updatePat{
store: store,
}, nil
return &updatePat{store: store}, nil
}
func (migration *updatePat) Register(migrations *migrate.Migrations) error {
@@ -34,25 +32,33 @@ func (migration *updatePat) Register(migrations *migrate.Migrations) error {
}
func (migration *updatePat) Up(ctx context.Context, db *bun.DB) error {
// begin transaction
tx, err := db.BeginTx(ctx, nil)
if err != nil {
return err
}
defer tx.Rollback()
for _, column := range []string{"last_used", "expires_at"} {
if err := migration.store.Dialect().AddNotNullDefaultToColumn(ctx, tx, "personal_access_tokens", column, "INTEGER", "0"); err != nil {
if err := migration.
store.
Dialect().
AddNotNullDefaultToColumn(ctx, tx, "personal_access_tokens", column, "INTEGER", "0"); err != nil {
return err
}
}
if err := migration.store.Dialect().AddNotNullDefaultToColumn(ctx, tx, "personal_access_tokens", "revoked", "BOOLEAN", "false"); err != nil {
if err := migration.
store.
Dialect().
AddNotNullDefaultToColumn(ctx, tx, "personal_access_tokens", "revoked", "BOOLEAN", "false"); err != nil {
return err
}
if err := migration.store.Dialect().AddNotNullDefaultToColumn(ctx, tx, "personal_access_tokens", "updated_by_user_id", "TEXT", "''"); err != nil {
if err := migration.
store.
Dialect().
AddNotNullDefaultToColumn(ctx, tx, "personal_access_tokens", "updated_by_user_id", "TEXT", "''"); err != nil {
return err
}

View File

@@ -77,12 +77,9 @@ type newAlertmanagerState struct {
}
func NewUpdateAlertmanagerFactory(sqlstore sqlstore.SQLStore) factory.ProviderFactory[SQLMigration, Config] {
return factory.
NewProviderFactory(
factory.MustNewName("update_alertmanager"),
func(ctx context.Context, ps factory.ProviderSettings, c Config) (SQLMigration, error) {
return newUpdateAlertmanager(ctx, ps, c, sqlstore)
})
return factory.NewProviderFactory(factory.MustNewName("update_alertmanager"), func(ctx context.Context, ps factory.ProviderSettings, c Config) (SQLMigration, error) {
return newUpdateAlertmanager(ctx, ps, c, sqlstore)
})
}
func newUpdateAlertmanager(_ context.Context, _ factory.ProviderSettings, _ Config, store sqlstore.SQLStore) (SQLMigration, error) {
@@ -90,8 +87,7 @@ func newUpdateAlertmanager(_ context.Context, _ factory.ProviderSettings, _ Conf
}
func (migration *updateAlertmanager) Register(migrations *migrate.Migrations) error {
if err := migrations.
Register(migration.Up, migration.Down); err != nil {
if err := migrations.Register(migration.Up, migration.Down); err != nil {
return err
}
@@ -99,8 +95,7 @@ func (migration *updateAlertmanager) Register(migrations *migrate.Migrations) er
}
func (migration *updateAlertmanager) Up(ctx context.Context, db *bun.DB) error {
tx, err := db.
BeginTx(ctx, nil)
tx, err := db.BeginTx(ctx, nil)
if err != nil {
return err
}

Some files were not shown because too many files have changed in this diff Show More