Compare commits

..

46 Commits

Author SHA1 Message Date
Nityananda Gohain
ae6bbc7192 Merge branch 'main' into issue_7183_filter 2025-03-06 20:10:47 +05:30
nityanandagohain
da627b9779 fix: move function to common query range 2025-03-06 20:09:49 +05:30
nityanandagohain
5bd30af3f7 Merge remote-tracking branch 'origin/issue_7183_filter' into issue_7183_filter 2025-03-06 18:36:35 +05:30
nityanandagohain
29f72451d8 fix: address comments 2025-03-06 18:36:20 +05:30
Raj Kamal Singh
d09b85bea8 feat: aws integration: support for lambda (#7196)
* feat: aws integration: add service definition for lambda

* feat: aws integration: lambda: add details of metrics collected

* feat: aws integrations: lambda overview: use sum for relevant metrics
2025-03-06 13:03:46 +00:00
nityanandagohain
85fe1a2a18 fix: added comments 2025-03-06 18:33:30 +05:30
Nityananda Gohain
2115656a5b Merge branch 'main' into issue_7183_filter 2025-03-06 18:31:07 +05:30
nityanandagohain
c9da6006db fix: handle case where end is equal to a complete window end 2025-03-06 18:29:49 +05:30
Sahil
114a979b14 feat: disabled same url check for traces query builder as well 2025-03-06 16:01:09 +05:30
Sahil
cb69cd91a0 fix: disabled same url check for redirect in logs explorer 2025-03-06 16:01:09 +05:30
Nityananda Gohain
2d73f91380 Fix: Multitenancy support for ORG (#7155)
* fix: support multitenancy in org

* fix: register and login working now

* fix: changes to migration

* fix: migrations run both on sqlite and postgres

* fix: remove user flags from fe and be

* fix: remove ingestion keys from update

* fix: multitenancy support for apdex settings

* fix: render ts for users correctly

* fix: fix migration to run for new tenants

* fix: clean up migrations

* fix: address comments

* Update pkg/sqlmigration/013_update_organization.go

Co-authored-by: ellipsis-dev[bot] <65095814+ellipsis-dev[bot]@users.noreply.github.com>

* fix: fix build

* fix: force invites with org id

* Update pkg/query-service/auth/auth.go

Co-authored-by: ellipsis-dev[bot] <65095814+ellipsis-dev[bot]@users.noreply.github.com>

* fix: address comments

* fix: address comments

* fix: provier with their own dialect

* fix: update dialects

* fix: remove unwanted change

* Update pkg/query-service/app/http_handler.go

Co-authored-by: ellipsis-dev[bot] <65095814+ellipsis-dev[bot]@users.noreply.github.com>

* fix: different files for types

---------

Co-authored-by: ellipsis-dev[bot] <65095814+ellipsis-dev[bot]@users.noreply.github.com>
2025-03-06 15:39:45 +05:30
Nityananda Gohain
efe86b0a00 Merge branch 'main' into issue_7183_filter 2025-03-06 14:15:26 +05:30
nityanandagohain
52780a7ad9 fix: add error log 2025-03-06 14:14:48 +05:30
Nityananda Gohain
44b46c089b Update pkg/query-service/common/query_range.go
Co-authored-by: ellipsis-dev[bot] <65095814+ellipsis-dev[bot]@users.noreply.github.com>
2025-03-06 14:13:16 +05:30
nityanandagohain
ca65b4148c fix: address comments 2025-03-06 14:12:09 +05:30
nityanandagohain
064a522293 fix: address comments 2025-03-06 14:08:02 +05:30
nityanandagohain
8563bcdacf fix: address comments 2025-03-06 14:06:17 +05:30
Shaheer Kochai
296a444bd8 fix(logs): centralize time range handling in DateTimeSelectionV2 (#7175)
- Remove custom time range handling logic from logs components
- Use unified time range selection through DateTimeSelectionV2 component
2025-03-06 06:15:11 +00:00
Nityananda Gohain
727cd7747b Update pkg/query-service/app/querier/v2/helper.go
Co-authored-by: ellipsis-dev[bot] <65095814+ellipsis-dev[bot]@users.noreply.github.com>
2025-03-06 11:30:43 +05:30
nityanandagohain
a7ff27ef07 fix: name updated 2025-03-06 11:28:14 +05:30
nityanandagohain
f61e33aa23 fix: update logic to handle actual empty series 2025-03-06 11:26:19 +05:30
dependabot[bot]
36ebde5470 chore(deps): bump dompurify from 3.1.3 to 3.2.4 in /frontend (#7124)
Bumps [dompurify](https://github.com/cure53/DOMPurify) from 3.1.3 to 3.2.4.
- [Release notes](https://github.com/cure53/DOMPurify/releases)
- [Commits](https://github.com/cure53/DOMPurify/compare/3.1.3...3.2.4)

---
updated-dependencies:
- dependency-name: dompurify
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-03-06 11:05:59 +05:30
Raj Kamal Singh
509d9c7fe5 feat: upgrade aws integration agent version (#7223)
Co-authored-by: Srikanth Chekuri <srikanth.chekuri92@gmail.com>
2025-03-06 04:52:13 +00:00
Srikanth Chekuri
816cae3aac fix: update bearer to capital case and handle undefined (#7226) 2025-03-06 04:00:30 +00:00
Yunus M
cb2c492618 chore: use platform property to evaluate type of user, update all references (#7162)
* feat: use platform property to evaluate type of user, update all references
2025-03-05 21:50:29 +05:30
Vibhu Pandey
4177b88a4e fix(alertmanager): fix tests for alertmanager (#7225) 2025-03-05 15:54:54 +00:00
nityanandagohain
5bf79edb8b fix: tests 2025-03-05 20:29:23 +05:30
nityanandagohain
3e2c23d015 Merge remote-tracking branch 'origin/main' into issue_7183_filter 2025-03-05 20:27:57 +05:30
Nityananda Gohain
b1e3f03bb5 fix: new implementation for finding missing timerange (#7201)
* fix: new implementation for finding missing timerange

* fix: remove unwanted code

* fix: update if condition

* fix: update logic and the test cases

* fix: correct name

* fix: fix overlapping test case

* fix: add comments

* Update pkg/query-service/querycache/query_range_cache.go

Co-authored-by: ellipsis-dev[bot] <65095814+ellipsis-dev[bot]@users.noreply.github.com>

* fix: use step ms

* fix: add one more test case

* fix: test case

* fix: merge missing ranger

* Update pkg/query-service/querycache/query_range_cache.go

Co-authored-by: Srikanth Chekuri <srikanth.chekuri92@gmail.com>

---------

Co-authored-by: ellipsis-dev[bot] <65095814+ellipsis-dev[bot]@users.noreply.github.com>
Co-authored-by: Srikanth Chekuri <srikanth.chekuri92@gmail.com>
2025-03-05 20:26:37 +05:30
Vibhu Pandey
02865cf49e feat(sqlstore): add transaction support for sqlstore (#7224)
### Summary

- add transaction support for sqlstore
- use transactions in alertmanager
2025-03-05 18:50:48 +05:30
nityanandagohain
c6bd1dd283 fix: use step ms 2025-03-05 16:14:49 +05:30
nityanandagohain
51b4c8d85b Merge remote-tracking branch 'origin/issue_7183' into issue_7183_filter 2025-03-05 16:13:56 +05:30
nityanandagohain
697f16743f fix: use step ms 2025-03-05 16:13:38 +05:30
Nityananda Gohain
0f4e4473ef Update pkg/query-service/querycache/query_range_cache.go
Co-authored-by: ellipsis-dev[bot] <65095814+ellipsis-dev[bot]@users.noreply.github.com>
2025-03-05 16:11:08 +05:30
nityanandagohain
4eb2e0b97b Merge remote-tracking branch 'origin/issue_7183' into issue_7183_filter 2025-03-05 16:08:08 +05:30
Nityananda Gohain
8e5526c66c Merge branch 'main' into issue_7183 2025-03-05 16:07:06 +05:30
nityanandagohain
423561f652 fix: add comments 2025-03-05 16:06:13 +05:30
nityanandagohain
dc61db6936 fix: fix overlapping test case 2025-03-05 15:55:23 +05:30
nityanandagohain
e9bba641bc fix: fix the logic to use the points correctly 2025-03-05 15:40:54 +05:30
Amlan Kumar Nandy
c2d038c025 feat: implement metrics explorer summary section (#7200) 2025-03-05 15:23:23 +05:30
nityanandagohain
fbc4e50136 fix: filter points which are not a complete agg interval 2025-03-04 18:54:55 +05:30
nityanandagohain
6049ba194a fix: correct name 2025-03-04 15:40:47 +05:30
nityanandagohain
068126db40 fix: update logic and the test cases 2025-03-04 14:03:43 +05:30
nityanandagohain
4b87ac6424 fix: update if condition 2025-03-04 10:41:34 +05:30
nityanandagohain
f47a4207a9 fix: remove unwanted code 2025-03-04 01:08:02 +05:30
nityanandagohain
657240c71b fix: new implementation for finding missing timerange 2025-03-04 00:02:01 +05:30
134 changed files with 5824 additions and 1187 deletions

View File

@@ -18,6 +18,7 @@ import (
baseconstants "go.signoz.io/signoz/pkg/query-service/constants"
"go.signoz.io/signoz/pkg/query-service/dao"
basemodel "go.signoz.io/signoz/pkg/query-service/model"
"go.signoz.io/signoz/pkg/types"
"go.uber.org/zap"
)
@@ -45,7 +46,7 @@ func (ah *APIHandler) CloudIntegrationsGenerateConnectionParams(w http.ResponseW
return
}
apiKey, apiErr := ah.getOrCreateCloudIntegrationPAT(r.Context(), currentUser.OrgId, cloudProvider)
apiKey, apiErr := ah.getOrCreateCloudIntegrationPAT(r.Context(), currentUser.OrgID, cloudProvider)
if apiErr != nil {
RespondError(w, basemodel.WrapApiError(
apiErr, "couldn't provision PAT for cloud integration:",
@@ -124,7 +125,7 @@ func (ah *APIHandler) getOrCreateCloudIntegrationPAT(ctx context.Context, orgId
))
}
for _, p := range allPats {
if p.UserID == integrationUser.Id && p.Name == integrationPATName {
if p.UserID == integrationUser.ID && p.Name == integrationPATName {
return p.Token, nil
}
}
@@ -136,7 +137,7 @@ func (ah *APIHandler) getOrCreateCloudIntegrationPAT(ctx context.Context, orgId
newPAT := model.PAT{
Token: generatePATToken(),
UserID: integrationUser.Id,
UserID: integrationUser.ID,
Name: integrationPATName,
Role: baseconstants.ViewerGroup,
ExpiresAt: 0,
@@ -154,7 +155,7 @@ func (ah *APIHandler) getOrCreateCloudIntegrationPAT(ctx context.Context, orgId
func (ah *APIHandler) getOrCreateCloudIntegrationUser(
ctx context.Context, orgId string, cloudProvider string,
) (*basemodel.User, *basemodel.ApiError) {
) (*types.User, *basemodel.ApiError) {
cloudIntegrationUserId := fmt.Sprintf("%s-integration", cloudProvider)
integrationUserResult, apiErr := ah.AppDao().GetUser(ctx, cloudIntegrationUserId)
@@ -171,19 +172,21 @@ func (ah *APIHandler) getOrCreateCloudIntegrationUser(
zap.String("cloudProvider", cloudProvider),
)
newUser := &basemodel.User{
Id: cloudIntegrationUserId,
Name: fmt.Sprintf("%s integration", cloudProvider),
Email: fmt.Sprintf("%s@signoz.io", cloudIntegrationUserId),
CreatedAt: time.Now().Unix(),
OrgId: orgId,
newUser := &types.User{
ID: cloudIntegrationUserId,
Name: fmt.Sprintf("%s integration", cloudProvider),
Email: fmt.Sprintf("%s@signoz.io", cloudIntegrationUserId),
TimeAuditable: types.TimeAuditable{
CreatedAt: time.Now(),
},
OrgID: orgId,
}
viewerGroup, apiErr := dao.DB().GetGroupByName(ctx, baseconstants.ViewerGroup)
if apiErr != nil {
return nil, basemodel.WrapApiError(apiErr, "couldn't get viewer group for creating integration user")
}
newUser.GroupId = viewerGroup.ID
newUser.GroupID = viewerGroup.ID
passwordHash, err := auth.PasswordHash(uuid.NewString())
if err != nil {

View File

@@ -54,7 +54,7 @@ func (ah *APIHandler) createPAT(w http.ResponseWriter, r *http.Request) {
}
// All the PATs are associated with the user creating the PAT.
pat.UserID = user.Id
pat.UserID = user.ID
pat.CreatedAt = time.Now().Unix()
pat.UpdatedAt = time.Now().Unix()
pat.LastUsed = 0
@@ -112,7 +112,7 @@ func (ah *APIHandler) updatePAT(w http.ResponseWriter, r *http.Request) {
return
}
req.UpdatedByUserID = user.Id
req.UpdatedByUserID = user.ID
id := mux.Vars(r)["id"]
req.UpdatedAt = time.Now().Unix()
zap.L().Info("Got Update PAT request", zap.Any("pat", req))
@@ -135,7 +135,7 @@ func (ah *APIHandler) getPATs(w http.ResponseWriter, r *http.Request) {
}, nil)
return
}
zap.L().Info("Get PATs for user", zap.String("user_id", user.Id))
zap.L().Info("Get PATs for user", zap.String("user_id", user.ID))
pats, apierr := ah.AppDao().ListPATs(ctx)
if apierr != nil {
RespondError(w, apierr, nil)
@@ -157,7 +157,7 @@ func (ah *APIHandler) revokePAT(w http.ResponseWriter, r *http.Request) {
}
zap.L().Info("Revoke PAT with id", zap.String("id", id))
if apierr := ah.AppDao().RevokePAT(ctx, id, user.Id); apierr != nil {
if apierr := ah.AppDao().RevokePAT(ctx, id, user.ID); apierr != nil {
RespondError(w, apierr, nil)
return
}

View File

@@ -25,6 +25,7 @@ import (
"go.signoz.io/signoz/ee/query-service/rules"
"go.signoz.io/signoz/pkg/http/middleware"
"go.signoz.io/signoz/pkg/signoz"
"go.signoz.io/signoz/pkg/types"
"go.signoz.io/signoz/pkg/types/authtypes"
"go.signoz.io/signoz/pkg/web"
@@ -340,14 +341,14 @@ func (s *Server) createPublicServer(apiHandler *api.APIHandler, web web.Web) (*h
r := baseapp.NewRouter()
// add auth middleware
getUserFromRequest := func(ctx context.Context) (*basemodel.UserPayload, error) {
getUserFromRequest := func(ctx context.Context) (*types.GettableUser, error) {
user, err := auth.GetUserFromRequestContext(ctx, apiHandler)
if err != nil {
return nil, err
}
if user.User.OrgId == "" {
if user.User.OrgID == "" {
return nil, basemodel.UnauthorizedError(errors.New("orgId is missing in the claims"))
}

View File

@@ -7,14 +7,14 @@ import (
"go.signoz.io/signoz/ee/query-service/app/api"
baseauth "go.signoz.io/signoz/pkg/query-service/auth"
basemodel "go.signoz.io/signoz/pkg/query-service/model"
"go.signoz.io/signoz/pkg/query-service/telemetry"
"go.signoz.io/signoz/pkg/types"
"go.signoz.io/signoz/pkg/types/authtypes"
"go.uber.org/zap"
)
func GetUserFromRequestContext(ctx context.Context, apiHandler *api.APIHandler) (*basemodel.UserPayload, error) {
func GetUserFromRequestContext(ctx context.Context, apiHandler *api.APIHandler) (*types.GettableUser, error) {
patToken, ok := authtypes.UUIDFromContext(ctx)
if ok && patToken != "" {
zap.L().Debug("Received a non-zero length PAT token")
@@ -40,9 +40,9 @@ func GetUserFromRequestContext(ctx context.Context, apiHandler *api.APIHandler)
}
telemetry.GetInstance().SetPatTokenUser()
dao.UpdatePATLastUsed(ctx, patToken, time.Now().Unix())
user.User.GroupId = group.ID
user.User.Id = pat.Id
return &basemodel.UserPayload{
user.User.GroupID = group.ID
user.User.ID = pat.Id
return &types.GettableUser{
User: user.User,
Role: pat.Role,
}, nil

View File

@@ -10,6 +10,7 @@ import (
basedao "go.signoz.io/signoz/pkg/query-service/dao"
baseint "go.signoz.io/signoz/pkg/query-service/interfaces"
basemodel "go.signoz.io/signoz/pkg/query-service/model"
"go.signoz.io/signoz/pkg/types"
"go.signoz.io/signoz/pkg/types/authtypes"
)
@@ -39,7 +40,7 @@ type ModelDao interface {
GetPAT(ctx context.Context, pat string) (*model.PAT, basemodel.BaseApiError)
UpdatePATLastUsed(ctx context.Context, pat string, lastUsed int64) basemodel.BaseApiError
GetPATByID(ctx context.Context, id string) (*model.PAT, basemodel.BaseApiError)
GetUserByPAT(ctx context.Context, token string) (*basemodel.UserPayload, basemodel.BaseApiError)
GetUserByPAT(ctx context.Context, token string) (*types.GettableUser, basemodel.BaseApiError)
ListPATs(ctx context.Context) ([]model.PAT, basemodel.BaseApiError)
RevokePAT(ctx context.Context, id string, userID string) basemodel.BaseApiError
}

View File

@@ -14,11 +14,12 @@ import (
baseconst "go.signoz.io/signoz/pkg/query-service/constants"
basemodel "go.signoz.io/signoz/pkg/query-service/model"
"go.signoz.io/signoz/pkg/query-service/utils"
"go.signoz.io/signoz/pkg/types"
"go.signoz.io/signoz/pkg/types/authtypes"
"go.uber.org/zap"
)
func (m *modelDao) createUserForSAMLRequest(ctx context.Context, email string) (*basemodel.User, basemodel.BaseApiError) {
func (m *modelDao) createUserForSAMLRequest(ctx context.Context, email string) (*types.User, basemodel.BaseApiError) {
// get auth domain from email domain
domain, apierr := m.GetDomainByEmail(ctx, email)
if apierr != nil {
@@ -42,15 +43,17 @@ func (m *modelDao) createUserForSAMLRequest(ctx context.Context, email string) (
return nil, apiErr
}
user := &basemodel.User{
Id: uuid.NewString(),
Name: "",
Email: email,
Password: hash,
CreatedAt: time.Now().Unix(),
user := &types.User{
ID: uuid.NewString(),
Name: "",
Email: email,
Password: hash,
TimeAuditable: types.TimeAuditable{
CreatedAt: time.Now(),
},
ProfilePictureURL: "", // Currently unused
GroupId: group.ID,
OrgId: domain.OrgId,
GroupID: group.ID,
OrgID: domain.OrgId,
}
user, apiErr = m.CreateUser(ctx, user, false)
@@ -73,7 +76,7 @@ func (m *modelDao) PrepareSsoRedirect(ctx context.Context, redirectUri, email st
return "", model.BadRequestStr("invalid user email received from the auth provider")
}
user := &basemodel.User{}
user := &types.User{}
if userPayload == nil {
newUser, apiErr := m.createUserForSAMLRequest(ctx, email)
@@ -95,7 +98,7 @@ func (m *modelDao) PrepareSsoRedirect(ctx context.Context, redirectUri, email st
return fmt.Sprintf("%s?jwt=%s&usr=%s&refreshjwt=%s",
redirectUri,
tokenStore.AccessJwt,
user.Id,
user.ID,
tokenStore.RefreshJwt), nil
}

View File

@@ -8,6 +8,7 @@ import (
"go.signoz.io/signoz/ee/query-service/model"
basemodel "go.signoz.io/signoz/pkg/query-service/model"
"go.signoz.io/signoz/pkg/types"
"go.uber.org/zap"
)
@@ -42,10 +43,10 @@ func (m *modelDao) CreatePAT(ctx context.Context, p model.PAT) (model.PAT, basem
}
} else {
p.CreatedByUser = model.User{
Id: createdByUser.Id,
Id: createdByUser.ID,
Name: createdByUser.Name,
Email: createdByUser.Email,
CreatedAt: createdByUser.CreatedAt,
CreatedAt: createdByUser.CreatedAt.Unix(),
ProfilePictureURL: createdByUser.ProfilePictureURL,
NotFound: false,
}
@@ -95,10 +96,10 @@ func (m *modelDao) ListPATs(ctx context.Context) ([]model.PAT, basemodel.BaseApi
}
} else {
pats[i].CreatedByUser = model.User{
Id: createdByUser.Id,
Id: createdByUser.ID,
Name: createdByUser.Name,
Email: createdByUser.Email,
CreatedAt: createdByUser.CreatedAt,
CreatedAt: createdByUser.CreatedAt.Unix(),
ProfilePictureURL: createdByUser.ProfilePictureURL,
NotFound: false,
}
@@ -111,10 +112,10 @@ func (m *modelDao) ListPATs(ctx context.Context) ([]model.PAT, basemodel.BaseApi
}
} else {
pats[i].UpdatedByUser = model.User{
Id: updatedByUser.Id,
Id: updatedByUser.ID,
Name: updatedByUser.Name,
Email: updatedByUser.Email,
CreatedAt: updatedByUser.CreatedAt,
CreatedAt: updatedByUser.CreatedAt.Unix(),
ProfilePictureURL: updatedByUser.ProfilePictureURL,
NotFound: false,
}
@@ -170,8 +171,8 @@ func (m *modelDao) GetPATByID(ctx context.Context, id string) (*model.PAT, basem
}
// deprecated
func (m *modelDao) GetUserByPAT(ctx context.Context, token string) (*basemodel.UserPayload, basemodel.BaseApiError) {
users := []basemodel.UserPayload{}
func (m *modelDao) GetUserByPAT(ctx context.Context, token string) (*types.GettableUser, basemodel.BaseApiError) {
users := []types.GettableUser{}
query := `SELECT
u.id,

View File

@@ -11,7 +11,7 @@ import (
saml2 "github.com/russellhaering/gosaml2"
"go.signoz.io/signoz/ee/query-service/sso"
"go.signoz.io/signoz/ee/query-service/sso/saml"
basemodel "go.signoz.io/signoz/pkg/query-service/model"
"go.signoz.io/signoz/pkg/types"
"go.uber.org/zap"
)
@@ -33,7 +33,7 @@ type OrgDomain struct {
SamlConfig *SamlConfig `json:"samlConfig"`
GoogleAuthConfig *GoogleOAuthConfig `json:"googleAuthConfig"`
Org *basemodel.Organization
Org *types.Organization
}
func (od *OrgDomain) String() string {

View File

@@ -47,6 +47,7 @@
"@tanstack/react-virtual": "3.11.2",
"@uiw/react-md-editor": "3.23.5",
"@visx/group": "3.3.0",
"@visx/hierarchy": "3.12.0",
"@visx/shape": "3.5.0",
"@visx/tooltip": "3.3.0",
"@xstate/react": "^3.0.0",
@@ -69,8 +70,9 @@
"cross-env": "^7.0.3",
"css-loader": "5.0.0",
"css-minimizer-webpack-plugin": "5.0.1",
"d3-hierarchy": "3.1.2",
"dayjs": "^1.10.7",
"dompurify": "3.1.3",
"dompurify": "3.2.4",
"dotenv": "8.2.0",
"event-source-polyfill": "1.0.31",
"eventemitter3": "5.0.1",

View File

@@ -4,6 +4,7 @@ import getOrgUser from 'api/user/getOrgUser';
import { FeatureKeys } from 'constants/features';
import { LOCALSTORAGE } from 'constants/localStorage';
import ROUTES from 'constants/routes';
import { useGetTenantLicense } from 'hooks/useGetTenantLicense';
import history from 'lib/history';
import { isEmpty } from 'lodash-es';
import { useAppContext } from 'providers/App/App';
@@ -13,7 +14,6 @@ import { matchPath, useLocation } from 'react-router-dom';
import { LicenseState, LicenseStatus } from 'types/api/licensesV3/getActive';
import { Organization } from 'types/api/user/getOrganization';
import { USER_ROLES } from 'types/roles';
import { isCloudUser } from 'utils/app';
import { routePermission } from 'utils/permission';
import routes, {
@@ -55,7 +55,7 @@ function PrivateRoute({ children }: PrivateRouteProps): JSX.Element {
);
const isOldRoute = oldRoutes.indexOf(pathname) > -1;
const currentRoute = mapRoutes.get('current');
const isCloudUserVal = isCloudUser();
const { isCloudUser: isCloudUserVal } = useGetTenantLicense();
const [orgData, setOrgData] = useState<Organization | undefined>(undefined);

View File

@@ -10,6 +10,7 @@ import AppLayout from 'container/AppLayout';
import useAnalytics from 'hooks/analytics/useAnalytics';
import { KeyboardHotkeysProvider } from 'hooks/hotkeys/useKeyboardHotkeys';
import { useThemeConfig } from 'hooks/useDarkMode';
import { useGetTenantLicense } from 'hooks/useGetTenantLicense';
import { LICENSE_PLAN_KEY } from 'hooks/useLicense';
import { NotificationProvider } from 'hooks/useNotifications';
import { ResourceProvider } from 'hooks/useResourceAttribute';
@@ -24,7 +25,7 @@ import { QueryBuilderProvider } from 'providers/QueryBuilder';
import { Suspense, useCallback, useEffect, useState } from 'react';
import { Route, Router, Switch } from 'react-router-dom';
import { CompatRouter } from 'react-router-dom-v5-compat';
import { extractDomain, isCloudUser, isEECloudUser } from 'utils/app';
import { extractDomain } from 'utils/app';
import PrivateRoute from './Private';
import defaultRoutes, {
@@ -54,7 +55,10 @@ function App(): JSX.Element {
const { hostname, pathname } = window.location;
const isCloudUserVal = isCloudUser();
const {
isCloudUser: isCloudUserVal,
isEECloudUser: isEECloudUserVal,
} = useGetTenantLicense();
const enableAnalytics = useCallback(
(user: IUser): void => {
@@ -150,7 +154,7 @@ function App(): JSX.Element {
let updatedRoutes = defaultRoutes;
// if the user is a cloud user
if (isCloudUserVal || isEECloudUser()) {
if (isCloudUserVal || isEECloudUserVal) {
// if the user is on basic plan then remove billing
if (isOnBasicPlan) {
updatedRoutes = updatedRoutes.filter(
@@ -175,6 +179,7 @@ function App(): JSX.Element {
isCloudUserVal,
isFetchingLicenses,
isFetchingUser,
isEECloudUserVal,
]);
useEffect(() => {

View File

@@ -9,19 +9,21 @@ const create = async (
): Promise<SuccessResponse<PayloadProps> | ErrorResponse> => {
try {
let httpConfig = {};
const username = props.username ? props.username.trim() : '';
const password = props.password ? props.password.trim() : '';
if (props.username !== '' && props.password !== '') {
if (username !== '' && password !== '') {
httpConfig = {
basic_auth: {
username: props.username,
password: props.password,
username,
password,
},
};
} else if (props.username === '' && props.password !== '') {
} else if (username === '' && password !== '') {
httpConfig = {
authorization: {
type: 'bearer',
credentials: props.password,
type: 'Bearer',
credentials: password,
},
};
}

View File

@@ -9,18 +9,21 @@ const editWebhook = async (
): Promise<SuccessResponse<PayloadProps> | ErrorResponse> => {
try {
let httpConfig = {};
if (props.username !== '' && props.password !== '') {
const username = props.username ? props.username.trim() : '';
const password = props.password ? props.password.trim() : '';
if (username !== '' && password !== '') {
httpConfig = {
basic_auth: {
username: props.username,
password: props.password,
username,
password,
},
};
} else if (props.username === '' && props.password !== '') {
} else if (username === '' && password !== '') {
httpConfig = {
authorization: {
type: 'bearer',
credentials: props.password,
type: 'Bearer',
credentials: password,
},
};
}

View File

@@ -9,19 +9,21 @@ const testWebhook = async (
): Promise<SuccessResponse<PayloadProps> | ErrorResponse> => {
try {
let httpConfig = {};
const username = props.username ? props.username.trim() : '';
const password = props.password ? props.password.trim() : '';
if (props.username !== '' && props.password !== '') {
if (username !== '' && password !== '') {
httpConfig = {
basic_auth: {
username: props.username,
password: props.password,
username,
password,
},
};
} else if (props.username === '' && props.password !== '') {
} else if (username === '' && password !== '') {
httpConfig = {
authorization: {
type: 'bearer',
credentials: props.password,
type: 'Bearer',
credentials: password,
},
};
}

View File

@@ -0,0 +1,67 @@
import axios from 'api';
import { ErrorResponseHandler } from 'api/ErrorResponseHandler';
import { AxiosError } from 'axios';
import {
OrderByPayload,
TreemapViewType,
} from 'container/MetricsExplorer/Summary/types';
import { ErrorResponse, SuccessResponse } from 'types/api';
import { BaseAutocompleteData } from 'types/api/queryBuilder/queryAutocompleteResponse';
import { TagFilter } from 'types/api/queryBuilder/queryBuilderData';
export interface MetricsListPayload {
filters: TagFilter;
groupBy?: BaseAutocompleteData[];
offset?: number;
limit?: number;
orderBy?: OrderByPayload;
}
export enum MetricType {
SUM = 'Sum',
GAUGE = 'Gauge',
HISTOGRAM = 'Histogram',
SUMMARY = 'Summary',
EXPONENTIAL_HISTOGRAM = 'ExponentialHistogram',
}
export interface MetricsListItemData {
metric_name: string;
description: string;
type: MetricType;
unit: string;
[TreemapViewType.CARDINALITY]: number;
[TreemapViewType.DATAPOINTS]: number;
lastReceived: string;
}
export interface MetricsListResponse {
status: string;
data: {
metrics: MetricsListItemData[];
total?: number;
};
}
export const getMetricsList = async (
props: MetricsListPayload,
signal?: AbortSignal,
headers?: Record<string, string>,
): Promise<SuccessResponse<MetricsListResponse> | ErrorResponse> => {
try {
const response = await axios.post('/metrics', props, {
signal,
headers,
});
return {
statusCode: 200,
error: null,
message: response.data.status,
payload: response.data,
params: props,
};
} catch (error) {
return ErrorResponseHandler(error as AxiosError);
}
};

View File

@@ -0,0 +1,34 @@
import axios from 'api';
import { ErrorResponseHandler } from 'api/ErrorResponseHandler';
import { AxiosError } from 'axios';
import { ErrorResponse, SuccessResponse } from 'types/api';
import { BaseAutocompleteData } from 'types/api/queryBuilder/queryAutocompleteResponse';
export interface MetricsListFilterKeysResponse {
status: string;
data: {
metricColumns: string[];
attributeKeys: BaseAutocompleteData[];
};
}
export const getMetricsListFilterKeys = async (
signal?: AbortSignal,
headers?: Record<string, string>,
): Promise<SuccessResponse<MetricsListFilterKeysResponse> | ErrorResponse> => {
try {
const response = await axios.get('/metrics/filters/keys', {
signal,
headers,
});
return {
statusCode: 200,
error: null,
message: response.data.status,
payload: response.data,
};
} catch (error) {
return ErrorResponseHandler(error as AxiosError);
}
};

View File

@@ -0,0 +1,44 @@
import axios from 'api';
import { ErrorResponseHandler } from 'api/ErrorResponseHandler';
import { AxiosError } from 'axios';
import { ErrorResponse, SuccessResponse } from 'types/api';
import { BaseAutocompleteData } from 'types/api/queryBuilder/queryAutocompleteResponse';
export interface MetricsListFilterValuesPayload {
filterAttributeKeyDataType: string;
filterKey: string;
searchText: string;
limit: number;
}
export interface MetricsListFilterValuesResponse {
status: string;
data: {
FilterValues: BaseAutocompleteData[];
};
}
export const getMetricsListFilterValues = async (
props: MetricsListFilterValuesPayload,
signal?: AbortSignal,
headers?: Record<string, string>,
): Promise<
SuccessResponse<MetricsListFilterValuesResponse> | ErrorResponse
> => {
try {
const response = await axios.post('/metrics/filters/values', props, {
signal,
headers,
});
return {
statusCode: 200,
error: null,
message: response.data.status,
payload: response.data,
params: props,
};
} catch (error) {
return ErrorResponseHandler(error as AxiosError);
}
};

View File

@@ -0,0 +1,54 @@
import axios from 'api';
import { ErrorResponseHandler } from 'api/ErrorResponseHandler';
import { AxiosError } from 'axios';
import { TreemapViewType } from 'container/MetricsExplorer/Summary/types';
import { ErrorResponse, SuccessResponse } from 'types/api';
import { TagFilter } from 'types/api/queryBuilder/queryBuilderData';
export interface MetricsTreeMapPayload {
filters: TagFilter;
limit?: number;
treemap?: TreemapViewType;
}
export interface MetricsTreeMapResponse {
status: string;
data: {
[TreemapViewType.CARDINALITY]: CardinalityData[];
[TreemapViewType.DATAPOINTS]: DatapointsData[];
};
}
export interface CardinalityData {
percentage: number;
total_value: number;
metric_name: string;
}
export interface DatapointsData {
percentage: number;
metric_name: string;
}
export const getMetricsTreeMap = async (
props: MetricsTreeMapPayload,
signal?: AbortSignal,
headers?: Record<string, string>,
): Promise<SuccessResponse<MetricsTreeMapResponse> | ErrorResponse> => {
try {
const response = await axios.post('/metrics/treemap', props, {
signal,
headers,
});
return {
statusCode: 200,
error: null,
message: response.data.status,
payload: response.data,
params: props,
};
} catch (error) {
return ErrorResponseHandler(error as AxiosError);
}
};

View File

@@ -1,26 +0,0 @@
import axios from 'api';
import { ErrorResponseHandler } from 'api/ErrorResponseHandler';
import { AxiosError } from 'axios';
import { ErrorResponse, SuccessResponse } from 'types/api';
import { PayloadProps, Props } from 'types/api/user/setFlags';
const setFlags = async (
props: Props,
): Promise<SuccessResponse<PayloadProps> | ErrorResponse> => {
try {
const response = await axios.patch(`/user/${props.userId}/flags`, {
...props.flags,
});
return {
statusCode: 200,
error: null,
message: response.data?.status,
payload: response.data,
};
} catch (error) {
return ErrorResponseHandler(error as AxiosError);
}
};
export default setFlags;

View File

@@ -6,6 +6,7 @@ import logEvent from 'api/common/logEvent';
import cx from 'classnames';
import { SOMETHING_WENT_WRONG } from 'constants/api';
import { FeatureKeys } from 'constants/features';
import { useGetTenantLicense } from 'hooks/useGetTenantLicense';
import { useNotifications } from 'hooks/useNotifications';
import { defaultTo } from 'lodash-es';
import { CreditCard, HelpCircle, X } from 'lucide-react';
@@ -16,7 +17,6 @@ import { useLocation } from 'react-router-dom';
import { ErrorResponse, SuccessResponse } from 'types/api';
import { CheckoutSuccessPayloadProps } from 'types/api/billing/checkout';
import { License } from 'types/api/licenses/def';
import { isCloudUser } from 'utils/app';
export interface LaunchChatSupportProps {
eventName: string;
@@ -38,7 +38,7 @@ function LaunchChatSupport({
onHoverText = '',
intercomMessageDisabled = false,
}: LaunchChatSupportProps): JSX.Element | null {
const isCloudUserVal = isCloudUser();
const { isCloudUser: isCloudUserVal } = useGetTenantLicense();
const { notifications } = useNotifications();
const {
licenses,
@@ -77,7 +77,6 @@ function LaunchChatSupport({
) {
let isChatSupportEnabled = false;
let isPremiumSupportEnabled = false;
const isCloudUserVal = isCloudUser();
if (featureFlags && featureFlags.length > 0) {
isChatSupportEnabled =
featureFlags.find((flag) => flag.name === FeatureKeys.CHAT_SUPPORT)
@@ -99,6 +98,7 @@ function LaunchChatSupport({
}, [
featureFlags,
featureFlagsFetchError,
isCloudUserVal,
isFetchingFeatureFlags,
isLoggedIn,
licenses,

View File

@@ -16,6 +16,7 @@ import { OnboardingStatusResponse } from 'api/messagingQueues/onboarding/getOnbo
import { QueryParams } from 'constants/query';
import ROUTES from 'constants/routes';
import { History } from 'history';
import { useGetTenantLicense } from 'hooks/useGetTenantLicense';
import { Bolt, Check, OctagonAlert, X } from 'lucide-react';
import {
KAFKA_SETUP_DOC_LINK,
@@ -23,7 +24,6 @@ import {
} from 'pages/MessagingQueues/MessagingQueuesUtils';
import { ReactNode, useEffect, useState } from 'react';
import { useHistory } from 'react-router-dom';
import { isCloudUser } from 'utils/app';
import { v4 as uuid } from 'uuid';
interface AttributeCheckListProps {
@@ -181,7 +181,7 @@ function AttributeCheckList({
const handleFilterChange = (value: AttributesFilters): void => {
setFilter(value);
};
const isCloudUserVal = isCloudUser();
const { isCloudUser: isCloudUserVal } = useGetTenantLicense();
const history = useHistory();
useEffect(() => {

View File

@@ -1,4 +0,0 @@
export default interface ReleaseNoteProps {
path?: string;
release?: string;
}

View File

@@ -1,61 +0,0 @@
import { Button, Space } from 'antd';
import setFlags from 'api/user/setFlags';
import MessageTip from 'components/MessageTip';
import { useAppContext } from 'providers/App/App';
import { useCallback } from 'react';
import { UserFlags } from 'types/api/user/setFlags';
import ReleaseNoteProps from '../ReleaseNoteProps';
export default function ReleaseNote0120({
release,
}: ReleaseNoteProps): JSX.Element | null {
const { user, setUserFlags } = useAppContext();
const handleDontShow = useCallback(async (): Promise<void> => {
const flags: UserFlags = { ReleaseNote0120Hide: 'Y' };
try {
setUserFlags(flags);
if (!user) {
// no user is set, so escape the routine
return;
}
const response = await setFlags({ userId: user.id, flags });
if (response.statusCode !== 200) {
console.log('failed to complete do not show status', response.error);
}
} catch (e) {
// here we do not nothing as the cost of error is minor,
// the user can switch the do no show option again in the further.
console.log('unexpected error: failed to complete do not show status', e);
}
}, [setUserFlags, user]);
return (
<MessageTip
show
message={
<div>
You are using {release} of SigNoz. We have introduced distributed setup in
v0.12.0 release. If you use or plan to use clickhouse queries in dashboard
or alerts, you might want to read about querying the new distributed tables{' '}
<a
href="https://signoz.io/docs/operate/migration/upgrade-0.12/#querying-distributed-tables"
target="_blank"
rel="noreferrer"
>
here
</a>
</div>
}
action={
<Space>
<Button onClick={handleDontShow}>Do not show again</Button>
</Space>
}
/>
);
}

View File

@@ -1,68 +0,0 @@
import ReleaseNoteProps from 'components/ReleaseNote/ReleaseNoteProps';
import ReleaseNote0120 from 'components/ReleaseNote/Releases/ReleaseNote0120';
import ROUTES from 'constants/routes';
import { useAppContext } from 'providers/App/App';
import { useSelector } from 'react-redux';
import { AppState } from 'store/reducers';
import { UserFlags } from 'types/api/user/setFlags';
import AppReducer from 'types/reducer/app';
interface ComponentMapType {
match: (
path: string | undefined,
version: string,
userFlags: UserFlags | null,
) => boolean;
component: ({ path, release }: ReleaseNoteProps) => JSX.Element | null;
}
const allComponentMap: ComponentMapType[] = [
{
match: (
path: string | undefined,
version: string,
userFlags: UserFlags | null,
): boolean => {
if (!path) {
return false;
}
const allowedPaths: string[] = [
ROUTES.LIST_ALL_ALERT,
ROUTES.APPLICATION,
ROUTES.ALL_DASHBOARD,
];
return (
userFlags?.ReleaseNote0120Hide !== 'Y' &&
allowedPaths.includes(path) &&
version.startsWith('v0.12')
);
},
component: ReleaseNote0120,
},
];
// ReleaseNote prints release specific warnings and notes that
// user needs to be aware of before using the upgraded version.
function ReleaseNote({ path }: ReleaseNoteProps): JSX.Element | null {
const { user } = useAppContext();
const { currentVersion } = useSelector<AppState, AppReducer>(
(state) => state.app,
);
const c = allComponentMap.find((item) =>
item.match(path, currentVersion, user.flags),
);
if (!c) {
return null;
}
return <c.component path={path} release={currentVersion} />;
}
ReleaseNote.defaultProps = {
path: '',
};
export default ReleaseNote;

View File

@@ -43,4 +43,10 @@ export const REACT_QUERY_KEY = {
AWS_GENERATE_CONNECTION_URL: 'AWS_GENERATE_CONNECTION_URL',
AWS_GET_CONNECTION_PARAMS: 'AWS_GET_CONNECTION_PARAMS',
GET_ATTRIBUTE_VALUES: 'GET_ATTRIBUTE_VALUES',
// Metrics Explorer Query Keys
GET_METRICS_LIST: 'GET_METRICS_LIST',
GET_METRICS_TREE_MAP: 'GET_METRICS_TREE_MAP',
GET_METRICS_LIST_FILTER_KEYS: 'GET_METRICS_LIST_FILTER_KEYS',
GET_METRICS_LIST_FILTER_VALUES: 'GET_METRICS_LIST_FILTER_VALUES',
};

View File

@@ -23,6 +23,7 @@ import SideNav from 'container/SideNav';
import TopNav from 'container/TopNav';
import dayjs from 'dayjs';
import { useIsDarkMode } from 'hooks/useDarkMode';
import { useGetTenantLicense } from 'hooks/useGetTenantLicense';
import { useNotifications } from 'hooks/useNotifications';
import history from 'lib/history';
import { isNull } from 'lodash-es';
@@ -54,7 +55,6 @@ import { ErrorResponse, SuccessResponse } from 'types/api';
import { CheckoutSuccessPayloadProps } from 'types/api/billing/checkout';
import { LicenseEvent } from 'types/api/licensesV3/getActive';
import { USER_ROLES } from 'types/roles';
import { isCloudUser } from 'utils/app';
import { eventEmitter } from 'utils/getEventEmitter';
import {
getFormattedDate,
@@ -122,6 +122,8 @@ function AppLayout(props: AppLayoutProps): JSX.Element {
const { pathname } = useLocation();
const { t } = useTranslation(['titles']);
const { isCloudUser: isCloudUserVal } = useGetTenantLicense();
const [getUserVersionResponse, getUserLatestVersionResponse] = useQueries([
{
queryFn: getUserVersion,
@@ -354,7 +356,6 @@ function AppLayout(props: AppLayoutProps): JSX.Element {
) {
let isChatSupportEnabled = false;
let isPremiumSupportEnabled = false;
const isCloudUserVal = isCloudUser();
if (featureFlags && featureFlags.length > 0) {
isChatSupportEnabled =
featureFlags.find((flag) => flag.name === FeatureKeys.CHAT_SUPPORT)
@@ -376,6 +377,7 @@ function AppLayout(props: AppLayoutProps): JSX.Element {
}, [
featureFlags,
featureFlagsFetchError,
isCloudUserVal,
isFetchingFeatureFlags,
isLoggedIn,
licenses,

View File

@@ -24,6 +24,7 @@ import Spinner from 'components/Spinner';
import { SOMETHING_WENT_WRONG } from 'constants/api';
import { REACT_QUERY_KEY } from 'constants/reactQueryKeys';
import useAxiosError from 'hooks/useAxiosError';
import { useGetTenantLicense } from 'hooks/useGetTenantLicense';
import { useNotifications } from 'hooks/useNotifications';
import { isEmpty, pick } from 'lodash-es';
import { useAppContext } from 'providers/App/App';
@@ -33,7 +34,6 @@ import { useMutation, useQuery } from 'react-query';
import { ErrorResponse, SuccessResponse } from 'types/api';
import { CheckoutSuccessPayloadProps } from 'types/api/billing/checkout';
import { License } from 'types/api/licenses/def';
import { isCloudUser } from 'utils/app';
import { getFormattedDate, getRemainingDays } from 'utils/timeUtils';
import { BillingUsageGraph } from './BillingUsageGraph/BillingUsageGraph';
@@ -145,7 +145,7 @@ export default function BillingContainer(): JSX.Element {
const handleError = useAxiosError();
const isCloudUserVal = isCloudUser();
const { isCloudUser: isCloudUserVal } = useGetTenantLicense();
const processUsageData = useCallback(
(data: any): void => {

View File

@@ -5,6 +5,7 @@ import setRetentionApi from 'api/settings/setRetention';
import TextToolTip from 'components/TextToolTip';
import GeneralSettingsCloud from 'container/GeneralSettingsCloud';
import useComponentPermission from 'hooks/useComponentPermission';
import { useGetTenantLicense } from 'hooks/useGetTenantLicense';
import { useNotifications } from 'hooks/useNotifications';
import find from 'lodash-es/find';
import { useAppContext } from 'providers/App/App';
@@ -23,7 +24,6 @@ import {
PayloadPropsMetrics as GetRetentionPeriodMetricsPayload,
PayloadPropsTraces as GetRetentionPeriodTracesPayload,
} from 'types/api/settings/getRetention';
import { isCloudUser } from 'utils/app';
import Retention from './Retention';
import StatusMessage from './StatusMessage';
@@ -394,7 +394,7 @@ function GeneralSettings({
onModalToggleHandler(type);
};
const isCloudUserVal = isCloudUser();
const { isCloudUser: isCloudUserVal } = useGetTenantLicense();
const renderConfig = [
{

View File

@@ -1,4 +1,5 @@
import { Col, Row, Select } from 'antd';
import { useGetTenantLicense } from 'hooks/useGetTenantLicense';
import { find } from 'lodash-es';
import {
ChangeEvent,
@@ -8,7 +9,6 @@ import {
useRef,
useState,
} from 'react';
import { isCloudUser } from 'utils/app';
import {
Input,
@@ -39,6 +39,9 @@ function Retention({
initialValue,
);
const interacted = useRef(false);
const { isCloudUser: isCloudUserVal } = useGetTenantLicense();
useEffect(() => {
if (!interacted.current) setSelectedValue(initialValue);
}, [initialValue]);
@@ -91,8 +94,6 @@ function Retention({
return null;
}
const isCloudUserVal = isCloudUser();
return (
<RetentionContainer>
<Row justify="space-between">

View File

@@ -1,21 +1,18 @@
import { Space } from 'antd';
import getAll from 'api/alerts/getAll';
import logEvent from 'api/common/logEvent';
import ReleaseNote from 'components/ReleaseNote';
import Spinner from 'components/Spinner';
import { useNotifications } from 'hooks/useNotifications';
import { isUndefined } from 'lodash-es';
import { useEffect, useRef } from 'react';
import { useTranslation } from 'react-i18next';
import { useQuery } from 'react-query';
import { useLocation } from 'react-router-dom';
import { AlertsEmptyState } from './AlertsEmptyState/AlertsEmptyState';
import ListAlert from './ListAlert';
function ListAlertRules(): JSX.Element {
const { t } = useTranslation('common');
const location = useLocation();
const { data, isError, isLoading, refetch, status } = useQuery('allAlerts', {
queryFn: getAll,
cacheTime: 0,
@@ -70,7 +67,6 @@ function ListAlertRules(): JSX.Element {
return (
<Space direction="vertical" size="large" style={{ width: '100%' }}>
<ReleaseNote path={location.pathname} />
<ListAlert
{...{
allAlertRules: data.payload,

View File

@@ -34,6 +34,7 @@ import { Base64Icons } from 'container/NewDashboard/DashboardSettings/General/ut
import dayjs from 'dayjs';
import { useGetAllDashboard } from 'hooks/dashboard/useGetAllDashboard';
import useComponentPermission from 'hooks/useComponentPermission';
import { useGetTenantLicense } from 'hooks/useGetTenantLicense';
import { useNotifications } from 'hooks/useNotifications';
import { useSafeNavigate } from 'hooks/useSafeNavigate';
import { get, isEmpty, isUndefined } from 'lodash-es';
@@ -82,7 +83,6 @@ import {
WidgetRow,
Widgets,
} from 'types/api/dashboard/getAll';
import { isCloudUser } from 'utils/app';
import DashboardTemplatesModal from './DashboardTemplates/DashboardTemplatesModal';
import ImportJSON from './ImportJSON';
@@ -111,6 +111,8 @@ function DashboardsList(): JSX.Element {
setListSortOrder: setSortOrder,
} = useDashboard();
const { isCloudUser: isCloudUserVal } = useGetTenantLicense();
const [searchString, setSearchString] = useState<string>(
sortOrder.search || '',
);
@@ -694,7 +696,7 @@ function DashboardsList(): JSX.Element {
Create and manage dashboards for your workspace.
</Typography.Text>
</Flex>
{isCloudUser() && (
{isCloudUserVal && (
<div className="integrations-container">
<div className="integrations-content">
<RequestDashboardBtn />
@@ -735,7 +737,7 @@ function DashboardsList(): JSX.Element {
<Button
type="text"
className="learn-more"
onClick={(): void => handleContactSupport(isCloudUser())}
onClick={(): void => handleContactSupport(isCloudUserVal)}
>
Contact Support
</Button>

View File

@@ -3,13 +3,15 @@
import './LogsError.styles.scss';
import { Typography } from 'antd';
import { useGetTenantLicense } from 'hooks/useGetTenantLicense';
import history from 'lib/history';
import { ArrowRight } from 'lucide-react';
import { isCloudUser } from 'utils/app';
export default function LogsError(): JSX.Element {
const { isCloudUser: isCloudUserVal } = useGetTenantLicense();
const handleContactSupport = (): void => {
if (isCloudUser()) {
if (isCloudUserVal) {
history.push('/support');
} else {
window.open('https://signoz.io/slack', '_blank');

View File

@@ -30,7 +30,6 @@ import TimeSeriesView from 'container/TimeSeriesView/TimeSeriesView';
import dayjs from 'dayjs';
import { useUpdateDashboard } from 'hooks/dashboard/useUpdateDashboard';
import { addEmptyWidgetInDashboardJSONWithQuery } from 'hooks/dashboard/utils';
import { LogTimeRange } from 'hooks/logs/types';
import { useCopyLogLink } from 'hooks/logs/useCopyLogLink';
import { useGetExplorerQueryRange } from 'hooks/queryBuilder/useGetExplorerQueryRange';
import { useGetPanelTypesQueryParam } from 'hooks/queryBuilder/useGetPanelTypesQueryParam';
@@ -104,7 +103,7 @@ function LogsExplorerViews({
// this is to respect the panel type present in the URL rather than defaulting it to list always.
const panelTypes = useGetPanelTypesQueryParam(PANEL_TYPES.LIST);
const { activeLogId, onTimeRangeChange } = useCopyLogLink();
const { activeLogId } = useCopyLogLink();
const { queryData: pageSize } = useUrlQueryData(
QueryParams.pageSize,
@@ -562,7 +561,6 @@ function LogsExplorerViews({
}, [handleSetConfig, panelTypes]);
useEffect(() => {
const currentParams = data?.params as Omit<LogTimeRange, 'pageSize'>;
const currentData = data?.payload?.data?.newResult?.data?.result || [];
if (currentData.length > 0 && currentData[0].list) {
const currentLogs: ILog[] = currentData[0].list.map((item) => ({
@@ -572,11 +570,6 @@ function LogsExplorerViews({
const newLogs = [...logs, ...currentLogs];
setLogs(newLogs);
onTimeRangeChange({
start: currentParams?.start,
end: currentParams?.end,
pageSize: newLogs.length,
});
}
// eslint-disable-next-line react-hooks/exhaustive-deps
@@ -612,7 +605,6 @@ function LogsExplorerViews({
pageSize,
minTime,
activeLogId,
onTimeRangeChange,
panelType,
selectedView,
]);

View File

@@ -0,0 +1,40 @@
import { Select } from 'antd';
import QueryBuilderSearch from 'container/QueryBuilder/filters/QueryBuilderSearch';
import DateTimeSelectionV2 from 'container/TopNav/DateTimeSelectionV2';
import { HardHat } from 'lucide-react';
import { TREEMAP_VIEW_OPTIONS } from './constants';
import { MetricsSearchProps } from './types';
function MetricsSearch({
query,
onChange,
heatmapView,
setHeatmapView,
}: MetricsSearchProps): JSX.Element {
return (
<div className="metrics-search-container">
<div className="metrics-search-options">
<Select
style={{ width: 140 }}
options={TREEMAP_VIEW_OPTIONS}
value={heatmapView}
onChange={setHeatmapView}
/>
<DateTimeSelectionV2
showAutoRefresh={false}
showRefreshText={false}
hideShareModal
/>
</div>
<QueryBuilderSearch
query={query}
onChange={onChange}
suffixIcon={<HardHat size={16} />}
isMetricsExplorer
/>
</div>
);
}
export default MetricsSearch;

View File

@@ -0,0 +1,86 @@
import { LoadingOutlined } from '@ant-design/icons';
import {
Spin,
Table,
TablePaginationConfig,
TableProps,
Typography,
} from 'antd';
import { SorterResult } from 'antd/es/table/interface';
import { useCallback } from 'react';
import { MetricsListItemRowData, MetricsTableProps } from './types';
import { metricsTableColumns } from './utils';
function MetricsTable({
isLoading,
data,
pageSize,
currentPage,
onPaginationChange,
setOrderBy,
totalCount,
}: MetricsTableProps): JSX.Element {
const handleTableChange: TableProps<MetricsListItemRowData>['onChange'] = useCallback(
(
_pagination: TablePaginationConfig,
_filters: Record<string, (string | number | boolean)[] | null>,
sorter:
| SorterResult<MetricsListItemRowData>
| SorterResult<MetricsListItemRowData>[],
): void => {
if ('field' in sorter && sorter.order) {
setOrderBy({
columnName: sorter.field as string,
order: sorter.order === 'ascend' ? 'asc' : 'desc',
});
} else {
setOrderBy({
columnName: 'type',
order: 'asc',
});
}
},
[setOrderBy],
);
return (
<div className="metrics-table-container">
<Table
loading={{
spinning: isLoading,
indicator: <Spin indicator={<LoadingOutlined size={14} spin />} />,
}}
dataSource={data}
columns={metricsTableColumns}
locale={{
emptyText: (
<div className="no-metrics-message-container">
<img
src="/Icons/emptyState.svg"
alt="thinking-emoji"
className="empty-state-svg"
/>
<Typography.Text className="no-metrics-message">
This query had no results. Edit your query and try again!
</Typography.Text>
</div>
),
}}
tableLayout="fixed"
onChange={handleTableChange}
pagination={{
current: currentPage,
pageSize,
showSizeChanger: true,
hideOnSinglePage: false,
onChange: onPaginationChange,
total: totalCount,
}}
/>
</div>
);
}
export default MetricsTable;

View File

@@ -0,0 +1,130 @@
import { Group } from '@visx/group';
import { Treemap } from '@visx/hierarchy';
import { Empty, Skeleton, Tooltip } from 'antd';
import { stratify, treemapBinary } from 'd3-hierarchy';
import { useMemo } from 'react';
import { useWindowSize } from 'react-use';
import {
TREEMAP_HEIGHT,
TREEMAP_MARGINS,
TREEMAP_SQUARE_PADDING,
} from './constants';
import { TreemapProps, TreemapTile, TreemapViewType } from './types';
import {
getTreemapTileStyle,
getTreemapTileTextStyle,
transformTreemapData,
} from './utils';
function MetricsTreemap({
viewType,
data,
isLoading,
}: TreemapProps): JSX.Element {
const { width: windowWidth } = useWindowSize();
const treemapWidth = useMemo(
() =>
Math.max(
windowWidth - TREEMAP_MARGINS.LEFT - TREEMAP_MARGINS.RIGHT - 70,
300,
),
[windowWidth],
);
const treemapData = useMemo(() => {
const extracedTreemapData =
(viewType === TreemapViewType.CARDINALITY
? data?.data?.[TreemapViewType.CARDINALITY]
: data?.data?.[TreemapViewType.DATAPOINTS]) || [];
return transformTreemapData(extracedTreemapData, viewType);
}, [data, viewType]);
const transformedTreemapData = stratify<TreemapTile>()
.id((d) => d.id)
.parentId((d) => d.parent)(treemapData)
.sum((d) => d.size ?? 0);
const xMax = treemapWidth - TREEMAP_MARGINS.LEFT - TREEMAP_MARGINS.RIGHT;
const yMax = TREEMAP_HEIGHT - TREEMAP_MARGINS.TOP - TREEMAP_MARGINS.BOTTOM;
if (isLoading) {
return (
<Skeleton style={{ width: treemapWidth, height: TREEMAP_HEIGHT }} active />
);
}
if (
!data ||
!data.data ||
data?.status === 'error' ||
(data?.status === 'success' && !data?.data?.[viewType])
) {
return (
<Empty
description="No metrics found"
style={{ width: treemapWidth, height: TREEMAP_HEIGHT, paddingTop: 30 }}
/>
);
}
return (
<div className="metrics-treemap">
<svg width={treemapWidth} height={TREEMAP_HEIGHT}>
<rect
width={treemapWidth}
height={TREEMAP_HEIGHT}
rx={14}
fill="transparent"
/>
<Treemap<TreemapTile>
top={TREEMAP_MARGINS.TOP}
root={transformedTreemapData}
size={[xMax, yMax]}
tile={treemapBinary}
round
>
{(treemap): JSX.Element => (
<Group>
{treemap
.descendants()
.reverse()
.map((node, i) => {
const nodeWidth = node.x1 - node.x0 - TREEMAP_SQUARE_PADDING;
const nodeHeight = node.y1 - node.y0 - TREEMAP_SQUARE_PADDING;
return (
<Group
// eslint-disable-next-line react/no-array-index-key
key={node.data.id || `node-${i}`}
top={node.y0 + TREEMAP_MARGINS.TOP}
left={node.x0 + TREEMAP_MARGINS.LEFT}
>
{node.depth > 0 && (
<Tooltip
title={`${node.data.id}: ${node.data.displayValue}%`}
placement="top"
>
<foreignObject
width={nodeWidth}
height={nodeHeight}
style={getTreemapTileStyle(node.data)}
>
<div style={getTreemapTileTextStyle()}>
{`${node.data.displayValue}%`}
</div>
</foreignObject>
</Tooltip>
)}
</Group>
);
})}
</Group>
)}
</Treemap>
</svg>
</div>
);
}
export default MetricsTreemap;

View File

@@ -0,0 +1,223 @@
.metrics-explorer-summary-tab {
display: flex;
flex-direction: column;
gap: 16px;
padding: 16px 0;
.metrics-search-container {
display: flex;
flex-direction: column;
gap: 16px;
.metrics-search-options {
display: flex;
justify-content: space-between;
}
}
.metrics-table-container {
margin-left: -16px;
margin-right: -16px;
.ant-table {
max-height: 500px;
overflow-y: auto;
.ant-table-thead > tr > th {
padding: 12px;
font-weight: 500;
font-size: 12px;
line-height: 18px;
background: var(--bg-ink-500);
border-bottom: none;
color: var(--Vanilla-400, #c0c1c3);
font-family: Inter;
font-size: 11px;
font-style: normal;
font-weight: 600;
line-height: 18px; /* 163.636% */
letter-spacing: 0.44px;
text-transform: uppercase;
&::before {
background-color: transparent;
}
}
.ant-table-thead > tr > th:has(.metric-name-column-header) {
background: var(--bg-ink-400);
}
.ant-table-cell {
padding: 12px;
font-size: 13px;
line-height: 20px;
color: var(--bg-vanilla-100);
background: var(--bg-ink-500);
}
.ant-table-cell:has(.metric-name-column-value) {
background: var(--bg-ink-400);
}
.metric-name-column-value {
color: var(--bg-vanilla-100);
font-family: 'Geist Mono';
font-style: normal;
font-weight: 600;
line-height: 20px; /* 142.857% */
letter-spacing: -0.07px;
}
.status-cell {
.active-tag {
color: var(--bg-forest-500);
padding: 4px 8px;
border-radius: 4px;
font-size: 12px;
font-weight: 500;
}
}
.progress-container {
.ant-progress-bg {
height: 8px !important;
border-radius: 4px;
}
}
.ant-table-tbody > tr:hover > td {
background: rgba(255, 255, 255, 0.04);
}
.ant-table-cell:first-child {
text-align: justify;
}
.ant-table-cell:nth-child(2) {
padding-left: 16px;
padding-right: 16px;
}
.ant-table-cell:nth-child(n + 3) {
padding-right: 24px;
}
.column-header-right {
text-align: right;
}
.column-header-left {
text-align: left;
}
.ant-table-tbody > tr > td {
border-bottom: none;
}
.ant-table-thead
> tr
> th:not(:last-child):not(.ant-table-selection-column):not(.ant-table-row-expand-icon-cell):not([colspan])::before {
background-color: transparent;
}
.ant-empty-normal {
visibility: hidden;
}
}
.ant-pagination {
position: fixed;
bottom: 0;
width: calc(100% - 64px);
background: var(--bg-ink-500);
padding: 16px;
margin: 0;
// this is to offset intercom icon till we improve the design
right: 20px;
.ant-pagination-item {
border-radius: 4px;
&-active {
background: var(--bg-robin-500);
border-color: var(--bg-robin-500);
a {
color: var(--bg-ink-500) !important;
}
}
}
}
}
.no-metrics-message-container {
display: flex;
flex-direction: column;
justify-content: flex-start;
align-items: center;
min-height: 400px;
gap: 16px;
padding-top: 32px;
}
.metric-type-renderer {
border-radius: 50px;
height: 24px;
width: fit-content;
display: flex;
align-items: center;
justify-content: flex-start;
gap: 4px;
text-transform: uppercase;
font-size: 12px;
padding: 5px 10px;
align-self: flex-end;
}
}
.lightMode {
.metrics-table-container {
.ant-table {
.ant-table-thead > tr > th {
background: var(--bg-vanilla-100);
color: var(--text-ink-300);
}
.ant-table-thead > tr > th:has(.metric-name-column-header) {
background: var(--bg-vanilla-100);
}
.ant-table-cell {
background: var(--bg-vanilla-100);
color: var(--bg-ink-500);
}
.ant-table-cell:has(.metric-name-column-value) {
background: var(--bg-vanilla-100);
}
.metric-name-column-value {
color: var(--bg-ink-300);
}
.ant-table-tbody > tr:hover > td {
background: rgba(0, 0, 0, 0.04);
}
}
.ant-pagination {
background: var(--bg-vanilla-100);
.ant-pagination-item {
&-active {
background: var(--bg-robin-500);
border-color: var(--bg-robin-500);
a {
color: var(--bg-vanilla-100) !important;
}
}
}
}
}
}

View File

@@ -1,10 +1,162 @@
import './Summary.styles.scss';
import * as Sentry from '@sentry/react';
import { usePageSize } from 'container/InfraMonitoringK8s/utils';
import { useGetMetricsList } from 'hooks/metricsExplorer/useGetMetricsList';
import { useGetMetricsTreeMap } from 'hooks/metricsExplorer/useGetMetricsTreeMap';
import { useQueryBuilder } from 'hooks/queryBuilder/useQueryBuilder';
import { useQueryOperations } from 'hooks/queryBuilder/useQueryBuilderOperations';
import ErrorBoundaryFallback from 'pages/ErrorBoundaryFallback/ErrorBoundaryFallback';
import { useCallback, useMemo, useState } from 'react';
import { useSelector } from 'react-redux';
import { AppState } from 'store/reducers';
import { TagFilter } from 'types/api/queryBuilder/queryBuilderData';
import { GlobalReducer } from 'types/reducer/globalTime';
import MetricsSearch from './MetricsSearch';
import MetricsTable from './MetricsTable';
import MetricsTreemap from './MetricsTreemap';
import { OrderByPayload, TreemapViewType } from './types';
import {
convertNanoToMilliseconds,
formatDataForMetricsTable,
getMetricsListQuery,
} from './utils';
function Summary(): JSX.Element {
const { pageSize, setPageSize } = usePageSize('metricsExplorer');
const [currentPage, setCurrentPage] = useState(1);
const [orderBy, setOrderBy] = useState<OrderByPayload>({
columnName: 'type',
order: 'asc',
});
const [heatmapView, setHeatmapView] = useState<TreemapViewType>(
TreemapViewType.CARDINALITY,
);
const { maxTime, minTime } = useSelector<AppState, GlobalReducer>(
(state) => state.globalTime,
);
const { currentQuery } = useQueryBuilder();
const queryFilters = useMemo(
() =>
currentQuery?.builder?.queryData[0]?.filters || {
items: [],
op: 'and',
},
[currentQuery],
);
const { handleChangeQueryData } = useQueryOperations({
index: 0,
query: currentQuery.builder.queryData[0],
entityVersion: '',
});
const metricsListQuery = useMemo(() => {
const baseQuery = getMetricsListQuery();
return {
...baseQuery,
limit: pageSize,
offset: (currentPage - 1) * pageSize,
filters: queryFilters,
start: convertNanoToMilliseconds(minTime),
end: convertNanoToMilliseconds(maxTime),
orderBy,
};
}, [queryFilters, minTime, maxTime, orderBy, pageSize, currentPage]);
const metricsTreemapQuery = useMemo(
() => ({
limit: 100,
filters: queryFilters,
treemap: heatmapView,
start: convertNanoToMilliseconds(minTime),
end: convertNanoToMilliseconds(maxTime),
}),
[queryFilters, heatmapView, minTime, maxTime],
);
const {
data: metricsData,
isLoading: isMetricsLoading,
isFetching: isMetricsFetching,
} = useGetMetricsList(metricsListQuery, {
enabled: !!metricsListQuery,
});
const {
data: treeMapData,
isLoading: isTreeMapLoading,
isFetching: isTreeMapFetching,
} = useGetMetricsTreeMap(metricsTreemapQuery, {
enabled: !!metricsTreemapQuery,
});
const handleFilterChange = useCallback(
(value: TagFilter) => {
handleChangeQueryData('filters', value);
setCurrentPage(1);
},
[handleChangeQueryData],
);
const updatedCurrentQuery = useMemo(
() => ({
...currentQuery,
builder: {
...currentQuery.builder,
queryData: [
{
...currentQuery.builder.queryData[0],
aggregateOperator: 'noop',
aggregateAttribute: {
...currentQuery.builder.queryData[0].aggregateAttribute,
},
},
],
},
}),
[currentQuery],
);
const searchQuery = updatedCurrentQuery?.builder?.queryData[0] || null;
const onPaginationChange = (page: number, pageSize: number): void => {
setCurrentPage(page);
setPageSize(pageSize);
};
const formattedMetricsData = useMemo(
() => formatDataForMetricsTable(metricsData?.payload?.data?.metrics || []),
[metricsData],
);
return (
<Sentry.ErrorBoundary fallback={<ErrorBoundaryFallback />}>
Summary
<div className="metrics-explorer-summary-tab">
<MetricsSearch
query={searchQuery}
onChange={handleFilterChange}
heatmapView={heatmapView}
setHeatmapView={setHeatmapView}
/>
<MetricsTreemap
data={treeMapData?.payload}
isLoading={isTreeMapLoading || isTreeMapFetching}
viewType={heatmapView}
/>
<MetricsTable
isLoading={isMetricsLoading || isMetricsFetching}
data={formattedMetricsData}
pageSize={pageSize}
currentPage={currentPage}
onPaginationChange={onPaginationChange}
setOrderBy={setOrderBy}
totalCount={metricsData?.payload?.data.total || 0}
/>
</div>
</Sentry.ErrorBoundary>
);
}

View File

@@ -0,0 +1,26 @@
import { MetricType } from 'api/metricsExplorer/getMetricsList';
import { TreemapViewType } from './types';
export const METRICS_TABLE_PAGE_SIZE = 10;
export const TREEMAP_VIEW_OPTIONS: {
value: TreemapViewType;
label: string;
}[] = [
{ value: TreemapViewType.CARDINALITY, label: 'Cardinality' },
{ value: TreemapViewType.DATAPOINTS, label: 'Datapoints' },
];
export const TREEMAP_HEIGHT = 300;
export const TREEMAP_SQUARE_PADDING = 5;
export const TREEMAP_MARGINS = { TOP: 10, LEFT: 10, RIGHT: 10, BOTTOM: 10 };
export const METRIC_TYPE_LABEL_MAP = {
[MetricType.SUM]: 'Sum',
[MetricType.GAUGE]: 'Gauge',
[MetricType.HISTOGRAM]: 'Histogram',
[MetricType.SUMMARY]: 'Summary',
[MetricType.EXPONENTIAL_HISTOGRAM]: 'Exp. Histogram',
};

View File

@@ -0,0 +1,56 @@
import { MetricsTreeMapResponse } from 'api/metricsExplorer/getMetricsTreeMap';
import React, { Dispatch, SetStateAction } from 'react';
import {
IBuilderQuery,
TagFilter,
} from 'types/api/queryBuilder/queryBuilderData';
export interface MetricsTableProps {
isLoading: boolean;
data: MetricsListItemRowData[];
pageSize: number;
currentPage: number;
onPaginationChange: (page: number, pageSize: number) => void;
setOrderBy: Dispatch<SetStateAction<OrderByPayload>>;
totalCount: number;
}
export interface MetricsSearchProps {
query: IBuilderQuery;
onChange: (value: TagFilter) => void;
heatmapView: TreemapViewType;
setHeatmapView: (value: TreemapViewType) => void;
}
export interface TreemapProps {
data: MetricsTreeMapResponse | null | undefined;
isLoading: boolean;
viewType: TreemapViewType;
}
export interface OrderByPayload {
columnName: string;
order: 'asc' | 'desc';
}
export interface MetricsListItemRowData {
key: string;
metric_name: React.ReactNode;
description: React.ReactNode;
metric_type: React.ReactNode;
unit: React.ReactNode;
samples: React.ReactNode;
timeseries: React.ReactNode;
}
export enum TreemapViewType {
CARDINALITY = 'timeseries',
DATAPOINTS = 'samples',
}
export interface TreemapTile {
id: string;
size: number;
displayValue: number | string | null;
parent: string | null;
}

View File

@@ -0,0 +1,241 @@
import { Color } from '@signozhq/design-tokens';
import { Tooltip, Typography } from 'antd';
import { ColumnType } from 'antd/es/table';
import {
MetricsListItemData,
MetricsListPayload,
MetricType,
} from 'api/metricsExplorer/getMetricsList';
import {
CardinalityData,
DatapointsData,
} from 'api/metricsExplorer/getMetricsTreeMap';
import {
BarChart,
BarChart2,
BarChartHorizontal,
Diff,
Gauge,
} from 'lucide-react';
import { useMemo } from 'react';
import { METRIC_TYPE_LABEL_MAP } from './constants';
import { MetricsListItemRowData, TreemapTile, TreemapViewType } from './types';
export const metricsTableColumns: ColumnType<MetricsListItemRowData>[] = [
{
title: <div className="metric-name-column-header">METRIC</div>,
dataIndex: 'metric_name',
width: 400,
sorter: true,
className: 'metric-name-column-header',
render: (value: string): React.ReactNode => (
<div className="metric-name-column-value">{value}</div>
),
},
{
title: 'DESCRIPTION',
dataIndex: 'description',
width: 400,
},
{
title: 'TYPE',
dataIndex: 'metric_type',
sorter: true,
width: 150,
},
{
title: 'UNIT',
dataIndex: 'unit',
width: 150,
},
{
title: 'DATAPOINTS',
dataIndex: TreemapViewType.DATAPOINTS,
width: 150,
sorter: true,
},
{
title: 'CARDINALITY',
dataIndex: TreemapViewType.CARDINALITY,
width: 150,
sorter: true,
},
];
export const getMetricsListQuery = (): MetricsListPayload => ({
filters: {
items: [],
op: 'and',
},
orderBy: { columnName: 'metric_name', order: 'asc' },
});
function MetricTypeRenderer({ type }: { type: MetricType }): JSX.Element {
const [icon, color] = useMemo(() => {
switch (type) {
case MetricType.SUM:
return [
<Diff key={type} size={12} color={Color.BG_ROBIN_500} />,
Color.BG_ROBIN_500,
];
case MetricType.GAUGE:
return [
<Gauge key={type} size={12} color={Color.BG_SAKURA_500} />,
Color.BG_SAKURA_500,
];
case MetricType.HISTOGRAM:
return [
<BarChart2 key={type} size={12} color={Color.BG_SIENNA_500} />,
Color.BG_SIENNA_500,
];
case MetricType.SUMMARY:
return [
<BarChartHorizontal key={type} size={12} color={Color.BG_FOREST_500} />,
Color.BG_FOREST_500,
];
case MetricType.EXPONENTIAL_HISTOGRAM:
return [
<BarChart key={type} size={12} color={Color.BG_AQUA_500} />,
Color.BG_AQUA_500,
];
default:
return [null, ''];
}
}, [type]);
return (
<div
className="metric-type-renderer"
style={{
backgroundColor: `${color}33`,
border: `1px solid ${color}`,
color,
}}
>
{icon}
<Typography.Text style={{ color, fontSize: 12 }}>
{METRIC_TYPE_LABEL_MAP[type]}
</Typography.Text>
</div>
);
}
function ValidateRowValueWrapper({
value,
children,
}: {
value: string | number | null;
children: React.ReactNode;
}): JSX.Element {
if (!value) {
return <div>-</div>;
}
return <div>{children}</div>;
}
export const formatDataForMetricsTable = (
data: MetricsListItemData[],
): MetricsListItemRowData[] =>
data.map((metric) => ({
key: metric.metric_name,
metric_name: (
<ValidateRowValueWrapper value={metric.metric_name}>
<Tooltip title={metric.metric_name}>{metric.metric_name}</Tooltip>
</ValidateRowValueWrapper>
),
description: (
<ValidateRowValueWrapper value={metric.description}>
<Tooltip title={metric.description}>{metric.description}</Tooltip>
</ValidateRowValueWrapper>
),
metric_type: <MetricTypeRenderer type={metric.type} />,
unit: (
<ValidateRowValueWrapper value={metric.unit}>
{metric.unit}
</ValidateRowValueWrapper>
),
[TreemapViewType.DATAPOINTS]: (
<ValidateRowValueWrapper value={metric[TreemapViewType.DATAPOINTS]}>
{metric[TreemapViewType.DATAPOINTS]}
</ValidateRowValueWrapper>
),
[TreemapViewType.CARDINALITY]: (
<ValidateRowValueWrapper value={metric[TreemapViewType.CARDINALITY]}>
{metric[TreemapViewType.CARDINALITY]}
</ValidateRowValueWrapper>
),
}));
export const transformTreemapData = (
data: CardinalityData[] | DatapointsData[],
viewType: TreemapViewType,
): TreemapTile[] => {
const totalSize = (data as (CardinalityData | DatapointsData)[]).reduce(
(acc: number, item: CardinalityData | DatapointsData) =>
acc + item.percentage,
0,
);
const children = data.map((item) => ({
id: item.metric_name,
size: totalSize > 0 ? Number((item.percentage / totalSize).toFixed(2)) : 0,
displayValue: Number(item.percentage).toFixed(2),
parent: viewType,
}));
return [
{
id: viewType,
size: 0,
parent: null,
displayValue: null,
},
...children,
];
};
const getTreemapTileBackgroundColor = (node: TreemapTile): string => {
const size = node.size * 10;
if (size > 0.8) {
return Color.BG_AMBER_600;
}
if (size > 0.6) {
return Color.BG_AMBER_500;
}
if (size > 0.4) {
return Color.BG_AMBER_400;
}
if (size > 0.2) {
return Color.BG_AMBER_300;
}
if (size > 0.1) {
return Color.BG_AMBER_200;
}
return Color.BG_AMBER_100;
};
export const getTreemapTileStyle = (
node: TreemapTile,
): React.CSSProperties => ({
overflow: 'visible',
cursor: 'pointer',
backgroundColor: getTreemapTileBackgroundColor(node),
borderRadius: 4,
});
export const getTreemapTileTextStyle = (): React.CSSProperties => ({
width: '100%',
height: '100%',
display: 'flex',
alignItems: 'center',
justifyContent: 'center',
fontSize: '12px',
fontWeight: 'bold',
color: Color.TEXT_SLATE_400,
textAlign: 'center',
padding: '4px',
});
export const convertNanoToMilliseconds = (time: number): number =>
Math.floor(time / 1000000);

View File

@@ -3,10 +3,10 @@ import './NoLogs.styles.scss';
import { Typography } from 'antd';
import logEvent from 'api/common/logEvent';
import ROUTES from 'constants/routes';
import { useGetTenantLicense } from 'hooks/useGetTenantLicense';
import history from 'lib/history';
import { ArrowUpRight } from 'lucide-react';
import { DataSource } from 'types/common/queryBuilder';
import { isCloudUser } from 'utils/app';
import DOCLINKS from 'utils/docLinks';
export default function NoLogs({
@@ -14,14 +14,15 @@ export default function NoLogs({
}: {
dataSource: DataSource;
}): JSX.Element {
const cloudUser = isCloudUser();
const { isCloudUser: isCloudUserVal } = useGetTenantLicense();
const handleLinkClick = (
e: React.MouseEvent<HTMLAnchorElement, MouseEvent>,
): void => {
e.preventDefault();
e.stopPropagation();
if (cloudUser) {
if (isCloudUserVal) {
if (dataSource === DataSource.TRACES) {
logEvent('Traces Explorer: Navigate to onboarding', {});
} else if (dataSource === DataSource.LOGS) {

View File

@@ -281,7 +281,7 @@ function Members(): JSX.Element {
const { joinedOn } = record;
return (
<Typography>
{dayjs.unix(Number(joinedOn)).format(DATE_TIME_FORMATS.MONTH_DATE_FULL)}
{dayjs(joinedOn).format(DATE_TIME_FORMATS.MONTH_DATE_FULL)}
</Typography>
);
},

View File

@@ -75,6 +75,7 @@ function QueryBuilderSearch({
placeholder,
suffixIcon,
isInfraMonitoring,
isMetricsExplorer,
disableNavigationShortcuts,
entity,
}: QueryBuilderSearchProps): JSX.Element {
@@ -113,6 +114,7 @@ function QueryBuilderSearch({
isLogsExplorerPage,
isInfraMonitoring,
entity,
isMetricsExplorer,
);
const [isOpen, setIsOpen] = useState<boolean>(false);
@@ -129,6 +131,7 @@ function QueryBuilderSearch({
isLogsExplorerPage,
isInfraMonitoring,
entity,
isMetricsExplorer,
);
const { registerShortcut, deregisterShortcut } = useKeyboardHotkeys();
@@ -137,12 +140,12 @@ function QueryBuilderSearch({
const toggleEditMode = useCallback(
(value: boolean) => {
// Editing mode is required only in infra monitoring mode
if (isInfraMonitoring) {
// Editing mode is required only in infra monitoring or metrics explorer
if (isInfraMonitoring || isMetricsExplorer) {
setIsEditingTag(value);
}
},
[isInfraMonitoring],
[isInfraMonitoring, isMetricsExplorer],
);
const onTagRender = ({
@@ -168,7 +171,7 @@ function QueryBuilderSearch({
updateTag(value);
// Editing starts
toggleEditMode(true);
if (isInfraMonitoring) {
if (isInfraMonitoring || isMetricsExplorer) {
setSearchValue(value);
} else {
handleSearch(value);
@@ -240,8 +243,11 @@ function QueryBuilderSearch({
);
const isMetricsDataSource = useMemo(
() => query.dataSource === DataSource.METRICS && !isInfraMonitoring,
[query.dataSource, isInfraMonitoring],
() =>
query.dataSource === DataSource.METRICS &&
!isInfraMonitoring &&
!isMetricsExplorer,
[query.dataSource, isInfraMonitoring, isMetricsExplorer],
);
const fetchValueDataType = (value: unknown, operator: string): DataTypes => {
@@ -291,8 +297,8 @@ function QueryBuilderSearch({
};
});
// If in infra monitoring, only run the onChange query when editing is finsished.
if (isInfraMonitoring) {
// If in infra monitoring or metrics explorer, only run the onChange query when editing is finsished.
if (isInfraMonitoring || isMetricsExplorer) {
if (!isEditingTag) {
onChange(initialTagFilters);
}
@@ -498,6 +504,7 @@ interface QueryBuilderSearchProps {
isInfraMonitoring?: boolean;
disableNavigationShortcuts?: boolean;
entity?: K8sCategory | null;
isMetricsExplorer?: boolean;
}
QueryBuilderSearch.defaultProps = {
@@ -508,6 +515,7 @@ QueryBuilderSearch.defaultProps = {
isInfraMonitoring: false,
disableNavigationShortcuts: false,
entity: null,
isMetricsExplorer: false,
};
export interface CustomTagProps {

View File

@@ -5,6 +5,7 @@ import { ENTITY_VERSION_V4 } from 'constants/app';
import { MAX_RPS_LIMIT } from 'constants/global';
import ResourceAttributesFilter from 'container/ResourceAttributesFilter';
import { useGetQueriesRange } from 'hooks/queryBuilder/useGetQueriesRange';
import { useGetTenantLicense } from 'hooks/useGetTenantLicense';
import { useNotifications } from 'hooks/useNotifications';
import { useAppContext } from 'providers/App/App';
import { useEffect, useMemo, useState } from 'react';
@@ -14,7 +15,6 @@ import { useLocation } from 'react-router-dom';
import { AppState } from 'store/reducers';
import { ServicesList } from 'types/api/metrics/getService';
import { GlobalReducer } from 'types/reducer/globalTime';
import { isCloudUser } from 'utils/app';
import { getTotalRPS } from 'utils/services';
import { getColumns } from '../Columns/ServiceColumn';
@@ -34,7 +34,7 @@ function ServiceMetricTable({
const { t: getText } = useTranslation(['services']);
const { licenses, isFetchingLicenses } = useAppContext();
const isCloudUserVal = isCloudUser();
const { isCloudUser: isCloudUserVal } = useGetTenantLicense();
const queries = useGetQueriesRange(queryRangeRequestData, ENTITY_VERSION_V4, {
queryKey: [

View File

@@ -3,11 +3,11 @@ import { Flex, Typography } from 'antd';
import { ResizeTable } from 'components/ResizeTable';
import { MAX_RPS_LIMIT } from 'constants/global';
import ResourceAttributesFilter from 'container/ResourceAttributesFilter';
import { useGetTenantLicense } from 'hooks/useGetTenantLicense';
import { useAppContext } from 'providers/App/App';
import { useEffect, useMemo, useState } from 'react';
import { useTranslation } from 'react-i18next';
import { useLocation } from 'react-router-dom';
import { isCloudUser } from 'utils/app';
import { getTotalRPS } from 'utils/services';
import { getColumns } from '../Columns/ServiceColumn';
@@ -22,7 +22,7 @@ function ServiceTraceTable({
const { t: getText } = useTranslation(['services']);
const { licenses, isFetchingLicenses } = useAppContext();
const isCloudUserVal = isCloudUser();
const { isCloudUser: isCloudUserVal } = useGetTenantLicense();
const tableColumns = useMemo(() => getColumns(search, false), [search]);
useEffect(() => {

View File

@@ -11,6 +11,7 @@ import ROUTES from 'constants/routes';
import { GlobalShortcuts } from 'constants/shortcuts/globalShortcuts';
import { useKeyboardHotkeys } from 'hooks/hotkeys/useKeyboardHotkeys';
import useComponentPermission from 'hooks/useComponentPermission';
import { useGetTenantLicense } from 'hooks/useGetTenantLicense';
import { LICENSE_PLAN_KEY, LICENSE_PLAN_STATUS } from 'hooks/useLicense';
import history from 'lib/history';
import {
@@ -28,7 +29,7 @@ import { AppState } from 'store/reducers';
import { License } from 'types/api/licenses/def';
import AppReducer from 'types/reducer/app';
import { USER_ROLES } from 'types/roles';
import { checkVersionState, isCloudUser, isEECloudUser } from 'utils/app';
import { checkVersionState } from 'utils/app';
import { routeConfig } from './config';
import { getQueryString } from './helper';
@@ -86,7 +87,10 @@ function SideNav(): JSX.Element {
const { registerShortcut, deregisterShortcut } = useKeyboardHotkeys();
const isCloudUserVal = isCloudUser();
const {
isCloudUser: isCloudUserVal,
isEECloudUser: isEECloudUserVal,
} = useGetTenantLicense();
const { t } = useTranslation('');
@@ -275,7 +279,7 @@ function SideNav(): JSX.Element {
let updatedUserManagementItems: UserManagementMenuItems[] = [
manageLicenseMenuItem,
];
if (isCloudUserVal || isEECloudUser()) {
if (isCloudUserVal || isEECloudUserVal) {
const isOnboardingEnabled =
featureFlags?.find((feature) => feature.name === FeatureKeys.ONBOARDING)
?.active || false;
@@ -330,6 +334,7 @@ function SideNav(): JSX.Element {
featureFlags,
isCloudUserVal,
isCurrentVersionError,
isEECloudUserVal,
isLatestVersion,
licenses?.licenses,
onClickVersionHandler,

View File

@@ -30,7 +30,7 @@ import getTimeString from 'lib/getTimeString';
import { isObject } from 'lodash-es';
import { Check, Copy, Info, Send, Undo } from 'lucide-react';
import { useTimezone } from 'providers/Timezone';
import { useCallback, useEffect, useMemo, useState } from 'react';
import { useCallback, useEffect, useState } from 'react';
import { useQueryClient } from 'react-query';
import { connect, useDispatch, useSelector } from 'react-redux';
import { RouteComponentProps, withRouter } from 'react-router-dom';
@@ -314,11 +314,6 @@ function DateTimeSelection({
return `Refreshed ${secondsDiff} sec ago`;
}, [maxTime, minTime, selectedTime]);
const isLogsExplorerPage = useMemo(
() => location.pathname === ROUTES.LOGS_EXPLORER,
[location.pathname],
);
const onSelectHandler = useCallback(
(value: Time | CustomTimeType): void => {
if (isModalTimeSelection) {
@@ -347,15 +342,13 @@ function DateTimeSelection({
return;
}
if (!isLogsExplorerPage) {
urlQuery.delete('startTime');
urlQuery.delete('endTime');
urlQuery.delete('startTime');
urlQuery.delete('endTime');
urlQuery.set(QueryParams.relativeTime, value);
urlQuery.set(QueryParams.relativeTime, value);
const generatedUrl = `${location.pathname}?${urlQuery.toString()}`;
safeNavigate(generatedUrl);
}
const generatedUrl = `${location.pathname}?${urlQuery.toString()}`;
safeNavigate(generatedUrl);
// For logs explorer - time range handling is managed in useCopyLogLink.ts:52
@@ -368,7 +361,6 @@ function DateTimeSelection({
},
[
initQueryBuilderData,
isLogsExplorerPage,
isModalTimeSelection,
location.pathname,
onTimeChange,
@@ -438,16 +430,14 @@ function DateTimeSelection({
updateLocalStorageForRoutes(JSON.stringify({ startTime, endTime }));
if (!isLogsExplorerPage) {
urlQuery.set(
QueryParams.startTime,
startTime?.toDate().getTime().toString(),
);
urlQuery.set(QueryParams.endTime, endTime?.toDate().getTime().toString());
urlQuery.delete(QueryParams.relativeTime);
const generatedUrl = `${location.pathname}?${urlQuery.toString()}`;
safeNavigate(generatedUrl);
}
urlQuery.set(
QueryParams.startTime,
startTime?.toDate().getTime().toString(),
);
urlQuery.set(QueryParams.endTime, endTime?.toDate().getTime().toString());
urlQuery.delete(QueryParams.relativeTime);
const generatedUrl = `${location.pathname}?${urlQuery.toString()}`;
safeNavigate(generatedUrl);
}
}
};
@@ -466,15 +456,13 @@ function DateTimeSelection({
setIsValidteRelativeTime(true);
if (!isLogsExplorerPage) {
urlQuery.delete('startTime');
urlQuery.delete('endTime');
urlQuery.delete('startTime');
urlQuery.delete('endTime');
urlQuery.set(QueryParams.relativeTime, dateTimeStr);
urlQuery.set(QueryParams.relativeTime, dateTimeStr);
const generatedUrl = `${location.pathname}?${urlQuery.toString()}`;
safeNavigate(generatedUrl);
}
const generatedUrl = `${location.pathname}?${urlQuery.toString()}`;
safeNavigate(generatedUrl);
if (!stagedQuery) {
return;

View File

@@ -5,7 +5,6 @@ import { DataTypes } from 'types/api/queryBuilder/queryAutocompleteResponse';
export type LogTimeRange = {
start: number;
end: number;
pageSize: number;
};
export type UseCopyLogLink = {
@@ -13,7 +12,6 @@ export type UseCopyLogLink = {
isLogsExplorerPage: boolean;
activeLogId: string | null;
onLogCopy: MouseEventHandler<HTMLElement>;
onTimeRangeChange: (newTimeRange: LogTimeRange | null) => void;
};
export type UseActiveLog = {

View File

@@ -3,7 +3,6 @@ import ROUTES from 'constants/routes';
import { useNotifications } from 'hooks/useNotifications';
import useUrlQuery from 'hooks/useUrlQuery';
import useUrlQueryData from 'hooks/useUrlQueryData';
import history from 'lib/history';
import {
MouseEventHandler,
useCallback,
@@ -18,7 +17,7 @@ import { AppState } from 'store/reducers';
import { GlobalReducer } from 'types/reducer/globalTime';
import { HIGHLIGHTED_DELAY } from './configs';
import { LogTimeRange, UseCopyLogLink } from './types';
import { UseCopyLogLink } from './types';
export const useCopyLogLink = (logId?: string): UseCopyLogLink => {
const urlQuery = useUrlQuery();
@@ -31,27 +30,8 @@ export const useCopyLogLink = (logId?: string): UseCopyLogLink => {
null,
);
const { selectedTime, minTime, maxTime } = useSelector<
AppState,
GlobalReducer
>((state) => state.globalTime);
const onTimeRangeChange = useCallback(
(newTimeRange: LogTimeRange | null): void => {
if (selectedTime !== 'custom') {
urlQuery.delete(QueryParams.startTime);
urlQuery.delete(QueryParams.endTime);
urlQuery.set(QueryParams.relativeTime, selectedTime);
} else {
urlQuery.set(QueryParams.startTime, newTimeRange?.start.toString() || '');
urlQuery.set(QueryParams.endTime, newTimeRange?.end.toString() || '');
urlQuery.delete(QueryParams.relativeTime);
}
const generatedUrl = `${pathname}?${urlQuery.toString()}`;
history.replace(generatedUrl);
},
[pathname, urlQuery, selectedTime],
const { minTime, maxTime } = useSelector<AppState, GlobalReducer>(
(state) => state.globalTime,
);
const isActiveLog = useMemo(() => activeLogId === logId, [activeLogId, logId]);
@@ -101,6 +81,5 @@ export const useCopyLogLink = (logId?: string): UseCopyLogLink => {
isLogsExplorerPage,
activeLogId,
onLogCopy,
onTimeRangeChange,
};
};

View File

@@ -0,0 +1,47 @@
import {
getMetricsList,
MetricsListPayload,
MetricsListResponse,
} from 'api/metricsExplorer/getMetricsList';
import { REACT_QUERY_KEY } from 'constants/reactQueryKeys';
import { useMemo } from 'react';
import { useQuery, UseQueryOptions, UseQueryResult } from 'react-query';
import { ErrorResponse, SuccessResponse } from 'types/api';
type UseGetMetricsList = (
requestData: MetricsListPayload,
options?: UseQueryOptions<
SuccessResponse<MetricsListResponse> | ErrorResponse,
Error
>,
headers?: Record<string, string>,
) => UseQueryResult<
SuccessResponse<MetricsListResponse> | ErrorResponse,
Error
>;
export const useGetMetricsList: UseGetMetricsList = (
requestData,
options,
headers,
) => {
const queryKey = useMemo(() => {
if (options?.queryKey && Array.isArray(options.queryKey)) {
return [...options.queryKey];
}
if (options?.queryKey && typeof options.queryKey === 'string') {
return options.queryKey;
}
return [REACT_QUERY_KEY.GET_METRICS_LIST, requestData];
}, [options?.queryKey, requestData]);
return useQuery<SuccessResponse<MetricsListResponse> | ErrorResponse, Error>({
queryFn: ({ signal }) => getMetricsList(requestData, signal, headers),
...options,
queryKey,
});
};

View File

@@ -0,0 +1,45 @@
import {
getMetricsListFilterKeys,
MetricsListFilterKeysResponse,
} from 'api/metricsExplorer/getMetricsListFilterKeys';
import { REACT_QUERY_KEY } from 'constants/reactQueryKeys';
import { useMemo } from 'react';
import { useQuery, UseQueryOptions, UseQueryResult } from 'react-query';
import { ErrorResponse, SuccessResponse } from 'types/api';
type UseGetMetricsListFilterKeys = (
options?: UseQueryOptions<
SuccessResponse<MetricsListFilterKeysResponse> | ErrorResponse,
Error
>,
headers?: Record<string, string>,
) => UseQueryResult<
SuccessResponse<MetricsListFilterKeysResponse> | ErrorResponse,
Error
>;
export const useGetMetricsListFilterKeys: UseGetMetricsListFilterKeys = (
options,
headers,
) => {
const queryKey = useMemo(() => {
if (options?.queryKey && Array.isArray(options.queryKey)) {
return [...options.queryKey];
}
if (options?.queryKey && typeof options.queryKey === 'string') {
return options.queryKey;
}
return [REACT_QUERY_KEY.GET_METRICS_LIST_FILTER_KEYS];
}, [options?.queryKey]);
return useQuery<
SuccessResponse<MetricsListFilterKeysResponse> | ErrorResponse,
Error
>({
queryFn: ({ signal }) => getMetricsListFilterKeys(signal, headers),
...options,
queryKey,
});
};

View File

@@ -0,0 +1,50 @@
import {
getMetricsTreeMap,
MetricsTreeMapPayload,
MetricsTreeMapResponse,
} from 'api/metricsExplorer/getMetricsTreeMap';
import { REACT_QUERY_KEY } from 'constants/reactQueryKeys';
import { useMemo } from 'react';
import { useQuery, UseQueryOptions, UseQueryResult } from 'react-query';
import { ErrorResponse, SuccessResponse } from 'types/api';
type UseGetMetricsTreeMap = (
requestData: MetricsTreeMapPayload,
options?: UseQueryOptions<
SuccessResponse<MetricsTreeMapResponse> | ErrorResponse,
Error
>,
headers?: Record<string, string>,
) => UseQueryResult<
SuccessResponse<MetricsTreeMapResponse> | ErrorResponse,
Error
>;
export const useGetMetricsTreeMap: UseGetMetricsTreeMap = (
requestData,
options,
headers,
) => {
const queryKey = useMemo(() => {
if (options?.queryKey && Array.isArray(options.queryKey)) {
return [...options.queryKey];
}
if (options?.queryKey && typeof options.queryKey === 'string') {
return options.queryKey;
}
return [REACT_QUERY_KEY.GET_METRICS_TREE_MAP, requestData];
}, [options?.queryKey, requestData]);
return useQuery<
SuccessResponse<MetricsTreeMapResponse> | ErrorResponse,
Error
>({
queryFn: ({ signal }) => getMetricsTreeMap(requestData, signal, headers),
...options,
queryKey,
});
};

View File

@@ -31,6 +31,7 @@ export const useAutoComplete = (
shouldUseSuggestions?: boolean,
isInfraMonitoring?: boolean,
entity?: K8sCategory | null,
isMetricsExplorer?: boolean,
): IAutoComplete => {
const [searchValue, setSearchValue] = useState<string>('');
const [searchKey, setSearchKey] = useState<string>('');
@@ -42,6 +43,7 @@ export const useAutoComplete = (
shouldUseSuggestions,
isInfraMonitoring,
entity,
isMetricsExplorer,
);
const [key, operator, result] = useSetCurrentKeyAndOperator(searchValue, keys);

View File

@@ -1,4 +1,5 @@
/* eslint-disable sonarjs/cognitive-complexity */
import { getMetricsListFilterValues } from 'api/metricsExplorer/getMetricsListFilterValues';
import { getAttributesValues } from 'api/queryBuilder/getAttributesValues';
import { DEBOUNCE_DELAY } from 'constants/queryBuilderFilterConfig';
import {
@@ -10,6 +11,7 @@ import {
getTagToken,
isInNInOperator,
} from 'container/QueryBuilder/filters/QueryBuilderSearch/utils';
import { useGetMetricsListFilterKeys } from 'hooks/metricsExplorer/useGetMetricsListFilterKeys';
import useDebounceValue from 'hooks/useDebounce';
import { cloneDeep, isEqual, uniqWith, unset } from 'lodash-es';
import { useCallback, useEffect, useMemo, useRef, useState } from 'react';
@@ -50,6 +52,7 @@ export const useFetchKeysAndValues = (
shouldUseSuggestions?: boolean,
isInfraMonitoring?: boolean,
entity?: K8sCategory | null,
isMetricsExplorer?: boolean,
): IuseFetchKeysAndValues => {
const [keys, setKeys] = useState<BaseAutocompleteData[]>([]);
const [exampleQueries, setExampleQueries] = useState<TagFilter[]>([]);
@@ -98,10 +101,17 @@ export const useFetchKeysAndValues = (
const isQueryEnabled = useMemo(
() =>
query.dataSource === DataSource.METRICS && !isInfraMonitoring
query.dataSource === DataSource.METRICS &&
!isInfraMonitoring &&
!isMetricsExplorer
? !!query.dataSource && !!query.aggregateAttribute.dataType
: true,
[isInfraMonitoring, query.aggregateAttribute.dataType, query.dataSource],
[
isInfraMonitoring,
isMetricsExplorer,
query.aggregateAttribute.dataType,
query.dataSource,
],
);
const { data, isFetching, status } = useGetAggregateKeys(
@@ -139,6 +149,14 @@ export const useFetchKeysAndValues = (
},
);
const {
data: metricsListFilterKeysData,
isFetching: isFetchingMetricsListFilterKeys,
status: fetchingMetricsListFilterKeysStatus,
} = useGetMetricsListFilterKeys({
enabled: isMetricsExplorer && isQueryEnabled && !shouldUseSuggestions,
});
/**
* Fetches the options to be displayed based on the selected value
* @param value - the selected value
@@ -182,6 +200,15 @@ export const useFetchKeysAndValues = (
: tagValue?.toString() ?? '',
});
payload = response.payload;
} else if (isMetricsExplorer) {
const response = await getMetricsListFilterValues({
searchText: searchKey,
filterKey: filterAttributeKey?.key ?? tagKey,
filterAttributeKeyDataType:
filterAttributeKey?.dataType ?? DataTypes.EMPTY,
limit: 10,
});
payload = response.payload?.data;
} else {
const response = await getAttributesValues({
aggregateOperator: query.aggregateOperator,
@@ -238,6 +265,32 @@ export const useFetchKeysAndValues = (
}
}, [data?.payload?.attributeKeys, status]);
useEffect(() => {
if (
isMetricsExplorer &&
fetchingMetricsListFilterKeysStatus === 'success' &&
!isFetchingMetricsListFilterKeys &&
metricsListFilterKeysData?.payload?.data?.attributeKeys
) {
setKeys(metricsListFilterKeysData.payload.data.attributeKeys);
setSourceKeys((prevState) =>
uniqWith(
[
...(metricsListFilterKeysData.payload.data.attributeKeys ?? []),
...prevState,
],
isEqual,
),
);
}
}, [
metricsListFilterKeysData?.payload?.data?.attributeKeys,
fetchingMetricsListFilterKeysStatus,
isMetricsExplorer,
metricsListFilterKeysData,
isFetchingMetricsListFilterKeys,
]);
useEffect(() => {
if (
fetchingSuggestionsStatus === 'success' &&

View File

@@ -0,0 +1,15 @@
import { useAppContext } from 'providers/App/App';
import { LicensePlatform } from 'types/api/licensesV3/getActive';
export const useGetTenantLicense = (): {
isCloudUser: boolean;
isEECloudUser: boolean;
} => {
const { activeLicenseV3 } = useAppContext();
return {
isCloudUser: activeLicenseV3?.platform === LicensePlatform.CLOUD || false,
isEECloudUser:
activeLicenseV3?.platform === LicensePlatform.SELF_HOSTED || false,
};
};

View File

@@ -19,7 +19,6 @@ import {
import { QueryDataV3 } from 'types/api/widgets/getQuery';
import { GlobalReducer } from 'types/reducer/globalTime';
import { LogTimeRange } from './logs/types';
import { useCopyLogLink } from './logs/useCopyLogLink';
import { useGetExplorerQueryRange } from './queryBuilder/useGetExplorerQueryRange';
import useUrlQueryData from './useUrlQueryData';
@@ -129,7 +128,7 @@ export const useLogsData = ({
return data;
};
const { activeLogId, onTimeRangeChange } = useCopyLogLink();
const { activeLogId } = useCopyLogLink();
const { data, isFetching } = useGetExplorerQueryRange(
requestData,
@@ -150,7 +149,6 @@ export const useLogsData = ({
);
useEffect(() => {
const currentParams = data?.params as Omit<LogTimeRange, 'pageSize'>;
const currentData = data?.payload?.data?.newResult?.data?.result || [];
if (currentData.length > 0 && currentData[0].list) {
const currentLogs: ILog[] = currentData[0].list.map((item) => ({
@@ -160,11 +158,6 @@ export const useLogsData = ({
const newLogs = [...logs, ...currentLogs];
setLogs(newLogs);
onTimeRangeChange({
start: currentParams?.start,
end: currentParams?.end,
pageSize: newLogs.length,
});
}
// eslint-disable-next-line react-hooks/exhaustive-deps

View File

@@ -12,6 +12,10 @@ interface SafeNavigateParams {
search?: string;
}
interface UseSafeNavigateProps {
preventSameUrlNavigation?: boolean;
}
const areUrlsEffectivelySame = (url1: URL, url2: URL): boolean => {
if (url1.pathname !== url2.pathname) return false;
@@ -78,7 +82,11 @@ const isDefaultNavigation = (currentUrl: URL, targetUrl: URL): boolean => {
return newKeys.length > 0;
};
export const useSafeNavigate = (): {
export const useSafeNavigate = (
{ preventSameUrlNavigation }: UseSafeNavigateProps = {
preventSameUrlNavigation: true,
},
): {
safeNavigate: (
to: string | SafeNavigateParams,
options?: NavigateOptions,
@@ -108,7 +116,7 @@ export const useSafeNavigate = (): {
const urlsAreSame = areUrlsEffectivelySame(currentUrl, targetUrl);
const isDefaultParamsNavigation = isDefaultNavigation(currentUrl, targetUrl);
if (urlsAreSame) {
if (preventSameUrlNavigation && urlsAreSame) {
return;
}
@@ -129,7 +137,7 @@ export const useSafeNavigate = (): {
);
}
},
[navigate, location.pathname, location.search],
[navigate, location.pathname, location.search, preventSameUrlNavigation],
);
return { safeNavigate };

View File

@@ -1,14 +1,10 @@
import './DashboardsListPage.styles.scss';
import { Space, Typography } from 'antd';
import ReleaseNote from 'components/ReleaseNote';
import ListOfAllDashboard from 'container/ListOfDashboard';
import { LayoutGrid } from 'lucide-react';
import { useLocation } from 'react-router-dom';
function DashboardsListPage(): JSX.Element {
const location = useLocation();
return (
<Space
direction="vertical"
@@ -16,7 +12,6 @@ function DashboardsListPage(): JSX.Element {
style={{ width: '100%' }}
className="dashboard-list-page"
>
<ReleaseNote path={location.pathname} />
<div className="dashboard-header">
<LayoutGrid size={14} className="icon" />
<Typography.Text className="text">Dashboards</Typography.Text>

View File

@@ -7,9 +7,9 @@ import { Color } from '@signozhq/design-tokens';
import { Button, Flex, Skeleton, Typography } from 'antd';
import { useGetIntegration } from 'hooks/Integrations/useGetIntegration';
import { useGetIntegrationStatus } from 'hooks/Integrations/useGetIntegrationStatus';
import { useGetTenantLicense } from 'hooks/useGetTenantLicense';
import { defaultTo } from 'lodash-es';
import { ArrowLeft, MoveUpRight, RotateCw } from 'lucide-react';
import { isCloudUser } from 'utils/app';
import { handleContactSupport } from '../utils';
import IntegrationDetailContent from './IntegrationDetailContent';
@@ -44,6 +44,8 @@ function IntegrationDetailPage(props: IntegrationDetailPageProps): JSX.Element {
integrationId: selectedIntegration,
});
const { isCloudUser: isCloudUserVal } = useGetTenantLicense();
const {
data: integrationStatus,
isLoading: isStatusLoading,
@@ -104,7 +106,7 @@ function IntegrationDetailPage(props: IntegrationDetailPageProps): JSX.Element {
</Button>
<div
className="contact-support"
onClick={(): void => handleContactSupport(isCloudUser())}
onClick={(): void => handleContactSupport(isCloudUserVal)}
>
<Typography.Link className="text">Contact Support </Typography.Link>

View File

@@ -5,10 +5,10 @@ import './Integrations.styles.scss';
import { Color } from '@signozhq/design-tokens';
import { Button, List, Typography } from 'antd';
import { useGetAllIntegrations } from 'hooks/Integrations/useGetAllIntegrations';
import { useGetTenantLicense } from 'hooks/useGetTenantLicense';
import { MoveUpRight, RotateCw } from 'lucide-react';
import { Dispatch, SetStateAction, useMemo } from 'react';
import { IntegrationsProps } from 'types/api/integrations/types';
import { isCloudUser } from 'utils/app';
import { handleContactSupport, INTEGRATION_TYPES } from './utils';
@@ -44,6 +44,8 @@ function IntegrationsList(props: IntegrationsListProps): JSX.Element {
refetch,
} = useGetAllIntegrations();
const { isCloudUser: isCloudUserVal } = useGetTenantLicense();
const filteredDataList = useMemo(() => {
let integrationsList: IntegrationsProps[] = [];
@@ -90,7 +92,7 @@ function IntegrationsList(props: IntegrationsListProps): JSX.Element {
</Button>
<div
className="contact-support"
onClick={(): void => handleContactSupport(isCloudUser())}
onClick={(): void => handleContactSupport(isCloudUserVal)}
>
<Typography.Link className="text">Contact Support </Typography.Link>

View File

@@ -8,10 +8,10 @@ import MessagingQueueHealthCheck from 'components/MessagingQueueHealthCheck/Mess
import { QueryParams } from 'constants/query';
import ROUTES from 'constants/routes';
import DateTimeSelectionV2 from 'container/TopNav/DateTimeSelectionV2';
import { useGetTenantLicense } from 'hooks/useGetTenantLicense';
import { useEffect } from 'react';
import { useTranslation } from 'react-i18next';
import { useHistory } from 'react-router-dom';
import { isCloudUser } from 'utils/app';
import {
KAFKA_SETUP_DOC_LINK,
@@ -34,7 +34,7 @@ function MessagingQueues(): JSX.Element {
);
};
const isCloudUserVal = isCloudUser();
const { isCloudUser: isCloudUserVal } = useGetTenantLicense();
const getStartedRedirect = (link: string, sourceCard: string): void => {
logEvent('Messaging Queues: Get started clicked', {

View File

@@ -1,15 +1,9 @@
import { Space } from 'antd';
import ReleaseNote from 'components/ReleaseNote';
import ServicesApplication from 'container/ServiceApplication';
import { useLocation } from 'react-router-dom';
function Metrics(): JSX.Element {
const location = useLocation();
return (
<Space direction="vertical" style={{ width: '100%' }}>
<ReleaseNote path={location.pathname} />
<ServicesApplication />
</Space>
);

View File

@@ -1,6 +1,7 @@
import RouteTab from 'components/RouteTab';
import { FeatureKeys } from 'constants/features';
import useComponentPermission from 'hooks/useComponentPermission';
import { useGetTenantLicense } from 'hooks/useGetTenantLicense';
import history from 'lib/history';
import { useAppContext } from 'providers/App/App';
import { useMemo } from 'react';
@@ -12,6 +13,10 @@ import { getRoutes } from './utils';
function SettingsPage(): JSX.Element {
const { pathname } = useLocation();
const { user, featureFlags, licenses } = useAppContext();
const {
isCloudUser: isCloudAccount,
isEECloudUser: isEECloudAccount,
} = useGetTenantLicense();
const isWorkspaceBlocked = licenses?.workSpaceBlock || false;
@@ -32,9 +37,19 @@ function SettingsPage(): JSX.Element {
isCurrentOrgSettings,
isGatewayEnabled,
isWorkspaceBlocked,
isCloudAccount,
isEECloudAccount,
t,
),
[user.role, isCurrentOrgSettings, isGatewayEnabled, isWorkspaceBlocked, t],
[
user.role,
isCurrentOrgSettings,
isGatewayEnabled,
isWorkspaceBlocked,
isCloudAccount,
isEECloudAccount,
t,
],
);
return <RouteTab routes={routes} activeKey={pathname} history={history} />;

View File

@@ -1,7 +1,6 @@
import { RouteTabProps } from 'components/RouteTab/types';
import { TFunction } from 'i18next';
import { ROLES, USER_ROLES } from 'types/roles';
import { isCloudUser, isEECloudUser } from 'utils/app';
import {
alertChannels,
@@ -18,13 +17,12 @@ export const getRoutes = (
isCurrentOrgSettings: boolean,
isGatewayEnabled: boolean,
isWorkspaceBlocked: boolean,
isCloudAccount: boolean,
isEECloudAccount: boolean,
t: TFunction,
): RouteTabProps['routes'] => {
const settings = [];
const isCloudAccount = isCloudUser();
const isEECloudAccount = isEECloudUser();
const isAdmin = userRole === USER_ROLES.ADMIN;
const isEditor = userRole === USER_ROLES.EDITOR;

View File

@@ -1,12 +1,13 @@
import './NoData.styles.scss';
import { Button, Typography } from 'antd';
import { useGetTenantLicense } from 'hooks/useGetTenantLicense';
import { LifeBuoy, RefreshCw } from 'lucide-react';
import { handleContactSupport } from 'pages/Integrations/utils';
import { isCloudUser } from 'utils/app';
function NoData(): JSX.Element {
const isCloudUserVal = isCloudUser();
const { isCloudUser: isCloudUserVal } = useGetTenantLicense();
return (
<div className="not-found-trace">
<section className="description">

View File

@@ -21,7 +21,6 @@ import { FeatureFlagProps as FeatureFlags } from 'types/api/features/getFeatures
import { PayloadProps as LicensesResModel } from 'types/api/licenses/getAll';
import { LicenseV3ResModel } from 'types/api/licensesV3/getActive';
import { Organization } from 'types/api/user/getOrganization';
import { UserFlags } from 'types/api/user/setFlags';
import { OrgPreference } from 'types/reducer/app';
import { USER_ROLES } from 'types/roles';
@@ -158,13 +157,6 @@ export function AppProvider({ children }: PropsWithChildren): JSX.Element {
}
}, [orgPreferencesData, isFetchingOrgPreferences]);
function setUserFlags(userflags: UserFlags): void {
setUser((prev) => ({
...prev,
flags: userflags,
}));
}
function updateUser(user: IUser): void {
setUser((prev) => ({
...prev,
@@ -252,7 +244,6 @@ export function AppProvider({ children }: PropsWithChildren): JSX.Element {
orgPreferencesFetchError,
licensesRefetch,
updateUser,
setUserFlags,
updateOrgPreferences,
updateOrg,
}),

View File

@@ -3,7 +3,6 @@ import { PayloadProps as LicensesResModel } from 'types/api/licenses/getAll';
import { LicenseV3ResModel } from 'types/api/licensesV3/getActive';
import { Organization } from 'types/api/user/getOrganization';
import { PayloadProps as User } from 'types/api/user/getUser';
import { UserFlags } from 'types/api/user/setFlags';
import { OrgPreference } from 'types/reducer/app';
export interface IAppContext {
@@ -26,7 +25,6 @@ export interface IAppContext {
orgPreferencesFetchError: unknown;
licensesRefetch: () => void;
updateUser: (user: IUser) => void;
setUserFlags: (flags: UserFlags) => void;
updateOrgPreferences: (orgPreferences: OrgPreference[]) => void;
updateOrg(orgId: string, updatedOrgName: string): void;
}

View File

@@ -20,7 +20,6 @@ function getUserDefaults(): IUser {
name: '',
profilePictureURL: '',
createdAt: 0,
flags: {},
organization: '',
orgId: '',
role: 'VIEWER',

View File

@@ -763,7 +763,12 @@ export function QueryBuilderProvider({
[panelType, stagedQuery],
);
const { safeNavigate } = useSafeNavigate();
const { safeNavigate } = useSafeNavigate({
preventSameUrlNavigation: !(
initialDataSource === DataSource.LOGS ||
initialDataSource === DataSource.TRACES
),
});
const redirectWithQueryBuilderData = useCallback(
(

View File

@@ -16,6 +16,7 @@ import thunk from 'redux-thunk';
import store from 'store';
import {
LicenseEvent,
LicensePlatform,
LicenseState,
LicenseStatus,
} from 'types/api/licensesV3/getActive';
@@ -115,6 +116,7 @@ export function getAppContextMock(
key: 'does-not-matter',
state: LicenseState.ACTIVE,
status: LicenseStatus.VALID,
platform: LicensePlatform.CLOUD,
},
isFetchingActiveLicenseV3: false,
activeLicenseV3FetchError: null,
@@ -126,7 +128,6 @@ export function getAppContextMock(
name: 'John Doe',
profilePictureURL: '',
createdAt: 1732544623,
flags: {},
organization: 'Nightswatch',
orgId: 'does-not-matter-id',
role: role as ROLES,
@@ -324,7 +325,6 @@ export function getAppContextMock(
orgPreferencesFetchError: null,
isLoggedIn: true,
updateUser: jest.fn(),
setUserFlags: jest.fn(),
updateOrg: jest.fn(),
updateOrgPreferences: jest.fn(),
licensesRefetch: jest.fn(),

View File

@@ -13,6 +13,11 @@ export enum LicenseState {
ACTIVE = 'ACTIVE',
}
export enum LicensePlatform {
SELF_HOSTED = 'SELF_HOSTED',
CLOUD = 'CLOUD',
}
export type LicenseV3EventQueueResModel = {
event: LicenseEvent;
status: string;
@@ -26,4 +31,5 @@ export type LicenseV3ResModel = {
status: LicenseStatus;
state: LicenseState;
event_queue: LicenseV3EventQueueResModel;
platform: LicensePlatform;
};

View File

@@ -1,4 +1,3 @@
import { UserFlags } from 'types/api/user/setFlags';
import { User } from 'types/reducer/app';
import { ROLES } from 'types/roles';
@@ -16,6 +15,5 @@ export interface PayloadProps {
profilePictureURL: string;
organization: string;
role: ROLES;
flags: UserFlags;
groupId: string;
}

View File

@@ -1,12 +0,0 @@
import { User } from 'types/reducer/app';
export interface UserFlags {
ReleaseNote0120Hide?: string;
}
export type PayloadProps = UserFlags;
export interface Props {
userId: User['userId'];
flags: UserFlags;
}

View File

@@ -13,18 +13,6 @@ export function extractDomain(email: string): string {
return emailParts[1];
}
export const isCloudUser = (): boolean => {
const { hostname } = window.location;
return hostname?.endsWith('signoz.cloud');
};
export const isEECloudUser = (): boolean => {
const { hostname } = window.location;
return hostname?.endsWith('signoz.io');
};
export const checkVersionState = (
currentVersion: string,
latestVersion: string,

View File

@@ -3936,6 +3936,11 @@
dependencies:
"@types/geojson" "*"
"@types/d3-hierarchy@^1.1.6":
version "1.1.11"
resolved "https://registry.yarnpkg.com/@types/d3-hierarchy/-/d3-hierarchy-1.1.11.tgz#c3bd70d025621f73cb3319e97e08ae4c9051c791"
integrity sha512-lnQiU7jV+Gyk9oQYk0GGYccuexmQPTp08E0+4BidgFdiJivjEvf+esPSdZqCZ2C7UwTWejWpqetVaU8A+eX3FA==
"@types/d3-interpolate@3.0.1", "@types/d3-interpolate@^3.0.0":
version "3.0.1"
resolved "https://registry.yarnpkg.com/@types/d3-interpolate/-/d3-interpolate-3.0.1.tgz#e7d17fa4a5830ad56fe22ce3b4fac8541a9572dc"
@@ -4474,6 +4479,11 @@
resolved "https://registry.npmjs.org/@types/trusted-types/-/trusted-types-2.0.3.tgz"
integrity sha512-NfQ4gyz38SL8sDNrSixxU2Os1a5xcdFxipAFxYEuLUlvU2uDwS4NUpsImcf1//SlWItCVMMLiylsxbmNMToV/g==
"@types/trusted-types@^2.0.7":
version "2.0.7"
resolved "https://registry.yarnpkg.com/@types/trusted-types/-/trusted-types-2.0.7.tgz#baccb07a970b91707df3a3e8ba6896c57ead2d11"
integrity sha512-ScaPdn1dQczgbl0QFTeTOmVHFULt394XJgOQNoyVhZ6r2vLnMLJfBPd53SB52T/3G36VI1/g2MZaX0cwDuXsfw==
"@types/unist@*", "@types/unist@^3.0.0":
version "3.0.2"
resolved "https://registry.yarnpkg.com/@types/unist/-/unist-3.0.2.tgz#6dd61e43ef60b34086287f83683a5c1b2dc53d20"
@@ -4716,6 +4726,15 @@
"@types/d3-shape" "^1.3.1"
d3-shape "^1.0.6"
"@visx/group@3.12.0":
version "3.12.0"
resolved "https://registry.yarnpkg.com/@visx/group/-/group-3.12.0.tgz#2c69b810b52f1c1e69bf6f2fe923d184e32078c7"
integrity sha512-Dye8iS1alVXPv7nj/7M37gJe6sSKqJLH7x6sEWAsRQ9clI0kFvjbKcKgF+U3aAVQr0NCohheFV+DtR8trfK/Ag==
dependencies:
"@types/react" "*"
classnames "^2.3.1"
prop-types "^15.6.2"
"@visx/group@3.3.0":
version "3.3.0"
resolved "https://registry.yarnpkg.com/@visx/group/-/group-3.3.0.tgz#20c1b75c1ab31798c3c702b6f58c412c688a6373"
@@ -4725,6 +4744,18 @@
classnames "^2.3.1"
prop-types "^15.6.2"
"@visx/hierarchy@3.12.0":
version "3.12.0"
resolved "https://registry.yarnpkg.com/@visx/hierarchy/-/hierarchy-3.12.0.tgz#38295d2469cf957ed6d7700fe968aa16cbb878f0"
integrity sha512-+X1HOeLEOODxjAD7ixrWJ4KCVei4wFe8ra3dYU0uZ14RdPPgUeiuyBfdeXWZuAHM6Ix9qrryneatQjkC3h4mvA==
dependencies:
"@types/d3-hierarchy" "^1.1.6"
"@types/react" "*"
"@visx/group" "3.12.0"
classnames "^2.3.1"
d3-hierarchy "^1.1.4"
prop-types "^15.6.1"
"@visx/scale@3.5.0":
version "3.5.0"
resolved "https://registry.yarnpkg.com/@visx/scale/-/scale-3.5.0.tgz#c3db3863bbdd24d44781104ef5ee4cdc8df6f11d"
@@ -7095,6 +7126,16 @@ d3-geo@3.1.0:
dependencies:
d3-array "2.5.0 - 3"
d3-hierarchy@3.1.2:
version "3.1.2"
resolved "https://registry.yarnpkg.com/d3-hierarchy/-/d3-hierarchy-3.1.2.tgz#b01cd42c1eed3d46db77a5966cf726f8c09160c6"
integrity sha512-FX/9frcub54beBdugHjDCdikxThEqjnR93Qt7PvQTOHxyiNCAlvMrHhclk3cD5VeAaq9fxmfRp+CnWw9rEMBuA==
d3-hierarchy@^1.1.4:
version "1.1.9"
resolved "https://registry.yarnpkg.com/d3-hierarchy/-/d3-hierarchy-1.1.9.tgz#2f6bee24caaea43f8dc37545fa01628559647a83"
integrity sha512-j8tPxlqh1srJHAtxfvOUwKNYJkQuBFdM1+JAUfq6xqH5eAqf93L7oG1NVqDa4CpFZNvnNKtCYEUC8KY9yEn9lQ==
"d3-interpolate@1 - 3", "d3-interpolate@1.2.0 - 3", d3-interpolate@3.0.1:
version "3.0.1"
resolved "https://registry.yarnpkg.com/d3-interpolate/-/d3-interpolate-3.0.1.tgz#3c47aa5b32c5b3dfb56ef3fd4342078a632b400d"
@@ -7538,15 +7579,12 @@ domhandler@^5.0.2, domhandler@^5.0.3:
dependencies:
domelementtype "^2.3.0"
dompurify@3.1.3:
version "3.1.3"
resolved "https://registry.yarnpkg.com/dompurify/-/dompurify-3.1.3.tgz#cfe3ce4232c216d923832f68f2aa18b2fb9bd223"
integrity sha512-5sOWYSNPaxz6o2MUPvtyxTTqR4D3L77pr5rUQoWgD5ROQtVIZQgJkXbo1DLlK3vj11YGw5+LnF4SYti4gZmwng==
dompurify@^3.0.0:
version "3.1.7"
resolved "https://registry.yarnpkg.com/dompurify/-/dompurify-3.1.7.tgz#711a8c96479fb6ced93453732c160c3c72418a6a"
integrity sha512-VaTstWtsneJY8xzy7DekmYWEOZcmzIe3Qb3zPd4STve1OBTa+e+WmS1ITQec1fZYXI3HCsOZZiSMpG6oxoWMWQ==
dompurify@3.2.4, dompurify@^3.0.0:
version "3.2.4"
resolved "https://registry.yarnpkg.com/dompurify/-/dompurify-3.2.4.tgz#af5a5a11407524431456cf18836c55d13441cd8e"
integrity sha512-ysFSFEDVduQpyhzAob/kkuJjf5zWkZD8/A9ywSp1byueyuCfHamrCBa14/Oc2iiB0e51B+NpxSl5gmzn+Ms/mg==
optionalDependencies:
"@types/trusted-types" "^2.0.7"
domutils@^2.5.2, domutils@^2.8.0:
version "2.8.0"

View File

@@ -314,7 +314,7 @@ func (server *Server) SetConfig(ctx context.Context, alertmanagerConfig *alertma
}
func (server *Server) TestReceiver(ctx context.Context, receiver alertmanagertypes.Receiver) error {
return alertmanagertypes.TestReceiver(ctx, receiver, server.tmpl, server.logger, alertmanagertypes.NewTestAlert(receiver, time.Now(), time.Now()))
return alertmanagertypes.TestReceiver(ctx, receiver, server.alertmanagerConfig, server.tmpl, server.logger, alertmanagertypes.NewTestAlert(receiver, time.Now(), time.Now()))
}
func (server *Server) TestAlert(ctx context.Context, postableAlert *alertmanagertypes.PostableAlert, receivers []string) error {
@@ -335,7 +335,7 @@ func (server *Server) TestAlert(ctx context.Context, postableAlert *alertmanager
ch <- err
return
}
ch <- alertmanagertypes.TestReceiver(ctx, receiver, server.tmpl, server.logger, alerts[0])
ch <- alertmanagertypes.TestReceiver(ctx, receiver, server.alertmanagerConfig, server.tmpl, server.logger, alerts[0])
}(receiverName)
}

View File

@@ -48,21 +48,23 @@ func (store *config) Get(ctx context.Context, orgID string) (*alertmanagertypes.
}
// Set implements alertmanagertypes.ConfigStore.
func (store *config) Set(ctx context.Context, config *alertmanagertypes.Config) error {
if _, err := store.
sqlstore.
BunDB().
NewInsert().
Model(config.StoreableConfig()).
On("CONFLICT (org_id) DO UPDATE").
Set("config = ?", config.StoreableConfig().Config).
Set("hash = ?", config.StoreableConfig().Hash).
Set("updated_at = ?", config.StoreableConfig().UpdatedAt).
Exec(ctx); err != nil {
return err
}
func (store *config) Set(ctx context.Context, config *alertmanagertypes.Config, opts ...alertmanagertypes.StoreOption) error {
return store.wrap(ctx, func(ctx context.Context) error {
if _, err := store.
sqlstore.
BunDBCtx(ctx).
NewInsert().
Model(config.StoreableConfig()).
On("CONFLICT (org_id) DO UPDATE").
Set("config = ?", config.StoreableConfig().Config).
Set("hash = ?", config.StoreableConfig().Hash).
Set("updated_at = ?", config.StoreableConfig().UpdatedAt).
Exec(ctx); err != nil {
return err
}
return nil
return nil
}, opts...)
}
func (store *config) ListOrgs(ctx context.Context) ([]string, error) {
@@ -82,31 +84,19 @@ func (store *config) ListOrgs(ctx context.Context) ([]string, error) {
return orgIDs, nil
}
func (store *config) CreateChannel(ctx context.Context, channel *alertmanagertypes.Channel, cb func(context.Context) error) error {
tx, err := store.sqlstore.BunDB().BeginTx(ctx, nil)
if err != nil {
return err
}
defer tx.Rollback() //nolint:errcheck
if _, err = tx.NewInsert().
Model(channel).
Exec(ctx); err != nil {
return err
}
if cb != nil {
if err = cb(ctx); err != nil {
func (store *config) CreateChannel(ctx context.Context, channel *alertmanagertypes.Channel, opts ...alertmanagertypes.StoreOption) error {
return store.wrap(ctx, func(ctx context.Context) error {
if _, err := store.
sqlstore.
BunDBCtx(ctx).
NewInsert().
Model(channel).
Exec(ctx); err != nil {
return err
}
}
if err = tx.Commit(); err != nil {
return err
}
return nil
return nil
}, opts...)
}
func (store *config) GetChannelByID(ctx context.Context, orgID string, id int) (*alertmanagertypes.Channel, error) {
@@ -130,65 +120,39 @@ func (store *config) GetChannelByID(ctx context.Context, orgID string, id int) (
return channel, nil
}
func (store *config) UpdateChannel(ctx context.Context, orgID string, channel *alertmanagertypes.Channel, cb func(context.Context) error) error {
tx, err := store.sqlstore.BunDB().BeginTx(ctx, nil)
if err != nil {
return err
}
defer tx.Rollback() //nolint:errcheck
_, err = tx.NewUpdate().
Model(channel).
WherePK().
Exec(ctx)
if err != nil {
return err
}
if cb != nil {
if err = cb(ctx); err != nil {
func (store *config) UpdateChannel(ctx context.Context, orgID string, channel *alertmanagertypes.Channel, opts ...alertmanagertypes.StoreOption) error {
return store.wrap(ctx, func(ctx context.Context) error {
if _, err := store.
sqlstore.
BunDBCtx(ctx).
NewUpdate().
Model(channel).
WherePK().
Exec(ctx); err != nil {
return err
}
}
if err = tx.Commit(); err != nil {
return err
}
return nil
return nil
}, opts...)
}
func (store *config) DeleteChannelByID(ctx context.Context, orgID string, id int, cb func(context.Context) error) error {
channel := new(alertmanagertypes.Channel)
func (store *config) DeleteChannelByID(ctx context.Context, orgID string, id int, opts ...alertmanagertypes.StoreOption) error {
return store.wrap(ctx, func(ctx context.Context) error {
channel := new(alertmanagertypes.Channel)
tx, err := store.sqlstore.BunDB().BeginTx(ctx, nil)
if err != nil {
return err
}
defer tx.Rollback() //nolint:errcheck
_, err = tx.NewDelete().
Model(channel).
Where("org_id = ?", orgID).
Where("id = ?", id).
Exec(ctx)
if err != nil {
return err
}
if cb != nil {
if err = cb(ctx); err != nil {
if _, err := store.
sqlstore.
BunDBCtx(ctx).
NewDelete().
Model(channel).
Where("org_id = ?", orgID).
Where("id = ?", id).
Exec(ctx); err != nil {
return err
}
}
if err = tx.Commit(); err != nil {
return err
}
return nil
return nil
}, opts...)
}
func (store *config) ListChannels(ctx context.Context, orgID string) ([]*alertmanagertypes.Channel, error) {
@@ -254,3 +218,19 @@ func (store *config) GetMatchers(ctx context.Context, orgID string) (map[string]
return matchersMap, nil
}
func (store *config) wrap(ctx context.Context, fn func(ctx context.Context) error, opts ...alertmanagertypes.StoreOption) error {
storeOpts := alertmanagertypes.NewStoreOptions(opts...)
if storeOpts.Cb == nil {
return fn(ctx)
}
return store.sqlstore.RunInTxCtx(ctx, nil, func(ctx context.Context) error {
if err := fn(ctx); err != nil {
return err
}
return storeOpts.Cb(ctx)
})
}

View File

@@ -252,7 +252,16 @@ func (provider *provider) UpdateChannelByReceiverAndID(ctx context.Context, orgI
return err
}
err = provider.configStore.UpdateChannel(ctx, orgID, channel, func(ctx context.Context) error {
config, err := provider.configStore.Get(ctx, orgID)
if err != nil {
return err
}
if err := config.UpdateReceiver(receiver); err != nil {
return err
}
err = provider.configStore.UpdateChannel(ctx, orgID, channel, alertmanagertypes.WithCb(func(ctx context.Context) error {
url := provider.url.JoinPath(routesPath)
body, err := json.Marshal(receiver)
@@ -278,8 +287,12 @@ func (provider *provider) UpdateChannelByReceiverAndID(ctx context.Context, orgI
return fmt.Errorf("bad response status %v", resp.Status)
}
if err := provider.configStore.Set(ctx, config); err != nil {
return err
}
return nil
})
}))
if err != nil {
return err
}
@@ -290,7 +303,16 @@ func (provider *provider) UpdateChannelByReceiverAndID(ctx context.Context, orgI
func (provider *provider) CreateChannel(ctx context.Context, orgID string, receiver alertmanagertypes.Receiver) error {
channel := alertmanagertypes.NewChannelFromReceiver(receiver, orgID)
err := provider.configStore.CreateChannel(ctx, channel, func(ctx context.Context) error {
config, err := provider.configStore.Get(ctx, orgID)
if err != nil {
return err
}
if err := config.CreateReceiver(receiver); err != nil {
return err
}
return provider.configStore.CreateChannel(ctx, channel, alertmanagertypes.WithCb(func(ctx context.Context) error {
url := provider.url.JoinPath(routesPath)
body, err := json.Marshal(receiver)
@@ -316,22 +338,30 @@ func (provider *provider) CreateChannel(ctx context.Context, orgID string, recei
return fmt.Errorf("bad response status %v", resp.Status)
}
if err := provider.configStore.Set(ctx, config); err != nil {
return err
}
return nil
})
}))
}
func (provider *provider) DeleteChannelByID(ctx context.Context, orgID string, channelID int) error {
channel, err := provider.configStore.GetChannelByID(ctx, orgID, channelID)
if err != nil {
return err
}
return nil
}
config, err := provider.configStore.Get(ctx, orgID)
if err != nil {
return err
}
func (provider *provider) DeleteChannelByID(ctx context.Context, orgID string, channelID int) error {
err := provider.configStore.DeleteChannelByID(ctx, orgID, channelID, func(ctx context.Context) error {
channel, err := provider.configStore.GetChannelByID(ctx, orgID, channelID)
if err != nil {
return err
}
if err := config.DeleteReceiver(channel.Name); err != nil {
return err
}
return provider.configStore.DeleteChannelByID(ctx, orgID, channelID, alertmanagertypes.WithCb(func(ctx context.Context) error {
url := provider.url.JoinPath(routesPath)
body, err := json.Marshal(map[string]string{"name": channel.Name})
@@ -357,13 +387,12 @@ func (provider *provider) DeleteChannelByID(ctx context.Context, orgID string, c
return fmt.Errorf("bad response status %v", resp.Status)
}
return nil
})
if err != nil {
return err
}
if err := provider.configStore.Set(ctx, config); err != nil {
return err
}
return nil
return nil
}))
}
func (provider *provider) SetConfig(ctx context.Context, config *alertmanagertypes.Config) error {

View File

@@ -109,6 +109,10 @@ func (provider *provider) UpdateChannelByReceiverAndID(ctx context.Context, orgI
return err
}
if err := channel.Update(receiver); err != nil {
return err
}
config, err := provider.configStore.Get(ctx, orgID)
if err != nil {
return err
@@ -118,15 +122,9 @@ func (provider *provider) UpdateChannelByReceiverAndID(ctx context.Context, orgI
return err
}
if err := provider.configStore.Set(ctx, config); err != nil {
return err
}
if err := channel.Update(receiver); err != nil {
return err
}
return provider.configStore.UpdateChannel(ctx, orgID, channel, alertmanagertypes.ConfigStoreNoopCallback)
return provider.configStore.UpdateChannel(ctx, orgID, channel, alertmanagertypes.WithCb(func(ctx context.Context) error {
return provider.configStore.Set(ctx, config)
}))
}
func (provider *provider) DeleteChannelByID(ctx context.Context, orgID string, channelID int) error {
@@ -144,11 +142,9 @@ func (provider *provider) DeleteChannelByID(ctx context.Context, orgID string, c
return err
}
if err := provider.configStore.Set(ctx, config); err != nil {
return err
}
return provider.configStore.DeleteChannelByID(ctx, orgID, channelID, alertmanagertypes.ConfigStoreNoopCallback)
return provider.configStore.DeleteChannelByID(ctx, orgID, channelID, alertmanagertypes.WithCb(func(ctx context.Context) error {
return provider.configStore.Set(ctx, config)
}))
}
func (provider *provider) CreateChannel(ctx context.Context, orgID string, receiver alertmanagertypes.Receiver) error {
@@ -161,12 +157,10 @@ func (provider *provider) CreateChannel(ctx context.Context, orgID string, recei
return err
}
if err := provider.configStore.Set(ctx, config); err != nil {
return err
}
channel := alertmanagertypes.NewChannelFromReceiver(receiver, orgID)
return provider.configStore.CreateChannel(ctx, channel, alertmanagertypes.ConfigStoreNoopCallback)
return provider.configStore.CreateChannel(ctx, channel, alertmanagertypes.WithCb(func(ctx context.Context) error {
return provider.configStore.Set(ctx, config)
}))
}
func (provider *provider) SetConfig(ctx context.Context, config *alertmanagertypes.Config) error {

View File

@@ -1,21 +1,27 @@
package app
import (
"context"
"errors"
"net/http"
"strings"
"go.signoz.io/signoz/pkg/query-service/dao"
"go.signoz.io/signoz/pkg/query-service/model"
"go.signoz.io/signoz/pkg/types/authtypes"
)
func (aH *APIHandler) setApdexSettings(w http.ResponseWriter, r *http.Request) {
claims, ok := authtypes.ClaimsFromContext(r.Context())
if !ok {
RespondError(w, &model.ApiError{Err: errors.New("unauthorized"), Typ: model.ErrorUnauthorized}, nil)
return
}
req, err := parseSetApdexScoreRequest(r)
if aH.HandleError(w, err, http.StatusBadRequest) {
return
}
if err := dao.DB().SetApdexSettings(context.Background(), req); err != nil {
if err := dao.DB().SetApdexSettings(r.Context(), claims.OrgID, req); err != nil {
RespondError(w, &model.ApiError{Err: err, Typ: model.ErrorInternal}, nil)
return
}
@@ -25,7 +31,12 @@ func (aH *APIHandler) setApdexSettings(w http.ResponseWriter, r *http.Request) {
func (aH *APIHandler) getApdexSettings(w http.ResponseWriter, r *http.Request) {
services := r.URL.Query().Get("services")
apdexSet, err := dao.DB().GetApdexSettings(context.Background(), strings.Split(strings.TrimSpace(services), ","))
claims, ok := authtypes.ClaimsFromContext(r.Context())
if !ok {
RespondError(w, &model.ApiError{Err: errors.New("unauthorized"), Typ: model.ErrorUnauthorized}, nil)
return
}
apdexSet, err := dao.DB().GetApdexSettings(r.Context(), claims.OrgID, strings.Split(strings.TrimSpace(services), ","))
if err != nil {
RespondError(w, &model.ApiError{Err: err, Typ: model.ErrorInternal}, nil)
return

View File

@@ -9,13 +9,14 @@ import (
"go.signoz.io/signoz/pkg/query-service/auth"
"go.signoz.io/signoz/pkg/query-service/constants"
"go.signoz.io/signoz/pkg/query-service/model"
"go.signoz.io/signoz/pkg/types"
)
type AuthMiddleware struct {
GetUserFromRequest func(r context.Context) (*model.UserPayload, error)
GetUserFromRequest func(r context.Context) (*types.GettableUser, error)
}
func NewAuthMiddleware(f func(ctx context.Context) (*model.UserPayload, error)) *AuthMiddleware {
func NewAuthMiddleware(f func(ctx context.Context) (*types.GettableUser, error)) *AuthMiddleware {
return &AuthMiddleware{
GetUserFromRequest: f,
}

View File

@@ -5910,7 +5910,7 @@ func (r *ClickHouseReader) ListSummaryMetrics(ctx context.Context, req *metrics_
sb.WriteString(orderByClauseFirstQuery)
}
sb.WriteString(fmt.Sprintf(" LIMIT %d;", req.Limit))
sb.WriteString(fmt.Sprintf(" LIMIT %d OFFSET %d;", req.Limit, req.Offset))
sampleQuery := sb.String()

View File

@@ -124,7 +124,7 @@ func (c *Controller) GenerateConnectionUrl(
}
// TODO(Raj): parameterized this in follow up changes
agentVersion := "0.0.1"
agentVersion := "0.0.2"
connectionUrl := fmt.Sprintf(
"https://%s.console.aws.amazon.com/cloudformation/home?region=%s#/stacks/quickcreate?",

Binary file not shown.

After

Width:  |  Height:  |  Size: 131 KiB

View File

@@ -0,0 +1 @@
<svg width="800px" height="800px" viewBox="0 0 16 16" xmlns="http://www.w3.org/2000/svg" fill="none"><path fill="#FA7E14" d="M7.983 8.37c-.053.073-.098.133-.141.194L5.775 11.5c-.64.91-1.282 1.82-1.924 2.73a.128.128 0 01-.092.051c-.906-.007-1.813-.017-2.719-.028-.01 0-.02-.003-.04-.006a.455.455 0 01.025-.053 13977.496 13977.496 0 015.446-8.146c.092-.138.188-.273.275-.413a.165.165 0 00.018-.124c-.167-.515-.338-1.03-.508-1.543-.073-.22-.15-.44-.218-.66-.022-.072-.059-.094-.134-.093-.57.002-1.136.001-1.704.001-.108 0-.108 0-.108-.103 0-.674 0-1.347-.002-2.021 0-.075.026-.092.099-.092 1.143.002 2.286.002 3.43 0a.113.113 0 01.076.017.107.107 0 01.045.061 18266.184 18266.184 0 003.92 9.51c.218.53.438 1.059.654 1.59.026.064.053.076.12.056.6-.178 1.2-.352 1.8-.531.075-.023.102-.008.126.064.204.62.412 1.239.62 1.858l.02.073c-.043.015-.083.032-.124.043l-4.085 1.25c-.065.02-.085 0-.106-.054l-1.25-3.048-1.226-2.984-.183-.449c-.01-.026-.023-.048-.043-.087z"/></svg>

After

Width:  |  Height:  |  Size: 965 B

View File

@@ -0,0 +1,299 @@
{
"id": "lambda",
"title": "AWS Lambda",
"icon": "file://icon.svg",
"overview": "file://overview.md",
"supported_signals": {
"metrics": true,
"logs": true
},
"data_collected": {
"metrics": [
{
"name": "aws_Lambda_AsyncEventAge_count",
"unit": "Milliseconds",
"type": "Gauge",
"description": ""
},
{
"name": "aws_Lambda_AsyncEventAge_max",
"unit": "Milliseconds",
"type": "Gauge",
"description": ""
},
{
"name": "aws_Lambda_AsyncEventAge_min",
"unit": "Milliseconds",
"type": "Gauge",
"description": ""
},
{
"name": "aws_Lambda_AsyncEventAge_sum",
"unit": "Milliseconds",
"type": "Gauge",
"description": ""
},
{
"name": "aws_Lambda_AsyncEventsDropped_count",
"unit": "Count",
"type": "Gauge",
"description": ""
},
{
"name": "aws_Lambda_AsyncEventsDropped_max",
"unit": "Count",
"type": "Gauge",
"description": ""
},
{
"name": "aws_Lambda_AsyncEventsDropped_min",
"unit": "Count",
"type": "Gauge",
"description": ""
},
{
"name": "aws_Lambda_AsyncEventsDropped_sum",
"unit": "Count",
"type": "Gauge",
"description": ""
},
{
"name": "aws_Lambda_AsyncEventsReceived_count",
"unit": "Count",
"type": "Gauge",
"description": ""
},
{
"name": "aws_Lambda_AsyncEventsReceived_max",
"unit": "Count",
"type": "Gauge",
"description": ""
},
{
"name": "aws_Lambda_AsyncEventsReceived_min",
"unit": "Count",
"type": "Gauge",
"description": ""
},
{
"name": "aws_Lambda_AsyncEventsReceived_sum",
"unit": "Count",
"type": "Gauge",
"description": ""
},
{
"name": "aws_Lambda_ClaimedAccountConcurrency_count",
"unit": "Count",
"type": "Gauge",
"description": ""
},
{
"name": "aws_Lambda_ClaimedAccountConcurrency_max",
"unit": "Count",
"type": "Gauge",
"description": ""
},
{
"name": "aws_Lambda_ClaimedAccountConcurrency_min",
"unit": "Count",
"type": "Gauge",
"description": ""
},
{
"name": "aws_Lambda_ClaimedAccountConcurrency_sum",
"unit": "Count",
"type": "Gauge",
"description": ""
},
{
"name": "aws_Lambda_ConcurrentExecutions_count",
"unit": "Count",
"type": "Gauge",
"description": ""
},
{
"name": "aws_Lambda_ConcurrentExecutions_max",
"unit": "Count",
"type": "Gauge",
"description": ""
},
{
"name": "aws_Lambda_ConcurrentExecutions_min",
"unit": "Count",
"type": "Gauge",
"description": ""
},
{
"name": "aws_Lambda_ConcurrentExecutions_sum",
"unit": "Count",
"type": "Gauge",
"description": ""
},
{
"name": "aws_Lambda_Duration_count",
"unit": "Milliseconds",
"type": "Gauge",
"description": ""
},
{
"name": "aws_Lambda_Duration_max",
"unit": "Milliseconds",
"type": "Gauge",
"description": ""
},
{
"name": "aws_Lambda_Duration_min",
"unit": "Milliseconds",
"type": "Gauge",
"description": ""
},
{
"name": "aws_Lambda_Duration_sum",
"unit": "Milliseconds",
"type": "Gauge",
"description": ""
},
{
"name": "aws_Lambda_Errors_count",
"unit": "Count",
"type": "Gauge",
"description": ""
},
{
"name": "aws_Lambda_Errors_max",
"unit": "Count",
"type": "Gauge",
"description": ""
},
{
"name": "aws_Lambda_Errors_min",
"unit": "Count",
"type": "Gauge",
"description": ""
},
{
"name": "aws_Lambda_Errors_sum",
"unit": "Count",
"type": "Gauge",
"description": ""
},
{
"name": "aws_Lambda_Invocations_count",
"unit": "Count",
"type": "Gauge",
"description": ""
},
{
"name": "aws_Lambda_Invocations_max",
"unit": "Count",
"type": "Gauge",
"description": ""
},
{
"name": "aws_Lambda_Invocations_min",
"unit": "Count",
"type": "Gauge",
"description": ""
},
{
"name": "aws_Lambda_Invocations_sum",
"unit": "Count",
"type": "Gauge",
"description": ""
},
{
"name": "aws_Lambda_Throttles_count",
"unit": "Count",
"type": "Gauge",
"description": ""
},
{
"name": "aws_Lambda_Throttles_max",
"unit": "Count",
"type": "Gauge",
"description": ""
},
{
"name": "aws_Lambda_Throttles_min",
"unit": "Count",
"type": "Gauge",
"description": ""
},
{
"name": "aws_Lambda_Throttles_sum",
"unit": "Count",
"type": "Gauge",
"description": ""
},
{
"name": "aws_Lambda_UnreservedConcurrentExecutions_count",
"unit": "Count",
"type": "Gauge",
"description": ""
},
{
"name": "aws_Lambda_UnreservedConcurrentExecutions_max",
"unit": "Count",
"type": "Gauge",
"description": ""
},
{
"name": "aws_Lambda_UnreservedConcurrentExecutions_min",
"unit": "Count",
"type": "Gauge",
"description": ""
},
{
"name": "aws_Lambda_UnreservedConcurrentExecutions_sum",
"unit": "Count",
"type": "Gauge",
"description": ""
}
],
"logs": [
{
"name": "Account Id",
"path": "resources.cloud.account.id",
"type": "string"
},
{
"name": "Log Group Name",
"path": "resources.aws.cloudwatch.log_group_name",
"type": "string"
},
{
"name": "Log Stream Name",
"path": "resources.aws.cloudwatch.log_stream_name",
"type": "string"
}
]
},
"telemetry_collection_strategy": {
"aws_metrics": {
"cloudwatch_metric_stream_filters": [
{
"Namespace": "AWS/Lambda"
}
]
},
"aws_logs": {
"cloudwatch_logs_subscriptions": [
{
"log_group_name_prefix": "/aws/lambda",
"filter_pattern": ""
}
]
}
},
"assets": {
"dashboards": [
{
"id": "overview",
"title": "AWS Lambda Overview",
"description": "Overview of AWS Lambda",
"image": "file://assets/dashboards/overview.png",
"definition": "file://assets/dashboards/overview.json"
}
]
}
}

View File

@@ -0,0 +1,3 @@
### Monitor AWS Lambda with SigNoz
Collect key AWS Lambda metrics and view them with an out of the box dashboard.

View File

@@ -6,7 +6,6 @@ import (
"encoding/json"
"errors"
"fmt"
"go.signoz.io/signoz/pkg/query-service/app/metricsexplorer"
"io"
"math"
"net/http"
@@ -19,6 +18,8 @@ import (
"text/template"
"time"
"go.signoz.io/signoz/pkg/query-service/app/metricsexplorer"
"github.com/gorilla/mux"
"github.com/gorilla/websocket"
jsoniter "github.com/json-iterator/go"
@@ -51,6 +52,7 @@ import (
"go.signoz.io/signoz/pkg/query-service/contextlinks"
v3 "go.signoz.io/signoz/pkg/query-service/model/v3"
"go.signoz.io/signoz/pkg/query-service/postprocess"
"go.signoz.io/signoz/pkg/types"
"go.signoz.io/signoz/pkg/types/authtypes"
"go.uber.org/zap"
@@ -597,8 +599,6 @@ func (aH *APIHandler) RegisterRoutes(router *mux.Router, am *AuthMiddleware) {
router.HandleFunc("/api/v1/user/{id}", am.SelfAccess(aH.editUser)).Methods(http.MethodPut)
router.HandleFunc("/api/v1/user/{id}", am.AdminAccess(aH.deleteUser)).Methods(http.MethodDelete)
router.HandleFunc("/api/v1/user/{id}/flags", am.SelfAccess(aH.patchUserFlag)).Methods(http.MethodPatch)
router.HandleFunc("/api/v1/rbac/role/{id}", am.SelfAccess(aH.getRole)).Methods(http.MethodGet)
router.HandleFunc("/api/v1/rbac/role/{id}", am.AdminAccess(aH.editRole)).Methods(http.MethodPut)
@@ -2108,8 +2108,13 @@ func (aH *APIHandler) revokeInvite(w http.ResponseWriter, r *http.Request) {
// listPendingInvites is used to list the pending invites.
func (aH *APIHandler) listPendingInvites(w http.ResponseWriter, r *http.Request) {
ctx := context.Background()
invites, err := dao.DB().GetInvites(ctx)
ctx := r.Context()
claims, ok := authtypes.ClaimsFromContext(ctx)
if !ok {
RespondError(w, &model.ApiError{Err: errors.New("failed to get org id from context"), Typ: model.ErrorInternal}, nil)
return
}
invites, err := dao.DB().GetInvites(ctx, claims.OrgID)
if err != nil {
RespondError(w, err, nil)
return
@@ -2120,7 +2125,7 @@ func (aH *APIHandler) listPendingInvites(w http.ResponseWriter, r *http.Request)
var resp []*model.InvitationResponseObject
for _, inv := range invites {
org, apiErr := dao.DB().GetOrg(ctx, inv.OrgId)
org, apiErr := dao.DB().GetOrg(ctx, inv.OrgID)
if apiErr != nil {
RespondError(w, apiErr, nil)
}
@@ -2128,7 +2133,7 @@ func (aH *APIHandler) listPendingInvites(w http.ResponseWriter, r *http.Request)
Name: inv.Name,
Email: inv.Email,
Token: inv.Token,
CreatedAt: inv.CreatedAt,
CreatedAt: inv.CreatedAt.Unix(),
Role: inv.Role,
Organization: org.Name,
})
@@ -2271,13 +2276,15 @@ func (aH *APIHandler) editUser(w http.ResponseWriter, r *http.Request) {
old.ProfilePictureURL = update.ProfilePictureURL
}
_, apiErr = dao.DB().EditUser(ctx, &model.User{
Id: old.Id,
Name: old.Name,
OrgId: old.OrgId,
Email: old.Email,
Password: old.Password,
CreatedAt: old.CreatedAt,
_, apiErr = dao.DB().EditUser(ctx, &types.User{
ID: old.ID,
Name: old.Name,
OrgID: old.OrgID,
Email: old.Email,
Password: old.Password,
TimeAuditable: types.TimeAuditable{
CreatedAt: old.CreatedAt,
},
ProfilePictureURL: old.ProfilePictureURL,
})
if apiErr != nil {
@@ -2319,7 +2326,7 @@ func (aH *APIHandler) deleteUser(w http.ResponseWriter, r *http.Request) {
return
}
if user.GroupId == adminGroup.ID && len(adminUsers) == 1 {
if user.GroupID == adminGroup.ID && len(adminUsers) == 1 {
RespondError(w, &model.ApiError{
Typ: model.ErrorInternal,
Err: errors.New("cannot delete the last admin user")}, nil)
@@ -2334,37 +2341,6 @@ func (aH *APIHandler) deleteUser(w http.ResponseWriter, r *http.Request) {
aH.WriteJSON(w, r, map[string]string{"data": "user deleted successfully"})
}
// addUserFlag patches a user flags with the changes
func (aH *APIHandler) patchUserFlag(w http.ResponseWriter, r *http.Request) {
// read user id from path var
userId := mux.Vars(r)["id"]
// read input into user flag
defer r.Body.Close()
b, err := io.ReadAll(r.Body)
if err != nil {
zap.L().Error("failed read user flags from http request for userId ", zap.String("userId", userId), zap.Error(err))
RespondError(w, model.BadRequestStr("received user flags in invalid format"), nil)
return
}
flags := make(map[string]string, 0)
err = json.Unmarshal(b, &flags)
if err != nil {
zap.L().Error("failed parsing user flags for userId ", zap.String("userId", userId), zap.Error(err))
RespondError(w, model.BadRequestStr("received user flags in invalid format"), nil)
return
}
newflags, apiError := dao.DB().UpdateUserFlags(r.Context(), userId, flags)
if !apiError.IsNil() {
RespondError(w, apiError, nil)
return
}
aH.Respond(w, newflags)
}
func (aH *APIHandler) getRole(w http.ResponseWriter, r *http.Request) {
id := mux.Vars(r)["id"]
@@ -2380,7 +2356,7 @@ func (aH *APIHandler) getRole(w http.ResponseWriter, r *http.Request) {
}, nil)
return
}
group, err := dao.DB().GetGroup(context.Background(), user.GroupId)
group, err := dao.DB().GetGroup(context.Background(), user.GroupID)
if err != nil {
RespondError(w, err, "Failed to get group")
return
@@ -2416,7 +2392,7 @@ func (aH *APIHandler) editRole(w http.ResponseWriter, r *http.Request) {
}
// Make sure that the request is not demoting the last admin user.
if user.GroupId == auth.AuthCacheObj.AdminGroupId {
if user.GroupID == auth.AuthCacheObj.AdminGroupId {
adminUsers, apiErr := dao.DB().GetUsersByGroup(ctx, auth.AuthCacheObj.AdminGroupId)
if apiErr != nil {
RespondError(w, apiErr, "Failed to fetch adminUsers")
@@ -2431,7 +2407,7 @@ func (aH *APIHandler) editRole(w http.ResponseWriter, r *http.Request) {
}
}
apiErr = dao.DB().UpdateUserGroup(context.Background(), user.Id, newGroup.ID)
apiErr = dao.DB().UpdateUserGroup(context.Background(), user.ID, newGroup.ID)
if apiErr != nil {
RespondError(w, apiErr, "Failed to add user to group")
return
@@ -2465,7 +2441,7 @@ func (aH *APIHandler) editOrg(w http.ResponseWriter, r *http.Request) {
return
}
req.Id = id
req.ID = id
if apiErr := dao.DB().EditOrg(context.Background(), req); apiErr != nil {
RespondError(w, apiErr, "Failed to update org in the DB")
return
@@ -3528,7 +3504,7 @@ func (aH *APIHandler) getUserPreference(
user := common.GetUserFromContext(r.Context())
preference, apiErr := preferences.GetUserPreference(
r.Context(), preferenceId, user.User.OrgId, user.User.Id,
r.Context(), preferenceId, user.User.OrgID, user.User.ID,
)
if apiErr != nil {
RespondError(w, apiErr, nil)
@@ -3551,7 +3527,7 @@ func (aH *APIHandler) updateUserPreference(
RespondError(w, model.BadRequest(err), nil)
return
}
preference, apiErr := preferences.UpdateUserPreference(r.Context(), preferenceId, req.PreferenceValue, user.User.Id)
preference, apiErr := preferences.UpdateUserPreference(r.Context(), preferenceId, req.PreferenceValue, user.User.ID)
if apiErr != nil {
RespondError(w, apiErr, nil)
return
@@ -3565,7 +3541,7 @@ func (aH *APIHandler) getAllUserPreferences(
) {
user := common.GetUserFromContext(r.Context())
preference, apiErr := preferences.GetAllUserPreferences(
r.Context(), user.User.OrgId, user.User.Id,
r.Context(), user.User.OrgID, user.User.ID,
)
if apiErr != nil {
RespondError(w, apiErr, nil)
@@ -3581,7 +3557,7 @@ func (aH *APIHandler) getOrgPreference(
preferenceId := mux.Vars(r)["preferenceId"]
user := common.GetUserFromContext(r.Context())
preference, apiErr := preferences.GetOrgPreference(
r.Context(), preferenceId, user.User.OrgId,
r.Context(), preferenceId, user.User.OrgID,
)
if apiErr != nil {
RespondError(w, apiErr, nil)
@@ -3604,7 +3580,7 @@ func (aH *APIHandler) updateOrgPreference(
RespondError(w, model.BadRequest(err), nil)
return
}
preference, apiErr := preferences.UpdateOrgPreference(r.Context(), preferenceId, req.PreferenceValue, user.User.OrgId)
preference, apiErr := preferences.UpdateOrgPreference(r.Context(), preferenceId, req.PreferenceValue, user.User.OrgID)
if apiErr != nil {
RespondError(w, apiErr, nil)
return
@@ -3618,7 +3594,7 @@ func (aH *APIHandler) getAllOrgPreferences(
) {
user := common.GetUserFromContext(r.Context())
preference, apiErr := preferences.GetAllOrgPreferences(
r.Context(), user.User.OrgId,
r.Context(), user.User.OrgID,
)
if apiErr != nil {
RespondError(w, apiErr, nil)

View File

@@ -6,9 +6,6 @@ import (
"encoding/json"
"errors"
"fmt"
"go.signoz.io/signoz/pkg/query-service/app/integrations/messagingQueues/kafka"
queues2 "go.signoz.io/signoz/pkg/query-service/app/integrations/messagingQueues/queues"
"go.signoz.io/signoz/pkg/query-service/app/integrations/thirdPartyApi"
"math"
"net/http"
"strconv"
@@ -16,6 +13,10 @@ import (
"text/template"
"time"
"go.signoz.io/signoz/pkg/query-service/app/integrations/messagingQueues/kafka"
queues2 "go.signoz.io/signoz/pkg/query-service/app/integrations/messagingQueues/queues"
"go.signoz.io/signoz/pkg/query-service/app/integrations/thirdPartyApi"
"github.com/SigNoz/govaluate"
"github.com/gorilla/mux"
promModel "github.com/prometheus/common/model"
@@ -32,6 +33,7 @@ import (
"go.signoz.io/signoz/pkg/query-service/postprocess"
"go.signoz.io/signoz/pkg/query-service/utils"
querytemplate "go.signoz.io/signoz/pkg/query-service/utils/queryTemplate"
"go.signoz.io/signoz/pkg/types"
)
var allowedFunctions = []string{"count", "ratePerSec", "sum", "avg", "min", "max", "p50", "p90", "p95", "p99"}
@@ -465,8 +467,8 @@ func parseGetTTL(r *http.Request) (*model.GetTTLParams, error) {
return &model.GetTTLParams{Type: typeTTL}, nil
}
func parseUserRequest(r *http.Request) (*model.User, error) {
var req model.User
func parseUserRequest(r *http.Request) (*types.User, error) {
var req types.User
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
return nil, err
}
@@ -519,8 +521,8 @@ func parseInviteUsersRequest(r *http.Request) (*model.BulkInviteRequest, error)
return &req, nil
}
func parseSetApdexScoreRequest(r *http.Request) (*model.ApdexSettings, error) {
var req model.ApdexSettings
func parseSetApdexScoreRequest(r *http.Request) (*types.ApdexSettings, error) {
var req types.ApdexSettings
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
return nil, err
}
@@ -566,8 +568,8 @@ func parseUserRoleRequest(r *http.Request) (*model.UserRole, error) {
return &req, nil
}
func parseEditOrgRequest(r *http.Request) (*model.Organization, error) {
var req model.Organization
func parseEditOrgRequest(r *http.Request) (*types.Organization, error) {
var req types.Organization
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
return nil, err
}

View File

@@ -122,6 +122,7 @@ func (q *querier) runBuilderQuery(
misses := q.queryCache.FindMissingTimeRanges(start, end, builderQuery.StepInterval, cacheKeys[queryName])
zap.L().Info("cache misses for logs query", zap.Any("misses", misses))
missedSeries := make([]querycache.CachedSeriesData, 0)
filteredMissedSeries := make([]querycache.CachedSeriesData, 0)
for _, miss := range misses {
query, err = prepareLogsQuery(ctx, q.UseLogsNewSchema, miss.Start, miss.End, builderQuery, params, preferRPM)
if err != nil {
@@ -138,15 +139,32 @@ func (q *querier) runBuilderQuery(
}
return
}
filteredSeries, startTime, endTime := common.FilterSeriesPoints(series, miss.Start, miss.End, builderQuery.StepInterval)
// making sure that empty range doesn't doesn't enter the cache
// empty results from filteredSeries means data was filtered out, but empty series means actual empty data
if len(filteredSeries) > 0 || len(series) == 0 {
filteredMissedSeries = append(filteredMissedSeries, querycache.CachedSeriesData{
Data: filteredSeries,
Start: startTime,
End: endTime,
})
}
// for the actual response
missedSeries = append(missedSeries, querycache.CachedSeriesData{
Data: series,
Start: miss.Start,
End: miss.End,
Data: series,
})
}
mergedSeries := q.queryCache.MergeWithCachedSeriesData(cacheKeys[queryName], missedSeries)
resultSeries := common.GetSeriesFromCachedData(mergedSeries, start, end)
filteredMergedSeries := q.queryCache.MergeWithCachedSeriesDataV2(cacheKeys[queryName], filteredMissedSeries)
q.queryCache.StoreSeriesInCache(cacheKeys[queryName], filteredMergedSeries)
mergedSeries := q.queryCache.MergeWithCachedSeriesDataV2(cacheKeys[queryName], missedSeries)
resultSeries := common.GetSeriesFromCachedDataV2(mergedSeries, start, end, builderQuery.StepInterval)
ch <- channelResult{
Err: nil,

View File

@@ -119,9 +119,10 @@ func (q *querier) runBuilderQuery(
ch <- channelResult{Err: err, Name: queryName, Query: query, Series: series}
return
}
misses := q.queryCache.FindMissingTimeRanges(start, end, builderQuery.StepInterval, cacheKeys[queryName])
misses := q.queryCache.FindMissingTimeRangesV2(start, end, builderQuery.StepInterval, cacheKeys[queryName])
zap.L().Info("cache misses for logs query", zap.Any("misses", misses))
missedSeries := make([]querycache.CachedSeriesData, 0)
filteredMissedSeries := make([]querycache.CachedSeriesData, 0)
for _, miss := range misses {
query, err = prepareLogsQuery(ctx, q.UseLogsNewSchema, miss.Start, miss.End, builderQuery, params, preferRPM)
if err != nil {
@@ -138,15 +139,33 @@ func (q *querier) runBuilderQuery(
}
return
}
filteredSeries, startTime, endTime := common.FilterSeriesPoints(series, miss.Start, miss.End, builderQuery.StepInterval)
// making sure that empty range doesn't doesn't enter the cache
// empty results from filteredSeries means data was filtered out, but empty series means actual empty data
if len(filteredSeries) > 0 || len(series) == 0 {
filteredMissedSeries = append(filteredMissedSeries, querycache.CachedSeriesData{
Data: filteredSeries,
Start: startTime,
End: endTime,
})
}
// for the actual response
missedSeries = append(missedSeries, querycache.CachedSeriesData{
Data: series,
Start: miss.Start,
End: miss.End,
})
}
mergedSeries := q.queryCache.MergeWithCachedSeriesData(cacheKeys[queryName], missedSeries)
resultSeries := common.GetSeriesFromCachedData(mergedSeries, start, end)
filteredMergedSeries := q.queryCache.MergeWithCachedSeriesDataV2(cacheKeys[queryName], filteredMissedSeries)
q.queryCache.StoreSeriesInCache(cacheKeys[queryName], filteredMergedSeries)
mergedSeries := q.queryCache.MergeWithCachedSeriesDataV2(cacheKeys[queryName], missedSeries)
resultSeries := common.GetSeriesFromCachedDataV2(mergedSeries, start, end, builderQuery.StepInterval)
ch <- channelResult{
Err: nil,

View File

@@ -25,6 +25,7 @@ import (
opAmpModel "go.signoz.io/signoz/pkg/query-service/app/opamp/model"
"go.signoz.io/signoz/pkg/query-service/app/preferences"
"go.signoz.io/signoz/pkg/signoz"
"go.signoz.io/signoz/pkg/types"
"go.signoz.io/signoz/pkg/types/authtypes"
"go.signoz.io/signoz/pkg/web"
@@ -291,14 +292,14 @@ func (s *Server) createPublicServer(api *APIHandler, web web.Web) (*http.Server,
r.Use(middleware.NewLogging(zap.L(), s.serverOptions.Config.APIServer.Logging.ExcludedRoutes).Wrap)
// add auth middleware
getUserFromRequest := func(ctx context.Context) (*model.UserPayload, error) {
getUserFromRequest := func(ctx context.Context) (*types.GettableUser, error) {
user, err := auth.GetUserFromReqContext(ctx)
if err != nil {
return nil, err
}
if user.User.OrgId == "" {
if user.User.OrgID == "" {
return nil, model.UnauthorizedError(errors.New("orgId is missing in the claims"))
}

View File

@@ -17,6 +17,7 @@ import (
"go.signoz.io/signoz/pkg/query-service/telemetry"
"go.signoz.io/signoz/pkg/query-service/utils"
smtpservice "go.signoz.io/signoz/pkg/query-service/utils/smtpService"
"go.signoz.io/signoz/pkg/types"
"go.signoz.io/signoz/pkg/types/authtypes"
"go.uber.org/zap"
"golang.org/x/crypto/bcrypt"
@@ -61,6 +62,10 @@ func Invite(ctx context.Context, req *model.InviteRequest) (*model.InviteRespons
return nil, errors.New("User already exists with the same email")
}
claims, ok := authtypes.ClaimsFromContext(ctx)
if !ok {
return nil, errors.New("failed to extract OrgID from context")
}
// Check if an invite already exists
invite, apiErr := dao.DB().GetInviteFromEmail(ctx, req.Email)
if apiErr != nil {
@@ -75,23 +80,18 @@ func Invite(ctx context.Context, req *model.InviteRequest) (*model.InviteRespons
return nil, errors.Wrap(err, "invalid invite request")
}
claims, ok := authtypes.ClaimsFromContext(ctx)
if !ok {
return nil, errors.Wrap(err, "failed to extract admin user id")
}
au, apiErr := dao.DB().GetUser(ctx, claims.UserID)
if apiErr != nil {
return nil, errors.Wrap(err, "failed to query admin user from the DB")
}
inv := &model.InvitationObject{
inv := &types.Invite{
Name: req.Name,
Email: req.Email,
Token: token,
CreatedAt: time.Now().Unix(),
CreatedAt: time.Now(),
Role: req.Role,
OrgId: au.OrgId,
OrgID: au.OrgID,
}
if err := dao.DB().CreateInviteEntry(ctx, inv); err != nil {
@@ -157,7 +157,7 @@ func InviteUsers(ctx context.Context, req *model.BulkInviteRequest) (*model.Bulk
}
// Helper function to handle individual invites
func inviteUser(ctx context.Context, req *model.InviteRequest, au *model.UserPayload) (*model.InviteResponse, error) {
func inviteUser(ctx context.Context, req *model.InviteRequest, au *types.GettableUser) (*model.InviteResponse, error) {
token, err := utils.RandomHex(opaqueTokenSize)
if err != nil {
return nil, errors.Wrap(err, "failed to generate invite token")
@@ -186,13 +186,13 @@ func inviteUser(ctx context.Context, req *model.InviteRequest, au *model.UserPay
return nil, errors.Wrap(err, "invalid invite request")
}
inv := &model.InvitationObject{
inv := &types.Invite{
Name: req.Name,
Email: req.Email,
Token: token,
CreatedAt: time.Now().Unix(),
CreatedAt: time.Now(),
Role: req.Role,
OrgId: au.OrgId,
OrgID: au.OrgID,
}
if err := dao.DB().CreateInviteEntry(ctx, inv); err != nil {
@@ -211,7 +211,7 @@ func inviteUser(ctx context.Context, req *model.InviteRequest, au *model.UserPay
return &model.InviteResponse{Email: inv.Email, InviteToken: inv.Token}, nil
}
func inviteEmail(req *model.InviteRequest, au *model.UserPayload, token string) {
func inviteEmail(req *model.InviteRequest, au *types.GettableUser, token string) {
smtp := smtpservice.GetInstance()
data := InviteEmailData{
CustomerName: req.Name,
@@ -251,7 +251,12 @@ func RevokeInvite(ctx context.Context, email string) error {
return ErrorInvalidInviteToken
}
if err := dao.DB().DeleteInvitation(ctx, email); err != nil {
claims, ok := authtypes.ClaimsFromContext(ctx)
if !ok {
return errors.New("failed to org id from context")
}
if err := dao.DB().DeleteInvitation(ctx, claims.OrgID, email); err != nil {
return errors.Wrap(err.Err, "failed to write to DB")
}
return nil
@@ -272,7 +277,7 @@ func GetInvite(ctx context.Context, token string) (*model.InvitationResponseObje
// TODO(Ahsan): This is not the best way to add org name in the invite response. We should
// either include org name in the invite table or do a join query.
org, apiErr := dao.DB().GetOrg(ctx, inv.OrgId)
org, apiErr := dao.DB().GetOrg(ctx, inv.OrgID)
if apiErr != nil {
return nil, errors.Wrap(apiErr.Err, "failed to query the DB")
}
@@ -280,13 +285,13 @@ func GetInvite(ctx context.Context, token string) (*model.InvitationResponseObje
Name: inv.Name,
Email: inv.Email,
Token: inv.Token,
CreatedAt: inv.CreatedAt,
CreatedAt: inv.CreatedAt.Unix(),
Role: inv.Role,
Organization: org.Name,
}, nil
}
func ValidateInvite(ctx context.Context, req *RegisterRequest) (*model.InvitationObject, error) {
func ValidateInvite(ctx context.Context, req *RegisterRequest) (*types.Invite, error) {
invitation, err := dao.DB().GetInviteFromEmail(ctx, req.Email)
if err != nil {
return nil, errors.Wrap(err.Err, "Failed to read from DB")
@@ -303,14 +308,14 @@ func ValidateInvite(ctx context.Context, req *RegisterRequest) (*model.Invitatio
return invitation, nil
}
func CreateResetPasswordToken(ctx context.Context, userId string) (*model.ResetPasswordEntry, error) {
func CreateResetPasswordToken(ctx context.Context, userId string) (*types.ResetPasswordRequest, error) {
token, err := utils.RandomHex(opaqueTokenSize)
if err != nil {
return nil, errors.Wrap(err, "failed to generate reset password token")
}
req := &model.ResetPasswordEntry{
UserId: userId,
req := &types.ResetPasswordRequest{
UserID: userId,
Token: token,
}
if apiErr := dao.DB().CreateResetPasswordEntry(ctx, req); err != nil {
@@ -334,7 +339,7 @@ func ResetPassword(ctx context.Context, req *model.ResetPasswordRequest) error {
return errors.Wrap(err, "Failed to generate password hash")
}
if apiErr := dao.DB().UpdateUserPassword(ctx, hash, entry.UserId); apiErr != nil {
if apiErr := dao.DB().UpdateUserPassword(ctx, hash, entry.UserID); apiErr != nil {
return apiErr.Err
}
@@ -360,7 +365,7 @@ func ChangePassword(ctx context.Context, req *model.ChangePasswordRequest) *mode
return model.InternalError(errors.New("Failed to generate password hash"))
}
if apiErr := dao.DB().UpdateUserPassword(ctx, hash, user.Id); apiErr != nil {
if apiErr := dao.DB().UpdateUserPassword(ctx, hash, user.ID); apiErr != nil {
return apiErr
}
@@ -369,6 +374,7 @@ func ChangePassword(ctx context.Context, req *model.ChangePasswordRequest) *mode
type RegisterRequest struct {
Name string `json:"name"`
OrgID string `json:"orgId"`
OrgName string `json:"orgName"`
Email string `json:"email"`
Password string `json:"password"`
@@ -380,7 +386,7 @@ type RegisterRequest struct {
SourceUrl string `json:"sourceUrl"`
}
func RegisterFirstUser(ctx context.Context, req *RegisterRequest) (*model.User, *model.ApiError) {
func RegisterFirstUser(ctx context.Context, req *RegisterRequest) (*types.User, *model.ApiError) {
if req.Email == "" {
return nil, model.BadRequest(model.ErrEmailRequired{})
@@ -392,8 +398,9 @@ func RegisterFirstUser(ctx context.Context, req *RegisterRequest) (*model.User,
groupName := constants.AdminGroup
// modify this to use bun
org, apierr := dao.DB().CreateOrg(ctx,
&model.Organization{Name: req.OrgName, IsAnonymous: req.IsAnonymous, HasOptedUpdates: req.HasOptedUpdates})
&types.Organization{Name: req.OrgName, IsAnonymous: req.IsAnonymous, HasOptedUpdates: req.HasOptedUpdates})
if apierr != nil {
zap.L().Error("CreateOrg failed", zap.Error(apierr.ToError()))
return nil, apierr
@@ -414,22 +421,24 @@ func RegisterFirstUser(ctx context.Context, req *RegisterRequest) (*model.User,
return nil, model.InternalError(model.ErrSignupFailed{})
}
user := &model.User{
Id: uuid.NewString(),
Name: req.Name,
Email: req.Email,
Password: hash,
CreatedAt: time.Now().Unix(),
user := &types.User{
ID: uuid.NewString(),
Name: req.Name,
Email: req.Email,
Password: hash,
TimeAuditable: types.TimeAuditable{
CreatedAt: time.Now(),
},
ProfilePictureURL: "", // Currently unused
GroupId: group.ID,
OrgId: org.Id,
GroupID: group.ID,
OrgID: org.ID,
}
return dao.DB().CreateUser(ctx, user, true)
}
// RegisterInvitedUser handles registering a invited user
func RegisterInvitedUser(ctx context.Context, req *RegisterRequest, nopassword bool) (*model.User, *model.ApiError) {
func RegisterInvitedUser(ctx context.Context, req *RegisterRequest, nopassword bool) (*types.User, *model.ApiError) {
if req.InviteToken == "" {
return nil, model.BadRequest(ErrorAskAdmin)
@@ -459,7 +468,7 @@ func RegisterInvitedUser(ctx context.Context, req *RegisterRequest, nopassword b
return &userPayload.User, nil
}
if invite.OrgId == "" {
if invite.OrgID == "" {
zap.L().Error("failed to find org in the invite")
return nil, model.InternalError(fmt.Errorf("invalid invite, org not found"))
}
@@ -492,15 +501,17 @@ func RegisterInvitedUser(ctx context.Context, req *RegisterRequest, nopassword b
}
}
user := &model.User{
Id: uuid.NewString(),
Name: req.Name,
Email: req.Email,
Password: hash,
CreatedAt: time.Now().Unix(),
user := &types.User{
ID: uuid.NewString(),
Name: req.Name,
Email: req.Email,
Password: hash,
TimeAuditable: types.TimeAuditable{
CreatedAt: time.Now(),
},
ProfilePictureURL: "", // Currently unused
GroupId: group.ID,
OrgId: invite.OrgId,
GroupID: group.ID,
OrgID: invite.OrgID,
}
// TODO(Ahsan): Ideally create user and delete invitation should happen in a txn.
@@ -510,7 +521,7 @@ func RegisterInvitedUser(ctx context.Context, req *RegisterRequest, nopassword b
return nil, apiErr
}
apiErr = dao.DB().DeleteInvitation(ctx, user.Email)
apiErr = dao.DB().DeleteInvitation(ctx, user.OrgID, user.Email)
if apiErr != nil {
zap.L().Error("delete invitation failed", zap.Error(apiErr.Err))
return nil, apiErr
@@ -525,7 +536,7 @@ func RegisterInvitedUser(ctx context.Context, req *RegisterRequest, nopassword b
// Register registers a new user. For the first register request, it doesn't need an invite token
// and also the first registration is an enforced ADMIN registration. Every subsequent request will
// need an invite token to go through.
func Register(ctx context.Context, req *RegisterRequest) (*model.User, *model.ApiError) {
func Register(ctx context.Context, req *RegisterRequest) (*types.User, *model.ApiError) {
users, err := dao.DB().GetUsers(ctx)
if err != nil {
return nil, model.InternalError(fmt.Errorf("failed to get user count"))
@@ -562,24 +573,24 @@ func Login(ctx context.Context, request *model.LoginRequest, jwt *authtypes.JWT)
return &model.LoginResponse{
UserJwtObject: userjwt,
UserId: user.User.Id,
UserId: user.User.ID,
}, nil
}
func claimsToUserPayload(claims authtypes.Claims) (*model.UserPayload, error) {
user := &model.UserPayload{
User: model.User{
Id: claims.UserID,
GroupId: claims.GroupID,
func claimsToUserPayload(claims authtypes.Claims) (*types.GettableUser, error) {
user := &types.GettableUser{
User: types.User{
ID: claims.UserID,
GroupID: claims.GroupID,
Email: claims.Email,
OrgId: claims.OrgID,
OrgID: claims.OrgID,
},
}
return user, nil
}
// authenticateLogin is responsible for querying the DB and validating the credentials.
func authenticateLogin(ctx context.Context, req *model.LoginRequest, jwt *authtypes.JWT) (*model.UserPayload, error) {
func authenticateLogin(ctx context.Context, req *model.LoginRequest, jwt *authtypes.JWT) (*types.GettableUser, error) {
// If refresh token is valid, then simply authorize the login request.
if len(req.RefreshToken) > 0 {
// parse the refresh token
@@ -624,17 +635,17 @@ func passwordMatch(hash, password string) bool {
return err == nil
}
func GenerateJWTForUser(user *model.User, jwt *authtypes.JWT) (model.UserJwtObject, error) {
func GenerateJWTForUser(user *types.User, jwt *authtypes.JWT) (model.UserJwtObject, error) {
j := model.UserJwtObject{}
var err error
j.AccessJwtExpiry = time.Now().Add(jwt.JwtExpiry).Unix()
j.AccessJwt, err = jwt.AccessToken(user.OrgId, user.Id, user.GroupId, user.Email)
j.AccessJwt, err = jwt.AccessToken(user.OrgID, user.ID, user.GroupID, user.Email)
if err != nil {
return j, errors.Errorf("failed to encode jwt: %v", err)
}
j.RefreshJwtExpiry = time.Now().Add(jwt.JwtRefresh).Unix()
j.RefreshJwt, err = jwt.RefreshToken(user.OrgId, user.Id, user.GroupId, user.Email)
j.RefreshJwt, err = jwt.RefreshToken(user.OrgID, user.ID, user.GroupID, user.Email)
if err != nil {
return j, errors.Errorf("failed to encode jwt: %v", err)
}

View File

@@ -6,7 +6,7 @@ import (
"github.com/pkg/errors"
"go.signoz.io/signoz/pkg/query-service/constants"
"go.signoz.io/signoz/pkg/query-service/dao"
"go.signoz.io/signoz/pkg/query-service/model"
"go.signoz.io/signoz/pkg/types"
"go.signoz.io/signoz/pkg/types/authtypes"
)
@@ -48,28 +48,28 @@ func InitAuthCache(ctx context.Context) error {
return nil
}
func GetUserFromReqContext(ctx context.Context) (*model.UserPayload, error) {
func GetUserFromReqContext(ctx context.Context) (*types.GettableUser, error) {
claims, ok := authtypes.ClaimsFromContext(ctx)
if !ok {
return nil, errors.New("no claims found in context")
}
user := &model.UserPayload{
User: model.User{
Id: claims.UserID,
GroupId: claims.GroupID,
user := &types.GettableUser{
User: types.User{
ID: claims.UserID,
GroupID: claims.GroupID,
Email: claims.Email,
OrgId: claims.OrgID,
OrgID: claims.OrgID,
},
}
return user, nil
}
func IsSelfAccessRequest(user *model.UserPayload, id string) bool { return user.Id == id }
func IsSelfAccessRequest(user *types.GettableUser, id string) bool { return user.ID == id }
func IsViewer(user *model.UserPayload) bool { return user.GroupId == AuthCacheObj.ViewerGroupId }
func IsEditor(user *model.UserPayload) bool { return user.GroupId == AuthCacheObj.EditorGroupId }
func IsAdmin(user *model.UserPayload) bool { return user.GroupId == AuthCacheObj.AdminGroupId }
func IsViewer(user *types.GettableUser) bool { return user.GroupID == AuthCacheObj.ViewerGroupId }
func IsEditor(user *types.GettableUser) bool { return user.GroupID == AuthCacheObj.EditorGroupId }
func IsAdmin(user *types.GettableUser) bool { return user.GroupID == AuthCacheObj.AdminGroupId }
func ValidatePassword(password string) error {
if len(password) < minimumPasswordLength {

View File

@@ -3,6 +3,7 @@ package common
import (
"math"
"regexp"
"sort"
"time"
"unicode"
@@ -123,3 +124,108 @@ func GetSeriesFromCachedData(data []querycache.CachedSeriesData, start, end int6
}
return newSeries
}
// It is different from GetSeriesFromCachedData because doesn't remove a point if it is >= (start - (start % step*1000))
func GetSeriesFromCachedDataV2(data []querycache.CachedSeriesData, start, end, step int64) []*v3.Series {
series := make(map[uint64]*v3.Series)
for _, cachedData := range data {
for _, data := range cachedData.Data {
h := labels.FromMap(data.Labels).Hash()
if _, ok := series[h]; !ok {
series[h] = &v3.Series{
Labels: data.Labels,
LabelsArray: data.LabelsArray,
Points: make([]v3.Point, 0),
}
}
for _, point := range data.Points {
if point.Timestamp >= (start-(start%(step*1000))) && point.Timestamp <= end {
series[h].Points = append(series[h].Points, point)
}
}
}
}
newSeries := make([]*v3.Series, 0, len(series))
for _, s := range series {
s.SortPoints()
s.RemoveDuplicatePoints()
newSeries = append(newSeries, s)
}
return newSeries
}
// filter series points for storing in cache
func FilterSeriesPoints(seriesList []*v3.Series, missStart, missEnd int64, stepInterval int64) ([]*v3.Series, int64, int64) {
filteredSeries := make([]*v3.Series, 0)
startTime := missStart
endTime := missEnd
stepMs := stepInterval * 1000
// return empty series if the interval is not complete
if missStart+stepMs > missEnd {
return []*v3.Series{}, missStart, missEnd
}
// if the end time is not a complete aggregation window, then we will have to adjust the end time
// to the previous complete aggregation window end
endCompleteWindow := missEnd%stepMs == 0
if !endCompleteWindow {
endTime = missEnd - (missEnd % stepMs)
}
// if the start time is not a complete aggregation window, then we will have to adjust the start time
// to the next complete aggregation window
if missStart%stepMs != 0 {
startTime = missStart + stepMs - (missStart % stepMs)
}
for _, series := range seriesList {
// if data for the series is empty, then we will add it to the cache
if len(series.Points) == 0 {
filteredSeries = append(filteredSeries, &v3.Series{
Labels: series.Labels,
LabelsArray: series.LabelsArray,
Points: make([]v3.Point, 0),
})
continue
}
// Sort the points based on timestamp
sort.Slice(series.Points, func(i, j int) bool {
return series.Points[i].Timestamp < series.Points[j].Timestamp
})
points := make([]v3.Point, len(series.Points))
copy(points, series.Points)
// Filter the first point that is not a complete aggregation window
if series.Points[0].Timestamp < missStart {
// Remove the first point
points = points[1:]
}
// filter the last point if it is not a complete aggregation window
// adding or condition to handle the end time is equal to a complete window end https://github.com/SigNoz/signoz/pull/7212#issuecomment-2703677190
if (!endCompleteWindow && series.Points[len(series.Points)-1].Timestamp == missEnd-(missEnd%stepMs)) ||
(endCompleteWindow && series.Points[len(series.Points)-1].Timestamp == missEnd) {
// Remove the last point
points = points[:len(points)-1]
}
// making sure that empty range doesn't enter the cache
if len(points) > 0 {
filteredSeries = append(filteredSeries, &v3.Series{
Labels: series.Labels,
LabelsArray: series.LabelsArray,
Points: points,
})
}
}
return filteredSeries, startTime, endTime
}

View File

@@ -0,0 +1,435 @@
package common
import (
"testing"
v3 "go.signoz.io/signoz/pkg/query-service/model/v3"
"go.signoz.io/signoz/pkg/query-service/querycache"
)
func TestFilterSeriesPoints(t *testing.T) {
// Define test cases
testCases := []struct {
name string
seriesList []*v3.Series
missStart int64 // in milliseconds
missEnd int64 // in milliseconds
stepInterval int64 // in seconds
expectedPoints []*v3.Series
expectedStart int64 // in milliseconds
expectedEnd int64 // in milliseconds
}{
{
name: "Complete aggregation window",
missStart: 1609459200000, // 01 Jan 2021 00:00:00 UTC
missEnd: 1609466400000, // 01 Jan 2021 02:00:00 UTC
stepInterval: 3600, // 1 hour
seriesList: []*v3.Series{
{
Points: []v3.Point{
{Timestamp: 1609459200000, Value: 1.0}, // 01 Jan 2021 00:00:00 UTC
{Timestamp: 1609462800000, Value: 2.0}, // 01 Jan 2021 01:00:00 UTC
{Timestamp: 1609466400000, Value: 3.0}, // 01 Jan 2021 02:00:00 UTC
},
},
},
expectedPoints: []*v3.Series{
{
Points: []v3.Point{
{Timestamp: 1609459200000, Value: 1.0},
{Timestamp: 1609462800000, Value: 2.0},
},
},
},
expectedStart: 1609459200000,
expectedEnd: 1609466400000,
},
{
name: "Filter first point",
missStart: 1609464600000, // 01 Jan 2021 01:30:00 UTC
missEnd: 1609470000000, // 01 Jan 2021 03:00:00 UTC
stepInterval: 3600, // 1 hour
seriesList: []*v3.Series{
{
Points: []v3.Point{
{Timestamp: 1609462800000, Value: 2.0}, // 01 Jan 2021 01:00:00 UTC
{Timestamp: 1609466400000, Value: 3.0}, // 01 Jan 2021 02:00:00 UTC
},
},
},
expectedPoints: []*v3.Series{
{
Points: []v3.Point{
{Timestamp: 1609466400000, Value: 3.0},
},
},
},
expectedStart: 1609466400000,
expectedEnd: 1609470000000,
},
{
name: "Filter last point",
missStart: 1609466400000, // 01 Jan 2021 02:00:00 UTC
missEnd: 1609471800000, // 01 Jan 2021 03:30:00 UTC
stepInterval: 3600, // 1 hour
seriesList: []*v3.Series{
{
Points: []v3.Point{
{Timestamp: 1609466400000, Value: 3.0}, // 01 Jan 2021 02:00:00 UTC
{Timestamp: 1609470000000, Value: 3.0}, // 01 Jan 2021 03:00:00 UTC
},
},
},
expectedPoints: []*v3.Series{
{
Points: []v3.Point{
{Timestamp: 1609466400000, Value: 3.0},
},
},
},
expectedStart: 1609466400000,
expectedEnd: 1609470000000,
},
{
name: "Incomplete aggregation window",
missStart: 1609470000000, // 01 Jan 2021 03:00:00 UTC
missEnd: 1609471800000, // 01 Jan 2021 03:30:00 UTC
stepInterval: 3600, // 1 hour
seriesList: []*v3.Series{
{
Points: []v3.Point{},
},
},
expectedPoints: []*v3.Series{},
expectedStart: 1609470000000,
expectedEnd: 1609471800000,
},
{
name: "Filter first point with multiple series",
missStart: 1609464600000, // 01 Jan 2021 01:30:00 UTC
missEnd: 1609477200000, // 01 Jan 2021 05:00:00 UTC
stepInterval: 3600, // 1 hour
seriesList: []*v3.Series{
{
Points: []v3.Point{
{Timestamp: 1609462800000, Value: 2.0}, // 01 Jan 2021 01:00:00 UTC
{Timestamp: 1609466400000, Value: 3.0}, // 01 Jan 2021 02:00:00 UTC
{Timestamp: 1609470000000, Value: 4.0}, // 01 Jan 2021 03:00:00 UTC
{Timestamp: 1609473600000, Value: 5.0}, // 01 Jan 2021 04:00:00 UTC
},
},
{
Points: []v3.Point{
{Timestamp: 1609466400000, Value: 6.0}, // 01 Jan 2021 02:00:00 UTC
{Timestamp: 1609470000000, Value: 7.0}, // 01 Jan 2021 03:00:00 UTC
{Timestamp: 1609473600000, Value: 8.0}, // 01 Jan 2021 04:00:00 UTC
},
},
{
Points: []v3.Point{
{Timestamp: 1609466400000, Value: 9.0}, // 01 Jan 2021 02:00:00 UTC
{Timestamp: 1609470000000, Value: 10.0}, // 01 Jan 2021 03:00:00 UTC
{Timestamp: 1609473600000, Value: 11.0}, // 01 Jan 2021 04:00:00 UTC
},
},
},
expectedPoints: []*v3.Series{
{
Points: []v3.Point{
{Timestamp: 1609466400000, Value: 3.0}, // 01 Jan 2021 02:00:00 UTC
{Timestamp: 1609470000000, Value: 4.0}, // 01 Jan 2021 03:00:00 UTC
{Timestamp: 1609473600000, Value: 5.0}, // 01 Jan 2021 04:00:00 UTC
},
},
{
Points: []v3.Point{
{Timestamp: 1609466400000, Value: 6.0}, // 01 Jan 2021 02:00:00 UTC
{Timestamp: 1609470000000, Value: 7.0}, // 01 Jan 2021 03:00:00 UTC
{Timestamp: 1609473600000, Value: 8.0}, // 01 Jan 2021 04:00:00 UTC
},
},
{
Points: []v3.Point{
{Timestamp: 1609466400000, Value: 9.0}, // 01 Jan 2021 02:00:00 UTC
{Timestamp: 1609470000000, Value: 10.0}, // 01 Jan 2021 03:00:00 UTC
{Timestamp: 1609473600000, Value: 11.0}, // 01 Jan 2021 04:00:00 UTC
},
},
},
expectedStart: 1609466400000,
expectedEnd: 1609477200000,
},
{
name: "Filter last point",
missStart: 1609466400000, // 01 Jan 2021 02:00:00 UTC
missEnd: 1609475400000, // 01 Jan 2021 04:30:00 UTC
stepInterval: 3600, // 1 hour
seriesList: []*v3.Series{
{
Points: []v3.Point{
{Timestamp: 1609466400000, Value: 3.0}, // 01 Jan 2021 02:00:00 UTC
{Timestamp: 1609470000000, Value: 4.0}, // 01 Jan 2021 03:00:00 UTC
{Timestamp: 1609473600000, Value: 5.0}, // 01 Jan 2021 04:00:00 UTC
},
},
{
Points: []v3.Point{
{Timestamp: 1609466400000, Value: 6.0}, // 01 Jan 2021 02:00:00 UTC
},
},
{
Points: []v3.Point{
{Timestamp: 1609466400000, Value: 9.0}, // 01 Jan 2021 02:00:00 UTC
{Timestamp: 1609470000000, Value: 10.0}, // 01 Jan 2021 03:00:00 UTC
},
},
},
expectedPoints: []*v3.Series{
{
Points: []v3.Point{
{Timestamp: 1609466400000, Value: 3.0}, // 01 Jan 2021 02:00:00 UTC
{Timestamp: 1609470000000, Value: 4.0}, // 01 Jan 2021 03:00:00 UTC
},
},
{
Points: []v3.Point{
{Timestamp: 1609466400000, Value: 6.0}, // 01 Jan 2021 02:00:00 UTC
},
},
{
Points: []v3.Point{
{Timestamp: 1609466400000, Value: 9.0}, // 01 Jan 2021 02:00:00 UTC
{Timestamp: 1609470000000, Value: 10.0}, // 01 Jan 2021 03:00:00 UTC
},
},
},
expectedStart: 1609466400000,
expectedEnd: 1609473600000,
},
{
name: "half range should return empty result",
missStart: 1609473600000, // 01 Jan 2021 04:00:00 UTC
missEnd: 1609475400000, // 01 Jan 2021 04:30:00 UTC
stepInterval: 3600, // 1 hour
seriesList: []*v3.Series{
{
Points: []v3.Point{
{Timestamp: 1609473600000, Value: 1.0}, // 01 Jan 2021 04:00:00 UTC
},
},
},
expectedPoints: []*v3.Series{},
expectedStart: 1609473600000,
expectedEnd: 1609475400000,
},
{
name: "respect actual empty series",
missStart: 1609466400000, // 01 Jan 2021 02:00:00 UTC
missEnd: 1609475400000, // 01 Jan 2021 04:30:00 UTC
stepInterval: 3600, // 1 hour
seriesList: []*v3.Series{
{
Points: []v3.Point{},
},
},
expectedPoints: []*v3.Series{
{
Points: []v3.Point{},
},
},
expectedStart: 1609466400000,
expectedEnd: 1609473600000,
},
{
name: "Remove point that is not a complete aggregation window",
missStart: 1609466400000, // 01 Jan 2021 02:00:00 UTC
missEnd: 1609470000000, // 01 Jan 2021 03:00:00 UTC
stepInterval: 3600, // 1 hour
seriesList: []*v3.Series{
{
Points: []v3.Point{
{Timestamp: 1609466400000, Value: 2.0}, // 01 Jan 2021 02:00:00 UTC
{Timestamp: 1609470000000, Value: 3.0}, // 01 Jan 2021 03:00:00 UTC
},
},
},
expectedPoints: []*v3.Series{
{
Points: []v3.Point{
{Timestamp: 1609466400000, Value: 2.0},
},
},
},
expectedStart: 1609466400000,
expectedEnd: 1609470000000,
},
}
for _, tc := range testCases {
t.Run(tc.name, func(t *testing.T) {
filteredSeries, startTime, endTime := FilterSeriesPoints(tc.seriesList, tc.missStart, tc.missEnd, tc.stepInterval)
if len(tc.expectedPoints) != len(filteredSeries) {
t.Errorf("Expected %d series, got %d", len(tc.expectedPoints), len(filteredSeries))
return
}
for i := range tc.expectedPoints {
if len(tc.expectedPoints[i].Points) != len(filteredSeries[i].Points) {
t.Errorf("Series %d: Expected %d points, got %d\nExpected points: %+v\nGot points: %+v",
i,
len(tc.expectedPoints[i].Points),
len(filteredSeries[i].Points),
tc.expectedPoints[i].Points,
filteredSeries[i].Points)
continue
}
for j := range tc.expectedPoints[i].Points {
if tc.expectedPoints[i].Points[j].Timestamp != filteredSeries[i].Points[j].Timestamp {
t.Errorf("Series %d Point %d: Expected timestamp %d, got %d", i, j, tc.expectedPoints[i].Points[j].Timestamp, filteredSeries[i].Points[j].Timestamp)
}
if tc.expectedPoints[i].Points[j].Value != filteredSeries[i].Points[j].Value {
t.Errorf("Series %d Point %d: Expected value %f, got %f", i, j, tc.expectedPoints[i].Points[j].Value, filteredSeries[i].Points[j].Value)
}
}
}
if tc.expectedStart != startTime {
t.Errorf("Expected start time %d, got %d", tc.expectedStart, startTime)
}
if tc.expectedEnd != endTime {
t.Errorf("Expected end time %d, got %d", tc.expectedEnd, endTime)
}
})
}
}
func TestGetSeriesFromCachedData(t *testing.T) {
testCases := []struct {
name string
data []querycache.CachedSeriesData
start int64
end int64
expectedCount int
expectedPoints int
}{
{
name: "Single point outside range",
data: []querycache.CachedSeriesData{
{
Data: []*v3.Series{
{
Labels: map[string]string{"label1": "value1"},
Points: []v3.Point{
{Timestamp: 1609473600000, Value: 1.0},
},
},
},
},
},
start: 1609475400000, // 01 Jan 2021 04:30:00 UTC
end: 1609477200000, // 01 Jan 2021 05:00:00 UTC
expectedCount: 1,
expectedPoints: 0,
},
{
name: "Single point inside range",
data: []querycache.CachedSeriesData{
{
Data: []*v3.Series{
{
Labels: map[string]string{"label1": "value1"},
Points: []v3.Point{
{Timestamp: 1609476000000, Value: 1.0},
},
},
},
},
},
start: 1609475400000, // 01 Jan 2021 04:30:00 UTC
end: 1609477200000, // 01 Jan 2021 05:00:00 UTC
expectedCount: 1,
expectedPoints: 1,
},
}
for _, tc := range testCases {
t.Run(tc.name, func(t *testing.T) {
series := GetSeriesFromCachedData(tc.data, tc.start, tc.end)
if len(series) != tc.expectedCount {
t.Errorf("Expected %d series, got %d", tc.expectedCount, len(series))
}
if len(series[0].Points) != tc.expectedPoints {
t.Errorf("Expected %d points, got %d", tc.expectedPoints, len(series[0].Points))
}
})
}
}
func TestGetSeriesFromCachedDataV2(t *testing.T) {
testCases := []struct {
name string
data []querycache.CachedSeriesData
start int64
end int64
step int64
expectedCount int
expectedPoints int
}{
{
name: "Single point outside range",
data: []querycache.CachedSeriesData{
{
Data: []*v3.Series{
{
Labels: map[string]string{"label1": "value1"},
Points: []v3.Point{
{Timestamp: 1609473600000, Value: 1.0},
},
},
},
},
},
start: 1609475400000,
end: 1609477200000,
step: 1000,
expectedCount: 1,
expectedPoints: 0,
},
{
name: "Single point inside range",
data: []querycache.CachedSeriesData{
{
Data: []*v3.Series{
{
Labels: map[string]string{"label1": "value1"},
Points: []v3.Point{
{Timestamp: 1609476000000, Value: 1.0},
},
},
},
},
},
start: 1609475400000,
end: 1609477200000,
step: 1000,
expectedCount: 1,
expectedPoints: 1,
},
}
for _, tc := range testCases {
t.Run(tc.name, func(t *testing.T) {
series := GetSeriesFromCachedDataV2(tc.data, tc.start, tc.end, tc.step)
if len(series) != tc.expectedCount {
t.Errorf("Expected %d series, got %d", tc.expectedCount, len(series))
}
if len(series[0].Points) != tc.expectedPoints {
t.Errorf("Expected %d points, got %d", tc.expectedPoints, len(series[0].Points))
}
})
}
}

View File

@@ -4,11 +4,11 @@ import (
"context"
"go.signoz.io/signoz/pkg/query-service/constants"
"go.signoz.io/signoz/pkg/query-service/model"
"go.signoz.io/signoz/pkg/types"
)
func GetUserFromContext(ctx context.Context) *model.UserPayload {
user, ok := ctx.Value(constants.ContextUserKey).(*model.UserPayload)
func GetUserFromContext(ctx context.Context) *types.GettableUser {
user, ok := ctx.Value(constants.ContextUserKey).(*types.GettableUser)
if !ok {
return nil
}

View File

@@ -13,28 +13,28 @@ type ModelDao interface {
}
type Queries interface {
GetInviteFromEmail(ctx context.Context, email string) (*model.InvitationObject, *model.ApiError)
GetInviteFromToken(ctx context.Context, token string) (*model.InvitationObject, *model.ApiError)
GetInvites(ctx context.Context) ([]model.InvitationObject, *model.ApiError)
GetInviteFromEmail(ctx context.Context, email string) (*types.Invite, *model.ApiError)
GetInviteFromToken(ctx context.Context, token string) (*types.Invite, *model.ApiError)
GetInvites(ctx context.Context, orgID string) ([]types.Invite, *model.ApiError)
GetUser(ctx context.Context, id string) (*model.UserPayload, *model.ApiError)
GetUserByEmail(ctx context.Context, email string) (*model.UserPayload, *model.ApiError)
GetUsers(ctx context.Context) ([]model.UserPayload, *model.ApiError)
GetUsersWithOpts(ctx context.Context, limit int) ([]model.UserPayload, *model.ApiError)
GetUser(ctx context.Context, id string) (*types.GettableUser, *model.ApiError)
GetUserByEmail(ctx context.Context, email string) (*types.GettableUser, *model.ApiError)
GetUsers(ctx context.Context) ([]types.GettableUser, *model.ApiError)
GetUsersWithOpts(ctx context.Context, limit int) ([]types.GettableUser, *model.ApiError)
GetGroup(ctx context.Context, id string) (*model.Group, *model.ApiError)
GetGroup(ctx context.Context, id string) (*types.Group, *model.ApiError)
GetGroupByName(ctx context.Context, name string) (*types.Group, *model.ApiError)
GetGroups(ctx context.Context) ([]model.Group, *model.ApiError)
GetGroups(ctx context.Context) ([]types.Group, *model.ApiError)
GetOrgs(ctx context.Context) ([]model.Organization, *model.ApiError)
GetOrgByName(ctx context.Context, name string) (*model.Organization, *model.ApiError)
GetOrg(ctx context.Context, id string) (*model.Organization, *model.ApiError)
GetOrgs(ctx context.Context) ([]types.Organization, *model.ApiError)
GetOrgByName(ctx context.Context, name string) (*types.Organization, *model.ApiError)
GetOrg(ctx context.Context, id string) (*types.Organization, *model.ApiError)
GetResetPasswordEntry(ctx context.Context, token string) (*model.ResetPasswordEntry, *model.ApiError)
GetUsersByOrg(ctx context.Context, orgId string) ([]model.UserPayload, *model.ApiError)
GetUsersByGroup(ctx context.Context, groupId string) ([]model.UserPayload, *model.ApiError)
GetResetPasswordEntry(ctx context.Context, token string) (*types.ResetPasswordRequest, *model.ApiError)
GetUsersByOrg(ctx context.Context, orgId string) ([]types.GettableUser, *model.ApiError)
GetUsersByGroup(ctx context.Context, groupId string) ([]types.GettableUser, *model.ApiError)
GetApdexSettings(ctx context.Context, services []string) ([]model.ApdexSettings, *model.ApiError)
GetApdexSettings(ctx context.Context, orgID string, services []string) ([]types.ApdexSettings, *model.ApiError)
GetIngestionKeys(ctx context.Context) ([]model.IngestionKey, *model.ApiError)
@@ -42,29 +42,27 @@ type Queries interface {
}
type Mutations interface {
CreateInviteEntry(ctx context.Context, req *model.InvitationObject) *model.ApiError
DeleteInvitation(ctx context.Context, email string) *model.ApiError
CreateInviteEntry(ctx context.Context, req *types.Invite) *model.ApiError
DeleteInvitation(ctx context.Context, orgID string, email string) *model.ApiError
CreateUser(ctx context.Context, user *model.User, isFirstUser bool) (*model.User, *model.ApiError)
EditUser(ctx context.Context, update *model.User) (*model.User, *model.ApiError)
CreateUser(ctx context.Context, user *types.User, isFirstUser bool) (*types.User, *model.ApiError)
EditUser(ctx context.Context, update *types.User) (*types.User, *model.ApiError)
DeleteUser(ctx context.Context, id string) *model.ApiError
UpdateUserFlags(ctx context.Context, userId string, flags map[string]string) (model.UserFlag, *model.ApiError)
CreateGroup(ctx context.Context, group *types.Group) (*types.Group, *model.ApiError)
DeleteGroup(ctx context.Context, id string) *model.ApiError
CreateOrg(ctx context.Context, org *model.Organization) (*model.Organization, *model.ApiError)
EditOrg(ctx context.Context, org *model.Organization) *model.ApiError
CreateOrg(ctx context.Context, org *types.Organization) (*types.Organization, *model.ApiError)
EditOrg(ctx context.Context, org *types.Organization) *model.ApiError
DeleteOrg(ctx context.Context, id string) *model.ApiError
CreateResetPasswordEntry(ctx context.Context, req *model.ResetPasswordEntry) *model.ApiError
CreateResetPasswordEntry(ctx context.Context, req *types.ResetPasswordRequest) *model.ApiError
DeleteResetPasswordEntry(ctx context.Context, token string) *model.ApiError
UpdateUserPassword(ctx context.Context, hash, userId string) *model.ApiError
UpdateUserGroup(ctx context.Context, userId, groupId string) *model.ApiError
SetApdexSettings(ctx context.Context, set *model.ApdexSettings) *model.ApiError
SetApdexSettings(ctx context.Context, orgID string, set *types.ApdexSettings) *model.ApiError
InsertIngestionKey(ctx context.Context, ingestionKey *model.IngestionKey) *model.ApiError
}

Some files were not shown because too many files have changed in this diff Show More