Compare commits

..

214 Commits
0.1.0 ... 0.2.0

Author SHA1 Message Date
Ankit Nayan
daebe83e32 updated files for docker and helm files v 0.2.0 2021-05-02 23:17:50 +05:30
Ankit Nayan
7bafcdb3da Merge pull request #79 from SigNoz/prod-deployment-issues
fixes prod build issue
2021-05-02 23:06:58 +05:30
dhrubesh
2deb8e5b9d fixes prod build issue 2021-05-02 22:54:02 +05:30
Ankit Nayan
d7f2d9f58b added druid dimensions for external and db calls 2021-05-02 22:05:14 +05:30
Ankit Nayan
42b7a51080 Merge branch 'backend' into main 2021-05-02 21:49:16 +05:30
Ankit Nayan
c4bba43667 fixed externalCallAvgDuration API for sinusoidal pattern 2021-05-02 21:48:55 +05:30
Ankit Nayan
3f176cda8d Merge branch 'main' of https://github.com/signoz/signoz into main 2021-05-02 20:54:05 +05:30
Ankit Nayan
e00b4a503e Merge pull request #75 from SigNoz/fix-bug-link-err-traces
Fixes bugs on Tracing and Metrics
2021-05-02 20:52:24 +05:30
Ankit Nayan
30bdd6792c Merge branch 'backend' into main 2021-05-02 20:50:06 +05:30
Ankit Nayan
9c6e66f315 fixed - now getting latest data for past time rather than get 30s stale there too 2021-05-02 20:49:54 +05:30
dhrubesh
9010d16319 equal --> equals 2021-05-02 20:31:30 +05:30
Ankit Nayan
3ef7b10f5b Merge pull request #74 from SigNoz/update-err-chart-title
Change title in Error chart in ServiceOverview
2021-05-02 20:13:19 +05:30
Ankit Nayan
3294463599 Merge pull request #73 from SigNoz/update-graph-title
Change Title in DB Calls Tab
2021-05-02 20:12:23 +05:30
dhrubesh
fc2d32e72d removes *100 since the logic hsa been moved to BE 2021-05-02 20:04:12 +05:30
dhrubesh
62ad8433bf adds service name in the trace when visiting from error 2021-05-02 20:02:44 +05:30
dhrubesh
66ec0a2d8d updates DB graph title and label 2021-05-02 19:51:21 +05:30
dhrubesh
e767886565 updates DB graph title 2021-05-02 19:47:50 +05:30
Ankit Nayan
00c3342b4d added Error tag from error% mertics to traces page 2021-05-02 18:09:01 +05:30
Ankit Nayan
0e86e37235 Merge branch 'pull-66' into main 2021-05-02 18:00:53 +05:30
Ankit Nayan
1ac544ad78 pull-65 2021-05-02 17:55:06 +05:30
Ankit Nayan
c22d6dd1cc Merge branch 'pull-68' into main 2021-05-02 17:44:14 +05:30
Ankit Nayan
4113b1aacc Merge pull request #70 from SigNoz/fixes-side-nav-bug
Fixes sidebar highlight on route change
2021-05-02 17:38:03 +05:30
Ankit Nayan
501d4729d6 Merge pull request #69 from SigNoz/link-error-to-traces
Links Error chart from ServiceOverview to Traces Page
2021-05-02 17:37:47 +05:30
dhrubesh
63f0eadb61 fixes error percentage key 2021-05-02 17:11:02 +05:30
Ankit Nayan
c110a71fff Merge pull request #64 from SigNoz/adds-external-calls
Adds external API monitoring calls
2021-05-02 17:08:09 +05:30
dhrubesh
25803e660c integrated API, populates graph data 2021-05-02 16:58:34 +05:30
Ankit Nayan
7c6a4ed402 Merge branch 'main' of https://github.com/signoz/signoz into main 2021-05-02 16:52:04 +05:30
Ankit Nayan
eb39850f63 added statusCode filtering with error:true tag in searchSpans API 2021-05-02 16:51:06 +05:30
Ankit Nayan
409929841d added statusCode filter for error:true filter in searchSpans 2021-05-02 16:50:54 +05:30
dhrubesh
7aeffacaa5 links error to traces 2021-05-02 16:39:36 +05:30
Ankit Nayan
883982bf36 Merge pull request #67 from pranay01/main
Checking if mapped_array[parentid[ is undefined, before pushing a tree object to it.
2021-05-02 16:21:25 +05:30
dhrubesh
e78d979dd3 fixes sidebar highlight on route change 2021-05-02 16:06:31 +05:30
Pranay Prateek
613f62e518 Merge branch 'spantree_issue' into main 2021-05-02 15:46:43 +05:30
Pranay Prateek
b3bf9fe670 fixed undefined spanUtil mapped array error 2021-05-02 15:45:56 +05:30
dhrubesh
39012d86d7 updates Graph title and adds initial set up for db overview 2021-05-02 14:42:30 +05:30
dhrubesh
eafc2919c7 adds color tokens 2021-05-02 14:41:18 +05:30
Ankit Nayan
ee69c3aed2 Merge branch 'backend' into main 2021-05-02 12:53:12 +05:30
Ankit Nayan
a6b1c271ee changed errors to percent in external calls 2021-05-02 12:52:57 +05:30
Ankit Nayan
cd90ac8e72 Adding DB Overview tab in service 2021-05-02 10:55:22 +05:30
Ankit Nayan
ade18bc11f added dbOverview tab for service 2021-05-02 10:54:59 +05:30
dhrubesh
4935936afc adds plus minus 15mins to timestamp 2021-05-01 20:54:33 +05:30
dhrubesh
a59e33d241 links end points to traces 2021-05-01 20:27:46 +05:30
Ankit Nayan
d5e77d2c57 Merge branch 'pull-52' into main 2021-05-01 20:15:27 +05:30
Ankit Nayan
75c5615d10 Merge branch 'pull-51' into main 2021-05-01 20:15:09 +05:30
Ankit Nayan
b494b380db Merge branch 'pull-50' into main 2021-05-01 20:14:49 +05:30
dhrubesh
f412573972 updates label of graph 2021-05-01 19:18:18 +05:30
dhrubesh
ec52ad7636 adds error and external call duration graph 2021-05-01 19:13:31 +05:30
Ankit Nayan
3b3ca64296 returning spans which are atleast 30s stale 2021-05-01 18:25:46 +05:30
dhrubesh
61c26d7727 adds external call RPS and duration via address 2021-05-01 16:30:20 +05:30
Ankit Nayan
ea93b65ab7 fixed error data in APIs 2021-05-01 12:37:53 +05:30
Ankit Nayan
c323068362 fixing TagValues for int value 2021-05-01 12:37:29 +05:30
Ankit Nayan
9008815790 rerfactoring 2021-04-30 23:00:15 +05:30
Ankit Nayan
99b52e5f19 Merge branch 'main' of https://github.com/signoz/signoz into main 2021-04-30 22:57:58 +05:30
Ankit Nayan
f5abbd2e64 Merge branch 'backend' into main 2021-04-30 22:57:41 +05:30
Ankit Nayan
f32ece7788 Merge pull request #48 from SigNoz/time-fixes-on-trace
Timestamp related fixes on trace
2021-04-30 22:55:08 +05:30
Ankit Nayan
d76b2ba8d2 Kubernetes Deployment: increased storage of Historical pod to 20Gi 2021-04-30 11:32:20 +05:30
Ankit Nayan
93138c91e2 commented druid response log line 2021-04-26 23:33:38 +05:30
Ankit Nayan
f0e7614443 commented druid response log line 2021-04-26 23:33:20 +05:30
Ankit Nayan
d3911faebc fixed NaN response from Druid during span aggregates results 2021-04-26 23:20:22 +05:30
Ankit Nayan
584215b186 fixed NaN response from Druid during span aggregates results 2021-04-26 23:19:34 +05:30
Ankit Nayan
99c073d6da External API Calls commit for query-service and flatten-processor 2021-04-26 21:55:37 +05:30
Ankit Nayan
44304229cb added APIs for external calls 2021-04-26 21:55:11 +05:30
Ankit Nayan
381fcd710e added dimensions for external calls and db calls 2021-04-26 21:54:32 +05:30
dhrubesh-makeen
c8b92ce4d5 removes hardcoding--2 2021-04-25 17:44:07 +05:30
dhrubesh-makeen
d5cb191299 removes hardcoding from routes 2021-04-25 17:37:43 +05:30
Ankit Nayan
3f901c8692 added status code in processor and supervisor 2021-04-24 11:21:32 +05:30
Ankit Nayan
e24577b663 added statusCode dimension in druid supervisor config 2021-04-24 11:20:07 +05:30
Ankit Nayan
e9d403493f added statusCode field in spans 2021-04-24 11:19:34 +05:30
dhrubesh-makeen
fa7e3f3d95 adds spacing 2021-04-24 04:00:05 +05:30
dhrubesh-makeen
05f40224b9 window.location.search --> location.search(react-router) 2021-04-24 03:55:22 +05:30
dhrubesh-makeen
55e86ead02 moves routing to a single place 2021-04-24 03:51:31 +05:30
dhrubesh-makeen
515766452d updates incorrect import 2021-04-24 02:36:19 +05:30
dhrubesh-makeen
dbe3488ad1 components --> modules & refactored TopNav and SideNav 2021-04-24 02:29:12 +05:30
dhrubesh-makeen
88e488bdc7 revamps api layer 2021-04-24 01:51:45 +05:30
dhrubesh-makeen
001f7414db moves store to a separate folder 2021-04-24 01:21:24 +05:30
dhrubesh-makeen
1d65ed38f9 adds /application to open when dev server starts 2021-04-24 00:59:45 +05:30
dhrubesh-makeen
e1ea39e287 updates timeformat 2021-04-24 00:58:59 +05:30
dhrubesh-makeen
079d742ea0 moves old PR changes to this PR 2021-04-24 00:58:38 +05:30
Ankit Nayan
733b137b2a Fixed timezone issue in timestamp conversion to string in query service 2021-04-23 13:37:55 +05:30
Ankit Nayan
1dad2bb791 fixed query service timestamp to string conversion timezone issue 2021-04-23 13:37:32 +05:30
Pranay Prateek
840246bd77 Merge pull request #46 from pranay01/main
Updated ReadMe image & title
2021-04-20 13:25:00 +05:30
Pranay Prateek
4403a5c334 updated README image 2021-04-20 13:23:37 +05:30
Pranay Prateek
3d7b79fecd updated README image 2021-04-20 13:12:03 +05:30
Pranay Prateek
ba95b36e09 updated README image 2021-04-20 13:06:31 +05:30
Pranay Prateek
15471090d5 updated image & title in README 2021-04-20 01:28:06 +05:30
Ankit Nayan
881feef4e9 changed TagKeys and TagValues type for correct ordering 2021-04-19 23:06:20 +05:30
Ankit Nayan
c45f09be08 added separate kafka exporters for traces and metrics in otel-collector config 2021-04-15 21:40:30 +05:30
Ankit Nayan
35648ba195 added s3 config test file to .gitignore 2021-04-02 10:53:55 +05:30
Ankit Nayan
eaa50f83bc docker installation command changed for Linux2 AMI 2021-04-02 10:37:21 +05:30
Ankit Nayan
c4d3f7fd2a adding amazon linux docker installation 2021-04-01 22:12:50 +05:30
Ankit Nayan
f0497bbbd4 adding amazon linux docker installation 2021-04-01 22:04:39 +05:30
Ankit Nayan
47825ef2ad adding amazon linux docker installation 2021-04-01 21:58:58 +05:30
Ankit Nayan
f3ce9e92f4 adding amazon linux docker installation 2021-04-01 21:56:03 +05:30
Ankit Nayan
4e9e4f25cc adding amazon linux docker installation 2021-04-01 21:41:25 +05:30
Ankit Nayan
d820d6462a adding separate s3 configs 2021-04-01 21:12:45 +05:30
Ankit Nayan
87171cc880 adding separate s3 configs 2021-04-01 20:44:29 +05:30
Ankit Nayan
9d53fc6055 adding separate s3 configs 2021-04-01 20:11:33 +05:30
Ankit Nayan
8419d113f4 adding separate s3 configs 2021-04-01 19:48:27 +05:30
Ankit Nayan
2173ab6775 adding separate s3 configs 2021-04-01 19:23:24 +05:30
Ankit Nayan
bb67ac9b55 adding separate s3 configs 2021-04-01 19:22:26 +05:30
Ankit Nayan
6d78104d03 adding separate s3 configs 2021-04-01 18:36:53 +05:30
Ankit Nayan
812752972f adding separate s3 configs 2021-04-01 17:52:52 +05:30
Ankit Nayan
5d53065519 adding separate s3 configs 2021-04-01 17:45:06 +05:30
Ankit Nayan
0fd39e17b7 adding separate s3 configs 2021-04-01 17:30:01 +05:30
Ankit Nayan
2b55458e30 adding separate s3 configs 2021-04-01 17:23:59 +05:30
Ankit Nayan
d414eaec72 adding separate s3 configs 2021-04-01 17:22:31 +05:30
Ankit Nayan
6334650d22 adding separate s3 configs 2021-04-01 16:48:45 +05:30
Ankit Nayan
ccf57d9c5c adding s3 config to docker-compose 2021-04-01 12:13:04 +05:30
EC2 Default User
a23520782c adding s3 config to docker deployment 2021-03-31 18:50:27 +00:00
Ankit Nayan
86698a50bb exposed grpc legacy port 55680 in docker deployment 2021-03-27 00:06:31 +05:30
Ankit Nayan
00c744c004 added scarf for docker 2021-03-22 12:17:00 +05:30
Ankit Nayan
67dda78cbe added metrics pipeline in otel collector config 2021-03-03 00:48:04 +05:30
Ankit Nayan
62e0ec7ea4 fixed typos in readme 2021-03-03 00:47:33 +05:30
Ankit Nayan
b43d2a7567 changed query-service image tag to 0.1.4 2021-03-01 02:45:01 +05:30
Ankit Nayan
b6c718a536 added cors 2021-03-01 02:43:05 +05:30
Ankit Nayan
96d012a34b removed span.Kind=2 check in filtered spans aggregates 2021-03-01 01:36:33 +05:30
Ankit Nayan
bd8d50bab9 opening docs in new tab 2021-02-26 21:17:44 +05:30
Ankit Nayan
d9055b5030 Merge pull request #31 from himanshu-source21/ft-saas-opensource-parity-1
Add null check in GenericVisualization
2021-02-24 12:43:16 +05:30
Himanshu DIxit
52b7d38df8 Add null check in GenericVisualization 2021-02-24 12:37:53 +05:30
Ankit Nayan
4d431f0476 Merge pull request #30 from himanshu-source21/ft-saas-opensource-parity-1
Fix latency initial values in traces page
2021-02-24 11:05:59 +05:30
Himanshu DIxit
caeeec803e Fix initial values 2021-02-24 01:46:20 +05:30
Ankit Nayan
2c4dc07d2d changed node version to 12 2021-02-23 22:12:54 +05:30
Ankit Nayan
2234d31974 changed signoz/frontend image tag 2021-02-23 22:12:36 +05:30
Ankit Nayan
917e397c97 added grpc port 4317 2021-02-23 22:12:10 +05:30
Ankit Nayan
aa320386a2 changed signoz/frontend image tag 2021-02-23 22:11:32 +05:30
Ankit Nayan
2169f5498c Merge pull request #29 from himanshu-source21/ft-saas-opensource-parity-1
Remove unused ref
2021-02-23 20:21:00 +05:30
Himanshu DIxit
dd5357b975 Remove unused ref 2021-02-23 20:19:05 +05:30
Ankit Nayan
cb59805ff0 Merge pull request #28 from himanshu-source21/ft-saas-opensource-parity-1
Forcefully add env file
2021-02-23 20:18:35 +05:30
Himanshu DIxit
c708b29657 Remove unused var and change baseURl 2021-02-23 20:15:54 +05:30
Himanshu DIxit
2d238ff6a2 Forcefully add env file 2021-02-23 20:09:28 +05:30
Ankit Nayan
7b778d6951 Merge pull request #26 from himanshu-source21/ft-saas-opensource-parity-1
Fix references error found during deployment
2021-02-23 19:17:57 +05:30
Himanshu DIxit
e4dbb323a5 Fix references error found during deployment 2021-02-23 17:28:18 +05:30
Ankit Nayan
f478a6e894 Merge branch 'main' of https://github.com/signoz/signoz into main 2021-02-23 17:19:49 +05:30
Ankit Nayan
88a756fe50 k8s installation script in progress 2021-02-23 17:18:50 +05:30
Ankit Nayan
ae2dfe59d9 Merge pull request #25 from himanshu-source21/ft-saas-opensource-parity-1
Ft saas opensource parity 1
2021-02-23 16:54:06 +05:30
Himanshu DIxit
fb1ade15b5 Fix final changes 2021-02-23 16:46:58 +05:30
Himanshu DIxit
4ec389c449 Fix multiple keys selected bug 2021-02-22 05:14:59 +05:30
Himanshu DIxit
cb5713216a Add SIG-60 2021-02-22 05:05:07 +05:30
Himanshu DIxit
3a79778ce4 Fix SIG-21 2021-02-22 04:44:34 +05:30
Himanshu DIxit
81a1d2bb37 Fix SIG-55 2021-02-22 04:22:11 +05:30
Himanshu DIxit
864ef41fef Fix SIG-58 2021-02-22 03:58:57 +05:30
Himanshu DIxit
999a5094bb Prettify: Add basic indentation hygiene 2021-02-21 06:23:56 +05:30
Himanshu Dixit
a1331536ca Refactor: Bring open source to parity with SAAS 2021-02-21 06:21:15 +05:30
Ankit Nayan
3795aa059e enter key escaped in read 2021-02-18 01:03:50 +05:30
Ankit Nayan
3c9b024e34 text changes 2021-02-17 01:19:51 +05:30
Ankit Nayan
2ff2b7485e adding installation script
Merge branch 'backend' into main
2021-02-16 23:43:40 +05:30
Ankit Nayan
19dff5fdf2 checks done 2021-02-16 23:30:51 +05:30
Ankit Nayan
7bfc184ee6 checks done 2021-02-16 23:10:36 +05:30
Ankit Nayan
4954c18baa fixed text 2021-02-16 12:27:44 +05:30
Ankit Nayan
65a4649696 fixed text 2021-02-16 12:25:21 +05:30
Ankit Nayan
03233cf6be added installaton success and error ping 2021-02-16 12:23:32 +05:30
Ankit Nayan
1176f61791 status_code var fixed 2021-02-16 11:53:39 +05:30
Ankit Nayan
61f0674a13 added install.sh 2021-02-16 11:02:57 +05:30
Ankit Nayan
364c68b138 added 4317 port for OTLP 2021-02-14 12:34:12 +05:30
Ankit Nayan
1de802688c added 4317 port for otlp 2021-02-14 12:33:59 +05:30
Ankit Nayan
cf58f77400 added zipkin receiver 2021-02-14 12:16:43 +05:30
Ankit Nayan
b96e1b5466 Merge branch 'backend' into main 2021-02-10 01:40:17 +05:30
Ankit Nayan
b4073dfaa8 updated README.md with docker instructions 2021-02-10 01:40:01 +05:30
Ankit Nayan
14ac30a79d Merge branch 'main' of https://github.com/signoz/signoz into main 2021-02-09 03:05:23 +05:30
Ankit Nayan
d526e15fa8 added batch size in otel collector config 2021-02-09 03:04:50 +05:30
Ankit Nayan
f7de4fcbd9 Merge pull request #20 from SigNoz/pranay01-patch-3
Update README.md
2021-02-08 00:54:59 +05:30
Pranay Prateek
27b6024d2a Update README.md 2021-02-08 00:53:14 +05:30
Ankit Nayan
c145f92125 otel collector image changed to 0.18.0 2021-02-07 01:09:59 +05:30
Ankit Nayan
cde268c40e changed otel image version to 0.19.0 2021-02-07 01:05:07 +05:30
Ankit Nayan
78bb92827b Merge branch 'main' of https://github.com/signoz/signoz into main 2021-02-05 13:55:13 +05:30
Ankit Nayan
1a0fa0900d Merge branch 'backend' into main 2021-02-05 13:54:37 +05:30
Ankit Nayan
6f9a33d6b4 add s3 config in comments 2021-02-05 13:53:58 +05:30
Ankit Nayan
beb4ba512a Merge pull request #11 from himanshu-source21/ft-carry-forward-service-name
Ft carry forward service name from metrics page -> traces page
2021-01-24 18:32:22 +05:30
Ankit Nayan
712f825525 tMerge branch 'backend' into main 2021-01-24 14:05:42 +05:30
Ankit Nayan
bf51c2948b changed README.md 2021-01-24 14:01:50 +05:30
Ankit Nayan
40bae79a35 added .gitattriibutes file 2021-01-24 13:09:49 +05:30
Ankit Nayan
bf79d537d7 Merge pull request #10 from SigNoz/pranay01-patch-1
Updated Logo in  README.md
2021-01-24 13:03:42 +05:30
“himanshu”
53a50efa00 Reset env 2021-01-23 11:12:04 +05:30
“himanshu”
bdd23c504c Fix key rendering issue 2021-01-23 10:16:10 +05:30
“himanshu”
e98e3be33e Preserve state in latencymodalform 2021-01-23 10:05:12 +05:30
“himanshu”
5ed648bf0e Merge branch 'main' of https://github.com/SigNoz/signoz into ft-carry-forward-service-name 2021-01-23 09:55:51 +05:30
“himanshu”
b0dd622aa3 Remove unused var 2021-01-23 09:28:45 +05:30
“himanshu”
1ebfa0679e Add support for service from metrics to trace page 2021-01-23 09:25:35 +05:30
Pranay Prateek
a3ea50401d Update README.md 2021-01-22 13:28:14 +05:30
Ankit Nayan
21c0de6f77 Merge pull request #9 from SigNoz/add-code-of-conduct-1
Added code of conduct
2021-01-22 11:51:41 +05:30
Pranay Prateek
254e2f1532 Added code of conduct 2021-01-22 11:50:17 +05:30
Ankit Nayan
307c26ae50 Merge pull request #8 from pranay01/main
improving flamegraph tooltip vis
2021-01-22 10:59:36 +05:30
Pranay Prateek
4ccbe6fdd6 improving flamegraph tooltip vis 2021-01-22 01:22:47 +05:30
Ankit Nayan
f8a05e535f Merge pull request #7 from himanshu-source21/ft-traces-time-interval
Ft traces time interval
2021-01-21 00:37:03 +05:30
“himanshu”
08ba714637 Change env 2021-01-21 00:30:10 +05:30
“himanshu”
7f495181a7 View traces fix for past 15 min interval 2021-01-21 00:27:54 +05:30
Ankit Nayan
4c629721bd changing deployments 2021-01-20 19:32:21 +05:30
Ankit Nayan
7e9fe17e76 docker druid environment changes 2021-01-20 14:43:35 +05:30
Ankit Nayan
56dc0824c8 changed folder for data storage in druid 2021-01-20 13:23:42 +05:30
Ankit Nayan
cd16bf43bd Merge pull request #6 from himanshu-source21/main
Fix Sig-13, Fix-11, Logo and lint
2021-01-20 11:48:45 +05:30
“himanshu”
23a0059e41 Revert image name in docker compose 2021-01-20 11:29:39 +05:30
“himanshu”
546af6e46b Gitignore env and Add redux dev tools 2021-01-20 04:45:52 +05:30
“himanshu”
a6811aca6b Merge branch 'main' of https://github.com/SigNoz/signoz into main 2021-01-20 04:40:55 +05:30
“himanshu”
d9b0c1da1c Create docker image and tag 2021-01-20 03:59:58 +05:30
Ankit Nayan
8930ff1c88 added druid environments for small instance 2021-01-19 18:30:17 +05:30
“himanshu”
6cf2fb5490 Add Dockerfile 2021-01-19 11:10:05 +05:30
“himanshu”
14a585641c Fix flamegraph 2021-01-19 10:26:49 +05:30
“himanshu”
2a039150a8 Use API status for loading 2021-01-19 06:01:57 +05:30
“himanshu”
b7dea68ff5 Add loading for table 2021-01-19 05:56:45 +05:30
“himanshu”
5605a6210f Fix image use base ref 2021-01-19 05:22:34 +05:30
Ankit Nayan
61a8a9c17b increased retries to 10 for kafka 2021-01-18 19:07:59 +05:30
Ankit Nayan
db73cd1cdd added docker-compose-tiny with environment to run on 3GB RAM 2021-01-18 16:09:15 +05:30
Ankit Nayan
38335cbd4c added storage folders for local development 2021-01-18 16:08:29 +05:30
Ankit Nayan
6d50599ba0 restarting pods set-retention and create-supervisor on failure 2021-01-18 12:37:25 +05:30
Himanshu Dixit
cc6f755f07 Commit uncommitted changes 2021-01-18 02:43:22 +05:30
Himanshu Dixit
4bd1790a52 Refactor API endpoint 2021-01-18 02:33:48 +05:30
Himanshu Dixit
2505e01fce Exclude idea and fix theme toggle 2021-01-18 02:23:46 +05:30
Himanshu Dixit
5ff2d9e9e7 Sanity prettify 2021-01-18 02:18:49 +05:30
Himanshu Dixit
ff47f0978f Fix flamegraph.
// Alternative fix, useRef or createRef with reference to DOM chart. Weird way to handle it, looks like it's relying on immutability. Ideally any componentLibrary will use useRef for opposite data transfer.
2021-01-17 20:48:28 +05:30
Ankit Nayan
67125542a9 updated deployment link 2021-01-17 19:17:20 +05:30
Ankit Nayan
6bd8bf7d9f changed backofflimit of jobs 2021-01-17 11:46:01 +05:30
Ankit Nayan
c7399bc8e7 added OTLP HTTP receiver 2021-01-17 03:07:15 +05:30
Ankit Nayan
5f796a982b added zookeeper auto purge interval to prevent full disk space error 2021-01-16 14:02:25 +05:30
Ankit Nayan
416d943d14 added docker compose 2021-01-16 12:18:54 +05:30
180 changed files with 60843 additions and 6852 deletions

1
.gitattributes vendored Normal file
View File

@@ -0,0 +1 @@
*.css linguist-detectable=false

9
.gitignore vendored
View File

@@ -1,3 +1,4 @@
deploy/docker/environment_tiny/common_test
frontend/node_modules
frontend/.pnp
*.pnp.js
@@ -19,7 +20,13 @@ frontend/.yarnclean
frontend/npm-debug.log*
frontend/yarn-debug.log*
frontend/yarn-error.log*
frontend/src/constants/env.ts
.idea
**/.vscode
*.tgz
**/build
**/build
**/storage
**/locust-scripts/__pycache__/

76
CODE_OF_CONDUCT.md Normal file
View File

@@ -0,0 +1,76 @@
# Contributor Covenant Code of Conduct
## Our Pledge
In the interest of fostering an open and welcoming environment, we as
contributors and maintainers pledge to making participation in our project and
our community a harassment-free experience for everyone, regardless of age, body
size, disability, ethnicity, sex characteristics, gender identity and expression,
level of experience, education, socio-economic status, nationality, personal
appearance, race, religion, or sexual identity and orientation.
## Our Standards
Examples of behavior that contributes to creating a positive environment
include:
* Using welcoming and inclusive language
* Being respectful of differing viewpoints and experiences
* Gracefully accepting constructive criticism
* Focusing on what is best for the community
* Showing empathy towards other community members
Examples of unacceptable behavior by participants include:
* The use of sexualized language or imagery and unwelcome sexual attention or
advances
* Trolling, insulting/derogatory comments, and personal or political attacks
* Public or private harassment
* Publishing others' private information, such as a physical or electronic
address, without explicit permission
* Other conduct which could reasonably be considered inappropriate in a
professional setting
## Our Responsibilities
Project maintainers are responsible for clarifying the standards of acceptable
behavior and are expected to take appropriate and fair corrective action in
response to any instances of unacceptable behavior.
Project maintainers have the right and responsibility to remove, edit, or
reject comments, commits, code, wiki edits, issues, and other contributions
that are not aligned to this Code of Conduct, or to ban temporarily or
permanently any contributor for other behaviors that they deem inappropriate,
threatening, offensive, or harmful.
## Scope
This Code of Conduct applies both within project spaces and in public spaces
when an individual is representing the project or its community. Examples of
representing a project or community include using an official project e-mail
address, posting via an official social media account, or acting as an appointed
representative at an online or offline event. Representation of a project may be
further defined and clarified by project maintainers.
## Enforcement
Instances of abusive, harassing, or otherwise unacceptable behavior may be
reported by contacting the project team at dev@signoz.io. All
complaints will be reviewed and investigated and will result in a response that
is deemed necessary and appropriate to the circumstances. The project team is
obligated to maintain confidentiality with regard to the reporter of an incident.
Further details of specific enforcement policies may be posted separately.
Project maintainers who do not follow or enforce the Code of Conduct in good
faith may face temporary or permanent repercussions as determined by other
members of the project's leadership.
## Attribution
This Code of Conduct is adapted from the [Contributor Covenant][homepage], version 1.4,
available at https://www.contributor-covenant.org/version/1/4/code-of-conduct.html
[homepage]: https://www.contributor-covenant.org
For answers to common questions about this code of conduct, see
https://www.contributor-covenant.org/faq

View File

@@ -1,34 +1,49 @@
<p align="center"><img src="https://signoz.io/img/SigNozLogo-200x200.svg" alt="SigNoz Logo" width="100"></p>
<p align="center">
<img src="https://res.cloudinary.com/dcv3epinx/image/upload/v1618904450/signoz-images/LogoGithub_sigfbu.svg" alt="SigNoz-logo" width="240" />
# SigNoz
SigNoz is an opensource observability platform. SigNoz uses distributed tracing to gain visibility into your systems and powers data using [Kafka](https://kafka.apache.org/) (to handle high ingestion rate and backpressure) and [Apache Druid](https://druid.apache.org/) (Apache Druid is a high performance real-time analytics database), both proven in industry to handle scale.
<p align="center">Monitor your applications and troubleshoot problems in your deployed applications, an open-source alternative to DataDog, New Relic, etc.</p>
</p>
[![MIT](https://img.shields.io/badge/license-MIT-brightgreen)](LICENSE)
![SigNoz Feature](https://signoz.io/img/readme_feature1.jpg)
##
SigNoz is an opensource observability platform. SigNoz uses distributed tracing to gain visibility into your systems and powers data using [Kafka](https://kafka.apache.org/) (to handle high ingestion rate and backpressure) and [Apache Druid](https://druid.apache.org/) (Apache Druid is a high performance real-time analytics database), both proven in the industry to handle scale.
<!-- ![SigNoz Feature](https://signoz.io/img/readme_feature1.jpg) -->
![SigNoz Feature](https://res.cloudinary.com/dcv3epinx/image/upload/v1618904032/signoz-images/screenzy-1618904013729_clssvy.png)
### Features:
- Application overview metrics like RPS, 50th/90th/99th Percentile latencies and Error Rate
- Application overview metrics like RPS, 50th/90th/99th Percentile latencies, and Error Rate
- Slowest endpoints in your application
- See exact request trace to figure out issues in downstream services, slow DB queries, call to 3rd party services like payment gateways, etc
- Filter traces by service name, operation, latency, error, tags/annotations.
- Filter traces by service name, operation, latency, error, tags/annotations.
- Aggregate metrics on filtered traces. Eg, you can get error rate and 99th percentile latency of `customer_type: gold` or `deployment_version: v2` or `external_call: paypal`
- Unified UI for metrics and traces. No need to switch from Prometheus to Jaeger to debug issues.
- In-built workflows to reduce your efforts in detecting common issues like new deployment failures, 3rd party slow APIs, etc (Coming Soon)
- Anomaly Detection Framework (Coming Soon)
### Motivation:
- SaaS vendors charge insane amount to provide Application Monitoring. They often surprise you by huge month end bills without any tranparency of data sent to them.
- SaaS vendors charge an insane amount to provide Application Monitoring. They often surprise you with huge month end bills without any transparency of data sent to them.
- Data privacy and compliance demands data to not leave the network boundary
- No more magic happening in agents installed in your infra. You take control of sampling, uptime, configuration. Also, you can build modules over SigNoz to extend business specific capabilities.
- Highly scalable architecture
- No more magic happening in agents installed in your infra. You take control of sampling, uptime, configuration.
- Build modules over SigNoz to extend business specific capabilities
# Getting Started
Deploy in Kubernetes using Helm. Below steps will install the SigNoz in platform namespace inside you k8s cluster.
## Deploy using docker-compose
We have a tiny-cluster setup and a standard setup to deploy using docker-compose.
Follow the steps listed at https://signoz.io/docs/deployment/docker/.
The troubleshooting instructions at https://signoz.io/docs/deployment/docker/#troubleshooting may be helpful
## Deploy in Kubernetes using Helm.
Below steps will install the SigNoz in platform namespace inside your k8s cluster.
```console
git clone https://github.com/SigNoz/signoz.git && cd signoz
@@ -38,8 +53,8 @@ helm -n platform install signoz deploy/kubernetes/platform
kubectl -n platform apply -Rf deploy/kubernetes/jobs
kubectl -n platform apply -f deploy/kubernetes/otel-collector
```
**You can choose a different namespace too. In that case, you need to point your applications to correct address to send traces. In our sample application just change the `JAEGER_ENDPOINT` environment variable in `sample-apps/hotrod/deployment.yaml`*
\*_You can choose a different namespace too. In that case, you need to point your applications to correct address to send traces. In our sample application just change the `JAEGER_ENDPOINT` environment variable in `sample-apps/hotrod/deployment.yaml`_
### Test HotROD application with SigNoz
@@ -53,17 +68,19 @@ kubectl -n sample-application apply -Rf sample-apps/hotrod/
`kubectl -n sample-application run strzal --image=djbingham/curl --restart='OnFailure' -i --tty --rm --command -- curl -X POST -F 'locust_count=6' -F 'hatch_rate=2' http://locust-master:8089/swarm`
### See UI
`kubectl -n platform port-forward svc/signoz-frontend 3000:3000`
### How to stop load
`kubectl -n sample-application run strzal --image=djbingham/curl --restart='OnFailure' -i --tty --rm --command -- curl http://locust-master:8089/stop`
# Documentation
You can find docs at https://signoz.io/docs/installation. If you need any clarification or find something missing, feel free to raise a github issue with label `documentation` or reach out to us at community slack channel.
You can find docs at https://signoz.io/docs/deployment/docker. If you need any clarification or find something missing, feel free to raise a GitHub issue with the label `documentation` or reach out to us at the community slack channel.
# Community
Join the [slack community](https://app.slack.com/client/T01HWUTP0LT#/) to know more about distributed tracing, observability or SigNoz and to connect with other users and contributors.
If you have any ideas, questions or any feedback, please share on our [Github Discussions](https://github.com/SigNoz/signoz/discussions)
Join the [slack community](https://app.slack.com/client/T01HWUTP0LT#/) to know more about distributed tracing, observability, or SigNoz and to connect with other users and contributors.
If you have any ideas, questions, or any feedback, please share on our [Github Discussions](https://github.com/SigNoz/signoz/discussions)

View File

@@ -0,0 +1,264 @@
version: "2.4"
volumes:
metadata_data: {}
middle_var: {}
historical_var: {}
broker_var: {}
coordinator_var: {}
router_var: {}
# If able to connect to kafka but not able to write to topic otlp_spans look into below link
# https://github.com/wurstmeister/kafka-docker/issues/409#issuecomment-428346707
services:
zookeeper:
image: bitnami/zookeeper:3.6.2-debian-10-r100
ports:
- "2181:2181"
environment:
- ALLOW_ANONYMOUS_LOGIN=yes
kafka:
# image: wurstmeister/kafka
image: bitnami/kafka:2.7.0-debian-10-r1
ports:
- "9092:9092"
hostname: kafka
environment:
KAFKA_ADVERTISED_HOST_NAME: kafka
KAFKA_ADVERTISED_PORT: 9092
KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
ALLOW_PLAINTEXT_LISTENER: 'yes'
KAFKA_CFG_AUTO_CREATE_TOPICS_ENABLE: 'true'
KAFKA_TOPICS: 'otlp_spans:1:1,flattened_spans:1:1'
healthcheck:
# test: ["CMD", "kafka-topics.sh", "--create", "--topic", "otlp_spans", "--zookeeper", "zookeeper:2181"]
test: ["CMD", "kafka-topics.sh", "--list", "--zookeeper", "zookeeper:2181"]
interval: 30s
timeout: 10s
retries: 10
depends_on:
- zookeeper
postgres:
container_name: postgres
image: postgres:latest
volumes:
- metadata_data:/var/lib/postgresql/data
environment:
- POSTGRES_PASSWORD=FoolishPassword
- POSTGRES_USER=druid
- POSTGRES_DB=druid
coordinator:
image: apache/druid:0.20.0
container_name: coordinator
volumes:
- ./storage:/opt/data
- coordinator_var:/opt/druid/var
depends_on:
- zookeeper
- postgres
ports:
- "8081:8081"
command:
- coordinator
env_file:
- environment_tiny/coordinator
- environment_tiny/common
broker:
image: apache/druid:0.20.0
container_name: broker
volumes:
- broker_var:/opt/druid/var
depends_on:
- zookeeper
- postgres
- coordinator
ports:
- "8082:8082"
command:
- broker
env_file:
- environment_tiny/broker
- environment_tiny/common
historical:
image: apache/druid:0.20.0
container_name: historical
volumes:
- ./storage:/opt/data
- historical_var:/opt/druid/var
depends_on:
- zookeeper
- postgres
- coordinator
ports:
- "8083:8083"
command:
- historical
env_file:
- environment_tiny/historical
- environment_tiny/common
middlemanager:
image: apache/druid:0.20.0
container_name: middlemanager
volumes:
- ./storage:/opt/data
- middle_var:/opt/druid/var
depends_on:
- zookeeper
- postgres
- coordinator
ports:
- "8091:8091"
command:
- middleManager
env_file:
- environment_tiny/middlemanager
- environment_tiny/common
router:
image: apache/druid:0.20.0
container_name: router
volumes:
- router_var:/opt/druid/var
depends_on:
- zookeeper
- postgres
- coordinator
ports:
- "8888:8888"
command:
- router
env_file:
- environment_tiny/router
- environment_tiny/common
flatten-processor:
image: signoz/flattener-processor:0.2.0
container_name: flattener-processor
depends_on:
- kafka
- otel-collector
ports:
- "8000:8000"
environment:
- KAFKA_BROKER=kafka:9092
- KAFKA_INPUT_TOPIC=otlp_spans
- KAFKA_OUTPUT_TOPIC=flattened_spans
query-service:
image: signoz.docker.scarf.sh/signoz/query-service:0.2.0
container_name: query-service
depends_on:
- router
ports:
- "8080:8080"
environment:
- DruidClientUrl=http://router:8888
- DruidDatasource=flattened_spans
- POSTHOG_API_KEY=H-htDCae7CR3RV57gUzmol6IAKtm5IMCvbcm_fwnL-w
frontend:
image: signoz/frontend:0.2.1
container_name: frontend
depends_on:
- query-service
links:
- "query-service"
ports:
- "3000:3000"
volumes:
- ./nginx-config.conf:/etc/nginx/conf.d/default.conf
create-supervisor:
image: theithollow/hollowapp-blog:curl
container_name: create-supervisor
command:
- /bin/sh
- -c
- "curl -X POST -H 'Content-Type: application/json' -d @/app/supervisor-spec.json http://router:8888/druid/indexer/v1/supervisor"
depends_on:
- router
restart: on-failure:6
volumes:
- ./druid-jobs/supervisor-spec.json:/app/supervisor-spec.json
set-retention:
image: theithollow/hollowapp-blog:curl
container_name: set-retention
command:
- /bin/sh
- -c
- "curl -X POST -H 'Content-Type: application/json' -d @/app/retention-spec.json http://router:8888/druid/coordinator/v1/rules/flattened_spans"
depends_on:
- router
restart: on-failure:6
volumes:
- ./druid-jobs/retention-spec.json:/app/retention-spec.json
otel-collector:
image: otel/opentelemetry-collector:0.18.0
command: ["--config=/etc/otel-collector-config.yaml", "--mem-ballast-size-mib=683"]
volumes:
- ./otel-collector-config.yaml:/etc/otel-collector-config.yaml
ports:
- "1777:1777" # pprof extension
- "8887:8888" # Prometheus metrics exposed by the agent
- "14268:14268" # Jaeger receiver
- "55678" # OpenCensus receiver
- "55680:55680" # OTLP HTTP/2.0 legacy port
- "55681:55681" # OTLP HTTP/1.0 receiver
- "4317:4317" # OTLP GRPC receiver
- "55679:55679" # zpages extension
- "13133" # health_check
depends_on:
kafka:
condition: service_healthy
hotrod:
image: jaegertracing/example-hotrod:latest
container_name: hotrod
ports:
- "9000:8080"
command: ["all"]
environment:
- JAEGER_ENDPOINT=http://otel-collector:14268/api/traces
load-hotrod:
image: "grubykarol/locust:1.2.3-python3.9-alpine3.12"
container_name: load-hotrod
hostname: load-hotrod
ports:
- "8089:8089"
environment:
ATTACKED_HOST: http://hotrod:8080
LOCUST_MODE: standalone
NO_PROXY: standalone
TASK_DELAY_FROM: 5
TASK_DELAY_TO: 30
QUIET_MODE: "${QUIET_MODE:-false}"
LOCUST_OPTS: "--headless -u 10 -r 1"
volumes:
- ./locust-scripts:/locust

View File

@@ -0,0 +1,259 @@
version: "2.4"
volumes:
metadata_data: {}
middle_var: {}
historical_var: {}
broker_var: {}
coordinator_var: {}
router_var: {}
# If able to connect to kafka but not able to write to topic otlp_spans look into below link
# https://github.com/wurstmeister/kafka-docker/issues/409#issuecomment-428346707
services:
zookeeper:
image: bitnami/zookeeper:3.6.2-debian-10-r100
ports:
- "2181:2181"
environment:
- ALLOW_ANONYMOUS_LOGIN=yes
kafka:
# image: wurstmeister/kafka
image: bitnami/kafka:2.7.0-debian-10-r1
ports:
- "9092:9092"
hostname: kafka
environment:
KAFKA_ADVERTISED_HOST_NAME: kafka
KAFKA_ADVERTISED_PORT: 9092
KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
ALLOW_PLAINTEXT_LISTENER: 'yes'
KAFKA_CFG_AUTO_CREATE_TOPICS_ENABLE: 'true'
KAFKA_TOPICS: 'otlp_spans:1:1,flattened_spans:1:1'
healthcheck:
# test: ["CMD", "kafka-topics.sh", "--create", "--topic", "otlp_spans", "--zookeeper", "zookeeper:2181"]
test: ["CMD", "kafka-topics.sh", "--list", "--zookeeper", "zookeeper:2181"]
interval: 30s
timeout: 10s
retries: 10
depends_on:
- zookeeper
postgres:
container_name: postgres
image: postgres:latest
volumes:
- metadata_data:/var/lib/postgresql/data
environment:
- POSTGRES_PASSWORD=FoolishPassword
- POSTGRES_USER=druid
- POSTGRES_DB=druid
coordinator:
image: apache/druid:0.20.0
container_name: coordinator
volumes:
- ./storage:/opt/druid/deepStorage
- coordinator_var:/opt/druid/data
depends_on:
- zookeeper
- postgres
ports:
- "8081:8081"
command:
- coordinator
env_file:
- environment_small/coordinator
broker:
image: apache/druid:0.20.0
container_name: broker
volumes:
- broker_var:/opt/druid/data
depends_on:
- zookeeper
- postgres
- coordinator
ports:
- "8082:8082"
command:
- broker
env_file:
- environment_small/broker
historical:
image: apache/druid:0.20.0
container_name: historical
volumes:
- ./storage:/opt/druid/deepStorage
- historical_var:/opt/druid/data
depends_on:
- zookeeper
- postgres
- coordinator
ports:
- "8083:8083"
command:
- historical
env_file:
- environment_small/historical
middlemanager:
image: apache/druid:0.20.0
container_name: middlemanager
volumes:
- ./storage:/opt/druid/deepStorage
- middle_var:/opt/druid/data
depends_on:
- zookeeper
- postgres
- coordinator
ports:
- "8091:8091"
command:
- middleManager
env_file:
- environment_small/middlemanager
router:
image: apache/druid:0.20.0
container_name: router
volumes:
- router_var:/opt/druid/data
depends_on:
- zookeeper
- postgres
- coordinator
ports:
- "8888:8888"
command:
- router
env_file:
- environment_small/router
flatten-processor:
image: signoz/flattener-processor:0.2.0
container_name: flattener-processor
depends_on:
- kafka
- otel-collector
ports:
- "8000:8000"
environment:
- KAFKA_BROKER=kafka:9092
- KAFKA_INPUT_TOPIC=otlp_spans
- KAFKA_OUTPUT_TOPIC=flattened_spans
query-service:
image: signoz.docker.scarf.sh/signoz/query-service:0.2.0
container_name: query-service
depends_on:
- router
ports:
- "8080:8080"
environment:
- DruidClientUrl=http://router:8888
- DruidDatasource=flattened_spans
- POSTHOG_API_KEY=H-htDCae7CR3RV57gUzmol6IAKtm5IMCvbcm_fwnL-w
frontend:
image: signoz/frontend:0.2.1
container_name: frontend
depends_on:
- query-service
links:
- "query-service"
ports:
- "3000:3000"
volumes:
- ./nginx-config.conf:/etc/nginx/conf.d/default.conf
create-supervisor:
image: theithollow/hollowapp-blog:curl
container_name: create-supervisor
command:
- /bin/sh
- -c
- "curl -X POST -H 'Content-Type: application/json' -d @/app/supervisor-spec.json http://router:8888/druid/indexer/v1/supervisor"
depends_on:
- router
restart: on-failure:6
volumes:
- ./druid-jobs/supervisor-spec.json:/app/supervisor-spec.json
set-retention:
image: theithollow/hollowapp-blog:curl
container_name: set-retention
command:
- /bin/sh
- -c
- "curl -X POST -H 'Content-Type: application/json' -d @/app/retention-spec.json http://router:8888/druid/coordinator/v1/rules/flattened_spans"
depends_on:
- router
restart: on-failure:6
volumes:
- ./druid-jobs/retention-spec.json:/app/retention-spec.json
otel-collector:
image: otel/opentelemetry-collector:0.18.0
command: ["--config=/etc/otel-collector-config.yaml", "--mem-ballast-size-mib=683"]
volumes:
- ./otel-collector-config.yaml:/etc/otel-collector-config.yaml
ports:
- "1777:1777" # pprof extension
- "8887:8888" # Prometheus metrics exposed by the agent
- "14268:14268" # Jaeger receiver
- "55678" # OpenCensus receiver
- "55680:55680" # OTLP HTTP/2.0 leagcy grpc receiver
- "55681:55681" # OTLP HTTP/1.0 receiver
- "4317:4317" # OTLP GRPC receiver
- "55679:55679" # zpages extension
- "13133" # health_check
depends_on:
kafka:
condition: service_healthy
hotrod:
image: jaegertracing/example-hotrod:latest
container_name: hotrod
ports:
- "9000:8080"
command: ["all"]
environment:
- JAEGER_ENDPOINT=http://otel-collector:14268/api/traces
load-hotrod:
image: "grubykarol/locust:1.2.3-python3.9-alpine3.12"
container_name: load-hotrod
hostname: load-hotrod
ports:
- "8089:8089"
environment:
ATTACKED_HOST: http://hotrod:8080
LOCUST_MODE: standalone
NO_PROXY: standalone
TASK_DELAY_FROM: 5
TASK_DELAY_TO: 30
QUIET_MODE: "${QUIET_MODE:-false}"
LOCUST_OPTS: "--headless -u 10 -r 1"
volumes:
- ./locust-scripts:/locust

View File

@@ -0,0 +1 @@
[{"period":"P3D","includeFuture":true,"tieredReplicants":{"_default_tier":1},"type":"loadByPeriod"},{"type":"dropForever"}]

View File

@@ -0,0 +1,69 @@
{
"type": "kafka",
"dataSchema": {
"dataSource": "flattened_spans",
"parser": {
"type": "string",
"parseSpec": {
"format": "json",
"timestampSpec": {
"column": "StartTimeUnixNano",
"format": "nano"
},
"dimensionsSpec": {
"dimensions": [
"TraceId",
"SpanId",
"ParentSpanId",
"Name",
"ServiceName",
"References",
"Tags",
"ExternalHttpMethod",
"ExternalHttpUrl",
"Component",
"DBSystem",
"DBName",
"DBOperation",
"PeerService",
{
"type": "string",
"name": "TagsKeys",
"multiValueHandling": "ARRAY"
},
{
"type": "string",
"name": "TagsValues",
"multiValueHandling": "ARRAY"
},
{ "name": "DurationNano", "type": "Long" },
{ "name": "Kind", "type": "int" },
{ "name": "StatusCode", "type": "int" }
]
}
}
},
"metricsSpec" : [
{ "type": "quantilesDoublesSketch", "name": "QuantileDuration", "fieldName": "DurationNano" }
],
"granularitySpec": {
"type": "uniform",
"segmentGranularity": "DAY",
"queryGranularity": "NONE",
"rollup": false
}
},
"tuningConfig": {
"type": "kafka",
"reportParseExceptions": true
},
"ioConfig": {
"topic": "flattened_spans",
"replicas": 1,
"taskDuration": "PT20M",
"completionTimeout": "PT30M",
"consumerProperties": {
"bootstrap.servers": "kafka:9092"
}
}
}

View File

@@ -0,0 +1,53 @@
#
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
#
# Java tuning
DRUID_XMX=512m
DRUID_XMS=512m
DRUID_MAXNEWSIZE=256m
DRUID_NEWSIZE=256m
DRUID_MAXDIRECTMEMORYSIZE=768m
druid_emitter_logging_logLevel=debug
druid_extensions_loadList=["druid-histogram", "druid-datasketches", "druid-lookups-cached-global", "postgresql-metadata-storage", "druid-kafka-indexing-service"]
druid_zk_service_host=zookeeper
druid_metadata_storage_host=
druid_metadata_storage_type=postgresql
druid_metadata_storage_connector_connectURI=jdbc:postgresql://postgres:5432/druid
druid_metadata_storage_connector_user=druid
druid_metadata_storage_connector_password=FoolishPassword
druid_coordinator_balancer_strategy=cachingCost
druid_indexer_runner_javaOptsArray=["-server", "-Xms512m", "-Xmx512m", "-XX:MaxDirectMemorySize=768m", "-Duser.timezone=UTC", "-Dfile.encoding=UTF-8", "-Djava.util.logging.manager=org.apache.logging.log4j.jul.LogManager"]
druid_indexer_fork_property_druid_processing_buffer_sizeBytes=25000000
druid_processing_buffer_sizeBytes=100MiB
druid_storage_type=local
druid_storage_storageDirectory=/opt/druid/deepStorage
druid_indexer_logs_type=file
druid_indexer_logs_directory=/opt/druid/data/indexing-logs
druid_processing_numThreads=1
druid_processing_numMergeBuffers=2
DRUID_LOG4J=<?xml version="1.0" encoding="UTF-8" ?><Configuration status="WARN"><Appenders><Console name="Console" target="SYSTEM_OUT"><PatternLayout pattern="%d{ISO8601} %p [%t] %c - %m%n"/></Console></Appenders><Loggers><Root level="info"><AppenderRef ref="Console"/></Root><Logger name="org.apache.druid.jetty.RequestLog" additivity="false" level="DEBUG"><AppenderRef ref="Console"/></Logger></Loggers></Configuration>

View File

@@ -0,0 +1,52 @@
#
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
#
# Java tuning
DRUID_XMX=64m
DRUID_XMS=64m
DRUID_MAXNEWSIZE=256m
DRUID_NEWSIZE=256m
DRUID_MAXDIRECTMEMORYSIZE=400m
druid_emitter_logging_logLevel=debug
druid_extensions_loadList=["druid-histogram", "druid-datasketches", "druid-lookups-cached-global", "postgresql-metadata-storage", "druid-kafka-indexing-service"]
druid_zk_service_host=zookeeper
druid_metadata_storage_host=
druid_metadata_storage_type=postgresql
druid_metadata_storage_connector_connectURI=jdbc:postgresql://postgres:5432/druid
druid_metadata_storage_connector_user=druid
druid_metadata_storage_connector_password=FoolishPassword
druid_coordinator_balancer_strategy=cachingCost
druid_indexer_runner_javaOptsArray=["-server", "-Xms64m", "-Xmx64m", "-XX:MaxDirectMemorySize=400m", "-Duser.timezone=UTC", "-Dfile.encoding=UTF-8", "-Djava.util.logging.manager=org.apache.logging.log4j.jul.LogManager"]
druid_indexer_fork_property_druid_processing_buffer_sizeBytes=25000000
druid_storage_type=local
druid_storage_storageDirectory=/opt/druid/deepStorage
druid_indexer_logs_type=file
druid_indexer_logs_directory=/opt/druid/data/indexing-logs
druid_processing_numThreads=1
druid_processing_numMergeBuffers=2
DRUID_LOG4J=<?xml version="1.0" encoding="UTF-8" ?><Configuration status="WARN"><Appenders><Console name="Console" target="SYSTEM_OUT"><PatternLayout pattern="%d{ISO8601} %p [%t] %c - %m%n"/></Console></Appenders><Loggers><Root level="info"><AppenderRef ref="Console"/></Root><Logger name="org.apache.druid.jetty.RequestLog" additivity="false" level="DEBUG"><AppenderRef ref="Console"/></Logger></Loggers></Configuration>

View File

@@ -0,0 +1,53 @@
#
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
#
# Java tuning
DRUID_XMX=512m
DRUID_XMS=512m
DRUID_MAXNEWSIZE=256m
DRUID_NEWSIZE=256m
DRUID_MAXDIRECTMEMORYSIZE=1280m
druid_emitter_logging_logLevel=debug
druid_extensions_loadList=["druid-histogram", "druid-datasketches", "druid-lookups-cached-global", "postgresql-metadata-storage", "druid-kafka-indexing-service"]
druid_zk_service_host=zookeeper
druid_metadata_storage_host=
druid_metadata_storage_type=postgresql
druid_metadata_storage_connector_connectURI=jdbc:postgresql://postgres:5432/druid
druid_metadata_storage_connector_user=druid
druid_metadata_storage_connector_password=FoolishPassword
druid_coordinator_balancer_strategy=cachingCost
druid_indexer_runner_javaOptsArray=["-server", "-Xms512m", "-Xmx512m", "-XX:MaxDirectMemorySize=1280m", "-Duser.timezone=UTC", "-Dfile.encoding=UTF-8", "-Djava.util.logging.manager=org.apache.logging.log4j.jul.LogManager"]
druid_indexer_fork_property_druid_processing_buffer_sizeBytes=25000000
druid_processing_buffer_sizeBytes=200MiB
druid_storage_type=local
druid_storage_storageDirectory=/opt/druid/deepStorage
druid_indexer_logs_type=file
druid_indexer_logs_directory=/opt/druid/data/indexing-logs
druid_processing_numThreads=2
druid_processing_numMergeBuffers=2
DRUID_LOG4J=<?xml version="1.0" encoding="UTF-8" ?><Configuration status="WARN"><Appenders><Console name="Console" target="SYSTEM_OUT"><PatternLayout pattern="%d{ISO8601} %p [%t] %c - %m%n"/></Console></Appenders><Loggers><Root level="info"><AppenderRef ref="Console"/></Root><Logger name="org.apache.druid.jetty.RequestLog" additivity="false" level="DEBUG"><AppenderRef ref="Console"/></Logger></Loggers></Configuration>

View File

@@ -0,0 +1,53 @@
#
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
#
# Java tuning
DRUID_XMX=1g
DRUID_XMS=1g
DRUID_MAXNEWSIZE=256m
DRUID_NEWSIZE=256m
DRUID_MAXDIRECTMEMORYSIZE=2g
druid_emitter_logging_logLevel=debug
druid_extensions_loadList=["druid-histogram", "druid-datasketches", "druid-lookups-cached-global", "postgresql-metadata-storage", "druid-kafka-indexing-service"]
druid_zk_service_host=zookeeper
druid_metadata_storage_host=
druid_metadata_storage_type=postgresql
druid_metadata_storage_connector_connectURI=jdbc:postgresql://postgres:5432/druid
druid_metadata_storage_connector_user=druid
druid_metadata_storage_connector_password=FoolishPassword
druid_coordinator_balancer_strategy=cachingCost
druid_indexer_runner_javaOptsArray=["-server", "-Xms1g", "-Xmx1g", "-XX:MaxDirectMemorySize=2g", "-Duser.timezone=UTC", "-Dfile.encoding=UTF-8", "-Djava.util.logging.manager=org.apache.logging.log4j.jul.LogManager"]
druid_indexer_fork_property_druid_processing_buffer_sizeBytes=25000000
druid_processing_buffer_sizeBytes=200MiB
druid_storage_type=local
druid_storage_storageDirectory=/opt/druid/deepStorage
druid_indexer_logs_type=file
druid_indexer_logs_directory=/opt/druid/data/indexing-logs
druid_processing_numThreads=2
druid_processing_numMergeBuffers=2
DRUID_LOG4J=<?xml version="1.0" encoding="UTF-8" ?><Configuration status="WARN"><Appenders><Console name="Console" target="SYSTEM_OUT"><PatternLayout pattern="%d{ISO8601} %p [%t] %c - %m%n"/></Console></Appenders><Loggers><Root level="info"><AppenderRef ref="Console"/></Root><Logger name="org.apache.druid.jetty.RequestLog" additivity="false" level="DEBUG"><AppenderRef ref="Console"/></Logger></Loggers></Configuration>

View File

@@ -0,0 +1,52 @@
#
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
#
# Java tuning
DRUID_XMX=128m
DRUID_XMS=128m
DRUID_MAXNEWSIZE=256m
DRUID_NEWSIZE=256m
DRUID_MAXDIRECTMEMORYSIZE=128m
druid_emitter_logging_logLevel=debug
druid_extensions_loadList=["druid-histogram", "druid-datasketches", "druid-lookups-cached-global", "postgresql-metadata-storage", "druid-kafka-indexing-service"]
druid_zk_service_host=zookeeper
druid_metadata_storage_host=
druid_metadata_storage_type=postgresql
druid_metadata_storage_connector_connectURI=jdbc:postgresql://postgres:5432/druid
druid_metadata_storage_connector_user=druid
druid_metadata_storage_connector_password=FoolishPassword
druid_coordinator_balancer_strategy=cachingCost
druid_indexer_runner_javaOptsArray=["-server", "-Xms128m", "-Xmx128m", "-XX:MaxDirectMemorySize=128m", "-Duser.timezone=UTC", "-Dfile.encoding=UTF-8", "-Djava.util.logging.manager=org.apache.logging.log4j.jul.LogManager"]
druid_indexer_fork_property_druid_processing_buffer_sizeBytes=25000000
druid_storage_type=local
druid_storage_storageDirectory=/opt/druid/deepStorage
druid_indexer_logs_type=file
druid_indexer_logs_directory=/opt/druid/data/indexing-logs
druid_processing_numThreads=1
druid_processing_numMergeBuffers=2
DRUID_LOG4J=<?xml version="1.0" encoding="UTF-8" ?><Configuration status="WARN"><Appenders><Console name="Console" target="SYSTEM_OUT"><PatternLayout pattern="%d{ISO8601} %p [%t] %c - %m%n"/></Console></Appenders><Loggers><Root level="info"><AppenderRef ref="Console"/></Root><Logger name="org.apache.druid.jetty.RequestLog" additivity="false" level="DEBUG"><AppenderRef ref="Console"/></Logger></Loggers></Configuration>

View File

@@ -0,0 +1,52 @@
#
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
#
# Java tuning
DRUID_XMX=512m
DRUID_XMS=512m
DRUID_MAXNEWSIZE=256m
DRUID_NEWSIZE=256m
DRUID_MAXDIRECTMEMORYSIZE=400m
druid_emitter_logging_logLevel=debug
# druid_extensions_loadList=["druid-histogram", "druid-datasketches", "druid-lookups-cached-global", "postgresql-metadata-storage", "druid-kafka-indexing-service"]
druid_zk_service_host=zookeeper
druid_metadata_storage_host=
druid_metadata_storage_type=postgresql
druid_metadata_storage_connector_connectURI=jdbc:postgresql://postgres:5432/druid
druid_metadata_storage_connector_user=druid
druid_metadata_storage_connector_password=FoolishPassword
druid_coordinator_balancer_strategy=cachingCost
druid_indexer_runner_javaOptsArray=["-server", "-Xms512m", "-Xmx512m", "-XX:MaxDirectMemorySize=400m", "-Duser.timezone=UTC", "-Dfile.encoding=UTF-8", "-Djava.util.logging.manager=org.apache.logging.log4j.jul.LogManager"]
druid_indexer_fork_property_druid_processing_buffer_sizeBytes=25000000
druid_processing_buffer_sizeBytes=50MiB
druid_processing_numThreads=1
druid_processing_numMergeBuffers=2
DRUID_LOG4J=<?xml version="1.0" encoding="UTF-8" ?><Configuration status="WARN"><Appenders><Console name="Console" target="SYSTEM_OUT"><PatternLayout pattern="%d{ISO8601} %p [%t] %c - %m%n"/></Console></Appenders><Loggers><Root level="info"><AppenderRef ref="Console"/></Root><Logger name="org.apache.druid.jetty.RequestLog" additivity="false" level="DEBUG"><AppenderRef ref="Console"/></Logger></Loggers></Configuration>

View File

@@ -0,0 +1,26 @@
# For S3 storage
# druid_extensions_loadList=["druid-histogram", "druid-datasketches", "druid-lookups-cached-global", "postgresql-metadata-storage", "druid-kafka-indexing-service", "druid-s3-extensions"]
# druid_storage_type=s3
# druid_storage_bucket=<s3-bucket-name>
# druid_storage_baseKey=druid/segments
# AWS_ACCESS_KEY_ID=<s3-access-id>
# AWS_SECRET_ACCESS_KEY=<s3-access-key>
# AWS_REGION=<s3-aws-region>
# druid_indexer_logs_type=s3
# druid_indexer_logs_s3Bucket=<s3-bucket-name>
# druid_indexer_logs_s3Prefix=druid/indexing-logs
# -----------------------------------------------------------
# For local storage
druid_extensions_loadList=["druid-histogram", "druid-datasketches", "druid-lookups-cached-global", "postgresql-metadata-storage", "druid-kafka-indexing-service"]
druid_storage_type=local
druid_storage_storageDirectory=/opt/data/segments
druid_indexer_logs_type=file
druid_indexer_logs_directory=/opt/data/indexing-logs

View File

@@ -0,0 +1,49 @@
#
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
#
# Java tuning
DRUID_XMX=64m
DRUID_XMS=64m
DRUID_MAXNEWSIZE=256m
DRUID_NEWSIZE=256m
DRUID_MAXDIRECTMEMORYSIZE=400m
druid_emitter_logging_logLevel=debug
# druid_extensions_loadList=["druid-histogram", "druid-datasketches", "druid-lookups-cached-global", "postgresql-metadata-storage", "druid-kafka-indexing-service"]
druid_zk_service_host=zookeeper
druid_metadata_storage_host=
druid_metadata_storage_type=postgresql
druid_metadata_storage_connector_connectURI=jdbc:postgresql://postgres:5432/druid
druid_metadata_storage_connector_user=druid
druid_metadata_storage_connector_password=FoolishPassword
druid_coordinator_balancer_strategy=cachingCost
druid_indexer_runner_javaOptsArray=["-server", "-Xms64m", "-Xmx64m", "-XX:MaxDirectMemorySize=400m", "-Duser.timezone=UTC", "-Dfile.encoding=UTF-8", "-Djava.util.logging.manager=org.apache.logging.log4j.jul.LogManager"]
druid_indexer_fork_property_druid_processing_buffer_sizeBytes=25000000
druid_processing_numThreads=1
druid_processing_numMergeBuffers=2
DRUID_LOG4J=<?xml version="1.0" encoding="UTF-8" ?><Configuration status="WARN"><Appenders><Console name="Console" target="SYSTEM_OUT"><PatternLayout pattern="%d{ISO8601} %p [%t] %c - %m%n"/></Console></Appenders><Loggers><Root level="info"><AppenderRef ref="Console"/></Root><Logger name="org.apache.druid.jetty.RequestLog" additivity="false" level="DEBUG"><AppenderRef ref="Console"/></Logger></Loggers></Configuration>

View File

@@ -0,0 +1,49 @@
#
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
#
# Java tuning
DRUID_XMX=512m
DRUID_XMS=512m
DRUID_MAXNEWSIZE=256m
DRUID_NEWSIZE=256m
DRUID_MAXDIRECTMEMORYSIZE=400m
druid_emitter_logging_logLevel=debug
# druid_extensions_loadList=["druid-histogram", "druid-datasketches", "druid-lookups-cached-global", "postgresql-metadata-storage", "druid-kafka-indexing-service"]
druid_zk_service_host=zookeeper
druid_metadata_storage_host=
druid_metadata_storage_type=postgresql
druid_metadata_storage_connector_connectURI=jdbc:postgresql://postgres:5432/druid
druid_metadata_storage_connector_user=druid
druid_metadata_storage_connector_password=FoolishPassword
druid_coordinator_balancer_strategy=cachingCost
druid_indexer_runner_javaOptsArray=["-server", "-Xms512m", "-Xmx512m", "-XX:MaxDirectMemorySize=400m", "-Duser.timezone=UTC", "-Dfile.encoding=UTF-8", "-Djava.util.logging.manager=org.apache.logging.log4j.jul.LogManager"]
druid_indexer_fork_property_druid_processing_buffer_sizeBytes=25000000
druid_processing_buffer_sizeBytes=50MiB
druid_processing_numThreads=1
druid_processing_numMergeBuffers=2
DRUID_LOG4J=<?xml version="1.0" encoding="UTF-8" ?><Configuration status="WARN"><Appenders><Console name="Console" target="SYSTEM_OUT"><PatternLayout pattern="%d{ISO8601} %p [%t] %c - %m%n"/></Console></Appenders><Loggers><Root level="info"><AppenderRef ref="Console"/></Root><Logger name="org.apache.druid.jetty.RequestLog" additivity="false" level="DEBUG"><AppenderRef ref="Console"/></Logger></Loggers></Configuration>

View File

@@ -0,0 +1,50 @@
#
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
#
# Java tuning
DRUID_XMX=64m
DRUID_XMS=64m
DRUID_MAXNEWSIZE=256m
DRUID_NEWSIZE=256m
DRUID_MAXDIRECTMEMORYSIZE=400m
druid_emitter_logging_logLevel=debug
# druid_extensions_loadList=["druid-histogram", "druid-datasketches", "druid-lookups-cached-global", "postgresql-metadata-storage", "druid-kafka-indexing-service"]
druid_zk_service_host=zookeeper
druid_metadata_storage_host=
druid_metadata_storage_type=postgresql
druid_metadata_storage_connector_connectURI=jdbc:postgresql://postgres:5432/druid
druid_metadata_storage_connector_user=druid
druid_metadata_storage_connector_password=FoolishPassword
druid_coordinator_balancer_strategy=cachingCost
druid_indexer_runner_javaOptsArray=["-server", "-Xms256m", "-Xmx256m", "-XX:MaxDirectMemorySize=400m", "-Duser.timezone=UTC", "-Dfile.encoding=UTF-8", "-Djava.util.logging.manager=org.apache.logging.log4j.jul.LogManager"]
druid_indexer_fork_property_druid_processing_buffer_sizeBytes=25000000
druid_processing_numThreads=1
druid_processing_numMergeBuffers=2
DRUID_LOG4J=<?xml version="1.0" encoding="UTF-8" ?><Configuration status="WARN"><Appenders><Console name="Console" target="SYSTEM_OUT"><PatternLayout pattern="%d{ISO8601} %p [%t] %c - %m%n"/></Console></Appenders><Loggers><Root level="info"><AppenderRef ref="Console"/></Root><Logger name="org.apache.druid.jetty.RequestLog" additivity="false" level="DEBUG"><AppenderRef ref="Console"/></Logger></Loggers></Configuration>

View File

@@ -0,0 +1,49 @@
#
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
#
# Java tuning
DRUID_XMX=64m
DRUID_XMS=64m
DRUID_MAXNEWSIZE=256m
DRUID_NEWSIZE=256m
DRUID_MAXDIRECTMEMORYSIZE=128m
druid_emitter_logging_logLevel=debug
# druid_extensions_loadList=["druid-histogram", "druid-datasketches", "druid-lookups-cached-global", "postgresql-metadata-storage", "druid-kafka-indexing-service"]
druid_zk_service_host=zookeeper
druid_metadata_storage_host=
druid_metadata_storage_type=postgresql
druid_metadata_storage_connector_connectURI=jdbc:postgresql://postgres:5432/druid
druid_metadata_storage_connector_user=druid
druid_metadata_storage_connector_password=FoolishPassword
druid_coordinator_balancer_strategy=cachingCost
druid_indexer_runner_javaOptsArray=["-server", "-Xms64m", "-Xmx64m", "-XX:MaxDirectMemorySize=128m", "-Duser.timezone=UTC", "-Dfile.encoding=UTF-8", "-Djava.util.logging.manager=org.apache.logging.log4j.jul.LogManager"]
druid_indexer_fork_property_druid_processing_buffer_sizeBytes=25000000
druid_processing_numThreads=1
druid_processing_numMergeBuffers=2
DRUID_LOG4J=<?xml version="1.0" encoding="UTF-8" ?><Configuration status="WARN"><Appenders><Console name="Console" target="SYSTEM_OUT"><PatternLayout pattern="%d{ISO8601} %p [%t] %c - %m%n"/></Console></Appenders><Loggers><Root level="info"><AppenderRef ref="Console"/></Root><Logger name="org.apache.druid.jetty.RequestLog" additivity="false" level="DEBUG"><AppenderRef ref="Console"/></Logger></Loggers></Configuration>

View File

@@ -0,0 +1,16 @@
from locust import HttpUser, task, between
class UserTasks(HttpUser):
wait_time = between(5, 15)
@task
def rachel(self):
self.client.get("/dispatch?customer=123&nonse=0.6308392664170006")
@task
def trom(self):
self.client.get("/dispatch?customer=392&nonse=0.015296363321630757")
@task
def japanese(self):
self.client.get("/dispatch?customer=731&nonse=0.8022286220408668")
@task
def coffee(self):
self.client.get("/dispatch?customer=567&nonse=0.0022220379420636593")

View File

@@ -0,0 +1,20 @@
server {
listen 3000;
server_name _;
location / {
root /usr/share/nginx/html;
index index.html index.htm;
try_files $uri $uri/ /index.html;
}
location /api {
proxy_pass http://query-service:8080/api;
}
# redirect server error pages to the static page /50x.html
#
error_page 500 502 503 504 /50x.html;
location = /50x.html {
root /usr/share/nginx/html;
}
}

View File

@@ -0,0 +1,51 @@
receivers:
otlp:
protocols:
grpc:
http:
jaeger:
protocols:
grpc:
thrift_http:
processors:
batch:
send_batch_size: 1000
timeout: 10s
memory_limiter:
# Same as --mem-ballast-size-mib CLI argument
ballast_size_mib: 683
# 80% of maximum memory up to 2G
limit_mib: 1500
# 25% of limit up to 2G
spike_limit_mib: 512
check_interval: 5s
queued_retry:
num_workers: 4
queue_size: 100
retry_on_failure: true
extensions:
health_check: {}
zpages: {}
exporters:
kafka/traces:
brokers:
- kafka:9092
topic: 'otlp_spans'
protocol_version: 2.0.0
kafka/metrics:
brokers:
- kafka:9092
topic: 'otlp_metrics'
protocol_version: 2.0.0
service:
extensions: [health_check, zpages]
pipelines:
traces:
receivers: [jaeger, otlp]
processors: [memory_limiter, batch, queued_retry]
exporters: [kafka/traces]
metrics:
receivers: [otlp]
processors: [batch]
exporters: [kafka/metrics]

256
deploy/install.k8s.sh Normal file
View File

@@ -0,0 +1,256 @@
#!/bin/bash
set -o errexit
is_command_present() {
type "$1" >/dev/null 2>&1
}
is_mac() {
[[ $OSTYPE == darwin* ]]
}
check_k8s_setup() {
echo "Checking your k8s setup status"
if ! is_command_present kubectl; then
echo "Please install kubectl on your machine"
exit 1
else
if ! is_command_present jq; then
install_jq
fi
clusters=`kubectl config view -o json | jq -r '."current-context"'`
if [[ ! -n $clusters ]]; then
echo "Please setup a k8s cluster & config kubectl to connect to it"
exit 1
fi
k8s_minor_version=`kubectl version --short -o json | jq ."serverVersion.minor" | sed 's/[^0-9]*//g'`
# if [[ $k8s_minor_version < 18 ]]; then
# echo "+++++++++++ ERROR ++++++++++++++++++++++"
# echo "SigNoz deployments require Kubernetes >= v1.18. Found version: v1.$k8s_minor_version"
# echo "+++++++++++ ++++++++++++++++++++++++++++"
# exit 1
# fi;
fi
}
install_jq(){
if [ $package_manager == "brew" ]; then
brew install jq
elif [ $package_manager == "yum" ]; then
yum_cmd="sudo yum --assumeyes --quiet"
$yum_cmd install jq
else
apt_cmd="sudo apt-get --yes --quiet"
$apt_cmd update
$apt_cmd install jq
fi
}
check_os() {
if is_mac; then
package_manager="brew"
desired_os=1
os="Mac"
return
fi
os_name="$(cat /etc/*-release | awk -F= '$1 == "NAME" { gsub(/"/, ""); print $2; exit }')"
case "$os_name" in
Ubuntu*)
desired_os=1
os="ubuntu"
package_manager="apt-get"
;;
Debian*)
desired_os=1
os="debian"
package_manager="apt-get"
;;
Red\ Hat*)
desired_os=1
os="red hat"
package_manager="yum"
;;
CentOS*)
desired_os=1
os="centos"
package_manager="yum"
;;
*)
desired_os=0
os="Not Found"
esac
}
echo_contact_support() {
echo "Please contact <support@signoz.io> with your OS details and version${1:-.}"
}
bye() { # Prints a friendly good bye message and exits the script.
set +o errexit
echo "Please share your email to receive support with the installation"
read -rp 'Email: ' email
while [[ $email == "" ]]
do
read -rp 'Email: ' email
done
DATA='{ "api_key": "H-htDCae7CR3RV57gUzmol6IAKtm5IMCvbcm_fwnL-w", "type": "capture", "event": "Installation Support", "distinct_id": "'"$SIGNOZ_INSTALLATION_ID"'", "properties": { "os": "'"$os"'", "email": "'"$email"'", "platform": "k8s", "k8s_minor_version": "'"$k8s_minor_version"'" } }'
URL="https://app.posthog.com/capture"
HEADER="Content-Type: application/json"
if has_curl; then
curl -sfL -d "$DATA" --header "$HEADER" "$URL" > /dev/null 2>&1
elif has_wget; then
wget -q --post-data="$DATA" --header="$HEADER" "$URL" > /dev/null 2>&1
fi
echo -e "\nExiting for now. Bye! \U1F44B\n"
exit 1
}
deploy_app() {
kubectl apply -f "$install_dir/config-template"
kubectl apply -f "$install_dir"
}
wait_for_application_start() {
local timeout=$1
address=$custom_domain
if [[ "$ssl_enable" == "true" ]]; then
protocol="https"
else
protocol="http"
fi
# The while loop is important because for-loops don't work for dynamic values
while [[ $timeout -gt 0 ]]; do
if [[ $address == "" || $address == null ]]; then
address=`kubectl get ingress appsmith-ingress -o json | jq -r '.status.loadBalancer.ingress[0].ip'`
fi
status_code="$(curl -s -o /dev/null -w "%{http_code}" $protocol://$address/api/v1 || true)"
if [[ status_code -eq 401 ]]; then
break
else
echo -ne "Waiting for all containers to start. This check will timeout in $timeout seconds...\r\c"
fi
((timeout--))
sleep 1
done
echo ""
}
echo -e "👋 Thank you for trying out SigNoz! "
echo ""
# Checking OS and assigning package manager
desired_os=0
os=""
echo -e "🕵️ Detecting your OS"
check_os
SIGNOZ_INSTALLATION_ID=$(curl -s 'https://api64.ipify.org')
# Run bye if failure happens
trap bye EXIT
DATA='{ "api_key": "H-htDCae7CR3RV57gUzmol6IAKtm5IMCvbcm_fwnL-w", "type": "capture", "event": "Installation Started", "distinct_id": "'"$SIGNOZ_INSTALLATION_ID"'", "properties": { "os": "'"$os"'", "platform": "k8s", "k8s_minor_version": "'"$k8s_minor_version"'" } }'
URL="https://app.posthog.com/capture"
HEADER="Content-Type: application/json"
if has_curl; then
curl -sfL -d "$DATA" --header "$HEADER" "$URL" > /dev/null 2>&1
elif has_wget; then
wget -q --post-data="$DATA" --header="$HEADER" "$URL" > /dev/null 2>&1
fi
# Check for kubernetes setup
check_k8s_setup
echo ""
echo "Deploy Appmisth on your cluster"
echo ""
deploy_app
wait_for_application_start 60
if [[ $status_code -ne 200 ]]; then
echo "+++++++++++ ERROR ++++++++++++++++++++++"
echo "The containers didn't seem to start correctly. Please run the following command to check containers that may have errored out:"
echo ""
echo -e "sudo docker-compose -f docker/docker-compose-tiny.yaml ps -a"
echo "Please read our troubleshooting guide https://signoz.io/docs/deployment/docker#troubleshooting"
echo "or reach us on SigNoz for support https://join.slack.com/t/signoz-community/shared_invite/zt-lrjknbbp-J_mI13rlw8pGF4EWBnorJA"
echo "++++++++++++++++++++++++++++++++++++++++"
SUPERVISORS="$(curl -so - http://localhost:8888/druid/indexer/v1/supervisor)"
DATASOURCES="$(curl -so - http://localhost:8888/druid/coordinator/v1/datasources)"
DATA='{ "api_key": "H-htDCae7CR3RV57gUzmol6IAKtm5IMCvbcm_fwnL-w", "type": "capture", "event": "Installation Error - Checks", "distinct_id": "'"$SIGNOZ_INSTALLATION_ID"'", "properties": { "os": "'"$os"'", "platform": "k8s", "error": "Containers not started", "SUPERVISORS": '"$SUPERVISORS"', "DATASOURCES": '"$DATASOURCES"' } }'
URL="https://app.posthog.com/capture"
HEADER="Content-Type: application/json"
if has_curl; then
curl -sfL -d "$DATA" --header "$HEADER" "$URL" > /dev/null 2>&1
elif has_wget; then
wget -q --post-data="$DATA" --header="$HEADER" "$URL" > /dev/null 2>&1
fi
exit 1
else
DATA='{ "api_key": "H-htDCae7CR3RV57gUzmol6IAKtm5IMCvbcm_fwnL-w", "type": "capture", "event": "Installation Success", "distinct_id": "'"$SIGNOZ_INSTALLATION_ID"'", "properties": { "os": "'"$os"'"} }'
URL="https://app.posthog.com/capture"
HEADER="Content-Type: application/json"
if has_curl; then
curl -sfL -d "$DATA" --header "$HEADER" "$URL" > /dev/null 2>&1
elif has_wget; then
wget -q --post-data="$DATA" --header="$HEADER" "$URL" > /dev/null 2>&1
fi
echo "++++++++++++++++++ SUCCESS ++++++++++++++++++++++"
echo "Your installation is complete!"
echo ""
echo "Your frontend is running on 'http://localhost:3000'."
echo ""
echo "+++++++++++++++++++++++++++++++++++++++++++++++++"
echo ""
echo "Need help Getting Started?"
echo "Join us on Slack https://join.slack.com/t/signoz-community/shared_invite/zt-lrjknbbp-J_mI13rlw8pGF4EWBnorJA"
echo ""
echo "Please share your email to receive support & updates about SigNoz!"
read -rp 'Email: ' email
while [[ $email == "" ]]
do
read -rp 'Email: ' email
done
DATA='{ "api_key": "H-htDCae7CR3RV57gUzmol6IAKtm5IMCvbcm_fwnL-w", "type": "capture", "event": "Identify Successful Installation", "distinct_id": "'"$SIGNOZ_INSTALLATION_ID"'", "properties": { "os": "'"$os"'", "email": "'"$email"'", "platform": "k8s" } }'
URL="https://app.posthog.com/capture"
HEADER="Content-Type: application/json"
if has_curl; then
curl -sfL -d "$DATA" --header "$HEADER" "$URL" > /dev/null 2>&1
elif has_wget; then
wget -q --post-data="$DATA" --header="$HEADER" "$URL" > /dev/null 2>&1
fi
fi
echo -e "\nThank you!\n"

475
deploy/install.sh Executable file
View File

@@ -0,0 +1,475 @@
#!/bin/bash
set -o errexit
is_command_present() {
type "$1" >/dev/null 2>&1
}
# Check whether 'wget' command exists.
has_wget() {
has_cmd wget
}
# Check whether 'curl' command exists.
has_curl() {
has_cmd curl
}
# Check whether the given command exists.
has_cmd() {
command -v "$1" > /dev/null 2>&1
}
is_mac() {
[[ $OSTYPE == darwin* ]]
}
check_os() {
if is_mac; then
package_manager="brew"
desired_os=1
os="Mac"
return
fi
os_name="$(cat /etc/*-release | awk -F= '$1 == "NAME" { gsub(/"/, ""); print $2; exit }')"
case "$os_name" in
Ubuntu*)
desired_os=1
os="ubuntu"
package_manager="apt-get"
;;
Amazon\ Linux*)
desired_os=1
os="amazon linux"
package_manager="yum"
;;
Debian*)
desired_os=1
os="debian"
package_manager="apt-get"
;;
Linux\ Mint*)
desired_os=1
os="linux mint"
package_manager="apt-get"
;;
Red\ Hat*)
desired_os=1
os="red hat"
package_manager="yum"
;;
CentOS*)
desired_os=1
os="centos"
package_manager="yum"
;;
SLES*)
desired_os=1
os="sles"
package_manager="zypper"
;;
openSUSE*)
desired_os=1
os="opensuse"
package_manager="zypper"
;;
*)
desired_os=0
os="Not Found: $os_name"
esac
}
# This function checks if the relevant ports required by SigNoz are available or not
# The script should error out in case they aren't available
check_ports_occupied() {
local port_check_output
local ports_pattern="80|443"
if is_mac; then
port_check_output="$(netstat -anp tcp | awk '$6 == "LISTEN" && $4 ~ /^.*\.('"$ports_pattern"')$/')"
elif is_command_present ss; then
# The `ss` command seems to be a better/faster version of `netstat`, but is not available on all Linux
# distributions by default. Other distributions have `ss` but no `netstat`. So, we try for `ss` first, then
# fallback to `netstat`.
port_check_output="$(ss --all --numeric --tcp | awk '$1 == "LISTEN" && $4 ~ /^.*:('"$ports_pattern"')$/')"
elif is_command_present netstat; then
port_check_output="$(netstat --all --numeric --tcp | awk '$6 == "LISTEN" && $4 ~ /^.*:('"$ports_pattern"')$/')"
fi
if [[ -n $port_check_output ]]; then
DATA='{ "api_key": "H-htDCae7CR3RV57gUzmol6IAKtm5IMCvbcm_fwnL-w", "type": "capture", "event": "Installation Error", "distinct_id": "'"$SIGNOZ_INSTALLATION_ID"'", "properties": { "os": "'"$os"'", "error": "port not available" } }'
URL="https://app.posthog.com/capture"
HEADER="Content-Type: application/json"
if has_curl; then
curl -sfL -d "$DATA" --header "$HEADER" "$URL" > /dev/null 2>&1
elif has_wget; then
wget -q --post-data="$DATA" --header="$HEADER" "$URL" > /dev/null 2>&1
fi
echo "+++++++++++ ERROR ++++++++++++++++++++++"
echo "SigNoz requires ports 80 & 443 to be open. Please shut down any other service(s) that may be running on these ports."
echo "You can run SigNoz on another port following this guide https://signoz.io/docs/deployment/docker#troubleshooting"
echo "++++++++++++++++++++++++++++++++++++++++"
echo ""
exit 1
fi
}
install_docker() {
echo "++++++++++++++++++++++++"
echo "Setting up docker repos"
if [[ $package_manager == apt-get ]]; then
apt_cmd="sudo apt-get --yes --quiet"
$apt_cmd update
$apt_cmd install software-properties-common gnupg-agent
curl -fsSL "https://download.docker.com/linux/$os/gpg" | sudo apt-key add -
sudo add-apt-repository \
"deb [arch=amd64] https://download.docker.com/linux/$os $(lsb_release -cs) stable"
$apt_cmd update
echo "Installing docker"
$apt_cmd install docker-ce docker-ce-cli containerd.io
elif [[ $package_manager == zypper ]]; then
zypper_cmd="sudo zypper --quiet --no-gpg-checks --non-interactive"
echo "Installing docker"
if [[ $os == sles ]]; then
os_sp="$(cat /etc/*-release | awk -F= '$1 == "VERSION_ID" { gsub(/"/, ""); print $2; exit }')"
os_arch="$(uname -i)"
sudo SUSEConnect -p sle-module-containers/$os_sp/$os_arch -r ''
fi
$zypper_cmd install docker docker-runc containerd
sudo systemctl enable docker.service
elif [[ $package_manager == yum && $os == 'amazon linux' ]]; then
echo
echo "Amazon Linux detected ... "
echo
sudo yum install docker
sudo service docker start
else
yum_cmd="sudo yum --assumeyes --quiet"
$yum_cmd install yum-utils
sudo yum-config-manager --add-repo https://download.docker.com/linux/$os/docker-ce.repo
echo "Installing docker"
$yum_cmd install docker-ce docker-ce-cli containerd.io
fi
}
install_docker_machine() {
echo "\nInstalling docker machine ..."
if [[ $os == "Mac" ]];then
curl -sL https://github.com/docker/machine/releases/download/v0.16.2/docker-machine-`uname -s`-`uname -m` >/usr/local/bin/docker-machine
chmod +x /usr/local/bin/docker-machine
else
curl -sL https://github.com/docker/machine/releases/download/v0.16.2/docker-machine-`uname -s`-`uname -m` >/tmp/docker-machine
chmod +x /tmp/docker-machine
sudo cp /tmp/docker-machine /usr/local/bin/docker-machine
fi
}
install_docker_compose() {
if [[ $package_manager == "apt-get" || $package_manager == "zypper" || $package_manager == "yum" ]]; then
if [[ ! -f /usr/bin/docker-compose ]];then
echo "++++++++++++++++++++++++"
echo "Installing docker-compose"
sudo curl -L "https://github.com/docker/compose/releases/download/1.26.0/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose
sudo chmod +x /usr/local/bin/docker-compose
sudo ln -s /usr/local/bin/docker-compose /usr/bin/docker-compose
echo "docker-compose installed!"
echo ""
fi
else
DATA='{ "api_key": "H-htDCae7CR3RV57gUzmol6IAKtm5IMCvbcm_fwnL-w", "type": "capture", "event": "Installation Error", "distinct_id": "'"$SIGNOZ_INSTALLATION_ID"'", "properties": { "os": "'"$os"'", "error": "Docker Compose not found" } }'
URL="https://app.posthog.com/capture"
HEADER="Content-Type: application/json"
if has_curl; then
curl -sfL -d "$DATA" --header "$HEADER" "$URL" > /dev/null 2>&1
elif has_wget; then
wget -q --post-data="$DATA" --header="$HEADER" "$URL" > /dev/null 2>&1
fi
echo "+++++++++++ IMPORTANT READ ++++++++++++++++++++++"
echo "docker-compose not found! Please install docker-compose first and then continue with this installation."
echo "Refer https://docs.docker.com/compose/install/ for installing docker-compose."
echo "+++++++++++++++++++++++++++++++++++++++++++++++++"
exit 1
fi
}
start_docker() {
echo "Starting Docker ..."
if [ $os == "Mac" ]
then
open --background -a Docker && while ! docker system info > /dev/null 2>&1; do sleep 1; done
else
if ! sudo systemctl is-active docker.service > /dev/null; then
echo "Starting docker service"
sudo systemctl start docker.service
fi
fi
}
wait_for_containers_start() {
local timeout=$1
# The while loop is important because for-loops don't work for dynamic values
while [[ $timeout -gt 0 ]]; do
status_code="$(curl -s -o /dev/null -w "%{http_code}" http://localhost:3000/api/v1/services/list || true)"
if [[ status_code -eq 200 ]]; then
break
else
SUPERVISORS="$(curl -so - http://localhost:8888/druid/indexer/v1/supervisor)"
LEN_SUPERVISORS="${#SUPERVISORS}"
if [[ LEN_SUPERVISORS -ne 19 && $timeout -eq 50 ]];then
echo "No Supervisors found... Re-applying docker compose\n"
sudo docker-compose -f ./docker/docker-compose-tiny.yaml up -d
fi
echo -ne "Waiting for all containers to start. This check will timeout in $timeout seconds...\r\c"
fi
((timeout--))
sleep 1
done
echo ""
}
bye() { # Prints a friendly good bye message and exits the script.
if [ "$?" -ne 0 ]; then
set +o errexit
echo "The containers didn't seem to start correctly. Please run the following command to check containers that may have errored out:"
echo ""
echo -e "sudo docker-compose -f docker/docker-compose-tiny.yaml ps -a"
# echo "Please read our troubleshooting guide https://signoz.io/docs/deployment/docker#troubleshooting"
echo "or reach us on SigNoz for support https://join.slack.com/t/signoz-community/shared_invite/zt-lrjknbbp-J_mI13rlw8pGF4EWBnorJA"
echo "++++++++++++++++++++++++++++++++++++++++"
echo "Please share your email to receive support with the installation"
read -rp 'Email: ' email
while [[ $email == "" ]]
do
read -rp 'Email: ' email
done
DATA='{ "api_key": "H-htDCae7CR3RV57gUzmol6IAKtm5IMCvbcm_fwnL-w", "type": "capture", "event": "Installation Support", "distinct_id": "'"$SIGNOZ_INSTALLATION_ID"'", "properties": { "os": "'"$os"'", "email": "'"$email"'" } }'
URL="https://app.posthog.com/capture"
HEADER="Content-Type: application/json"
if has_curl; then
curl -sfL -d "$DATA" --header "$HEADER" "$URL" > /dev/null 2>&1
elif has_wget; then
wget -q --post-data="$DATA" --header="$HEADER" "$URL" > /dev/null 2>&1
fi
echo ""
echo -e "\nWe will reach out to you at the email provided shortly, Exiting for now. Bye! 👋 \n"
exit 0
fi
}
echo -e "👋 Thank you for trying out SigNoz! "
echo ""
# Checking OS and assigning package manager
desired_os=0
os=""
echo -e "🕵️ Detecting your OS"
check_os
SIGNOZ_INSTALLATION_ID=$(curl -s 'https://api64.ipify.org')
# Run bye if failure happens
trap bye EXIT
DATA='{ "api_key": "H-htDCae7CR3RV57gUzmol6IAKtm5IMCvbcm_fwnL-w", "type": "capture", "event": "Installation Started", "distinct_id": "'"$SIGNOZ_INSTALLATION_ID"'", "properties": { "os": "'"$os"'" } }'
URL="https://app.posthog.com/capture"
HEADER="Content-Type: application/json"
if has_curl; then
curl -sfL -d "$DATA" --header "$HEADER" "$URL" > /dev/null 2>&1
elif has_wget; then
wget -q --post-data="$DATA" --header="$HEADER" "$URL" > /dev/null 2>&1
fi
if [[ $desired_os -eq 0 ]];then
DATA='{ "api_key": "H-htDCae7CR3RV57gUzmol6IAKtm5IMCvbcm_fwnL-w", "type": "capture", "event": "Installation Error", "distinct_id": "'"$SIGNOZ_INSTALLATION_ID"'", "properties": { "os": "'"$os"'", "error": "OS Not Supported" } }'
URL="https://app.posthog.com/capture"
HEADER="Content-Type: application/json"
if has_curl; then
curl -sfL -d "$DATA" --header "$HEADER" "$URL" > /dev/null 2>&1
elif has_wget; then
wget -q --post-data="$DATA" --header="$HEADER" "$URL" > /dev/null 2>&1
fi
fi
# check_ports_occupied
# Check is Docker daemon is installed and available. If not, the install & start Docker for Linux machines. We cannot automatically install Docker Desktop on Mac OS
if ! is_command_present docker; then
if [[ $package_manager == "apt-get" || $package_manager == "zypper" || $package_manager == "yum" ]]; then
install_docker
else
echo ""
echo "+++++++++++ IMPORTANT READ ++++++++++++++++++++++"
echo "Docker Desktop must be installed manually on Mac OS to proceed. Docker can only be installed automatically on Ubuntu / openSUSE / SLES / Redhat / Cent OS"
echo "https://docs.docker.com/docker-for-mac/install/"
echo "++++++++++++++++++++++++++++++++++++++++++++++++"
DATA='{ "api_key": "H-htDCae7CR3RV57gUzmol6IAKtm5IMCvbcm_fwnL-w", "type": "capture", "event": "Installation Error", "distinct_id": "'"$SIGNOZ_INSTALLATION_ID"'", "properties": { "os": "'"$os"'", "error": "Docker not installed" } }'
URL="https://app.posthog.com/capture"
HEADER="Content-Type: application/json"
if has_curl; then
curl -sfL -d "$DATA" --header "$HEADER" "$URL" > /dev/null 2>&1
elif has_wget; then
wget -q --post-data="$DATA" --header="$HEADER" "$URL" > /dev/null 2>&1
fi
exit 1
fi
fi
# Install docker-compose
if ! is_command_present docker-compose; then
install_docker_compose
fi
# if ! is_command_present docker-compose; then
# install_docker_machine
# docker-machine create -d virtualbox --virtualbox-memory 3584 signoz
# fi
start_docker
echo ""
echo "Pulling the latest container images for SigNoz. To run as sudo it will ask for system password."
sudo docker-compose -f ./docker/docker-compose-tiny.yaml pull
echo ""
echo "Starting the SigNoz containers. It may take a few minute ..."
echo
# The docker-compose command does some nasty stuff for the `--detach` functionality. So we add a `|| true` so that the
# script doesn't exit because this command looks like it failed to do it's thing.
sudo docker-compose -f ./docker/docker-compose-tiny.yaml up --detach --remove-orphans || true
wait_for_containers_start 60
echo ""
if [[ $status_code -ne 200 ]]; then
echo "+++++++++++ ERROR ++++++++++++++++++++++"
echo "The containers didn't seem to start correctly. Please run the following command to check containers that may have errored out:"
echo ""
echo -e "sudo docker-compose -f docker/docker-compose-tiny.yaml ps -a"
echo "Please read our troubleshooting guide https://signoz.io/docs/deployment/docker#troubleshooting"
echo "or reach us on SigNoz for support https://join.slack.com/t/signoz-community/shared_invite/zt-lrjknbbp-J_mI13rlw8pGF4EWBnorJA"
echo "++++++++++++++++++++++++++++++++++++++++"
SUPERVISORS="$(curl -so - http://localhost:8888/druid/indexer/v1/supervisor)"
DATASOURCES="$(curl -so - http://localhost:8888/druid/coordinator/v1/datasources)"
DATA='{ "api_key": "H-htDCae7CR3RV57gUzmol6IAKtm5IMCvbcm_fwnL-w", "type": "capture", "event": "Installation Error - Checks", "distinct_id": "'"$SIGNOZ_INSTALLATION_ID"'", "properties": { "os": "'"$os"'", "error": "Containers not started", "SUPERVISORS": '"$SUPERVISORS"', "DATASOURCES": '"$DATASOURCES"' } }'
URL="https://app.posthog.com/capture"
HEADER="Content-Type: application/json"
if has_curl; then
curl -sfL -d "$DATA" --header "$HEADER" "$URL" > /dev/null 2>&1
elif has_wget; then
wget -q --post-data="$DATA" --header="$HEADER" "$URL" > /dev/null 2>&1
fi
exit 1
else
DATA='{ "api_key": "H-htDCae7CR3RV57gUzmol6IAKtm5IMCvbcm_fwnL-w", "type": "capture", "event": "Installation Success", "distinct_id": "'"$SIGNOZ_INSTALLATION_ID"'", "properties": { "os": "'"$os"'"} }'
URL="https://app.posthog.com/capture"
HEADER="Content-Type: application/json"
if has_curl; then
curl -sfL -d "$DATA" --header "$HEADER" "$URL" > /dev/null 2>&1
elif has_wget; then
wget -q --post-data="$DATA" --header="$HEADER" "$URL" > /dev/null 2>&1
fi
echo "++++++++++++++++++ SUCCESS ++++++++++++++++++++++"
echo "Your installation is complete!"
echo ""
echo "Your frontend is running on 'http://localhost:3000'."
echo ""
echo "+++++++++++++++++++++++++++++++++++++++++++++++++"
echo ""
echo "Need help Getting Started?"
echo "Join us on Slack https://join.slack.com/t/signoz-community/shared_invite/zt-lrjknbbp-J_mI13rlw8pGF4EWBnorJA"
echo ""
echo "Please share your email to receive support & updates about SigNoz!"
read -rp 'Email: ' email
while [[ $email == "" ]]
do
read -rp 'Email: ' email
done
DATA='{ "api_key": "H-htDCae7CR3RV57gUzmol6IAKtm5IMCvbcm_fwnL-w", "type": "capture", "event": "Identify Successful Installation", "distinct_id": "'"$SIGNOZ_INSTALLATION_ID"'", "properties": { "os": "'"$os"'", "email": "'"$email"'" } }'
URL="https://app.posthog.com/capture"
HEADER="Content-Type: application/json"
if has_curl; then
curl -sfL -d "$DATA" --header "$HEADER" "$URL" > /dev/null 2>&1
elif has_wget; then
wget -q --post-data="$DATA" --header="$HEADER" "$URL" > /dev/null 2>&1
fi
fi
echo -e "\nThank you!\n"
##### Changing default memory limit of docker ############
# # Check if memory is less and Confirm to increase size of docker machine
# # https://github.com/docker/machine/releases
# # On OS X
# $ curl -L https://github.com/docker/machine/releases/download/v0.16.2/docker-machine-`uname -s`-`uname -m` >/usr/local/bin/docker-machine && \
# chmod +x /usr/local/bin/docker-machine
# # On Linux
# $ curl -L https://github.com/docker/machine/releases/download/v0.16.2/docker-machine-`uname -s`-`uname -m` >/tmp/docker-machine &&
# chmod +x /tmp/docker-machine &&
# sudo cp /tmp/docker-machine /usr/local/bin/docker-machine
# VBoxManage list vms
# docker-machine stop
# VBoxManage modifyvm default --cpus 2
# VBoxManage modifyvm default --memory 4096
# docker-machine start
# VBoxManage showvminfo default | grep Memory
# VBoxManage showvminfo default | grep CPU

View File

@@ -26,4 +26,4 @@ spec:
name: retention-config
restartPolicy: Never
backoffLimit: 4
backoffLimit: 8

View File

@@ -25,10 +25,19 @@ data:
"ServiceName",
"References",
"Tags",
"TagsKeys",
"TagsValues",
{
"type": "string",
"name": "TagsKeys",
"multiValueHandling": "ARRAY"
},
{
"type": "string",
"name": "TagsValues",
"multiValueHandling": "ARRAY"
},
{ "name": "DurationNano", "type": "Long" },
{ "name": "Kind", "type": "int" }
{ "name": "Kind", "type": "int" },
{ "name": "StatusCode", "type": "int" }
]
}
}

View File

@@ -24,4 +24,4 @@ spec:
configMap:
name: supervisor-config
restartPolicy: Never
backoffLimit: 4
backoffLimit: 8

View File

@@ -12,8 +12,14 @@ data:
protocols:
grpc:
thrift_http:
otlp:
protocols:
grpc:
http:
processors:
batch:
send_batch_size: 1000
timeout: 10s
memory_limiter:
# Same as --mem-ballast-size-mib CLI argument
ballast_size_mib: 683
@@ -38,6 +44,10 @@ data:
extensions: [health_check, zpages]
pipelines:
traces:
receivers: [jaeger]
receivers: [jaeger, otlp]
processors: [memory_limiter, batch, queued_retry]
exporters: [kafka]
metrics:
receivers: [otlp]
processors: [batch]
exporters: [kafka]

View File

@@ -25,7 +25,7 @@ spec:
- "--config=/conf/otel-collector-config.yaml"
# Memory Ballast size should be max 1/3 to 1/2 of memory.
- "--mem-ballast-size-mib=683"
image: otel/opentelemetry-collector-dev:latest
image: otel/opentelemetry-collector:0.18.0
name: otel-collector
resources:
limits:
@@ -37,7 +37,9 @@ spec:
ports:
- containerPort: 55679 # Default endpoint for ZPages.
- containerPort: 55680 # Default endpoint for OpenTelemetry receiver.
- containerPort: 14250 # Default endpoint for Jaeger HTTP receiver.
- containerPort: 55681 # Default endpoint for OpenTelemetry HTTP/1.0 receiver.
- containerPort: 4317 # Default endpoint for OpenTelemetry GRPC receiver.
- containerPort: 14250 # Default endpoint for Jaeger GRPC receiver.
- containerPort: 14268 # Default endpoint for Jaeger HTTP receiver.
- containerPort: 9411 # Default endpoint for Zipkin receiver.
- containerPort: 8888 # Default endpoint for querying metrics.

View File

@@ -11,6 +11,14 @@ spec:
port: 55680
protocol: TCP
targetPort: 55680
- name: otlp-http-legacy # Default endpoint for OpenTelemetry receiver.
port: 55681
protocol: TCP
targetPort: 55681
- name: otlp-grpc # Default endpoint for OpenTelemetry receiver.
port: 4317
protocol: TCP
targetPort: 4317
- name: jaeger-grpc # Default endpoing for Jaeger gRPC receiver
port: 14250
- name: jaeger-thrift-http # Default endpoint for Jaeger HTTP receiver.

View File

@@ -10,12 +10,12 @@ dependencies:
version: 0.2.18
- name: flattener-processor
repository: file://./signoz-charts/flattener-processor
version: 0.1.1
version: 0.2.0
- name: query-service
repository: file://./signoz-charts/query-service
version: 0.1.1
version: 0.2.0
- name: frontend
repository: file://./signoz-charts/frontend
version: 0.1.6
digest: sha256:1acfbd4b86e7ca6b70101f7dc3b2ab26aa5e72df2f454800f6dda7e645580978
generated: "2021-01-07T17:50:04.227534+05:30"
version: 0.2.1
digest: sha256:7ea89a82fabae53ff97cbdaddab0c9edf952a3d212237efc5897b32937d940fd
generated: "2021-05-02T23:16:58.998702+05:30"

View File

@@ -15,12 +15,12 @@ type: application
# This is the chart version. This version number should be incremented each time you make changes
# to the chart and its templates, including the app version.
# Versions are expected to follow Semantic Versioning (https://semver.org/)
version: 0.1.6
version: 0.2.0
# This is the version number of the application being deployed. This version number should be
# incremented each time you make changes to the application. Versions are not expected to
# follow Semantic Versioning. They should reflect the version the application is using.
appVersion: 0.1.1
appVersion: 0.2.0
dependencies:
- name: zookeeper
@@ -34,10 +34,10 @@ dependencies:
version: 0.2.18
- name: flattener-processor
repository: "file://./signoz-charts/flattener-processor"
version: 0.1.1
version: 0.2.0
- name: query-service
repository: "file://./signoz-charts/query-service"
version: 0.1.1
version: 0.2.0
- name: frontend
repository: "file://./signoz-charts/frontend"
version: 0.1.6
version: 0.2.1

View File

@@ -14,8 +14,8 @@ type: application
# This is the chart version. This version number should be incremented each time you make changes
# to the chart and its templates, including the app version.
version: 0.1.1
version: 0.2.0
# This is the version number of the application being deployed. This version number should be
# incremented each time you make changes to the application.
appVersion: 0.1.1
appVersion: 0.2.0

View File

@@ -14,8 +14,8 @@ type: application
# This is the chart version. This version number should be incremented each time you make changes
# to the chart and its templates, including the app version.
version: 0.1.6
version: 0.2.1
# This is the version number of the application being deployed. This version number should be
# incremented each time you make changes to the application.
appVersion: 0.1.7
appVersion: 0.2.1

View File

@@ -14,8 +14,8 @@ type: application
# This is the chart version. This version number should be incremented each time you make changes
# to the chart and its templates, including the app version.
version: 0.1.1
version: 0.2.0
# This is the version number of the application being deployed. This version number should be
# incremented each time you make changes to the application.
appVersion: 0.1.3
appVersion: 0.2.0

View File

@@ -1,3 +1,7 @@
zookeeper:
autopurge:
purgeInterval: 1
kafka:
zookeeper:
enabled: false
@@ -7,8 +11,14 @@ kafka:
druid:
configVars:
# To store data on local disks attached
druid_extensions_loadList: '["druid-histogram", "druid-datasketches", "druid-lookups-cached-global", "postgresql-metadata-storage", "druid-kafka-indexing-service"]'
druid_storage_type: local
# # To store data in S3
# druid_extensions_loadList: '["druid-histogram", "druid-datasketches", "druid-lookups-cached-global", "postgresql-metadata-storage", "druid-kafka-indexing-service", "druid-s3-extensions"]'
# druid_storage_type: s3
# druid_storage_bucket: signoz-druid
# druid_storage_baseKey: baseKey
@@ -18,7 +28,7 @@ druid:
historical:
persistence:
size: "4Gi"
size: "20Gi"
zkHosts: "signoz-zookeeper:2181"

16
frontend/.babelrc Normal file
View File

@@ -0,0 +1,16 @@
{
"presets": [
"@babel/preset-env",
"@babel/preset-react",
"@babel/preset-typescript"
],
"plugins": [
"react-hot-loader/babel",
"@babel/plugin-proposal-class-properties"
],
"env": {
"production": {
"presets": ["minify"]
}
}
}

View File

@@ -0,0 +1,7 @@
{
"trailingComma": "all",
"useTabs": true,
"tabWidth": 1,
"singleQuote": false,
"jsxSingleQuote": false
}

View File

@@ -1,21 +1,20 @@
# stage1 as builder
FROM node:14-alpine as builder
FROM node:12-alpine as builder
WORKDIR /frontend
# copy the package.json to install dependencies
COPY package.json ./
# Install the dependencies and make the folder
RUN npm install && mkdir /react-ui && mv ./node_modules ./react-ui
WORKDIR /react-ui
RUN yarn install
COPY . .
# Build the project and copy the files
RUN npm run build
RUN yarn build
FROM nginx:1.15-alpine
FROM nginx:1.12-alpine
#!/bin/sh
@@ -25,7 +24,7 @@ COPY conf/default.conf /etc/nginx/conf.d/default.conf
RUN rm -rf /usr/share/nginx/html/*
# Copy from the stahg 1
COPY --from=builder /react-ui/build /usr/share/nginx/html
COPY --from=builder /frontend/build /usr/share/nginx/html
EXPOSE 3000

View File

@@ -1,3 +1,25 @@
# Docker
**Building image**
``docker-compose up`
/ This will also run
or
`docker build . -t tagname`
**Tag to remote url- Introduce versinoing later on**
```
docker tag signoz/frontend:latest 7296823551/signoz:latest
```
**Running locally**
```
docker-compose up
```
# Getting Started with Create React App
This project was bootstrapped with [Create React App](https://github.com/facebook/create-react-app).

View File

@@ -0,0 +1,7 @@
version: "3.9"
services:
web:
build: .
image: signoz/frontend:latest
ports:
- "3000:3000"

View File

@@ -1,28 +1,28 @@
const gulp = require('gulp')
const gulpless = require('gulp-less')
const postcss = require('gulp-postcss')
const debug = require('gulp-debug')
var csso = require('gulp-csso')
const autoprefixer = require('autoprefixer')
const NpmImportPlugin = require('less-plugin-npm-import')
const gulp = require("gulp");
const gulpless = require("gulp-less");
const postcss = require("gulp-postcss");
const debug = require("gulp-debug");
var csso = require("gulp-csso");
const autteoprefixer = require("autoprefixer");
const NpmImportPlugin = require("less-plugin-npm-import");
gulp.task('less', function () {
const plugins = [autoprefixer()]
gulp.task("less", function () {
const plugins = [autoprefixer()];
return gulp
.src('src/themes/*-theme.less')
.pipe(debug({title: 'Less files:'}))
.pipe(
gulpless({
javascriptEnabled: true,
plugins: [new NpmImportPlugin({prefix: '~'})],
}),
)
.pipe(postcss(plugins))
.pipe(
csso({
debug: true,
}),
)
.pipe(gulp.dest('./public'))
})
return gulp
.src("src/themes/*-theme.less")
.pipe(debug({ title: "Less files:" }))
.pipe(
gulpless({
javascriptEnabled: true,
plugins: [new NpmImportPlugin({ prefix: "~" })],
}),
)
.pipe(postcss(plugins))
.pipe(
csso({
debug: true,
}),
)
.pipe(gulp.dest("./public"));
});

View File

@@ -1,87 +1,157 @@
{
"name": "client",
"version": "0.1.0",
"private": true,
"dependencies": {
"@material-ui/core": "^4.0.0",
"@testing-library/jest-dom": "^5.11.4",
"@testing-library/react": "^11.1.0",
"@testing-library/user-event": "^12.1.10",
"@types/chart.js": "^2.9.28",
"@types/d3": "^6.2.0",
"@types/jest": "^26.0.15",
"@types/node": "^14.14.7",
"@types/react": "^17.0.0",
"@types/react-dom": "^16.9.9",
"@types/react-redux": "^7.1.11",
"@types/react-router-dom": "^5.1.6",
"@types/redux": "^3.6.0",
"@types/styled-components": "^5.1.4",
"@types/vis": "^4.21.21",
"antd": "^4.8.0",
"axios": "^0.21.0",
"chart.js": "^2.9.4",
"d3": "^6.2.0",
"d3-array": "^2.8.0",
"d3-ease": "^2.0.0",
"d3-flame-graph": "^3.1.1",
"d3-tip": "^0.9.1",
"material-ui-chip-input": "^2.0.0-beta.2",
"prop-types": "^15.6.2",
"react": "17.0.0",
"react-chartjs-2": "^2.11.1",
"react-chips": "^0.8.0",
"react-css-theme-switcher": "^0.1.6",
"react-dom": "17.0.0",
"react-graph-vis": "^1.0.5",
"react-redux": "^7.2.2",
"react-router-dom": "^5.2.0",
"react-scripts": "4.0.0",
"react-vis": "^1.11.7",
"recharts": "^1.8.5",
"redux": "^4.0.5",
"redux-thunk": "^2.3.0",
"styled-components": "^5.2.1",
"typescript": "^4.0.5",
"web-vitals": "^0.2.4"
},
"scripts": {
"start": "react-scripts start",
"build": "react-scripts build",
"test": "react-scripts test",
"eject": "react-scripts eject",
"storybook": "start-storybook -p 6006 -s public --no-dll",
"build-storybook": "build-storybook -s public --no-dll"
},
"eslintConfig": {
"extends": [
"react-app",
"react-app/jest"
]
},
"browserslist": {
"production": [
">0.2%",
"not dead",
"not op_mini all"
],
"development": [
"last 1 chrome version",
"last 1 firefox version",
"last 1 safari version"
]
},
"devDependencies": {
"@babel/core": "^7.12.3",
"@babel/preset-typescript": "^7.12.7",
"autoprefixer": "^9.0.0",
"babel-plugin-styled-components": "^1.12.0",
"gulp": "^4.0.2",
"gulp-csso": "^4.0.1",
"gulp-debug": "^4.0.0",
"gulp-less": "^4.0.1",
"gulp-postcss": "^9.0.0",
"less-plugin-npm-import": "^2.1.0",
"react-is": "^17.0.1"
}
"name": "frontend",
"version": "1.0.0",
"description": "",
"main": "webpack.config.js",
"scripts": {
"dev": "NODE_ENV=development webpack serve",
"start": "node scripts/start.js",
"build": "webpack --config=webpack.config.prod.js",
"prettify": "prettier --write ."
},
"author": "",
"license": "ISC",
"dependencies": {
"@auth0/auth0-react": "^1.2.0",
"@babel/core": "7.12.3",
"@material-ui/core": "^4.0.0",
"@pmmmwh/react-refresh-webpack-plugin": "0.4.2",
"@svgr/webpack": "5.4.0",
"@testing-library/jest-dom": "^5.11.4",
"@testing-library/react": "^11.1.0",
"@testing-library/user-event": "^12.1.10",
"@types/chart.js": "^2.9.28",
"@types/d3": "^6.2.0",
"@types/jest": "^26.0.15",
"@types/node": "^14.14.7",
"@types/react": "^17.0.0",
"@types/react-dom": "^16.9.9",
"@types/react-redux": "^7.1.11",
"@types/react-router-dom": "^5.1.6",
"@types/redux": "^3.6.0",
"@types/styled-components": "^5.1.4",
"@types/vis": "^4.21.21",
"@typescript-eslint/eslint-plugin": "^4.5.0",
"@typescript-eslint/parser": "^4.5.0",
"antd": "^4.8.0",
"axios": "^0.21.0",
"babel-eslint": "^10.1.0",
"babel-jest": "^26.6.0",
"babel-loader": "8.1.0",
"babel-plugin-named-asset-import": "^0.3.7",
"babel-preset-minify": "^0.5.1",
"babel-preset-react-app": "^10.0.0",
"bfj": "^7.0.2",
"camelcase": "^6.1.0",
"case-sensitive-paths-webpack-plugin": "2.3.0",
"chart.js": "^2.9.4",
"css-loader": "4.3.0",
"d3": "^6.2.0",
"d3-array": "^2.8.0",
"d3-ease": "^2.0.0",
"d3-flame-graph": "^3.1.1",
"d3-tip": "^0.9.1",
"dotenv": "8.2.0",
"dotenv-expand": "5.1.0",
"eslint": "^7.11.0",
"eslint-config-react-app": "^6.0.0",
"eslint-plugin-flowtype": "^5.2.0",
"eslint-plugin-import": "^2.22.1",
"eslint-plugin-jest": "^24.1.0",
"eslint-plugin-jsx-a11y": "^6.3.1",
"eslint-plugin-react": "^7.21.5",
"eslint-plugin-react-hooks": "^4.2.0",
"eslint-plugin-testing-library": "^3.9.2",
"eslint-webpack-plugin": "^2.1.0",
"file-loader": "6.1.1",
"fs-extra": "^9.0.1",
"html-webpack-plugin": "5.1.0",
"identity-obj-proxy": "3.0.0",
"jest": "26.6.0",
"jest-circus": "26.6.0",
"jest-resolve": "26.6.0",
"jest-watch-typeahead": "0.6.1",
"material-ui-chip-input": "^2.0.0-beta.2",
"mini-css-extract-plugin": "0.11.3",
"optimize-css-assets-webpack-plugin": "5.0.4",
"pnp-webpack-plugin": "1.6.4",
"postcss-flexbugs-fixes": "4.2.1",
"postcss-loader": "3.0.0",
"postcss-normalize": "8.0.1",
"postcss-preset-env": "6.7.0",
"postcss-safe-parser": "5.0.2",
"prop-types": "^15.6.2",
"react": "17.0.0",
"react-app-polyfill": "^2.0.0",
"react-chartjs-2": "^2.11.1",
"react-chips": "^0.8.0",
"react-css-theme-switcher": "^0.1.6",
"react-dev-utils": "^11.0.0",
"react-dom": "17.0.0",
"react-graph-vis": "^1.0.5",
"react-modal": "^3.12.1",
"react-redux": "^7.2.2",
"react-refresh": "^0.8.3",
"react-router-dom": "^5.2.0",
"react-vis": "^1.11.7",
"recharts": "^1.8.5",
"redux": "^4.0.5",
"redux-thunk": "^2.3.0",
"resolve": "1.18.1",
"resolve-url-loader": "^3.1.2",
"sass-loader": "8.0.2",
"semver": "7.3.2",
"style-loader": "1.3.0",
"styled-components": "^5.2.1",
"terser-webpack-plugin": "4.2.3",
"ts-pnp": "1.2.0",
"typescript": "^4.0.5",
"url-loader": "4.1.1",
"web-vitals": "^0.2.4",
"webpack": "^5.23.0",
"webpack-dev-server": "^3.11.2",
"webpack-manifest-plugin": "2.2.0",
"workbox-webpack-plugin": "5.1.4"
},
"eslintConfig": {
"extends": [
"react-app",
"react-app/jest"
]
},
"browserslist": {
"production": [
">0.2%",
"not dead",
"not op_mini all"
],
"development": [
"last 1 chrome version",
"last 1 firefox version",
"last 1 safari version"
]
},
"devDependencies": {
"@babel/core": "^7.12.3",
"@babel/plugin-proposal-class-properties": "^7.12.13",
"@babel/plugin-syntax-jsx": "^7.12.13",
"@babel/preset-env": "^7.12.17",
"@babel/preset-react": "^7.12.13",
"@babel/preset-typescript": "^7.12.17",
"autoprefixer": "^9.0.0",
"babel-plugin-styled-components": "^1.12.0",
"copy-webpack-plugin": "^7.0.0",
"gulp": "^4.0.2",
"gulp-csso": "^4.0.1",
"gulp-debug": "^4.0.0",
"gulp-less": "^4.0.1",
"gulp-postcss": "^9.0.0",
"husky": "4.3.8",
"less-plugin-npm-import": "^2.1.0",
"lint-staged": "10.5.3",
"prettier": "2.2.1",
"react-hot-loader": "^4.13.0",
"react-is": "^17.0.1",
"webpack-cli": "^4.5.0"
}
}

File diff suppressed because one or more lines are too long

Binary file not shown.

Before

Width:  |  Height:  |  Size: 3.8 KiB

After

Width:  |  Height:  |  Size: 2.2 KiB

View File

@@ -1,44 +0,0 @@
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8" />
<link rel="icon" href="%PUBLIC_URL%/favicon.ico" />
<meta name="viewport" content="width=device-width, initial-scale=1" />
<meta name="theme-color" content="#000000" />
<meta
name="description"
content="Web site created using create-react-app"
/>
<link rel="apple-touch-icon" href="%PUBLIC_URL%/logo192.png" />
<link rel="stylesheet" href="https://stackpath.bootstrapcdn.com/bootstrap/4.3.1/css/bootstrap.min.css" integrity="sha384-ggOyR0iXCbMQv3Xipma34MD+dH/1fQ784/j6cY/iJTQUOhcWr7x9JvoRxT2MZw1T" crossorigin="anonymous">
<!--
manifest.json provides metadata used when your web app is installed on a
user's mobile device or desktop. See https://developers.google.com/web/fundamentals/web-app-manifest/
-->
<link rel="manifest" href="%PUBLIC_URL%/manifest.json" />
<!--
Notice the use of %PUBLIC_URL% in the tags above.
It will be replaced with the URL of the `public` folder during the build.
Only files inside the `public` folder can be referenced from the HTML.
Unlike "/favicon.ico" or "favicon.ico", "%PUBLIC_URL%/favicon.ico" will
work correctly both with client-side routing and a non-root public URL.
Learn how to configure a non-root public URL by running `npm run build`.
-->
<title>React App</title>
</head>
<body>
<noscript>You need to enable JavaScript to run this app.</noscript>
<div id="root"></div>
<!--
This HTML file is a template.
If you open it directly in the browser, you will see an empty page.
You can add webfonts, meta tags, or analytics to this file.
The build step will place the bundled scripts into the <body> tag.
To begin the development, run `npm start` or `yarn start`.
To create a production bundle, use `npm run build` or `yarn build`.
-->
</body>
</html>

File diff suppressed because one or more lines are too long

View File

@@ -1,25 +1,25 @@
{
"short_name": "React App",
"name": "Create React App Sample",
"icons": [
{
"src": "favicon.ico",
"sizes": "64x64 32x32 24x24 16x16",
"type": "image/x-icon"
},
{
"src": "logo192.png",
"type": "image/png",
"sizes": "192x192"
},
{
"src": "logo512.png",
"type": "image/png",
"sizes": "512x512"
}
],
"start_url": ".",
"display": "standalone",
"theme_color": "#000000",
"background_color": "#ffffff"
"short_name": "React App",
"name": "Create React App Sample",
"icons": [
{
"src": "favicon.ico",
"sizes": "64x64 32x32 24x24 16x16",
"type": "image/x-icon"
},
{
"src": "logo192.png",
"type": "image/png",
"sizes": "192x192"
},
{
"src": "logo512.png",
"type": "image/png",
"sizes": "512x512"
}
],
"start_url": ".",
"display": "standalone",
"theme_color": "#000000",
"background_color": "#ffffff"
}

View File

@@ -1,75 +0,0 @@
import { ActionTypes } from './types';
import { Moment } from 'moment'
export type DateTimeRangeType = [Moment|null,Moment|null]|null;
export interface GlobalTime {
maxTime: number;
minTime: number;
}
export interface updateTimeIntervalAction {
type: ActionTypes.updateTimeInterval;
payload: GlobalTime;
}
export const updateTimeInterval = (interval:string, datetimeRange?:[number,number]) => {
let maxTime: number = 0;
let minTime: number = 0;
// if interval string is custom, then datetimRange should be present and max & min time should be
// set directly based on that. Assuming datetimeRange values are in ms, and minTime is 0th element
switch (interval) {
case '15min':
maxTime=Date.now()*1000000; // in nano sec
minTime=(Date.now()-15*60*1000)*1000000;
break;
case '30min':
maxTime=Date.now()*1000000; // in nano sec
minTime=(Date.now()-30*60*1000)*1000000;
break;
case '1hr':
maxTime=Date.now()*1000000; // in nano sec
minTime=(Date.now()-1*60*60*1000)*1000000;
break;
case '6hr':
maxTime=Date.now()*1000000; // in nano sec
minTime=(Date.now()-6*60*60*1000)*1000000;
break;
case '1day':
maxTime=Date.now()*1000000; // in nano sec
minTime=(Date.now()-24*60*60*1000)*1000000;
break;
case '1week':
maxTime=Date.now()*1000000; // in nano sec
minTime=(Date.now()-7*24*60*60*1000)*1000000;
break;
case 'custom':
if (datetimeRange !== undefined)
{
maxTime=datetimeRange[1]*1000000;// in nano sec
minTime=datetimeRange[0]*1000000;// in nano sec
}
break;
default:
console.log('not found matching case')
}
return {
type: ActionTypes.updateTimeInterval,
payload: {maxTime:maxTime, minTime:minTime},
};
};

View File

@@ -1,6 +0,0 @@
export * from './types';
export * from './traceFilters';
export * from './traces';
export * from './metrics';
export * from './usage';
export * from './global';

View File

@@ -1,112 +0,0 @@
import { Dispatch } from 'redux';
import metricsAPI from '../api/metricsAPI';
import { GlobalTime } from './global';
import { ActionTypes } from './types';
export interface servicesListItem{
"serviceName": string;
"p99": number;
"avgDuration": number;
"numCalls": number;
"callRate": number;
"numErrors": number;
"errorRate": number;
};
export interface metricItem{
"timestamp":number;
"p50":number;
"p90":number;
"p99":number;
"numCalls":number;
"callRate":number;
"numErrors":number;
"errorRate":number;
}
export interface topEndpointListItem{
"p50": number;
"p90": number;
"p99": number;
"numCalls": number;
"name": string;
};
export interface customMetricsItem{
"timestamp": number;
"value": number;
};
export interface getServicesListAction {
type: ActionTypes.getServicesList;
payload: servicesListItem[];
}
export interface getServiceMetricsAction{
type: ActionTypes.getServiceMetrics;
payload: metricItem[];
}
export interface getTopEndpointsAction {
type: ActionTypes.getTopEndpoints;
payload: topEndpointListItem[];
}
export interface getFilteredTraceMetricsAction{
type: ActionTypes.getFilteredTraceMetrics;
payload: customMetricsItem[];
}
export const getServicesList = (globalTime: GlobalTime) => {
return async (dispatch: Dispatch) => {
let request_string = 'services?start='+globalTime.minTime+'&end='+globalTime.maxTime;
const response = await metricsAPI.get<servicesListItem[]>(request_string);
dispatch<getServicesListAction>({
type: ActionTypes.getServicesList,
payload: response.data
//PNOTE - response.data in the axios response has the actual API response
});
};
};
export const getServicesMetrics = (serviceName:string, globalTime: GlobalTime) => {
return async (dispatch: Dispatch) => {
let request_string = 'service/overview?service='+serviceName+'&start='+globalTime.minTime+'&end='+globalTime.maxTime+'&step=60';
const response = await metricsAPI.get<metricItem[]>(request_string);
dispatch<getServiceMetricsAction>({
type: ActionTypes.getServiceMetrics,
payload: response.data
//PNOTE - response.data in the axios response has the actual API response
});
};
};
export const getTopEndpoints = (serviceName:string, globalTime: GlobalTime) => {
return async (dispatch: Dispatch) => {
let request_string = 'service/top_endpoints?service='+serviceName+'&start='+globalTime.minTime+'&end='+globalTime.maxTime;
const response = await metricsAPI.get<topEndpointListItem[]>(request_string);
dispatch<getTopEndpointsAction>({
type: ActionTypes.getTopEndpoints,
payload: response.data
//PNOTE - response.data in the axios response has the actual API response
});
};
};
export const getFilteredTraceMetrics = (filter_params: string, globalTime: GlobalTime) => {
return async (dispatch: Dispatch) => {
let request_string = 'spans/aggregates?start='+globalTime.minTime+'&end='+globalTime.maxTime+'&'+filter_params;
const response = await metricsAPI.get<customMetricsItem[]>(request_string);
dispatch<getFilteredTraceMetricsAction>({
type: ActionTypes.getFilteredTraceMetrics,
payload: response.data
//PNOTE - response.data in the axios response has the actual API response
});
};
};

View File

@@ -1,49 +0,0 @@
// Action creator must have a type and optionally a payload
import { ActionTypes } from './types'
export interface TagItem {
key: string;
value: string;
operator: 'equals'|'contains';
}
export interface LatencyValue {
min:string;
max:string;
}
export interface TraceFilters{
tags?: TagItem[];
service?:string;
latency?:LatencyValue;
operation?:string;
}
//define interface for action. Action creator always returns object of this type
export interface updateTraceFiltersAction {
type: ActionTypes.updateTraceFilters,
payload: TraceFilters,
}
export const updateTraceFilters = (traceFilters: TraceFilters) => {
return {
type: ActionTypes.updateTraceFilters,
payload: traceFilters,
};
};
export interface updateInputTagAction {
type: ActionTypes.updateInput,
payload: string,
}
export const updateInputTag = (Input: string) => {
return {
type: ActionTypes.updateInput,
payload: Input,
};
};
//named export when you want to export multiple functions from the same file

View File

@@ -1,146 +0,0 @@
import { ActionTypes } from './types';
import tracesAPI from '../api/tracesAPI';
import { Dispatch } from 'redux';
import { GlobalTime } from './global';
// PNOTE
// define trace interface - what it should return
// define action creator to show list of traces on component mount -- use useEffect ( , []) -> Mounts when loaded first time
// Date() - takes number of milliseconds as input, our API takes in microseconds
// Sample API call for traces - https://api.signoz.io/api/traces?end=1606968273667000&limit=20&lookback=2d&maxDuration=&minDuration=&service=driver&operation=&start=1606968100867000
export interface Tree{
name: string;
value: number;
children?: Tree[];
}
export interface RefItem{
refType: string;
traceID: string;
spanID: string;
}
export interface TraceTagItem{
key: string;
// type: string;
value: string;
}
export interface ProcessItem{
serviceName: string;
tags: TraceTagItem[];
}
// PNOTE - Temp DS for converting span to tree which can be consumed by flamegraph
export interface pushDStree {
id: string;
name: string;
value: number;
time: number;
startTime: number;
tags: TraceTagItem[];
children: pushDStree[];
}
export interface spanItem{
traceID: string; // index 0
spanID: string; // index 1
operationName: string; // index 2
startTime: number; // index 3
duration: number; // index 4
references: RefItem[]; // index 5
tags: []; //index 6
logs: []; // index 7
processID: string; // index 8
warnings: []; // index 9
children: pushDStree[]; // index 10 // PNOTE - added this as we are adding extra field in span item to convert trace to tree.
// Should this field be optional?
}
//let mapped_array :{ [id: string] : spanItem; } = {};
export interface traceItem{
traceID: string;
spans: spanItem[];
processes: { [id: string] : ProcessItem; } ;
warnings: [];
}
export interface traceResponse{
data: traceItem[];
total: number;
limit: number;
offset: number;
error: [];
}
export type span = [number, string, string, string, string, string, string, string|string[], string|string[], string|string[], pushDStree[]];
export interface spanList{
events: span[];
segmentID: string;
columns: string[];
}
// export interface traceResponseNew{
// [id: string] : spanList;
// }
export interface traceResponseNew{
[id: string] : spanList;
}
export interface spansWSameTraceIDResponse{
[id: string] : spanList;
}
export interface FetchTracesAction {
type: ActionTypes.fetchTraces;
payload: traceResponseNew;
}
export interface FetchTraceItemAction {
type: ActionTypes.fetchTraceItem;
payload: spansWSameTraceIDResponse;
}
export const fetchTraces = (globalTime: GlobalTime, filter_params: string ) => {
return async (dispatch: Dispatch) => {
if (globalTime){
let request_string = 'spans?limit=100&lookback=2d&start='+globalTime.minTime+'&end='+globalTime.maxTime+'&'+filter_params;
const response = await tracesAPI.get<traceResponseNew>(request_string);
dispatch<FetchTracesAction>({
type: ActionTypes.fetchTraces,
payload: response.data
//PNOTE - response.data in the axios response has the actual API response?
});
}
};
};
export const fetchTraceItem = (traceID: string) => {
return async (dispatch: Dispatch) => {
let request_string = 'traces/'+traceID;
const response = await tracesAPI.get<spansWSameTraceIDResponse>(request_string);
dispatch<FetchTraceItemAction>({
type: ActionTypes.fetchTraceItem,
payload: response.data
//PNOTE - response.data in the axios response has the actual API response?
});
};
};

View File

@@ -1,22 +0,0 @@
import { FetchTracesAction, FetchTraceItemAction } from './traces';
import { updateTraceFiltersAction, updateInputTagAction } from './traceFilters';
import {getServicesListAction,getServiceMetricsAction, getTopEndpointsAction, getFilteredTraceMetricsAction} from './metrics'
import {getUsageDataAction} from './usage'
import {updateTimeIntervalAction} from './global';
export enum ActionTypes {
updateTraceFilters,
updateInput,
fetchTraces,
fetchTraceItem,
getServicesList,
getServiceMetrics,
getTopEndpoints,
getUsageData,
updateTimeInterval,
getFilteredTraceMetrics,
}
export type Action = FetchTraceItemAction | FetchTracesAction | updateTraceFiltersAction | updateInputTagAction |getServicesListAction| getServiceMetricsAction| getTopEndpointsAction | getUsageDataAction | updateTimeIntervalAction| getFilteredTraceMetricsAction;

View File

@@ -1,29 +0,0 @@
import { Dispatch } from 'redux';
import metricsAPI from '../api/metricsAPI';
import { ActionTypes } from './types';
import { GlobalTime } from './global';
export interface usageDataItem{
"timestamp":number;
"count":number;
}
export interface getUsageDataAction {
type: ActionTypes.getUsageData;
payload: usageDataItem[];
}
export const getUsageData = (globalTime: GlobalTime) => {
return async (dispatch: Dispatch) => {
let request_string = 'usage?start='+globalTime.minTime+'&end='+globalTime.maxTime+'&step=3600&service=driver';
//Step can only be multiple of 3600
const response = await metricsAPI.get<usageDataItem[]>(request_string);
dispatch<getUsageDataAction>({
type: ActionTypes.getUsageData,
payload: response.data
//PNOTE - response.data in the axios response has the actual API response
});
};
};

View File

@@ -0,0 +1,3 @@
const apiV1 = "/api/v1/";
export default apiV1;

View File

@@ -1,9 +0,0 @@
import axios from 'axios';
// No auth for the API
export default axios.create({
baseURL: 'https://api.signoz.io/api/prom/api/v1',
}
);

View File

@@ -0,0 +1,9 @@
import axios, { AxiosRequestConfig } from "axios";
import { ENVIRONMENT } from "Src/constants/env";
import apiV1 from "./apiV1";
export default axios.create({
baseURL: `${ENVIRONMENT.baseURL}`,
});
export { apiV1 };

View File

@@ -1,10 +0,0 @@
import axios from 'axios';
export default axios.create({
// baseURL: 'http://104.211.113.204:8080/api/v1/',
// baseURL: process.env.REACT_APP_QUERY_SERVICE_URL,
// console.log('in metrics API', process.env.QUERY_SERVICE_URL)
baseURL: '/api/v1/',
}
);

View File

@@ -1,16 +0,0 @@
import axios from 'axios';
export default axios.create({
// baseURL: 'https://api.telegram.org/bot1518273960:AAHcgVvym9a0Qkl-PKiCI84X1VZaVbkTud0/',
// baseURL: 'http://104.211.113.204:8080/api/v1/',
baseURL: '/api/v1/',
}
);
//https://api.telegram.org/bot1518273960:AAHcgVvym9a0Qkl-PKiCI84X1VZaVbkTud0/sendMessage?chat_id=351813222&text=Hello%20there
// Chat ID can be obtained from here
//https://api.telegram.org/bot1518273960:AAHcgVvym9a0Qkl-PKiCI84X1VZaVbkTud0/getUpdates

View File

@@ -1,12 +0,0 @@
import axios from 'axios';
//import { format } from 'path';
export default axios.create({
// baseURL: 'http://104.211.113.204:8080/api/v1/' //comment this line and remove this comment before pushing
// baseURL: process.env.QUERY_SERVICE_URL,
// console.log('in traces API', process.env.QUERY_SERVICE_URL)
baseURL: '/api/v1/',
});

View File

@@ -1,6 +1,9 @@
@import "~antd/dist/antd.dark.css";
@import "~antd/dist/antd.compact.css";
.ant-space-item {
margin-right: 0 !important;
}
/* #components-layout-demo-side .logo {
height: 32px;
margin: 16px;
@@ -10,3 +13,17 @@
.site-layout .site-layout-background {
background: #fff;
} */
.instrument-card{
border-radius: 4px;
background: #313131;
padding: 33px 23px;
max-width: 800px;
margin-top: 40px;
}
#chart svg{
width: 100%;
}
#chart{
width: 100%;
}

View File

@@ -1,122 +0,0 @@
import React, { Suspense, useState } from 'react';
import { Layout, Menu, Switch as ToggleButton, Spin, Row, Col } from 'antd';
import { useThemeSwitcher } from "react-css-theme-switcher";
import { NavLink, BrowserRouter as Router, Route, Switch } from 'react-router-dom';
import {
LineChartOutlined,
BarChartOutlined,
DeploymentUnitOutlined,
AlignLeftOutlined,
} from '@ant-design/icons';
import DateTimeSelector from './DateTimeSelector';
import ShowBreadcrumbs from './ShowBreadcrumbs';
const { Content, Footer, Sider } = Layout;
const ServiceMetrics = React.lazy(() => import('./metrics/ServiceMetricsDef'));
const ServiceMap = React.lazy(() => import('./servicemap/ServiceMap'));
const TraceDetail = React.lazy(() => import('./traces/TraceDetail'));
const TraceGraph = React.lazy(() => import ('./traces/TraceGraphDef' ));
const UsageExplorer = React.lazy(() => import ('./usage/UsageExplorerDef' ));
const ServicesTable = React.lazy(() => import('./metrics/ServicesTableDef'));
// const Signup = React.lazy(() => import('./Signup'));
//PNOTE
//React. lazy currently only supports default exports. If the module you want to import uses named exports, you can create an intermediate module that reexports it as the default. This ensures that tree shaking keeps working and that you don't pull in unused components.
const App = () => {
// state = { collapsed: false, isDarkMode: true };
const { switcher, currentTheme, status, themes } = useThemeSwitcher();
const [isDarkMode, setIsDarkMode] = useState(true);
const [collapsed, setCollapsed] = useState(false);
const toggleTheme = (isChecked :boolean) => {
setIsDarkMode(isChecked);
switcher({ theme: isChecked ? themes.dark : themes.light });
};
const onCollapse = (): void => {
setCollapsed(!collapsed);
};
if (status === "loading") {
return null;
}
return (
<Router basename="/">
<Layout style={{ minHeight: '100vh' }}>
<Sider collapsible collapsed={collapsed} onCollapse={onCollapse} width={160}>
<div className="logo">
<ToggleButton checked={isDarkMode} onChange={toggleTheme} />
<NavLink to='/'><img src={"signoz.svg"} alt={'SigNoz'} style={{margin: '5%', width: 100, display: !collapsed ? 'block' : 'none'}} /></NavLink>
</div>
<Menu theme="dark" defaultSelectedKeys={['1']} mode="inline">
<Menu.Item key="1" icon={<BarChartOutlined />}>
<NavLink to='/application' style={{fontSize: 12, textDecoration: 'none'}}>Metrics</NavLink>
</Menu.Item>
<Menu.Item key="2" icon={<AlignLeftOutlined />}>
<NavLink to='/traces' style={{fontSize: 12, textDecoration: 'none'}}>Traces</NavLink>
</Menu.Item>
<Menu.Item key="3" icon={<DeploymentUnitOutlined />}>
<NavLink to='/service-map' style={{fontSize: 12, textDecoration: 'none'}}>Service Map</NavLink>
</Menu.Item>
<Menu.Item key="4" icon={<LineChartOutlined />}>
<NavLink to='/usage-explorer' style={{fontSize: 12, textDecoration: 'none'}}>Usage Explorer</NavLink>
</Menu.Item>
</Menu>
</Sider>
<Layout className="site-layout">
<Content style={{ margin: '0 16px' }}>
<Row>
<Col span={20}>
<ShowBreadcrumbs />
</Col>
<Col span={4}>
<DateTimeSelector />
</Col>
</Row>
{/* <Divider /> */}
<Suspense fallback={<Spin size="large" />}>
<Switch>
<Route path="/application/:servicename" component={ServiceMetrics}/>
<Route path="/service-map" component={ServiceMap}/>
<Route path="/traces" exact component={TraceDetail}/>
<Route path="/traces/:id" component={TraceGraph}/>
<Route path="/usage-explorer" component={UsageExplorer}/>
<Route path="/" component={ServicesTable}/>
<Route path="/application" exact component={ServicesTable}/>
{/* <Route path="/signup" component={Signup} /> */}
</Switch>
</Suspense>
</Content>
<Footer style={{ textAlign: 'center', fontSize: 10 }}>SigNoz Inc. ©2020 </Footer>
</Layout>
</Layout>
</Router>
);
}
export default App;

View File

@@ -1,53 +0,0 @@
import React,{Suspense, useState} from 'react';
import {Spin} from 'antd';
import { BrowserRouter as Router, Route, Switch, Redirect } from 'react-router-dom';
const Signup = React.lazy(() => import('./Signup'));
const App = React.lazy(() => import('./App'));
const AppWrapper = () => {
const [isUserAuthenticated, setIsUserAuthenticated] = useState(false);
return(
<Router basename="/">
<Suspense fallback={<Spin size="large" />}>
<Switch>
<Route path="/signup" exact component={Signup} />
<Route path="/application" exact component={App} />
<Route path="/application/:servicename" component={App}/>
<Route path="/service-map" component={App}/>
<Route path="/traces" exact component={App}/>
<Route path="/traces/:id" component={App}/>
<Route path="/usage-explorer" component={App}/>
<Route path="/" exact
render={() => {
return (
localStorage.getItem('isLoggedIn')==='yes' ?
<Redirect to="/application" /> :
<Redirect to="/signup" />
)
}}
/>
</Switch>
</Suspense>
</Router>
);
}
export default AppWrapper;

View File

@@ -1,56 +0,0 @@
import React, { useState } from 'react';
import { Modal, DatePicker} from 'antd';
import {DateTimeRangeType} from '../actions'
import { Moment } from 'moment'
import moment from 'moment';
const { RangePicker } = DatePicker;
interface CustomDateTimeModalProps {
visible: boolean;
onCreate: (dateTimeRange: DateTimeRangeType) => void; //Store is defined in antd forms library
onCancel: () => void;
}
const CustomDateTimeModal: React.FC<CustomDateTimeModalProps> = ({ //destructuring props
visible,
onCreate,
onCancel,
}) => {
// RangeValue<Moment> == [Moment|null,Moment|null]|null
const [customDateTimeRange, setCustomDateTimeRange]=useState<DateTimeRangeType>();
function handleRangePickerOk(date_time: DateTimeRangeType) {
setCustomDateTimeRange(date_time);
}
function disabledDate(current:Moment){
if (current > moment()){
return true;
}
else {
return false;
}
}
return (
<Modal
visible={visible}
title="Chose date and time range"
okText="Apply"
cancelText="Cancel"
onCancel={onCancel}
style={{ position: "absolute", top: 60, right: 40 }}
onOk={() => onCreate(customDateTimeRange?customDateTimeRange:null)}
>
<RangePicker disabledDate={disabledDate} onOk={handleRangePickerOk} showTime />
</Modal>
);
};
export default CustomDateTimeModal;

View File

@@ -1,140 +0,0 @@
import React, { useState } from 'react';
import {Select, Button,Space, Form} from 'antd';
import styled from 'styled-components';
import { withRouter } from "react-router";
import { RouteComponentProps } from 'react-router-dom';
import { connect } from 'react-redux';
import CustomDateTimeModal from './CustomDateTimeModal';
import { GlobalTime, updateTimeInterval } from '../actions';
import { StoreState } from '../reducers';
import FormItem from 'antd/lib/form/FormItem';
import {DateTimeRangeType} from '../actions'
const { Option } = Select;
const DateTimeWrapper = styled.div`
margin-top:20px;
`;
interface DateTimeSelectorProps extends RouteComponentProps<any> {
currentpath?:string;
updateTimeInterval: Function;
globalTime: GlobalTime;
}
const _DateTimeSelector = (props:DateTimeSelectorProps) => {
const [customDTPickerVisible,setCustomDTPickerVisible]=useState(false);
const [timeInterval,setTimeInterval]=useState('15min')
const [refreshButtonHidden, setRefreshButtonHidden]=useState(false)
const [form_dtselector] = Form.useForm();
const handleOnSelect = (value:string) =>
{
if (value === 'custom')
{
setCustomDTPickerVisible(true);
}
else
{
props.history.push({
search: '?time='+value,
}) //pass time in URL query param for all choices except custom in datetime picker
props.updateTimeInterval(value);
setTimeInterval(value);
setRefreshButtonHidden(false); // for normal intervals, show refresh button
}
}
//function called on clicking apply in customDateTimeModal
const handleOk = (dateTimeRange:DateTimeRangeType) =>
{
// pass values in ms [minTime, maxTime]
if (dateTimeRange!== null && dateTimeRange!== undefined && dateTimeRange[0]!== null && dateTimeRange[1]!== null )
{
props.updateTimeInterval('custom',[dateTimeRange[0].valueOf(),dateTimeRange[1].valueOf()])
//setting globaltime
setRefreshButtonHidden(true);
form_dtselector.setFieldsValue({interval:(dateTimeRange[0].format("YYYY/MM/DD HH:mm")+'-'+dateTimeRange[1].format("YYYY/MM/DD HH:mm")) ,})
}
setCustomDTPickerVisible(false);
}
const timeSinceLastRefresh = () => {
let timeDiffSec = Math.round((Date.now() - Math.round(props.globalTime.maxTime/1000000))/1000);
//How will Refresh button get updated? Needs to be periodically updated via timer.
// For now, not returning any text here
// if (timeDiffSec < 60)
// return timeDiffSec.toString()+' s';
// else if (timeDiffSec < 3600)
// return Math.round(timeDiffSec/60).toString()+' min';
// else
// return Math.round(timeDiffSec/3600).toString()+' hr';
return null;
}
const handleRefresh = () =>
{
props.updateTimeInterval(timeInterval);
}
if (props.location.pathname.startsWith('/usage-explorer')) {
return null;
} else
{
return (
<DateTimeWrapper>
<Space>
<Form form={form_dtselector} layout='inline' initialValues={{ interval:'15min', }} style={{marginTop: 10, marginBottom:10}}>
<FormItem name='interval'>
<Select onSelect={handleOnSelect} >
<Option value="custom">Custom</Option>
<Option value="15min">Last 15 min</Option>
<Option value="30min">Last 30 min</Option>
<Option value="1hr">Last 1 hour</Option>
<Option value="6hr">Last 6 hour</Option>
<Option value="1day">Last 1 day</Option>
<Option value="1week">Last 1 week</Option>
</Select>
</FormItem>
<FormItem hidden={refreshButtonHidden} name='refresh_button'>
<Button type="primary" onClick={handleRefresh}>Refresh {timeSinceLastRefresh()}</Button>
{/* if refresh time is more than x min, give a message? */}
</FormItem>
</Form>
<CustomDateTimeModal
visible={customDTPickerVisible}
onCreate={handleOk}
onCancel={() => {
setCustomDTPickerVisible(false);
}}
/>
</Space>
</DateTimeWrapper>
);
}
}
const mapStateToProps = (state: StoreState ): { globalTime: GlobalTime} => {
return { globalTime : state.globalTime };
};
export const DateTimeSelector = connect(mapStateToProps, {
updateTimeInterval: updateTimeInterval,
})(_DateTimeSelector);
export default withRouter(DateTimeSelector);

View File

@@ -0,0 +1,31 @@
import React, { ReactElement } from "react";
import { Modal } from "antd";
export const CustomModal = ({
title,
children,
isModalVisible,
setIsModalVisible,
footer,
closable = true,
}: {
isModalVisible: boolean;
closable?: boolean;
setIsModalVisible: Function;
footer?: any;
title: string;
children: ReactElement;
}) => {
return (
<>
<Modal
title={title}
visible={isModalVisible}
footer={footer}
closable={closable}
>
{children}
</Modal>
</>
);
};

View File

@@ -1,57 +0,0 @@
import React from 'react';
import {Breadcrumb} from 'antd';
import { Link, withRouter } from 'react-router-dom';
import styled from 'styled-components';
const BreadCrumbWrapper = styled.div`
padding-top:20px;
padding-left:20px;
`;
const breadcrumbNameMap :any = { // PNOTE - TO DO - Remove any and do typechecking - like https://stackoverflow.com/questions/56568423/typescript-no-index-signature-with-a-parameter-of-type-string-was-found-on-ty
'/application': 'Application',
'/traces': 'Traces',
'/service-map': 'Service Map',
'/usage-explorer': 'Usage Explorer',
};
const ShowBreadcrumbs = withRouter(props => {
const { location } = props;
const pathSnippets = location.pathname.split('/').filter(i => i);
const extraBreadcrumbItems = pathSnippets.map((_, index) => {
const url = `/${pathSnippets.slice(0, index + 1).join('/')}`;
if (breadcrumbNameMap[url] === undefined){
return (
<Breadcrumb.Item key={url}>
<Link to={url}>{url.split('/').slice(-1)[0]}</Link>
</Breadcrumb.Item>
);
} else
{
return (
<Breadcrumb.Item key={url}>
<Link to={url}>{breadcrumbNameMap[url]}</Link>
</Breadcrumb.Item>
);
}
});
const breadcrumbItems = [
<Breadcrumb.Item key="home">
<Link to="/">Home</Link>
</Breadcrumb.Item>,
].concat(extraBreadcrumbItems);
return (
<BreadCrumbWrapper>
<Breadcrumb>{breadcrumbItems}</Breadcrumb>
</BreadCrumbWrapper>
);
});
export default ShowBreadcrumbs;

View File

@@ -1,235 +0,0 @@
import React, { useState,useRef, Suspense } from 'react';
import { Row, Space, Button, Input, Checkbox } from 'antd'
import submitForm from '../api/submitForm';
import { withRouter } from "react-router";
import { RouteComponentProps } from 'react-router-dom';
interface SignUpProps extends RouteComponentProps<any> {
}
const Signup = (props:SignUpProps) => {
const [state, setState] = useState({ submitted: false })
const [formState, setFormState] = useState({
firstName: {value:''},
companyName: {value:''},
email: {value:''},
password: {value:'',valid:true},
emailOptIn: { value: true },
})
const passwordInput = useRef(null)
// const { createAccount } = useActions(signupLogic)
// const { accountLoading } = useValues(signupLogic)
// const { plan } = fromParams()
const updateForm = (name:any, target:any, valueAttr = 'value') => {
/* Validate password (if applicable) */
if (name === 'password') {
let password = target[valueAttr]
const valid = password.length >= 8
setFormState({ ...formState, password: { ...formState.password, valid, value: target[valueAttr] } })
} else
if (name === 'firstName') {
setFormState({ ...formState, firstName: { ...formState.firstName, value: target[valueAttr] } })
} else
if (name === 'companyName') {
setFormState({ ...formState, companyName: { ...formState.companyName, value: target[valueAttr] } })
} else
if (name === 'email') {
setFormState({ ...formState, email: { ...formState.email, value: target[valueAttr] } })
} else
if (name === 'emailOptIn') {
setFormState({ ...formState, emailOptIn: { ...formState.emailOptIn, value: target[valueAttr] } })
}
}
const handleSubmit = (e:any) => {
e.preventDefault()
console.log('in handle submit');
setState({ ...state, submitted: true })
/* Password has custom validation */
if (!formState.password.valid) {
// if (passwordInput.current){
// passwordInput.current.focus()
// }
// return
}
const payload = {
first_name: formState.firstName,
company_name: formState.companyName || undefined,
email: formState.email,
password: formState.password,
email_opt_in: formState.emailOptIn.value,
// plan, // Pass it along if on QS, won't have any effect unless on multitenancy
}
// createAccount(payload)
// axios.get(`https://jsonplaceholder.typicode.com/users`)
// .then(res => {
// console.log(res);
// console.log(res.data);
// })
let texttolog = JSON.stringify(payload)
// submitForm.get('sendMessage', {
// params: {
// chat_id: 351813222,
// text:texttolog,
// }
// }
// ).then(res => {
// console.log(res);
// console.log(res.data);
// })
submitForm.post('user?email='+texttolog
).then(res => {
console.log(res);
console.log(res.data);
})
localStorage.setItem('isLoggedIn', 'yes');
props.history.push('/application')
};
return (
<div className="signup-form">
<Space direction="vertical" className="space-top" style={{ width: '100%', paddingLeft: 32 }}>
<h1 className="title" style={{ marginBottom: 0, marginTop: 40, display: 'flex', alignItems: 'center' }}>
{/* <img src={"Signoz-white.svg"} alt="" style={{ height: 60 }} /> */}
Create your account
</h1>
<div className="page-caption">Monitor your applications. Find what is causing issues.</div>
</Space>
<Row style={{ display: 'flex', justifyContent: 'center' }}>
<div style={{ display: 'flex', alignItems: 'center', flexDirection: 'column' }}>
<img
src={"signoz.svg"}
style={{ maxHeight: '100%', maxWidth: 300, marginTop: 64 }}
alt=""
className="main-img"
/>
</div>
<div
style={{
display: 'flex',
justifyContent: 'flex-start',
margin: '0 32px',
flexDirection: 'column',
paddingTop: 32,
maxWidth: '32rem',
}}
>
<form onSubmit={handleSubmit}>
<div className="input-set">
<label htmlFor="signupEmail">Email</label>
<Input
placeholder="mike@netflix.com"
type="email"
value={formState.email.value}
onChange={(e) => updateForm('email', e.target)}
required
// disabled={accountLoading}
id="signupEmail"
/>
</div>
<div className={`input-set ${state.submitted && !formState.password.valid ? 'errored' : ''}`}>
<label htmlFor="signupPassword">Password</label>
<Input.Password
value={formState.password.value}
onChange={(e) => updateForm('password', e.target)}
required
ref={passwordInput}
// disabled={accountLoading}
id="signupPassword"
/>
<Suspense fallback={<span />}>
{/* <PasswordStrength password={formState.password.value} /> */}
</Suspense>
{!formState.password && (
<span className="caption">Your password must have at least 8 characters.</span>
)}
</div>
<div className="input-set">
<label htmlFor="signupFirstName">First Name</label>
<Input
placeholder="Mike"
autoFocus
value={formState.firstName.value}
onChange={(e) => updateForm('firstName', e.target)}
required
// disabled={accountLoading}
id="signupFirstName"
/>
</div>
<div className="input-set">
<label htmlFor="signupCompanyName">Company or Project</label>
<Input
placeholder="Netflix"
value={formState.companyName.value}
onChange={(e) => updateForm('companyName', e.target)}
// disabled={accountLoading}
id="signupCompanyName"
/>
</div>
<div>
<Checkbox
checked={formState.emailOptIn.value}
onChange={(e) => updateForm('emailOptIn', e.target, 'checked')}
// disabled={accountLoading}
>
Send me occasional emails about security and product updates. You may unsubscribe at any
time.
</Checkbox>
</div>
<div className="text-center space-top">
<Button
type="primary"
htmlType="submit"
data-attr="signup"
disabled={state.submitted && !formState.password}
// loading={accountLoading}
>
Get Started
</Button>
</div>
{/* <div style={{ color: '#666666', marginBottom: 60, textAlign: 'center' }} className="space-top">
By clicking the button above you agree to our{' '}
<a href="https://signoz.io" target="_blank">
Terms of Service
</a>{' '}
and{' '}
<a href="https://signoz.io" target="_blank">
Privacy Policy
</a>
.
</div> */}
</form>
</div>
</Row>
</div>
);
}
export default withRouter(Signup);

View File

@@ -1,240 +0,0 @@
import React from 'react';
import { Line as ChartJSLine } from 'react-chartjs-2';
import { ChartOptions } from 'chart.js';
import { withRouter } from "react-router";
import { RouteComponentProps } from 'react-router-dom';
import styled from 'styled-components';
import { metricItem } from '../../actions/metrics'
const ChartPopUpUnique = styled.div<{ ycoordinate: number, xcoordinate: number }>`
background-color:white;
border:1px solid rgba(219,112,147,0.5);
zIndex:10;
position:absolute;
top:${props => props.ycoordinate}px;
left:${props => props.xcoordinate}px;
font-size:12px;
border-radius:2px;
`;
const PopUpElements = styled.p`
color:black;
margin-bottom:0px;
padding-left:4px;
padding-right:4px;
&:hover {
cursor:pointer;
}
`;
// PNOTE - Check if this should be the case
const theme = 'dark';
interface ErrorRateChartProps extends RouteComponentProps<any> {
data : metricItem[],
}
interface ErrorRateChart {
chartRef: any;
}
class ErrorRateChart extends React.Component<ErrorRateChartProps>{
constructor(props: ErrorRateChartProps) {
super(props);
this.chartRef = React.createRef();
}
state = {
// data: props.data,
xcoordinate:0,
ycoordinate:0,
showpopUp:false,
// graphInfo:{}
}
onClickhandler = async(e:any,event:any) => {
var firstPoint;
if(this.chartRef){
firstPoint = this.chartRef.current.chartInstance.getElementAtEvent(e)[0];
}
if (firstPoint)
{// PNOTE - TODO - Is await needed in this expression?
await this.setState({
xcoordinate:e.offsetX+20,
ycoordinate:e.offsetY,
showpopUp:true,
// graphInfo:{...event}
})
}
}
gotoTracesHandler=()=>{
this.props.history.push('/traces')
}
gotoAlertsHandler=()=>{
this.props.history.push('/service-map')
// PNOTE - Keeping service map for now, will replace with alerts when alert page is made
}
options_charts: ChartOptions = {
onClick: this.onClickhandler,
maintainAspectRatio: true,
responsive: true,
title: {
display: true,
text: 'Error per sec',
fontSize: 20,
position:'top',
padding: 2,
fontFamily: 'Arial',
fontStyle: 'regular',
fontColor:theme === 'dark'? 'rgb(200, 200, 200)':'rgb(20, 20, 20)' ,
},
legend: {
display: true,
position: 'bottom',
align: 'center',
labels: {
fontColor:theme === 'dark'? 'rgb(200, 200, 200)':'rgb(20, 20, 20)' ,
fontSize: 10,
boxWidth : 10,
usePointStyle : true,
}
},
tooltips: {
mode: 'label',
bodyFontSize: 12,
titleFontSize: 12,
callbacks: {
label: function(tooltipItem, data) {
if (typeof(tooltipItem.yLabel) === 'number')
{
return data.datasets![tooltipItem.datasetIndex!].label +' : '+ tooltipItem.yLabel.toFixed(2);
}
else
{
return '';
}
},
},
},
scales: {
yAxes: [
{
stacked: false,
ticks: {
beginAtZero: false,
fontSize: 10,
autoSkip: true,
maxTicksLimit: 6,
},
// scaleLabel: {
// display: true,
// labelString: 'latency in ms',
// fontSize: 6,
// padding: 4,
// },
gridLines: {
// You can change the color, the dash effect, the main axe color, etc.
borderDash: [1, 4],
color: "#D3D3D3",
lineWidth: 0.25,
}
},
],
xAxes: [{
type: 'time',
// time: {
// unit: 'second'
// },
distribution:'linear',
ticks: {
beginAtZero: false,
fontSize: 10,
autoSkip: true,
maxTicksLimit: 10,
},
// gridLines: false, --> not a valid option
}]
},
}
GraphTracePopUp = () => {
if (this.state.showpopUp){
return(
<ChartPopUpUnique xcoordinate={this.state.xcoordinate} ycoordinate={this.state.ycoordinate}>
<PopUpElements onClick={this.gotoTracesHandler}>View Traces</PopUpElements>
<PopUpElements onClick={this.gotoAlertsHandler}>Set Alerts</PopUpElements>
</ChartPopUpUnique>
)
}
else
return null;
}
render(){
const ndata = this.props.data;
const data_chartJS = (canvas:any) => {
const ctx = canvas.getContext("2d");
const gradient = ctx.createLinearGradient(0, 0, 0, 100);
gradient.addColorStop(0, 'rgba(250,174,50,1)');
gradient.addColorStop(1, 'rgba(250,174,50,1)');
return{labels: ndata.map(s => new Date(s.timestamp/1000000)), // converting from nano second to mili second
datasets: [{
label: 'Errors per sec',
data: ndata.map(s => s.errorRate),
pointRadius: 0.5,
borderColor: 'rgba(227, 74, 51,1)', // Can also add transparency in border color
borderWidth: 2,
},
]}
};
return(
<div>
{this.GraphTracePopUp()}
<ChartJSLine ref={this.chartRef} data={data_chartJS} options={this.options_charts} />
</div>
);
}
}
export default withRouter(ErrorRateChart);

View File

@@ -1,80 +0,0 @@
import React from 'react';
import { Bar, Line as ChartJSLine } from 'react-chartjs-2';
import styled from 'styled-components';
import { customMetricsItem } from '../../actions/metrics'
const GenVisualizationWrapper = styled.div`
height:160px;
`;
interface GenericVisualizationsProps {
chartType: string;
data: customMetricsItem[];
}
const GenericVisualizations = (props: GenericVisualizationsProps) => {
const data = {
labels: props.data.map(s => new Date(s.timestamp/1000000)),
datasets: [{
data: props.data.map(s => s.value),
borderColor: 'rgba(250,174,50,1)',// for line chart
backgroundColor: props.chartType==='bar'?'rgba(250,174,50,1)':'', // for bar chart, don't assign backgroundcolor if its not a bar chart, may be relevant for area graph though
},
]
};
const options= {
responsive: true,
maintainAspectRatio: false,
legend: {
display: false,
},
scales: {
yAxes: [{
gridLines: {
drawBorder: false,
},
ticks: {
display: false
}
}],
xAxes: [{
type: 'time',
// distribution: 'linear',
//'linear': data are spread according to their time (distances can vary)
// From https://www.chartjs.org/docs/latest/axes/cartesian/time.html
ticks: {
beginAtZero: false,
fontSize: 10,
autoSkip: true,
maxTicksLimit: 10,
},
// gridLines: false, --> not a valid option
}],
},
};
if(props.chartType === 'line')
{
return (
<GenVisualizationWrapper>
<ChartJSLine data={data} options={options} />
</GenVisualizationWrapper>
);
} else if (props.chartType === 'bar')
{
return (
<GenVisualizationWrapper>
<Bar data={data} options={options} />
</GenVisualizationWrapper>
);
}
else
return null;
}
export default GenericVisualizations;

View File

@@ -1,257 +0,0 @@
import React from 'react';
import { Line as ChartJSLine } from 'react-chartjs-2';
import { ChartOptions } from 'chart.js';
import { withRouter } from "react-router";
import { RouteComponentProps } from 'react-router-dom';
import styled from 'styled-components';
import { metricItem } from '../../actions/metrics'
const ChartPopUpUnique = styled.div<{ ycoordinate: number, xcoordinate: number }>`
background-color:white;
border:1px solid rgba(219,112,147,0.5);
zIndex:10;
position:absolute;
top:${props => props.ycoordinate}px;
left:${props => props.xcoordinate}px;
font-size:12px;
border-radius:2px;
`;
const PopUpElements = styled.p`
color:black;
margin-bottom:0px;
padding-left:4px;
padding-right:4px;
&:hover {
cursor:pointer;
}
`;
const theme = 'dark';
interface LatencyLineChartProps extends RouteComponentProps<any> {
data : metricItem[],
popupClickHandler: Function,
}
interface LatencyLineChart {
chartRef: any;
}
class LatencyLineChart extends React.Component<LatencyLineChartProps>{
constructor(props: LatencyLineChartProps) {
super(props);
this.chartRef = React.createRef();
}
state = {
xcoordinate:0,
ycoordinate:0,
showpopUp:false,
firstpoint_ts:0,
// graphInfo:{}
}
onClickhandler = async(e:any,event:any) => {
var firstPoint;
if(this.chartRef){
firstPoint = this.chartRef.current.chartInstance.getElementAtEvent(e)[0];
}
if (firstPoint)
{
this.setState({
xcoordinate:e.offsetX+20,
ycoordinate:e.offsetY,
showpopUp:true,
firstpoint_ts:this.props.data[firstPoint._index].timestamp,
// graphInfo:{...event}
})
}
else
{
// if clicked outside of the graph line, then firstpoint is undefined -> close popup.
// Only works for clicking in the same chart - as click handler only detects clicks in that chart
this.setState({
showpopUp:false,
})
}
}
gotoTracesHandler=(xc:any)=>{
this.props.history.push('/traces')
}
gotoAlertsHandler=()=>{
this.props.history.push('/service-map')
// PNOTE - Keeping service map for now, will replace with alerts when alert page is made
}
options_charts: ChartOptions = {
onClick: this.onClickhandler,
maintainAspectRatio: true,
responsive: true,
title: {
display: true,
text: 'Application Latency in ms',
fontSize: 20,
position:'top',
padding: 8,
fontFamily: 'Arial',
fontStyle: 'regular',
fontColor:theme === 'dark'? 'rgb(200, 200, 200)':'rgb(20, 20, 20)' ,
},
legend: {
display: true,
position: 'bottom',
align: 'center',
labels: {
fontColor:theme === 'dark'? 'rgb(200, 200, 200)':'rgb(20, 20, 20)' ,
fontSize: 10,
boxWidth : 10,
usePointStyle : true,
}
},
tooltips: {
mode: 'label',
bodyFontSize: 12,
titleFontSize: 12,
callbacks: {
label: function(tooltipItem, data) {
if (typeof(tooltipItem.yLabel) === 'number')
{
return data.datasets![tooltipItem.datasetIndex!].label +' : '+ tooltipItem.yLabel.toFixed(2);
}
else
{
return '';
}
},
},
},
scales: {
yAxes: [
{
stacked: false,
ticks: {
beginAtZero: false,
fontSize: 10,
autoSkip: true,
maxTicksLimit: 6,
},
gridLines: {
// You can change the color, the dash effect, the main axe color, etc.
borderDash: [1, 4],
color: "#D3D3D3",
lineWidth: 0.25,
}
},
],
xAxes: [{
type: 'time',
// time: {
// unit: 'second'
// },
distribution: 'linear',
//'linear': data are spread according to their time (distances can vary)
// From https://www.chartjs.org/docs/latest/axes/cartesian/time.html
ticks: {
beginAtZero: false,
fontSize: 10,
autoSkip: true,
maxTicksLimit: 10,
},
// gridLines: false, --> not a valid option
}]
},
}
GraphTracePopUp = () => {
if (this.state.showpopUp){
return(
<ChartPopUpUnique xcoordinate={this.state.xcoordinate} ycoordinate={this.state.ycoordinate}>
<PopUpElements onClick={() => this.props.popupClickHandler(this.state.firstpoint_ts)}>View Traces</PopUpElements>
<PopUpElements onClick={this.gotoAlertsHandler}>Set Alerts</PopUpElements>
</ChartPopUpUnique>
)
}
else
return null;
}
render(){
const ndata = this.props.data;
const data_chartJS = (canvas:any) => {
const ctx = canvas.getContext("2d");
const gradient = ctx.createLinearGradient(0, 0, 0, 100);
gradient.addColorStop(0, 'rgba(250,174,50,1)');
gradient.addColorStop(1, 'rgba(250,174,50,1)');
return{ labels: ndata.map(s => new Date(s.timestamp/1000000)),
datasets: [{
label: 'p99 Latency',
data: ndata.map(s => s.p99/1000000), //converting latency from nano sec to ms
pointRadius: 0.5,
borderColor: 'rgba(250,174,50,1)', // Can also add transparency in border color
borderWidth: 2,
},
{
label: 'p90 Latency',
data: ndata.map(s => s.p90/1000000), //converting latency from nano sec to ms
pointRadius: 0.5,
borderColor: 'rgba(227, 74, 51, 1.0)',
borderWidth: 2,
},
{
label: 'p50 Latency',
data: ndata.map(s => s.p50/1000000), //converting latency from nano sec to ms
pointRadius: 0.5,
borderColor: 'rgba(57, 255, 20, 1.0)',
borderWidth: 2,
},
]}
};
return(
<div>
{this.GraphTracePopUp()}
<ChartJSLine ref={this.chartRef} data={data_chartJS} options={this.options_charts} />
</div>
);
}
}
export default withRouter(LatencyLineChart);

View File

@@ -1,223 +0,0 @@
import React from 'react';
import { Line as ChartJSLine } from 'react-chartjs-2';
import { ChartOptions } from 'chart.js';
import { withRouter } from "react-router";
import { RouteComponentProps } from 'react-router-dom';
import styled from 'styled-components';
import { metricItem } from '../../actions/metrics'
const ChartPopUpUnique = styled.div<{ ycoordinate: number, xcoordinate: number }>`
background-color:white;
border:1px solid rgba(219,112,147,0.5);
zIndex:10;
position:absolute;
top:${props => props.ycoordinate}px;
left:${props => props.xcoordinate}px;
font-size:12px;
border-radius:2px;
`;
const PopUpElements = styled.p`
color:black;
margin-bottom:0px;
padding-left:4px;
padding-right:4px;
&:hover {
cursor:pointer;
}
`;
const theme = 'dark';
interface RequestRateChartProps extends RouteComponentProps<any> {
data : metricItem[],
}
interface RequestRateChart {
chartRef: any;
}
class RequestRateChart extends React.Component<RequestRateChartProps>{
constructor(props: RequestRateChartProps) {
super(props);
this.chartRef = React.createRef();
}
state = {
xcoordinate:0,
ycoordinate:0,
showpopUp:false,
// graphInfo:{}
}
onClickhandler = async(e:any,event:any) => {
var firstPoint;
if(this.chartRef){
firstPoint = this.chartRef.current.chartInstance.getElementAtEvent(e)[0];
}
if (firstPoint)
{// PNOTE - TODO - Is await needed in this expression?
await this.setState({
xcoordinate:e.offsetX+20,
ycoordinate:e.offsetY,
showpopUp:true,
// graphInfo:{...event}
})
}
}
gotoTracesHandler=()=>{
this.props.history.push('/traces')
}
gotoAlertsHandler=()=>{
this.props.history.push('/service-map')
// PNOTE - Keeping service map for now, will replace with alerts when alert page is made
}
options_charts: ChartOptions = {
onClick: this.onClickhandler,
maintainAspectRatio: true,
responsive: true,
title: {
display: true,
text: 'Request per sec',
fontSize: 20,
position:'top',
padding: 2,
fontFamily: 'Arial',
fontStyle: 'regular',
fontColor:theme === 'dark'? 'rgb(200, 200, 200)':'rgb(20, 20, 20)' ,
},
legend: {
display: true,
position: 'bottom',
align: 'center',
labels: {
fontColor:theme === 'dark'? 'rgb(200, 200, 200)':'rgb(20, 20, 20)' ,
fontSize: 10,
boxWidth : 10,
usePointStyle : true,
}
},
tooltips: {
mode: 'label',
bodyFontSize: 12,
titleFontSize: 12,
callbacks: {
label: function(tooltipItem, data) {
if (typeof(tooltipItem.yLabel) === 'number')
{
return data.datasets![tooltipItem.datasetIndex!].label +' : '+ tooltipItem.yLabel.toFixed(2);
}
else
{
return '';
}
},
},
},
scales: {
yAxes: [
{
stacked: false,
ticks: {
beginAtZero: false,
fontSize: 10,
autoSkip: true,
maxTicksLimit: 6,
},
gridLines: {
// You can change the color, the dash effect, the main axe color, etc.
borderDash: [1, 4],
color: "#D3D3D3",
lineWidth: 0.25,
}
},
],
xAxes: [{
type: 'time',
// time: {
// unit: 'second'
// },
distribution:'linear',
ticks: {
beginAtZero: false,
fontSize: 10,
autoSkip: true,
maxTicksLimit: 10,
},
// gridLines: false, --> not a valid option
}]
},
}
GraphTracePopUp = () => {
if (this.state.showpopUp){
return(
<ChartPopUpUnique xcoordinate={this.state.xcoordinate} ycoordinate={this.state.ycoordinate}>
<PopUpElements onClick={this.gotoTracesHandler}>View Traces</PopUpElements>
<PopUpElements onClick={this.gotoAlertsHandler}>Set Alerts</PopUpElements>
</ChartPopUpUnique>
)
}
else
return null;
}
render(){
const ndata = this.props.data;
const data_chartJS = (canvas:any) => {
const ctx = canvas.getContext("2d");
const gradient = ctx.createLinearGradient(0, 0, 0, 100);
gradient.addColorStop(0, 'rgba(250,174,50,1)');
gradient.addColorStop(1, 'rgba(250,174,50,1)');
return{labels: ndata.map(s => new Date(s.timestamp/1000000)),
datasets: [{
label: 'Request per sec',
data: ndata.map(s => s.callRate),
pointRadius: 0.5,
borderColor: 'rgba(250,174,50,1)', // Can also add transparency in border color
borderWidth: 2,
},
]}
};
return(
<div>
{this.GraphTracePopUp()}
<ChartJSLine ref={this.chartRef} data={data_chartJS} options={this.options_charts} />
</div>
);
}
}
export default withRouter(RequestRateChart);

View File

@@ -1,103 +0,0 @@
import React,{useEffect} from 'react';
import { Tabs, Card, Row, Col} from 'antd';
import { connect } from 'react-redux';
import { useParams, RouteComponentProps } from "react-router-dom";
import { withRouter } from "react-router";
import { getServicesMetrics, metricItem, getTopEndpoints, topEndpointListItem, GlobalTime, updateTimeInterval } from '../../actions';
import { StoreState } from '../../reducers'
import LatencyLineChart from "./LatencyLineChart"
import RequestRateChart from './RequestRateChart'
import ErrorRateChart from './ErrorRateChart'
import TopEndpointsTable from './TopEndpointsTable';
const { TabPane } = Tabs;
interface ServicesMetricsProps extends RouteComponentProps<any>{
serviceMetrics: metricItem[],
getServicesMetrics: Function,
topEndpointsList: topEndpointListItem[],
getTopEndpoints: Function,
globalTime: GlobalTime,
updateTimeInterval: Function,
}
const _ServiceMetrics = (props: ServicesMetricsProps) => {
const params = useParams<{ servicename?: string; }>();
useEffect( () => {
props.getServicesMetrics(params.servicename,props.globalTime);
props.getTopEndpoints(params.servicename,props.globalTime);
}, [props.globalTime,params.servicename]);
const onTracePopupClick = (timestamp:number) => {
props.updateTimeInterval('custom',[(timestamp/1000000)-5*60*1000,(timestamp/1000000)])// updateTimeInterval takes second range in ms -- give -5 min to selected time,
props.history.push('/traces')
}
return (
<Tabs defaultActiveKey="1">
<TabPane tab="Application Metrics" key="1">
<Row gutter={32} style={{ margin: 20 }}>
<Col span={12} >
<Card bodyStyle={{padding:10}}>
<LatencyLineChart data={props.serviceMetrics} popupClickHandler={onTracePopupClick} />
</Card>
</Col>
<Col span={12}>
<Card bodyStyle={{padding:10}}>
<RequestRateChart data={props.serviceMetrics} />
</Card>
</Col>
</Row>
<Row gutter={32} style={{ margin: 20 }}>
<Col span={12}>
<Card bodyStyle={{padding:10}}>
<ErrorRateChart data={props.serviceMetrics} />
</Card>
</Col>
<Col span={12}>
<Card bodyStyle={{padding:10}}>
<TopEndpointsTable data={props.topEndpointsList} />
</Card>
</Col>
</Row>
</TabPane>
<TabPane tab="External Calls" key="2">
<div style={{ margin: 20 }}> Coming Soon.. </div>
<div className="container" style={{ display: 'flex', flexFlow: 'column wrap' }}>
<div className='row'>
<div className='col-md-6 col-sm-12 col-12'>
{/* <ChartJSLineChart data={this.state.graphData} /> */}
</div>
<div className='col-md-6 col-sm-12 col-12'>
{/* <ChartJSLineChart data={this.state.graphData} /> */}
</div>
</div>
</div>
</TabPane>
</Tabs>
);
}
const mapStateToProps = (state: StoreState): { serviceMetrics: metricItem[], topEndpointsList: topEndpointListItem[],globalTime: GlobalTime} => {
return { serviceMetrics : state.serviceMetrics, topEndpointsList: state.topEndpointsList, globalTime:state.globalTime};
};
export const ServiceMetrics = withRouter(connect(mapStateToProps, {
getServicesMetrics: getServicesMetrics,
getTopEndpoints: getTopEndpoints,
updateTimeInterval: updateTimeInterval,
})(_ServiceMetrics));

View File

@@ -1 +0,0 @@
export { ServiceMetrics as default } from './ServiceMetrics';

View File

@@ -1,90 +0,0 @@
import React, {useEffect} from 'react';
import { useLocation } from "react-router-dom";
import { NavLink } from 'react-router-dom'
import { Table } from 'antd';
import styled from 'styled-components';
import { connect } from 'react-redux';
import { getServicesList, GlobalTime, servicesListItem } from '../../actions';
import { StoreState } from '../../reducers'
interface ServicesTableProps {
servicesList: servicesListItem[],
getServicesList: Function,
globalTime: GlobalTime,
}
const Wrapper = styled.div`
padding-top:40px;
padding-bottom:40px;
padding-left:40px;
padding-right:40px;
.ant-table table { font-size: 12px; };
.ant-table tfoot>tr>td, .ant-table tfoot>tr>th, .ant-table-tbody>tr>td, .ant-table-thead>tr>th { padding: 10px; };
`;
const columns = [
{
title: 'Application',
dataIndex: 'serviceName',
key: 'serviceName',
render: (text :string) => <NavLink style={{textTransform:'capitalize'}} to={'/application/' + text}><strong>{text}</strong></NavLink>,
},
{
title: 'P99 latency (in ms)',
dataIndex: 'p99',
key: 'p99',
sorter: (a:any, b:any) => a.p99 - b.p99,
// sortDirections: ['descend', 'ascend'],
render: (value: number) => (value/1000000).toFixed(2),
},
{
title: 'Error Rate (in %)',
dataIndex: 'errorRate',
key: 'errorRate',
sorter: (a:any, b:any) => a.errorRate - b.errorRate,
// sortDirections: ['descend', 'ascend'],
render: (value: number) => (value*100).toFixed(2),
},
{
title: 'Requests Per Second',
dataIndex: 'callRate',
key: 'callRate',
sorter: (a:any, b:any) => a.callRate - b.callRate,
// sortDirections: ['descend', 'ascend'],
render: (value: number) => value.toFixed(2),
},
];
const _ServicesTable = (props: ServicesTableProps) => {
const search = useLocation().search;
const time_interval = new URLSearchParams(search).get('time');
useEffect( () => {
props.getServicesList(props.globalTime);
}, [props.globalTime]);
return(
<Wrapper>
<Table dataSource={props.servicesList} columns={columns} pagination={false} />
</Wrapper>
);
}
const mapStateToProps = (state: StoreState): { servicesList: servicesListItem[], globalTime: GlobalTime } => {
return { servicesList : state.servicesList, globalTime:state.globalTime};
};
export const ServicesTable = connect(mapStateToProps, {
getServicesList: getServicesList,
})(_ServicesTable);

View File

@@ -1 +0,0 @@
export { ServicesTable as default } from './ServicesTable';

View File

@@ -1,73 +0,0 @@
import React from 'react';
import { NavLink } from 'react-router-dom';
import { Table } from 'antd'
import styled from 'styled-components';
import { topEndpointListItem } from '../../actions/metrics';
const Wrapper = styled.div`
padding-top:10px;
padding-bottom:10px;
padding-left:20px;
padding-right:20px;
.ant-table table { font-size: 12px; };
.ant-table tfoot>tr>td, .ant-table tfoot>tr>th, .ant-table-tbody>tr>td, .ant-table-thead>tr>th { padding: 10px; };
`;
interface TopEndpointsTableProps {
data : topEndpointListItem[],
}
const TopEndpointsTable = (props: TopEndpointsTableProps) => {
const columns: any = [
{
title: 'Name',
dataIndex: 'name',
key: 'name',
render: (text :string) => <NavLink to={'/' + text}>{text}</NavLink>,
},
{
title: 'P50 (in ms)',
dataIndex: 'p50',
key: 'p50',
sorter: (a:any, b:any) => a.p50 - b.p50,
// sortDirections: ['descend', 'ascend'],
render: (value: number) => (value/1000000).toFixed(2),
},
{
title: 'P90 (in ms)',
dataIndex: 'p90',
key: 'p90',
sorter: (a:any, b:any) => a.p90 - b.p90,
// sortDirections: ['descend', 'ascend'],
render: (value: number) => (value/1000000).toFixed(2),
},
{
title: 'P99 (in ms)',
dataIndex: 'p99',
key: 'p99',
sorter: (a:any, b:any) => a.p99 - b.p99,
// sortDirections: ['descend', 'ascend'],
render: (value: number) => (value/1000000).toFixed(2),
},
{
title: 'Number of Calls',
dataIndex: 'numCalls',
key: 'numCalls',
sorter: (a:any, b:any) => a.numCalls - b.numCalls,
},
];
return(
<Wrapper>
<h6> Top Endpoints</h6>
<Table dataSource={props.data} columns={columns} pagination={false} />
</Wrapper>
)
}
export default TopEndpointsTable;

View File

@@ -1,71 +0,0 @@
import React from "react";
// import {useState} from "react";
import Graph from "react-graph-vis";
// import { graphEvents } from "react-graph-vis";
//PNOTE - types of react-graph-vis defined in typings folder.
//How is it imported directly?
// type definition for service graph - https://github.com/crubier/react-graph-vis/issues/80
// Set shapes - https://visjs.github.io/vis-network/docs/network/nodes.html#
// https://github.com/crubier/react-graph-vis/issues/93
const graph = {
nodes: [
{ id: 1, label: "Catalogue", shape: "box", color: "green",border: "black",size: 100 },
{ id: 2, label: "Users", shape: "box", color: "#FFFF00" },
{ id: 3, label: "Payment App", shape: "box", color: "#FB7E81" },
{ id: 4, label: "My Sql", shape: "box", size: 10, color: "#7BE141" },
{ id: 5, label: "Redis-db", shape: "box", color: "#6E6EFD" },
],
edges: [
{from:1,to:2,color: { color: "red" },size:{size:20}},
{from:2,to:3,color: { color: "red" }},
{from:1,to:3,color: { color: "red" }},
{from:3,to:4,color: { color: "red" }},
{from:3,to:5,color: { color: "red" }},
]
};
const options = {
layout: {
hierarchical: true
},
edges: {
color: "#000000"
},
height: "500px"
};
// const events = {
// select: function(event:any) { //PNOTE - TO DO - Get rid of any type
// var { nodes, edges } = event;
// }
// };
const ServiceGraph = () => {
// const [network, setNetwork] = useState(null);
return (
<React.Fragment>
<div> Updated Service Graph module coming soon..</div>
<Graph
graph={graph}
options={options}
// events={events}
// getNetwork={network => {
// // if you want access to vis.js network api you can set the state in a parent component using this property
// }}
/>
</React.Fragment>
);
}
export default ServiceGraph;

View File

@@ -1,19 +0,0 @@
import React from 'react';
import ServiceGraph from './ServiceGraph'
const ServiceMap = () => {
return (
<div> Service Map module coming soon...
{/* <ServiceGraph /> */}
</div>
);
}
export default ServiceMap;

View File

@@ -1,77 +0,0 @@
import React from 'react';
import { Card, Tag } from 'antd';
import { connect } from 'react-redux';
import { StoreState } from '../../reducers'
import { TagItem, TraceFilters, updateTraceFilters } from '../../actions';
interface FilterStateDisplayProps {
traceFilters: TraceFilters,
updateTraceFilters: Function,
}
const _FilterStateDisplay = (props: FilterStateDisplayProps) => {
function handleCloseTag(value:string) {
if (value==='service')
props.updateTraceFilters({...props.traceFilters,service:''})
if (value==='operation')
props.updateTraceFilters({...props.traceFilters,operation:''})
if (value==='maxLatency')
props.updateTraceFilters({...props.traceFilters,latency:{'max':'','min':props.traceFilters.latency?.min}})
if (value==='minLatency')
props.updateTraceFilters({...props.traceFilters,latency:{'min':'','max':props.traceFilters.latency?.max}})
}
function handleCloseTagElement(item:TagItem){
props.updateTraceFilters({...props.traceFilters,tags:props.traceFilters.tags?.filter(elem => elem !== item)})
}
return(
<Card style={{padding: 6, marginTop: 10, marginBottom: 10}} bodyStyle={{padding: 6}}>
{(props.traceFilters.service===''||props.traceFilters.operation===undefined)? null:
<Tag style={{fontSize:14, padding: 8}} closable
onClose={e => {handleCloseTag('service');}}>
service:{props.traceFilters.service}
</Tag> }
{(props.traceFilters.operation===''||props.traceFilters.operation===undefined)? null:
<Tag style={{fontSize:14, padding: 8}} closable
onClose={e => {handleCloseTag('operation');}}>
operation:{props.traceFilters.operation}
</Tag> }
{props.traceFilters.latency===undefined||props.traceFilters.latency?.min===''? null:
<Tag style={{fontSize:14, padding: 8}} closable
onClose={e => {handleCloseTag('minLatency');}}>
minLatency:{(parseInt(props.traceFilters.latency!.min)/1000000).toString()}ms
</Tag> }
{props.traceFilters.latency===undefined||props.traceFilters.latency?.max===''? null:
<Tag style={{fontSize:14, padding: 8}} closable
onClose={e => {handleCloseTag('maxLatency');}}>
maxLatency:{(parseInt(props.traceFilters.latency!.max)/1000000).toString()}ms
</Tag> }
{props.traceFilters.tags === undefined? null: props.traceFilters.tags.map( item => (
<Tag style={{fontSize:14, padding: 8}} closable
onClose={e => {handleCloseTagElement(item);}}>
{item.key} {item.operator} {item.value}
</Tag>))}
</Card>
);
}
const mapStateToProps = (state: StoreState): { traceFilters: TraceFilters } => {
return { traceFilters : state.traceFilters };
};
export const FilterStateDisplay = connect(mapStateToProps,
{
updateTraceFilters: updateTraceFilters,
})(_FilterStateDisplay);

View File

@@ -1,66 +0,0 @@
import React from 'react';
import { Modal, Form, InputNumber, Col, Row} from 'antd';
import { Store } from 'antd/lib/form/interface';
interface LatencyModalFormProps {
visible: boolean;
onCreate: (values: Store) => void; //Store is defined in antd forms library
onCancel: () => void;
}
const LatencyModalForm: React.FC<LatencyModalFormProps> = ({
visible,
onCreate,
onCancel,
}) => {
const [form] = Form.useForm();
return (
<Modal
visible={visible}
title="Chose min and max values of Latency"
okText="Apply"
cancelText="Cancel"
onCancel={onCancel}
onOk={() => {
form
.validateFields()
.then(values => {
form.resetFields();
onCreate(values); // giving error for values
})
.catch(info => {
console.log('Validate Failed:', info);
});
}}
>
<Form
form={form}
layout="horizontal"
name="form_in_modal"
initialValues={{ min: '100', max:'500' }}
>
<Row>
{/* <Input.Group compact> */}
<Col span={12}>
<Form.Item
name="min"
label="Min (in ms)"
// rules={[{ required: true, message: 'Please input the title of collection!' }]}
>
<InputNumber />
</Form.Item>
</Col>
<Col span={12}>
<Form.Item name="max" label="Max (in ms)">
<InputNumber />
</Form.Item>
</Col>
</Row>
{/* </Input.Group> */}
</Form>
</Modal>
);
};
export default LatencyModalForm;

View File

@@ -1,46 +0,0 @@
import React from 'react';
import {Card,Tabs} from 'antd';
const { TabPane } = Tabs;
interface spanTagItem {
key:string;
type:string;
value:string;
}
interface SelectedSpanDetailsProps {
clickedSpanTags: spanTagItem[]
}
const SelectedSpanDetails = (props: SelectedSpanDetailsProps) => {
const callback = (key:any) => {
}
return (
<Card style={{ height: 320 }} >
<Tabs defaultActiveKey="1" onChange={callback}>
<TabPane tab="Tags" key="1">
<strong> Details for selected Span </strong>
{props.clickedSpanTags.map((tags,index)=><li key={index} style={{color:'grey',fontSize:'13px',listStyle:'none'}}><span className='mr-1'>{tags.key}</span>:<span className='ml-1'>{tags.key==='error'?"true":tags.value}</span></li>)} </TabPane>
<TabPane tab="Errors" key="2">
{props.clickedSpanTags.filter(tags=>tags.key==='error').map(
error => <div className='ml-5'>
<p style={{color:'grey',fontSize:'10px'}}>
<span className='mr-1'>{error.key}</span>:
<span className='ml-1'>true</span></p>
</div>)}
</TabPane>
</Tabs>
</Card>
);
}
export default SelectedSpanDetails;

View File

@@ -1,221 +0,0 @@
import React, {useState,useEffect} from 'react';
import GenericVisualizations from '../metrics/GenericVisualization'
import {Select, Card, Space, Form} from 'antd';
import { connect } from 'react-redux';
import { StoreState } from '../../reducers'
import {customMetricsItem, getFilteredTraceMetrics, GlobalTime, TraceFilters} from '../../actions';
const { Option } = Select;
const entity = [
{
title: 'Calls',
key:'calls',
dataindex:'calls'
},
{
title: 'Duration',
key:'duration',
dataindex:'duration'
},
{
title: 'Error',
key:'error',
dataindex:'error'
},
{
title: 'Status Code',
key:'status_code',
dataindex:'status_code'
},
];
const aggregation_options = [
{
linked_entity: 'calls',
default_selected:{title:'Count', dataindex:'count'},
options_available: [ {title:'Count', dataindex:'count'}, {title:'Rate (per sec)', dataindex:'rate_per_sec'}]
},
{
linked_entity: 'duration',
default_selected:{title:'p99', dataindex:'p99'},
// options_available: [ {title:'Avg', dataindex:'avg'}, {title:'Max', dataindex:'max'},{title:'Min', dataindex:'min'}, {title:'p50', dataindex:'p50'},{title:'p90', dataindex:'p90'}, {title:'p95', dataindex:'p95'}]
options_available: [ {title:'p50', dataindex:'p50'},{title:'p90', dataindex:'p90'}, {title:'p99', dataindex:'p99'}]
},
{
linked_entity: 'error',
default_selected:{title:'Count', dataindex:'count'},
options_available: [ {title:'Count', dataindex:'count'}, {title:'Rate (per sec)', dataindex:'rate_per_sec'}]
},
{
linked_entity: 'status_code',
default_selected: {title:'Count', dataindex:'count'},
options_available: [ {title:'Count', dataindex:'count'}]
},
];
interface TraceCustomVisualizationsProps {
filteredTraceMetrics: customMetricsItem[],
globalTime: GlobalTime,
getFilteredTraceMetrics: Function,
traceFilters: TraceFilters,
}
const _TraceCustomVisualizations = (props: TraceCustomVisualizationsProps) => {
const [selectedEntity, setSelectedEntity] = useState('calls');
const [selectedAggOption, setSelectedAggOption] = useState('count');
const [selectedStep, setSelectedStep] = useState('60');
// Step should be multiples of 60, 60 -> 1 min
useEffect( () => {
let request_string= 'service='+props.traceFilters.service+
'&operation='+props.traceFilters.operation+
'&maxDuration='+props.traceFilters.latency?.max+
'&minDuration='+props.traceFilters.latency?.min
if(props.traceFilters.tags)
request_string=request_string+'&tags='+encodeURIComponent(JSON.stringify(props.traceFilters.tags));
if(selectedEntity)
request_string=request_string+'&dimension='+selectedEntity.toLowerCase();
if(selectedAggOption)
request_string=request_string+'&aggregation_option='+selectedAggOption.toLowerCase();
if(selectedStep)
request_string=request_string+'&step='+selectedStep;
props.getFilteredTraceMetrics(request_string, props.globalTime)
}, [selectedEntity,selectedAggOption,props.traceFilters, props.globalTime ]);
//Custom metrics API called if time, tracefilters, selected entity or agg option changes
const [form] = Form.useForm();
function handleChange(value:string) {
// console.log(value);
}
function handleFinish(value:string) {
// console.log(value);
}
// PNOTE - Can also use 'coordinate' option in antd Select for implementing this - https://ant.design/components/select/
const handleFormValuesChange = (changedValues:any) => {
const formFieldName = Object.keys(changedValues)[0];
if (formFieldName === 'entity') {
const temp_entity = aggregation_options.filter((item) => item.linked_entity === changedValues[formFieldName])[0];
form.setFieldsValue( {
agg_options : temp_entity.default_selected.title,
// PNOTE - TO DO Check if this has the same behaviour as selecting an option?
})
let temp = form.getFieldsValue(['agg_options','entity']);
setSelectedEntity(temp.entity);
setSelectedAggOption(temp.agg_options);
//form.setFieldsValue({ agg_options: aggregation_options.filter( item => item.linked_entity === selectedEntity )[0] }); //reset product selection
// PNOTE - https://stackoverflow.com/questions/64377293/update-select-option-list-based-on-other-select-field-selection-ant-design
}
if (formFieldName === 'agg_options') {
setSelectedAggOption(changedValues[formFieldName]);
}
}
return (
<Card>
{/* <Space direction="vertical"> */}
<div>Custom Visualizations</div>
<Form
form={form}
onFinish={handleFinish}
onValuesChange={handleFormValuesChange}
initialValues={{ agg_options: 'Count', chart_style:'line', interval:'5m', group_by:'none' }}
>
<Space>
<Form.Item name="entity">
<Select defaultValue={selectedEntity} style={{ width: 120 }} allowClear>
{entity.map((item) => (
<Option key={item.key} value={item.dataindex}>
{item.title}
</Option>
)
)
}
</Select>
</Form.Item>
<Form.Item name="agg_options">
<Select style={{ width: 120 }} allowClear>
{ aggregation_options.filter((item) => item.linked_entity === selectedEntity)[0].options_available
.map((item) => (
<Option key={item.dataindex} value={item.dataindex}>
{item.title}
</Option>
))
}
</Select>
</Form.Item>
<Form.Item name="chart_style">
<Select style={{ width: 120 }} onChange={handleChange} allowClear>
<Option value="line">Line Chart</Option>
<Option value="bar">Bar Chart</Option>
<Option value="area">Area Chart</Option>
</Select>
</Form.Item>
<Form.Item name="interval">
<Select style={{ width: 120 }} onChange={handleChange} allowClear>
<Option value="1m">1 min</Option>
<Option value="5m">5 min</Option>
<Option value="30m">30 min</Option>
</Select>
</Form.Item>
{/* Need heading for each option */}
<Form.Item name="group_by">
<Select style={{ width: 120 }} onChange={handleChange} allowClear>
<Option value="none">Group By</Option>
<Option value="status">Status Code</Option>
<Option value="protocol">Protocol</Option>
</Select>
</Form.Item>
</Space>
</Form>
<GenericVisualizations chartType='line' data={props.filteredTraceMetrics}/>
{/* This component should take bar or line as an input */}
</Card>
);
}
const mapStateToProps = (state: StoreState): { filteredTraceMetrics: customMetricsItem[] , globalTime: GlobalTime, traceFilters: TraceFilters} => {
return { filteredTraceMetrics : state.filteredTraceMetrics, globalTime: state.globalTime,traceFilters:state.traceFilters };
};
export const TraceCustomVisualizations = connect(mapStateToProps, {
getFilteredTraceMetrics: getFilteredTraceMetrics,
})(_TraceCustomVisualizations);

View File

@@ -1,22 +0,0 @@
import React from 'react';
import {TraceCustomVisualizations} from './TraceCustomVisualizations';
import { TraceFilter } from './TraceFilter';
import { TraceList } from './TraceList';
const TraceDetail = () => {
return (
<div>
<TraceFilter />
<TraceCustomVisualizations />
<TraceList />
</div>
);
}
export default TraceDetail;

View File

@@ -1,293 +0,0 @@
import React,{useEffect, useState} from 'react';
import { Select, Button, Input, Form, AutoComplete} from 'antd';
import { connect } from 'react-redux';
import { Store } from 'antd/lib/form/interface';
import styled from 'styled-components';
import { updateTraceFilters, fetchTraces, TraceFilters, GlobalTime } from '../../actions';
import { StoreState } from '../../reducers';
import LatencyModalForm from './LatencyModalForm';
import {FilterStateDisplay} from './FilterStateDisplay';
import FormItem from 'antd/lib/form/FormItem';
import metricsAPI from '../../api/metricsAPI';
const { Option } = Select;
const InfoWrapper = styled.div`
padding-top:10px;
font-style:italic;
font-size: 12px;
`;
interface TraceFilterProps {
traceFilters: TraceFilters,
globalTime: GlobalTime,
updateTraceFilters: Function,
fetchTraces: Function,
}
interface TagKeyOptionItem {
"tagKeys": string;
"tagCount": number;
}
const _TraceFilter = (props: TraceFilterProps) => {
const [serviceList, setServiceList] = useState<string[]>([]);
const [operationList, setOperationsList] = useState<string[]>([]);
const [tagKeyOptions, setTagKeyOptions] = useState<TagKeyOptionItem[]>([]);
useEffect( () => {
metricsAPI.get<string[]>('services/list').then(response => {
setServiceList( response.data );
});
}, []);
useEffect( () => {
let request_string='service='+props.traceFilters.service+
'&operation='+props.traceFilters.operation+
'&maxDuration='+props.traceFilters.latency?.max+
'&minDuration='+props.traceFilters.latency?.min
if(props.traceFilters.tags)
request_string=request_string+'&tags='+encodeURIComponent(JSON.stringify(props.traceFilters.tags));
props.fetchTraces(props.globalTime, request_string)
}, [props.traceFilters,props.globalTime]);
useEffect ( () => {
let latencyButtonText = 'Latency';
if (props.traceFilters.latency?.min === '' && props.traceFilters.latency?.max !== '')
latencyButtonText = 'Latency<'+(parseInt(props.traceFilters.latency?.max)/1000000).toString()+'ms';
else if (props.traceFilters.latency?.min !== '' && props.traceFilters.latency?.max === '')
latencyButtonText = 'Latency>'+(parseInt(props.traceFilters.latency?.min)/1000000).toString()+'ms';
else if ( props.traceFilters.latency !== undefined && props.traceFilters.latency?.min !== '' && props.traceFilters.latency?.max !== '')
latencyButtonText = (parseInt(props.traceFilters.latency.min)/1000000).toString()+'ms <Latency<'+(parseInt(props.traceFilters.latency.max)/1000000).toString()+'ms';
form_basefilter.setFieldsValue({latency:latencyButtonText ,})
}, [props.traceFilters.latency])
useEffect ( () => {
form_basefilter.setFieldsValue({service: props.traceFilters.service,})
}, [props.traceFilters.service])
useEffect ( () => {
form_basefilter.setFieldsValue({operation: props.traceFilters.operation,})
}, [props.traceFilters.operation])
const [modalVisible, setModalVisible] = useState(false);
const [loading] = useState(false);
const [tagKeyValueApplied, setTagKeyValueApplied]=useState(['']);
const [latencyFilterValues, setLatencyFilterValues]=useState({min:'',max:''})
const [form] = Form.useForm();
const [form_basefilter] = Form.useForm();
function handleChange(value:string) {
console.log(value);
}
function handleChangeOperation(value:string) {
props.updateTraceFilters({...props.traceFilters,operation:value})
}
function handleChangeService(value:string) {
let service_request='/service/'+value+'/operations';
metricsAPI.get<string[]>(service_request).then(response => {
// form_basefilter.resetFields(['operation',])
setOperationsList( response.data );
});
let tagkeyoptions_request='tags?service='+value;
metricsAPI.get<TagKeyOptionItem[]>(tagkeyoptions_request).then(response => {
setTagKeyOptions( response.data );
});
props.updateTraceFilters({...props.traceFilters,service:value})
}
const onLatencyButtonClick = () => {
setModalVisible(true);
}
const onLatencyModalApply = (values: Store) => {
setModalVisible(false);
props.updateTraceFilters({...props.traceFilters,latency:{min:values.min?(parseInt(values.min)*1000000).toString():"", max:values.max?(parseInt(values.max)*1000000).toString():""}})
}
const onTagFormSubmit = (values:any) => {
let request_tags= 'service=frontend&tags='+encodeURIComponent(JSON.stringify([{"key":values.tag_key,"value":values.tag_value,"operator":values.operator}]))
if (props.traceFilters.tags){ // If there are existing tag filters present
props.updateTraceFilters(
{
service:props.traceFilters.service,
operation:props.traceFilters.operation,
latency:props.traceFilters.latency,
tags:[...props.traceFilters.tags, {'key':values.tag_key,'value':values.tag_value,'operator':values.operator}]
});
}
else
{
props.updateTraceFilters(
{
service:props.traceFilters.service,
operation:props.traceFilters.operation,
latency:props.traceFilters.latency,
tags:[ {'key':values.tag_key,'value':values.tag_value,'operator':values.operator}]
});
}
form.resetFields();
}
const onTagClose = (value:string) => {
setTagKeyValueApplied(tagKeyValueApplied.filter( e => (e !== value)));
}
// For autocomplete
//Setting value when autocomplete field is changed
const onChangeTagKey = (data: string) => {
form.setFieldsValue({ tag_key: data });
};
const dataSource = ['status:200'];
const children = [];
for (let i = 0; i < dataSource.length; i++) {
children.push(<Option value={dataSource[i]} key={dataSource[i]}>{dataSource[i]}</Option>);
}
// PNOTE - Remove any
const handleApplyFilterForm = (values:any) => {
let request_params: string ='';
if (typeof values.service !== undefined && typeof(values.operation) !== undefined)
{
request_params = 'service='+values.service+'&operation='+values.operation;
}
else if (typeof values.service === undefined && typeof values.operation !== undefined)
{
request_params = 'operation='+values.operation;
}
else if (typeof values.service !== undefined && typeof values.operation === undefined)
{
request_params = 'service='+values.service;
}
request_params=request_params+'&minDuration='+latencyFilterValues.min+'&maxDuration='+latencyFilterValues.max;
setTagKeyValueApplied(tagKeyValueApplied => [...tagKeyValueApplied, 'service eq'+values.service, 'operation eq '+values.operation, 'maxduration eq '+ (parseInt(latencyFilterValues.max)/1000000).toString(), 'minduration eq '+(parseInt(latencyFilterValues.min)/1000000).toString()]);
props.updateTraceFilters({'service':values.service,'operation':values.operation,'latency':latencyFilterValues})
}
return (
<div>
<div>Filter Traces</div>
{/* <div>{JSON.stringify(props.traceFilters)}</div> */}
<Form form={form_basefilter} layout='inline' onFinish={handleApplyFilterForm} initialValues={{ service:'', operation:'',latency:'Latency',}} style={{marginTop: 10, marginBottom:10}}>
<FormItem rules={[{ required: true }]} name='service'>
<Select showSearch style={{ width: 180 }} onChange={handleChangeService} placeholder='Select Service' allowClear>
{serviceList.map( s => <Option value={s}>{s}</Option>)}
</Select>
</FormItem>
<FormItem name='operation'>
<Select showSearch style={{ width: 180 }} onChange={handleChangeOperation} placeholder='Select Operation' allowClear>
{operationList.map( item => <Option value={item}>{item}</Option>)}
</Select>
</FormItem>
<FormItem name='latency'>
<Input style={{ width: 200 }} type='button' onClick={onLatencyButtonClick}/>
</FormItem>
{/* <FormItem>
<Button type="primary" htmlType="submit">Apply Filters</Button>
</FormItem> */}
</Form>
<FilterStateDisplay />
{/* // What will be the empty state of card when there is no Tag , it should show something */}
<InfoWrapper>Select Service to get Tag suggestions </InfoWrapper>
<Form form={form} layout='inline' onFinish={onTagFormSubmit} initialValues={{operator:'equals'}} style={{marginTop: 10, marginBottom:10}}>
<FormItem rules={[{ required: true }]} name='tag_key'>
<AutoComplete
options={tagKeyOptions.map((s) => { return ({'value' : s.tagKeys}) })}
style={{ width: 200, textAlign: 'center' }}
// onSelect={onSelect}
// onSearch={onSearch}
onChange={onChangeTagKey}
filterOption={(inputValue, option) =>
option!.value.toUpperCase().indexOf(inputValue.toUpperCase()) !== -1
}
placeholder="Tag Key"
/>
</FormItem>
<FormItem name='operator'>
<Select style={{ width: 120, textAlign: 'center' }}>
<Option value="equals">EQUAL</Option>
<Option value="contains">CONTAINS</Option>
</Select>
</FormItem>
<FormItem rules={[{ required: true }]} name='tag_value'>
<Input style={{ width: 160, textAlign: 'center',}} placeholder="Tag Value" />
</FormItem>
<FormItem>
<Button type="primary" htmlType="submit"> Apply Tag Filter </Button>
</FormItem>
</Form>
<LatencyModalForm
visible={modalVisible}
onCreate={onLatencyModalApply}
onCancel={() => {
setModalVisible(false);
}}
/>
</div>
);
}
const mapStateToProps = (state: StoreState): { traceFilters: TraceFilters, globalTime: GlobalTime } => {
return { traceFilters: state.traceFilters, globalTime: state.globalTime };
};
export const TraceFilter = connect(mapStateToProps, {
updateTraceFilters: updateTraceFilters,
fetchTraces: fetchTraces,
})(_TraceFilter);

View File

@@ -1,108 +0,0 @@
import React, { useEffect, useState } from 'react'
import { useParams } from "react-router-dom";
import { flamegraph } from 'd3-flame-graph'
import { connect } from 'react-redux';
import { Card, Button, Row, Col, Space } from 'antd';
import * as d3 from 'd3';
import * as d3Tip from 'd3-tip';
//import * as d3Tip from 'd3-tip';
// PNOTE - uninstall @types/d3-tip. issues with importing d3-tip https://github.com/Caged/d3-tip/issues/181
import './TraceGraph.css'
import { spanToTreeUtil } from '../../utils/spanToTree'
import { fetchTraceItem , spansWSameTraceIDResponse } from '../../actions';
import { StoreState } from '../../reducers'
import { TraceGraphColumn } from './TraceGraphColumn'
import SelectedSpanDetails from './SelectedSpanDetails'
interface TraceGraphProps {
traceItem: spansWSameTraceIDResponse ,
fetchTraceItem: Function,
}
const _TraceGraph = (props: TraceGraphProps) => {
const params = useParams<{ id?: string; }>();
const [clickedSpanTags,setClickedSpanTags]=useState([])
useEffect( () => {
props.fetchTraceItem(params.id);
}, []);
useEffect( () => {
if (props.traceItem[0].events.length>0)
{
const tree = spanToTreeUtil(props.traceItem[0].events);
d3.select("#chart").datum(tree).call(chart);
}
},[props.traceItem]);
// if this monitoring of props.traceItem.data is removed then zoom on click doesn't work
// Doesn't work if only do initial check, works if monitor an element - as it may get updated in sometime
const tip = d3Tip.default().attr('class', 'd3-tip').html(function(d:any) { return d.data.name+'<br>duration: '+d.data.value});
const onClick = (z:any) => {
setClickedSpanTags(z.data.tags);
console.log(`Clicked on ${z.data.name}, id: "${z.id}"`);
}
const chart = flamegraph()
.width(640)
.cellHeight(18)
.transitionDuration(500)
.minFrameSize(5)
.sort(true)
.inverted(true)
.tooltip(tip)
.elided(false)
.onClick(onClick)
// .title("Trace Flame graph")
.differential(false)
.selfValue(true); //sets span width based on value - which is mapped to duration
const resetZoom = () => {
chart.resetZoom();
}
return (
<Row gutter={{ xs: 8, sm: 16, md: 24, lg: 32 }}>
<Col md={8} sm={24} >
<TraceGraphColumn />
</Col>
<Col md={16} sm={24} >
{/* <Card style={{ width: 640 }}> */}
<Space direction="vertical" size='middle' >
<Card bodyStyle={{padding: 80, }} style={{ height: 320, }}>
<div>Trace Graph component ID is {params.id} </div>
<Button type="primary" onClick={resetZoom}>Reset Zoom</Button>
<div id="chart" style={{ fontSize: 12 }}></div>
</Card>
<SelectedSpanDetails clickedSpanTags={clickedSpanTags}/>
</Space>
</Col>
</Row>
);
}
const mapStateToProps = (state: StoreState): { traceItem: spansWSameTraceIDResponse } => {
return { traceItem: state.traceItem };
};
export const TraceGraph = connect(mapStateToProps, {
fetchTraceItem: fetchTraceItem,
})(_TraceGraph);

View File

@@ -1,73 +0,0 @@
import React from 'react';
import { connect } from 'react-redux';
import { Table } from 'antd'
import { traceResponseNew, pushDStree } from '../../actions';
import { StoreState } from '../../reducers'
interface TraceGraphColumnProps {
traces: traceResponseNew,
}
interface TableDataSourceItem {
key: string;
operationName: string;
startTime: number;
duration: number;
}
const _TraceGraphColumn = (props: TraceGraphColumnProps) => {
const columns: any = [
{
title: 'Start Time (UTC Time)',
dataIndex: 'startTime',
key: 'startTime',
sorter: (a:any, b:any) => a.startTime - b.startTime,
sortDirections: ['descend', 'ascend'],
render: (value: number) => (new Date(Math.round(value/1000))).toUTCString()
},
{
title: 'Duration (in ms)',
dataIndex: 'duration',
key: 'duration',
sorter: (a:any, b:any) => a.duration - b.duration,
sortDirections: ['descend', 'ascend'],
render: (value: number) => (value/1000000).toFixed(2),
},
{
title: 'Operation',
dataIndex: 'operationName',
key: 'operationName',
},
];
let dataSource :TableDataSourceItem[] = [];
if (props.traces[0].events.length > 0) {
props.traces[0].events.map((item: (number|string|string[]|pushDStree[])[], index ) => {
if (typeof item[0] === 'number' && typeof item[4] === 'string' && typeof item[6] === 'string' && typeof item[1] === 'string' && typeof item[2] === 'string' )
dataSource.push({startTime: item[0], operationName: item[4] , duration:parseInt(item[6]), key:index.toString()});
});
}
return (
<div>
<Table dataSource={dataSource} columns={columns} size="middle"/>;
</div>
);
}
const mapStateToProps = (state: StoreState): { traces: traceResponseNew } => {
return { traces : state.traces };
};
export const TraceGraphColumn = connect(mapStateToProps)(_TraceGraphColumn);

View File

@@ -1,116 +0,0 @@
import React, { useEffect } from 'react';
import { connect } from 'react-redux';
import { NavLink } from 'react-router-dom';
import { Table } from 'antd'
import { traceResponseNew, fetchTraces, pushDStree } from '../../actions';
import { StoreState } from '../../reducers'
interface TraceListProps {
traces: traceResponseNew,
fetchTraces: Function,
}
interface TableDataSourceItem {
key: string;
spanid: string;
traceid: string;
operationName: string;
startTime: number;
duration: number;
}
const _TraceList = (props: TraceListProps) => {
// PNOTE (TO DO) - Currently this use of useEffect gives warning. May need to memoise fetchtraces - https://stackoverflow.com/questions/55840294/how-to-fix-missing-dependency-warning-when-using-useeffect-react-hook
useEffect( () => {
props.fetchTraces();
}, []);
// PNOTE - code snippet -
// renderList(): JSX.Element[] {
// return this.props.todos.map((todo: Todo) => {
// return (
// <div onClick={() => this.onTodoClick(todo.id)} key={todo.id}>
// {todo.title}
// </div>
// );
// });
// }
const columns: any = [
{
title: 'Start Time (UTC Time)',
dataIndex: 'startTime',
key: 'startTime',
sorter: (a:any, b:any) => a.startTime - b.startTime,
sortDirections: ['descend', 'ascend'],
render: (value: number) => (new Date(Math.round(value))).toUTCString()
// new Date() assumes input in milliseconds. Start Time stamp returned by druid api for span list is in ms
},
{
title: 'Duration (in ms)',
dataIndex: 'duration',
key: 'duration',
sorter: (a:any, b:any) => a.duration - b.duration,
sortDirections: ['descend', 'ascend'],
render: (value: number) => (value/1000000).toFixed(2),
},
{
title: 'Operation',
dataIndex: 'operationName',
key: 'operationName',
},
{
title: 'TraceID',
dataIndex: 'traceid',
key: 'traceid',
render: (text :string) => <NavLink to={'/traces/' + text}>{text.slice(-16)}</NavLink>,
//only last 16 chars have traceID, druid makes it 32 by adding zeros
},
];
let dataSource :TableDataSourceItem[] = [];
const renderTraces = () => {
if (typeof props.traces[0]!== 'undefined' && props.traces[0].events.length > 0) {
//PNOTE - Template literal should be wrapped in curly braces for it to be evaluated
props.traces[0].events.map((item: (number|string|string[]|pushDStree[])[], index ) => {
if (typeof item[0] === 'number' && typeof item[4] === 'string' && typeof item[6] === 'string' && typeof item[1] === 'string' && typeof item[2] === 'string' )
dataSource.push({startTime: item[0], operationName: item[4] , duration:parseInt(item[6]), spanid:item[1], traceid:item[2], key:index.toString()});
});
//antd table in typescript - https://codesandbox.io/s/react-typescript-669cv
return <Table dataSource={dataSource} columns={columns} size="middle"/>;
} else
{
return <div> No spans found for given filter!</div>
}
};// end of renderTraces
return(
<div>
<div>List of traces with spanID</div>
<div>{renderTraces()}</div>
</div>
)
}
const mapStateToProps = (state: StoreState): { traces: traceResponseNew } => {
return { traces : state.traces };
};
export const TraceList = connect(mapStateToProps, {
fetchTraces: fetchTraces,
})(_TraceList);

View File

@@ -1,80 +0,0 @@
import React, {useEffect} from 'react';
import { Bar } from 'react-chartjs-2'
import { Card } from 'antd'
import { connect } from 'react-redux';
import { getUsageData, GlobalTime, usageDataItem } from '../../actions';
import { StoreState } from '../../reducers'
interface UsageExplorerProps {
usageData: usageDataItem[],
getUsageData: Function,
globalTime: GlobalTime,
}
const _UsageExplorer = (props: UsageExplorerProps) => {
useEffect( () => {
props.getUsageData(props.globalTime);
}, [props.globalTime]);
const data = {
labels: props.usageData.map(s => new Date(s.timestamp/1000000)),
datasets: [
{
label: 'Span Count',
data: props.usageData.map(s => s.count),
backgroundColor: 'rgba(255, 99, 132, 0.2)',
borderColor: 'rgba(255, 99, 132, 1)',
borderWidth: 2,
},
],
}
const options = {
scales: {
yAxes: [
{
ticks: {
beginAtZero: true,
fontSize: 10,
},
},
],
xAxes: [
{
type: 'time',
// distribution: 'linear', // Bar graph doesn't take lineardistribution type?
ticks: {
beginAtZero: true,
fontSize: 10,
},
},
],
},
legend: {
display: false,
}
}
return(
<React.Fragment>
{/* PNOTE - TODO - Keep it in reponsive row column tab */}
<Card style={{ width: "50%" , margin:20 }} bodyStyle={{padding:20 }}>
<Bar data={data} options={options} />
</Card>
</React.Fragment>
);
}
const mapStateToProps = (state: StoreState): { usageData: usageDataItem[], globalTime: GlobalTime } => {
return { usageData : state.usageDate, globalTime: state.globalTime };
};
export const UsageExplorer = connect(mapStateToProps, {
getUsageData: getUsageData,
})(_UsageExplorer);

View File

@@ -1 +0,0 @@
export { UsageExplorer as default } from './UsageExplorer';

View File

@@ -0,0 +1,7 @@
import ROUTES from "./routes";
export const WITHOUT_SESSION_PATH = ["/redirect"];
export const AUTH0_REDIRECT_PATH = "/redirect";
export const DEFAULT_AUTH0_APP_REDIRECTION_PATH = ROUTES.APPLICATION;

View File

@@ -0,0 +1 @@
export const IS_LOGGED_IN = "isLoggedIn";

View File

@@ -0,0 +1,3 @@
export const ENVIRONMENT = {
baseURL: "",
};

View File

@@ -0,0 +1,3 @@
export enum LOCAL_STORAGE {
METRICS_TIME_IN_DURATION = "metricsTimeDuration",
}

View File

@@ -0,0 +1 @@
export const SKIP_ONBOARDING = "skip_onboarding";

View File

@@ -0,0 +1,8 @@
export enum METRICS_PAGE_QUERY_PARAM {
interval = "interval",
startTime = "startTime",
endTime = "endTime",
service = "service",
error = "error",
operation = "operation",
}

View File

@@ -0,0 +1,13 @@
const ROUTES = {
SIGN_UP: "/signup",
SERVICE_METRICS: "/application/:servicename",
SERVICE_MAP: "/service-map",
TRACES: "/traces",
TRACE_GRAPH: "/traces/:id",
SETTINGS: "/settings",
INSTRUMENTATION: "/add-instrumentation",
USAGE_EXPLORER: "/usage-explorer",
APPLICATION: "/application",
};
export default ROUTES;

View File

@@ -0,0 +1,39 @@
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8" />
<title data-react-helmet="true">Open source Observability platform | SigNoz</title><meta data-react-helmet="true" property="og:title" content="Open source Observability platform | SigNoz"><meta data-react-helmet="true" name="description" content="SigNoz is an opensource observability platform to help you find issues in your deployed applications &amp; solve them quickly. It provides an integrated UI for metrics and traces with deep filtering and aggregation to pin down specific issues very quickly. Built on Kafka and Druid, it is designed to handle enterprise scale."><meta data-react-helmet="true" property="og:description" content="SigNoz is an opensource observability platform to help you find issues in your deployed applications &amp; solve them quickly. It provides an integrated UI for metrics and traces with deep filtering and aggregation to pin down specific issues very quickly. Built on Kafka and Druid, it is designed to handle enterprise scale."><meta data-react-helmet="true" property="og:image" content="https://signoz.io/img/HeroShot-3.jpg"><meta data-react-helmet="true" name="twitter:image" content="https://signoz.io/img/HeroShot-3.jpg"><meta data-react-helmet="true" name="twitter:image:alt" content="Image for Open source Observability platform | SigNoz"><meta data-react-helmet="true" name="twitter:card" content="summary_large_image"><meta data-react-helmet="true" name="docusaurus_locale" content="en"><meta data-react-helmet="true" name="docusaurus_tag" content="default"><link data-react-helmet="true" rel="shortcut icon" href="/img/favicon.ico">
<link
rel="stylesheet"
href="https://stackpath.bootstrapcdn.com/bootstrap/4.3.1/css/bootstrap.min.css"
integrity="sha384-ggOyR0iXCbMQv3Xipma34MD+dH/1fQ784/j6cY/iJTQUOhcWr7x9JvoRxT2MZw1T"
crossorigin="anonymous"
/>
<!--
manifest.json provides metadata used when your web app is installed on a
user's mobile device or desktop. See https://developers.google.com/web/fundamentals/web-app-manifest/
-->
<!--
It will be replaced with the URL of the `public` folder during the build.
Only files inside the `public` folder can be referenced from the HTML.
work correctly both with client-side routing and a non-root public URL.
Learn how to configure a non-root public URL by running `npm run build`.
-->
<title>Signoz</title>
</head>
<body>
<noscript>You need to enable JavaScript to run this app.</noscript>
<div id="root"></div>
<!--
This HTML file is a template.
If you open it directly in the browser, you will see an empty page.
You can add webfonts, meta tags, or analytics to this file.
The build step will place the bundled scripts into the <body> tag.
To begin the development, run `npm start` or `yarn start`.
To create a production bundle, use `npm run build` or `yarn build`.
-->
</body>
</html>

View File

@@ -1,34 +1,22 @@
import React from 'react';
import ReactDOM from 'react-dom';
import { Provider } from 'react-redux';
import { createStore, applyMiddleware } from 'redux';
import React from "react";
import ReactDOM from "react-dom";
import { Provider } from "react-redux";
import { ThemeSwitcherProvider } from "react-css-theme-switcher";
import thunk from 'redux-thunk';
// import { NavLink, BrowserRouter as Router, Route, Switch } from 'react-router-dom';
import AppWrapper from './components/AppWrapper';
import './assets/index.css';
import { reducers } from './reducers';
// import Signup from './components/Signup';
const store = createStore(reducers, applyMiddleware(thunk))
const themes = {
dark: `${process.env.PUBLIC_URL}/dark-theme.css`,
light: `${process.env.PUBLIC_URL}/light-theme.css`,
};
import store from "Src/store";
import AppWrapper from "Src/modules/AppWrapper";
import "Src/assets/index.css";
import { BrowserRouter as Router } from "react-router-dom";
import themes from "Src/themes";
ReactDOM.render(
<Provider store={store}>
<React.StrictMode>
<ThemeSwitcherProvider themeMap={themes} defaultTheme="dark">
<AppWrapper />
{/* <App /> */}
</ThemeSwitcherProvider>
</React.StrictMode>
</Provider>,
document.querySelector('#root')
);
<Provider store={store}>
<React.StrictMode>
<ThemeSwitcherProvider themeMap={themes} defaultTheme="dark">
<Router basename="/">
<AppWrapper />
</Router>
</ThemeSwitcherProvider>
</React.StrictMode>
</Provider>,
document.querySelector("#root"),
);

Some files were not shown because too many files have changed in this diff Show More