-
Notifications
You must be signed in to change notification settings - Fork 280
Open
Description
Describe the bug
Current helm config for cpu limit
of robusta-forwarder
caused the pods to consume available cpu of the Node.
To Reproduce
NA
Expected behavior
robusta-forwarder
pod should run in specified cpu limits.
Screenshots
pratikraj@Pratiks-MacBook-Pro ~ %
pratikraj@Pratiks-MacBook-Pro ~ % oc adm top nodes
NAME CPU(cores) CPU% MEMORY(bytes) MEMORY%
worker7.copper.cp.xxxxx.xxxx.com 42648m 98% 34248Mi 14%
pratikraj@Pratiks-MacBook-Pro ~ %
pratikraj@Pratiks-MacBook-Pro ~ % oc adm top po -n robusta
NAME CPU(cores) MEMORY(bytes)
robusta-forwarder-6ddb7758f7-xm42p 53400m 502Mi
robusta-runner-6cb648c696-44sqp 9m 838Mi
pratikraj@Pratiks-MacBook-Pro ~ %
pratikraj@Pratiks-MacBook-Pro ~ % oc delete -n robusta po robusta-forwarder-6ddb7758f7-xm42p
pod "robusta-forwarder-6ddb7758f7-xm42p" deleted
pratikraj@Pratiks-MacBook-Pro ~ %
pratikraj@Pratiks-MacBook-Pro ~ % oc get po -n robusta
NAME READY STATUS RESTARTS AGE
robusta-forwarder-6ddb7758f7-b2l69 1/1 Running 0 36s
robusta-runner-6cb648c696-44sqp 2/2 Running 2 (31h ago) 3d22h
pratikraj@Pratiks-MacBook-Pro ~ %
pratikraj@Pratiks-MacBook-Pro ~ % oc version
Client Version: 4.15.15
Kustomize Version: v5.0.4-0.20230601165947-6ce0bf390ce3
Server Version: 4.14.36
Kubernetes Version: v1.27.16+03a907c
pratikraj@Pratiks-MacBook-Pro ~ %
pratikraj@Pratiks-MacBook-Pro ~ % oc get po -n robusta
NAME READY STATUS RESTARTS AGE
robusta-forwarder-6ddb7758f7-b2l69 1/1 Running 0 17h
robusta-runner-6cb648c696-44sqp 2/2 Running 2 (2d1h ago) 4d16h
pratikraj@Pratiks-MacBook-Pro ~ %
pratikraj@Pratiks-MacBook-Pro ~ % oc adm top po -n robusta
NAME CPU(cores) MEMORY(bytes)
robusta-forwarder-6ddb7758f7-b2l69 29m 286Mi
robusta-runner-6cb648c696-44sqp 880m 991Mi
pratikraj@Pratiks-MacBook-Pro ~ %
pratikraj@Pratiks-MacBook-Pro ~ % oc adm top nodes
NAME CPU(cores) CPU% MEMORY(bytes) MEMORY%
worker7.copper.cp.xxxxx.xxxx.com 1284m 2% 34086Mi 14%
pratikraj@Pratiks-MacBook-Pro ~ %
Environment Info (please complete the following information):
Client Version: 4.15.15
Kustomize Version: v5.0.4-0.20230601165947-6ce0bf390ce3
Server Version: 4.14.36
Kubernetes Version: v1.27.16+03a907c
Additional context
Add any other context about the problem here.
Metadata
Metadata
Assignees
Labels
No labels