Communication requirements
CAST AI Deployments and DaemonSets
For the CAST AI components to communicate with CAST AI SaaS, use the following:
api.cast.ai:443
Use CAST AI components behind a proxy
If your company is using a proxy to access resources at the public network (like api.cast.ai:443
), you need to configure the PROXY variables for the CAST-AI components
Example: castai-agent
deployment on GKE cluster:
...
containers:
- env:
- name: API_URL
value: api.cast.ai
- name: PROVIDER
value: gke
- name: MONITOR_METADATA
value: /agent-metadata/metadata
- name: PPROF_PORT
value: "6060"
- name: HTTP_PROXY
value: "http://<proxyaddress>:<port-if-needed>"
- name: HTTPS_PROXY
value: "https://<proxyaddress>:<port-if-needed>"
- name: NO_PROXY
value: "localhost,<pod cidr><svc cidr>,*.cluster.local,googleapis.com,metadata.google.internal"
envFrom:
- secretRef:
name: castai-agent
image: us-docker.pkg.dev/castai-hub/library/agent:v0.48.1
...
Add the environment variables HTTP_PROXY
, HTTPS_PROXY
and NO_PROXY
to the castai-agent
deployment.
Make sure that NO_PROXY
has set the correct settings matching to your environment to prevent sending Kubernetes-internal traffic to the external proxy!
Container Registry to pull images
us-docker.pkg.dev/castai-hub:443
Helm charts
castai.github.io:443
objects.githubusercontent.com:443
Node binaries
https://storage.googleapis.com/castai-node-components/
Node startup logs upload
This includes kubelet, driver errors, etc.
Updated about 1 month ago