Odigos

No results

Help CenterGetting Started with OdigosInstalling Odigos on Kubernetes

Installing Odigos on Kubernetes

Last updated November 1, 2023

Introduction:

Kubernetes has become the de facto standard for container orchestration, powering modern cloud-native applications. To ensure these applications are observable and performant, integrating tools like Odigos is essential. This article provides a step-by-step guide on installing Odigos on a Kubernetes cluster, ensuring seamless observability for your containerized applications.

Installation Steps:

  1. Prerequisites:
  • Ensure you have kubectl installed and configured to communicate with your Kubernetes cluster.
  • Have the Odigos CLI tool installed on your local machine.
  1. Download the Odigos Kubernetes Manifest:
  • Use the following command to fetch the latest Odigos manifest for Kubernetes: arduinoCopy code wget https://odigos.io/kubernetes-manifest.yaml
  1. Configure Odigos Settings:
  • Edit the kubernetes-manifest.yaml file to specify your Odigos API key and other configuration settings specific to your environment.
  1. Deploy Odigos to Kubernetes:
  • Apply the manifest to your Kubernetes cluster using the following command: Copy code kubectl apply -f kubernetes-manifest.yaml
  1. Verify the Installation:
  • Check the status of the Odigos pods to ensure they are running: arduinoCopy code kubectl get pods -n odigos-namespace
  • You should see the Odigos pods in a RUNNING state.
  1. Integrate with Your Applications:
  • Label the pods or deployments you wish to monitor with Odigos: arduinoCopy code kubectl label pods <POD_NAME> odigos-instrumentation=true
  1. Access the Odigos Dashboard:
  • Navigate to the Odigos web interface to view the telemetry data from your Kubernetes applications. You'll see metrics, traces, and logs from the labeled pods.

Conclusion:

Installing Odigos on Kubernetes is a straightforward process that brings the power of comprehensive observability to your containerized applications. With Odigos monitoring your Kubernetes workloads, you can gain deep insights into application performance, troubleshoot issues faster, and ensure optimal user experiences.

Was this article helpful?