Skip to main content
Nebius AI Cloud supports several integrations with third-party tools. They enable you to run and orchestrate AI workloads in the Nebius AI Cloud infrastructure. With such integrations, you can flexibly configure the infrastructure and fulfill your key objectives. For example, you can train and deploy AI models, run workloads on virtual machines or launch parallel jobs.

Software compatibility and runtimes

Learn about what and how you can run in Nebius AI Cloud

Anyscale

Scale AI workloads with Anyscale deployed on a Managed Service for Kubernetes® cluster

dstack

Install dstack and orchestrate AI workloads

Run:ai

Optimize your GPU resources for ML/AI workloads by using Run:ai and Managed Kubernetes

SkyPilot

Run, manage and scale AI workloads on Nebius AI Cloud by using SkyPilot

MPIrun

Configure a Compute GPU cluster and run NCCL tests with MPIrun