Artificial Intelligence and Machine Learning With Docker
AI/ML accelerated

Faster and more secure AI/ML development
Faster time to code
For more than a decade, developers have relied on Docker to accelerate the setup and deployment of their development environments. Modern AI/ML applications are complex, and Docker saves developers time to accelerate innovation.
Hundreds of AI/ML models & images
Hundreds of AI/ML images are available on Docker Hub. Verified images from industry-leading AI/ML tools, such as PyTorch, Tensorflow, and Jupyter, provide trusted and tested content to ensure a strong starting point for AI/ML practitioners.
Reproducibility
AI/ML models require a consistent setup and deployment to produce accurate results. Docker allows teams to ensure that their models and environments are identical for each deployment.
Secure by default
Trusted content, enhanced isolation, registry access management, and Docker Scout all work to deliver a secure environment to developer teams.
Featured AI/ML repositories on Docker Hub
AI/ML on Docker Hub
Docker Hub is home to thousands of AI/ML images for most AI/ML models. With Docker Hub, developers can find and rapidly deploy environments in a consistent and secure fashion.
Docker Hub is a collaboration tool as well as a marketplace for community developers, open source contributors, and independent software vendors (ISVs) to distribute their code publicly.
Visit Hub
Hugging Face
With Docker and Hugging Face, developers can launch and deploy complex ML apps in minutes. With the support for Docker on Hugging Face Spaces, you can create custom apps by simply writing a Dockerfile.
Spaces also come with pre-defined templates of popular open source projects for members who want to get their end-to-end project on production in just a few clicks.
Docker Hub is a collaboration tool as well as a marketplace for community developers, open source contributors, and independent software vendors (ISVs) to distribute their code publicly.
Hugging Face on Hub
DataStax
Docker and Kaskada give ML practitioners a declarative language designed specifically for the problem at hand.
Docker provides a reproducible development environment and an ecosystem of tools. Kaskada enables sharing of machine learning ‘features as code’ throughout the ML lifecycle — from training models locally to maintaining real-time features in production.
DataStax on Hub
-
Why Docker Chose OCI Artifacts for AI Model Packaging
Explore why Docker chose OCI artifacts to package and share AI models. Standardized, flexible, and made for developers.
Read now
-
Behind the scenes: How we designed Docker Model Runner and what’s next
Get an inside look at Docker’s first major step into the AI development space and a preview of what’s ahead for Model Runner.
Read now
-
How to Build, Run, and Package AI Models Locally with Docker Model Runner
Introduction As a Senior DevOps Engineer and Docker Captain, I’ve helped build AI systems for everything from retail personalization to medical imaging. One truth stands out: AI capabilities are core to modern infrastructure. This guide will show you how to run and package local AI models with Docker Model Runner — a lightweight, developer-friendly tool…
Read now