Docker Compose Adds LLM ‘models’ Support and Cloud Offload for Agentic Apps
LLM models
Configuration in Compose
Docker Compose 2.38.0 adds a new top-level models
element, enabling developers to define large language models directly within the Compose file. Services can now specify model settings—such as model name, version, and resource requirements—alongside containers, simplifying the local development and deployment of agentic applications without separate ML orchestration tooling (docs.docker.com).
Unified AI Agent Workflows
Developers can declare agents and Model Context Protocol (MCP)–compatible tools in docker-compose.yml
, then launch entire agentic stacks with a single docker compose up
. Compose integrates out of the box with popular AI frameworks—LangGraph, Embabel, Vercel AI SDK, Spring AI, CrewAI, Google’s ADK, and Agno—making it straightforward to prototype and iterate on intelligent workflows using familiar YAML definitions (sdtimes.com).
Docker Offload for Cloud-Scale Execution
Alongside local support, Docker Offload lets teams offload compute-intensive AI workloads to cloud environments (e.g., Google Cloud Run, Azure Container Apps) with the same Compose CLI. This hybrid model enables seamless scaling: developers build and test locally, then leverage managed cloud resources for production inference, all without changing Compose definitions (sdtimes.com).
Source: Docker Compose gets new features for building and running agents - SD Times
Read Next
- August 4, 2025
Next.js 15.4: Production-Ready Turbopack with 100% Integration Test Compatibility
Next.js 15.4 marks the first production-ready milestone for Turbopack, passing all integration tests and powering Vercel’s high-traffic site.
- July 31, 2025
TypeScript 5.9 RC: import defer, Minimal tsconfig, and Major Performance Gains
TypeScript 5.9 RC introduces deferred module loading with import defer, a lean tsconfig init, stable Node 20 module resolution, enhanced editor tooling, and key compiler optimizations.