Recommended Driver >=545.23.08

TensorFlow 2.16.1 + CUDA 12.3

Generate a production-ready Dockerfile with verified compatibility

Configuration Summary

Framework
TensorFlow 2.16.1
CUDA Version
12.3
Python Support
3.9, 3.10, 3.11
Min Driver
>=545.23.08

Note: 首个支持pip CUDA的版本

Install Command
pip install tensorflow[and-cuda]==2.16.1

What's in TensorFlow 2.16.1

  • Security patches and bug fixes
  • Stable Keras 3 transition
  • Improved tf.data pipeline performance

Best For

Use Cases

  • Enterprise deployments requiring stability
  • Data-intensive training pipelines

CUDA 12.3 Advantages

  • Modern data center GPUs (A100, A10G)
  • Good balance of features and stability
  • Cloud platform compatibility

Generate Dockerfile

Configuration

Local GPU or CPU environment

首个支持pip CUDA的版本

Requires NVIDIA Driver >=545.23.08
Dockerfile
1# syntax=docker/dockerfile:1
2# ^ Required for BuildKit cache mounts and advanced features
3
4# Generated by DockerFit (https://tools.eastondev.com/docker)
5# TENSORFLOW 2.16.1 + CUDA 12.3 | Python 3.11
6# Multi-stage build for optimized image size
7
8# ==============================================================================
9# Stage 1: Builder - Install dependencies and compile
10# ==============================================================================
11FROM python:3.10-slim-bookworm AS builder
12
13# Build arguments
14ARG DEBIAN_FRONTEND=noninteractive
15
16# Environment variables
17ENV PYTHONUNBUFFERED=1
18ENV PYTHONDONTWRITEBYTECODE=1
19
20# Create virtual environment
21ENV VIRTUAL_ENV=/opt/venv
22RUN python -m venv $VIRTUAL_ENV
23ENV PATH="$VIRTUAL_ENV/bin:$PATH"
24
25# Upgrade pip
26RUN pip install --no-cache-dir --upgrade pip setuptools wheel
27
28# Install TensorFlow with pip CUDA packages (no system CUDA needed)
29# This installs CUDA/cuDNN via pip, avoiding dual CUDA dependency
30RUN --mount=type=cache,target=/root/.cache/pip \
31 pip install tensorflow[and-cuda]==2.16.1
32
33# Install project dependencies
34COPY requirements.txt .
35RUN --mount=type=cache,target=/root/.cache/pip \
36 pip install -r requirements.txt
37
38# ==============================================================================
39# Stage 2: Runtime - Minimal production image
40# ==============================================================================
41FROM python:3.10-slim-bookworm AS runtime
42
43# Labels
44LABEL maintainer="Generated by DockerFit"
45LABEL version="2.16.1"
46LABEL description="TENSORFLOW 2.16.1 + CUDA 12.3"
47
48# Environment variables
49ENV PYTHONUNBUFFERED=1
50ENV PYTHONDONTWRITEBYTECODE=1
51ENV NVIDIA_VISIBLE_DEVICES=all
52ENV NVIDIA_DRIVER_CAPABILITIES=compute,utility
53
54# Create non-root user for security
55ARG USERNAME=appuser
56ARG USER_UID=1000
57ARG USER_GID=$USER_UID
58RUN groupadd --gid $USER_GID $USERNAME \
59 && useradd --uid $USER_UID --gid $USER_GID -m $USERNAME
60
61# Copy virtual environment from builder
62COPY --from=builder --chown=$USERNAME:$USERNAME /opt/venv /opt/venv
63ENV VIRTUAL_ENV=/opt/venv
64ENV PATH="$VIRTUAL_ENV/bin:$PATH"
65
66# Set working directory
67WORKDIR /app
68
69# Copy application code
70COPY --chown=$USERNAME:$USERNAME . .
71
72# Switch to non-root user
73USER $USERNAME
74
75# Expose port
76EXPOSE 8000
77
78# Default command
79CMD ["python", "main.py"]
🚀 Recommended

High-Performance GPU Cloud

Deploy your Docker containers with powerful NVIDIA GPUs. A100/H100 available, 32+ global locations.

  • NVIDIA A100/H100 GPU instances
  • Hourly billing, starting at $0.004/h
  • 32+ global data centers
  • One-click container & bare metal deployment
🎁 Deploy Now

Frequently Asked Questions

What NVIDIA driver version do I need?

For TensorFlow 2.16.1 with CUDA 12.3, you need NVIDIA driver version >=545.23.08 or higher.

Run nvidia-smi to check your current driver version.

How do I install TensorFlow with CUDA support?

TensorFlow 2.16.1 uses the following installation command:

pip install tensorflow[and-cuda]==2.16.1

Since TensorFlow 2.15+, CUDA libraries are bundled via tensorflow[and-cuda].

How do I verify GPU access in the container?

After building your image, run:

docker run --gpus all your-image python -c "import tensorflow as tf; print(tf.config.list_physical_devices('GPU'))"

This should show available GPU devices.