CLI Reference

The Distil CLI is a command-line tool for fine-tuning compact language models using the distil labs platform. It enables you to train specialized models up to 70x smaller than teacher models while maintaining comparable accuracy, without requiring ML expertise.

Installation

Install the Distil CLI with a single command:

1curl -fsSL https://cli-assets.distillabs.ai/install.sh | sh

Supported Platforms

The Distil CLI supports the following operating systems:

PlatformSupported
Linux (x86_64)Yes
macOS (Intel)Yes
macOS (Apple Silicon)Yes
WindowsNo

Windows users can use the Distil CLI through WSL (Windows Subsystem for Linux) or use our platform via our REST API.

Claude Skill

Use our Claude Skill to train models directly from Claude Code or Claude.ai. The skill teaches Claude how to guide you through the entire training workflow.

Installation

Claude Code:

$/plugin marketplace add https://github.com/distil-labs/distil-cli-skill
>/plugin install distil-cli@distil-cli-skill

Claude.ai / Claude Desktop:

  1. Download the skill as ZIP from its GitHub page or directly here.
  2. Go to claude.ai → Settings → Capabilities → Skills
  3. Click “Upload skill” and select the ZIP file
  4. Toggle the skill ON

Capabilities

EnvironmentWhat Claude Can Do
Claude CodeFull end-to-end workflow: task selection, data preparation, running CLI commands, training, and deployment
Claude.aiTask selection and data preparation: help choose the right task type and create data files. You run CLI commands yourself.

Usage

Once installed, ask Claude to help you train a model:

“Help me train a classification model for customer support intent detection”

Claude will guide you through creating a model, preparing data files, uploading data, running teacher evaluation, training, and deployment.

Model Identifiers

When working with the Distil platform, you’ll encounter several types of identifiers. Understanding these is key to navigating the CLI effectively.

Model Name

The model name is a human-readable identifier you choose when creating a model. It helps you organize and recognize your models.

1distil model create my-qa-model

Model names should be descriptive of the task or project (e.g., customer-support-classifier, product-qa-bot).

Model ID

The model ID is a unique identifier automatically assigned when you create a model. This is the primary identifier used in most CLI commands.

1# Create a model and receive its ID
2distil model create my-model-name
3# Output: Model created with ID: d64ee301-76d2-4f06-8e7d-398e40c0d7de
4
5# Use the model ID in subsequent commands
6distil model show d64ee301-76d2-4f06-8e7d-398e40c0d7de

You can find your model IDs by listing all models:

1distil model show

Component IDs

Each model tracks multiple components through the training workflow. These component IDs are automatically managed but useful for debugging and API integration:

ComponentDescription
Upload IDIdentifies a specific data upload. Created when you run upload-data. A model can have multiple uploads if you iterate on your data.
Teacher Evaluation IDIdentifies a teacher evaluation run. Created when you run run-teacher-evaluation. Shows how well the teacher model performs on your task.
Training IDIdentifies a training job. Created when you run run-training. Tracks the distillation process that creates your small model.

View all component IDs for a model with:

1distil model show <model-id>

CLI Commands

Authentication

CommandDescription
distil loginAuthenticate with the distil labs platform. Opens a browser for login.
distil registerCreate a new distil labs account.
distil whoamiDisplay the currently authenticated user.
distil logoutLog out from the platform and clear credentials.

Model Management

CommandDescription
distil model create <name>Create a new model with the specified name. Returns the model ID.
distil model showList all your models with their IDs, names, and status.
distil model show <model-id>Show detailed information about a specific model, including all component IDs.

Data Upload

CommandDescription
distil model upload-data <model-id> --data <directory>Upload all data files from a directory. Expects standard filenames (job_description.json, train.csv, test.csv, config.yaml).
distil model upload-data <model-id> --job-description <file> --train <file> --test <file> [--config <file>] [--unstructured <file>]Upload data files individually with explicit paths.
distil model download-data <model-id>Download the uploaded data files for a model.

Teacher Evaluation

CommandDescription
distil model run-teacher-evaluation <model-id>Start a teacher evaluation to validate that a large model can solve your task.
distil model teacher-evaluation <model-id>Check the status and results of the teacher evaluation.

Training

CommandDescription
distil model run-training <model-id>Start training to distill knowledge into a compact model.
distil model training <model-id>Check the status and results of the training job.

Model Download

CommandDescription
distil model download <model-id>Download your trained model as an Ollama-ready package.

Global Options

OptionDescription
--output jsonOutput results in JSON format for scripting and automation.
--helpDisplay help information for any command.

Next Steps