Welcome to distil labs

Distil labs provides a platform for training task-specific small language models (SLMs) with just a prompt and a few dozen examples. Our platform handles the complex machine learning processes behind the scenes. This allows you to focus on your use-case instead of managing large datasets and infrastructure.

Getting started

Install the Distil CLI:

1curl -fsSL https://cli-assets.distillabs.ai/install.sh | sh

Minimal example

1# Log in (if you dont have an account, use `distil register`)
2distil login
3
4# create a model for your specific task
5distil model create my-first-model
6# Output: Model created with ID: <model-id>
7
8# Upload your data (see Data preparation for details)
9distil model upload-data <model-id> --data ./my-data-folder
10
11# Train a Small Model to solve your task as well as an LLM can
12distil model run-training <model-id>
13
14# Download the trained model
15distil model download <model-id>

That’s it! Your trained model is ready for local deployment. You can also use our Claude Skill to train models directly from Claude Code or Claude.ai.

Next steps

Ready to build your own specialized models? Continue to our How to train your SLM guide or explore detailed tutorials.