Google Colab Training
Google Colab logo
Runs in Google ColabUse the generated notebook with your saved BBoxML settings.

Train your BBoxML dataset with Google Colab

Prepare a dataset version, download the generated notebook, run the training cells, and export the finished model artifact.

Use this after you have labeled enough images to start training

Workflow at a glance

1

Prepare version

2

Download notebook

3

Run training

4

Export model

What BBoxML configures for you

The notebook already includes your chosen dataset source, training model, output destination, and export artifact.

Step-by-step guide

Follow these steps in order the first time. After that, the same loop becomes much faster.

Step 1

1. Prepare a clean export version

Start in the Export workspace. The notebook uses the exact version you save.

What to do

  1. 1Open Export and choose a version.
  2. 2Check format, size, and split settings.
  3. 3Save before downloading the notebook.
Step 2

2. Configure the notebook before download

Choose how the notebook should behave before you download it.

What to do

  1. 1Open the Colab Notebook popup.
  2. 2Pick a training model or custom checkpoint.
  3. 3Choose dataset source and output destination.
  4. 4Choose the final artifact to export.
Step 3

3. Pick the right source and destination

Choose the simplest storage path for your workflow. Google Drive is safer if you want to avoid losing the final export.

What to do

  1. 1Upload ZIP is best for quick, small runs.
  2. 2Google Drive works better for larger or repeat runs.
  3. 3Drive output keeps results in one predictable place.
  4. 4Warning: if you choose browser download only and the browser blocks the download while you are away, you can lose the exported result from that run. Google Drive output is safer.
Step 4

4. Run the setup cells in order

Use the first notebook cells to confirm the saved settings.

What to do

  1. 1Open the notebook in Google Colab.
  2. 2Run the config and summary cells first.
  3. 3Mount Drive if the notebook asks for it.
  4. 4Then run setup and dataset prep.
Step 5

5. Understand what the notebook is doing

The notebook always trains with a YOLO checkpoint.

What to do

  1. 1YOLO exports train directly.
  2. 2COCO and Pascal VOC are converted first.
  3. 3The training step stays consistent after conversion.
  4. 4Smaller checkpoints train faster than larger ones.
Step 6

6. Export the final trained model

The last cell exports the artifact you chose in BBoxML.

What to do

  1. 1Use best.pt for the simplest result.
  2. 2Use a weights zip for best.pt and last.pt together.
  3. 3Use a full run zip for logs and predictions.
  4. 4Use TFLite for lighter deployment after training.

Tips to keep it easy

These are the easiest ways to avoid friction on early training runs.

Start with the smallest successful loop

For the first run, use Upload ZIP, a smaller checkpoint, and `best.pt`.

Use Drive for repeat workflows

For repeat runs, switch both source and output to Google Drive.

Expect some startup time in Colab

Some startup delay is normal while Colab spins up the runtime.

Common questions

Do COCO Detection or Pascal VOC train directly as their own model families?

No. They are dataset formats. The notebook converts them to YOLO training layout first.

Should I use Upload ZIP or Google Drive ZIP?

Use Upload ZIP for quick starts. Use Google Drive ZIP for larger or repeat runs.

Is TensorFlow Lite the model being trained?

No. The notebook trains from a `.pt` checkpoint first, then exports the result to `.tflite`.

Ready to try it?

Prepare a dataset version in BBoxML, then use the Colab Notebook button to generate a notebook that already matches your workflow.