How to share Jupyter/Colab ML projects that actually run

ML projects · Sharing · Approx. 7 min read

You finally get your notebook working. You send it to a friend, guide, or recruiter… and they reply with:

Sharing ML work as “here’s my notebook” almost never works cleanly. The environment, data paths, and run instructions are all locked in your head.

In this article, we’ll walk through what breaks when you share notebooks, how to improve things with a few simple practices, and how NoteCapsule can help you ship ML projects that people can actually run.

Why your notebook breaks on other machines

Most broken shares come down to four issues:

A good test: if you restart the kernel and click “Run all” on your own machine, does it succeed from top to bottom? If not, it probably won’t work for anyone else either.

Baseline: make your notebook more shareable

Before bringing in any tools, a few simple habits will dramatically improve shareability:

1. Use relative paths inside the project

Instead of:

data = pd.read_csv("C:/Users/me/Desktop/dataset/train.csv")

Prefer:

data = pd.read_csv("./data/train.csv")

And keep a project structure like:

project/
  notebook.ipynb
  data/
    train.csv
    test.csv

2. Capture dependencies in a file

At minimum, run:

pip freeze > requirements.txt

Or for Colab, list what you explicitly installed:

!pip install pandas scikit-learn matplotlib

3. Add a tiny “README” section at the top

Even inside the notebook, start with a markdown cell that answers:

Going further with NoteCapsule Capsules

NoteCapsule sits on top of that baseline and helps you package everything into a reusable project snapshot – a Capsule.

Each Capsule includes:

!pip install notebookcapsule -q

from notebookcapsule import create_capsule

create_capsule(
    name="kaggle_titanic_baseline",
    data_dirs=["./data"]
)

This creates a self-contained folder under ./capsules/ that you can zip and send:

capsules/
  2025-11-24_kaggle_titanic_baseline/
    notebook.ipynb
    requirements_suggested.txt
    data_manifest.json
    README_template.md
    capsule_meta.json

What it’s like for the person receiving your Capsule

Imagine you’re sending this to your guide or a recruiter. They would:

  1. Unzip the Capsule into a folder on their machine.
  2. Create a virtual environment (optional but good practice).
  3. Install dependencies from requirements_suggested.txt (plus any manual tweaks).
  4. Make sure the data files listed in data_manifest.json exist in the expected ./data/ folders.
  5. Open notebook.ipynb and run it from top to bottom.

They’re not guessing where the data lives or which libraries to install – it’s all in the Capsule.

Using Capsules with Google Colab

If you’re primarily on Colab, a practical flow is:

  1. Mount Drive in your Colab notebook.
  2. Keep your project under a single Drive folder (not scattered paths).
  3. Use create_capsule(...) with data_dirs pointing to your project’s data folders.
  4. Export the Capsule with export_capsule("name") if you want a zip file.
from notebookcapsule import export_capsule

export_capsule("kaggle_titanic_baseline")
# → capsule_kaggle_titanic_baseline_2025-11-24.zip

You can then share that zip via Drive or email. The recipient can work locally or in their own Colab environment.

Summary: from “here’s my notebook” to “here’s my project”

Want your next ML project to run cleanly for others?

NoteCapsule is built for notebook-heavy students and researchers who want to ship projects that guides, reviewers, and recruiters can run without fighting errors.

Get early access

Join the early access list on the homepage and we’ll send you setup instructions and a sample Capsule to try.