AI/ML

    How to Install Kimi K2 on Paperspace Gradient (DigitalOcean) – Full Setup Guide


    Why Use Paperspace Gradient?

    Paperspace Gradient is a powerful platform by DigitalOcean that allows you to:

    • Access high-end GPUs (A100, RTX 4090, etc.)

    • Launch notebooks or containers with minimal setup

    • Easily install and run open-source LLMs like Kimi K2

    • Benefit from JupyterLab, volume persistence and SSH access

    Prerequisites

    • Paperspace account: https://www.paperspace.com 

    • Hugging Face account (accept Kimi K2 license)

    • Basic Python familiarity (Jupyter or CLI)

    Step 1: Log In and Create a Notebook

    1. Go to https://www.paperspace.com/gradient 

    2. Click on “Create Notebook”

    3. Select Base Container:

    • Recommended: PyTorch + CUDA 12 or Transformers + GPU

    4. Choose machine type:

    • Use A100, 3090, or RTX 5000

    5. Name your notebook (e.g., Kimi-K2-LLM) and click “Create Notebook”

    Step 2: Update and Install Dependencies

    Open the Jupyter notebook terminal and run:

    sudo apt update && sudo apt install -y git-lfspip install torch torchvision transformers accelerate huggingface_hub

    Step 3: Clone Kimi K2 Model

    git lfs installgit clone https://huggingface.co/moonshotai/Kimi-K2-Instructcd Kimi-K2-Instruct 

    Make sure to accept the model license at: https://huggingface.co/moonshotai/Kimi-K2-Instruct 

    Step 4: Load Kimi K2 Model (Notebook Cell)

    from transformers import AutoModelForCausalLM, AutoTokenizerimport torchmodel_id = "moonshotai/Kimi-K2-Instruct"tokenizer = AutoTokenizer.from_pretrained(model_id, trust_remote_code=True)model = AutoModelForCausalLM.from_pretrained(model_id, trust_remote_code=True, torch_dtype=torch.float16).cuda()prompt = "Explain generative AI to a 10-year-old."inputs = tokenizer(prompt, return_tensors="pt").to("cuda")outputs = model.generate(**inputs, max_new_tokens=200)print(tokenizer.decode(outputs[0], skip_special_tokens=True))

    Step 5 (Optional): Use CLI Instead of Notebook

    1. Switch to terminal

    2. Create and run a Python script (e.g., app.py)

    3. Launch inference from the command line

    Bonus: Save Your Workspace

    Enable Persistent Storage to retain:

    • Model weights

    • Logs

    • Generated outputs

    • Python scripts

    Final Thoughts

    Paperspace Gradient is one of the easiest ways to run Kimi K2 in a notebook interface with GPU support. Whether you’re building AI applications, fine-tuning models, or testing prompts this setup is ideal for developers, researchers and solo builders.

    Need enterprise grade deployment or DevOps help? Contact our expert AI Squad at OneClick IT Consultancy and let’s build something powerful together.

    Share

    facebook
    LinkedIn
    Twitter
    Mail
    AI/ML

    Related Center Of Excellence