Labs & Sandbox

Hands-on coding environments with GPU access, 500+ datasets, and weekly challenges.

Guided Notebooks

Step-by-step interactive notebooks with instructions and checkpoints.

120 available

GPU

Free Sandbox

Open coding environment with GPU access. Build whatever you want.

GPU

Challenge Labs

Timed problem-solving challenges. Test your skills under pressure.

85 available

GPU

Pair Coding

Real-time collaboration with a peer. Learn together, build together.

API Playground

Compare Claude, GPT, Llama, and more side-by-side. Test prompts across models.

GPU

Competition Arena

Kaggle-style ML competitions with leaderboards and prizes.

12 available

Browser-Based IDE

transformer_from_scratch.ipynb
GPU: A10G Python 3.11
# Cell [1] - Build Multi-Head Attention
import torch
import torch.nn as nn
 
class MultiHeadAttention(nn.Module):
    def __init__(self, d_model, n_heads):
        super().__init__()
        self.d_k = d_model // n_heads
        self.W_q = nn.Linear(d_model, d_model)
        self.W_k = nn.Linear(d_model, d_model)
        self.W_v = nn.Linear(d_model, d_model)
 
Shift+Enter
Output
MultiHeadAttention(
  (W_q): Linear(512, 512)
  (W_k): Linear(512, 512)
  (W_v): Linear(512, 512)
)
Parameters: 786,432
GPU Memory: 3.2 MB
AI Tutor

Great implementation! Your attention dimensions look correct. Next, implement the forward() method with scaled dot-product attention.

Active Challenges

Updated weekly
48 hours

Build a Better Chatbot

RAG LLMs NLP
847 joined 3 months Pro
5 days

Image Classification Sprint

CNN Vision PyTorch
1203 joined GPU credits
3 days

Optimize the Transformer

Optimization Transformers CUDA
342 joined Mentor session

Dataset Library

500+ datasets

ImageNet Subset

Vision 2.1 GB 100K images

Common Crawl NLP

NLP 850 MB 1.2M documents

Financial Fraud

Tabular 420 MB 6.3M transactions

Medical Imaging

Vision 3.4 GB 50K scans

Customer Reviews

NLP 180 MB 500K reviews

IoT Sensor Data

Time Series 1.1 GB 10M readings

Speech Commands

Audio 2.3 GB 105K clips

E-Commerce Events

Tabular 670 MB 4.5M events