Back to projects
DjangoReactViteDockerREST API

AIForge

An open platform for sharing, downloading, and running AI agents and machine learning models locally — a developer-first model marketplace.

Problem

The ML community lacks a unified, open platform where developers can publish models and users can discover and run them locally without vendor lock-in. Existing solutions either require cloud execution or lack a cohesive developer experience for model sharing and local deployment.

Solution

AIForge is designed as an open marketplace with a developer-first philosophy. Users can publish models with metadata, download them via CLI or SDK, and execute them locally inside isolated Docker containers:

  • Django REST backend with JWT authentication and model registry
  • React/Vite frontend for browsing, searching, and publishing models
  • Containerized model execution for isolation and reproducibility
  • Python SDK for programmatic model discovery and local execution
  • S3-compatible storage for model artifacts and versioning

Architecture


# AIForge Platform Architecture

┌──────────────────────────────────────────────────┐
│                  Frontend (React/Vite)            │
│                                                  │
│   Browse Models ──► Download ──► Run Locally     │
│   Publish Model ──► Upload  ──► Set Metadata     │
└────────────────────────┬─────────────────────────┘
                         │ REST API
                         ▼
┌──────────────────────────────────────────────────┐
│              Backend API (Django)                │
│                                                  │
│  Auth ──► Model Registry ──► Storage ──► Jobs   │
└───────────┬──────────────────────┬───────────────┘
            │                      │
            ▼                      ▼
┌───────────────────┐   ┌──────────────────────┐
│  Model Hosting    │   │  File Storage        │
│  Server           │   │  (S3 / local)        │
│                   │   │                      │
│  Container each   │   │  .pkl, .pt, .onnx    │
│  model version    │   │  model weights       │
└───────────────────┘   └──────────────────────┘
            │
            ▼
┌───────────────────┐
│  Local Execution  │
│  Environment      │
│                   │
│  CLI / Python SDK │
│  Docker runner    │
└───────────────────┘

Tech Stack

Django

Backend API & ORM

React + Vite

Frontend marketplace

Docker

Model execution isolation

PostgreSQL

Registry & user data

Django REST

API framework

S3

Model artifact storage

Challenges

Local Execution Isolation

Running arbitrary model code safely required containerization with resource limits and network isolation per execution.

Model Format Diversity

Supporting .pt, .pkl, .onnx and other formats required a unified interface abstraction with format-specific runners.

Dependency Management

Each model has unique Python dependencies. Solved by storing requirements.txt per model and building images on-demand.

View on GitHub