# Supabase Storage Setup Guide This document explains how to configure Supabase Storage for file uploads in InstantSplat, replacing the previous HuggingFace Hub upload functionality. ## Why Supabase Storage? The HuggingFace Hub upload has been replaced with Supabase Storage to avoid websocket dependency issues that were encountered with the supabase-py library. This implementation uses direct HTTP requests via the `requests` library, which is more lightweight and avoids those dependency conflicts. ## Setup Instructions ### 1. Create a Supabase Project 1. Go to [supabase.com](https://supabase.com) and create a new project 2. Wait for the project to be provisioned ### 2. Create a Storage Bucket 1. In your Supabase dashboard, navigate to **Storage** 2. Click **New bucket** 3. Create a bucket named `outputs` (or any name you prefer) 4. Choose whether to make it **public** or **private**: - **Public**: Files are accessible via public URLs (recommended for this use case) - **Private**: Files require authentication to access ### 3. Configure Bucket Policies (if using public bucket) For public access, ensure your bucket has appropriate policies: 1. Go to **Storage** > **Policies** 2. For the `outputs` bucket, add a policy for **SELECT** (read) operations: ```sql -- Allow public read access CREATE POLICY "Public read access" ON storage.objects FOR SELECT TO public USING (bucket_id = 'outputs'); ``` 3. Add a policy for **INSERT** (upload) operations: ```sql -- Allow authenticated uploads CREATE POLICY "Authenticated uploads" ON storage.objects FOR INSERT TO authenticated WITH CHECK (bucket_id = 'outputs'); ``` ### 4. Get Your Credentials 1. Go to **Settings** > **API** in your Supabase dashboard 2. Copy the following values: - **Project URL**: `https://xxxxx.supabase.co` - **anon public key**: For client-side uploads (limited permissions) - **service_role key**: For server-side uploads (full permissions) ⚠️ **Important**: Use the `service_role` key for server-side applications like this one, as it has full permissions to upload files. ### 5. Set Environment Variables Set the following environment variables in your deployment environment: ```bash export SUPABASE_URL="https://your-project-id.supabase.co" export SUPABASE_KEY="your-service-role-key" export SUPABASE_BUCKET="outputs" # Optional, defaults to "outputs" ``` For local development, you can create a `.env` file: ```env SUPABASE_URL=https://your-project-id.supabase.co SUPABASE_KEY=your-service-role-key SUPABASE_BUCKET=outputs ``` ### 6. Configure File Size Limits By default, Supabase Storage has the following limits: - **Free tier**: Up to 50 MB per file - **Pro tier**: Up to 5 GB per file (standard uploads) - **Pro tier with resumable uploads**: Up to 50 GB per file To increase file size limits: 1. Go to **Storage** > **Settings** 2. Set the **Global file size limit** 3. Optionally set per-bucket limits For files larger than 6 MB, the implementation includes: - ✅ Automatic retry logic with exponential backoff - ✅ 10-minute timeout for large file uploads - ✅ Direct storage hostname usage for better performance ## Implementation Details The new implementation uses the Supabase Storage REST API directly: ### Endpoint Structure ``` POST https://{project-id}.storage.supabase.co/storage/v1/object/{bucket}/{path} ``` ### Headers ```http Authorization: Bearer {service_role_key} apikey: {service_role_key} Content-Type: {mime-type} x-upsert: true ``` ### Features - **Direct HTTP uploads**: No Python library dependencies beyond `requests` - **Retry logic**: Automatic retries with exponential backoff for failed uploads - **Large file support**: Tested with files up to several hundred MB - **Performance optimization**: Uses direct storage hostname for better upload speeds - **Auto content-type detection**: Automatically sets MIME types for .ply, .glb, and .mp4 files - **Upsert support**: Automatically overwrites existing files with the same name ## Testing To test the upload functionality locally: ```python from app import upload_to_supabase_storage url = upload_to_supabase_storage( file_path="path/to/your/file.ply", remote_path="test/file.ply", supabase_url=os.environ.get("SUPABASE_URL"), supabase_key=os.environ.get("SUPABASE_KEY"), bucket_name="outputs" ) print(f"Uploaded to: {url}") ``` ## Troubleshooting ### Upload fails with 401 Unauthorized - Check that `SUPABASE_KEY` is set correctly - Ensure you're using the `service_role` key, not the `anon` key - Verify the key hasn't been rotated or revoked ### Upload fails with 403 Forbidden - Check bucket policies allow INSERT operations - Verify the bucket exists and is spelled correctly - Ensure the service role key has appropriate permissions ### Upload times out - Check your network connection - Verify file size is within limits - Consider increasing the timeout parameter in the upload function ### Files upload but aren't accessible - If using a public bucket, ensure SELECT policies are configured - Check that the bucket is set to public in Storage settings - Verify the public URL format matches your project ## Migration Notes ### Changes from HuggingFace Hub | Aspect | HF Hub | Supabase Storage | |--------|--------|------------------| | Authentication | `HF_TOKEN` | `SUPABASE_URL` + `SUPABASE_KEY` | | Bucket/Repo | `SPACE_REPO_ID` | `SUPABASE_BUCKET` | | URL format | `huggingface.co/spaces/...` | `{project}.storage.supabase.co/...` | | Library | `huggingface_hub` | `requests` (standard library) | ### Environment Variable Migration **Old (HF Hub):** ```bash HF_TOKEN=hf_xxx ``` **New (Supabase):** ```bash SUPABASE_URL=https://xxx.supabase.co SUPABASE_KEY=eyJxxx SUPABASE_BUCKET=outputs ``` ## Additional Resources - [Supabase Storage Documentation](https://supabase.com/docs/guides/storage) - [Storage REST API Reference](https://supabase.com/docs/reference/storage) - [File Upload Best Practices](https://supabase.com/docs/guides/storage/uploads)