facemelter commited on
Commit
2508e29
Β·
verified Β·
1 Parent(s): ff0e97f

Updated .env.example and README.md

Browse files
Files changed (2) hide show
  1. .env.example +22 -20
  2. README.md +353 -48
.env.example CHANGED
@@ -1,8 +1,7 @@
1
 
2
-
3
-
4
  # Environment
5
- ENVIRONMENT=production
6
 
7
  # ------------------------------------------------
8
 
@@ -38,29 +37,15 @@ ANTHROPIC_TEMPERATURE=0.0
38
  BIRD_CLASSIFIER_API_KEY=<secure-key-for-modal-bird-classifier>
39
 
40
  # https://.../mcp
41
- MODAL_MCP_URL=<secure-key-for-modal-mcp>
42
-
43
- # ------------------------------------------------
44
-
45
- ##############################################
46
- # eBird MCP Server - LEGACY
47
- ##############################################
48
- # Use true for HF Space (subprocess mode)
49
- #EBIRD_USE_STDIO=true
50
 
51
- # Cornell eBird API
52
- #EBIRD_API_KEY=
53
- # REQUIRED for eBird API calls
54
- #EBIRD_BASE_URL=https://api.ebird.org/v2
55
- #EBIRD_MCP_AUTH_KEY=<secure_key_for_ebird_auth> # Only needed for HTTP
56
- #EBIRD_MCP_URL=http://localhost:8000/mcp # Update if eBird server deployed separately
57
 
58
  # ------------------------------------------------
59
 
60
  ##############################################
61
- # eBird MCP Server
62
  ##############################################
63
- ###### NUTHATCH HAIL MARY ######
64
  NUTHATCH_USE_STDIO=true
65
 
66
  NUTHATCH_API_KEY=<secure-key-for-nuthatch-api>
@@ -72,3 +57,20 @@ NUTHATCH_MCP_URL=http://localhost:8000/mcp # Only for HTTP mode
72
  DEFAULT_TIMEOUT=15
73
  RATE_LIMIT_DELAY=1.0
74
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
 
2
+ ## DEPRECATED: was used with ebird_tools.py to disable/enable auth
 
3
  # Environment
4
+ ##ENVIRONMENT=production
5
 
6
  # ------------------------------------------------
7
 
 
37
  BIRD_CLASSIFIER_API_KEY=<secure-key-for-modal-bird-classifier>
38
 
39
  # https://.../mcp
40
+ MODAL_MCP_URL=<https://your-modal-server-url/mcp>
 
 
 
 
 
 
 
 
41
 
 
 
 
 
 
 
42
 
43
  # ------------------------------------------------
44
 
45
  ##############################################
46
+ # Nuthatch MCP Server -- CURRENT
47
  ##############################################
48
+ # Use true for HF Space (subprocess mode)
49
  NUTHATCH_USE_STDIO=true
50
 
51
  NUTHATCH_API_KEY=<secure-key-for-nuthatch-api>
 
57
  DEFAULT_TIMEOUT=15
58
  RATE_LIMIT_DELAY=1.0
59
 
60
+
61
+ # ------------------------------------------------
62
+
63
+ ##############################################
64
+ # eBird MCP Server - BONUS TOOL (not wired)
65
+ # Refer to: <doc> for instructions to integrate this and other tools
66
+ ##############################################
67
+ # Use true for HF Space (subprocess mode)
68
+ #EBIRD_USE_STDIO=true
69
+
70
+ # Cornell eBird API
71
+ #EBIRD_API_KEY=
72
+ # REQUIRED for eBird API calls
73
+ #EBIRD_BASE_URL=https://api.ebird.org/v2
74
+ #EBIRD_MCP_AUTH_KEY=<secure_key_for_ebird_auth> # Only needed for HTTP
75
+ #EBIRD_MCP_URL=http://localhost:8000/mcp # Update if eBird server deployed separately
76
+
README.md CHANGED
@@ -1,5 +1,5 @@
1
  ---
2
- title: BirdScope AI - MCP Agent
3
  emoji: πŸ¦…
4
  colorFrom: green
5
  colorTo: blue
@@ -9,78 +9,383 @@ app_file: app.py
9
  pinned: false
10
  ---
11
 
12
- # πŸ¦… BirdScope AI - Streaming MCP Agent
13
 
14
- **Real-time bird identification powered by MCP tools!**
15
 
16
- Built for the [MCP 1st Birthday Hackathon](https://huggingface.co/MCP-First-Birthday)
17
 
18
- ## Features
 
 
 
 
 
 
 
 
19
 
20
- - πŸ” **Image Classification**: Upload bird photos for instant AI identification
21
- - πŸ—ΊοΈ **Location Discovery**: Find birding hotspots near any location
22
- - πŸ“Š **Sighting Data**: Get recent observations from eBird API
23
- - πŸ€– **Multi-Provider**: Support for HuggingFace (free credits) & OpenAI
24
 
25
- ## How to Use
 
 
 
 
 
 
26
 
27
- ### Option 1: HuggingFace (Recommended for Hackathon)
28
- 1. Get your HuggingFace API key from [Settings β†’ Access Tokens](https://huggingface.co/settings/tokens)
29
- 2. Select "HuggingFace" as your provider in the sidebar
30
- 3. Enter your HF API key in the sidebar
31
- 4. Start chatting! (Uses your $25 hackathon credits)
32
 
33
- ### Option 2: OpenAI
34
- 1. Select "OpenAI" as your provider in the sidebar
35
- 2. Enter your OpenAI API key in the sidebar
36
- 3. Start chatting!
37
 
38
- ## Architecture
 
 
 
 
39
 
40
- This Space uses **MCP (Model Context Protocol)** to connect AI agents with:
 
 
 
 
41
 
42
- ### Modal MCP Server
43
- - GPU-powered bird classification
44
- - ResNet50 model trained on 555 bird species
45
- - Real-time image processing
46
 
47
- ### eBird MCP Server
48
- - 7 tools for bird data discovery
49
- - Recent sightings, hotspot locations
50
- - Notable/rare bird alerts
51
- - Powered by Cornell Lab of Ornithology
52
 
53
- ## Technology Stack
54
 
55
- - **Frontend**: Gradio 6.0 with custom Blocks UI
56
- - **Agent Framework**: LangGraph with streaming support
57
- - **MCP Clients**: FastMCP for tool integration
58
- - **LLM Providers**:
59
- - HuggingFace Inference API (Qwen/Qwen3-Coder)
60
- - OpenAI (gpt-4o-mini)
61
 
62
- ## Development
 
 
 
 
 
 
 
 
 
63
 
64
- Local testing:
65
  ```bash
 
 
 
 
 
 
66
  # Install dependencies
67
  pip install -r requirements.txt
 
68
 
69
- # Set up environment variables
 
 
 
 
70
  cp .env.example .env
71
- # Edit .env with your API keys
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
72
 
73
- # Run eBird MCP server (in separate terminal)
74
- python ebird_tools.py --http --port 8000
75
 
76
- # Run the app
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
77
  python app.py
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
78
  ```
79
 
80
- **Note**: Both HuggingFace and OpenAI providers work locally and on HF Spaces. Just provide your API key in the sidebar.
 
 
 
 
 
 
81
 
82
- ## Credits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
83
 
84
- - **Bird Data**: [eBird](https://ebird.org) by Cornell Lab of Ornithology
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
85
  - **MCP Protocol**: [Anthropic Model Context Protocol](https://github.com/anthropics/mcp)
86
- - **Bird Classifier**: Custom ResNet50 model
 
 
 
 
 
 
 
1
  ---
2
+ title: BirdScope AI - MCP Multi-Agent System
3
  emoji: πŸ¦…
4
  colorFrom: green
5
  colorTo: blue
 
9
  pinned: false
10
  ---
11
 
12
+ # πŸ¦… BirdScope AI - Multi-Agent Bird Identification System
13
 
14
+ **AI-powered bird identification with specialized MCP agents**
15
 
16
+ Built for the [MCP 1st Birthday Hackathon](https://huggingface.co/MCP-1st-Birthday)
17
 
18
+ ---
19
+
20
+ ## 🎯 Overview
21
+
22
+ BirdScope AI is a production-ready multi-agent system that combines **Modal GPU classification** with **Nuthatch species database** to provide comprehensive bird identification and exploration. Users can upload photos, search species, explore taxonomic families, and access rich multimedia content (images, audio recordings, conservation data).
23
+
24
+ **Two Agent Modes:**
25
+ 1. **Specialized Subagents (3 Specialists)** - Router orchestrates image identifier, species explorer, and taxonomy specialist
26
+ 2. **Audio Finder Agent** - Specialized agent for discovering bird audio recordings
27
 
28
+ ---
29
+
30
+ ## ✨ Features
 
31
 
32
+ - πŸ” **Image Classification**: Upload bird photos for instant GPU-powered identification
33
+ - πŸ“Έ **Reference Images**: High-quality Unsplash photos for each species
34
+ - 🎡 **Audio Recordings**: Bird calls and songs from xeno-canto.org
35
+ - 🌍 **Conservation Data**: IUCN status and taxonomic information
36
+ - 🧠 **Multi-Agent Architecture**: Specialized agents with focused tool subsets
37
+ - πŸ”„ **Dual Streaming**: Separate outputs for chat responses and tool execution logs
38
+ - πŸ€– **Multi-Provider**: OpenAI (GPT-4), Anthropic (Claude), HuggingFace (Qwen)
39
 
40
+ ---
 
 
 
 
41
 
42
+ ## πŸš€ Quick Start (For Users)
 
 
 
43
 
44
+ ### Option 1: OpenAI (Recommended)
45
+ 1. Get your OpenAI API key from [platform.openai.com/api-keys](https://platform.openai.com/api-keys)
46
+ 2. Select **OpenAI** as provider in the sidebar
47
+ 3. Enter your API key
48
+ 4. Model used: `gpt-4o-mini`
49
 
50
+ ### Option 2: Anthropic (Claude)
51
+ 1. Get your Anthropic API key from [console.anthropic.com/settings/keys](https://console.anthropic.com/settings/keys)
52
+ 2. Select **Anthropic** as provider
53
+ 3. Enter your API key
54
+ 4. Model used: `claude-sonnet-4-5`
55
 
56
+ ### Option 3: HuggingFace
57
+ ⚠️ **Note**: HuggingFace Inference API has limited function calling support. OpenAI or Anthropic recommended for full functionality.
 
 
58
 
59
+ ---
 
 
 
 
60
 
61
+ ## πŸ› οΈ Environment Setup (For Developers)
62
 
63
+ ### Prerequisites
 
 
 
 
 
64
 
65
+ - Python 3.11+
66
+ - Modal account (for GPU classifier)
67
+ - Nuthatch API key
68
+ - LLM API key (OpenAI, Anthropic, or HuggingFace)
69
+
70
+ ---
71
+
72
+ ### 🏠 Local Development Setup
73
+
74
+ #### Step 1: Clone and Install
75
 
 
76
  ```bash
77
+ cd ~/Desktop/hackathon/hackathon_draft
78
+
79
+ # Create virtual environment
80
+ python3.11 -m venv .venv
81
+ source .venv/bin/activate # On Windows: .venv\Scripts\activate
82
+
83
  # Install dependencies
84
  pip install -r requirements.txt
85
+ ```
86
 
87
+ #### Step 2: Configure Environment Variables
88
+
89
+ Create a `.env` file from the example:
90
+
91
+ ```bash
92
  cp .env.example .env
93
+ ```
94
+
95
+ Edit `.env` with your API keys:
96
+
97
+ ```bash
98
+ # ================================================
99
+ # REQUIRED: Modal Bird Classifier (GPU)
100
+ # ================================================
101
+ MODAL_MCP_URL=https://your-modal-app--mcp-server.modal.run/mcp
102
+ BIRD_CLASSIFIER_API_KEY=your-modal-api-key-here
103
+
104
+ # ================================================
105
+ # REQUIRED: Nuthatch Species Database
106
+ # ================================================
107
+ NUTHATCH_API_KEY=your-nuthatch-api-key-here
108
+ NUTHATCH_BASE_URL=https://nuthatch.lastelm.software/v2 # Default, can omit
109
+
110
+ # Nuthatch Transport Mode (STDIO or HTTP)
111
+ NUTHATCH_USE_STDIO=true # Recommended for local development
112
+
113
+ # Only needed if NUTHATCH_USE_STDIO=false:
114
+ # NUTHATCH_MCP_URL=http://localhost:8001/mcp
115
+ # NUTHATCH_MCP_AUTH_KEY=your-auth-key-here
116
+
117
+ # ================================================
118
+ # LLM Provider (Choose ONE)
119
+ # ================================================
120
+ # OpenAI (Recommended)
121
+ OPENAI_API_KEY=sk-your-openai-key-here
122
+ DEFAULT_OPENAI_MODEL=gpt-4o-mini
123
+ OPENAI_TEMPERATURE=0.0
124
+
125
+ # OR Anthropic
126
+ # ANTHROPIC_API_KEY=sk-ant-your-anthropic-key-here
127
+ # DEFAULT_ANTHROPIC_MODEL=claude-sonnet-4-5-20250929
128
+ # ANTHROPIC_TEMPERATURE=0.0
129
+
130
+ # OR HuggingFace (Limited function calling support)
131
+ # HF_API_KEY=hf_your-huggingface-token-here
132
+ # DEFAULT_HF_MODEL=Qwen/Qwen2.5-Coder-32B-Instruct
133
+ # HF_TEMPERATURE=0.1
134
+ ```
135
 
136
+ #### Step 3: Understanding Nuthatch Transport Modes
 
137
 
138
+ **STDIO Mode (Recommended for Local):**
139
+ - Nuthatch MCP server runs as subprocess
140
+ - Automatically started by the app
141
+ - No separate server process needed
142
+ - Set `NUTHATCH_USE_STDIO=true`
143
+
144
+ **HTTP Mode (Alternative for Local):**
145
+ - Nuthatch MCP server runs as separate HTTP server
146
+ - Useful for debugging or multiple clients
147
+ - Requires running server in separate terminal
148
+
149
+ To use HTTP mode:
150
+
151
+ ```bash
152
+ # Terminal 1: Run Nuthatch MCP server
153
+ python nuthatch_tools.py --http --port 8001
154
+
155
+ # Terminal 2: Run the app
156
+ # Set in .env:
157
+ # NUTHATCH_USE_STDIO=false
158
+ # NUTHATCH_MCP_URL=http://localhost:8001/mcp
159
+ python app.py
160
+ ```
161
+
162
+ #### Step 4: Run the App
163
+
164
+ ```bash
165
+ # With STDIO mode (default, easiest):
166
  python app.py
167
+
168
+ # Or using Gradio CLI:
169
+ gradio app.py
170
+ ```
171
+
172
+ App will be available at: `http://127.0.0.1:7860`
173
+
174
+ ---
175
+
176
+ ### ☁️ HuggingFace Spaces Deployment
177
+
178
+ #### Step 1: Create a New Space
179
+
180
+ 1. Go to [huggingface.co/new-space](https://huggingface.co/new-space)
181
+ 2. Choose:
182
+ - **SDK**: Gradio
183
+ - **Hardware**: CPU Basic (free) or CPU Upgrade (faster)
184
+ - **Visibility**: Public or Private
185
+
186
+ #### Step 2: Upload Your Code
187
+
188
+ **Option A: Using `upload_to_space.py` (Recommended)**
189
+
190
+ ```bash
191
+ # 1. Install HuggingFace CLI
192
+ pip install huggingface_hub
193
+
194
+ # 2. Login
195
+ huggingface-cli login
196
+
197
+ # 3. Update upload_to_space.py with your Space name
198
+ # Edit line with repo_id:
199
+ # repo_id="YOUR-USERNAME/YOUR-SPACE-NAME"
200
+
201
+ # 4. Upload
202
+ python upload_to_space.py
203
+ ```
204
+
205
+ **Option B: Using Git**
206
+
207
+ ```bash
208
+ git remote add hf-space https://huggingface.co/spaces/YOUR-USERNAME/YOUR-SPACE-NAME
209
+ git push hf-space main
210
  ```
211
 
212
+ #### Step 3: Configure Secrets in HuggingFace Spaces
213
+
214
+ ⚠️ **CRITICAL**: Spaces use **Secrets**, not `.env` files!
215
+
216
+ Go to your Space β†’ **Settings** β†’ **Variables and secrets**
217
+
218
+ **Add these secrets:**
219
 
220
+ ```bash
221
+ # REQUIRED: Modal Bird Classifier
222
+ MODAL_MCP_URL = https://your-modal-app--mcp-server.modal.run/mcp
223
+ BIRD_CLASSIFIER_API_KEY = your-modal-api-key-here
224
+
225
+ # REQUIRED: Nuthatch Species Database
226
+ NUTHATCH_API_KEY = your-nuthatch-api-key-here
227
+ NUTHATCH_BASE_URL = https://nuthatch.lastelm.software/v2 # Optional
228
+ NUTHATCH_USE_STDIO = true # MUST be "true" for Spaces
229
+
230
+ # OPTIONAL: Backend-provided LLM keys (users can provide their own)
231
+ # Only add if you want to provide default keys:
232
+ # OPENAI_API_KEY = sk-your-key-here
233
+ # ANTHROPIC_API_KEY = sk-ant-your-key-here
234
+ ```
235
 
236
+ **Important Notes:**
237
+ - βœ… **ALWAYS** use `NUTHATCH_USE_STDIO=true` on Spaces (subprocess mode)
238
+ - βœ… HTTP mode not supported on Spaces (port binding restrictions)
239
+ - βœ… Users can provide their own LLM keys via the UI
240
+ - βœ… Environment variables from Spaces **do not** auto-inherit to subprocesses
241
+ - The app explicitly passes `NUTHATCH_API_KEY` and `NUTHATCH_BASE_URL` to the subprocess (see `mcp_clients.py`)
242
+
243
+ #### Step 4: Verify Deployment
244
+
245
+ 1. Wait for Space to build (2-5 minutes)
246
+ 2. Check **Logs** tab for errors
247
+ 3. Try the app - upload a bird photo or ask about species
248
+
249
+ ---
250
+
251
+ ## πŸ“ Project Structure
252
+
253
+ ```
254
+ hackathon_draft/
255
+ β”œβ”€β”€ app.py # Main Gradio app
256
+ β”œβ”€β”€ upload_to_space.py # HF Spaces upload script
257
+ β”œβ”€β”€ requirements.txt # Python dependencies
258
+ β”œβ”€β”€ .env.example # Environment template
259
+ β”œβ”€β”€ langgraph_agent/
260
+ β”‚ β”œβ”€β”€ __init__.py
261
+ β”‚ β”œβ”€β”€ agents.py # Agent factory (single/multi-agent)
262
+ β”‚ β”œβ”€β”€ config.py # Configuration loader
263
+ β”‚ β”œβ”€β”€ mcp_clients.py # MCP client setup
264
+ β”‚ β”œβ”€β”€ subagent_config.py # Agent mode definitions
265
+ β”‚ β”œβ”€β”€ prompts.py # System prompts
266
+ β”‚ └── structured_output.py # Response formatting
267
+ β”œβ”€β”€ nuthatch_tools.py # Nuthatch MCP server
268
+ └── agent_cache.py # Session-based agent caching
269
+ ```
270
+
271
+ ---
272
+
273
+ ## πŸ—οΈ Architecture
274
+
275
+ ### MCP Servers
276
+
277
+ **1. Modal Bird Classifier (GPU)**
278
+ - Hosted on Modal (serverless GPU)
279
+ - ResNet50 trained on 555 bird species
280
+ - Tools: `classify_from_url`, `classify_from_base64`
281
+ - Transport: Streamable HTTP
282
+
283
+ **2. Nuthatch Species Database**
284
+ - Species reference API (1000+ birds)
285
+ - Tools: `search_birds`, `get_bird_info`, `get_bird_images`, `get_bird_audio`, `search_by_family`, `filter_by_status`, `get_all_families`
286
+ - Transport: **STDIO** (subprocess on Spaces), STDIO or HTTP (local)
287
+ - Data sources: Unsplash (images), xeno-canto (audio)
288
+
289
+ ### Agent Modes
290
+
291
+ **Mode 1: Specialized Subagents (3 Specialists)**
292
+ - **Router** orchestrates 3 specialized agents:
293
+ 1. **Image Identifier**: classify images, show reference photos
294
+ 2. **Species Explorer**: search by name, provide multimedia
295
+ 3. **Taxonomy Specialist**: conservation status, family search
296
+ - Each specialist has focused tool subset
297
+
298
+ **Mode 2: Audio Finder Agent**
299
+ - Single specialized agent for finding bird audio
300
+ - Tools: `search_birds`, `get_bird_info`, `get_bird_audio`
301
+ - Optimized workflow for xeno-canto recordings
302
+
303
+ ### Tech Stack
304
+
305
+ - **Frontend**: Gradio 6.0 with custom CSS (cloud/sky theme)
306
+ - **Agent Framework**: LangGraph with streaming
307
+ - **MCP Integration**: FastMCP client library
308
+ - **LLM Support**: OpenAI, Anthropic, HuggingFace
309
+ - **Session Management**: In-memory agent caching
310
+ - **Output Parsing**: LlamaIndex Pydantic + regex (optimized)
311
+
312
+ ---
313
+
314
+ ## 🎨 Special Features
315
+
316
+ ### Dual Streaming Output
317
+ - **Chat Panel**: LLM responses with markdown rendering
318
+ - **Tool Log Panel**: Real-time tool execution traces (inputs/outputs)
319
+
320
+ ### Dynamic Examples
321
+ - Examples change based on selected agent mode
322
+ - Photo examples always visible
323
+ - Text examples adapt to Audio Finder vs Multi-Agent
324
+
325
+ ### Structured Output
326
+ - Automatic image/audio URL extraction
327
+ - Markdown formatting for media
328
+ - xeno-canto audio links (browser-friendly)
329
+
330
+ ---
331
+
332
+ ## πŸ“ API Key Sources
333
+
334
+ | Service | Get Key From | Purpose |
335
+ |---------|-------------|---------|
336
+ | **Modal** | [modal.com](https://modal.com) | GPU bird classifier |
337
+ | **Nuthatch** | [nuthatch.lastelm.software](https://nuthatch.lastelm.software) | Species database |
338
+ | **OpenAI** | [platform.openai.com/api-keys](https://platform.openai.com/api-keys) | LLM (recommended) |
339
+ | **Anthropic** | [console.anthropic.com/settings/keys](https://console.anthropic.com/settings/keys) | LLM (Claude) |
340
+ | **HuggingFace** | [huggingface.co/settings/tokens](https://huggingface.co/settings/tokens) | LLM (limited support) |
341
+
342
+ ---
343
+
344
+ ## πŸ› Troubleshooting
345
+
346
+ ### Space stuck on "Building"
347
+ - Check **Logs** tab for errors
348
+ - Verify all required secrets are set
349
+ - Try Factory Reboot (Settings β†’ Factory Reboot)
350
+
351
+ ### "Invalid API key" errors
352
+ - Ensure secrets are set correctly (no quotes needed)
353
+ - Check secret names match exactly (case-sensitive)
354
+
355
+ ### HuggingFace provider fails with "function calling not support"
356
+ - HuggingFace Inference API has limited tool calling
357
+ - Use OpenAI or Anthropic instead
358
+
359
+ ### Nuthatch server not starting (local)
360
+ - Check `NUTHATCH_API_KEY` is set in `.env`
361
+ - Verify API key is valid
362
+ - Try STDIO mode: `NUTHATCH_USE_STDIO=true`
363
+
364
+ ### Audio links broken
365
+ - Check AUDIO_FINDER_PROMPT is working
366
+ - Verify xeno-canto URLs include `/download`
367
+ - Check structured output parsing logs
368
+
369
+ ---
370
+
371
+ ## πŸ“š Documentation
372
+
373
+ For detailed implementation docs, see:
374
+ - `project_docs/implementation/phase_5_final.md` - Complete agent architecture
375
+ - `project_docs/commands_guide/git_spaces_cheatsheet.md` - Deployment guide
376
+
377
+ ---
378
+
379
+ ## πŸ† Credits
380
+
381
+ - **Bird Species Data**: [Nuthatch API](https://nuthatch.lastelm.software) by Last Elm Software
382
+ - **Bird Audio**: [xeno-canto.org](https://xeno-canto.org) - Community bird recordings
383
+ - **Reference Images**: [Unsplash](https://unsplash.com) + curated collections
384
  - **MCP Protocol**: [Anthropic Model Context Protocol](https://github.com/anthropics/mcp)
385
+ - **Hackathon**: [HuggingFace MCP-1st-Birthday](https://huggingface.co/MCP-1st-Birthday)
386
+
387
+ ---
388
+
389
+ ## πŸ“„ License
390
+
391
+ MIT License - Built for educational and research purposes