Fix Redis connection issues with proper SSL configuration and enhanced error handling
Browse files- README.md +74 -60
- core/redis_client.py +16 -20
- test_redis_connection.py +57 -0
README.md
CHANGED
|
@@ -31,7 +31,7 @@ Your personal AI-powered life coaching assistant.
|
|
| 31 |
|
| 32 |
## Requirements
|
| 33 |
|
| 34 |
-
All requirements are specified in
|
| 35 |
- Streamlit UI
|
| 36 |
- FastAPI backend (for future expansion)
|
| 37 |
- Redis connection for persistent memory
|
|
@@ -39,30 +39,30 @@ All requirements are specified in `requirements.txt`. The app automatically hand
|
|
| 39 |
|
| 40 |
## Environment Variables
|
| 41 |
|
| 42 |
-
Configure these in your Hugging Face Space secrets or local
|
| 43 |
|
| 44 |
-
-
|
| 45 |
-
-
|
| 46 |
-
-
|
| 47 |
-
-
|
| 48 |
-
-
|
| 49 |
-
-
|
| 50 |
-
-
|
| 51 |
-
-
|
| 52 |
-
-
|
| 53 |
|
| 54 |
## Provider Details
|
| 55 |
|
| 56 |
### Ollama (Primary Local Provider)
|
| 57 |
|
| 58 |
-
|
| 59 |
1. Install Ollama: https://ollama.com/download
|
| 60 |
-
2. Pull a model:
|
| 61 |
-
3. Start server:
|
| 62 |
-
4. Configure ngrok:
|
| 63 |
-
5. Set
|
| 64 |
|
| 65 |
-
|
| 66 |
- No cost for inference
|
| 67 |
- Full control over models
|
| 68 |
- Fast response times
|
|
@@ -70,75 +70,89 @@ Configure these in your Hugging Face Space secrets or local `.env` file:
|
|
| 70 |
|
| 71 |
### Hugging Face Inference API (Fallback)
|
| 72 |
|
| 73 |
-
|
| 74 |
|
| 75 |
-
|
| 76 |
-
- ⚠️
|
| 77 |
-
- ⏱️
|
| 78 |
-
- 🔄
|
| 79 |
-
- 💰
|
| 80 |
-
- 📍
|
| 81 |
|
| 82 |
-
|
| 83 |
When using the Hugging Face fallback, you may encounter 503 errors initially. This indicates the endpoint is initializing. Simply retry your request after 30-60 seconds, or wait for the initialization to complete (typically 4 minutes).
|
| 84 |
|
| 85 |
-
|
| 86 |
|
| 87 |
### OpenAI (Alternative Fallback)
|
| 88 |
|
| 89 |
-
Configure with
|
| 90 |
|
| 91 |
## Switching Between Providers
|
| 92 |
|
| 93 |
### For Local Development (Windows/Ollama):
|
| 94 |
|
| 95 |
-
1.
|
| 96 |
-
|
| 97 |
# Download from https://ollama.com/download/OllamaSetup.exe
|
| 98 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 99 |
|
| 100 |
-
ollama pull mistral
|
| 101 |
-
ollama pull llama3
|
| 102 |
-
ollama serve
|
| 103 |
-
Start ngrok tunnel:
|
| 104 |
-
|
| 105 |
-
ngrok http 11434
|
| 106 |
-
Update environment variables:
|
| 107 |
-
|
| 108 |
-
OLLAMA_HOST=https://your-ngrok-url.ngrok-free.app
|
| 109 |
-
LOCAL_MODEL_NAME=mistral
|
| 110 |
-
USE_FALLBACK=false
|
| 111 |
For Production Deployment:
|
| 112 |
-
The application automatically handles provider fallback:
|
| 113 |
|
|
|
|
| 114 |
Primary: Ollama (via ngrok)
|
| 115 |
Secondary: Hugging Face Inference API
|
| 116 |
Tertiary: OpenAI (if configured)
|
|
|
|
| 117 |
Architecture
|
|
|
|
| 118 |
This application consists of:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 119 |
|
| 120 |
-
Streamlit frontend (app.py)
|
| 121 |
-
Core LLM abstraction (core/llm.py)
|
| 122 |
-
Memory management (core/memory.py)
|
| 123 |
-
Configuration management (utils/config.py)
|
| 124 |
-
API endpoints (in api/ directory for future expansion)
|
| 125 |
Built with Python, Streamlit, FastAPI, and Redis.
|
| 126 |
|
| 127 |
-
Troubleshooting Common Issues
|
|
|
|
| 128 |
503 Errors with Hugging Face Fallback:
|
| 129 |
-
Wait 4 minutes for cold start initialization
|
| 130 |
-
Retry request after endpoint warms up
|
|
|
|
| 131 |
Ollama Connection Issues:
|
| 132 |
-
Verify ollama serve is running locally
|
| 133 |
-
Check ngrok tunnel status
|
| 134 |
-
Confirm ngrok URL matches OLLAMA_HOST
|
| 135 |
-
Test with test_ollama_connection.py
|
|
|
|
| 136 |
Redis Connection Problems:
|
| 137 |
-
Set USE_FALLBACK=true to disable Redis requirement
|
| 138 |
-
Or configure proper Redis credentials
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 139 |
Model Not Found:
|
| 140 |
-
Pull required model: ollama pull <model-name>
|
| 141 |
-
Check available models: ollama list
|
|
|
|
| 142 |
Diagnostic Scripts:
|
| 143 |
-
Run python test_ollama_connection.py to verify Ollama connectivity.
|
| 144 |
-
Run python diagnose_ollama.py for detailed connection diagnostics.
|
|
|
|
|
|
| 31 |
|
| 32 |
## Requirements
|
| 33 |
|
| 34 |
+
All requirements are specified in requirements.txt. The app automatically handles:
|
| 35 |
- Streamlit UI
|
| 36 |
- FastAPI backend (for future expansion)
|
| 37 |
- Redis connection for persistent memory
|
|
|
|
| 39 |
|
| 40 |
## Environment Variables
|
| 41 |
|
| 42 |
+
Configure these in your Hugging Face Space secrets or local .env file:
|
| 43 |
|
| 44 |
+
- OLLAMA_HOST: Your Ollama server URL (default: ngrok URL)
|
| 45 |
+
- LOCAL_MODEL_NAME: Default model name (default: mistral)
|
| 46 |
+
- HF_TOKEN: Hugging Face API token (for Hugging Face models)
|
| 47 |
+
- HF_API_ENDPOINT_URL: Hugging Face inference API endpoint
|
| 48 |
+
- USE_FALLBACK: Whether to use fallback providers (true/false)
|
| 49 |
+
- REDIS_HOST: Redis server hostname (default: localhost)
|
| 50 |
+
- REDIS_PORT: Redis server port (default: 6379)
|
| 51 |
+
- REDIS_USERNAME: Redis username (optional)
|
| 52 |
+
- REDIS_PASSWORD: Redis password (optional)
|
| 53 |
|
| 54 |
## Provider Details
|
| 55 |
|
| 56 |
### Ollama (Primary Local Provider)
|
| 57 |
|
| 58 |
+
Setup:
|
| 59 |
1. Install Ollama: https://ollama.com/download
|
| 60 |
+
2. Pull a model: ollama pull mistral
|
| 61 |
+
3. Start server: ollama serve
|
| 62 |
+
4. Configure ngrok: ngrok http 11434
|
| 63 |
+
5. Set OLLAMA_HOST to your ngrok URL
|
| 64 |
|
| 65 |
+
Advantages:
|
| 66 |
- No cost for inference
|
| 67 |
- Full control over models
|
| 68 |
- Fast response times
|
|
|
|
| 70 |
|
| 71 |
### Hugging Face Inference API (Fallback)
|
| 72 |
|
| 73 |
+
Current Endpoint: https://zxzbfrlg3ssrk7d9.us-east-1.aws.endpoints.huggingface.cloud
|
| 74 |
|
| 75 |
+
Important Scaling Behavior:
|
| 76 |
+
- ⚠️ Scale-to-Zero: Endpoint automatically scales to zero after 15 minutes of inactivity
|
| 77 |
+
- ⏱️ Cold Start: Takes approximately 4 minutes to initialize when first requested
|
| 78 |
+
- 🔄 Automatic Wake-up: Sending any request will automatically start the endpoint
|
| 79 |
+
- 💰 Cost: $0.536/hour while running (not billed when scaled to zero)
|
| 80 |
+
- 📍 Location: AWS us-east-1 (Intel Sapphire Rapids, 16vCPUs, 32GB RAM)
|
| 81 |
|
| 82 |
+
Handling 503 Errors:
|
| 83 |
When using the Hugging Face fallback, you may encounter 503 errors initially. This indicates the endpoint is initializing. Simply retry your request after 30-60 seconds, or wait for the initialization to complete (typically 4 minutes).
|
| 84 |
|
| 85 |
+
Model: OpenAI GPT OSS 20B (Uncensored variant)
|
| 86 |
|
| 87 |
### OpenAI (Alternative Fallback)
|
| 88 |
|
| 89 |
+
Configure with OPENAI_API_KEY environment variable.
|
| 90 |
|
| 91 |
## Switching Between Providers
|
| 92 |
|
| 93 |
### For Local Development (Windows/Ollama):
|
| 94 |
|
| 95 |
+
1. Install Ollama:
|
| 96 |
+
bash
|
| 97 |
# Download from https://ollama.com/download/OllamaSetup.exe
|
| 98 |
+
|
| 99 |
+
Pull and run models:
|
| 100 |
+
ollama pull mistral
|
| 101 |
+
ollama pull llama3
|
| 102 |
+
ollama serve
|
| 103 |
+
|
| 104 |
+
Start ngrok tunnel:
|
| 105 |
+
ngrok http 11434
|
| 106 |
+
|
| 107 |
+
Update environment variables:
|
| 108 |
+
OLLAMA_HOST=https://your-ngrok-url.ngrok-free.app
|
| 109 |
+
LOCAL_MODEL_NAME=mistral
|
| 110 |
+
USE_FALLBACK=false
|
| 111 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 112 |
For Production Deployment:
|
|
|
|
| 113 |
|
| 114 |
+
The application automatically handles provider fallback:
|
| 115 |
Primary: Ollama (via ngrok)
|
| 116 |
Secondary: Hugging Face Inference API
|
| 117 |
Tertiary: OpenAI (if configured)
|
| 118 |
+
|
| 119 |
Architecture
|
| 120 |
+
|
| 121 |
This application consists of:
|
| 122 |
+
- Streamlit frontend (app.py)
|
| 123 |
+
- Core LLM abstraction (core/llm.py)
|
| 124 |
+
- Memory management (core/memory.py)
|
| 125 |
+
- Configuration management (utils/config.py)
|
| 126 |
+
- API endpoints (in api/ directory for future expansion)
|
| 127 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 128 |
Built with Python, Streamlit, FastAPI, and Redis.
|
| 129 |
|
| 130 |
+
## Troubleshooting Common Issues
|
| 131 |
+
|
| 132 |
503 Errors with Hugging Face Fallback:
|
| 133 |
+
- Wait 4 minutes for cold start initialization
|
| 134 |
+
- Retry request after endpoint warms up
|
| 135 |
+
|
| 136 |
Ollama Connection Issues:
|
| 137 |
+
- Verify ollama serve is running locally
|
| 138 |
+
- Check ngrok tunnel status
|
| 139 |
+
- Confirm ngrok URL matches OLLAMA_HOST
|
| 140 |
+
- Test with test_ollama_connection.py
|
| 141 |
+
|
| 142 |
Redis Connection Problems:
|
| 143 |
+
- Set USE_FALLBACK=true to disable Redis requirement
|
| 144 |
+
- Or configure proper Redis credentials:
|
| 145 |
+
REDIS_HOST=redis-10296.c245.us-east-1-3.ec2.redns.redis-cloud.com
|
| 146 |
+
REDIS_PORT=10296
|
| 147 |
+
REDIS_USERNAME=default
|
| 148 |
+
REDIS_PASSWORD=your_password_here
|
| 149 |
+
REDIS_DISABLE_SSL=false
|
| 150 |
+
|
| 151 |
Model Not Found:
|
| 152 |
+
- Pull required model: ollama pull <model-name>
|
| 153 |
+
- Check available models: ollama list
|
| 154 |
+
|
| 155 |
Diagnostic Scripts:
|
| 156 |
+
- Run python test_ollama_connection.py to verify Ollama connectivity.
|
| 157 |
+
- Run python diagnose_ollama.py for detailed connection diagnostics.
|
| 158 |
+
- Run python test_redis_connection.py to verify Redis connectivity.
|
core/redis_client.py
CHANGED
|
@@ -9,20 +9,19 @@ logger = logging.getLogger(__name__)
|
|
| 9 |
|
| 10 |
class RedisClient:
|
| 11 |
"""Enhanced Redis client with proper connection handling"""
|
| 12 |
-
|
| 13 |
_instance = None
|
| 14 |
_redis_client = None
|
| 15 |
-
|
| 16 |
def __new__(cls):
|
| 17 |
if cls._instance is None:
|
| 18 |
cls._instance = super(RedisClient, cls).__new__(cls)
|
| 19 |
return cls._instance
|
| 20 |
-
|
| 21 |
def __init__(self):
|
| 22 |
if not hasattr(self, '_initialized'):
|
| 23 |
self._initialized = True
|
| 24 |
self._connect()
|
| 25 |
-
|
| 26 |
def _connect(self):
|
| 27 |
"""Establish Redis connection with proper error handling for Redis Cloud"""
|
| 28 |
logger.info(f"Attempting Redis connection with:")
|
|
@@ -31,7 +30,7 @@ class RedisClient:
|
|
| 31 |
logger.info(f" Username: {'SET' if config.redis_username else 'NOT SET'}")
|
| 32 |
logger.info(f" Password: {'SET' if config.redis_password else 'NOT SET'}")
|
| 33 |
logger.info(f" SSL Disabled: {config.redis_disable_ssl}")
|
| 34 |
-
|
| 35 |
if not config.redis_host or config.redis_host == "localhost":
|
| 36 |
logger.info("Redis not configured, skipping connection")
|
| 37 |
return None
|
|
@@ -49,12 +48,11 @@ class RedisClient:
|
|
| 49 |
username=config.redis_username if config.redis_username else "default",
|
| 50 |
password=config.redis_password,
|
| 51 |
decode_responses=True,
|
| 52 |
-
socket_connect_timeout=
|
| 53 |
-
socket_timeout=
|
| 54 |
health_check_interval=30,
|
| 55 |
retry_on_timeout=True
|
| 56 |
)
|
| 57 |
-
|
| 58 |
self._redis_client.ping()
|
| 59 |
logger.info("Successfully connected to Redis without SSL")
|
| 60 |
return
|
|
@@ -72,31 +70,29 @@ class RedisClient:
|
|
| 72 |
username=config.redis_username if config.redis_username else "default",
|
| 73 |
password=config.redis_password,
|
| 74 |
decode_responses=True,
|
| 75 |
-
socket_connect_timeout=
|
| 76 |
-
socket_timeout=
|
| 77 |
ssl=True,
|
| 78 |
-
ssl_cert_reqs=
|
| 79 |
health_check_interval=30,
|
| 80 |
retry_on_timeout=True
|
| 81 |
)
|
| 82 |
-
|
| 83 |
self._redis_client.ping()
|
| 84 |
logger.info("Successfully connected to Redis Cloud with SSL")
|
| 85 |
return
|
| 86 |
-
|
| 87 |
except Exception as e:
|
| 88 |
logger.error(f"Redis Cloud SSL connection failed: {e}")
|
| 89 |
self._redis_client = None
|
| 90 |
-
|
| 91 |
def _parse_host_port(self, host_string: str, default_port: int) -> tuple:
|
| 92 |
"""Parse host and port from host string"""
|
| 93 |
if not host_string:
|
| 94 |
return "localhost", default_port
|
| 95 |
-
|
| 96 |
# Remove any whitespace and control characters
|
| 97 |
host_string = host_string.strip()
|
| 98 |
host_string = re.sub(r'[\r\n\t\0]+', '', host_string)
|
| 99 |
-
|
| 100 |
# Handle case where port is included in REDIS_HOST (e.g., "host:port")
|
| 101 |
if ':' in host_string and not host_string.startswith('['): # Not IPv6
|
| 102 |
# Check if the part after : is a valid port number
|
|
@@ -113,13 +109,13 @@ class RedisClient:
|
|
| 113 |
except ValueError:
|
| 114 |
# Port is not a valid integer, use default
|
| 115 |
return host_string, default_port
|
| 116 |
-
|
| 117 |
return host_string, default_port
|
| 118 |
-
|
| 119 |
def get_client(self) -> Optional[redis.Redis]:
|
| 120 |
"""Get Redis client instance"""
|
| 121 |
return self._redis_client
|
| 122 |
-
|
| 123 |
def is_healthy(self) -> bool:
|
| 124 |
"""Check if Redis connection is healthy"""
|
| 125 |
if not self._redis_client:
|
|
@@ -129,7 +125,7 @@ class RedisClient:
|
|
| 129 |
return True
|
| 130 |
except:
|
| 131 |
return False
|
| 132 |
-
|
| 133 |
def reconnect(self):
|
| 134 |
"""Reconnect to Redis"""
|
| 135 |
self._connect()
|
|
|
|
| 9 |
|
| 10 |
class RedisClient:
|
| 11 |
"""Enhanced Redis client with proper connection handling"""
|
|
|
|
| 12 |
_instance = None
|
| 13 |
_redis_client = None
|
| 14 |
+
|
| 15 |
def __new__(cls):
|
| 16 |
if cls._instance is None:
|
| 17 |
cls._instance = super(RedisClient, cls).__new__(cls)
|
| 18 |
return cls._instance
|
| 19 |
+
|
| 20 |
def __init__(self):
|
| 21 |
if not hasattr(self, '_initialized'):
|
| 22 |
self._initialized = True
|
| 23 |
self._connect()
|
| 24 |
+
|
| 25 |
def _connect(self):
|
| 26 |
"""Establish Redis connection with proper error handling for Redis Cloud"""
|
| 27 |
logger.info(f"Attempting Redis connection with:")
|
|
|
|
| 30 |
logger.info(f" Username: {'SET' if config.redis_username else 'NOT SET'}")
|
| 31 |
logger.info(f" Password: {'SET' if config.redis_password else 'NOT SET'}")
|
| 32 |
logger.info(f" SSL Disabled: {config.redis_disable_ssl}")
|
| 33 |
+
|
| 34 |
if not config.redis_host or config.redis_host == "localhost":
|
| 35 |
logger.info("Redis not configured, skipping connection")
|
| 36 |
return None
|
|
|
|
| 48 |
username=config.redis_username if config.redis_username else "default",
|
| 49 |
password=config.redis_password,
|
| 50 |
decode_responses=True,
|
| 51 |
+
socket_connect_timeout=15,
|
| 52 |
+
socket_timeout=15,
|
| 53 |
health_check_interval=30,
|
| 54 |
retry_on_timeout=True
|
| 55 |
)
|
|
|
|
| 56 |
self._redis_client.ping()
|
| 57 |
logger.info("Successfully connected to Redis without SSL")
|
| 58 |
return
|
|
|
|
| 70 |
username=config.redis_username if config.redis_username else "default",
|
| 71 |
password=config.redis_password,
|
| 72 |
decode_responses=True,
|
| 73 |
+
socket_connect_timeout=15,
|
| 74 |
+
socket_timeout=15,
|
| 75 |
ssl=True,
|
| 76 |
+
ssl_cert_reqs='required', # Proper SSL certificate validation
|
| 77 |
health_check_interval=30,
|
| 78 |
retry_on_timeout=True
|
| 79 |
)
|
|
|
|
| 80 |
self._redis_client.ping()
|
| 81 |
logger.info("Successfully connected to Redis Cloud with SSL")
|
| 82 |
return
|
|
|
|
| 83 |
except Exception as e:
|
| 84 |
logger.error(f"Redis Cloud SSL connection failed: {e}")
|
| 85 |
self._redis_client = None
|
| 86 |
+
|
| 87 |
def _parse_host_port(self, host_string: str, default_port: int) -> tuple:
|
| 88 |
"""Parse host and port from host string"""
|
| 89 |
if not host_string:
|
| 90 |
return "localhost", default_port
|
| 91 |
+
|
| 92 |
# Remove any whitespace and control characters
|
| 93 |
host_string = host_string.strip()
|
| 94 |
host_string = re.sub(r'[\r\n\t\0]+', '', host_string)
|
| 95 |
+
|
| 96 |
# Handle case where port is included in REDIS_HOST (e.g., "host:port")
|
| 97 |
if ':' in host_string and not host_string.startswith('['): # Not IPv6
|
| 98 |
# Check if the part after : is a valid port number
|
|
|
|
| 109 |
except ValueError:
|
| 110 |
# Port is not a valid integer, use default
|
| 111 |
return host_string, default_port
|
| 112 |
+
|
| 113 |
return host_string, default_port
|
| 114 |
+
|
| 115 |
def get_client(self) -> Optional[redis.Redis]:
|
| 116 |
"""Get Redis client instance"""
|
| 117 |
return self._redis_client
|
| 118 |
+
|
| 119 |
def is_healthy(self) -> bool:
|
| 120 |
"""Check if Redis connection is healthy"""
|
| 121 |
if not self._redis_client:
|
|
|
|
| 125 |
return True
|
| 126 |
except:
|
| 127 |
return False
|
| 128 |
+
|
| 129 |
def reconnect(self):
|
| 130 |
"""Reconnect to Redis"""
|
| 131 |
self._connect()
|
test_redis_connection.py
ADDED
|
@@ -0,0 +1,57 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
import sys
|
| 2 |
+
import os
|
| 3 |
+
from pathlib import Path
|
| 4 |
+
|
| 5 |
+
# Add project root to path
|
| 6 |
+
project_root = Path(__file__).parent
|
| 7 |
+
sys.path.append(str(project_root))
|
| 8 |
+
|
| 9 |
+
from core.redis_client import redis_client
|
| 10 |
+
from utils.config import config
|
| 11 |
+
|
| 12 |
+
def test_redis_connection():
|
| 13 |
+
"""Test Redis connection with current configuration"""
|
| 14 |
+
print("Testing Redis connection...")
|
| 15 |
+
print(f"REDIS_HOST: {config.redis_host}")
|
| 16 |
+
print(f"REDIS_PORT: {config.redis_port}")
|
| 17 |
+
print(f"REDIS_USERNAME: {config.redis_username}")
|
| 18 |
+
print(f"REDIS_DISABLE_SSL: {config.redis_disable_ssl}")
|
| 19 |
+
|
| 20 |
+
# Initialize Redis client
|
| 21 |
+
client = redis_client.get_client()
|
| 22 |
+
|
| 23 |
+
if client is None:
|
| 24 |
+
print("❌ Redis client is None - connection failed")
|
| 25 |
+
return False
|
| 26 |
+
|
| 27 |
+
try:
|
| 28 |
+
# Test ping
|
| 29 |
+
result = client.ping()
|
| 30 |
+
print(f"✅ Ping result: {result}")
|
| 31 |
+
|
| 32 |
+
# Test basic set/get
|
| 33 |
+
test_key = "redis_test_key"
|
| 34 |
+
test_value = "redis_test_value"
|
| 35 |
+
|
| 36 |
+
client.set(test_key, test_value)
|
| 37 |
+
retrieved_value = client.get(test_key)
|
| 38 |
+
|
| 39 |
+
if retrieved_value == test_value:
|
| 40 |
+
print("✅ Set/Get test successful")
|
| 41 |
+
# Clean up
|
| 42 |
+
client.delete(test_key)
|
| 43 |
+
else:
|
| 44 |
+
print("❌ Set/Get test failed")
|
| 45 |
+
|
| 46 |
+
return True
|
| 47 |
+
except Exception as e:
|
| 48 |
+
print(f"❌ Redis operation failed: {e}")
|
| 49 |
+
return False
|
| 50 |
+
|
| 51 |
+
if __name__ == "__main__":
|
| 52 |
+
success = test_redis_connection()
|
| 53 |
+
if success:
|
| 54 |
+
print("\n🎉 Redis connection test passed!")
|
| 55 |
+
else:
|
| 56 |
+
print("\n💥 Redis connection test failed!")
|
| 57 |
+
sys.exit(1)
|