Add DB Backup
This commit is contained in:
parent
d159cadacc
commit
3d0bfbca2c
@ -22,4 +22,10 @@ FRONTEND_URL=http://localhost:5174
|
||||
AZURE_CLIENT_ID=db244cf5-eb11-4738-a2ea-5b0716c9ec0a
|
||||
AZURE_CLIENT_SECRET=Zad8Q~qRBxaQq8up0lLXAq4pHzrVM2JFGFJhHaDp
|
||||
AZURE_TENANT_ID=consumers
|
||||
AZURE_REDIRECT_URI=http://localhost:8000/auth/azure/callback
|
||||
AZURE_REDIRECT_URI=http://localhost:8000/auth/azure/callback
|
||||
|
||||
# Cloudflare R2 Backup Configuration
|
||||
R2_ENDPOINT=https://d4704b8c40b2f95b2c7bf7ee4ecc52f8.r2.cloudflarestorage.com
|
||||
R2_ACCESS_KEY=1997b1e48a337c0dbe1f7552a08631b5
|
||||
R2_SECRET_KEY=369694e39fedfedb254158c147171f5760de84fa2346d5d5d5a961f1f517dbc6
|
||||
R2_BUCKET=my-recipes-db-bkp
|
||||
93
backend/BACKUP_README.md
Normal file
93
backend/BACKUP_README.md
Normal file
@ -0,0 +1,93 @@
|
||||
# Database Backup & Restore Scripts
|
||||
|
||||
## Overview
|
||||
Automated database backup system that exports PostgreSQL database, compresses it with gzip, and uploads to Cloudflare R2 storage.
|
||||
|
||||
## Requirements
|
||||
```bash
|
||||
pip install boto3
|
||||
```
|
||||
|
||||
## Configuration
|
||||
All configuration is stored in `.env` file:
|
||||
- `R2_ENDPOINT`: Cloudflare R2 endpoint URL
|
||||
- `R2_ACCESS_KEY`: R2 API access key
|
||||
- `R2_SECRET_KEY`: R2 API secret key
|
||||
- `R2_BUCKET`: R2 bucket name
|
||||
- Database credentials (DB_HOST, DB_PORT, DB_NAME, DB_USER, DB_PASSWORD)
|
||||
|
||||
## Usage
|
||||
|
||||
### Create Backup
|
||||
```bash
|
||||
cd backend
|
||||
python backup_db.py
|
||||
```
|
||||
|
||||
This will:
|
||||
1. Export the database using `pg_dump`
|
||||
2. Compress the dump with gzip (typically 80-90% reduction)
|
||||
3. Upload to R2 with timestamp
|
||||
4. List all backups in R2
|
||||
5. Clean up old local backups (keeps last 3)
|
||||
|
||||
### Restore from Backup
|
||||
```bash
|
||||
cd backend
|
||||
python restore_db.py
|
||||
```
|
||||
|
||||
This will:
|
||||
1. List all available backups in R2
|
||||
2. Let you select which backup to restore
|
||||
3. Download and decompress the backup
|
||||
4. Restore to the database (with confirmation prompt)
|
||||
|
||||
**WARNING**: Restore will drop all existing tables and recreate them from backup!
|
||||
|
||||
## Automated Backups
|
||||
|
||||
### Linux/Mac (Cron)
|
||||
Add to crontab:
|
||||
```bash
|
||||
# Daily backup at 2 AM
|
||||
0 2 * * * cd /path/to/backend && python backup_db.py >> backup.log 2>&1
|
||||
```
|
||||
|
||||
### Windows (Task Scheduler)
|
||||
Create a scheduled task:
|
||||
1. Open Task Scheduler
|
||||
2. Create Basic Task
|
||||
3. Name: "Recipe DB Backup"
|
||||
4. Trigger: Daily at 2:00 AM
|
||||
5. Action: Start a program
|
||||
- Program: `python`
|
||||
- Arguments: `backup_db.py`
|
||||
- Start in: `C:\path\to\backend`
|
||||
|
||||
## Backup File Format
|
||||
Files are named: `recipes_db_YYYYMMDD_HHMMSS.sql.gz`
|
||||
|
||||
Example: `recipes_db_20251221_140530.sql.gz`
|
||||
|
||||
## Storage
|
||||
- Local backups stored in: `backend/backups/`
|
||||
- R2 backups stored in: `my-recipes-db-bkp` bucket
|
||||
- Local backups auto-cleanup (keeps last 3)
|
||||
- R2 backups are never auto-deleted (manual cleanup if needed)
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### pg_dump not found
|
||||
Install PostgreSQL client tools:
|
||||
- **Windows**: Install PostgreSQL and add to PATH
|
||||
- **Linux**: `sudo apt install postgresql-client`
|
||||
- **Mac**: `brew install postgresql`
|
||||
|
||||
### Connection errors
|
||||
Verify database credentials in `.env` file
|
||||
|
||||
### R2 upload errors
|
||||
- Check R2 credentials
|
||||
- Verify bucket exists
|
||||
- Ensure API token has "Edit" permissions
|
||||
164
backend/BACKUP_SYSTEM_COMPLETE.md
Normal file
164
backend/BACKUP_SYSTEM_COMPLETE.md
Normal file
@ -0,0 +1,164 @@
|
||||
# Database Backup System - Complete Setup
|
||||
|
||||
## ✅ What's Been Implemented
|
||||
|
||||
### 1. **Backend API Endpoints** (Admin Only)
|
||||
- `POST /admin/backup` - Trigger manual backup
|
||||
- `GET /admin/backups` - List all available backups
|
||||
- `POST /admin/restore?filename=<name>` - Restore from backup
|
||||
|
||||
### 2. **Frontend Admin Panel**
|
||||
- New "ניהול" (Management) tab in navigation (visible to admin users only)
|
||||
- 🛡️ Admin button in top bar
|
||||
- Full backup management interface:
|
||||
- Create new backups instantly
|
||||
- View all backups with dates and sizes
|
||||
- Restore from any backup with confirmation
|
||||
|
||||
### 3. **Automated Weekly Backups**
|
||||
- Batch script: `run_backup.bat`
|
||||
- Full setup guide: `WEEKLY_BACKUP_SETUP.md`
|
||||
- Configured for Windows Task Scheduler
|
||||
|
||||
## 🚀 How to Use
|
||||
|
||||
### **Manual Backup (Admin User)**
|
||||
|
||||
1. Login with admin account
|
||||
2. Click 🛡️ "ניהול" button in top bar (or use the ניהול tab)
|
||||
3. Click "צור גיבוי חדש" (Create New Backup)
|
||||
4. Backup is created, compressed, and uploaded to R2
|
||||
5. See confirmation toast: "גיבוי נוצר בהצלחה! 📦"
|
||||
|
||||
### **Restore from Backup (Admin User)**
|
||||
|
||||
1. Go to Admin Panel (🛡️ ניהול)
|
||||
2. View all available backups in the table
|
||||
3. Click "שחזר" (Restore) button for desired backup
|
||||
4. Confirm the warning (this will delete current data!)
|
||||
5. Page will refresh automatically after restore
|
||||
|
||||
### **Setup Weekly Automatic Backups**
|
||||
|
||||
Follow the instructions in `WEEKLY_BACKUP_SETUP.md`:
|
||||
|
||||
**Quick Steps:**
|
||||
1. Open Task Scheduler (`Win + R` → `taskschd.msc`)
|
||||
2. Create Task → "Recipe DB Weekly Backup"
|
||||
3. Set trigger: Weekly, Sunday, 2:00 AM
|
||||
4. Set action: Run `C:\Path\To\backend\run_backup.bat`
|
||||
5. Configure to run even when not logged in
|
||||
|
||||
## 📁 Files Created/Modified
|
||||
|
||||
### Backend
|
||||
- ✅ `backup_restore_api.py` - Core backup/restore functions
|
||||
- ✅ `main.py` - Added admin endpoints
|
||||
- ✅ `requirements.txt` - Added boto3 dependency
|
||||
- ✅ `.env` - Added R2 credentials
|
||||
- ✅ `run_backup.bat` - Windows batch script for scheduled tasks
|
||||
- ✅ `BACKUP_README.md` - Complete documentation
|
||||
- ✅ `WEEKLY_BACKUP_SETUP.md` - Task Scheduler setup guide
|
||||
|
||||
### Frontend
|
||||
- ✅ `backupApi.js` - API calls for backup operations
|
||||
- ✅ `components/AdminPanel.jsx` - Admin UI component
|
||||
- ✅ `components/TopBar.jsx` - Added admin button
|
||||
- ✅ `App.jsx` - Added admin view and navigation
|
||||
- ✅ `App.css` - Added admin panel styles
|
||||
|
||||
## 🔐 Security
|
||||
|
||||
- **Admin-only access**: All backup endpoints check `is_admin` flag
|
||||
- **Non-admin users**: Cannot see the admin button or access backup endpoints
|
||||
- **403 Forbidden**: Returned if non-admin tries to access admin endpoints
|
||||
|
||||
## 💾 Backup Details
|
||||
|
||||
### What's Backed Up
|
||||
- Complete PostgreSQL database (recipes_db)
|
||||
- All tables: users, recipes, grocery lists, shares, notifications
|
||||
|
||||
### Backup Process
|
||||
1. Uses `pg_dump` to export database
|
||||
2. Compresses with gzip (typically 80-90% size reduction)
|
||||
3. Uploads to Cloudflare R2 with timestamp
|
||||
4. Filename format: `recipes_db_YYYYMMDD_HHMMSS.sql.gz`
|
||||
5. Local backups auto-cleanup (keeps last 3)
|
||||
|
||||
### Restore Process
|
||||
1. Downloads from R2
|
||||
2. Decompresses file
|
||||
3. **Drops all existing tables** (CASCADE)
|
||||
4. Restores from SQL file
|
||||
5. Cleans up temporary files
|
||||
|
||||
## 🧪 Testing
|
||||
|
||||
### Test Manual Backup
|
||||
```bash
|
||||
cd backend
|
||||
python backup_db.py
|
||||
```
|
||||
|
||||
### Test Manual Restore
|
||||
```bash
|
||||
cd backend
|
||||
python restore_db.py
|
||||
```
|
||||
|
||||
### Test via Web UI
|
||||
1. Login as admin
|
||||
2. Navigate to Admin Panel
|
||||
3. Click "צור גיבוי חדש"
|
||||
4. Check R2 bucket for new file
|
||||
|
||||
## ⚠️ Important Notes
|
||||
|
||||
1. **Restore is destructive**: It deletes ALL current data
|
||||
2. **Admin access required**: Set user's `is_admin = true` in database
|
||||
3. **R2 credentials**: Already configured in `.env`
|
||||
4. **Weekly backups**: Manual setup required (follow WEEKLY_BACKUP_SETUP.md)
|
||||
5. **PostgreSQL tools**: Must have `pg_dump` and `psql` in system PATH
|
||||
|
||||
## 🔧 Troubleshooting
|
||||
|
||||
### "Admin access required" error
|
||||
- Check if user has `is_admin = true` in database
|
||||
- Run: `SELECT username, is_admin FROM users;` in psql
|
||||
|
||||
### Backup fails
|
||||
- Check `backend/backup.log` for errors
|
||||
- Verify R2 credentials in `.env`
|
||||
- Verify database credentials in `.env`
|
||||
- Test: `python backup_db.py` manually
|
||||
|
||||
### Can't see admin button
|
||||
- Verify user's `is_admin` flag in database
|
||||
- Refresh page after changing admin status
|
||||
- Check browser console for errors
|
||||
|
||||
### Scheduled backup doesn't run
|
||||
- Check Task Scheduler → Task History
|
||||
- Verify `run_backup.bat` path is correct
|
||||
- Check `backend/backup.log` for errors
|
||||
- Test batch file manually first
|
||||
|
||||
## 📊 What Admins Can Do
|
||||
|
||||
✅ Create manual backups anytime
|
||||
✅ View all backups with dates and sizes
|
||||
✅ Restore from any backup point
|
||||
✅ See backup history in table format
|
||||
✅ All regular user features (recipes, grocery lists, etc.)
|
||||
|
||||
## Next Steps
|
||||
|
||||
1. **✅ Test the system**: Create a manual backup from Admin Panel
|
||||
2. **📅 Setup weekly backups**: Follow WEEKLY_BACKUP_SETUP.md
|
||||
3. **🔒 Secure admin access**: Only give admin rights to trusted users
|
||||
4. **📝 Document your backup strategy**: When/how often you back up
|
||||
|
||||
---
|
||||
|
||||
**Your database is now protected with automated backups! 🎉**
|
||||
131
backend/WEEKLY_BACKUP_SETUP.md
Normal file
131
backend/WEEKLY_BACKUP_SETUP.md
Normal file
@ -0,0 +1,131 @@
|
||||
# Weekly Backup Setup - Windows Task Scheduler
|
||||
|
||||
This guide will help you set up automatic weekly backups of your database.
|
||||
|
||||
## Setup Instructions
|
||||
|
||||
### 1. Create Batch Script
|
||||
|
||||
Create a file `run_backup.bat` in the `backend` folder:
|
||||
|
||||
```batch
|
||||
@echo off
|
||||
cd /d "%~dp0"
|
||||
python backup_db.py >> backup.log 2>&1
|
||||
```
|
||||
|
||||
### 2. Open Task Scheduler
|
||||
|
||||
1. Press `Win + R`
|
||||
2. Type `taskschd.msc`
|
||||
3. Press Enter
|
||||
|
||||
### 3. Create New Task
|
||||
|
||||
1. Click "Create Task" (not "Create Basic Task")
|
||||
2. In **General** tab:
|
||||
- Name: `Recipe DB Weekly Backup`
|
||||
- Description: `Automatic weekly database backup to Cloudflare R2`
|
||||
- Select "Run whether user is logged on or not"
|
||||
- Check "Run with highest privileges"
|
||||
|
||||
### 4. Configure Trigger
|
||||
|
||||
1. Go to **Triggers** tab
|
||||
2. Click "New..."
|
||||
3. Configure:
|
||||
- Begin the task: `On a schedule`
|
||||
- Settings: `Weekly`
|
||||
- Recur every: `1 weeks`
|
||||
- Days: Select `Sunday` (or your preferred day)
|
||||
- Time: `02:00:00` (2 AM)
|
||||
- Check "Enabled"
|
||||
4. Click OK
|
||||
|
||||
### 5. Configure Action
|
||||
|
||||
1. Go to **Actions** tab
|
||||
2. Click "New..."
|
||||
3. Configure:
|
||||
- Action: `Start a program`
|
||||
- Program/script: `C:\Path\To\backend\run_backup.bat`
|
||||
*(Replace with your actual path)*
|
||||
- Start in: `C:\Path\To\backend\`
|
||||
*(Replace with your actual path)*
|
||||
4. Click OK
|
||||
|
||||
### 6. Additional Settings
|
||||
|
||||
1. Go to **Conditions** tab:
|
||||
- Uncheck "Start the task only if the computer is on AC power"
|
||||
|
||||
2. Go to **Settings** tab:
|
||||
- Check "Run task as soon as possible after a scheduled start is missed"
|
||||
- If the task fails, restart every: `10 minutes`
|
||||
- Attempt to restart up to: `3 times`
|
||||
|
||||
3. Click OK
|
||||
|
||||
### 7. Enter Password
|
||||
|
||||
- You'll be prompted to enter your Windows password
|
||||
- This allows the task to run even when you're not logged in
|
||||
|
||||
## Verify Setup
|
||||
|
||||
### Test the Task
|
||||
|
||||
1. In Task Scheduler, find your task
|
||||
2. Right-click → "Run"
|
||||
3. Check `backend/backup.log` for results
|
||||
|
||||
### View Scheduled Runs
|
||||
|
||||
- In Task Scheduler, select your task
|
||||
- Check the "History" tab to see past runs
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Task doesn't run
|
||||
|
||||
- Check Task Scheduler → Task History for errors
|
||||
- Verify Python is in system PATH
|
||||
- Try running `run_backup.bat` manually first
|
||||
|
||||
### No log file created
|
||||
|
||||
- Check file permissions in backend folder
|
||||
- Verify the "Start in" path is correct
|
||||
|
||||
### Backup fails
|
||||
|
||||
- Check `backend/backup.log` for error messages
|
||||
- Verify database credentials in `.env`
|
||||
- Verify R2 credentials in `.env`
|
||||
- Test by running `python backup_db.py` manually
|
||||
|
||||
## Change Backup Schedule
|
||||
|
||||
1. Open Task Scheduler
|
||||
2. Find "Recipe DB Weekly Backup"
|
||||
3. Right-click → Properties
|
||||
4. Go to Triggers tab
|
||||
5. Edit the trigger to change day/time
|
||||
6. Click OK
|
||||
|
||||
## Disable Automatic Backups
|
||||
|
||||
1. Open Task Scheduler
|
||||
2. Find "Recipe DB Weekly Backup"
|
||||
3. Right-click → Disable
|
||||
|
||||
## View Backup Log
|
||||
|
||||
Check `backend/backup.log` to see backup history:
|
||||
|
||||
```batch
|
||||
cd backend
|
||||
type backup.log
|
||||
```
|
||||
|
||||
Or open it in Notepad.
|
||||
209
backend/backup_db.py
Normal file
209
backend/backup_db.py
Normal file
@ -0,0 +1,209 @@
|
||||
"""
|
||||
Database backup script for R2 storage
|
||||
Exports PostgreSQL database, compresses it, and uploads to Cloudflare R2
|
||||
"""
|
||||
import os
|
||||
import subprocess
|
||||
import gzip
|
||||
import shutil
|
||||
from datetime import datetime
|
||||
from pathlib import Path
|
||||
import boto3
|
||||
from botocore.config import Config
|
||||
from dotenv import load_dotenv
|
||||
|
||||
# Load environment variables
|
||||
load_dotenv()
|
||||
|
||||
# R2 Configuration
|
||||
R2_ENDPOINT = os.getenv("R2_ENDPOINT")
|
||||
R2_ACCESS_KEY = os.getenv("R2_ACCESS_KEY")
|
||||
R2_SECRET_KEY = os.getenv("R2_SECRET_KEY")
|
||||
R2_BUCKET = os.getenv("R2_BUCKET")
|
||||
|
||||
# Database Configuration
|
||||
DB_HOST = os.getenv("DB_HOST", "localhost")
|
||||
DB_PORT = os.getenv("DB_PORT", "5432")
|
||||
DB_NAME = os.getenv("DB_NAME", "recipes_db")
|
||||
DB_USER = os.getenv("DB_USER", "recipes_user")
|
||||
DB_PASSWORD = os.getenv("DB_PASSWORD", "recipes_password")
|
||||
|
||||
# Backup directory
|
||||
BACKUP_DIR = Path(__file__).parent / "backups"
|
||||
BACKUP_DIR.mkdir(exist_ok=True)
|
||||
|
||||
|
||||
def create_db_dump():
|
||||
"""Create PostgreSQL database dump"""
|
||||
timestamp = datetime.now().strftime("%Y%m%d_%H%M%S")
|
||||
dump_file = BACKUP_DIR / f"recipes_db_{timestamp}.sql"
|
||||
|
||||
print(f"Creating database dump: {dump_file}")
|
||||
|
||||
# Set PGPASSWORD environment variable for pg_dump
|
||||
env = os.environ.copy()
|
||||
env['PGPASSWORD'] = DB_PASSWORD
|
||||
|
||||
# Run pg_dump
|
||||
cmd = [
|
||||
"pg_dump",
|
||||
"-h", DB_HOST,
|
||||
"-p", DB_PORT,
|
||||
"-U", DB_USER,
|
||||
"-d", DB_NAME,
|
||||
"-f", str(dump_file),
|
||||
"--no-owner", # Don't include ownership commands
|
||||
"--no-acl", # Don't include access privileges
|
||||
]
|
||||
|
||||
try:
|
||||
subprocess.run(cmd, env=env, check=True, capture_output=True, text=True)
|
||||
print(f"✓ Database dump created: {dump_file}")
|
||||
return dump_file
|
||||
except subprocess.CalledProcessError as e:
|
||||
print(f"✗ Error creating database dump: {e.stderr}")
|
||||
raise
|
||||
|
||||
|
||||
def compress_file(file_path):
|
||||
"""Compress file using gzip"""
|
||||
compressed_file = Path(str(file_path) + ".gz")
|
||||
|
||||
print(f"Compressing {file_path.name}...")
|
||||
|
||||
with open(file_path, 'rb') as f_in:
|
||||
with gzip.open(compressed_file, 'wb', compresslevel=9) as f_out:
|
||||
shutil.copyfileobj(f_in, f_out)
|
||||
|
||||
# Remove uncompressed file
|
||||
file_path.unlink()
|
||||
|
||||
# Get compression ratio
|
||||
original_size = file_path.stat().st_size if file_path.exists() else 0
|
||||
compressed_size = compressed_file.stat().st_size
|
||||
ratio = (1 - compressed_size / max(original_size, 1)) * 100 if original_size > 0 else 0
|
||||
|
||||
print(f"✓ Compressed to {compressed_file.name}")
|
||||
print(f" Original: {original_size / 1024:.2f} KB")
|
||||
print(f" Compressed: {compressed_size / 1024:.2f} KB")
|
||||
print(f" Ratio: {ratio:.1f}% reduction")
|
||||
|
||||
return compressed_file
|
||||
|
||||
|
||||
def upload_to_r2(file_path):
|
||||
"""Upload file to Cloudflare R2"""
|
||||
print(f"Uploading {file_path.name} to R2...")
|
||||
|
||||
# Configure S3 client for R2
|
||||
s3_client = boto3.client(
|
||||
's3',
|
||||
endpoint_url=R2_ENDPOINT,
|
||||
aws_access_key_id=R2_ACCESS_KEY,
|
||||
aws_secret_access_key=R2_SECRET_KEY,
|
||||
config=Config(signature_version='s3v4'),
|
||||
region_name='auto'
|
||||
)
|
||||
|
||||
# Upload file
|
||||
try:
|
||||
s3_client.upload_file(
|
||||
str(file_path),
|
||||
R2_BUCKET,
|
||||
file_path.name,
|
||||
ExtraArgs={
|
||||
'Metadata': {
|
||||
'backup-date': datetime.now().isoformat(),
|
||||
'db-name': DB_NAME,
|
||||
}
|
||||
}
|
||||
)
|
||||
print(f"✓ Uploaded to R2: s3://{R2_BUCKET}/{file_path.name}")
|
||||
return True
|
||||
except Exception as e:
|
||||
print(f"✗ Error uploading to R2: {e}")
|
||||
raise
|
||||
|
||||
|
||||
def list_r2_backups():
|
||||
"""List all backups in R2 bucket"""
|
||||
print(f"\nListing backups in R2 bucket: {R2_BUCKET}")
|
||||
|
||||
s3_client = boto3.client(
|
||||
's3',
|
||||
endpoint_url=R2_ENDPOINT,
|
||||
aws_access_key_id=R2_ACCESS_KEY,
|
||||
aws_secret_access_key=R2_SECRET_KEY,
|
||||
config=Config(signature_version='s3v4'),
|
||||
region_name='auto'
|
||||
)
|
||||
|
||||
try:
|
||||
response = s3_client.list_objects_v2(Bucket=R2_BUCKET)
|
||||
|
||||
if 'Contents' not in response:
|
||||
print("No backups found")
|
||||
return
|
||||
|
||||
print(f"\nFound {len(response['Contents'])} backup(s):")
|
||||
for obj in sorted(response['Contents'], key=lambda x: x['LastModified'], reverse=True):
|
||||
size_mb = obj['Size'] / (1024 * 1024)
|
||||
print(f" - {obj['Key']}")
|
||||
print(f" Size: {size_mb:.2f} MB")
|
||||
print(f" Date: {obj['LastModified']}")
|
||||
|
||||
except Exception as e:
|
||||
print(f"✗ Error listing backups: {e}")
|
||||
|
||||
|
||||
def cleanup_old_local_backups(keep_last=3):
|
||||
"""Keep only the last N local backups"""
|
||||
backups = sorted(BACKUP_DIR.glob("*.sql.gz"), key=lambda x: x.stat().st_mtime, reverse=True)
|
||||
|
||||
if len(backups) > keep_last:
|
||||
print(f"\nCleaning up old local backups (keeping last {keep_last})...")
|
||||
for backup in backups[keep_last:]:
|
||||
print(f" Removing: {backup.name}")
|
||||
backup.unlink()
|
||||
|
||||
|
||||
def main():
|
||||
"""Main backup process"""
|
||||
print("=" * 60)
|
||||
print("Database Backup to Cloudflare R2")
|
||||
print("=" * 60)
|
||||
print()
|
||||
|
||||
try:
|
||||
# Verify R2 credentials
|
||||
if not all([R2_ENDPOINT, R2_ACCESS_KEY, R2_SECRET_KEY, R2_BUCKET]):
|
||||
raise ValueError("Missing R2 credentials in environment variables")
|
||||
|
||||
# Create database dump
|
||||
dump_file = create_db_dump()
|
||||
|
||||
# Compress the dump
|
||||
compressed_file = compress_file(dump_file)
|
||||
|
||||
# Upload to R2
|
||||
upload_to_r2(compressed_file)
|
||||
|
||||
# List all backups
|
||||
list_r2_backups()
|
||||
|
||||
# Cleanup old local backups
|
||||
cleanup_old_local_backups(keep_last=3)
|
||||
|
||||
print("\n" + "=" * 60)
|
||||
print("✓ Backup completed successfully!")
|
||||
print("=" * 60)
|
||||
|
||||
except Exception as e:
|
||||
print("\n" + "=" * 60)
|
||||
print(f"✗ Backup failed: {e}")
|
||||
print("=" * 60)
|
||||
raise
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
245
backend/backup_restore_api.py
Normal file
245
backend/backup_restore_api.py
Normal file
@ -0,0 +1,245 @@
|
||||
"""
|
||||
Backup and Restore API endpoints for database management.
|
||||
Admin-only access required.
|
||||
"""
|
||||
import os
|
||||
import subprocess
|
||||
import gzip
|
||||
import shutil
|
||||
from datetime import datetime
|
||||
from typing import List
|
||||
import boto3
|
||||
from botocore.exceptions import ClientError
|
||||
from dotenv import load_dotenv
|
||||
|
||||
load_dotenv()
|
||||
|
||||
|
||||
def get_r2_client():
|
||||
"""Get configured R2 client"""
|
||||
return boto3.client(
|
||||
's3',
|
||||
endpoint_url=os.getenv('R2_ENDPOINT'),
|
||||
aws_access_key_id=os.getenv('R2_ACCESS_KEY'),
|
||||
aws_secret_access_key=os.getenv('R2_SECRET_KEY'),
|
||||
region_name='auto'
|
||||
)
|
||||
|
||||
|
||||
def create_db_dump() -> str:
|
||||
"""Create a database dump file"""
|
||||
timestamp = datetime.now().strftime('%Y%m%d_%H%M%S')
|
||||
backup_dir = os.path.join(os.path.dirname(__file__), 'backups')
|
||||
os.makedirs(backup_dir, exist_ok=True)
|
||||
|
||||
dump_file = os.path.join(backup_dir, f'recipes_db_{timestamp}.sql')
|
||||
|
||||
db_host = os.getenv('DB_HOST', 'localhost')
|
||||
db_port = os.getenv('DB_PORT', '5432')
|
||||
db_name = os.getenv('DB_NAME', 'recipes_db')
|
||||
db_user = os.getenv('DB_USER', 'postgres')
|
||||
db_password = os.getenv('DB_PASSWORD', 'postgres')
|
||||
|
||||
env = os.environ.copy()
|
||||
env['PGPASSWORD'] = db_password
|
||||
|
||||
cmd = [
|
||||
'pg_dump',
|
||||
'-h', db_host,
|
||||
'-p', db_port,
|
||||
'-U', db_user,
|
||||
'-d', db_name,
|
||||
'--no-owner',
|
||||
'--no-acl',
|
||||
'-f', dump_file
|
||||
]
|
||||
|
||||
result = subprocess.run(cmd, env=env, capture_output=True, text=True)
|
||||
|
||||
if result.returncode != 0:
|
||||
raise Exception(f"pg_dump failed: {result.stderr}")
|
||||
|
||||
return dump_file
|
||||
|
||||
|
||||
def compress_file(file_path: str) -> str:
|
||||
"""Compress a file with gzip"""
|
||||
compressed_path = f"{file_path}.gz"
|
||||
|
||||
with open(file_path, 'rb') as f_in:
|
||||
with gzip.open(compressed_path, 'wb', compresslevel=9) as f_out:
|
||||
shutil.copyfileobj(f_in, f_out)
|
||||
|
||||
os.remove(file_path)
|
||||
return compressed_path
|
||||
|
||||
|
||||
def upload_to_r2(file_path: str) -> str:
|
||||
"""Upload file to R2"""
|
||||
s3_client = get_r2_client()
|
||||
bucket_name = os.getenv('R2_BUCKET')
|
||||
file_name = os.path.basename(file_path)
|
||||
|
||||
try:
|
||||
s3_client.upload_file(file_path, bucket_name, file_name)
|
||||
return file_name
|
||||
except ClientError as e:
|
||||
raise Exception(f"R2 upload failed: {str(e)}")
|
||||
|
||||
|
||||
def list_r2_backups() -> List[dict]:
|
||||
"""List all backups in R2"""
|
||||
s3_client = get_r2_client()
|
||||
bucket_name = os.getenv('R2_BUCKET')
|
||||
|
||||
try:
|
||||
response = s3_client.list_objects_v2(Bucket=bucket_name)
|
||||
|
||||
if 'Contents' not in response:
|
||||
return []
|
||||
|
||||
backups = []
|
||||
for obj in response['Contents']:
|
||||
backups.append({
|
||||
'filename': obj['Key'],
|
||||
'size': obj['Size'],
|
||||
'last_modified': obj['LastModified'].isoformat()
|
||||
})
|
||||
|
||||
backups.sort(key=lambda x: x['last_modified'], reverse=True)
|
||||
return backups
|
||||
|
||||
except ClientError as e:
|
||||
raise Exception(f"Failed to list R2 backups: {str(e)}")
|
||||
|
||||
|
||||
def download_from_r2(filename: str) -> str:
|
||||
"""Download a backup from R2"""
|
||||
s3_client = get_r2_client()
|
||||
bucket_name = os.getenv('R2_BUCKET')
|
||||
|
||||
backup_dir = os.path.join(os.path.dirname(__file__), 'backups')
|
||||
os.makedirs(backup_dir, exist_ok=True)
|
||||
|
||||
local_path = os.path.join(backup_dir, filename)
|
||||
|
||||
try:
|
||||
s3_client.download_file(bucket_name, filename, local_path)
|
||||
return local_path
|
||||
except ClientError as e:
|
||||
raise Exception(f"R2 download failed: {str(e)}")
|
||||
|
||||
|
||||
def decompress_file(compressed_path: str) -> str:
|
||||
"""Decompress a gzipped file"""
|
||||
if not compressed_path.endswith('.gz'):
|
||||
raise ValueError("File must be gzipped (.gz)")
|
||||
|
||||
decompressed_path = compressed_path[:-3]
|
||||
|
||||
with gzip.open(compressed_path, 'rb') as f_in:
|
||||
with open(decompressed_path, 'wb') as f_out:
|
||||
shutil.copyfileobj(f_in, f_out)
|
||||
|
||||
return decompressed_path
|
||||
|
||||
|
||||
def restore_database(sql_file: str) -> None:
|
||||
"""Restore database from SQL file"""
|
||||
db_host = os.getenv('DB_HOST', 'localhost')
|
||||
db_port = os.getenv('DB_PORT', '5432')
|
||||
db_name = os.getenv('DB_NAME', 'recipes_db')
|
||||
db_user = os.getenv('DB_USER', 'postgres')
|
||||
db_password = os.getenv('DB_PASSWORD', 'postgres')
|
||||
|
||||
env = os.environ.copy()
|
||||
env['PGPASSWORD'] = db_password
|
||||
|
||||
# Drop all tables first
|
||||
drop_cmd = [
|
||||
'psql',
|
||||
'-h', db_host,
|
||||
'-p', db_port,
|
||||
'-U', db_user,
|
||||
'-d', db_name,
|
||||
'-c', 'DROP SCHEMA public CASCADE; CREATE SCHEMA public;'
|
||||
]
|
||||
|
||||
drop_result = subprocess.run(drop_cmd, env=env, capture_output=True, text=True)
|
||||
|
||||
if drop_result.returncode != 0:
|
||||
raise Exception(f"Failed to drop schema: {drop_result.stderr}")
|
||||
|
||||
# Restore from backup
|
||||
restore_cmd = [
|
||||
'psql',
|
||||
'-h', db_host,
|
||||
'-p', db_port,
|
||||
'-U', db_user,
|
||||
'-d', db_name,
|
||||
'-f', sql_file
|
||||
]
|
||||
|
||||
restore_result = subprocess.run(restore_cmd, env=env, capture_output=True, text=True)
|
||||
|
||||
if restore_result.returncode != 0:
|
||||
raise Exception(f"Database restore failed: {restore_result.stderr}")
|
||||
|
||||
|
||||
def perform_backup() -> dict:
|
||||
"""Perform complete backup process"""
|
||||
try:
|
||||
# Create dump
|
||||
dump_file = create_db_dump()
|
||||
|
||||
# Compress
|
||||
compressed_file = compress_file(dump_file)
|
||||
|
||||
# Upload to R2
|
||||
r2_filename = upload_to_r2(compressed_file)
|
||||
|
||||
# Get file size
|
||||
file_size = os.path.getsize(compressed_file)
|
||||
|
||||
# Clean up local file
|
||||
os.remove(compressed_file)
|
||||
|
||||
return {
|
||||
'success': True,
|
||||
'filename': r2_filename,
|
||||
'size': file_size,
|
||||
'timestamp': datetime.now().isoformat()
|
||||
}
|
||||
except Exception as e:
|
||||
return {
|
||||
'success': False,
|
||||
'error': str(e)
|
||||
}
|
||||
|
||||
|
||||
def perform_restore(filename: str) -> dict:
|
||||
"""Perform complete restore process"""
|
||||
try:
|
||||
# Download from R2
|
||||
compressed_file = download_from_r2(filename)
|
||||
|
||||
# Decompress
|
||||
sql_file = decompress_file(compressed_file)
|
||||
|
||||
# Restore database
|
||||
restore_database(sql_file)
|
||||
|
||||
# Clean up
|
||||
os.remove(compressed_file)
|
||||
os.remove(sql_file)
|
||||
|
||||
return {
|
||||
'success': True,
|
||||
'filename': filename,
|
||||
'timestamp': datetime.now().isoformat()
|
||||
}
|
||||
except Exception as e:
|
||||
return {
|
||||
'success': False,
|
||||
'error': str(e)
|
||||
}
|
||||
@ -56,6 +56,12 @@ from notification_db_utils import (
|
||||
delete_notification,
|
||||
)
|
||||
|
||||
from backup_restore_api import (
|
||||
perform_backup,
|
||||
perform_restore,
|
||||
list_r2_backups,
|
||||
)
|
||||
|
||||
from email_utils import (
|
||||
generate_verification_code,
|
||||
send_verification_email,
|
||||
@ -1153,5 +1159,53 @@ def delete_notification_endpoint(
|
||||
return {"message": "Notification deleted"}
|
||||
|
||||
|
||||
# ===== Backup & Restore Endpoints (Admin Only) =====
|
||||
|
||||
@app.post("/admin/backup")
|
||||
def trigger_backup(
|
||||
current_user: dict = Depends(get_current_user)
|
||||
):
|
||||
"""Trigger a manual database backup (admin only)"""
|
||||
if not current_user.get("is_admin", False):
|
||||
raise HTTPException(status_code=403, detail="Admin access required")
|
||||
|
||||
result = perform_backup()
|
||||
if not result['success']:
|
||||
raise HTTPException(status_code=500, detail=result['error'])
|
||||
|
||||
return result
|
||||
|
||||
|
||||
@app.get("/admin/backups")
|
||||
def list_backups(
|
||||
current_user: dict = Depends(get_current_user)
|
||||
):
|
||||
"""List all available backups (admin only)"""
|
||||
if not current_user.get("is_admin", False):
|
||||
raise HTTPException(status_code=403, detail="Admin access required")
|
||||
|
||||
try:
|
||||
backups = list_r2_backups()
|
||||
return {"backups": backups}
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=500, detail=str(e))
|
||||
|
||||
|
||||
@app.post("/admin/restore")
|
||||
def trigger_restore(
|
||||
filename: str,
|
||||
current_user: dict = Depends(get_current_user)
|
||||
):
|
||||
"""Restore database from a backup (admin only)"""
|
||||
if not current_user.get("is_admin", False):
|
||||
raise HTTPException(status_code=403, detail="Admin access required")
|
||||
|
||||
result = perform_restore(filename)
|
||||
if not result['success']:
|
||||
raise HTTPException(status_code=500, detail=result['error'])
|
||||
|
||||
return result
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
uvicorn.run("main:app", host="0.0.0.0", port=8000, reload=True)
|
||||
@ -20,3 +20,6 @@ aiosmtplib==3.0.2
|
||||
authlib==1.3.0
|
||||
httpx==0.27.0
|
||||
itsdangerous==2.1.2
|
||||
|
||||
# Backup to R2
|
||||
boto3==1.34.17
|
||||
|
||||
218
backend/restore_db.py
Normal file
218
backend/restore_db.py
Normal file
@ -0,0 +1,218 @@
|
||||
"""
|
||||
Database restore script from R2 storage
|
||||
Downloads compressed backup from R2 and restores to PostgreSQL
|
||||
"""
|
||||
import os
|
||||
import subprocess
|
||||
import gzip
|
||||
from pathlib import Path
|
||||
import boto3
|
||||
from botocore.config import Config
|
||||
from dotenv import load_dotenv
|
||||
|
||||
# Load environment variables
|
||||
load_dotenv()
|
||||
|
||||
# R2 Configuration
|
||||
R2_ENDPOINT = os.getenv("R2_ENDPOINT")
|
||||
R2_ACCESS_KEY = os.getenv("R2_ACCESS_KEY")
|
||||
R2_SECRET_KEY = os.getenv("R2_SECRET_KEY")
|
||||
R2_BUCKET = os.getenv("R2_BUCKET")
|
||||
|
||||
# Database Configuration
|
||||
DB_HOST = os.getenv("DB_HOST", "localhost")
|
||||
DB_PORT = os.getenv("DB_PORT", "5432")
|
||||
DB_NAME = os.getenv("DB_NAME", "recipes_db")
|
||||
DB_USER = os.getenv("DB_USER", "recipes_user")
|
||||
DB_PASSWORD = os.getenv("DB_PASSWORD", "recipes_password")
|
||||
|
||||
# Restore directory
|
||||
RESTORE_DIR = Path(__file__).parent / "restores"
|
||||
RESTORE_DIR.mkdir(exist_ok=True)
|
||||
|
||||
|
||||
def list_r2_backups():
|
||||
"""List all available backups in R2"""
|
||||
s3_client = boto3.client(
|
||||
's3',
|
||||
endpoint_url=R2_ENDPOINT,
|
||||
aws_access_key_id=R2_ACCESS_KEY,
|
||||
aws_secret_access_key=R2_SECRET_KEY,
|
||||
config=Config(signature_version='s3v4'),
|
||||
region_name='auto'
|
||||
)
|
||||
|
||||
try:
|
||||
response = s3_client.list_objects_v2(Bucket=R2_BUCKET)
|
||||
|
||||
if 'Contents' not in response:
|
||||
return []
|
||||
|
||||
backups = sorted(response['Contents'], key=lambda x: x['LastModified'], reverse=True)
|
||||
return backups
|
||||
|
||||
except Exception as e:
|
||||
print(f"✗ Error listing backups: {e}")
|
||||
return []
|
||||
|
||||
|
||||
def download_from_r2(backup_name):
|
||||
"""Download backup file from R2"""
|
||||
local_file = RESTORE_DIR / backup_name
|
||||
|
||||
print(f"Downloading {backup_name} from R2...")
|
||||
|
||||
s3_client = boto3.client(
|
||||
's3',
|
||||
endpoint_url=R2_ENDPOINT,
|
||||
aws_access_key_id=R2_ACCESS_KEY,
|
||||
aws_secret_access_key=R2_SECRET_KEY,
|
||||
config=Config(signature_version='s3v4'),
|
||||
region_name='auto'
|
||||
)
|
||||
|
||||
try:
|
||||
s3_client.download_file(R2_BUCKET, backup_name, str(local_file))
|
||||
size_mb = local_file.stat().st_size / (1024 * 1024)
|
||||
print(f"✓ Downloaded: {local_file.name} ({size_mb:.2f} MB)")
|
||||
return local_file
|
||||
except Exception as e:
|
||||
print(f"✗ Error downloading from R2: {e}")
|
||||
raise
|
||||
|
||||
|
||||
def decompress_file(compressed_file):
|
||||
"""Decompress gzip file"""
|
||||
decompressed_file = Path(str(compressed_file).replace('.gz', ''))
|
||||
|
||||
print(f"Decompressing {compressed_file.name}...")
|
||||
|
||||
with gzip.open(compressed_file, 'rb') as f_in:
|
||||
with open(decompressed_file, 'wb') as f_out:
|
||||
f_out.write(f_in.read())
|
||||
|
||||
compressed_size = compressed_file.stat().st_size
|
||||
decompressed_size = decompressed_file.stat().st_size
|
||||
|
||||
print(f"✓ Decompressed to {decompressed_file.name}")
|
||||
print(f" Compressed: {compressed_size / 1024:.2f} KB")
|
||||
print(f" Decompressed: {decompressed_size / 1024:.2f} KB")
|
||||
|
||||
return decompressed_file
|
||||
|
||||
|
||||
def restore_database(sql_file):
|
||||
"""Restore PostgreSQL database from SQL file"""
|
||||
print(f"\nRestoring database from {sql_file.name}...")
|
||||
print("WARNING: This will overwrite the current database!")
|
||||
|
||||
response = input("Are you sure you want to continue? (yes/no): ")
|
||||
if response.lower() != 'yes':
|
||||
print("Restore cancelled")
|
||||
return False
|
||||
|
||||
# Set PGPASSWORD environment variable
|
||||
env = os.environ.copy()
|
||||
env['PGPASSWORD'] = DB_PASSWORD
|
||||
|
||||
# Drop and recreate database (optional, comment out if you want to merge)
|
||||
print("Dropping existing tables...")
|
||||
drop_cmd = [
|
||||
"psql",
|
||||
"-h", DB_HOST,
|
||||
"-p", DB_PORT,
|
||||
"-U", DB_USER,
|
||||
"-d", DB_NAME,
|
||||
"-c", "DROP SCHEMA public CASCADE; CREATE SCHEMA public;"
|
||||
]
|
||||
|
||||
try:
|
||||
subprocess.run(drop_cmd, env=env, check=True, capture_output=True, text=True)
|
||||
except subprocess.CalledProcessError as e:
|
||||
print(f"Warning: Could not drop schema: {e.stderr}")
|
||||
|
||||
# Restore from backup
|
||||
print("Restoring database...")
|
||||
restore_cmd = [
|
||||
"psql",
|
||||
"-h", DB_HOST,
|
||||
"-p", DB_PORT,
|
||||
"-U", DB_USER,
|
||||
"-d", DB_NAME,
|
||||
"-f", str(sql_file)
|
||||
]
|
||||
|
||||
try:
|
||||
subprocess.run(restore_cmd, env=env, check=True, capture_output=True, text=True)
|
||||
print("✓ Database restored successfully!")
|
||||
return True
|
||||
except subprocess.CalledProcessError as e:
|
||||
print(f"✗ Error restoring database: {e.stderr}")
|
||||
raise
|
||||
|
||||
|
||||
def main():
|
||||
"""Main restore process"""
|
||||
print("=" * 60)
|
||||
print("Database Restore from Cloudflare R2")
|
||||
print("=" * 60)
|
||||
print()
|
||||
|
||||
try:
|
||||
# Verify R2 credentials
|
||||
if not all([R2_ENDPOINT, R2_ACCESS_KEY, R2_SECRET_KEY, R2_BUCKET]):
|
||||
raise ValueError("Missing R2 credentials in environment variables")
|
||||
|
||||
# List available backups
|
||||
print("Available backups:")
|
||||
backups = list_r2_backups()
|
||||
|
||||
if not backups:
|
||||
print("No backups found in R2")
|
||||
return
|
||||
|
||||
for i, backup in enumerate(backups, 1):
|
||||
size_mb = backup['Size'] / (1024 * 1024)
|
||||
print(f"{i}. {backup['Key']}")
|
||||
print(f" Size: {size_mb:.2f} MB, Date: {backup['LastModified']}")
|
||||
print()
|
||||
|
||||
# Select backup
|
||||
choice = input(f"Select backup to restore (1-{len(backups)}) or 'q' to quit: ")
|
||||
|
||||
if choice.lower() == 'q':
|
||||
print("Restore cancelled")
|
||||
return
|
||||
|
||||
try:
|
||||
backup_index = int(choice) - 1
|
||||
if backup_index < 0 or backup_index >= len(backups):
|
||||
raise ValueError()
|
||||
except ValueError:
|
||||
print("Invalid selection")
|
||||
return
|
||||
|
||||
selected_backup = backups[backup_index]['Key']
|
||||
|
||||
# Download backup
|
||||
compressed_file = download_from_r2(selected_backup)
|
||||
|
||||
# Decompress backup
|
||||
sql_file = decompress_file(compressed_file)
|
||||
|
||||
# Restore database
|
||||
restore_database(sql_file)
|
||||
|
||||
print("\n" + "=" * 60)
|
||||
print("✓ Restore completed successfully!")
|
||||
print("=" * 60)
|
||||
|
||||
except Exception as e:
|
||||
print("\n" + "=" * 60)
|
||||
print(f"✗ Restore failed: {e}")
|
||||
print("=" * 60)
|
||||
raise
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
1496
backend/restores/recipes_db_20251221_030046.sql
Normal file
1496
backend/restores/recipes_db_20251221_030046.sql
Normal file
File diff suppressed because one or more lines are too long
BIN
backend/restores/recipes_db_20251221_030046.sql.gz
Normal file
BIN
backend/restores/recipes_db_20251221_030046.sql.gz
Normal file
Binary file not shown.
3
backend/run_backup.bat
Normal file
3
backend/run_backup.bat
Normal file
@ -0,0 +1,3 @@
|
||||
@echo off
|
||||
cd /d "%~dp0"
|
||||
python backup_db.py >> backup.log 2>&1
|
||||
@ -1978,3 +1978,232 @@ html {
|
||||
margin-left: 0.5rem;
|
||||
}
|
||||
|
||||
/* Admin Panel */
|
||||
.admin-view {
|
||||
width: 100%;
|
||||
max-width: 1600px;
|
||||
margin: 0 auto;
|
||||
padding: 2rem;
|
||||
}
|
||||
|
||||
.admin-panel {
|
||||
background: var(--panel-bg);
|
||||
border-radius: 12px;
|
||||
padding: 2rem;
|
||||
box-shadow: var(--shadow-md);
|
||||
}
|
||||
|
||||
.admin-header {
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: center;
|
||||
margin-bottom: 2rem;
|
||||
flex-wrap: wrap;
|
||||
gap: 1rem;
|
||||
}
|
||||
|
||||
.admin-header h2 {
|
||||
margin: 0;
|
||||
color: var(--text-main);
|
||||
}
|
||||
|
||||
.backups-list {
|
||||
margin-top: 1.5rem;
|
||||
}
|
||||
|
||||
.backups-table {
|
||||
width: 100%;
|
||||
table-layout: auto;
|
||||
border-collapse: collapse;
|
||||
background: var(--surface-bg);
|
||||
border-radius: 8px;
|
||||
overflow: hidden;
|
||||
}
|
||||
|
||||
.backups-table thead {
|
||||
background: var(--border-subtle);
|
||||
}
|
||||
|
||||
.backups-table th,
|
||||
.backups-table td {
|
||||
padding: 0.75rem;
|
||||
text-align: right;
|
||||
border-bottom: 1px solid var(--border-subtle);
|
||||
white-space: nowrap;
|
||||
}
|
||||
|
||||
.backups-table th {
|
||||
font-weight: 600;
|
||||
color: var(--text-main);
|
||||
font-size: 0.9rem;
|
||||
}
|
||||
|
||||
.backups-table td {
|
||||
color: var(--text-muted);
|
||||
}
|
||||
|
||||
/* Column widths */
|
||||
.backups-table .col-filename,
|
||||
.backups-table td.filename {
|
||||
width: auto;
|
||||
}
|
||||
|
||||
.backups-table .col-date,
|
||||
.backups-table td.date {
|
||||
width: auto;
|
||||
}
|
||||
|
||||
.backups-table .col-size,
|
||||
.backups-table td.size {
|
||||
width: auto;
|
||||
}
|
||||
|
||||
.backups-table .col-actions,
|
||||
.backups-table td.actions {
|
||||
width: auto;
|
||||
text-align: center;
|
||||
}
|
||||
|
||||
.backups-table td.filename {
|
||||
color: var(--text-main);
|
||||
font-family: 'Courier New', monospace;
|
||||
font-size: 0.85rem;
|
||||
}
|
||||
|
||||
.backups-table tbody tr:last-child td {
|
||||
border-bottom: none;
|
||||
}
|
||||
|
||||
.backups-table tbody tr:hover {
|
||||
background: var(--hover-bg);
|
||||
}
|
||||
|
||||
.btn.small {
|
||||
padding: 0.4rem 0.8rem;
|
||||
font-size: 0.85rem;
|
||||
}
|
||||
|
||||
.btn.danger {
|
||||
background: #dc2626;
|
||||
color: white;
|
||||
}
|
||||
|
||||
.btn.danger:hover {
|
||||
background: #b91c1c;
|
||||
}
|
||||
|
||||
.restore-warning {
|
||||
text-align: center;
|
||||
padding: 1rem;
|
||||
}
|
||||
|
||||
.restore-warning p {
|
||||
margin: 0.5rem 0;
|
||||
color: var(--text-main);
|
||||
}
|
||||
|
||||
.restore-warning .filename-highlight {
|
||||
font-family: 'Courier New', monospace;
|
||||
color: var(--accent);
|
||||
font-weight: 600;
|
||||
margin: 1rem 0;
|
||||
}
|
||||
|
||||
/* Restore Progress */
|
||||
.restore-progress {
|
||||
text-align: center;
|
||||
padding: 2rem 1rem;
|
||||
}
|
||||
|
||||
.progress-bar-container {
|
||||
width: 100%;
|
||||
height: 8px;
|
||||
background: var(--border-subtle);
|
||||
border-radius: 999px;
|
||||
overflow: hidden;
|
||||
margin: 1.5rem 0;
|
||||
}
|
||||
|
||||
.progress-bar-fill {
|
||||
height: 100%;
|
||||
background: linear-gradient(90deg, var(--accent), #10b981);
|
||||
border-radius: 999px;
|
||||
animation: progressAnimation 2s ease-in-out infinite;
|
||||
}
|
||||
|
||||
@keyframes progressAnimation {
|
||||
0% {
|
||||
width: 0%;
|
||||
opacity: 0.6;
|
||||
}
|
||||
50% {
|
||||
width: 70%;
|
||||
opacity: 1;
|
||||
}
|
||||
100% {
|
||||
width: 100%;
|
||||
opacity: 0.8;
|
||||
}
|
||||
}
|
||||
|
||||
.progress-text {
|
||||
margin: 0.5rem 0;
|
||||
color: var(--text-main);
|
||||
font-weight: 500;
|
||||
}
|
||||
|
||||
.progress-text-muted {
|
||||
margin: 1rem 0 0 0;
|
||||
color: var(--text-muted);
|
||||
font-size: 0.9rem;
|
||||
}
|
||||
|
||||
.loading,
|
||||
.empty-state {
|
||||
text-align: center;
|
||||
padding: 2rem;
|
||||
color: var(--text-muted);
|
||||
}
|
||||
|
||||
@media (max-width: 768px) {
|
||||
.admin-view {
|
||||
padding: 1rem;
|
||||
}
|
||||
|
||||
.admin-panel {
|
||||
padding: 1rem;
|
||||
}
|
||||
|
||||
.backups-table {
|
||||
font-size: 0.8rem;
|
||||
}
|
||||
|
||||
.backups-table th,
|
||||
.backups-table td {
|
||||
padding: 0.5rem;
|
||||
}
|
||||
|
||||
.backups-table td.filename {
|
||||
font-size: 0.75rem;
|
||||
}
|
||||
|
||||
.backups-table .col-filename,
|
||||
.backups-table td.filename {
|
||||
width: 35%;
|
||||
}
|
||||
|
||||
.backups-table .col-date,
|
||||
.backups-table td.date {
|
||||
width: 32%;
|
||||
}
|
||||
|
||||
.backups-table .col-size,
|
||||
.backups-table td.size {
|
||||
width: 18%;
|
||||
}
|
||||
|
||||
.backups-table .col-actions,
|
||||
.backups-table td.actions {
|
||||
width: 15%;
|
||||
}
|
||||
}
|
||||
|
||||
@ -7,6 +7,7 @@ import RecipeDetails from "./components/RecipeDetails";
|
||||
import RecipeFormDrawer from "./components/RecipeFormDrawer";
|
||||
import GroceryLists from "./components/GroceryLists";
|
||||
import PinnedGroceryLists from "./components/PinnedGroceryLists";
|
||||
import AdminPanel from "./components/AdminPanel";
|
||||
import Modal from "./components/Modal";
|
||||
import ToastContainer from "./components/ToastContainer";
|
||||
import ThemeToggle from "./components/ThemeToggle";
|
||||
@ -28,7 +29,7 @@ function App() {
|
||||
} catch {
|
||||
return "recipes";
|
||||
}
|
||||
}); // "recipes" or "grocery-lists"
|
||||
}); // "recipes", "grocery-lists", or "admin"
|
||||
|
||||
const [selectedGroceryListId, setSelectedGroceryListId] = useState(null);
|
||||
|
||||
@ -402,6 +403,7 @@ function App() {
|
||||
setCurrentView("grocery-lists");
|
||||
setSelectedGroceryListId(listId);
|
||||
}}
|
||||
onAdminClick={() => setCurrentView("admin")}
|
||||
/>
|
||||
)}
|
||||
|
||||
@ -458,11 +460,23 @@ function App() {
|
||||
>
|
||||
🛒 רשימות קניות
|
||||
</button>
|
||||
{user?.is_admin && (
|
||||
<button
|
||||
className={`nav-tab ${currentView === "admin" ? "active" : ""}`}
|
||||
onClick={() => setCurrentView("admin")}
|
||||
>
|
||||
🛡️ ניהול
|
||||
</button>
|
||||
)}
|
||||
</nav>
|
||||
)}
|
||||
|
||||
<main className="layout">
|
||||
{currentView === "grocery-lists" ? (
|
||||
{currentView === "admin" ? (
|
||||
<div className="admin-view">
|
||||
<AdminPanel onShowToast={addToast} />
|
||||
</div>
|
||||
) : currentView === "grocery-lists" ? (
|
||||
<GroceryLists
|
||||
user={user}
|
||||
onShowToast={addToast}
|
||||
|
||||
66
frontend/src/backupApi.js
Normal file
66
frontend/src/backupApi.js
Normal file
@ -0,0 +1,66 @@
|
||||
import { getToken } from './authApi';
|
||||
|
||||
const API_BASE_URL = window.ENV?.API_URL || 'http://localhost:8000';
|
||||
|
||||
/**
|
||||
* Trigger a manual database backup (admin only)
|
||||
*/
|
||||
export async function triggerBackup() {
|
||||
const token = getToken();
|
||||
const response = await fetch(`${API_BASE_URL}/admin/backup`, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Authorization': `Bearer ${token}`,
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const error = await response.json();
|
||||
throw new Error(error.detail || 'Failed to create backup');
|
||||
}
|
||||
|
||||
return response.json();
|
||||
}
|
||||
|
||||
/**
|
||||
* List all available backups (admin only)
|
||||
*/
|
||||
export async function listBackups() {
|
||||
const token = getToken();
|
||||
const response = await fetch(`${API_BASE_URL}/admin/backups`, {
|
||||
method: 'GET',
|
||||
headers: {
|
||||
'Authorization': `Bearer ${token}`,
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const error = await response.json();
|
||||
throw new Error(error.detail || 'Failed to list backups');
|
||||
}
|
||||
|
||||
return response.json();
|
||||
}
|
||||
|
||||
/**
|
||||
* Restore database from a backup (admin only)
|
||||
*/
|
||||
export async function restoreBackup(filename) {
|
||||
const token = getToken();
|
||||
const response = await fetch(`${API_BASE_URL}/admin/restore?filename=${encodeURIComponent(filename)}`, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Authorization': `Bearer ${token}`,
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const error = await response.json();
|
||||
throw new Error(error.detail || 'Failed to restore backup');
|
||||
}
|
||||
|
||||
return response.json();
|
||||
}
|
||||
177
frontend/src/components/AdminPanel.jsx
Normal file
177
frontend/src/components/AdminPanel.jsx
Normal file
@ -0,0 +1,177 @@
|
||||
import { useState, useEffect } from 'react';
|
||||
import { triggerBackup, listBackups, restoreBackup } from '../backupApi';
|
||||
import Modal from './Modal';
|
||||
|
||||
function AdminPanel({ onShowToast }) {
|
||||
const [backups, setBackups] = useState([]);
|
||||
const [loading, setLoading] = useState(false);
|
||||
const [restoreModal, setRestoreModal] = useState({ isOpen: false, filename: '' });
|
||||
const [restoring, setRestoring] = useState(false);
|
||||
|
||||
useEffect(() => {
|
||||
loadBackups();
|
||||
}, []);
|
||||
|
||||
const loadBackups = async () => {
|
||||
try {
|
||||
setLoading(true);
|
||||
const data = await listBackups();
|
||||
setBackups(data.backups || []);
|
||||
} catch (error) {
|
||||
onShowToast(error.message || 'שגיאה בטעינת גיבויים', 'error');
|
||||
} finally {
|
||||
setLoading(false);
|
||||
}
|
||||
};
|
||||
|
||||
const handleCreateBackup = async () => {
|
||||
try {
|
||||
setLoading(true);
|
||||
const result = await triggerBackup();
|
||||
onShowToast('גיבוי נוצר בהצלחה! 📦', 'success');
|
||||
loadBackups(); // Refresh list
|
||||
} catch (error) {
|
||||
onShowToast(error.message || 'שגיאה ביצירת גיבוי', 'error');
|
||||
} finally {
|
||||
setLoading(false);
|
||||
}
|
||||
};
|
||||
|
||||
const handleRestoreClick = (filename) => {
|
||||
setRestoreModal({ isOpen: true, filename });
|
||||
};
|
||||
|
||||
const handleRestoreConfirm = async () => {
|
||||
console.log('Restore confirm clicked, filename:', restoreModal.filename);
|
||||
setRestoreModal({ isOpen: false, filename: '' });
|
||||
setRestoring(true);
|
||||
|
||||
try {
|
||||
console.log('Starting restore...');
|
||||
const result = await restoreBackup(restoreModal.filename);
|
||||
console.log('Restore result:', result);
|
||||
onShowToast('שחזור הושלם בהצלחה! ♻️ מרענן את הדף...', 'success');
|
||||
|
||||
// Refresh page after 2 seconds to reload all data
|
||||
setTimeout(() => {
|
||||
window.location.reload();
|
||||
}, 2000);
|
||||
} catch (error) {
|
||||
console.error('Restore error:', error);
|
||||
onShowToast(error.message || 'שגיאה בשחזור גיבוי', 'error');
|
||||
setRestoring(false);
|
||||
}
|
||||
};
|
||||
|
||||
const formatBytes = (bytes) => {
|
||||
if (bytes === 0) return '0 Bytes';
|
||||
const k = 1024;
|
||||
const sizes = ['Bytes', 'KB', 'MB', 'GB'];
|
||||
const i = Math.floor(Math.log(bytes) / Math.log(k));
|
||||
return Math.round(bytes / Math.pow(k, i) * 100) / 100 + ' ' + sizes[i];
|
||||
};
|
||||
|
||||
const formatDate = (isoString) => {
|
||||
const date = new Date(isoString);
|
||||
return date.toLocaleString('he-IL', {
|
||||
year: 'numeric',
|
||||
month: '2-digit',
|
||||
day: '2-digit',
|
||||
hour: '2-digit',
|
||||
minute: '2-digit'
|
||||
});
|
||||
};
|
||||
|
||||
return (
|
||||
<div className="admin-panel">
|
||||
<div className="admin-header">
|
||||
<h2>ניהול גיבויים 🛡️</h2>
|
||||
<button
|
||||
className="btn primary"
|
||||
onClick={handleCreateBackup}
|
||||
disabled={loading}
|
||||
>
|
||||
{loading ? 'יוצר גיבוי...' : 'צור גיבוי חדש'}
|
||||
</button>
|
||||
</div>
|
||||
|
||||
{loading && backups.length === 0 ? (
|
||||
<div className="loading">טוען גיבויים...</div>
|
||||
) : backups.length === 0 ? (
|
||||
<div className="empty-state">אין גיבויים זמינים</div>
|
||||
) : (
|
||||
<div className="backups-list">
|
||||
<table className="backups-table">
|
||||
<thead>
|
||||
<tr>
|
||||
<th className="col-filename">קובץ</th>
|
||||
<th className="col-date">תאריך</th>
|
||||
<th className="col-size">גודל</th>
|
||||
<th className="col-actions">פעולות</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
{backups.map((backup) => (
|
||||
<tr key={backup.filename}>
|
||||
<td className="filename">{backup.filename}</td>
|
||||
<td className="date">{formatDate(backup.last_modified)}</td>
|
||||
<td className="size">{formatBytes(backup.size)}</td>
|
||||
<td className="actions">
|
||||
<button
|
||||
className="btn ghost small"
|
||||
onClick={() => handleRestoreClick(backup.filename)}
|
||||
disabled={loading}
|
||||
>
|
||||
שחזר
|
||||
</button>
|
||||
</td>
|
||||
</tr>
|
||||
))}
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
)}
|
||||
|
||||
<Modal
|
||||
isOpen={restoreModal.isOpen || restoring}
|
||||
onClose={() => !restoring && setRestoreModal({ isOpen: false, filename: '' })}
|
||||
title={restoring ? "⏳ משחזר גיבוי..." : "⚠️ אישור שחזור גיבוי"}
|
||||
>
|
||||
{restoring ? (
|
||||
<div className="restore-progress">
|
||||
<div className="progress-bar-container">
|
||||
<div className="progress-bar-fill"></div>
|
||||
</div>
|
||||
<p className="progress-text">מוריד גיבוי...</p>
|
||||
<p className="progress-text">משחזר מסד נתונים...</p>
|
||||
<p className="progress-text-muted">אנא המתן, התהליך עשוי לקחת מספר דקות</p>
|
||||
</div>
|
||||
) : (
|
||||
<div className="restore-warning">
|
||||
<p>פעולה זו תמחק את כל הנתונים הנוכחיים!</p>
|
||||
<p>האם אתה בטוח שברצונך לשחזר מהגיבוי:</p>
|
||||
<p className="filename-highlight">{restoreModal.filename}</p>
|
||||
<div className="modal-actions">
|
||||
<button
|
||||
className="btn ghost"
|
||||
onClick={() => setRestoreModal({ isOpen: false, filename: '' })}
|
||||
disabled={loading}
|
||||
>
|
||||
ביטול
|
||||
</button>
|
||||
<button
|
||||
className="btn danger"
|
||||
onClick={handleRestoreConfirm}
|
||||
disabled={loading}
|
||||
>
|
||||
שחזר
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
</Modal>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
export default AdminPanel;
|
||||
@ -1,4 +1,4 @@
|
||||
function Modal({ isOpen, title, message, onConfirm, onCancel, confirmText = "מחק", cancelText = "ביטול", isDangerous = false }) {
|
||||
function Modal({ isOpen, title, message, onConfirm, onCancel, confirmText = "מחק", cancelText = "ביטול", isDangerous = false, children }) {
|
||||
if (!isOpen) return null;
|
||||
|
||||
return (
|
||||
@ -8,19 +8,21 @@ function Modal({ isOpen, title, message, onConfirm, onCancel, confirmText = "מ
|
||||
<h2>{title}</h2>
|
||||
</header>
|
||||
<div className="modal-body">
|
||||
{message}
|
||||
{children || message}
|
||||
</div>
|
||||
<footer className="modal-footer">
|
||||
<button className="btn ghost" onClick={onCancel}>
|
||||
{cancelText}
|
||||
</button>
|
||||
<button
|
||||
className={`btn ${isDangerous ? "danger" : "primary"}`}
|
||||
onClick={onConfirm}
|
||||
>
|
||||
{confirmText}
|
||||
</button>
|
||||
</footer>
|
||||
{!children && (
|
||||
<footer className="modal-footer">
|
||||
<button className="btn ghost" onClick={onCancel}>
|
||||
{cancelText}
|
||||
</button>
|
||||
<button
|
||||
className={`btn ${isDangerous ? "danger" : "primary"}`}
|
||||
onClick={onConfirm}
|
||||
>
|
||||
{confirmText}
|
||||
</button>
|
||||
</footer>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
|
||||
@ -1,6 +1,6 @@
|
||||
import NotificationBell from "./NotificationBell";
|
||||
|
||||
function TopBar({ onAddClick, user, onLogout, onShowToast, onNotificationClick }) {
|
||||
function TopBar({ onAddClick, user, onLogout, onShowToast, onNotificationClick, onAdminClick }) {
|
||||
return (
|
||||
<header className="topbar">
|
||||
<div className="topbar-left">
|
||||
@ -15,6 +15,12 @@ function TopBar({ onAddClick, user, onLogout, onShowToast, onNotificationClick }
|
||||
|
||||
<div className="topbar-actions">
|
||||
{user && <NotificationBell onShowToast={onShowToast} onNotificationClick={onNotificationClick} />}
|
||||
{user?.is_admin && (
|
||||
<button className="btn ghost btn-mobile-compact" onClick={onAdminClick}>
|
||||
<span className="btn-text-desktop">🛡️ ניהול</span>
|
||||
<span className="btn-text-mobile">🛡️</span>
|
||||
</button>
|
||||
)}
|
||||
{user && (
|
||||
<button className="btn primary btn-mobile-compact" onClick={onAddClick}>
|
||||
<span className="btn-text-desktop">+ מתכון חדש</span>
|
||||
|
||||
Loading…
x
Reference in New Issue
Block a user