Add Backup to develop and prod

This commit is contained in:
dvirlabs 2025-12-21 04:19:17 +02:00
parent 41f4a31602
commit 7fd437e561
12 changed files with 828 additions and 8 deletions

View File

@ -28,4 +28,12 @@ AZURE_REDIRECT_URI=http://localhost:8000/auth/azure/callback
R2_ENDPOINT=https://d4704b8c40b2f95b2c7bf7ee4ecc52f8.r2.cloudflarestorage.com R2_ENDPOINT=https://d4704b8c40b2f95b2c7bf7ee4ecc52f8.r2.cloudflarestorage.com
R2_ACCESS_KEY=1997b1e48a337c0dbe1f7552a08631b5 R2_ACCESS_KEY=1997b1e48a337c0dbe1f7552a08631b5
R2_SECRET_KEY=369694e39fedfedb254158c147171f5760de84fa2346d5d5d5a961f1f517dbc6 R2_SECRET_KEY=369694e39fedfedb254158c147171f5760de84fa2346d5d5d5a961f1f517dbc6
R2_BUCKET=my-recipes-db-bkp # Buckets are auto-selected based on environment (FRONTEND_URL)
# Dev: my-recipes-db-bkp-dev
# Prod: my-recipes-db-bkp-prod
# Automatic Backup Schedule
# Options: test (every 1 minute), daily, weekly, disabled
# For testing: BACKUP_INTERVAL=test
# For production: BACKUP_INTERVAL=weekly
BACKUP_INTERVAL=weekly

150
backend/AUTOMATIC_BACKUP.md Normal file
View File

@ -0,0 +1,150 @@
# Automatic Backup System
## ✅ Setup Complete!
The backend now automatically runs backups when the application starts. No cron setup needed!
## How It Works
When you start the backend:
```bash
cd backend
uvicorn main:app --reload
```
The backup scheduler starts automatically and you'll see:
```
⏰ Backup scheduler started: EVERY 1 MINUTE (testing mode)
⚠️ WARNING: Test mode active! Change BACKUP_INTERVAL to 'weekly' for production
```
## Testing (Current Setting)
**Currently set to: `BACKUP_INTERVAL=test`**
- Runs **every 1 minute**
- Check backend console logs for backup status
- Check R2 bucket for new files
**Expected console output:**
```
[2025-12-21 15:30:45] INFO: Starting scheduled backup...
[2025-12-21 15:30:53] INFO: ✅ Scheduled backup completed: recipes_db_20251221_153045.sql.gz
```
## Change to Production Schedule
After testing works, update `.env`:
```env
# Change from:
BACKUP_INTERVAL=test
# To one of these:
BACKUP_INTERVAL=weekly # Sunday at 2:00 AM (recommended)
BACKUP_INTERVAL=daily # Every day at 2:00 AM
BACKUP_INTERVAL=disabled # Turn off automatic backups
```
Then restart the backend:
```bash
# Stop current server (Ctrl+C)
# Start again
uvicorn main:app --reload
```
You'll see:
```
⏰ Backup scheduler started: WEEKLY on Sundays at 2:00 AM
✅ Backup scheduler is running
```
## Available Options
| Setting | Description | When it runs |
|---------|-------------|--------------|
| `test` | Testing mode | Every 1 minute |
| `daily` | Daily backups | Every day at 2:00 AM |
| `weekly` | Weekly backups | Sundays at 2:00 AM |
| `disabled` | No automatic backups | Never (manual only) |
## Manual Backup Still Available
Admin users can still trigger manual backups from the Admin Panel in the UI, regardless of the automatic schedule.
## Monitoring
### Check if scheduler is running
Look for these messages in backend console when starting:
```
⏰ Backup scheduler started: ...
✅ Backup scheduler is running
```
### Watch backup logs in real-time
The scheduled backups show in your backend console:
```
[2025-12-21 15:30:45] INFO: Starting scheduled backup...
[2025-12-21 15:30:53] INFO: ✅ Scheduled backup completed: recipes_db_20251221_153045.sql.gz
```
### Verify backups are created
- Check R2 bucket in Cloudflare dashboard
- Look for files: `recipes_db_YYYYMMDD_HHMMSS.sql.gz`
## Production Deployment
When deploying to production:
1. **Update `.env`:**
```env
BACKUP_INTERVAL=weekly
```
2. **Keep the backend running:**
- Use systemd, docker, or process manager
- The scheduler only runs while the backend is running
3. **Using Docker:**
```dockerfile
# In your Dockerfile or docker-compose.yml
# No additional cron setup needed!
# The app handles scheduling internally
```
## Troubleshooting
### "Backup scheduler is DISABLED"
- Check `.env` has `BACKUP_INTERVAL` set
- Not set to `disabled`
### No backups running in test mode
- Check backend console for error messages
- Verify R2 credentials in `.env`
- Verify database credentials in `.env`
- Check that `APScheduler` is installed: `pip install APScheduler`
### Backups not running at scheduled time
- Backend must be running 24/7
- Use systemd or docker in production
- Check server timezone matches expected schedule
### Want to disable automatic backups
```env
BACKUP_INTERVAL=disabled
```
## Benefits Over Cron
**No external setup** - Works immediately when backend starts
**Cross-platform** - Works on Windows, Linux, Docker
**Easy testing** - Just change one env variable
**Logs in console** - See backup status in backend logs
**No permissions issues** - Runs with same permissions as backend
## Next Steps
1. **Test now**: Start backend and wait 1-2 minutes
2. **Verify**: Check console logs and R2 bucket
3. **Switch to weekly**: Change `.env` to `BACKUP_INTERVAL=weekly`
4. **Restart backend**: The new schedule takes effect

226
backend/CRON_SETUP.md Normal file
View File

@ -0,0 +1,226 @@
# Automated Backup with Cron (Linux/Production)
## Quick Setup
### Option 1: Automated Setup (Recommended)
```bash
cd backend
chmod +x setup_cron.sh
./setup_cron.sh
```
Then select:
- **Option 1**: Every 1 minute (for testing)
- **Option 2**: Weekly (Sunday 2 AM) - after testing works
### Option 2: Manual Setup
#### 1. Make script executable
```bash
cd backend
chmod +x run_backup.sh
```
#### 2. Edit crontab
```bash
crontab -e
```
#### 3. Add one of these lines:
**For Testing (every 1 minute):**
```bash
* * * * * cd /path/to/backend && ./run_backup.sh
```
**For Production (weekly - Sunday 2 AM):**
```bash
0 2 * * 0 cd /path/to/backend && ./run_backup.sh
```
**For Daily (2 AM):**
```bash
0 2 * * * cd /path/to/backend && ./run_backup.sh
```
Replace `/path/to/backend` with your actual path, e.g.:
```bash
* * * * * cd /home/user/my-recipes/backend && ./run_backup.sh
```
#### 4. Save and exit
- Press `Ctrl+X`, then `Y`, then `Enter` (if using nano)
- Or `:wq` (if using vim)
## Verify It's Working
### 1. Check if cron job is installed
```bash
crontab -l | grep backup
```
### 2. Wait 2-3 minutes (for 1-minute test)
### 3. Check the log
```bash
cd backend
tail -f backup.log
```
Expected output:
```
[2025-12-21 15:30:45] Starting backup...
[2025-12-21 15:30:47] Creating database dump...
[2025-12-21 15:30:49] Compressing file...
[2025-12-21 15:30:51] Uploading to R2...
[2025-12-21 15:30:53] ✅ Backup completed: recipes_db_20251221_153045.sql.gz
```
### 4. Check R2 bucket
- Should see new backup files appearing
- Files named: `recipes_db_YYYYMMDD_HHMMSS.sql.gz`
## Change from Testing to Weekly
### Method 1: Using setup script
```bash
cd backend
./setup_cron.sh
```
Select option 2 (Weekly)
### Method 2: Manual edit
```bash
crontab -e
```
Change this line:
```bash
* * * * * cd /path/to/backend && ./run_backup.sh
```
To this:
```bash
0 2 * * 0 cd /path/to/backend && ./run_backup.sh
```
Save and exit.
## Cron Schedule Reference
```
* * * * * command
│ │ │ │ │
│ │ │ │ └─── Day of week (0-7, 0 and 7 are Sunday)
│ │ │ └───── Month (1-12)
│ │ └─────── Day of month (1-31)
│ └───────── Hour (0-23)
└─────────── Minute (0-59)
```
**Examples:**
- `* * * * *` - Every minute
- `0 2 * * *` - Daily at 2:00 AM
- `0 2 * * 0` - Weekly on Sunday at 2:00 AM
- `0 2 * * 1` - Weekly on Monday at 2:00 AM
- `0 */6 * * *` - Every 6 hours
## Troubleshooting
### Cron job not running
**1. Check cron service is running:**
```bash
sudo systemctl status cron
# or
sudo systemctl status crond
```
**2. Check cron logs:**
```bash
# Ubuntu/Debian
grep CRON /var/log/syslog
# CentOS/RHEL
tail -f /var/log/cron
```
**3. Test script manually:**
```bash
cd backend
./run_backup.sh
cat backup.log
```
### No backup.log file
**Check permissions:**
```bash
ls -la run_backup.sh
# Should be: -rwxr-xr-x
chmod +x run_backup.sh
```
**Test Python script:**
```bash
cd backend
python3 backup_db.py
```
### Script runs but backup fails
**Check backup.log for errors:**
```bash
cat backup.log
```
Common issues:
- Database credentials wrong (check `.env`)
- R2 credentials wrong (check `.env`)
- `pg_dump` not installed: `sudo apt install postgresql-client`
- Python packages missing: `pip install boto3`
## Remove Cron Job
```bash
crontab -e
```
Delete the line with `run_backup.sh`, save and exit.
## Docker/Container Environment
If running in Docker, add to your docker-compose.yml or Dockerfile:
```yaml
# docker-compose.yml
services:
backend:
# ... other config
command: >
sh -c "
echo '0 2 * * 0 cd /app && python backup_db.py >> backup.log 2>&1' | crontab - &&
crond -f -l 2
"
```
Or use a separate container with a cron image.
## Production Recommendations
1. **Use weekly backups** - Daily can consume too much storage
2. **Monitor logs** - Set up log monitoring/alerts
3. **Test restore** - Periodically test restoring from backups
4. **Set up retention** - Automatically delete old backups (not implemented yet)
5. **Use separate backup server** - Don't backup to same server as database
## Success Checklist
- ✅ `run_backup.sh` is executable
- ✅ Cron job is installed (`crontab -l` shows it)
- ✅ Test run completed successfully
- ✅ `backup.log` shows successful backup
- ✅ R2 bucket contains backup files
- ✅ Changed from 1-minute to weekly schedule

View File

@ -0,0 +1,142 @@
# Test Scheduled Backup - 1 Minute Setup
## Quick Test Setup (1 Minute Interval)
### Step 1: Open Task Scheduler
1. Press `Win + R`
2. Type `taskschd.msc`
3. Press Enter
### Step 2: Create Test Task
1. Click **"Create Task"** (not Basic Task)
2. **General Tab:**
- Name: `Recipe DB Backup TEST`
- Description: `Test backup every 1 minute`
- Select "Run whether user is logged on or not"
- Check "Run with highest privileges"
### Step 3: Set 1-Minute Trigger
1. Go to **Triggers** tab
2. Click "New..."
3. Configure:
- Begin the task: `On a schedule`
- Settings: `Daily`
- Recur every: `1 days`
- Start: Set to current time (e.g., if it's 3:00 PM, set to 3:00 PM)
- Check "Repeat task every": `1 minute`
- For a duration of: `1 hour`
- Check "Enabled"
4. Click OK
### Step 4: Set Action
1. Go to **Actions** tab
2. Click "New..."
3. Configure:
- Action: `Start a program`
- Program/script: Browse to `run_backup.bat` or paste full path:
```
C:\Users\dvirl\OneDrive\Desktop\gitea\my-recipes\backend\run_backup.bat
```
- Start in:
```
C:\Users\dvirl\OneDrive\Desktop\gitea\my-recipes\backend
```
4. Click OK
### Step 5: Settings
1. Go to **Conditions** tab:
- Uncheck "Start the task only if the computer is on AC power"
2. Go to **Settings** tab:
- Check "Run task as soon as possible after a scheduled start is missed"
- Check "If the task fails, restart every: 1 minutes"
- Attempt to restart up to: `3 times`
- Check "Stop the task if it runs longer than: 30 minutes"
3. Click OK
4. Enter your Windows password when prompted
### Step 6: Monitor Test Results
**Check the backup log:**
```bash
cd C:\Users\dvirl\OneDrive\Desktop\gitea\my-recipes\backend
type backup.log
```
**Or open in Notepad:**
- Navigate to `backend\backup.log`
- Should see new entries every minute
**Check R2 bucket:**
- Login to Cloudflare Dashboard
- Go to R2 → my-recipes-db-bkp
- Should see new backup files appearing every minute
### Expected Log Output
```
[2025-12-21 15:30:45] Starting backup...
[2025-12-21 15:30:47] Creating database dump...
[2025-12-21 15:30:49] Compressing file...
[2025-12-21 15:30:51] Uploading to R2...
[2025-12-21 15:30:53] Backup completed: recipes_db_20251221_153045.sql.gz
[2025-12-21 15:30:53] Size: 2.5 MB
```
### Verify It's Working
Wait 2-3 minutes and check:
1. ✅ `backend\backup.log` has multiple entries
2. ✅ R2 bucket has new backup files
3. ✅ Task Scheduler shows "Last Run Result: (0x0)" = Success
### If It Works - Convert to Weekly
1. Open Task Scheduler
2. Find "Recipe DB Backup TEST"
3. Right-click → Properties
4. Go to **Triggers** tab
5. Edit the trigger:
- Settings: Change to `Weekly`
- Recur every: `1 weeks`
- Days: Select `Sunday` (or your preferred day)
- Time: `02:00:00` (2 AM)
- **Uncheck** "Repeat task every"
6. Click OK
7. **Rename task**: Right-click → Rename → "Recipe DB Weekly Backup"
### Troubleshooting
**No backup.log file:**
- Task might not be running
- Check Task Scheduler History tab
- Run `run_backup.bat` manually first
**backup.log shows errors:**
- Check if Python is in PATH
- Verify database credentials in `.env`
- Verify R2 credentials in `.env`
**Task shows "Could not start":**
- Verify the paths are correct
- Make sure you entered Windows password
- Try "Run" button in Task Scheduler manually
**Want to stop test:**
- Right-click task → Disable
- Or delete the task
### Manual Test First
Before setting up Task Scheduler, test manually:
```bash
cd C:\Users\dvirl\OneDrive\Desktop\gitea\my-recipes\backend
run_backup.bat
```
Check if:
1. backup.log is created
2. Backup appears in R2
3. No errors in log
If manual test works, Task Scheduler will work too!

View File

@ -15,6 +15,20 @@ from dotenv import load_dotenv
load_dotenv() load_dotenv()
def get_environment() -> str:
"""Detect environment based on FRONTEND_URL"""
frontend_url = os.getenv('FRONTEND_URL', 'http://localhost:5174')
if 'myrecipes.dvirlabs.com' in frontend_url or 'my-recipes.dvirlabs.com' in frontend_url:
return 'prod'
return 'dev'
def get_r2_bucket() -> str:
"""Get R2 bucket name based on environment"""
env = get_environment()
return f'my-recipes-db-bkp-{env}'
def get_r2_client(): def get_r2_client():
"""Get configured R2 client""" """Get configured R2 client"""
return boto3.client( return boto3.client(
@ -29,10 +43,11 @@ def get_r2_client():
def create_db_dump() -> str: def create_db_dump() -> str:
"""Create a database dump file""" """Create a database dump file"""
timestamp = datetime.now().strftime('%Y%m%d_%H%M%S') timestamp = datetime.now().strftime('%Y%m%d_%H%M%S')
env = get_environment()
backup_dir = os.path.join(os.path.dirname(__file__), 'backups') backup_dir = os.path.join(os.path.dirname(__file__), 'backups')
os.makedirs(backup_dir, exist_ok=True) os.makedirs(backup_dir, exist_ok=True)
dump_file = os.path.join(backup_dir, f'recipes_db_{timestamp}.sql') dump_file = os.path.join(backup_dir, f'recipes_db_{env}_{timestamp}.sql')
db_host = os.getenv('DB_HOST', 'localhost') db_host = os.getenv('DB_HOST', 'localhost')
db_port = os.getenv('DB_PORT', '5432') db_port = os.getenv('DB_PORT', '5432')
@ -77,7 +92,7 @@ def compress_file(file_path: str) -> str:
def upload_to_r2(file_path: str) -> str: def upload_to_r2(file_path: str) -> str:
"""Upload file to R2""" """Upload file to R2"""
s3_client = get_r2_client() s3_client = get_r2_client()
bucket_name = os.getenv('R2_BUCKET') bucket_name = get_r2_bucket()
file_name = os.path.basename(file_path) file_name = os.path.basename(file_path)
try: try:
@ -90,7 +105,7 @@ def upload_to_r2(file_path: str) -> str:
def list_r2_backups() -> List[dict]: def list_r2_backups() -> List[dict]:
"""List all backups in R2""" """List all backups in R2"""
s3_client = get_r2_client() s3_client = get_r2_client()
bucket_name = os.getenv('R2_BUCKET') bucket_name = get_r2_bucket()
try: try:
response = s3_client.list_objects_v2(Bucket=bucket_name) response = s3_client.list_objects_v2(Bucket=bucket_name)
@ -116,7 +131,7 @@ def list_r2_backups() -> List[dict]:
def download_from_r2(filename: str) -> str: def download_from_r2(filename: str) -> str:
"""Download a backup from R2""" """Download a backup from R2"""
s3_client = get_r2_client() s3_client = get_r2_client()
bucket_name = os.getenv('R2_BUCKET') bucket_name = get_r2_bucket()
backup_dir = os.path.join(os.path.dirname(__file__), 'backups') backup_dir = os.path.join(os.path.dirname(__file__), 'backups')
os.makedirs(backup_dir, exist_ok=True) os.makedirs(backup_dir, exist_ok=True)

103
backend/backup_scheduler.py Normal file
View File

@ -0,0 +1,103 @@
"""
Automatic backup scheduler
Runs database backups on a schedule without needing cron
"""
import os
from datetime import datetime
from apscheduler.schedulers.background import BackgroundScheduler
from backup_restore_api import perform_backup
import logging
# Configure logging
logging.basicConfig(
level=logging.INFO,
format='[%(asctime)s] %(levelname)s: %(message)s',
datefmt='%Y-%m-%d %H:%M:%S'
)
logger = logging.getLogger(__name__)
scheduler = BackgroundScheduler()
def scheduled_backup_job():
"""Job that runs on schedule to perform backup"""
logger.info("Starting scheduled backup...")
try:
result = perform_backup()
if result['success']:
logger.info(f"✅ Scheduled backup completed: {result['filename']}")
else:
logger.error(f"❌ Scheduled backup failed: {result.get('error', 'Unknown error')}")
except Exception as e:
logger.error(f"❌ Scheduled backup exception: {str(e)}")
def start_backup_scheduler():
"""Start the backup scheduler based on environment configuration"""
# Get backup interval from environment (default: weekly)
backup_interval = os.getenv('BACKUP_INTERVAL', 'weekly').lower()
if backup_interval == 'disabled':
logger.info("⏸️ Automatic backups are DISABLED")
return
if backup_interval == 'test' or backup_interval == '1minute':
# Test mode - every 1 minute
scheduler.add_job(
scheduled_backup_job,
'interval',
minutes=1,
id='backup_job',
replace_existing=True
)
logger.info("⏰ Backup scheduler started: EVERY 1 MINUTE (testing mode)")
logger.warning("⚠️ WARNING: Test mode active! Change BACKUP_INTERVAL to 'weekly' for production")
elif backup_interval == 'daily':
# Daily at 2 AM
scheduler.add_job(
scheduled_backup_job,
'cron',
hour=2,
minute=0,
id='backup_job',
replace_existing=True
)
logger.info("⏰ Backup scheduler started: DAILY at 2:00 AM")
elif backup_interval == 'weekly':
# Weekly - Sunday at 2 AM
scheduler.add_job(
scheduled_backup_job,
'cron',
day_of_week='sun',
hour=2,
minute=0,
id='backup_job',
replace_existing=True
)
logger.info("⏰ Backup scheduler started: WEEKLY on Sundays at 2:00 AM")
else:
logger.warning(f"⚠️ Unknown BACKUP_INTERVAL: {backup_interval}, defaulting to weekly")
scheduler.add_job(
scheduled_backup_job,
'cron',
day_of_week='sun',
hour=2,
minute=0,
id='backup_job',
replace_existing=True
)
logger.info("⏰ Backup scheduler started: WEEKLY on Sundays at 2:00 AM")
scheduler.start()
logger.info("✅ Backup scheduler is running")
def stop_backup_scheduler():
"""Stop the backup scheduler"""
if scheduler.running:
scheduler.shutdown()
logger.info("🛑 Backup scheduler stopped")

View File

@ -60,8 +60,12 @@ from backup_restore_api import (
perform_backup, perform_backup,
perform_restore, perform_restore,
list_r2_backups, list_r2_backups,
get_environment,
get_r2_bucket,
) )
from backup_scheduler import start_backup_scheduler, stop_backup_scheduler
from email_utils import ( from email_utils import (
generate_verification_code, generate_verification_code,
send_verification_email, send_verification_email,
@ -249,6 +253,19 @@ app.add_middleware(
) )
# ===== Startup Event: Start Backup Scheduler =====
@app.on_event("startup")
async def startup_event():
"""Start the automatic backup scheduler when the app starts"""
start_backup_scheduler()
@app.on_event("shutdown")
async def shutdown_event():
"""Stop the backup scheduler when the app shuts down"""
stop_backup_scheduler()
@app.get("/recipes", response_model=List[Recipe]) @app.get("/recipes", response_model=List[Recipe])
def list_recipes(): def list_recipes():
rows = list_recipes_db() rows = list_recipes_db()
@ -1186,7 +1203,13 @@ def list_backups(
try: try:
backups = list_r2_backups() backups = list_r2_backups()
return {"backups": backups} environment = get_environment()
bucket = get_r2_bucket()
return {
"backups": backups,
"environment": environment,
"bucket": bucket
}
except Exception as e: except Exception as e:
raise HTTPException(status_code=500, detail=str(e)) raise HTTPException(status_code=500, detail=str(e))

View File

@ -23,3 +23,4 @@ itsdangerous==2.1.2
# Backup to R2 # Backup to R2
boto3==1.34.17 boto3==1.34.17
APScheduler==3.10.4

8
backend/run_backup.sh Normal file
View File

@ -0,0 +1,8 @@
#!/bin/bash
# Cron job setup for automated database backups
# Navigate to backend directory
cd "$(dirname "$0")"
# Run backup script
python3 backup_db.py >> backup.log 2>&1

97
backend/setup_cron.sh Normal file
View File

@ -0,0 +1,97 @@
#!/bin/bash
# Automated Backup Cron Setup Script
# This script sets up automated backups for production
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
BACKUP_SCRIPT="$SCRIPT_DIR/run_backup.sh"
echo "🛡️ Database Backup Cron Setup"
echo "================================"
echo ""
# Make backup script executable
chmod +x "$BACKUP_SCRIPT"
echo "✅ Made run_backup.sh executable"
# Get current crontab
crontab -l > /tmp/current_cron 2>/dev/null || touch /tmp/current_cron
# Check if backup job already exists
if grep -q "run_backup.sh" /tmp/current_cron; then
echo "⚠️ Backup cron job already exists!"
echo ""
echo "Current backup schedule:"
grep "run_backup.sh" /tmp/current_cron
echo ""
read -p "Do you want to replace it? (y/n): " replace
if [ "$replace" != "y" ]; then
echo "Cancelled."
exit 0
fi
# Remove existing backup job
grep -v "run_backup.sh" /tmp/current_cron > /tmp/new_cron
else
cp /tmp/current_cron /tmp/new_cron
fi
echo ""
echo "Select backup schedule:"
echo "1) Every 1 minute (for testing)"
echo "2) Weekly (Sunday at 2:00 AM)"
echo "3) Daily (at 2:00 AM)"
read -p "Enter choice (1-3): " choice
case $choice in
1)
# Every 1 minute for testing
echo "* * * * * cd $SCRIPT_DIR && ./run_backup.sh" >> /tmp/new_cron
echo "✅ Set to run EVERY 1 MINUTE (testing only!)"
echo "⚠️ WARNING: This will create many backups. Change to weekly after testing!"
;;
2)
# Weekly - Sunday at 2 AM
echo "0 2 * * 0 cd $SCRIPT_DIR && ./run_backup.sh" >> /tmp/new_cron
echo "✅ Set to run WEEKLY on Sundays at 2:00 AM"
;;
3)
# Daily at 2 AM
echo "0 2 * * * cd $SCRIPT_DIR && ./run_backup.sh" >> /tmp/new_cron
echo "✅ Set to run DAILY at 2:00 AM"
;;
*)
echo "❌ Invalid choice"
exit 1
;;
esac
# Install new crontab
crontab /tmp/new_cron
rm /tmp/current_cron /tmp/new_cron
echo ""
echo "✅ Cron job installed successfully!"
echo ""
echo "Current crontab:"
crontab -l | grep "run_backup.sh"
echo ""
echo "📝 Logs will be written to: $SCRIPT_DIR/backup.log"
echo ""
echo "To view logs:"
echo " tail -f $SCRIPT_DIR/backup.log"
echo ""
echo "To remove cron job:"
echo " crontab -e"
echo " (then delete the line with run_backup.sh)"
echo ""
if [ "$choice" = "1" ]; then
echo "⚠️ TESTING MODE: Backup runs every minute"
echo "Wait 2-3 minutes and check:"
echo " 1. tail -f $SCRIPT_DIR/backup.log"
echo " 2. Check R2 bucket for new files"
echo ""
echo "To change to weekly schedule after testing:"
echo " ./setup_cron.sh"
echo " (select option 2)"
fi

View File

@ -2003,10 +2003,45 @@ html {
} }
.admin-header h2 { .admin-header h2 {
margin: 0; margin: 0 0 0.5rem 0;
color: var(--text-main); color: var(--text-main);
} }
.environment-info {
display: flex;
align-items: center;
gap: 0.75rem;
margin-top: 0.5rem;
}
.env-badge {
display: inline-block;
padding: 0.3rem 0.75rem;
border-radius: 999px;
font-size: 0.75rem;
font-weight: 600;
letter-spacing: 0.5px;
}
.env-badge.dev {
background: rgba(34, 197, 94, 0.15);
color: #22c55e;
}
.env-badge.prod {
background: rgba(239, 68, 68, 0.15);
color: #ef4444;
}
.bucket-name {
font-family: 'Courier New', monospace;
font-size: 0.85rem;
color: var(--text-muted);
background: var(--border-subtle);
padding: 0.3rem 0.75rem;
border-radius: 6px;
}
.backups-list { .backups-list {
margin-top: 1.5rem; margin-top: 1.5rem;
} }

View File

@ -7,6 +7,8 @@ function AdminPanel({ onShowToast }) {
const [loading, setLoading] = useState(false); const [loading, setLoading] = useState(false);
const [restoreModal, setRestoreModal] = useState({ isOpen: false, filename: '' }); const [restoreModal, setRestoreModal] = useState({ isOpen: false, filename: '' });
const [restoring, setRestoring] = useState(false); const [restoring, setRestoring] = useState(false);
const [environment, setEnvironment] = useState('dev');
const [bucket, setBucket] = useState('');
useEffect(() => { useEffect(() => {
loadBackups(); loadBackups();
@ -17,6 +19,8 @@ function AdminPanel({ onShowToast }) {
setLoading(true); setLoading(true);
const data = await listBackups(); const data = await listBackups();
setBackups(data.backups || []); setBackups(data.backups || []);
setEnvironment(data.environment || 'dev');
setBucket(data.bucket || '');
} catch (error) { } catch (error) {
onShowToast(error.message || 'שגיאה בטעינת גיבויים', 'error'); onShowToast(error.message || 'שגיאה בטעינת גיבויים', 'error');
} finally { } finally {
@ -85,7 +89,15 @@ function AdminPanel({ onShowToast }) {
return ( return (
<div className="admin-panel"> <div className="admin-panel">
<div className="admin-header"> <div className="admin-header">
<div>
<h2>ניהול גיבויים 🛡</h2> <h2>ניהול גיבויים 🛡</h2>
<div className="environment-info">
<span className={`env-badge ${environment}`}>
{environment === 'prod' ? '🔴 PRODUCTION' : '🟢 DEVELOPMENT'}
</span>
<span className="bucket-name">{bucket}</span>
</div>
</div>
<button <button
className="btn primary" className="btn primary"
onClick={handleCreateBackup} onClick={handleCreateBackup}