View file File name : IMPLEMENTATION_PLAN.md Content :# Implementation Plan: Database + File Backup System ## Features to Implement ### 1. Database Backup Enhancements #### 1.1 Auto-delete backup file after Google Drive upload - **Location**: `app/Services/BackupService.php::uploadToGoogleDrive()` - **Change**: After successful upload, delete the local SQL file - **Implementation**: ```php $fileId = $googleDriveService->uploadFile($localPath, $filename); if ($fileId) { unlink($localPath); // Delete local file after successful upload } ``` - **Status**: SIMPLE - 2 lines of code #### 1.2 Compress SQL backups before upload - **Location**: `app/Services/BackupService.php::backupDatabase()` - **Changes**: 1. Add new method: `compressBackup($filePath)` → returns gzipped file path 2. After mysqldump, compress the SQL file 3. Upload the .sql.gz instead of .sql 4. Delete original .sql after compression - **Implementation**: ```php // Create backup $this->executeMysqlDump($dbConfig, $localPath); // mydb_2025-12-03.sql // Compress $compressedPath = $this->compressBackup($localPath); // mydb_2025-12-03.sql.gz // Upload compressed version $fileId = $this->uploadToGoogleDrive($compressedPath, basename($compressedPath)); ``` - **Status**: MEDIUM - compression + error handling --- ### 2. File/Folder Backup System #### 2.1 New Database Table: `file_backup_configurations` ```sql CREATE TABLE file_backup_configurations ( id BIGINT PRIMARY KEY, name VARCHAR(255), -- "Backup Website Files" source_path VARCHAR(500), -- "/home/user/website" include_dirs JSON, -- ["public_html", "config"] exclude_dirs JSON, -- ["node_modules", ".git"] include_extensions JSON, -- ["php", "js", "css"] or null for all exclude_extensions JSON, -- ["log", "tmp"] or null compress BOOLEAN DEFAULT true, -- Compress before backup delete_after_upload BOOLEAN DEFAULT true, -- Delete local copy after upload google_drive_folder_id VARCHAR(255), -- Optional: specific Drive folder is_active BOOLEAN DEFAULT true, last_backup_at TIMESTAMP, timestamps ); ``` #### 2.2 New Database Table: `file_backup_schedules` ```sql CREATE TABLE file_backup_schedules ( id BIGINT PRIMARY KEY, file_backup_configuration_id BIGINT, frequency ENUM('hourly', 'daily', 'weekly', 'monthly'), time VARCHAR(255), -- HH:MM format day_of_week VARCHAR(255), -- For weekly day_of_month VARCHAR(255), -- For monthly is_active BOOLEAN DEFAULT true, last_run_at TIMESTAMP, next_run_at TIMESTAMP, timestamps FOREIGN KEY (file_backup_configuration_id) REFERENCES file_backup_configurations(id) ON DELETE CASCADE ); ``` #### 2.3 New Database Table: `file_backup_histories` ```sql CREATE TABLE file_backup_histories ( id BIGINT PRIMARY KEY, file_backup_configuration_id BIGINT, backup_name VARCHAR(255), -- "website_2025-12-03_10-45.tar.gz" backup_path VARCHAR(500), -- Local file path google_drive_file_id VARCHAR(255), -- Drive file ID if uploaded total_files INT, -- Number of files backed up compressed_size BIGINT, -- Size after compression uncompressed_size BIGINT, -- Size before compression status ENUM('success', 'failed'), error_message TEXT, started_at TIMESTAMP, completed_at TIMESTAMP, timestamps FOREIGN KEY (file_backup_configuration_id) REFERENCES file_backup_configurations(id) ON DELETE CASCADE ); ``` --- ### 3. Implementation Steps #### Phase 1: Database Backup Improvements (PRIORITY: HIGH) 1. ✓ Add compression method to `BackupService` 2. ✓ Modify `backupDatabase()` to compress SQL files 3. ✓ Add auto-delete after successful Google Drive upload 4. ✓ Update backup filename format (add `.gz` extension) 5. ✓ Test with real database **Effort**: 2-3 hours | **Priority**: HIGH #### Phase 2: File Backup Infrastructure (PRIORITY: HIGH) 1. Create migrations for 3 new tables 2. Create models: - `FileBackupConfiguration` (with relationships) - `FileBackupSchedule` (with relationships) - `FileBackupHistory` (with relationships) 3. Create `FileBackupService` with methods: - `backupFilesAndFolders($config)` → creates tar.gz - `selectiveBackup($sourcePath, $includeDirs, $excludeDirs)` → only backup specified dirs - `uploadFileBackup($localPath, $filename)` → upload and optionally delete 4. Create `FileBackupController` for CRUD operations 5. Add routes for file backup management **Effort**: 4-5 hours | **Priority**: HIGH #### Phase 3: File Backup Scheduling (PRIORITY**: MEDIUM) 1. Create `FileBackupCommand` artisan command 2. Add to scheduler in `console.php` 3. Update `RunScheduledBackups` command to handle file backups too 4. Test scheduling with different frequencies **Effort**: 2 hours | **Priority**: MEDIUM #### Phase 4: UI/Views (PRIORITY**: MEDIUM) 1. Create views for file backup configuration 2. Create views for file backup history 3. Create views for file backup schedules 4. Add dashboard widgets for file backups 5. Add buttons for manual file backup **Effort**: 3-4 hours | **Priority**: MEDIUM --- ## Detailed Architecture ### File Backup Process Flow ``` User clicks "Backup Files" ↓ FileBackupController::fileBackupNow($config) ↓ FileBackupService::backupFilesAndFolders($config) ├─ Read source path ├─ Filter by include/exclude dirs ├─ Filter by include/exclude extensions ├─ Create tar archive with selected files ├─ If compress=true, compress to .tar.gz └─ Record in file_backup_histories ↓ FileBackupController::uploadFileBackup() ├─ Call GoogleDriveService::uploadFile() ├─ Update history with Drive file_id ├─ If delete_after_upload=true, delete local file └─ Return success/error ↓ Update file_backup_histories with final status ``` ### Selective Directory Backup Example User configures: ``` Source Path: /home/user/website Include Dirs: ["public_html", "config", "app"] Exclude Dirs: ["node_modules", "vendor"] Exclude Extensions: ["log", "tmp", "cache"] ``` Result: Backs up ONLY: - `/home/user/website/public_html/*` (excluding .log, .tmp, .cache files) - `/home/user/website/config/*` (excluding .log, .tmp, .cache files) - `/home/user/website/app/*` (excluding .log, .tmp, .cache files) NOT backed up: - `/home/user/website/node_modules/` (entire directory excluded) - `/home/user/website/vendor/` (entire directory excluded) ### Compression Benefits - Database backup: 100MB SQL → 10MB .gz (90% reduction) - Files backup: 500MB files → 100MB .tar.gz (80% reduction) - Google Drive quota savings - Faster uploads --- ## Technical Considerations ### 1. Large File Handling - For files > 100MB, consider chunked uploads to Google Drive - Add progress tracking for large backups - Implement timeout handling for long-running backups ### 2. Directory Traversal Security - Validate paths to prevent `../../../etc/passwd` attacks - Use `realpath()` to resolve symlinks - Whitelist allowed base paths ### 3. Compression Format - Use `.tar.gz` for files (standard, portable, compressible) - Use `.gz` for SQL (standard, already common) - Alternative: `.zip` if Windows users need compatibility ### 4. Disk Space Management - Check available disk space before backing up - Warn if local backup dir exceeds threshold - Implement cleanup of old backups (configurable retention) ### 5. Error Handling - Handle permission denied errors gracefully - Handle files being deleted during backup - Handle Google Drive API errors - Provide detailed error messages in history --- ## Database Schema Migrations ### Migration 1: Create FileBackupConfiguration ```php Schema::create('file_backup_configurations', function (Blueprint $table) { $table->id(); $table->string('name'); $table->string('source_path'); $table->json('include_dirs')->nullable(); $table->json('exclude_dirs')->nullable(); $table->json('include_extensions')->nullable(); $table->json('exclude_extensions')->nullable(); $table->boolean('compress')->default(true); $table->boolean('delete_after_upload')->default(true); $table->string('google_drive_folder_id')->nullable(); $table->boolean('is_active')->default(true); $table->timestamp('last_backup_at')->nullable(); $table->timestamps(); }); ``` ### Migration 2: Create FileBackupSchedule ```php Schema::create('file_backup_schedules', function (Blueprint $table) { $table->id(); $table->foreignId('file_backup_configuration_id') ->constrained('file_backup_configurations') ->onDelete('cascade'); $table->enum('frequency', ['hourly', 'daily', 'weekly', 'monthly']); $table->string('time')->nullable(); $table->string('day_of_week')->nullable(); $table->string('day_of_month')->nullable(); $table->boolean('is_active')->default(true); $table->timestamp('last_run_at')->nullable(); $table->timestamp('next_run_at')->nullable(); $table->timestamps(); }); ``` ### Migration 3: Create FileBackupHistory ```php Schema::create('file_backup_histories', function (Blueprint $table) { $table->id(); $table->foreignId('file_backup_configuration_id') ->constrained('file_backup_configurations') ->onDelete('cascade'); $table->string('backup_name'); $table->string('backup_path'); $table->string('google_drive_file_id')->nullable(); $table->integer('total_files')->default(0); $table->bigInteger('compressed_size')->nullable(); $table->bigInteger('uncompressed_size')->nullable(); $table->enum('status', ['success', 'failed']); $table->text('error_message')->nullable(); $table->timestamp('started_at'); $table->timestamp('completed_at')->nullable(); $table->timestamps(); }); ``` --- ## Models ### FileBackupConfiguration Model ```php class FileBackupConfiguration extends Model { protected $fillable = [ 'name', 'source_path', 'include_dirs', 'exclude_dirs', 'include_extensions', 'exclude_extensions', 'compress', 'delete_after_upload', 'google_drive_folder_id', 'is_active' ]; protected $casts = [ 'include_dirs' => 'array', 'exclude_dirs' => 'array', 'include_extensions' => 'array', 'exclude_extensions' => 'array', 'is_active' => 'boolean', 'compress' => 'boolean', 'delete_after_upload' => 'boolean', ]; public function schedules() { return $this->hasMany(FileBackupSchedule::class); } public function histories() { return $this->hasMany(FileBackupHistory::class); } } ``` --- ## Implementation Order (Recommended) 1. **Week 1 - Database Backup Compression** - Add compression to `BackupService` - Add auto-delete after upload - Test thoroughly 2. **Week 2 - File Backup Migrations & Models** - Create 3 migrations - Create 3 models with relationships - Create `FileBackupService` (core logic) 3. **Week 3 - File Backup Controller & Routes** - Create `FileBackupController` - Add routes - Manual file backup functionality 4. **Week 4 - File Backup Scheduling** - Create file backup command - Add to scheduler - Test with different frequencies 5. **Week 5 - UI & Polish** - Create views - Dashboard widgets - Error handling & edge cases - Documentation --- ## Questions Before Starting 1. **Compression format**: Use `.tar.gz` for files or `.zip`? 2. **Storage location**: Keep in same `storage/app/backups/` or separate `storage/app/file-backups/`? 3. **Permissions**: How to handle files user might not have access to? 4. **Symlinks**: Follow symlinks or skip them? 5. **Large files**: Max file size to backup? Stream large files? --- ## Summary | Feature | Complexity | Time | Priority | |---------|-----------|------|----------| | Database backup compression | LOW | 1h | HIGH | | Auto-delete after upload | LOW | 30m | HIGH | | File backup infrastructure | HIGH | 8-10h | HIGH | | File backup scheduling | MEDIUM | 3h | MEDIUM | | UI/Views | MEDIUM | 4h | MEDIUM | | **TOTAL** | - | **18-22h** | - | **MVP (Minimum Viable Product)**: Database compression + Auto-delete + Basic file backup = ~6-8 hours