1 Commits

Author SHA1 Message Date
safayavatsal
a43c0c43db feat: Add comprehensive fine-tuning framework with adapter layers
- Implement WhisperAdapter class for efficient fine-tuning
- Add AdaptedWhisperModel with selective parameter freezing
- Create FineTuningDataset for data preparation
- Include WhisperFineTuner main training class
- Support adapter saving/loading functionality
- Address GitHub Discussions #64, #759 fine-tuning requests

Features:
- Parameter-efficient fine-tuning using adapter layers
- Flexible target module selection
- Integrated training pipeline with validation
- Compatible with all Whisper model sizes
- Memory-efficient training approach
2025-10-19 23:47:14 +05:30