safayavatsal a43c0c43db feat: Add comprehensive fine-tuning framework with adapter layers
- Implement WhisperAdapter class for efficient fine-tuning
- Add AdaptedWhisperModel with selective parameter freezing
- Create FineTuningDataset for data preparation
- Include WhisperFineTuner main training class
- Support adapter saving/loading functionality
- Address GitHub Discussions #64, #759 fine-tuning requests

Features:
- Parameter-efficient fine-tuning using adapter layers
- Flexible target module selection
- Integrated training pipeline with validation
- Compatible with all Whisper model sizes
- Memory-efficient training approach
2025-10-19 23:47:14 +05:30
..
2023-11-06 10:10:30 -08:00
2023-11-06 10:10:30 -08:00
2024-09-30 10:27:14 -07:00
2023-11-06 10:10:30 -08:00
2025-06-25 18:00:48 -07:00