This project, Fine-Tuning-using-ResNet50, demonstrates a comprehensive approach to classifying flower images using a pre-trained ResNet50 model.
Despite not reaching 95%+ accuracy, the model achieves around 88% accuracy without significant overfitting, making it a practical solution for flower classification.
Introduction
ResNet50 is known for its deep residual learning framework, which leverages skip (residual) connections to tackle vanishing gradients and handle large model depth.
In this project, the 102 Flower Dataset is used to showcase how effective fine-tuning can be, even with moderate hardware resources.
Objective
Data Preparation: Load and preprocess flower images (102 categories).
Model Definition: Modify the ResNet50 model to handle multi-class classification.
Training & Validation: Optimize accuracy while preventing overfitting.
Evaluation: Assess the model's performance on a validation set.
Methodology
Data Loading & Preprocessing: Split dataset into training, validation, and test sets. Apply transformations (resizing, normalization, augmentation).
Model Customization: Replace the final fully connected layer of ResNet50 to match the number of flower categories. Incorporate dropout for regularization.
Training & Optimization: Use a learning rate scheduler, define a suitable loss function, and implement automatic mixed precision (AMP) to optimize GPU memory usage.
Evaluation & Early Stopping: Track validation performance, apply early stopping to prevent overfitting, and save the best model.