This repository contains all my homework assignments' answers and final project work done during the course. Each assignment and project phase has been meticulously documented and includes both the Persian and English slides provided by the instructor.
- Topics Covered:
- Multi-Layer Perceptrons (MLPs)
- Various Activation Functions
- Topics Covered:
- Overfitting
- Dropout Layer
- Regularization Methods
- Transfer Learning
- Topics Covered:
- Optimizers
- Learning Rate Schedules
- Convolutional Neural Networks (CNNs)
- Convolutional Layers
- Inception Modules
- Topics Covered:
- CNN Networks using KerasTuner
- Pooling Layers
- Recurrent Neural Networks (RNNs)
- Long Short-Term Memory Networks (LSTMs)
- Batch Normalization
- Gradient-weighted Class Activation Mapping (Grad-CAM)
- Topics Covered:
- Different Types of RNNs
- Backpropagation Formulas
- Attention Mechanism
- Topics Covered:
- Evaluation Metrics: Accuracy, Recall, Precision, and F1 Score
- Generative Adversarial Networks (GANs)
Each homework folder contains the respective Jupyter notebooks with my solutions and detailed explanations.
- Implemented a multi-label classification model using the ARMAN-EMO dataset.
- Developed a multi-modal classification model combining ResNet for feature extraction and a customized MLP for classification.
This repository also includes the instructor's slides in both Persian and English, which were used throughout the course for lectures and explanations.