Administrative information
Title
|
Model Compression |
|
Duration |
150 min |
Module |
C |
Lesson Type |
Practical |
Focus |
Technical - Future AI |
Topic |
Advances in ML models through a HC lens - A result Oriented Study |
|
Keywords
model compression, pruning, quantization, knowledge distillation,
Learning Goals
- Understand how to implement techniques of model compression
- Grasp the advantages of pruning, quantization and knowledge distillation
- Becoming familiar with a high-level framework like TensorFlow
Expected Preparation
Learning Events to be Completed Before
Obligatory for Students
- Basic understanding of model compression concepts and techniques
- Basic understanding of how the performance of machine and deep learning models can be evaluated (e.g. accuracy, precision and recall, F score)
- Knowledge of the Python programming language
Optional for Students
- Knowledge of the TensorFlow framework
References and background for students
- Knowledge of machine learning and neural networks theory
Recommended for Teachers
- Recall knowledge of the TensorFlow framework and Python programming language
- Provide a practical view on the implementations needed to leverage model compression techniques
- Propose pop-up quizzes
Lesson materials
The materials of this learning event are available under CC BY-NC-SA 4.0.
Instructions for Teachers
- Give a brief overview of Tensorflow 2.x
- Use Google Colab as working Jupyter Notebook for practical application
- Students must use the indicated time allocated for each task.
- Task 1 to Task 4 should be completed before the remaining tasks are assigned.
Outline
Duration |
Description |
Concepts |
Activity |
Material |
0-10 min |
Introduction to tools used and how to make hands dirty in a second |
Tools introduction |
Introduction to main tools |
|
10-80 min |
[Task 1 - Task 3] Training a model and then? How to apply pruning and quantization to working models and compare performances |
Pruning & Quantization |
Practical session and working examples |
Colab Notebook |
80-140 min |
[Task 4 - Task 6] When could be knowledge distillation useful? How to distill knowledge from teacher to student |
Knowledge Distillation |
Practical session and working examples |
Colab Notebook |
140-150 min |
Conclusion, questions and answers |
Summary |
Conclusions |
|
More information
Click here for an overview of all lesson plans of the master human centred AI
Please visit the home page of the consortium HCAIM
Acknowledgements
|
The Human-Centered AI Masters programme was co-financed by the Connecting Europe Facility of the European Union Under Grant №CEF-TC-2020-1 Digital Skills 2020-EU-IA-0068.
The materials of this learning event are available under CC BY-NC-SA 4.0
|
The HCAIM consortium consists of three excellence centres, three SMEs and four Universities
|