Administrative information
Title |
Regularization Techniques |
|
Duration |
60 min |
Module |
B |
Lesson Type |
Tutorial |
Focus |
Technical - Deep Learning |
Topic |
Regularization Techniques |
Keywords
Regularization, Callbacks, Gridsearch,
Learning Goals
- Examine Weight initializers
- Investigate bias
- Apply dropout and noise
- Impliment callbacks
- Undertsand and implement a gridsearch
- Apply non traditional overfitting techniques
Expected Preparation
Learning Events to be Completed Before
None.
Obligatory for Students
None.
Optional for Students
None.
References and background for students:
-
- John D Kelleher and Brain McNamee. (2018), Fundamentals of Machine Learning for Predictive Data Analytics, MIT Press.
- Michael Nielsen. (2015), Neural Networks and Deep Learning, 1. Determination press, San Francisco CA USA.
- Charu C. Aggarwal. (2018), Neural Networks and Deep Learning, 1. Springer
- Antonio Gulli,Sujit Pal. Deep Learning with Keras, Packt, [ISBN: 9781787128422].
Recommended for Teachers
None.
Lesson Materials
The materials of this learning event are available under CC BY-NC-SA 4.0.
- The students will be expected to demonstrate and apply fundamental techniques/algorithms for the prediction and training of neural networks, including investigations for model bias and based around model regularisation. Compare, contrast and demonstrate appropriate hyper-parameters for training artificial neural networks with an emphasis on reproducibility (generalizability), transparency and interpretation. Use Deep Learning frameworks to implement Deep Learning models for classification and or regression tasks. This includes a strong focus on evaluation of the models, for performance and ethical considerations, resulting in transparency and explainable AI. This practical requires you to develop the most suitable AI model, where every suitable method for model development be employed to ensure the best generalization.
- The tutorial is based on the following dataset [[1]] where the details of the attributes are contained in the lesson materials.
- The tutorial will require the following headings. Where each heading should have (where appropriate) a rationale and description expanding the approach and its findings, code that executes the work, and in some cases a visual aid to further compound your findings. Use standard Jupyter notebook markdown, to crate headings and sections:
- Introduction
- Opening the dataset, Brief data exploration and data pre-processing
- Model exploration to determine network topology
- Hyperparameter investigation (learning parameters and optimization)
- Most appropriate model selection
- Grid search
- Final Model presentation and performance evaluation
- Please note the grid search will take some time, thus this may be completed outside of the tutorial time or the tutorial can be conducted in a flipped classroom mode of delivery.
Outline
Time schedule
Duration (Min) |
Description |
10 |
Providing an overview of the practical and importing datasets with the basic pre-processing |
10 |
Models to explore topologies |
20 |
Hyperparameter investigation with regularisation techniques |
10 |
Grid search (note this should be pre done - either by lecturer or students in flipped mode as it can take significant time to run live) |
5 |
Final model discussion |
More information
Click here for an overview of all lesson plans of the master human centred AI
Please visit the home page of the consortium HCAIM
Acknowledgements
|
The Human-Centered AI Masters programme was co-financed by the Connecting Europe Facility of the European Union Under Grant №CEF-TC-2020-1 Digital Skills 2020-EU-IA-0068.
The materials of this learning event are available under CC BY-NC-SA 4.0
|
The HCAIM consortium consists of three excellence centres, three SMEs and four Universities
|