Title | Forward propagation | ![]() |
Duration | 60 min | |
Module | B | |
Lesson Type | Lecture | |
Focus | Technical - Deep Learning | |
Topic |
Forward pass |
Forward pass, Loss,
None.
None.
References and background for students:
None.
The materials of this learning event are available under CC BY-NC-SA 4.0.
This lecture will introduce students to the fundamentals of forward propagation for an artificial neural network. This will introduce students to the topology (weights, synapses, activation functions and loss functions). Students will then be able to do a forward pass using pen and paper, using Python with only the Numpy library (for matrices manipulation) and then using KERAS as part of the tutorial associated with this LE. This will build fundamental understanding of what activation functions apply to specific problem contexts and how the activation functions differ in computational complexity. In the lecture the outer layer activation function and corresponding loss functions will be examined for use cases such as binomial classification, regression and multi-class classification.
Note:
Duration (Min) | Description |
---|---|
10 | Definition of Neural Network Components |
15 | Weights and Activation functions (Sigmoid, TanH and ReLu) |
15 | Loss functions (Regression, binomial classification, and multi class activation) |
15 | Using matrices for a forward pass |
5 | Recap on the forward pass |
Click here for an overview of all lesson plans of the master human centred AI
Please visit the home page of the consortium HCAIM
![]() |
The Human-Centered AI Masters programme was co-financed by the Connecting Europe Facility of the European Union Under Grant №CEF-TC-2020-1 Digital Skills 2020-EU-IA-0068. The materials of this learning event are available under CC BY-NC-SA 4.0
|
The HCAIM consortium consists of three excellence centres, three SMEs and four Universities |