SubjectsSubjects(version: 908)
Course, academic year 2022/2023
  
Neural Networks - AM445004
Title: Neural Networks
Guaranteed by: Department of Computing and Control Engineering (445)
Actual: from 2022
Semester: summer
Points: summer s.:5
E-Credits: summer s.:5
Examination process: summer s.:
Hours per week, examination: summer s.:2/2, C+Ex [HT]
Capacity: unknown / unknown (unknown)
Min. number of students: unlimited
Language: English
Teaching methods: full-time
Level:  
For type: Master's (post-Bachelor)
Additional information: https://moodle.vscht.cz/enrol/index.php?id=55
Guarantor: Cejnar Pavel RNDr. Mgr. Ph.D.
Incompatibility : M445004
Interchangeability : M445004
Is incompatible with: M445004
Is interchangeable with: M445004
Annotation
Last update: Cejnar Pavel RNDr. Mgr. Ph.D. (14.06.2022)
The course is focused on comprehension of commonly used neural network architectures, suitable for various types of solved problems and processed data. Lectures cover the necessary theory, but are mainly focused on practical aspects of neural network design. For seminars, students will try to train the designed models of neural networks and further optimize them.
Aim of the course
Last update: Cejnar Pavel RNDr. Mgr. Ph.D. (14.06.2022)

Students will be able to:

(i) select the appropriate neural network architecture for the selected data type

(ii) design the neural network model and select the appropriate optimization algorithm for training

Literature
Last update: Cejnar Pavel RNDr. Mgr. Ph.D. (14.06.2022)

Goodfellow, I., Bengio, Y., Courville, A.: Deep Learning. MIT Press, 2016. http://www.deeplearningbook.org

Learning resources
Last update: Cejnar Pavel RNDr. Mgr. Ph.D. (14.06.2022)

https://moodle.vscht.cz/enrol/index.php?id=55

Syllabus
Last update: Cejnar Pavel RNDr. Mgr. Ph.D. (14.06.2022)

Feed-forward neural networks

  • basic architectures and activation functions
  • optimization algorithms for training
  • selection of hyperparameters

Regularization of neural network models

  • commonly used regularization techniques - dropout, label-smoothing

Convolutional neural networks

  • convolution layers, normalization
  • architectures suitable for deep convolutional neural networks
  • pre-training and fine-tuning of deep neural networks

Recurrent neural networks

  • basic recurrent networks and problems of their training
  • LSTM, GRU
  • bidirectional and deep recurrent networks

Transformer architecture

Design and optimization of neural networks in various environments - Python, MATLAB

Entry requirements
Last update: Cejnar Pavel RNDr. Mgr. Ph.D. (14.06.2022)

basic programming skills in Python, MATLAB (at least one of them) are advisable

Course completion requirements
Last update: Cejnar Pavel RNDr. Mgr. Ph.D. (14.06.2022)

The student passes the practicals by submission of sufficient number of assignments (obtaining the appropriate number of points, including bonus points). The assignments are announced regularly during the whole semester. The student can choose which of the assignments to work on in order to obtain the necessary number of points. The written exam test consists of randomly selected questions from a set of previously announced exam questions. Classification in the exam can be improved or replaced by submission of an extended number of assignments (obtaining the extended number of points).

Teaching methods
Activity Credits Hours
Účast na přednáškách 1 28
Příprava na přednášky, semináře, laboratoře, exkurzi nebo praxi 1 28
Práce na individuálním projektu 1 28
Příprava na zkoušku a její absolvování 1 28
Účast na seminářích 1 28
5 / 5 140 / 140
 
VŠCHT Praha