Main page | Study Branches/Specializations | Groups of Courses | All Courses | Roles                Instructions

A course is the basic teaching unit, it's design as a medium for a student to acquire comprehensive knowledge and skills indispensable in the given field. A course guarantor is responsible for the factual content of the course.
For each course, there is a department responsible for the course organisation. A person responsible for timetabling for a given department sets a time schedule of teaching and for each class, s/he assigns an instructor and/or an examiner.
Expected time consumption of the course is expressed by a course attribute extent of teaching. For example, extent = 2 +2 indicates two teaching hours of lectures and two teaching hours of seminar (lab) per week.
At the end of each semester, the course instructor has to evaluate the extent to which a student has acquired the expected knowledge and skills. The type of this evaluation is indicated by the attribute completion. So, a course can be completed by just an assessment ('pouze zápočet'), by a graded assessment ('klasifikovaný zápočet'), or by just an examination ('pouze zkouška') or by an assessment and examination ('zápočet a zkouška') .
The difficulty of a given course is evaluated by the amount of ECTS credits.
The course is in session (cf. teaching is going on) during a semester. Each course is offered either in the winter ('zimní') or summer ('letní') semester of an academic year. Exceptionally, a course might be offered in both semesters.
The subject matter of a course is described in various texts.

MI-TNN Theory of Neural Networks Extent of teaching: 1P+1C
Instructor: Completion: Z,ZK
Department: 18105 Credits: 4 Semester: L

Annotation:
In this course, we study neural networks from the point of view of the theory of function approximation and from the point of view of probability theory. At first, we recall basic concepts pertaining to artificial neural Networks, such as neurons and connections between them, types of neurons from the point of view of signal transmission, network topology, somatic and synaptic mappings, network training, and the role of time in neural networks. In connection with network topology, we get acquainted with its transformation into a canonical topology, and in connection with somatic and synaptic mappings, with their composition into mappings computed by the Network, Finally in connection with training, we pay attention to the problem of overtraining and to the fact that training is actually a specific optimization task, recalling the most typical objective functions and the most important optimization methods employed for neural network training. We will see the meaninig of all these concepts in the context of common kinds of forward neural networks. Within the topic approximation approach to neural networks, we first notice the connection of neural networks to expressing functions of many variables using functions of fewer variables (Kolmogorov theorem, Vituškin theorem). Afterwards, we will see how the universal approximation capacity of neural networks can be mathematically formalized as the sets of mappings computed by neural networks being dense in important Banach spaces of functions, in particular in the spaces of continuous functions, spaces of functions integrable with respect to a finite measure, spaces of functions with continuous derivatives, and Sobolev spaces. Within the topic probabilistic approach, we first get acquainted with training based on expectation and training based on a random sample, and with probabilistic assumptions about training data with which those two kinds of neural networks can be employed. We will see how it is possible to get an estimate of the conditional expectancy of network outputs conditioned by its inputs using the expectancy based learning. We recall the strong and the weak law of large numbers and get acquainted with an analogy of the strong law of large numbers for neural networks and with the assumptions for its validity. Finally, we recall the central limit theorem, get acquinted with its analogy for neural networks, with the assumptions for its validity and with the hypothesis tests based on it. We will see how those tests can be employed to search for the topology of the network.

Lecture syllabus:
1,2: Basic concept of artificial neural networks. 3,4: Artificial neural networks from the point of view of function approximation theory. 5,6: Artificial neural networks from the point of view of probability theory.

Seminar syllabus:
1: Introduction to Matlab. 2: Survey of Python neural networks libraries. 3,4: Shallow neural networks in Matlab. 5,6: Deep neural networks in Matlab.

Literature:
[1] M.T. Hagan, H.B. Demuth, and M.H. Beale. Neural Network Design. PWS Publishing, 1996.
[2] T. Hasti, R. Tibshirani, and J. Friedman. The Elements of Statistical Learning. Springer, 2001.
[3] M.H. Beale, M.T. Hagan, and H.B. Demuth. Deep Learning Toolbox User's Guide, Version 12. Mathworks, 2018.
[4] H. White. Artificial Neural Networks: Approximation and Learning Theory. Blackwell
Publishers,992.

Requirements:

Informace o předmětu a výukové materiály naleznete na https://courses.fit.cvut.cz/MI-TNN/

The course is also part of the following Study plans:
Study Plan Study Branch/Specialization Role Recommended semester
MI-ZI.2016 Knowledge Engineering V Není
MI-ZI.2018 Knowledge Engineering V Není
MI-SP-TI.2016 System Programming V Není
MI-SP-SP.2016 System Programming V Není
MI-SPOL.2016 Unspecified Branch/Specialisation of Study V Není
MI-WSI-WI.2016 Web and Software Engineering V Není
MI-WSI-SI.2016 Web and Software Engineering V Není
MI-WSI-ISM.2016 Web and Software Engineering V Není
MI-NPVS.2016 Design and Programming of Embedded Systems V Není
MI-PSS.2016 Computer Systems and Networks V Není
MI-PB.2016 Computer Security V Není
NI-TI.2018 Computer Science V 2


Page updated 28. 3. 2024, semester: Z/2023-4, L/2019-20, L/2022-3, Z/2019-20, Z/2022-3, L/2020-1, L/2023-4, Z/2020-1, Z,L/2021-2, Send comments to the content presented here to Administrator of study plans Design and implementation: J. Novák, I. Halaška