Mi, 13.12.2023 13:00

Dissertation Colloquia - Anton Ponomarchuk: Towards an understanding of different classes of ReLU neural networks

Disseration colloquia in the working group "Symbolic Computation"

Anton Ponomarchuk, RICAM

Wednesday, December 13, 13:30 - 14:30
SP2 416-2

Towards an understanding of different classes of ReLU neural networks

Abstract: The universal approximation theorems have shown that neural networks with a single hidden layer can approximate any continuous function with arbitrary precision (with some additional hypotheses). While a single-layer network is enough to learn smooth functions, there still are open questions concerning networks with more than one hidden layer. What is the relation between the class of functions represented by one hidden layer and those represented by two or more hidden layers? How does a given neural network architecture constrain the topology of its decision boundary? A better understanding of the mathematical foundations (and function classes) has implications for neural networks' statistical and algorithmic learning aspects. This talk will present recent research on these questions, the limitations of the current solution and possible alternatives.