Skip to main content

Deep Neural Network Approximation via Function Compositions

Dr. Shijun ZHANG
Date & Time
14 Dec 2023 (Thu) | 10:00 AM - 11:00 AM
Venue
Online Via Zoom
Registration Link: https://cityu.zoom.us/meeting/register/tJAsf--urjsqHtRrjIq8LPssJwBVloPmBGNs

ABSTRACT

Deep neural networks have brought transformative changes in areas such as image recognition and healthcare, yet their underlying theoretical principles remain somewhat elusive. The main emphasis of this presentation is to explore the theoretical dimensions of deep neural networks, with a special focus on evaluating their approximation capabilities. We begin by assessing the approximation errors in ReLU networks when they attempt to approximate various target functions, including (Lipschitz) continuous functions, polynomials, and smooth functions. Utilizing the concept of Vapnik-Chervonenkis dimension, we demonstrate that these approximation errors are nearly optimal. To achieve better approximation accuracy, we propose several innovative methods, including the introduction of new activation functions, the pre-setting of certain parameters, and the sharing of parameters. Specifically, we have developed a simple and computable activation function, named EUAF, which enables a fixed-size EUAF network to approximate continuous functions with arbitrary accuracy. Additionally, drawing inspiration from the widespread use of ReLU, we investigate its connections to other activation functions. We broaden the scope of our approximation results, initially specific to ReLU networks, to include a wide array of activation functions such as Sigmoid, ELU, and GELU.