應(yīng)yl7703永利官網(wǎng)李朋副教授邀請(qǐng), 中山大學(xué)數(shù)學(xué)學(xué)院(珠
海)張海樟教授, 將于2022年11月11號(hào)(星期五)下午15:00-16:00在線舉辦學(xué)術(shù)報(bào)告.
報(bào)告題目:Uniform Convergence of Deep Neural Networks with Contractive Activation Functions and Poolings
騰訊會(huì)議ID: 897-121-032
報(bào)告摘要: Deep neural networks, as a powerful system to represent high dimensional complex functions, play a key role in deep learning. Convergence of deep neural networks is a fundamental issue in building the mathematical foundation for deep learning. Existing researches on this subject only studied neural networks with the Rectified Linear Unit (ReLU) activation function. The important pooling strategy was not considered either. The methods in these studies made use of the piecewise linearity of ReLU. On the other hand, there are over forty activation functions that are commonly used in artificial neural networks, many of which are nonlinear. In this paper, we study the convergence of deep neural networks as the depth tends to infinity for general activation functions which cover most of commonly-used activation functions in artificial neural networks. Pooling will also be studied. Specifically, for contractive activation functions such as the logistic sigmoid function, we establish a uniform and exponential convergence of the associated deep neural networks. Such a convergence will not be affectedwhen the max pooling or average pooling is added to the structure of the network. We also adapt the matrix-vector multiplication approach developed in our recent papers to prove that the major condition there is still sufficient for the neural networks defined by nonexpansive activation functions to converge.
報(bào)告人簡(jiǎn)介
張海樟,中山大學(xué)數(shù)學(xué)學(xué)院(珠海)教授. 研究興趣包括學(xué)習(xí)理論、應(yīng)用調(diào)和分析和函數(shù)逼近. 代表性成果有再生核的Weierstrass逼近定理, 以及在國(guó)際上首創(chuàng)的再生核巴拿赫空間理論. 以再生核巴拿赫空間為基礎(chǔ)的心理學(xué)分類方法入選劍橋大學(xué)出版社的《數(shù)學(xué)心理學(xué)新手冊(cè)》.在Journal of Machine Learning Research、Applied and Computational Harmonic Analysis、Neural Networks, Neural Computation、Neurocomputing、Journal of Approximation Theory等發(fā)表多篇原創(chuàng)性工作, 單篇最高他引超過(guò)200次. 主持包括優(yōu)秀青年基金在內(nèi)的四項(xiàng)國(guó)家基金. 詳情見(jiàn)其主頁(yè)https://mathzh.sysu.edu.cn/zh-hans/teacher/106
甘肅應(yīng)用數(shù)學(xué)中心
甘肅省高校應(yīng)用數(shù)學(xué)與復(fù)雜系統(tǒng)省級(jí)重點(diǎn)實(shí)驗(yàn)室
萃英學(xué)院
yl7703永利官網(wǎng)
2022年11月3日