Enhanced Expressive Power and Fast Training of Neural Networks by Random Projections
- A+
:蔡剑锋副教授
:2019-03-18 10:00
:实验楼105
报告人:蔡剑锋副教授
香港科技大学
题目: Enhanced Expressive Power Fast Training of Neural Networks by Random Projections
时间:2019年3月18日10:00
地点:实验楼105
摘要:
Random projections are able to perform dimension reduction efficiently for datasets with nonlinear low-dimensional structures. One well-known example is that random matrices embed sparse vectors into a low-dimensional subspace nearly isometrically, known as the restricted isometric property in compressed sensing. In this talk, we explore some applications of random projections in deep neural networks. We provide the expressive power of fully connected neural networks when the input data are sparse vectors or form a low-dimensional smooth manifold. We prove that the number of neurons required for approximating a Lipschitz function with a prescribed precision depends on the sparsity or the dimension of the manifold weakly on the dimension of the input vector. The key in our proof is that random projections embed stably the set of sparse vectors or a low-dimensional smooth manifold into a low-dimensional subspace. Based on this fact, we also propose some new neural network models, where at each layer the input is first projected onto a low-dimensional subspace by a random projection then the standard linear connection non-linear activation are applied. In this way, the number of parameters in neural networks is significantly reduced, therefore the training of neural networks can be accelerated without too much performance loss.
报告人简介:
蔡剑锋分别于复旦大学和香港中文大学获得学士和博士学位,先后在新加坡国立大学、美国加州大学洛杉矶分校和爱荷华大学任职博士后、访问助理教授、助理教授等,目前是香港科技大学数学系副教授。他的研究兴趣是成像科学和数据科学中的数学方法,包括数值线性代数、优化、计算调和分析、逼近论和概率,已在J. Amer. Math. Soc.、 Appl. Comput. Harmon. Anal.、SIAM J. Optim.、SIAM J. Sci. Comput.、Math. Comp.、Numer. Math.等数学一流期刊发表论文40余篇。蔡剑锋是2017、2018年全球高引用科学家之一。

