数学理论与应用 ›› 2016, Vol. 36 ›› Issue (4): 92-105.

• • 上一篇    下一篇

基于λ -递增函数的样本学习

李晶晶, 田大钢   

  1. 上海理工大学管理学院,上海,200093
  • 出版日期:2016-12-30 发布日期:2020-09-27

Sample Learning Based on λ -increasing Function

Li Jingjing, Tian Dagang   

  1. Business School,University of Shanghai for Science and Technology,Shanghai 200093,China
  • Online:2016-12-30 Published:2020-09-27

摘要: 在理论研究和实际应用中,神经网络的结构问题一直是个难点.本文利用Vugar E.Ismailov近期的研究成果,讨论了神经网络对样本点的学习问题.结果表明,利用λ -严格递增函数,只需两个隐层节点,就可以学会任意给定的样本集.同时讨论了在隐层节点中使用通常的Sigmoid函数与使用λ -严格递增函数作为活化函数的差别.

关键词: 神经网络, 网络结构, λ -递增函数, Sigmoid函数

Abstract: How to determine the structure of a neural network is often an aporia in theoretical research and practical application.Based on the recent research of Vugar E.Ismailov,this paper study the learning method of sample points in neural network.The results show that with theλ -strictly increasing function,any specified sample set can be learnt by using only two neurons in hidden layer.The differences between using the usual Sigmoid function andλ -strictly increasing function as active function in the hidden layer are presented as well.

Key words: Neural network, Neural network structure, λ -increasing function, Sigmoid function