位置:成果数据库 > 期刊 > 期刊详情页
Progressive framework for deep neural networks: from linear to non-linear
  • ISSN号:1005-8885
  • 期刊名称:《中国邮电高校学报:英文版》
  • 时间:0
  • 分类:TN[电子电信]
  • 作者机构:[1]School of Information and Communication Engineering, Beijing University of Posts and Telecommunications, Beijing 100876, China, [2]Beijing Key Laboratory of Network System and Network Culture, Beijing University of Posts and Telecommunications, Beijing 100876, China
  • 相关基金:This work was supported by the National Natural Science Foundation of China (61471049, 61372169, 61532018), and the Postgraduate Innovation Fund of SICE, BUPT, 2015.
中文摘要:

我们建议一个新奇进步框架优化深神经的网络。想法是试着联合线性方法和听说复杂、抽象的内部代表深学习方法的能力的稳定性。我们在输入层之间插入线性损失层和一个传统的深模型的首先隐藏的非线性的层。为优化的损失是增加的新层的线性损失和最后产量层的非线性的损失的加权的和。我们修改深正规的关联分析(DCCA ) 的模型结构,即,增加第三个语义看法调整文章和图象对并且把结构嵌进我们的框架,为象 text-to-image 那样的跨 modal 检索任务,搜索和 image-to-text 寻找。试验性的结果证明修改模型的性能比类似的 state-of-art 在新加坡的国家大学的数据集上来临的好(NUS 宽) 。验证我们的框架的归纳能力,我们把我们的框架用于 RankNet,一个评价模型由随机的坡度降下优化了。我们的方法超过 RankNet 并且更快速收敛,它显示我们的进步框架能为深神经的网络提供一个更好、更快的答案。

英文摘要:

We propose a novel progressive framework to optimize deep neural networks. The idea is to try to combine the stability of linear methods and the ability of learning complex and abstract internal representations of deep leaming methods. We insert a linear loss layer between the input layer and the first hidden non-linear layer of a traditional deep model. The loss objective for optimization is a weighted sum of linear loss of the added new layer and non-linear loss of the last output layer. We modify the model structure of deep canonical correlation analysis (DCCA), i.e., adding a third semantic view to regularize text and image pairs and embedding the structure into our framework, for cross-modal retrieval tasks such as text-to-image search and image-to-text search. The experimental results show the performance of the modified model is better than similar state-of-art approaches on a dataset of National University of Singapore (NUS-WIDE). To validate the generalization ability of our framework, we apply our framework to RankNet, a ranking model optimized by stochastic gradient descent. Our method outperforms RankNet and converges more quickly, which indicates our progressive framework could provide a better and faster solution for deep neural networks.

同期刊论文项目
同项目期刊论文
期刊信息
  • 《中国邮电高校学报:英文版》
  • 主管单位:高教部
  • 主办单位:北京邮电大学、南邮、重邮、西邮、长邮、石邮
  • 主编:LU Yinghua
  • 地址:北京231信箱(中国邮电大学)
  • 邮编:100704
  • 邮箱:jchupt@bupt.edu.cn
  • 电话:010-62282493
  • 国际标准刊号:ISSN:1005-8885
  • 国内统一刊号:ISSN:11-3486/TN
  • 邮发代号:2-629
  • 获奖情况:
  • 国内外数据库收录:
  • 俄罗斯文摘杂志,波兰哥白尼索引,荷兰文摘与引文数据库,美国工程索引,美国剑桥科学文摘,英国科学文摘数据库
  • 被引量:127