大数跨境
0
0

近期必读七篇【深度神经网络】论文丨IJCAI2020

近期必读七篇【深度神经网络】论文丨IJCAI2020 AMiner AI
2020-09-28
1
导读:IJCAI 2020系列论文集!


AMiner平台由清华大学计算机系研发,拥有我国完全自主知识产权。平台包含了超过2.3亿学术论文/专利和1.36亿学者的科技图谱,提供学者评价、专家发现、智能指派、学术地图等科技情报专业化服务。系统2006年上线,吸引了全球220个国家/地区1000多万独立IP访问,数据下载量230万次,年度访问量超过1100万,成为学术搜索和社会网络挖掘研究的重要数据和实验平台。

AMiner:https://www.aminer.cn/


导语:国际人工智能联合会议(International Joint Conference on Artificial Intelligence, 简称为 IJCAI)是人工智能领域中最主要的学术会议之一,原为单数年召开,自2016年起改为每年召开。因疫情的影响, IJCAI 2020将于2021年1月5日-10日在日本举行。


根据AMiner-IJCAI 2020词云图,小脉发现表征学习、图神经网络、深度强化学习、深度神经网络等都是今年比较火的Topic,受到了很多人的关注。今天小脉给大家分享的是IJCAI 2020七篇关于深度神经网络(Deep Neural Network)相关论文。

1.论文名称:TRP: Trained Rank Pruning for Efficient Deep Neural Networks

论文链接:https://www.aminer.cn/pub/5eabf34391e011664ffd2878?conf=ijcai2020

论文简介:

  • The authors implement the TRP scheme with NVIDIA 1080 Ti GPUs. For training on CIFAR-10, the authors start with base learning rate of 0.1 to train 164 epochs and degrade the value by a factor of 10 at the 82-th and 122-th epoch.

  • For ImageNet, the authors directly finetune the model with TRP scheme from the pretrained baseline with learning rate 0.0001 for 10 epochs.

  • The authors adopt the retrained data independent decomposition as the basic methods


2.论文名称:Closing the Generalization Gap of Adaptive Gradient Methods in Training Deep Neural Networks

论文链接:https://www.aminer.cn/pub/5e67654091e011e0d179097c?conf=ijcai2020

论文简介:

  • The authors empirically evaluate the proposed algorithm for training various modern deep learning models and test them on several standard benchmarks The authors show that for nonconvex loss functions in deep learning, the proposed algorithm still enjoys a fast convergence rate, while its generalization performance is as good as SGD with momentum and much better than existing adaptive gradient methods such as Adam and Amsgrad.

  • Train Loss SGD-Momentum Adam Amsgrad AdamW Yogi AdaBound Padam Adam Amsgrad AdamW.


3.论文名称:Multi-scale Two-way Deep Neural Network for Stock Trend Prediction

论文链接:https://www.aminer.cn/pub/5ef96b048806af6ef277224d?conf=ijcai2020

论文简介:

  • 5.1 Datasets and Settings

    The authors test the model on two datasets: FI-2010 [
    Ntakaris et al, 2018] and CSI-2016.

  • FI-2010 is the first publicly available benchmark dataset of high-frequency Limit Order Book (LOB)1 data.

  • It comprises approximately 4.5 million events of 5 stocks from 10 consecutive days.

  • Every 10 non-overlapping events are officially represented as a 144-D feature vector.

  • Experimental settings on this dataset are as follows.



4.论文名称:HyperNews: Simultaneous News Recommendation and Active-Time Prediction via a Double-Task Deep Neural Network

论文链接:https://www.aminer.cn/pub/5ef96b048806af6ef27721b0?conf=ijcai2020

论文简介:

  • Fig. 1a shows the overall framework of the proposed method, HyperNews, which can simultaneously conduct news recommendation and active-time prediction in the manner of multitask learning.

  • These two tasks would share the same inputs, i.e., the attributes of news and users including ‘news-title’, ‘news-content’, ‘news-category’, ‘publish-time’, ‘click-time’, and ‘active-time’.


5.论文名称:Direct Quantization for Training Highly Accurate Low Bit-width Deep Neural Networks

论文链接:https://www.aminer.cn/pub/5ef96b048806af6ef2772109?conf=ijcai2020

论文简介:

  • Experimental results demonstrate that the proposed method achieves state-of-the-art performance on the image classification task, using AlexNet, ResNet and MobileNetV2 architectures on CIFAR-100 and ImageNet datasets.

  • The proposed method achieves the state-of-the-art image classification accuracy when comparing to other low bit-width networks on CIFAR-100 and ImageNet datasets with AlexNet, ResNet and MobileNetV2 architectures.

  • The authors' method has 0.3% Top-1 accuracy lower than QIL for AlexNet, while achieves 0.3% higher than QIL for ResNet-18.

  • In comparison with Group-Net (4 bases) on ResNet18, the method has 0.3% lower in top-1 accuracy


6.论文名称:DeepView: Visualizing Classification Boundaries of Deep Neural Networks as Scatter Plots Using Discriminative Dimensionality Reduction

论文链接:https://www.aminer.cn/pub/5ef96b048806af6ef2772107?conf=ijcai2020

论文简介:

  • In this work the authors propose DeepView, to the best of the knowledge the first algorithm that is able visualize a smooth twodimensional manifold of the decision function of a deep neural network which is trained on high-dimensional data such as natural images.

  • For this purpose, the authors adopt a mathematically precise formulation of DiDi together with a matching choice of inverse DR.

  • The authors believe that the presented approach can provide insights into trained models and contribute to improve these, by e.g. providing insights into areas of lacking data


7.论文名称:Spectral Pruning: Compressing Deep Neural Networks via Spectral Analysis and its Generalization Error

论文链接:https://www.aminer.cn/pub/5ef96b048806af6ef277217c?conf=ijcai2020

论文简介:

  • The authors proposed a simple pruning algorithm for compressing a network and gave its approximation and generalization error bounds using the degrees of freedom.

  • Unlike the existing compression based generalization error analysis, the analysis is compatible with a practically useful method and further gives a tighter intrinsic dimensionality bound.

  • The proposed algorithm is implemented and only requires linear algebraic operations.

  • The numerical experiments showed that the compression ability is related to the eigenvalue distribution, and the algorithm has favorable performance compared to existing methods


彩蛋!!!

扫码检索更多深度神经网络论文


查看更多IJCAI 2020精彩论文,请移步:

https://www.aminer.cn/conf/ijcai2020/papers

添加“小脉”微信,留言“IJCAI 2020”,即可加入【IJCAI 2020会议交流群】,与IJCAI 2020论文作者面对面沟通!

阅读原文,了解更多 “JCAI 2020 ”会议论文!
【声明】内容源于网络
0
0
AMiner AI
AI帮你理解科学
内容 212
粉丝 0
AMiner AI AI帮你理解科学
总阅读100
粉丝0
内容212