[Fri 26 Jun 2015 ~ Thu 2 Jul 2015] Deep Learning in arxiv

Natural Neural Networks

Google DeepMind又一神作

Projected Natural Gradient Descent algorithm (PRONG) better than SGD as evidenced by the boost in performance offered by batch normalization (BN)

Deep Convolutional Matching

DEEP-PLANT: PLANT IDENTIFICATION WITHCONVOLUTIONAL NEURAL NETWORKS

Supervised Learning of Semantics-PreservingHashing via Deep Neural Networks for Large-Scale Image Search

AttentionNet: Aggregating Weak Directionsfor Accurate Object Detection

版权声明:本文为博主原创文章,未经博主允许不得转载。

时间: 2024-07-29 23:16:44

[Fri 26 Jun 2015 ~ Thu 2 Jul 2015] Deep Learning in arxiv的相关文章

[Fri 19 Jun 2015 ~ Thu 25 Jun 2015] Deep Learning in arxiv

A Neural Network Approach to Context-Sensitive Generation of Conversational Responses Leverage Financial News to Predict Stock Price Movements Using Word Embeddings and Deep Neural Networks MatchNet: Unifying Feature and Metric Learning for Patch-Bas

[Fri, 3 Jul 2015 ~ Tue, 7 Jul 2015] Deep Learning in arxiv

Convolutional Color Constancy can this be used for training cnn to narrow the gap between different lighting conditions? Describing Multimedia Content using Attention-based Encoder–Decoder Networks lots of machine translation, speech recognition, ima

[Thu, 9 Jul 2015 ~ Tue, 14 Jul 2015] Deep Learning in arxiv

这一期的神作论文有蛮多的,都非常有意思. Feature Representation In ConvolutionalNeural Networks 该论文中论述了在某种CNN结构下,是否有准确率较高的off model的分类方法(这里是指非softmax)能达到更有效的分类结果呢? 论文给出了肯定的答案. 该论文还给出了各层特征重要性的图表,蛮有意思的 该论文还交代了实验中用到的开源代码. Towards Good Practices for Very DeepTwo-Stream Conv

[12 Jun 2015 ~ 18 Jun 2015] Deep Learning in arxiv

Multi-pathConvolutional Neural Network for Complex Image Classification Suppresshigh frequency components with Bilateral filter in the second path ParseNet:Looking Wider to See Better code:https: //github.com/weiliu89/caffe/tree/fcn SEMANTICIMAGE SEG

[3 Jun 2015 ~ 9 Jun 2015] Deep Learning in arxiv

arXiv is an e-print service in the fields of physics, mathematics, computer science, quantitative biology, quantitative finance and statistics. There'll be lots of papers in advance. Here's some recent papers which is important or interesting. 1. Obj

[10 Jun 2015 ~ 11 Jun 2015] Deep Learning in arxiv

1. similarity nets DeepSimNets 2. multi-task Learning Multiple Tasks with Deep Relationship Networks 3. interactive learning system Constructionof a Large-scale Image Dataset using Deep Learning with Humans in the Loop

[Tue, 21 Jul 2015 ~ Mon, 27 Jul 2015] Deep Learning in arxiv

Compression of Fully-Connected Layer inNeural Network by Kronecker Product 又是一篇压缩网络文章,但没有给出在imagenet上的错误率变化,有待观测. Building a Large-scale Multimodal KnowledgeBase for Visual Question Answering 李飞飞视觉QA KB示意图 系统流程图 Bottom-up and top-down reasoning withc

[Tue, 11 Aug 2015 ~ Mon, 17 Aug 2015] Deep Learning in arxiv

Image Representations and New Domains inNeural Image Captioning we find that a state-of-theart neuralcaptioning algorithm is able to produce quality captions even when providedwith surprisingly poor image representations Deep Boosting: Joint Feature

【BZOJ 4104】 4104: [Thu Summer Camp 2015]解密运算 (智商)

4104: [Thu Summer Camp 2015]解密运算 Time Limit: 10 Sec  Memory Limit: 512 MBSubmit: 370  Solved: 237 Description 对于一个长度为N的字符串,我们在字符串的末尾添加一个特殊的字符".".之后将字符串视为一个环,从位置1,2,3,...,N+1为起点读出N+1个字符,就能得到N+1个字符串. 比如对于字符串"ABCAAA",我们可以得到这N+1个串: ABCAAA.