您好,欢迎光临本网站![请登录][注册会员]  

搜索资源列表

  1. H.265/HEVC标准白皮书(2013年1月)

  2. 新一代视频压缩编码技术!今年刚认证的H.265编码标准,前几天一直找不到白皮书,今天发上来,估计不少人也在找吧。下面是目录: CONTENTS Page Abstract i 0 Introduction 1 0.1 General 1 0.2 Prologue 1 0.3 Purpose 1 0.4 Applications 1 0.5 Publication and versions of this Specification 1 0.6 Profiles, tiers and level
  3. 所属分类:其它

    • 发布日期:2013-04-13
    • 文件大小:4194304
    • 提供者:newthinker_wei
  1. 深度学习的9篇标志性论文

  2. A Fast Learning Algorithm for Deep Belief Nets (2006) - 首 次提出layerwise greedy pretraining的方法,开创deep learning方向。 layerwise pretraining的Restricted Boltzmann Machine (RBM)堆叠起来构成 Deep Belief Network (DBN),其中训练最高层的RBM时加入了label。之后对整个DBN进行fine-tuning。在 MNI
  3. 所属分类:其它

    • 发布日期:2013-12-15
    • 文件大小:3145728
    • 提供者:luoyun614
  1. Greedy Layer-Wise Training of Deep Networks

  2. Greedy Layer-Wise Training of Deep Networks
  3. 所属分类:其它

    • 发布日期:2014-08-22
    • 文件大小:118784
    • 提供者:honghf123
  1. 深度学习的9篇标志性论文

  2. A Fast Learning Algorithm for Deep Belief Nets (2006) - 首 次提出layerwise greedy pretraining的方法,开创deep learning方向。 layerwise pretraining的Restricted Boltzmann Machine (RBM)堆叠起来构成 Deep Belief Network (DBN),其中训练最高层的RBM时加入了label。之后对整个DBN进行fine-tuning。在 MNI
  3. 所属分类:专业指导

    • 发布日期:2014-09-17
    • 文件大小:3145728
    • 提供者:hgj3804278
  1. 创建深度学习理论的3篇论文

  2. 创建深度学习理论的3篇论文。包括 A fast learning algorithm for deep belief nets,Efficient Learning of Sparse Representations with an Energy-Based Model,Greedy Layer-Wise Training of Deep Networks
  3. 所属分类:讲义

    • 发布日期:2014-11-23
    • 文件大小:901120
    • 提供者:yangkequn
  1. Greedy Layer-Wise Training

  2. Complexity theory of circuits strongly suggests that deep architectures can be much more efcient sometimes exponentially than shallow architectures in terms of computational elements required to represent some functions Deep multi layer neural netwo
  3. 所属分类:专业指导

    • 发布日期:2015-05-08
    • 文件大小:317440
    • 提供者:millelee
  1. Deep Learning Methods and Applications

  2. 1 Introduction 198 1.1 Definitions and background . . . . . . . . . . . . . . . . . 198 1.2 Organization of this monograph . . . . . . . . . . . . . . 202 2 Some Historical Context of Deep Learning 205 3 Three Classes of Deep Learning Networks 214 3
  3. 所属分类:专业指导

    • 发布日期:2015-05-20
    • 文件大小:8388608
    • 提供者:lengwuqin
  1. Deep Learning Tutorial

  2. 1 LICENSE 1 2 Deep Learning Tutorials 3 3 Getting Started 5 3.1 Download . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 3.2 Datasets . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
  3. 所属分类:专业指导

    • 发布日期:2015-05-20
    • 文件大小:1048576
    • 提供者:lengwuqin
  1. Learning Deep Architectures for AI--content

  2. Contents 1 Introduction 2 1.1 How do We Train Deep Architectures? 5 1.2 Intermediate Representations: Sharing Features and Abstractions Across Tasks 7 1.3 Desiderata for Learning AI 10 1.4 Outline of the Paper 11 2 Theoretical Advantages of Deep Arc
  3. 所属分类:专业指导

    • 发布日期:2015-05-20
    • 文件大小:1048576
    • 提供者:lengwuqin
  1. 关于深度学习(DL)的9篇标志性文章

  2. deep learning 的一些标志性文章 A Fast Learning Algorithm for Deep Belief Nets (2006) - 首 次提出layerwise greedy pretraining的方法,开创deep learning方向。 layerwise pretraining的Restricted Boltzmann Machine (RBM)堆叠起来构成 Deep Belief Network (DBN),其中训练最高层的RBM时加入了label。之后对整
  3. 所属分类:专业指导

    • 发布日期:2015-09-14
    • 文件大小:3145728
    • 提供者:zyf19930610
  1. 100篇之外深度学习.zip

  2. 新论文:最近6个月以内的 Batch Renormalization: Towards Reducing Minibatch Dependence in Batch-Normalized Models, S. Ioffe. Wasserstein GAN, M. Arjovsky et al. Understanding deep learning requires rethinking generalization, C. Zhang et al. [pdf] 老论文:2012年以前的 An
  3. 所属分类:其它

    • 发布日期:2017-02-22
    • 文件大小:35651584
    • 提供者:oscer2016
  1. Deep learning Methods and Applications

  2. 微软大佬邓力的关于深度学习及应用的力作,主要是在语音方向, Table of Contents Chapter 1 Introduction .................................................................................................................... 5 1.1 Definitions and Background.............................
  3. 所属分类:深度学习

    • 发布日期:2017-12-22
    • 文件大小:3145728
    • 提供者:wangdq_1989
  1. k7 SRIO参考例程

  2. Core name: Xilinx LogiCORE Serial RapidIO Version: 5.5 Release Date: April 19, 2010 ================================================================================ This document contains the following sections: 1. Introduction 2. New Features 3. Su
  3. 所属分类:硬件开发

    • 发布日期:2018-07-27
    • 文件大小:433152
    • 提供者:cleve_mfr
  1. COMMON LAYER INTERFACE (CLI).pdf

  2. The Common Layer Interface (CLI) is a universal format for the input of geometry data to model fabrication systems based on layer manufacturing technologies (LMT). It is suitable for systems using layer-wise photo-curing of resin, sintering or bindi
  3. 所属分类:制造

    • 发布日期:2018-10-19
    • 文件大小:217088
    • 提供者:qq_43462040
  1. TensorFlow.js Machine Learning for the Web and Beyond.pdf

  2. TensorFlow.js, Google 提供的基于TensorFlow的Javascr ipt库。方便使用JS的开发者使用,并且可以为未来的边缘计算提供支持。TensorFlow. js: Machine Learning for the Web and beyond acceleration, notably TensorFire(Kwok et al., 2017), Propel Layers APl, which provides higher-level model buildin
  3. 所属分类:机器学习

    • 发布日期:2019-07-15
    • 文件大小:581632
    • 提供者:nicholaskong
  1. Learning Lightweight Lane Detection CNNs by Self Attention Distillation.pdf

  2. Training deep models for lane detection is challenging due to the very subtle and sparse supervisory signals in- herent in lane annotations. Without learning from much richer context, these models often fail in challenging sce- narios, e.g., severe o
  3. 所属分类:深度学习

    • 发布日期:2020-04-20
    • 文件大小:3145728
    • 提供者:iOrigin
  1. Algorithms for hyper-parameter optimization

  2. Algorithms for hyper-parameter optimization.pdf,讲述贝叶斯算法的TPE过程的专业论文The contribution of this work is two novel strategies for approximating f by modeling H: a hier archical Gaussian Process and a tree-structured parzen estimator. These are described in
  3. 所属分类:其它

    • 发布日期:2019-09-03
    • 文件大小:274432
    • 提供者:yangtao_whut
  1. 深度学习的起源.pdf

  2. 深度学习的起源.pdfON THE ORIGIN OF DEEP LEARNING Table 1: Major milestones that will be covered in this paper Year Contributer Contribution 300BC Aristotle introduced Associationism, started the history of human's attempt to understand brain 1873 Alexander
  3. 所属分类:其它

    • 发布日期:2019-07-03
    • 文件大小:5242880
    • 提供者:python_gogogo
  1. 多层卷积脉冲神经网络.pdf

  2. Spiking neural networks (SNNs) have advantages over traditional, non-spiking networks with respect to bio- realism, potential for low-power hardware implementations, and theoretical computing power. However, in practice, spiking net- works with multi
  3. 所属分类:深度学习

    • 发布日期:2020-07-27
    • 文件大小:1048576
    • 提供者:mfysxf
  1. 模型剪枝学习笔记4–Layer-wise Pruning and Auto-tuning of Layer-wise Learning Rates

  2. Layer-wise Pruning and Auto-tuning of Layer-wise Learning Rates in Fine-tuning of Deep Networks 这篇论文是上个月刚出的关于剪枝方面的论文。作者:首尔大学团队 论文下载地址:https://arxiv.org/abs/2002.06048 Layer-wise剪枝+AutoLR:深度网络微调中的层级剪枝和层级学习率的自动调整。 该方法可以逐层剪枝和自动调整逐层学习率来提高微调性能并降低网络复杂性。 摘要
  3. 所属分类:其它

    • 发布日期:2021-01-07
    • 文件大小:38912
    • 提供者:weixin_38651468
« 12 »