您好,欢迎光临本网站![请登录][注册会员]  

搜索资源列表

  1. Natural-Logarithm-Rectified Activation Function in Convolutional Neural Networks

  2. Activation functions play a key role in providing remarkable performance in deep neural networks, and the rectified linear unit (ReLU) is one of the most widely used activation functions. Various new activation functions and improvements on ReLU have
  3. 所属分类:其它

    • 发布日期:2021-02-08
    • 文件大小:589824
    • 提供者:weixin_38636655