文件名称:
Batched Sparse Matrix Multiplication for Accelerating Graph Convolutional PPT
开发工具:
文件大小: 2mb
下载次数: 0
上传时间: 2019-08-03
详细说明:Batched Sparse Matrix Multiplication for Accelerating Graph Convolutional Networks
对图卷积网络进行加速的批量稀疏矩阵乘法
作者的ppt的pdf版本Formulation of Graph Convolution
GraphConvolution( A, X w, bias)
Feature
forb← o to batchsize
do for ch←0 to channel
y:=2a a,w
Y=AXW
doU← MatOu(xX[b],W[ch]】)
B←Add(bias[Ch],U)
MatMul and SpMM
C[ch]+ SpMM(AbI[ch], B)
Adjacency matrix
Y[b]16)
min2t.nB≤2(mB≤106)
Reduce instructions for memory access to same non-zero element
atomicAddo
SWA SpMM(C A, B, subWarp)
//set matrix c to o
i← threaded
nzid←i/ subWarp
1
2
2730
rid←idsA[nzid*2
3
cd←idsA[nzid*2+1]
Va|← valuesalnzid
forj←(i‰ subWarp) to nB by sub Warp
do Atomic( C[rid][]+c[rid[]+ val bicidjll
8
Sub-Warp-Assigned(SWA)SpMM for CSR
Assign sub Warp to each row of input sparse matrix
Reduce instructions for memory access to same non-zero element
Atomic-free addition to output matrix
SWA SpMM CSR(C, A, B, subWarp)
//set matrix c to o
i← threaded
rid←i/ subWarp
subWarp subWarp
6
2730
for nzid rptA[rid] to rptA[rid 1
3
do cid← solids[nzd]
Va|← valuesalnzid]
for j+(i% subWarp) to nB by subWarp
do clsid]]←Crid]可+va*B[cid][j
Efficient Use of Shared Memory
Utilize shared memory for output matrix
(a For small matrix
(b) Cache blocking
with Sparse Tensor
Reduce the overhead of cuda kernel launch for
nitializing output matrix
Hardware support for atomic operation on shared
memory
Cache blocking optimization for larger inputs
Divide the output matrix along the column
Larger output matrix can be placed on shared memory
Also improve the locality of memory access to input dense
matrix
Global Memory
□ Threads
田□□
Input Matrix(sparse)
SM
nput Matrix(dense
Shared Memory
utput Matrix(dense)
10
(系统自动生成,下载前可以参看下载内容)
下载文件列表
相关说明
- 本站资源为会员上传分享交流与学习,如有侵犯您的权益,请联系我们删除.
- 本站是交换下载平台,提供交流渠道,下载内容来自于网络,除下载问题外,其它问题请自行百度。
- 本站已设置防盗链,请勿用迅雷、QQ旋风等多线程下载软件下载资源,下载后用WinRAR最新版进行解压.
- 如果您发现内容无法下载,请稍后再次尝试;或者到消费记录里找到下载记录反馈给我们.
- 下载后发现下载的内容跟说明不相乎,请到消费记录里找到下载记录反馈给我们,经确认后退回积分.
- 如下载前有疑问,可以通过点击"提供者"的名字,查看对方的联系方式,联系对方咨询.