前言 图神经网络(gnn)是一类功能强大的神经网络,它对图结构数据进行操作。它们通过从节点的局部邻域聚合信息来学习节点表示(嵌入)。这个概念在图表示学习文献中被称为"消息传递"。
本文转载自P**nHub兄弟网站
作者 | Ebrahim Pichka
仅用于学术分享,若侵权请联系删除
欢迎关注公众号 CV技术指南,专注于计算机视觉的技术总结、最新技术跟踪、经典论文解读、CV招聘信息。
【CV技术指南】CV全栈指导班、基础入门班、论文指导班 全面上线!!

消息(嵌入)通过多个GNN层在图中的节点之间传递。每个节点聚合来自其邻居的消息以更新其表示。这个过程跨层重复,允许节点获得编码有关图的更丰富信息的表示。gnn的一主要变体有GraphSAGE[2]、Graph Convolution Network[3]等。
图注意力网络(GAT)[1]是一类特殊的gnn,主要的改进是消息传递的方式。他们引入了一种可学习的注意力机制,通过在每个源节点和目标节点之间分配权重,使节点能够在聚合来自本地邻居的消息时决定哪个邻居节点更重要,而不是以相同的权重聚合来自所有邻居的信息。

图注意力网络在节点分类、链接预测和图分类等任务上优于许多其他GNN模型。他们在几个基准图数据集上也展示了最先进的性能。
在这篇文章中,我们将介绍原始"Graph Attention Networks"(by Veličković )论文的关键部分,并使用PyTorch实现论文中提出的概念,这样以更好地掌握GAT方法。
论文引言
在第1节"引言"中对图表示学习文献中的现有方法进行了广泛的回顾之后,论文介绍了图注意网络(GAT)。

然后将论文的方法与现有的一些方法进行比较,并指出它们之间的一般异同,这是论文的常用格式,就不多介绍了。
GAT的架构
本节是本文的主要部分,对图注意力网络的体系结构进行了详细的阐述。为了进一步解释,假设所提出的架构在一个有N个节点的图上执行(V = {V′};i=1,...,N),每个节点用向量h ^ (F个元素)表示,节点之间存在任意边。

作者首先描述了单个图注意力层的特征,以及它是如何运作的(因为它是图注意力网络的基本构建块)。一般来说,单个GAT层应该将具有给定节点嵌入(表示)的图作为输入,将信息传播到本地邻居节点,并输出更新后的节点表示。

如上所述,ga层的所有输入节点特征向量(h′)都是线性变换的(即乘以一个权重矩阵W),在PyTorch中,通常是这样做的:

ini
import torch
from torch import nn
# in_features -> F and out_feature -> F'
in_features = ...
out_feature = ...
# instanciate the learnable weight matrix W (FxF')
W = nn.Parameter(torch.empty(size=(in_features, out_feature)))
# Initialize the weight matrix W
nn.init.xavier_normal_(W)
# multiply W and h (h is input features of all the nodes -> NxF matrix)
h_transformed = torch.mm(h, W)
获得了输入节点特征(嵌入)的转换版本后我们先跳到最后查看和理解GAT层的最终目标是什么。
如论文所述,在图注意层的最后,对于每个节点i,我们需要从其邻域获得一个新的特征向量,该特征向量更具有结构和上下文感知性。
这是通过计算相邻节点特征的加权和,然后是非线性激活函数σ来完成的。根据Graph ML文献,这个加权和在一般GNN层操作中也被称为"聚合"步骤。
论文的这些权重α′ⱼ∈[0,1]是通过一种关注机制来学习和计算的,该机制表示在消息传递和聚合过程中节点i的邻居j特征的重要性。

每一对节点i和它的邻居j计算这些注意权值α′ⱼ的计算方法如下

其中e ^ⱼ是注意力得分,在应用Softmax函数后,有权重都会在[0,1]区间内,并且和为1。现在通过注意函数a(...)计算每个节点i和它的邻居j∈N′之间的注意分数e′ⱼ,如下所示:

上图中的||表示两个转换后的节点嵌入的连接,a是大小为2 * F '(转换后嵌入大小的两倍)的可学习参数(即注意力参数)向量。而(a¹)是向量a的转置,导致整个表达式a¹[Wh′|| Whⱼ]是"a"与转换后的嵌入的连接之间的点(内)积。
整个操作说明如下:

在PyTorch中,我们采用了一种稍微不同的方法。因为计算所有节点对之间的e′ⱼ然后只选择代表节点之间现有边的那些是更有效的。来计算所有的e′ⱼ
ini
# instanciate the learnable attention parameter vector `a`
a = nn.Parameter(torch.empty(size=(2 * out_feature, 1)))
# Initialize the parameter vector `a`
nn.init.xavier_normal_(a)
# we obtained `h_transformed` in the previous code snippet
# calculating the dot product of all node embeddings
# and first half the attention vector parameters (corresponding to neighbor messages)
source_scores = torch.matmul(h_transformed, self.a[:out_feature, :])
# calculating the dot product of all node embeddings
# and second half the attention vector parameters (corresponding to target node)
target_scores = torch.matmul(h_transformed, self.a[out_feature:, :])
# broadcast add
e = source_scores + target_scores.T
e = self.leakyrelu(e)
代码片段的最后一部分(# broadcast add)将所有一对一的源和目标分数相加,得到一个包含所有e′ⱼ分数的NxN矩阵。(下图所示)

到目前为止,我们假设图是完全连接的,我们计算的是所有可能的节点对之间的注意力得分。但是其实大部分情况下图不可能是完全连接的,所以为了解决这个问题,在将LeakyReLU激活应用于注意力分数之后,注意力分数基于图中现有的边被屏蔽,这意味着我们只保留与现有边对应的分数。
它可以通过给不存在边的节点之间的分数矩阵中的元素分配一个大的负分数(近似于-∞)来完成,这样它们对应的注意力权重在softmax之后变为零(还记得我们以前发的注意力掩码么,就是一样的道理)。
这里的注意力掩码是通过使用图的邻接矩阵来实现的。邻接矩阵是一个NxN矩阵,如果节点i和j之间存在一条边,则在第i行和第j列处为1,在其他地方为0。因此,我们通过将邻接矩阵的零元素赋值为-∞并在其他地方赋值为0来创建掩码。然后将掩码添加到分数矩阵中。然后在它的行上应用softmax函数。
ini
connectivity_mask = -9e16 * torch.ones_like(e)
# adj_mat is the N by N adjacency matrix
e = torch.where(adj_mat > 0, e, connectivity_mask) # masked attention scores
# attention coefficients are computed as a softmax over the rows
# for each column j in the attention score matrix e
attention = F.softmax(e, dim=-1)
最后,根据论文描述,在获得注意力分数并将其与现有的边进行掩码遮蔽后,通过对分数矩阵的行执行softmax,得到注意力权重α¹ⱼ。

我们通过一个完整的可视化图过程如下:

最后就是计算节点嵌入的加权和:
ini
# final node embeddings are computed as a weighted average of the features of its neighbors
h_prime = torch.matmul(attention, h_transformed)
以上一个一个注意力头的工作流程和原理,论文还引入了多头的概念,其中所有操作都是通过多个并行的操作流来完成的。

多头注意力和聚合过程如下图所示:

节点1在其邻域中的多头注意力(K = 3个头),不同的箭头样式和颜色表示独立的注意力计算。将来自每个头部的聚合特征连接或平均以获得h '。
为了以更简洁的模块化形式(作为PyTorch模块)封装实现并合并多头注意力的功能,整个Graph关注层的实现如下:
ini
import torch
from torch import nn
import torch.nn.functional as F
################################
### GAT LAYER DEFINITION ###
################################
class GraphAttentionLayer(nn.Module):
def __init__(self, in_features: int, out_features: int,
n_heads: int, concat: bool = False, dropout: float = 0.4,
leaky_relu_slope: float = 0.2):
super(GraphAttentionLayer, self).__init__()
self.n_heads = n_heads # Number of attention heads
self.concat = concat # wether to concatenate the final attention heads
self.dropout = dropout # Dropout rate
if concat: # concatenating the attention heads
self.out_features = out_features # Number of output features per node
assert out_features % n_heads == 0 # Ensure that out_features is a multiple of n_heads
self.n_hidden = out_features // n_heads
else: # averaging output over the attention heads (Used in the main paper)
self.n_hidden = out_features
# A shared linear transformation, parametrized by a weight matrix W is applied to every node
# Initialize the weight matrix W
self.W = nn.Parameter(torch.empty(size=(in_features, self.n_hidden * n_heads)))
# Initialize the attention weights a
self.a = nn.Parameter(torch.empty(size=(n_heads, 2 * self.n_hidden, 1)))
self.leakyrelu = nn.LeakyReLU(leaky_relu_slope) # LeakyReLU activation function
self.softmax = nn.Softmax(dim=1) # softmax activation function to the attention coefficients
self.reset_parameters() # Reset the parameters
def reset_parameters(self):
nn.init.xavier_normal_(self.W)
nn.init.xavier_normal_(self.a)
def _get_attention_scores(self, h_transformed: torch.Tensor):
source_scores = torch.matmul(h_transformed, self.a[:, :self.n_hidden, :])
target_scores = torch.matmul(h_transformed, self.a[:, self.n_hidden:, :])
# broadcast add
# (n_heads, n_nodes, 1) + (n_heads, 1, n_nodes) = (n_heads, n_nodes, n_nodes)
e = source_scores + target_scores.mT
return self.leakyrelu(e)
def forward(self, h: torch.Tensor, adj_mat: torch.Tensor):
n_nodes = h.shape[0]
# Apply linear transformation to node feature -> W h
# output shape (n_nodes, n_hidden * n_heads)
h_transformed = torch.mm(h, self.W)
h_transformed = F.dropout(h_transformed, self.dropout, training=self.training)
# splitting the heads by reshaping the tensor and putting heads dim first
# output shape (n_heads, n_nodes, n_hidden)
h_transformed = h_transformed.view(n_nodes, self.n_heads, self.n_hidden).permute(1, 0, 2)
# getting the attention scores
# output shape (n_heads, n_nodes, n_nodes)
e = self._get_attention_scores(h_transformed)
# Set the attention score for non-existent edges to -9e15 (MASKING NON-EXISTENT EDGES)
connectivity_mask = -9e16 * torch.ones_like(e)
e = torch.where(adj_mat > 0, e, connectivity_mask) # masked attention scores
# attention coefficients are computed as a softmax over the rows
# for each column j in the attention score matrix e
attention = F.softmax(e, dim=-1)
attention = F.dropout(attention, self.dropout, training=self.training)
# final node embeddings are computed as a weighted average of the features of its neighbors
h_prime = torch.matmul(attention, h_transformed)
# concatenating/averaging the attention heads
# output shape (n_nodes, out_features)
if self.concat:
h_prime = h_prime.permute(1, 0, 2).contiguous().view(n_nodes, self.out_features)
else:
h_prime = h_prime.mean(dim=0)
return h_prime
最后将上面所有的代码整合成一个完整的GAT模型:
ini
class GAT(nn.Module):
def __init__(self,
in_features,
n_hidden,
n_heads,
num_classes,
concat=False,
dropout=0.4,
leaky_relu_slope=0.2):
super(GAT, self).__init__()
# Define the Graph Attention layers
self.gat1 = GraphAttentionLayer(
in_features=in_features, out_features=n_hidden, n_heads=n_heads,
concat=concat, dropout=dropout, leaky_relu_slope=leaky_relu_slope
)
self.gat2 = GraphAttentionLayer(
in_features=n_hidden, out_features=num_classes, n_heads=1,
concat=False, dropout=dropout, leaky_relu_slope=leaky_relu_slope
)
def forward(self, input_tensor: torch.Tensor , adj_mat: torch.Tensor):
# Apply the first Graph Attention layer
x = self.gat1(input_tensor, adj_mat)
x = F.elu(x) # Apply ELU activation function to the output of the first layer
# Apply the second Graph Attention layer
x = self.gat2(x, adj_mat)
return F.softmax(x, dim=1) # Apply softmax activation function
方法对比
作者对GATs和其他一些现有GNN方法/架构进行了比较:
- 由于GATs能够计算注意力权重并并行执行局部聚合,因此它比现有的一些方法计算效率更高。
- GATs可以在聚合消息时为节点的邻居分配不同的重要性,这可以实现模型容量的飞跃并提高可解释性。
- GAT不考虑节点的完整邻域(不需要从邻域采样),也不假设节点内部有任何排序。
- 通过将伪坐标函数设置为u(x, y) = f(x)||f(y), GAT可以重新表述为MoNet的一个特定实例(Monti等人,2016),其中f(x)表示(可能是mlp转换的)节点x的特征,而||是连接;权函数为wj(u) = softmax(MLP(u))
基准测试
在论文的第三部分中,作者描述了评估GAT的基准、数据集和任务。然后,他们提出了他们对模型的评估结果。
论文中用作基准的数据集分为两种类型的任务,转换和归纳。
归纳学习:这是一种监督学习任务,其中模型仅在一组标记的训练样例上进行训练,并且在训练过程中完全未观察到的样例上对训练后的模型进行评估和测试。这是一种被称为普通监督学习的学习类型。
传导学习:在这种类型的任务中,所有的数据,包括训练、验证和测试实例,都在训练期间使用。但是在每个阶段,模型只访问相应的标签集。这意味着在训练期间,模型只使用由训练实例和标签产生的损失进行训练,但测试和验证特征用于消息传递。这主要是因为示例中存在的结构和上下文信息。
论文使用四个基准数据集来评估GATs,其中三个对应于传导学习,另一个用作归纳学习任务。
转导学习数据集,即Cora、Citeseer和Pubmed (Sen et al., 2008)数据集都是引文图,其中节点是已发布的文档,边(连接)是它们之间的引用,节点特征是文档的词包表示的元素。
归纳学习数据集是一个蛋白质-蛋白质相互作用(PPI)数据集,其中包含不同人体组织的图形(Zitnik & Leskovec, 2017)。数据集的详细描述如下:

作者报告了四个基准测试的以下性能,显示了GATs与现有GNN方法的可比结果。


总结
通过阅读这篇文章并试用代码,希望你能够对GATs的工作原理以及如何在实际场景中应用它们有一个扎实的理解。
本文的完整代码在这里:
最后还有引用
1\] Graph Attention Networks (2017), Petar Veličković, Guillem Cucurull, Arantxa Casanova, Adriana Romero, Pietro Liò, Yoshua Bengio. arXiv:1710.10903v3 \[2\] Inductive Representation Learning on Large Graphs (2017), William L. Hamilton, Rex Ying, Jure Leskovec. arXiv:1706.02216v4 \[3\] Semi-Supervised Classification with Graph Convolutional Networks (2016), Thomas N. Kipf, Max Welling. arXiv:1609.02907v4 **欢迎关注公众号** [CV技术指南](https://link.juejin.cn?target=https%3A%2F%2Flink.zhihu.com%2F%3Ftarget%3Dhttps%253A%2F%2Fmp.weixin.qq.com%2Fs%253F__biz%253DMzkyMDE2OTA3Mw%253D%253D%2526mid%253D2247500774%2526idx%253D3%2526sn%253D8c3a31c12dfb17c8c6a964c67d0c48b3%2526chksm%253Dc1947cf8f6e3f5ee377070f4f8d8da45ead5240ab8c6c3f910520fec416d3669f1b37a685417%2526token%253D1484091063%2526lang%253Dzh_CN%2523rd "https://link.zhihu.com/?target=https%3A//mp.weixin.qq.com/s%3F__biz%3DMzkyMDE2OTA3Mw%3D%3D%26mid%3D2247500774%26idx%3D3%26sn%3D8c3a31c12dfb17c8c6a964c67d0c48b3%26chksm%3Dc1947cf8f6e3f5ee377070f4f8d8da45ead5240ab8c6c3f910520fec416d3669f1b37a685417%26token%3D1484091063%26lang%3Dzh_CN%23rd")**,专注于计算机视觉的技术总结、最新技术跟踪、经典论文解读、CV招聘信息。** [计算机视觉入门1v3辅导班](https://link.juejin.cn?target=https%3A%2F%2Flink.zhihu.com%2F%3Ftarget%3Dhttp%253A%2F%2Fmp.weixin.qq.com%2Fs%253F__biz%253DMzkyMDE2OTA3Mw%253D%253D%2526mid%253D2247502028%2526idx%253D2%2526sn%253D6ff9f1b3154e6698a20629cb72511754%2526chksm%253Dc19477d2f6e3fec44bc3dc6063248e95846343378ab0053fd9430f13f3bf511ce511a1f6caf5%2526scene%253D21%2523wechat_redirect "https://link.zhihu.com/?target=http%3A//mp.weixin.qq.com/s%3F__biz%3DMzkyMDE2OTA3Mw%3D%3D%26mid%3D2247502028%26idx%3D2%26sn%3D6ff9f1b3154e6698a20629cb72511754%26chksm%3Dc19477d2f6e3fec44bc3dc6063248e95846343378ab0053fd9430f13f3bf511ce511a1f6caf5%26scene%3D21%23wechat_redirect") [【技术文档】《从零搭建pytorch模型教程》122页PDF下载](https://link.juejin.cn?target=https%3A%2F%2Flink.zhihu.com%2F%3Ftarget%3Dhttp%253A%2F%2Fmp.weixin.qq.com%2Fs%253F__biz%253DMzkyMDE2OTA3Mw%253D%253D%2526mid%253D2247499719%2526idx%253D1%2526sn%253D7d8f412157ebe349a6a3502cb1ce414b%2526chksm%253Dc19440d9f6e3c9cf9227dbf3330061fbf3f5f560b35f236e894df286dadd9f799e2b3defff71%2526scene%253D21%2523wechat_redirect "https://link.zhihu.com/?target=http%3A//mp.weixin.qq.com/s%3F__biz%3DMzkyMDE2OTA3Mw%3D%3D%26mid%3D2247499719%26idx%3D1%26sn%3D7d8f412157ebe349a6a3502cb1ce414b%26chksm%3Dc19440d9f6e3c9cf9227dbf3330061fbf3f5f560b35f236e894df286dadd9f799e2b3defff71%26scene%3D21%23wechat_redirect") QQ交流群:470899183。群内有大佬负责解答大家的日常学习、科研、代码问题。 **其它文章** [CV的未来发展方向?平时最喜欢逛的几个地方](https://link.juejin.cn?target=https%3A%2F%2Flink.zhihu.com%2F%3Ftarget%3Dhttp%253A%2F%2Fmp.weixin.qq.com%2Fs%253F__biz%253DMzkyMDE2OTA3Mw%253D%253D%2526mid%253D2247512430%2526idx%253D1%2526sn%253D00b85586ec24e84296798910a489da39%2526chksm%253Dc1940e70f6e387668a846c3a90ce31fed5ae43df398867a9f622f7e1ecdbd3049719fc66561b%2526scene%253D21%2523wechat_redirect "https://link.zhihu.com/?target=http%3A//mp.weixin.qq.com/s%3F__biz%3DMzkyMDE2OTA3Mw%3D%3D%26mid%3D2247512430%26idx%3D1%26sn%3D00b85586ec24e84296798910a489da39%26chksm%3Dc1940e70f6e387668a846c3a90ce31fed5ae43df398867a9f622f7e1ecdbd3049719fc66561b%26scene%3D21%23wechat_redirect") [训练网络loss出现Nan解决办法](https://link.juejin.cn?target=https%3A%2F%2Flink.zhihu.com%2F%3Ftarget%3Dhttp%253A%2F%2Fmp.weixin.qq.com%2Fs%253F__biz%253DMzkyMDE2OTA3Mw%253D%253D%2526mid%253D2247512430%2526idx%253D2%2526sn%253Dff27fdc6eb74ec38f128d110f6d552d9%2526chksm%253Dc1940e70f6e3876699e85558b46758c21195135b8085376b65a04fca88744642afe47e9fce41%2526scene%253D21%2523wechat_redirect "https://link.zhihu.com/?target=http%3A//mp.weixin.qq.com/s%3F__biz%3DMzkyMDE2OTA3Mw%3D%3D%26mid%3D2247512430%26idx%3D2%26sn%3Dff27fdc6eb74ec38f128d110f6d552d9%26chksm%3Dc1940e70f6e3876699e85558b46758c21195135b8085376b65a04fca88744642afe47e9fce41%26scene%3D21%23wechat_redirect") [比Meta「分割一切AI」更全能!港科大版图像分割AI来了:实现更强粒度和语义功能](https://link.juejin.cn?target=https%3A%2F%2Flink.zhihu.com%2F%3Ftarget%3Dhttp%253A%2F%2Fmp.weixin.qq.com%2Fs%253F__biz%253DMzkyMDE2OTA3Mw%253D%253D%2526mid%253D2247512430%2526idx%253D3%2526sn%253De45f12d6604d4fb5edf2860c40d4bf9c%2526chksm%253Dc1940e70f6e387665c87e88df792e4835778974b88156ec52f5e422c150667ad33600fb820ef%2526scene%253D21%2523wechat_redirect "https://link.zhihu.com/?target=http%3A//mp.weixin.qq.com/s%3F__biz%3DMzkyMDE2OTA3Mw%3D%3D%26mid%3D2247512430%26idx%3D3%26sn%3De45f12d6604d4fb5edf2860c40d4bf9c%26chksm%3Dc1940e70f6e387665c87e88df792e4835778974b88156ec52f5e422c150667ad33600fb820ef%26scene%3D21%23wechat_redirect") [Segment Anything项目整理汇总](https://link.juejin.cn?target=https%3A%2F%2Flink.zhihu.com%2F%3Ftarget%3Dhttp%253A%2F%2Fmp.weixin.qq.com%2Fs%253F__biz%253DMzkyMDE2OTA3Mw%253D%253D%2526mid%253D2247512327%2526idx%253D1%2526sn%253D75a63eebe08b0b404a3ca7a9ecc85e4a%2526chksm%253Dc1940e19f6e3870fe88f236ab1eed9563b11638cda89f51459e45c0e32b21d50b958e157a586%2526scene%253D21%2523wechat_redirect "https://link.zhihu.com/?target=http%3A//mp.weixin.qq.com/s%3F__biz%3DMzkyMDE2OTA3Mw%3D%3D%26mid%3D2247512327%26idx%3D1%26sn%3D75a63eebe08b0b404a3ca7a9ecc85e4a%26chksm%3Dc1940e19f6e3870fe88f236ab1eed9563b11638cda89f51459e45c0e32b21d50b958e157a586%26scene%3D21%23wechat_redirect") [Meta Segment Anything会让CV没前途吗?](https://link.juejin.cn?target=https%3A%2F%2Flink.zhihu.com%2F%3Ftarget%3Dhttp%253A%2F%2Fmp.weixin.qq.com%2Fs%253F__biz%253DMzkyMDE2OTA3Mw%253D%253D%2526mid%253D2247512327%2526idx%253D2%2526sn%253D5c537d415ef9d973d8432e4a459b5373%2526chksm%253Dc1940e19f6e3870f079851c9bcf6f6b08847ad5479feb73ee49d1fb5013fddb41d959cc030ce%2526scene%253D21%2523wechat_redirect "https://link.zhihu.com/?target=http%3A//mp.weixin.qq.com/s%3F__biz%3DMzkyMDE2OTA3Mw%3D%3D%26mid%3D2247512327%26idx%3D2%26sn%3D5c537d415ef9d973d8432e4a459b5373%26chksm%3Dc1940e19f6e3870f079851c9bcf6f6b08847ad5479feb73ee49d1fb5013fddb41d959cc030ce%26scene%3D21%23wechat_redirect") [CVPR'2023年AQTC挑战赛第一名解决方案:以功能-交互为中心的时空视觉语言对齐方法](https://link.juejin.cn?target=https%3A%2F%2Flink.zhihu.com%2F%3Ftarget%3Dhttp%253A%2F%2Fmp.weixin.qq.com%2Fs%253F__biz%253DMzkyMDE2OTA3Mw%253D%253D%2526mid%253D2247512327%2526idx%253D3%2526sn%253D5b0c46e2204e7ab3df8ded5b3ad911d7%2526chksm%253Dc1940e19f6e3870fb9d9539b62e088b9066c7a5ff04fac3cda08ab56e26af175ad7cb1a39e96%2526scene%253D21%2523wechat_redirect "https://link.zhihu.com/?target=http%3A//mp.weixin.qq.com/s%3F__biz%3DMzkyMDE2OTA3Mw%3D%3D%26mid%3D2247512327%26idx%3D3%26sn%3D5b0c46e2204e7ab3df8ded5b3ad911d7%26chksm%3Dc1940e19f6e3870fb9d9539b62e088b9066c7a5ff04fac3cda08ab56e26af175ad7cb1a39e96%26scene%3D21%23wechat_redirect") [6万字!30个方向130篇 \| CVPR 2023 最全 AIGC 论文汇总](https://link.juejin.cn?target=https%3A%2F%2Flink.zhihu.com%2F%3Ftarget%3Dhttp%253A%2F%2Fmp.weixin.qq.com%2Fs%253F__biz%253DMzkyMDE2OTA3Mw%253D%253D%2526mid%253D2247512236%2526idx%253D1%2526sn%253Da403001f38ac8472e6788bfe1d522708%2526chksm%253Dc1940fb2f6e386a42517b30f7d4365512d5677a927e9425c21202d1a79606c1c485403a5647b%2526scene%253D21%2523wechat_redirect "https://link.zhihu.com/?target=http%3A//mp.weixin.qq.com/s%3F__biz%3DMzkyMDE2OTA3Mw%3D%3D%26mid%3D2247512236%26idx%3D1%26sn%3Da403001f38ac8472e6788bfe1d522708%26chksm%3Dc1940fb2f6e386a42517b30f7d4365512d5677a927e9425c21202d1a79606c1c485403a5647b%26scene%3D21%23wechat_redirect") [知识蒸馏最新进展](https://link.juejin.cn?target=https%3A%2F%2Flink.zhihu.com%2F%3Ftarget%3Dhttp%253A%2F%2Fmp.weixin.qq.com%2Fs%253F__biz%253DMzkyMDE2OTA3Mw%253D%253D%2526mid%253D2247512069%2526idx%253D1%2526sn%253Dd8a81fff38b45989e619c4434d0a0aec%2526chksm%253Dc1940f1bf6e3860d5a1f563e4a2240fc74bd3717f1aa8b00908dc32e56ed8766a0d8522d2d19%2526scene%253D21%2523wechat_redirect "https://link.zhihu.com/?target=http%3A//mp.weixin.qq.com/s%3F__biz%3DMzkyMDE2OTA3Mw%3D%3D%26mid%3D2247512069%26idx%3D1%26sn%3Dd8a81fff38b45989e619c4434d0a0aec%26chksm%3Dc1940f1bf6e3860d5a1f563e4a2240fc74bd3717f1aa8b00908dc32e56ed8766a0d8522d2d19%26scene%3D21%23wechat_redirect") [ICCV2023 | 当尺度感知调制遇上Transformer,会碰撞出怎样的火花?](https://link.juejin.cn?target=https%3A%2F%2Flink.zhihu.com%2F%3Ftarget%3Dhttp%253A%2F%2Fmp.weixin.qq.com%2Fs%253F__biz%253DMzkyMDE2OTA3Mw%253D%253D%2526mid%253D2247512400%2526idx%253D1%2526sn%253D9f5c980ab009ff1e1b23d9f9023e6d64%2526chksm%253Dc1940e4ef6e387585137285bfdd86931f4cb4851ce22028fe407f5f8c69a5f2320a5d460eb96%2526scene%253D21%2523wechat_redirect "https://link.zhihu.com/?target=http%3A//mp.weixin.qq.com/s%3F__biz%3DMzkyMDE2OTA3Mw%3D%3D%26mid%3D2247512400%26idx%3D1%26sn%3D9f5c980ab009ff1e1b23d9f9023e6d64%26chksm%3Dc1940e4ef6e387585137285bfdd86931f4cb4851ce22028fe407f5f8c69a5f2320a5d460eb96%26scene%3D21%23wechat_redirect") [CVPR 2023 \| 完全无监督的视频物体分割 RCF](https://link.juejin.cn?target=https%3A%2F%2Flink.zhihu.com%2F%3Ftarget%3Dhttp%253A%2F%2Fmp.weixin.qq.com%2Fs%253F__biz%253DMzkyMDE2OTA3Mw%253D%253D%2526mid%253D2247512400%2526idx%253D2%2526sn%253D31963dd77c0f6981e5351b869466c386%2526chksm%253Dc1940e4ef6e38758f5312b5ddfa746584de29a4a3e439a0d9490b35134acf1115d79fe41dabe%2526scene%253D21%2523wechat_redirect "https://link.zhihu.com/?target=http%3A//mp.weixin.qq.com/s%3F__biz%3DMzkyMDE2OTA3Mw%3D%3D%26mid%3D2247512400%26idx%3D2%26sn%3D31963dd77c0f6981e5351b869466c386%26chksm%3Dc1940e4ef6e38758f5312b5ddfa746584de29a4a3e439a0d9490b35134acf1115d79fe41dabe%26scene%3D21%23wechat_redirect") [新加坡国立大学提出最新优化器:CAME,大模型训练成本降低近一半!](https://link.juejin.cn?target=https%3A%2F%2Flink.zhihu.com%2F%3Ftarget%3Dhttp%253A%2F%2Fmp.weixin.qq.com%2Fs%253F__biz%253DMzkyMDE2OTA3Mw%253D%253D%2526mid%253D2247512400%2526idx%253D3%2526sn%253Dcc70a3dc2808935790941858487e4a4d%2526chksm%253Dc1940e4ef6e38758390db31bb68fca51eac5dc7f6b77b7d4d3c34f2e768f812f718d3c00b6df%2526scene%253D21%2523wechat_redirect "https://link.zhihu.com/?target=http%3A//mp.weixin.qq.com/s%3F__biz%3DMzkyMDE2OTA3Mw%3D%3D%26mid%3D2247512400%26idx%3D3%26sn%3Dcc70a3dc2808935790941858487e4a4d%26chksm%3Dc1940e4ef6e38758390db31bb68fca51eac5dc7f6b77b7d4d3c34f2e768f812f718d3c00b6df%26scene%3D21%23wechat_redirect") [SegNetr来啦 \| 超越UNeXit/U-Net/U-Net++/SegNet,精度更高模型更小的UNet家族](https://link.juejin.cn?target=https%3A%2F%2Flink.zhihu.com%2F%3Ftarget%3Dhttp%253A%2F%2Fmp.weixin.qq.com%2Fs%253F__biz%253DMzkyMDE2OTA3Mw%253D%253D%2526mid%253D2247512236%2526idx%253D2%2526sn%253Dee66b03f2b91cc54be58ee2f14e6d968%2526chksm%253Dc1940fb2f6e386a45729c6dd7c2aaf776be3ce34986ea8616d985ea1404ef34deacd16af02f7%2526scene%253D21%2523wechat_redirect "https://link.zhihu.com/?target=http%3A//mp.weixin.qq.com/s%3F__biz%3DMzkyMDE2OTA3Mw%3D%3D%26mid%3D2247512236%26idx%3D2%26sn%3Dee66b03f2b91cc54be58ee2f14e6d968%26chksm%3Dc1940fb2f6e386a45729c6dd7c2aaf776be3ce34986ea8616d985ea1404ef34deacd16af02f7%26scene%3D21%23wechat_redirect") [ReID专栏(二)多尺度设计与应用](https://link.juejin.cn?target=https%3A%2F%2Flink.zhihu.com%2F%3Ftarget%3Dhttp%253A%2F%2Fmp.weixin.qq.com%2Fs%253F__biz%253DMzkyMDE2OTA3Mw%253D%253D%2526mid%253D2247508492%2526idx%253D2%2526sn%253D6a8b0d481f5e8c36bdfb3d655abeef1e%2526chksm%253Dc1941d12f6e39404be39bfe2c467e1101033ee1b19a12bff0d04d6388860c0523c0cbf6a775c%2526scene%253D21%2523wechat_redirect "https://link.zhihu.com/?target=http%3A//mp.weixin.qq.com/s%3F__biz%3DMzkyMDE2OTA3Mw%3D%3D%26mid%3D2247508492%26idx%3D2%26sn%3D6a8b0d481f5e8c36bdfb3d655abeef1e%26chksm%3Dc1941d12f6e39404be39bfe2c467e1101033ee1b19a12bff0d04d6388860c0523c0cbf6a775c%26scene%3D21%23wechat_redirect") [ReID专栏(一) 任务与数据集概述](https://link.juejin.cn?target=https%3A%2F%2Flink.zhihu.com%2F%3Ftarget%3Dhttp%253A%2F%2Fmp.weixin.qq.com%2Fs%253F__biz%253DMzkyMDE2OTA3Mw%253D%253D%2526mid%253D2247508384%2526idx%253D2%2526sn%253D4ed74ba8f39f8bb7141f92763aed3ca7%2526chksm%253Dc1941ebef6e397a8debea9d2b20a6cfcfef6f75b35064631dfd91afb5af1a2795e8a2d703dfd%2526scene%253D21%2523wechat_redirect "https://link.zhihu.com/?target=http%3A//mp.weixin.qq.com/s%3F__biz%3DMzkyMDE2OTA3Mw%3D%3D%26mid%3D2247508384%26idx%3D2%26sn%3D4ed74ba8f39f8bb7141f92763aed3ca7%26chksm%3Dc1941ebef6e397a8debea9d2b20a6cfcfef6f75b35064631dfd91afb5af1a2795e8a2d703dfd%26scene%3D21%23wechat_redirect") [libtorch教程(三)简单模型搭建](https://link.juejin.cn?target=https%3A%2F%2Flink.zhihu.com%2F%3Ftarget%3Dhttp%253A%2F%2Fmp.weixin.qq.com%2Fs%253F__biz%253DMzkyMDE2OTA3Mw%253D%253D%2526mid%253D2247508230%2526idx%253D2%2526sn%253D45d9a7ad9d71c5036677306adcf55323%2526chksm%253Dc1941e18f6e3970e6552801057c8a06ac0c498e6f27ed1f2d2cbf637f2f814e9a51173f2a2f8%2526scene%253D21%2523wechat_redirect "https://link.zhihu.com/?target=http%3A//mp.weixin.qq.com/s%3F__biz%3DMzkyMDE2OTA3Mw%3D%3D%26mid%3D2247508230%26idx%3D2%26sn%3D45d9a7ad9d71c5036677306adcf55323%26chksm%3Dc1941e18f6e3970e6552801057c8a06ac0c498e6f27ed1f2d2cbf637f2f814e9a51173f2a2f8%26scene%3D21%23wechat_redirect") [libtorch教程(二)张量的常规操作](https://link.juejin.cn?target=https%3A%2F%2Flink.zhihu.com%2F%3Ftarget%3Dhttp%253A%2F%2Fmp.weixin.qq.com%2Fs%253F__biz%253DMzkyMDE2OTA3Mw%253D%253D%2526mid%253D2247508348%2526idx%253D2%2526sn%253D204dc600871bb45420e5bee2b2cbddb5%2526chksm%253Dc1941e62f6e39774a07e139e9d0ea31d218ff12794ed8ccdbd90ff765ef088902529d01bc5b6%2526scene%253D21%2523wechat_redirect "https://link.zhihu.com/?target=http%3A//mp.weixin.qq.com/s%3F__biz%3DMzkyMDE2OTA3Mw%3D%3D%26mid%3D2247508348%26idx%3D2%26sn%3D204dc600871bb45420e5bee2b2cbddb5%26chksm%3Dc1941e62f6e39774a07e139e9d0ea31d218ff12794ed8ccdbd90ff765ef088902529d01bc5b6%26scene%3D21%23wechat_redirect") [libtorch教程(一)开发环境搭建:VS+libtorch和Qt+libtorch](https://link.juejin.cn?target=https%3A%2F%2Flink.zhihu.com%2F%3Ftarget%3Dhttp%253A%2F%2Fmp.weixin.qq.com%2Fs%253F__biz%253DMzkyMDE2OTA3Mw%253D%253D%2526mid%253D2247508194%2526idx%253D2%2526sn%253D1a2252850a06715266dcd5dbe45c8f39%2526chksm%253Dc1941ffcf6e396ea60d1056e3f875279dc0490b5c34101ccc2e1086a60d10709732d00c96259%2526scene%253D21%2523wechat_redirect "https://link.zhihu.com/?target=http%3A//mp.weixin.qq.com/s%3F__biz%3DMzkyMDE2OTA3Mw%3D%3D%26mid%3D2247508194%26idx%3D2%26sn%3D1a2252850a06715266dcd5dbe45c8f39%26chksm%3Dc1941ffcf6e396ea60d1056e3f875279dc0490b5c34101ccc2e1086a60d10709732d00c96259%26scene%3D21%23wechat_redirect") [NeRF与三维重建专栏(三)nerf_pl源码部分解读与colmap、cuda算子使用](https://link.juejin.cn?target=https%3A%2F%2Flink.zhihu.com%2F%3Ftarget%3Dhttp%253A%2F%2Fmp.weixin.qq.com%2Fs%253F__biz%253DMzkyMDE2OTA3Mw%253D%253D%2526mid%253D2247508142%2526idx%253D2%2526sn%253Df5073dec9e6915c35900f9e20b5d61e5%2526chksm%253Dc1941fb0f6e396a65570341cfe053e0fb9bf76304caf4a33ab2c557ccb40aef1db40fcf307fe%2526scene%253D21%2523wechat_redirect "https://link.zhihu.com/?target=http%3A//mp.weixin.qq.com/s%3F__biz%3DMzkyMDE2OTA3Mw%3D%3D%26mid%3D2247508142%26idx%3D2%26sn%3Df5073dec9e6915c35900f9e20b5d61e5%26chksm%3Dc1941fb0f6e396a65570341cfe053e0fb9bf76304caf4a33ab2c557ccb40aef1db40fcf307fe%26scene%3D21%23wechat_redirect") [NeRF与三维重建专栏(二)NeRF原文解读与体渲染物理模型](https://link.juejin.cn?target=https%3A%2F%2Flink.zhihu.com%2F%3Ftarget%3Dhttp%253A%2F%2Fmp.weixin.qq.com%2Fs%253F__biz%253DMzkyMDE2OTA3Mw%253D%253D%2526mid%253D2247508127%2526idx%253D2%2526sn%253D988a43a1a2b77c28a1fd537679852ea2%2526chksm%253Dc1941f81f6e3969726b0096fb71d53fb2ec2a19dc4af72d47d578194a33e495d31c0a835bba6%2526scene%253D21%2523wechat_redirect "https://link.zhihu.com/?target=http%3A//mp.weixin.qq.com/s%3F__biz%3DMzkyMDE2OTA3Mw%3D%3D%26mid%3D2247508127%26idx%3D2%26sn%3D988a43a1a2b77c28a1fd537679852ea2%26chksm%3Dc1941f81f6e3969726b0096fb71d53fb2ec2a19dc4af72d47d578194a33e495d31c0a835bba6%26scene%3D21%23wechat_redirect") [NeRF与三维重建专栏(一)领域背景、难点与数据集介绍](https://link.juejin.cn?target=https%3A%2F%2Flink.zhihu.com%2F%3Ftarget%3Dhttp%253A%2F%2Fmp.weixin.qq.com%2Fs%253F__biz%253DMzkyMDE2OTA3Mw%253D%253D%2526mid%253D2247508075%2526idx%253D3%2526sn%253D863180b4c9f0df23e8476a97a4b7c4a4%2526chksm%253Dc1941f75f6e39663959a828934d340837dc511ff1f1345de3a3fca5b7f05577a811a6ed80893%2526scene%253D21%2523wechat_redirect "https://link.zhihu.com/?target=http%3A//mp.weixin.qq.com/s%3F__biz%3DMzkyMDE2OTA3Mw%3D%3D%26mid%3D2247508075%26idx%3D3%26sn%3D863180b4c9f0df23e8476a97a4b7c4a4%26chksm%3Dc1941f75f6e39663959a828934d340837dc511ff1f1345de3a3fca5b7f05577a811a6ed80893%26scene%3D21%23wechat_redirect") [异常检测专栏(三)传统的异常检测算法------上](https://link.juejin.cn?target=https%3A%2F%2Flink.zhihu.com%2F%3Ftarget%3Dhttp%253A%2F%2Fmp.weixin.qq.com%2Fs%253F__biz%253DMzkyMDE2OTA3Mw%253D%253D%2526mid%253D2247508096%2526idx%253D2%2526sn%253D47aff8b9480b53c153963189f654ce44%2526chksm%253Dc1941f9ef6e39688e5e818fbbe07d3e3d5a0066539b56c8b66cb926452b14aabb72a75e56b9b%2526scene%253D21%2523wechat_redirect "https://link.zhihu.com/?target=http%3A//mp.weixin.qq.com/s%3F__biz%3DMzkyMDE2OTA3Mw%3D%3D%26mid%3D2247508096%26idx%3D2%26sn%3D47aff8b9480b53c153963189f654ce44%26chksm%3Dc1941f9ef6e39688e5e818fbbe07d3e3d5a0066539b56c8b66cb926452b14aabb72a75e56b9b%26scene%3D21%23wechat_redirect") [异常检测专栏(二):评价指标及常用数据集](https://link.juejin.cn?target=https%3A%2F%2Flink.zhihu.com%2F%3Ftarget%3Dhttp%253A%2F%2Fmp.weixin.qq.com%2Fs%253F__biz%253DMzkyMDE2OTA3Mw%253D%253D%2526mid%253D2247508090%2526idx%253D2%2526sn%253D297ae517a789efbd291a6d0e75fb6982%2526chksm%253Dc1941f64f6e39672d18b978b4ead75d073359c4ceab213ad35b46c6179f1749ac0b4fe2e5567%2526scene%253D21%2523wechat_redirect "https://link.zhihu.com/?target=http%3A//mp.weixin.qq.com/s%3F__biz%3DMzkyMDE2OTA3Mw%3D%3D%26mid%3D2247508090%26idx%3D2%26sn%3D297ae517a789efbd291a6d0e75fb6982%26chksm%3Dc1941f64f6e39672d18b978b4ead75d073359c4ceab213ad35b46c6179f1749ac0b4fe2e5567%26scene%3D21%23wechat_redirect") [异常检测专栏(一)异常检测概述](https://link.juejin.cn?target=https%3A%2F%2Flink.zhihu.com%2F%3Ftarget%3Dhttp%253A%2F%2Fmp.weixin.qq.com%2Fs%253F__biz%253DMzkyMDE2OTA3Mw%253D%253D%2526mid%253D2247508055%2526idx%253D2%2526sn%253Daaf85683054d72e754a48e87e1dc9564%2526chksm%253Dc1941f49f6e3965fb7a28e8392bdbbd6307ffa7e56836e3eb3ecec84b7e00cdcb42ca1b53a5f%2526scene%253D21%2523wechat_redirect "https://link.zhihu.com/?target=http%3A//mp.weixin.qq.com/s%3F__biz%3DMzkyMDE2OTA3Mw%3D%3D%26mid%3D2247508055%26idx%3D2%26sn%3Daaf85683054d72e754a48e87e1dc9564%26chksm%3Dc1941f49f6e3965fb7a28e8392bdbbd6307ffa7e56836e3eb3ecec84b7e00cdcb42ca1b53a5f%26scene%3D21%23wechat_redirect") [BEV专栏(二)从BEVFormer看BEV流程(下篇)](https://link.juejin.cn?target=https%3A%2F%2Flink.zhihu.com%2F%3Ftarget%3Dhttp%253A%2F%2Fmp.weixin.qq.com%2Fs%253F__biz%253DMzkyMDE2OTA3Mw%253D%253D%2526mid%253D2247507667%2526idx%253D2%2526sn%253D4c0583db7670bf6170b58afd9d558d56%2526chksm%253Dc19461cdf6e3e8dba8fafac55916264b3eded8e173ed99cd14ca3822a91bc12a6951ca78511a%2526scene%253D21%2523wechat_redirect "https://link.zhihu.com/?target=http%3A//mp.weixin.qq.com/s%3F__biz%3DMzkyMDE2OTA3Mw%3D%3D%26mid%3D2247507667%26idx%3D2%26sn%3D4c0583db7670bf6170b58afd9d558d56%26chksm%3Dc19461cdf6e3e8dba8fafac55916264b3eded8e173ed99cd14ca3822a91bc12a6951ca78511a%26scene%3D21%23wechat_redirect") [BEV专栏(一)从BEVFormer深入探究BEV流程(上篇)](https://link.juejin.cn?target=https%3A%2F%2Flink.zhihu.com%2F%3Ftarget%3Dhttp%253A%2F%2Fmp.weixin.qq.com%2Fs%253F__biz%253DMzkyMDE2OTA3Mw%253D%253D%2526mid%253D2247507628%2526idx%253D2%2526sn%253D8b01002bbd23fd99689e7cf64357855e%2526chksm%253Dc19461b2f6e3e8a4dcbcaccb6292a8b9795fcfb53978297dec44cf1b3ae19fd2cd46a947eb6d%2526scene%253D21%2523wechat_redirect "https://link.zhihu.com/?target=http%3A//mp.weixin.qq.com/s%3F__biz%3DMzkyMDE2OTA3Mw%3D%3D%26mid%3D2247507628%26idx%3D2%26sn%3D8b01002bbd23fd99689e7cf64357855e%26chksm%3Dc19461b2f6e3e8a4dcbcaccb6292a8b9795fcfb53978297dec44cf1b3ae19fd2cd46a947eb6d%26scene%3D21%23wechat_redirect") [可见光遥感图像目标检测(三)文字场景检测之Arbitrary](https://link.juejin.cn?target=https%3A%2F%2Flink.zhihu.com%2F%3Ftarget%3Dhttp%253A%2F%2Fmp.weixin.qq.com%2Fs%253F__biz%253DMzkyMDE2OTA3Mw%253D%253D%2526mid%253D2247507602%2526idx%253D2%2526sn%253D54d1e8c276de96840122c0ce11edbfd3%2526chksm%253Dc194618cf6e3e89ab6051fb1de39149abadf4f509f11f03612a197342be3de5b2088a1d81346%2526scene%253D21%2523wechat_redirect "https://link.zhihu.com/?target=http%3A//mp.weixin.qq.com/s%3F__biz%3DMzkyMDE2OTA3Mw%3D%3D%26mid%3D2247507602%26idx%3D2%26sn%3D54d1e8c276de96840122c0ce11edbfd3%26chksm%3Dc194618cf6e3e89ab6051fb1de39149abadf4f509f11f03612a197342be3de5b2088a1d81346%26scene%3D21%23wechat_redirect") [可见光遥感目标检测(二)主要难点与研究方法概述](https://link.juejin.cn?target=https%3A%2F%2Flink.zhihu.com%2F%3Ftarget%3Dhttp%253A%2F%2Fmp.weixin.qq.com%2Fs%253F__biz%253DMzkyMDE2OTA3Mw%253D%253D%2526mid%253D2247507526%2526idx%253D2%2526sn%253D42d330d6ba74d49a2995098a37f62f54%2526chksm%253Dc1946158f6e3e84e46308d1b4b5854ae9c5f763158a83704eda18ecd6f852c6ec9907ee1e764%2526scene%253D21%2523wechat_redirect "https://link.zhihu.com/?target=http%3A//mp.weixin.qq.com/s%3F__biz%3DMzkyMDE2OTA3Mw%3D%3D%26mid%3D2247507526%26idx%3D2%26sn%3D42d330d6ba74d49a2995098a37f62f54%26chksm%3Dc1946158f6e3e84e46308d1b4b5854ae9c5f763158a83704eda18ecd6f852c6ec9907ee1e764%26scene%3D21%23wechat_redirect") [可见光遥感目标检测(一)任务概要介绍](https://link.juejin.cn?target=https%3A%2F%2Flink.zhihu.com%2F%3Ftarget%3Dhttp%253A%2F%2Fmp.weixin.qq.com%2Fs%253F__biz%253DMzkyMDE2OTA3Mw%253D%253D%2526mid%253D2247507453%2526idx%253D2%2526sn%253D53ebd358ecea9ea7fcce5ff058d22701%2526chksm%253Dc19462e3f6e3ebf54a3297d7933b8b97c8a6c84fb9720ef3a89a3d5a32620dca460386c8ed90%2526scene%253D21%2523wechat_redirect "https://link.zhihu.com/?target=http%3A//mp.weixin.qq.com/s%3F__biz%3DMzkyMDE2OTA3Mw%3D%3D%26mid%3D2247507453%26idx%3D2%26sn%3D53ebd358ecea9ea7fcce5ff058d22701%26chksm%3Dc19462e3f6e3ebf54a3297d7933b8b97c8a6c84fb9720ef3a89a3d5a32620dca460386c8ed90%26scene%3D21%23wechat_redirect") [TensorRT教程(三)TensorRT的安装教程](https://link.juejin.cn?target=https%3A%2F%2Flink.zhihu.com%2F%3Ftarget%3Dhttp%253A%2F%2Fmp.weixin.qq.com%2Fs%253F__biz%253DMzkyMDE2OTA3Mw%253D%253D%2526mid%253D2247501495%2526idx%253D3%2526sn%253Df064fb3334ff73ef704b6c04f978d1e4%2526chksm%253Dc19479a9f6e3f0bf1f405552d9df661c83318cbddeea83ee22af03920aaa3856d67a4d2d4ef3%2526scene%253D21%2523wechat_redirect "https://link.zhihu.com/?target=http%3A//mp.weixin.qq.com/s%3F__biz%3DMzkyMDE2OTA3Mw%3D%3D%26mid%3D2247501495%26idx%3D3%26sn%3Df064fb3334ff73ef704b6c04f978d1e4%26chksm%3Dc19479a9f6e3f0bf1f405552d9df661c83318cbddeea83ee22af03920aaa3856d67a4d2d4ef3%26scene%3D21%23wechat_redirect") [TensorRT教程(二)TensorRT进阶介绍](https://link.juejin.cn?target=https%3A%2F%2Flink.zhihu.com%2F%3Ftarget%3Dhttp%253A%2F%2Fmp.weixin.qq.com%2Fs%253F__biz%253DMzkyMDE2OTA3Mw%253D%253D%2526mid%253D2247501201%2526idx%253D2%2526sn%253D778bf7ac2261d1b1e51bcfffca69fb72%2526chksm%253Dc1947a8ff6e3f399c0614d2c91f6cd2b6749006991edb38827171066533cad9d95b3557d9ddc%2526scene%253D21%2523wechat_redirect "https://link.zhihu.com/?target=http%3A//mp.weixin.qq.com/s%3F__biz%3DMzkyMDE2OTA3Mw%3D%3D%26mid%3D2247501201%26idx%3D2%26sn%3D778bf7ac2261d1b1e51bcfffca69fb72%26chksm%3Dc1947a8ff6e3f399c0614d2c91f6cd2b6749006991edb38827171066533cad9d95b3557d9ddc%26scene%3D21%23wechat_redirect") [TensorRT教程(一)初次介绍TensorRT](https://link.juejin.cn?target=https%3A%2F%2Flink.zhihu.com%2F%3Ftarget%3Dhttp%253A%2F%2Fmp.weixin.qq.com%2Fs%253F__biz%253DMzkyMDE2OTA3Mw%253D%253D%2526mid%253D2247500842%2526idx%253D1%2526sn%253Dcd4d093a3d3d32d5a8274899d8d06bed%2526chksm%253Dc1947b34f6e3f22238b60b61cbdc0ea2ab9a71e7ba160ccf0d3420e14b702807258b807c232e%2526scene%253D21%2523wechat_redirect "https://link.zhihu.com/?target=http%3A//mp.weixin.qq.com/s%3F__biz%3DMzkyMDE2OTA3Mw%3D%3D%26mid%3D2247500842%26idx%3D1%26sn%3Dcd4d093a3d3d32d5a8274899d8d06bed%26chksm%3Dc1947b34f6e3f22238b60b61cbdc0ea2ab9a71e7ba160ccf0d3420e14b702807258b807c232e%26scene%3D21%23wechat_redirect") [AI最全资料汇总 \| 基础入门、技术前沿、工业应用、部署框架、实战教程学习](https://link.juejin.cn?target=https%3A%2F%2Flink.zhihu.com%2F%3Ftarget%3Dhttp%253A%2F%2Fmp.weixin.qq.com%2Fs%253F__biz%253DMzkyMDE2OTA3Mw%253D%253D%2526mid%253D2247506021%2526idx%253D1%2526sn%253Df046fe191ba09ad3f9f3b1250f1e0b7e%2526chksm%253Dc194677bf6e3ee6da255900685f07d6a0cd76945947b63803c1197dc57a6d47c478864fc01d3%2523rd "https://link.zhihu.com/?target=http%3A//mp.weixin.qq.com/s%3F__biz%3DMzkyMDE2OTA3Mw%3D%3D%26mid%3D2247506021%26idx%3D1%26sn%3Df046fe191ba09ad3f9f3b1250f1e0b7e%26chksm%3Dc194677bf6e3ee6da255900685f07d6a0cd76945947b63803c1197dc57a6d47c478864fc01d3%23rd") [计算机视觉入门1v3辅导班](https://link.juejin.cn?target=https%3A%2F%2Flink.zhihu.com%2F%3Ftarget%3Dhttp%253A%2F%2Fmp.weixin.qq.com%2Fs%253F__biz%253DMzkyMDE2OTA3Mw%253D%253D%2526mid%253D2247502028%2526idx%253D2%2526sn%253D6ff9f1b3154e6698a20629cb72511754%2526chksm%253Dc19477d2f6e3fec44bc3dc6063248e95846343378ab0053fd9430f13f3bf511ce511a1f6caf5%2526scene%253D21%2523wechat_redirect "https://link.zhihu.com/?target=http%3A//mp.weixin.qq.com/s%3F__biz%3DMzkyMDE2OTA3Mw%3D%3D%26mid%3D2247502028%26idx%3D2%26sn%3D6ff9f1b3154e6698a20629cb72511754%26chksm%3Dc19477d2f6e3fec44bc3dc6063248e95846343378ab0053fd9430f13f3bf511ce511a1f6caf5%26scene%3D21%23wechat_redirect") [计算机视觉交流群](https://link.juejin.cn?target=https%3A%2F%2Flink.zhihu.com%2F%3Ftarget%3Dhttp%253A%2F%2Fmp.weixin.qq.com%2Fs%253F__biz%253DMzkyMDE2OTA3Mw%253D%253D%2526mid%253D2247502028%2526idx%253D3%2526sn%253Dff6cc11a4acd7f5e53b3c935f83c8221%2526chksm%253Dc19477d2f6e3fec4efea4fdf1bbd981a2926fa84e3abb51a7699a9b06e826441a2280e9a1d5d%2526scene%253D21%2523wechat_redirect "https://link.zhihu.com/?target=http%3A//mp.weixin.qq.com/s%3F__biz%3DMzkyMDE2OTA3Mw%3D%3D%26mid%3D2247502028%26idx%3D3%26sn%3Dff6cc11a4acd7f5e53b3c935f83c8221%26chksm%3Dc19477d2f6e3fec4efea4fdf1bbd981a2926fa84e3abb51a7699a9b06e826441a2280e9a1d5d%26scene%3D21%23wechat_redirect") [聊聊计算机视觉入门](https://link.juejin.cn?target=https%3A%2F%2Flink.zhihu.com%2F%3Ftarget%3Dhttp%253A%2F%2Fmp.weixin.qq.com%2Fs%253F__biz%253DMzkyMDE2OTA3Mw%253D%253D%2526mid%253D2247501535%2526idx%253D1%2526sn%253D75739f6624d715c8fefba1d9f6ff09c6%2526chksm%253Dc19479c1f6e3f0d73482ccef787092f5bb3bfece2f1358f48d2b9cd8a341680c791540ba4946%2526scene%253D21%2523wechat_redirect "https://link.zhihu.com/?target=http%3A//mp.weixin.qq.com/s%3F__biz%3DMzkyMDE2OTA3Mw%3D%3D%26mid%3D2247501535%26idx%3D1%26sn%3D75739f6624d715c8fefba1d9f6ff09c6%26chksm%3Dc19479c1f6e3f0d73482ccef787092f5bb3bfece2f1358f48d2b9cd8a341680c791540ba4946%26scene%3D21%23wechat_redirect")