BriefGPT.xyz
Jul, 2023
增强空间上下文的潜在图注意力
Latent Graph Attention for Enhanced Spatial Context
HTML
PDF
Ayush Singh, Yash Bhambhu, Himanshu Buckchash, Deepak K. Gupta, Dilip K. Prasad
TL;DR
本文介绍了一种计算效率高且模块化的框架——Latent Graph Attention(LGA),该框架能够将全局背景信息融入现有架构中,特别是能够使小型架构的性能接近大型架构,从而使轻量级架构在计算能力和能源需求较低的边缘设备中更加有效。在透明物体分割、去雾图像恢复和光流估计三个应用中,LGA模块的引入改善了性能。
Abstract
Global contexts in images are quite valuable in
image-to-image translation
problems. Conventional attention-based and graph-based models capture the
global context
to a large extent, however, these are computatio
→