BriefGPT.xyz
May, 2022
ReLU领域:小而强劲的非线性
ReLU Fields: The Little Non-linearity That Could
HTML
PDF
Animesh Karnewar, Tobias Ritschel, Oliver Wang, Niloy J. Mitra
TL;DR
通过对栅格化表示的最小更改,引入固定非线性 (ReLU),结合更粗到更细的优化,快速重建、渲染,并在辐射场和占据场上进行比较,该方法在保留 MLP 高保真结果的同时实现了 MLPs 速度的变化。
Abstract
In many recent works,
multi-layer perceptions
(MLPs) have been shown to be suitable for modeling complex
spatially-varying functions
including images and 3D scenes. Although the MLPs are able to represent complex
→