BriefGPT.xyz
Feb, 2019
用于缩放子模块优化大规模问题的记忆化框架
A Memoization Framework for Scaling Submodular Optimization to Large Scale Problems
HTML
PDF
Rishabh Iyer, Jeff Bilmes
TL;DR
使用预计算复杂性模型和记忆化的方式来优化大规模子模量优化问题,该方法在许多约束和非约束的子模量最大化,最小化,差异最小化问题中都适用,且在数据子集选择和摘要方面表现出了明显的加速效果。
Abstract
We are motivated by large scale
submodular optimization
problems, where standard algorithms that treat the submodular functions in the \emph{value oracle model} do not scale. In this paper, we present a model called the \emph{
→