Bregman Proximal Viewpoint on Neural Operators

A.-R. Mezidi, J. Patracone, S. Salzo, A. Habrard, M. Pontil, R. Emonet and M. Sebban
Under review, 2024

Abstract

We present several advances on neural operators by viewing the action of operator layers as the minimizers of Bregman regularized optimization problems over Banach function spaces. The proposed framework allows interpreting the activation operators as Bregman proximity operators from dual to primal space. This novel viewpoint is general enough to recover classical neural operators as well as a new variant, coined Bregman neural operators, which also includes a skip-like connection. Numerical experiments support the added benefits of the Bregman variant of Fourier neural operators for training deeper and more accurate models.

Downloads

PDF   BibTeX  

Back