Abstract:
Neural operator learning aims to approximate mappings between infinite-dimensional function spaces from data, offering a promising alternative to costly numerical PDE solvers. However, current approaches face inherent trade-offs between flexibility and generalization. Spectral-based methods offer strong generalization but lack local adaptability, whereas attention-based methods are flexible but prone to overfitting with limited data. Furthermore, extending neural operators to three-dimensional (3D) inverse design introduces difficulty, as the design space grows exponentially and geometry is tightly coupled with the induced physical field. This article presents a physics-aware neural operator framework comprising three components. The holistic physics mixer (HPM) constructs an adaptive spectral basis conditioned on local physical states via a learnable coupling function, unifying spectral and attention-based paradigms within a single transform. Physics-state residual learning (PSRL) reformulates the learning target from full input-output mappings to residuals between nearby physical states, leveraging the Lipschitz continuity of stable systems to implicitly augment limited training data without introducing non-physical samples. For 3D inverse design, a physics-geometry variational autoencoder (PG-VAE) jointly encodes geometry and physical fields into a compact latent space, on which a two-stage optimization procedure generates designs from random initialization without geometric templates. Extensive experiments on standard PDE benchmarks and aerodynamic shape optimization tasks show consistent improvements over existing methods in prediction accuracy and data efficiency