Physics-Informed Neural Networks (PINNs) encounter challenges in dealing with imbalanced training losses, especially when there are sample points with
extremely high losses. This can make the optimization process unstable, making
it challenging to find the correct descent direction during training. In this paper,
we propose a progressive learning approach based on anomaly points awareness
to improve the optimization process of PINNs. Our approach comprises two primary steps: the awareness of anomaly data points and the update of training set.
Anomaly points are identified by utilizing an upper bound calculated from the mean
and standard deviation of the feedforward losses of all training data. In the absence
of anomalies, the parameters of the PINN are optimized using the default training data; however, once anomalies are detected, a progressive exclusion method
aligned with the network learning pattern is introduced to exclude potentially unfavorable data points from the training set. In addition, intermittent detection is
employed, rather than performing anomaly detection in each iteration, to balance
performance and efficiency. Extensive experimental results demonstrate that the
proposed method leads to substantial improvement in approximation accuracy when
solving typical benchmark partial differential equations. The code is accessible at
https://github.com/JcLimath/Anomaly-Aware-PINN.