Appendix: Technology#
AI concepts useful to understand the Ansys SimAI Pro application#
The upcoming sections compile a collection of articles and references designed to offer a theoretical and practical insight into deep learning and associated technologies. The following resources are intended for those who wish to deepen their understanding and expertise in this field.
Deep Learning#
Deep Learning represents the fundamental technology of the Ansys SimAI Pro application.
Highly influential article providing an authoritative introduction to the field: LeCun, Y., Bengio, Y. & Hinton, G. Deep learning. Nature 521, 436-444 (2015)
Comprehensive textbook delving into both fundamental concepts and advanced topics: Ian Goodfellow, Yoshua Bengio, and Aaron Courville, Deep Learning (Cambridge (EE. UU.): MIT Press, 2016)
Point Cloud#
Point Cloud is an unordered set of points that represent geometries or scenes, mainly in 3-dimension. Point cloud is widely used as a simplified data representation of CAD models. The following publications present two landmark publications that pioneered deep learning methods for point cloud data.
R. Q. Charles, H. Su, M. Kaichun and L. J. Guibas, “PointNet: Deep Learning on Point Sets for 3D Classification and Segmentation,” 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 2017, pp. 77-85, doi: 10.1109/CVPR.2017.16. © Ansys, Inc. All rights reserved. 37 Contains proprietary and confidential information of ANSYS, Inc. and its subsidiaries and affiliates.
Charles R.Qi, Li Yi,Hao Su, and Leonidas J. Guibas. 2017. PointNet++: deep hierarchical feature learning on point sets in a metric space. In Proceedings of the 31st International Conference on Neural Information Processing Systems (NIPS’17). Curran Associates Inc., Red Hook, NY, USA, 5105-5114
Graph#
Graph is an effective representation for arbitrary meshes which makes graph neural network (GNN) a popular choice for learning CAD models. This section provides two comprehensive review publications and two key algorithms for graph learning followed by a research work on GNN for mesh-based physics simulation.
Sanchez-Lengeling, B., Reif, E., Pearce, A., & Wiltschko, A. B. (2021). A gentle introduction to graph neural networks. Distill, 6(9), e33. • Daigavane, A., Ravindran, B., & Aggarwal, G. (2021). Understanding convolutions on graphs. Distill, 6(9), e32.
Kipf, T. N., & Welling, M. (2016). Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907.
Veličković, P., Cucurull, G., Casanova, A., Romero, A., Lio, P., & Bengio, Y. (2017). Graph attention networks. arXiv preprint arXiv:1710.10903.
Pfaff, T., Fortunato, M., Sanchez-Gonzalez, A., & Battaglia, P. W. (2020). Learning mesh-based simulation with graph networks. arXiv preprint arXiv:2010.03409.
Physics-Informed Neural Networks#
Physics-Informed Neural Networks (PINNs) are data-efficient methods to solve partial differential equations by explicitly integrating physical laws into the deep learning model training.
Raissi, M., Perdikaris, P., & Karniadakis, G. E. (2019). Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations. Journal of Computational physics, 378, 686-707.
Fourier Neural Operator (FNO)#
FNO is a breakthrough work in neural operators. It proposes an approach in learning solution operators of physics systems by Fourier transforms to solve partial differential equations.
Li, Z., Kovachki, N., Azizzadenesheli, K., Liu, B., Bhattacharya, K., Stuart, A., & Anandkumar, A. (2020). Fourier neural operator for parametric partial differential equations. arXiv preprint arXiv:2010.08895.
DeepONet#
DeepONetis another pioneering neural operator work for learning differential equations. It leverages deep learning architectures to approximate operators for learning mappings between functions.
Lu, L., Jin, P., & Karniadakis, G. E. (2019). Deeponet: Learning nonlinear operators for identifying differential equations based on the universal approximation theorem of operators. arXiv preprint arXiv:1910.03193.
Deep Learning Model Training Process#
The Ansys SimAI Pro application and its design reflect the key stages of AI model training.
The generic model architecture of the Ansys SimAI Pro application uses a combination of deep learning neural networks to capture various physics scales. Instead of storing data points explicitly, the platform learns a continuous function, allowing it to generalize to new geometries and conditions. Here is a synthetic break down of the Ansys SimAI Pro application training process:
Architecture and Dataset: The deep learning architecture (neural network) and a dataset are used to train the model.
Dataset splitting: The dataset you selected is split into two subsets: a training and a test subset. The training subset is dedicated to train the model. The test subset is dedicated to the assessment of the model performance once built. The test subset is never seen by the model.
Preprocessing: The data are preprocessed to ensure it is standardized for efficient model input and aligns with a canonical internal format.
Weight initialization: The model’s weights (the parameters that the model adjusts during training) are initialized.
Training using back propagation: The model is fed with training data in progressive batches. As each batch of data goes through the network, the model makes predictions. Using the back propagation algorithm, errors between predicted and actual values are calculated. The weights are then updated to minimize the error to reach the best approximation possible. This process is repeated over a certain number of iterations.
Evaluation: Once the training is complete, the model is tested on the test subset to measure its accuracy and genericity. This provides a final evaluation and helps understand how well the model generalizes to unseen data. This information is available in the Model Evaluation Report.
For more information regarding our product’s workflow or our AI models, see the related links section.