Abstract
In decay heat removal processes of the pebble-bed high temperature gas-cooled reactors, particle-scale radiative heat transfer between spheres is complicated for modeling and numerical simulations with traditional approaches. Artificial intelligence (AI) provides a new aspect to solve the dense granular dynamics. A machine learning model was developed for predicting the obstructed view factor between all possible pebble pairs in the large-scale nuclear pebble bed. The view factor dataset is obtained by random generation for sphere positions and thermal ray tracing method by CUDA paralleling for the view factor. The regression models are trained by gradient boosting decision tree (GBDT) method of XGBoost software for 2 ∼ 10 spheres cases. It is shown that the model performance will be greatly improved without overfitting by adding more trees rather than going deeper for every tree to reach R2 scores greater than 0.999. For engineering application, the trained XGboost models are applied to predict view factors in large-scale nuclear pebble bed during decay heat removal processes. From the transient numerical results, it takes about 10 h to get its maximum 1520°C only with thermal radiation and it is still less than the design upper limit.