Demonstration code of decentralized Gaussian process training [1,2] and prediction [3]. For further details check the paper for decentralized GP training here and prediction here. A presentation of the decentralized GP training paper can be found here and prediction here.
The code implements:
- Centralized Generalized Analytical Proximal GP Training (gapx-GP) [1]
- Decentralized Consensus GP Training (DEC-c-GP) [1]
- Decentralized Analytical Proximal GP Training (DEC-apx-GP) [2]
- Decentralized Generalized Analytical Proximal GP Training (DEC-gapx-GP) [1]
- Decentralized Nested Pointwise Aggregation of Experts (DEC-NPAE) [3]
- Distributed Nested Pointwise Aggregation of Experts (DIST-NPAE) [3]
The source code of the factorized training and the centralized NPAE [4] can be found in the GRBCM [5] GitHub repository.
Execute: For decentralized GP training
demo_2D_train.m
For decentralized GP prediction
demo_2D_pred.m
[1] G. P. Kontoudis and D. J. Stilwell, “Scalable, Federated Gaussian Process Training for Decentralized Multi-Agent Systems,” in IEEE Access, 2024.
[2] G. P. Kontoudis and D. J. Stilwell, “Decentralized Federated Learning using Gaussian Processes,” in IEEE International Symposium on Multi-Robot and Multi-Agent Systems (MRS), 2023.
[3] G. P. Kontoudis and D. J. Stilwell, “Decentralized Nested Gaussian Processes for Multi-Robot Systems,” in IEEE International Conference on Robotics and Automation (ICRA), 2021.
[4] D. Rullière, N. Durrande, F. Bachoc, and C. Chevalier, “Nested Kriging predictions for datasets with a large number of observations,” Statistics and Computing, 2018.
[5] H. Liu, J. Cai, Y. Wang, and Y. S. Ong, “Generalized robust Bayesian committee machine for large-scale Gaussian process regression,” in International Conference on Machine Learning (ICML), 2018.
Please open a GitHub issue if you encounter any problem.