Automation in Construction, 2025
As-built BIM reconstruction plays a significant role in urban renewal and building digitization but currently faces challenges of low efficiency. Scan-to-BIM aims to improve reconstruction efficiency but lacks domain-specific, large-scale datasets and accurate, multi-dimensional benchmark metrics. These deficiencies further impede the evaluation and training of scan-to-BIM methods.
To address these challenges, this paper proposes BIMNet, an IFC-based large-scale point cloud to BIM dataset, and a set of metrics that reflect the quality and issues of reconstructed models from both geometric and topological perspectives. Experiments demonstrate that BIMNet enhances the evaluation and training of scan-to-BIM methods during the critical processes of reconstruction and segmentation. This research contributes to the data foundation and metric system for deep-learning based scan-to-BIM methods. In the future, BIMNet will not only facilitate the development of scan-to-BIM but also contribute to the advancement of smart cities and AI-driven technologies beyond scan-to-BIM.
Features:
- Real-world point cloud: Filtered and cleaned raw point clouds originate from Matterport3D.
- IFC-based semantic annotation: 14-category point cloud semantics designed based on IFC (Industry Foundation Classes).
- Dedicated BIM model: Manually modeled high-fidelity BIM model for each scan with Revit.
- Domain-specific benchmark: Novel geomertic and topologic benchmark for scan-to-BIM.
We demonstrate some of the dataset construction processes here. For more details, please refer to the paper. Meanwhile, we welcome you to visit our homepage to preview data samples.
Figure 1: Workflow for constrcuting BIMNet dataset.
Figure 2: Manual modeling procedure. (a) Component measurement
in the section view. (b) Component selection and placement in the plan view.
(c) Reconstructed BIM model. (d) RGB point cloud. (e) Strictly modeled components according to the raw point cloud.
Figure 3: Automated point cloud semantic annotation refinement.
You should first request access to Matterport3D dataset as our dataset is based on Matterport3D. Please fill and sign the Terms of Use agreement form and send it to matterport3d@googlegroups.com to request access to Matterport3D dataset. If your request is approved, please send their reply email to us at thubimnet@outlook.com to get access to our dataset.
If you find BIMNet useful in your research, please cite our work:
@article{liu2025dataset,
title = {Dataset and benchmark for as-built {BIM} reconstruction from real-world point cloud},
journal = {Automation in Construction},
volume = {173},
pages = {106096},
year = {2025},
issn = {0926-5805},
doi = {https://doi.org/10.1016/j.autcon.2025.106096},
url = {https://www.sciencedirect.com/science/article/pii/S0926580525001360},
author = {Yudong Liu and Han Huang and Ge Gao and Ziyi Ke and Shengtao Li and Ming Gu}
}
The original data from Matterport3D dataset is released under Terms of Use agreement. The part of our datset is under MIT Liscence.
Please contact us at thubimnet@outlook.com if you have any questions.
Copyright © 2025 CBIMS Research Group.