This project aims to provide technical support for bridge card games by detecting the suit, rank, position, and orientation of playing cards. The project is based on the YOLOv7 framework and collects data from various scenarios. We have also developed methods to synthesize data to improve the model’s accuracy. In the end, we achieved an impressive mAP@0.5 score of 0.959, indicating that our model exhibits excellent accuracy.
1 University of Melbourne, 2 University of Western Australia, 3 Monash University
The author sequence is grouped by institutions.
The label consists of two parts: the number and suit of the card, followed by the angle. For example, 2H 81 means this card is 2 Heart, and the angle between this card and the table edge closest to it is 81 degrees. B means back. For more information, please see Report.
To run this script, the environment should be first installed. Please check requirements.txt and download multi-object-tracker.
pip install -r requirements.txt
cd multi-object-tracker
pip install -e .
Run script according to the following steps:
- In card_detect project specify save_dir in video_split.py, run it to gain the screenshots 65CC of video in a directory ⟨D⟩.
- In card_detect project, run:
python detect.py -–weights card_detection.pt –-source ⟨D⟩ –-name ⟨N⟩ –-save-txt -–nosave –-save-conf
- Find corresponding txt results in card_detect/runs/detect/⟨N⟩/labels.
- Copy ⟨D⟩ and ⟨N⟩/labels above to orientation_detect project, and rename /labels to ⟨D⟩_res. For example, ⟨D⟩ is test_video so the ⟨D⟩_res will be test_video_res.
- In seg table infer project, run:
python -u segment/predict.py -–weights orientation_detection.pt -–source ⟨D⟩ –-name <specify_your_name_here>
- In orientation_detect project, find image results in orientation_detect/runs/predict-seg/⟨N⟩/vis.
- In orientation_detect project, specify image folder in merge images.py and run it to obtain final video result.
This project used the code implementation of YOLOv7, we appreciate their great work. If you are looking for more applications, please refer to them.