Inference Issue with Object Detection Models Created Using VertexAI #151
Labels
comp:model
Model related isssues
Hardware:USB Accelerator
Coral USB Accelerator issues
subtype:windows
Windos Build/installation issues
type:bug
Bug
type:performance
Performance issues
Uh oh!
There was an error while loading. Please reload this page.
Description
When performing object detection inference using models (tflite) created with VertexAI and EdgeTPU, the program halts during the execution of pycoral.utils.edgetpu.run_inference(interpreter, input_data).
pycoral
I use the term "halts" because no error message is shown in the command prompt.
Previously, I could use models created with VertexAI without issues, so I suspect that something has changed on the VertexAI side. Upon reviewing the model information, I found that the "description" field, which used to say TOCO Converted. Model built using AutoML Vision now reads MLIR Converted. Model built using AutoML Vision (as confirmed via Netron).
I would like to continue performing object detection inference using the EdgeTPU, even after this model change. I would greatly appreciate any advice or solutions to resolve this issue. Thank you in advance.
Click to expand!
Issue Type
Bug, Performance
Operating System
Windows 10
Coral Device
USB Accelerator
Other Devices
No response
Programming Language
Python 3.9
Relevant Log Output
No response
The text was updated successfully, but these errors were encountered: