-
Intel Corporation
Stars
OpenVINO Tokenizers extension
A scalable inference server for models optimized with OpenVINO™
Run Generative AI models with simple C++/Python API and using OpenVINO Runtime
📚 Jupyter notebook tutorials for OpenVINO™
Neural Network Compression Framework for enhanced OpenVINO™ infe 734F rence
OpenVINO™ is an open source toolkit for optimizing and deploying AI inference
oneAPI Threading Building Blocks (oneTBB)
oneAPI DPC++ Library (oneDPL) https://software.intel.com/content/www/us/en/develop/tools/oneapi/components/dpc-library.html
Mirror kept for legacy. Moved to https://github.com/llvm/llvm-project