I’m a PhD student at the University of Queensland 🎓, deeply immersed in the fascinating world of neural networks 🤖—a constantly evolving field that pushes me to think outside the box every single day!
My research focuses on neural network verification (NNV) 🧠💪. I’m passionate about ensuring these powerful models are robust, reliable, and dependable, regardless of the conditions or inputs they encounter.
Want to know more about me? Visit my website: zhongkuima.github.io
I’ve worked on several exciting projects related to neural networks and model security, some of which have been published in top-tier conferences:
- AIM - “Model Modulation with Logits Redistribution” (WWW’25)
- GRAB - “Uncovering Gradient Inversion Risks in Practical Language Model Training” (CCS’24)
- CoreLocker - “CORELOCKER: Neuron-level Usage Control” (S&P’24)
- WraLU - “ReLU Hull Approximation” (POPL’24)
- PdD - “Formalizing Robustness Against Character-Level Perturbations for Neural Network Language Models” (ICFEM’23)
Thanks and to be honored with my friends and collaborators, including Xinguo Feng, Zihan Wang . You can find more works by their scholar profiles.
I’m currently working on some exciting tools that I’m thrilled to share with you:
- shapeonnx: A tool to infer the shape of an ONNX model when the official tool being down. It’s a simple yet powerful tool that helps you understand the dimensions of your model’s inputs and outputs! 📏
- slimonnx: A tool to optimize and simplify your ONNX models by removing redundant operations and resolving version issues. It makes ONNX files cleaner, more efficient, and ready for action! 🚀
- torchonnx: A tool for converting ONNX models to PyTorch models (.pth for parameters, .py for structure). It’s simple, lightweight, and designed for seamless model conversion 🔄.
- torchvnnlib: A tool to convert VNN-LIB files (.vnnlib) to PyTorch tensors (.pth files) for efficient neural network verification. Take full advantage of the PyTorch ecosystem! 🚀
- propdag: A bound propagation framework for neural network verification. It supports any DAG (Directed Acyclic Graph) structure, covering both feedforward and backward propagation patterns for verification. This tool allows researchers to focus on their algorithms without worrying about complex computation graphs! 💪
I’m always open to collaboration and contributions! If you’re interested in working together or have ideas for our or new projects, feel free to reach out. I love brainstorming and bouncing ideas around! 💡
Thanks so much for visiting my GitHub! Let’s innovate, collaborate, and make AI even better together! ⭐