Lists (9)
Sort Name ascending (A-Z)
Stars
本项目是一个面向小白开发者的大模型应用开发教程,在线阅读地址:https://datawhalechina.github.io/llm-universe/
Burp Plugin to Bypass WAFs through the insertion of Junk Data
AKShare is an elegant and simple financial data interface library for Python, built for human beings! 开源财经数据接口库
An open-source AI agent that brings the power of Gemini directly into your terminal.
基于DeepSeek-R1黑盒蒸馏的网络安全渗透领域推理模型。可高效的应对断网情况下的网络安全大赛。简介写完整了,图片加载不出来看看是否梯子挂好了。2025.5.14更新英文数据集
🔥🔒 Awesome MCP (Model Context Protocol) Security 🖥️
中文nlp解决方案(大模型、数据、模型、训练、推理)
整理开源的中文大语言模型,以规模较小、可私有化部署、训练成本较低的模型为主,包括底座模型,垂直领域微调及应用,数据集与教程等。
Obfuscate all your TCP connections into HTTP protocol.
FULL v0, Cursor, Manus, Same.dev, Lovable, Devin, Replit Agent, Windsurf Agent, VSCode Agent, Dia Browser, Trae AI & Cluely (And other Open Sourced) System Prompts, Tools & AI Models.
🚀 One-stop solution for creating your digital avatar from chat history 💡 Fine-tune LLMs with your chat logs to capture your unique style, then bind to a chatbot to bring your digital self to life. …
Cybersecurity AI (CAI), an open Bug Bounty-ready Artificial Intelligence
A demonstration toolkit revealing potential security vulnerabilities in MCP (Model Context Protocol) frameworks through data poisoning, JSON injection, function overriding, and cross-MCP call attac…
JavaSecLab is a comprehensive Java vulnerability platform| JavaSecLab是一款综合型Java漏洞平台,提供相关漏洞缺陷代码、修复代码、漏洞场景、审计SINK点、安全编码规范,覆盖多种漏洞场景,友好用户交互UI……
Challenges & author writeups from ZeroDays CTF 2025.
A simple, decentralized mesh VPN with WireGuard support.
关于内存马的学习研究支持新手从0到1,从内存马原理,内存马植入 内存马检测 到内存马防御与内存马应急以及内存马查杀全系列java内存马/php/.net/c++/python 喜欢可以点个star 后续持续更新
✍ WeChat Markdown Editor | 一款高度简洁的微信 Markdown 编辑器:支持 Markdown 语法、自定义主题样式、内容管理、多图床、AI 助手等特性
🚀 The fast, Pythonic way to build MCP servers and clients
Safety at Scale: A Comprehensive Survey of Large Model Safety
Vulhub Vulnerability Reproduction Designated Platform
🌐 Make websites accessible for AI agents. Automate tasks online with ease.
🚀🚀 「大模型」2小时完全从0训练26M的小参数GPT!🌏 Train a 26M-parameter GPT from scratch in just 2h!