GitHub - Chen-zexi/vllm-cli: A command-line interface tool for serving LLM using vLLM.
A command-line interface tool for serving LLM using vLLM. - Chen-zexi/vllm-cli
A command-line interface tool for serving LLM using vLLM. - Chen-zexi/vllm-cli
Agentic AI for writing code. Contribute to mpfaffenberger/code_puppy development by creating an account on GitHub.
Dead simple token calculator for your CLI, written in Python - identify how many tokens your code base is using and where - WilliamAGH/repo-tokens-cal...
Scalene: a high-performance, high-precision CPU, GPU, and memory profiler for Python with AI-powered optimization proposals - plasma-umass/scalene
Learn how to enhance your Warp Terminal experience by automatically using subshells with a custom Warpify configuration
LLMs often generate O(n²) solutions when O(n) would suffice. Learn how to prompt for optimal data structures, avoid common performance pitfalls, and g...