Skip to content
Change the repository type filter

All

    Repositories list

    • FlashInfer: Kernel Library for LLM Serving
      Cuda
      Apache License 2.0
      149000Updated Dec 4, 2024Dec 4, 2024
    • vllm

      Public
      A high-throughput and memory-efficient inference and serving engine for LLMs
      Python
      Apache License 2.0
      4.8k400Updated Dec 4, 2024Dec 4, 2024
    • Mooncake

      Public
      Mooncake is the serving platform for Kimi, a leading LLM service provided by Moonshot AI.
      C++
      Apache License 2.0
      1042k70Updated Dec 4, 2024Dec 4, 2024
    • A Flexible Framework for Experiencing Cutting-edge LLM Inference Optimizations
      Python
      Apache License 2.0
      41761230Updated Nov 14, 2024Nov 14, 2024