mirror of
https://github.com/remsky/Kokoro-FastAPI.git
synced 2025-08-05 16:48:53 +00:00
Update README.md
This commit is contained in:
parent
8f0150a577
commit
7711c32fc2
1 changed files with 4 additions and 6 deletions
10
README.md
10
README.md
|
@ -5,19 +5,17 @@
|
||||||
# <sub><sub>_`FastKoko`_ </sub></sub>
|
# <sub><sub>_`FastKoko`_ </sub></sub>
|
||||||
[]()
|
[]()
|
||||||
[]()
|
[]()
|
||||||
[](https://huggingface.co/hexgrad/Kokoro-82M/tree/c3b0d86e2a980e027ef71c28819ea02e351c2667) [](https://huggingface.co/spaces/Remsky/Kokoro-TTS-Zero) [](https://www.buymeacoffee.com/remsky)
|
[](https://huggingface.co/hexgrad/Kokoro-82M/tree/c3b0d86e2a980e027ef71c28819ea02e351c2667) [](https://huggingface.co/spaces/Remsky/Kokoro-TTS-Zero)
|
||||||
|
|
||||||
Dockerized FastAPI wrapper for [Kokoro-82M](https://huggingface.co/hexgrad/Kokoro-82M) text-to-speech model
|
Dockerized FastAPI wrapper for [Kokoro-82M](https://huggingface.co/hexgrad/Kokoro-82M) text-to-speech model
|
||||||
- OpenAI-compatible Speech endpoint, with inline voice combination functionality
|
- OpenAI-compatible Speech endpoint, with inline voice combination functionality
|
||||||
- NVIDIA GPU accelerated or CPU Onnx inference
|
- NVIDIA GPU accelerated or CPU Onnx inference
|
||||||
- very fast generation time
|
- very fast generation time
|
||||||
- 100x+ real time speed via HF A100
|
- 35x-100x+ real time speed via 4060Ti+
|
||||||
- 35-50x+ real time speed via 4060Ti
|
|
||||||
- 5x+ real time speed via M3 Pro CPU
|
- 5x+ real time speed via M3 Pro CPU
|
||||||
- streaming support w/ variable chunking to control latency & artifacts
|
- streaming support w/ variable chunking to control latency & artifacts
|
||||||
- simple audio generation web ui utility
|
- phoneme, simple audio generation web ui utility
|
||||||
- (new) phoneme endpoints for conversion and generation
|
- Runs on an 80mb-300mb model (CUDA container + 5gb on disk due to drivers)
|
||||||
|
|
||||||
|
|
||||||
## Quick Start
|
## Quick Start
|
||||||
|
|
||||||
|
|
Loading…
Add table
Reference in a new issue