Update spacing in the readme

This commit is contained in:
jteijema 2025-01-14 16:39:17 +01:00
parent 55ea0db7df
commit aefd525c89

View file

@ -24,8 +24,8 @@ Dockerized FastAPI wrapper for [Kokoro-82M](https://huggingface.co/hexgrad/Kokor
The service can be accessed through either the API endpoints or the Gradio web interface. The service can be accessed through either the API endpoints or the Gradio web interface.
1. Install prerequisites: 1. Install prerequisites:
- Install [Docker Desktop](https://www.docker.com/products/docker-desktop/) - Install [Docker Desktop](https://www.docker.com/products/docker-desktop/)
- Clone the repository: - Clone the repository:
```bash ```bash
git clone https://github.com/remsky/Kokoro-FastAPI.git git clone https://github.com/remsky/Kokoro-FastAPI.git
cd Kokoro-FastAPI cd Kokoro-FastAPI
@ -33,17 +33,17 @@ The service can be accessed through either the API endpoints or the Gradio web i
2. Start the service: 2. Start the service:
- Using Docker Compose (Full setup including UI): - Using Docker Compose (Full setup including UI):
```bash ```bash
cd docker/gpu # OR cd docker/gpu # OR
# cd docker/cpu # Run this or the above # cd docker/cpu # Run this or the above
docker compose up --build docker compose up --build
``` ```
Once started: Once started:
- The API will be available at http://localhost:8880 - The API will be available at http://localhost:8880
- The UI can be accessed at http://localhost:7860 - The UI can be accessed at http://localhost:7860
- OR run the API alone using Docker (model + voice packs baked in): - OR run the API alone using Docker (model + voice packs baked in):
```bash ```bash
docker run -p 8880:8880 ghcr.io/remsky/kokoro-fastapi-cpu:latest # CPU docker run -p 8880:8880 ghcr.io/remsky/kokoro-fastapi-cpu:latest # CPU
docker run --gpus all -p 8880:8880 ghcr.io/remsky/kokoro-fastapi-gpu:latest # Nvidia GPU docker run --gpus all -p 8880:8880 ghcr.io/remsky/kokoro-fastapi-gpu:latest # Nvidia GPU
@ -53,7 +53,7 @@ The service can be accessed through either the API endpoints or the Gradio web i
3. Running the UI Docker Service: 3. Running the UI Docker Service:
- If you only want to run the Gradio web interface separately and connect it to an existing API service: - If you only want to run the Gradio web interface separately and connect it to an existing API service:
```bash ```bash
docker run -p 7860:7860 \ docker run -p 7860:7860 \
-e API_HOST=<api-hostname-or-ip> \ -e API_HOST=<api-hostname-or-ip> \
@ -61,9 +61,9 @@ The service can be accessed through either the API endpoints or the Gradio web i
ghcr.io/remsky/kokoro-fastapi-ui:v0.1.0 ghcr.io/remsky/kokoro-fastapi-ui:v0.1.0
``` ```
- Replace `<api-hostname-or-ip>` with: - Replace `<api-hostname-or-ip>` with:
- `kokoro-tts` if the UI container is running in the same Docker Compose setup. - `kokoro-tts` if the UI container is running in the same Docker Compose setup.
- `localhost` if the API is running on your local machine. - `localhost` if the API is running on your local machine.
4. Run locally as an OpenAI-Compatible Speech Endpoint 4. Run locally as an OpenAI-Compatible Speech Endpoint