Merge pull request #1 from kmac/add-openrouter-support
feat(providers): added openrouter support
This commit is contained in:
commit
0d7f83d4fb
7 changed files with 108 additions and 0 deletions
|
|
@ -78,6 +78,7 @@ There are mainly 2 ways of installing Perplexica - With Docker, Without Docker.
|
|||
- `OPENAI`: Your OpenAI API key. **You only need to fill this if you wish to use OpenAI's models**.
|
||||
- `OLLAMA`: Your Ollama API URL. You should enter it as `http://host.docker.internal:PORT_NUMBER`. If you installed Ollama on port 11434, use `http://host.docker.internal:11434`. For other ports, adjust accordingly. **You need to fill this if you wish to use Ollama's models instead of OpenAI's**.
|
||||
- `GROQ`: Your Groq API key. **You only need to fill this if you wish to use Groq's hosted models**.
|
||||
- `OPENROUTER`: Your OpenRouter API key. **You only need to fill this if you wish to use models via OpenRouter**.
|
||||
- `ANTHROPIC`: Your Anthropic API key. **You only need to fill this if you wish to use Anthropic models**.
|
||||
- `Gemini`: Your Gemini API key. **You only need to fill this if you wish to use Google's models**.
|
||||
- `DEEPSEEK`: Your Deepseek API key. **Only needed if you want Deepseek models.**
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue