OpenRouter
How to Integrate OpenRouter with Jan
OpenRouter is a tool that gathers AI models. Developers can utilize its API to engage with diverse large language models, generative image models, and generative 3D object models.
To connect Jan with OpenRouter for accessing remote Large Language Models (LLMs) through OpenRouter, you can follow the steps below:
Step 1: Configure OpenRouter API key
- Find your API keys in the OpenRouter API Key.
- Set the OpenRouter API key in
~/jan/engines/openai.json
file.
Step 2: MModel Configuration
- Go to the directory
~/jan/models
. - Make a new folder called
openrouter-(modelname)
, likeopenrouter-dolphin-mixtral-8x7b
. - Inside the folder, create a
model.json
file with the following settings:
- Set the
id
property to the model id obtained from OpenRouter. - Set the
format
property toapi
. - Set the
engine
property toopenai
. - Ensure the
state
property is set toready
.
~/jan/models/openrouter-dolphin-mixtral-8x7b/model.json
{
"sources": [
{
"filename": "openrouter",
"url": "https://openrouter.ai/"
}
],
"id": "cognitivecomputations/dolphin-mixtral-8x7b",
"object": "model",
"name": "Dolphin 2.6 Mixtral 8x7B",
"version": "1.0",
"description": "This is a 16k context fine-tune of Mixtral-8x7b. It excels in coding tasks due to extensive training with coding data and is known for its obedience, although it lacks DPO tuning. The model is uncensored and is stripped of alignment and bias. It requires an external alignment layer for ethical use. Users are cautioned to use this highly compliant model responsibly, as detailed in a blog post about uncensored models at erichartford.com/uncensored-models.",
"format": "api",
"settings": {},
"parameters": {},
"metadata": {
"tags": ["General", "Big Context Length"]
},
"engine": "openai"
}
note
For more details regarding the model.json
settings and parameters fields, please see here.
Step 3 : Start the Model
- Restart Jan and go to the Hub.
- Find your model and click on the Use button.