Pick a route or let the strategy decide. Every reply shows which model, which provider, and what Portkey saw.
Route to:·Strategy:·Reliability: noneCtrl+Enter to send
Setup
This demo routes a chat request through a local Portkey Gateway to five LLM clouds: Groq, Mistral, Cerebras, OpenRouter, and Vertex AI (Gemini). Keys are read from .env by the local proxy and never reach the browser.
Copy .env.example to .env and fill in keys for any of the five providers.
In one terminal: npx @portkey-ai/gateway (listens on localhost:8787).
In another terminal: node proxy.js (serves the page on localhost:3000).
Open Settings to confirm which providers are configured, then send a message.
To see fallback in action, enable Simulate outage and set Strategy = fallback.
To see loadbalance distribution, set Strategy = loadbalance and watch the provider counts in the top bar update as you send multiple messages.