portkey.ai Open in urlscan Pro
2606:4700::6812:16f1  Public Scan

Submitted URL: http://portkey.ai/
Effective URL: https://portkey.ai/
Submission: On November 15 via api from US — Scanned from CA

Form analysis 0 forms found in the DOM

Text Content

Product

Developers

Solutions

Resources

Pricing

Schedule Demo

Log In →




CONTROL PANEL


FOR AI APPS

With Portkey's AI Gateway, Guardrails, and Observability Suite,
thousands of teams ship reliable, cost-efficient, and fast apps

Start your free trial →

View the Docs





We're Open Source!

Pricing

Product

Developers

Solutions

Resources

Demo

Free trial →

Trusted by leading companies and teams like




MONITOR COSTS, QUALITY, AND LATENCY

Get insights from 40+ metrics and debug with detailed logs and traces.

Observability Suite

→


ROUTE TO 200+ LLMS, RELIABLY

Call any LLM with a single endpoint and setup fallbacks, load balancing,
retries, cache, and canary tests effortlessly.

AI Gateway

→




BUILD AND DEPLOY EFFECTIVE PROMPTS

Ditch git—collaboratively develop the best prompts and deploy them from a single
place.

Prompt Playground

→


ENFORCE RELIABLE LLM BEHAVIOUR WITH GUARDRAILS

LLMs are unpredictable. With Portkey, you can synchronously run Guardrails on
your requests and route them with precision.

Guardrails

→




PUT YOUR AGENTS IN PROD

Portkey integrates with Langchain, CrewAI, Autogen and other major agent
frameworks, and makes your agent workflows production-ready.

Agents

→


 * “IT'S THE SIMPLEST TOOL I'VE FOUND FOR MANAGING PROMPTS AND GETTING INSIGHTS
   FROM OUR AI MODELS. WITH PORTKEY, WE CAN EASILY UNDERSTAND HOW OUR MODELS ARE
   PERFORMING AND MAKE IMPROVEMENTS. IT GIVES US CONFIDENCE TO PUT THINGS IN
   PRODUCTION AND SCALE OUR BUSINESS WITHOUT BREAKING A SWEAT. PLUS, THE TEAM AT
   PORTKEY IS ALWAYS RESPONSIVE AND HELPFUL.”
   
   
   PABLO PAZOS
   
   Building Barkibu - AI-first pet health insurance


 * “WHILE DEVELOPING ALBUS, WE WERE HANDLING MORE THAN 10,000 QUESTIONS DAILY.
   THE CHALLENGES OF MANAGING COSTS, LATENCY, AND RATE-LIMITING ON OPENAI WERE
   BECOMING OVERWHELMING; IT WAS THEN THAT PORTKEY INTERVENED AND PROVIDED
   INVALUABLE SUPPORT THROUGH THEIR ANALYTICS AND SEMANTIC CACHING SOLUTIONS.”
   
   
   KARTIK M
   
   Building Albus - RAG based community support


 * THANK YOU FOR THE GOOD WORK YOU GUYS ARE DOING. I DON'T WANT TO WORK WITHOUT
   PORTKEY ANYMORE.
   
   
   LUKA BREITIG
   
   Building The Happy Beavers - AI Content Generation


 * DANIEL YUABOV
   
   Building Autoeasy


 * "WE ARE REALLY BIG FANS OF WHAT YOU FOLKS DELIVER. FOR US, PORTKEY IS
   OPENAI."
   
   
   DEEPANSHU S
   
   Building in stealth


 * "WE ARE USING PORTKEY IN STAGING AND PRODUCTION, WORKS REALLY WELL SO FAR.
   WITH REPORTING AND OBSERVABILITY BEING SO BAD ON OPENAI AND AZURE, PORTKEY
   REALLY HELPS GET VISIBILITY INTO HOW AND WHERE WE ARE USING GPT WHICH BECOMES
   A PROBLEM AS YOU START USING IT AT SCALE WITHIN A COMPANY AND PRODUCT."
   
   
   SWAPAN R
   
   Building Haptik.ai


 * "I REALLY LIKE PORTKEY SO FAR. I AM GETTING ALL THE QUERIES I MADE SAVED IN
   PORTKEY OUT OF BOX. INTEGRATION IS SUPER QUICK IF USING OPENAI DIRECTLY. AND
   BASIC ANALYTICS AROUND COSTING IS ALSO USEFUL. OPENAI FAILS OFTEN AND I CAN
   SET PORTKEY TO FALLBACK TO AZURE AS WELL."
   
   
   SIDDHARTH BULIA
   
   Building in Stealth





INTEGRATE
IN A MINUTE


WORKS WITH OPENAI AND OTHER AI PROVIDERS OUT OF THE BOX. NATIVELY INTEGRATED
WITH LANGCHAIN, LLAMAINDEX AND MORE.

View Integration Docs →

Node.js

Python

OpenAI JS

OpenAI Py

cURL

import Portkey from 'portkey-ai';
const portkey = new Portkey()

const chat = await portkey.chat.completions.create({
  messages: [{ role: 'user', content: 'Say this is a test' }],
  model: 'gpt-4,
});

console.log(chat.choices);





BUILD YOUR AI APP'S CONTROL PANEL NOW

Schedule your demo →

No Credit Card Needed

SOC2 Certified

Fanatical Support


30% FASTER LAUNCH


WITH A FULL-STACK OPS PLATFORM, FOCUS ON BUILDING YOUR WORLD-DOMINATION APP. OR,
SOMETHING NICE.


99.99% UPTIME


WE MAINTAIN STRICT UPTIME SLAS TO ENSURE THAT YOU DON'T GO DOWN. WHEN WE'RE
DOWN, WE PAY YOU BACK.


40MS LATENCY


CLOUDFLARE WORKERS ENABLE OUR BLAZING FAST APIS WITH <40MS LATENCIES. WE WON'T
SLOW YOU DOWN.


100% COMMITMENT


WE'VE BUILT & SCALED LLM SYSTEMS FOR OVER 3 YEARS. WE WANT TO PARTNER AND MAKE
YOUR APP WIN.


FAQ


GOT QUESTIONS?

If you have any other questions - please get in touch at hello@portkey.ai


HOW DOES PORTKEY WORK?



You can integrate Portkey by replacing the OpenAI API base path in your app with
Portkey's API endpoint. Portkey will start routing all your requests to OpenAI
to give you control of everything that's happening. You can then unlock
additional value by managing your prompts & parameters in a single place.


HOW DO YOU STORE MY DATA?



Portkey is ISO:27001 and SOC 2 certified. We're also GDPR compliant. We maintain
the best practices involving security of our services, data storage and
retrieval. All your data is encrypted in transit and at rest. For enterprises,
we offer managed hosting to deploy Portkey inside private clouds. If you need to
talk about these options, feel free to drop us a note on hello@portkey.ai


WILL THIS SLOW DOWN MY APP?



No, we actively benchmark to check for any additional latency due to Portkey.
With the built-in smart caching, automatic fail-over and edge compute layers -
your users might even notice an overall improvement in your app experience.

Portkey.ai powers your team with seamless AI integrations for smarter decisions.



Products

Observability

AI Gateway

Guardrails

Prompt Management

Agents

Security & Compliance

What’s New

Developers

Documentation

Github(6k ⭐️)

Community

Changelog

API Status

Resources

Blog

Community (2k 🙋‍♂️)

AI Grants Finder

Events Calendar

Company

Pricing

Privacy Policy

Cookie Policy

Terms of Service

DPA



© 2024 Portkey, Inc. All rights reserved

HIPAA

COMPLIANT




GDPR

















We use cookies to personalize content and analyze traffic.