portkey.ai Open in urlscan Pro
172.67.74.84  Public Scan

Submitted URL: http://portkey.ai/
Effective URL: https://portkey.ai/
Submission: On April 10 via api from US — Scanned from DE

Form analysis 0 forms found in the DOM

Text Content

Announcing our Open Source AI Gateway!




Features

Pricing

Docs

Blog

Resources

Schedule Demo

Log In →


CONTROL PANEL


FOR AI APPS

With Portkey's Observability Suite and AI Gateway,
hundreds of teams ship reliable, cost-efficient, and fast apps.

Start your free trial →

View the Docs





We're Open Source!

Pricing

Features

Blog

Docs

Resources

Schedule Demo

Log In →




MONITOR COSTS, QUALITY, AND LATENCY

Get insights from 40+ metrics and debug with detailed logs and traces.

Observability Suite →


ROUTE TO 100+ LLMS, RELIABLY

Call any LLM with a single endpoint and setup fallbacks, load balancing,
retries, cache, and canary tests effortlessly.

AI Gateway →




BUILD AND DEPLOY EFFECTIVE PROMPTS

Ditch git—collaboratively develop the best prompts and deploy them from a single
place.

Prompt Playground →


EVALUATE OUTPUTS WITH AI AND HUMAN FEEDBACK

Collect and track feedback from users. Setup tests to auto judge outputs and
find what's not working — in realtime.




 * DANIEL YUABOV
   
   Building Autoeasy


 * “IT'S THE SIMPLEST TOOL I'VE FOUND FOR MANAGING PROMPTS AND GETTING INSIGHTS
   FROM OUR AI MODELS. WITH PORTKEY, WE CAN EASILY UNDERSTAND HOW OUR MODELS ARE
   PERFORMING AND MAKE IMPROVEMENTS. IT GIVES US CONFIDENCE TO PUT THINGS IN
   PRODUCTION AND SCALE OUR BUSINESS WITHOUT BREAKING A SWEAT. PLUS, THE TEAM AT
   PORTKEY IS ALWAYS RESPONSIVE AND HELPFUL.”
   
   
   PABLO PAZOS
   
   Building Barkibu - AI-first pet health insurance


 * “WHILE DEVELOPING ALBUS, WE WERE HANDLING MORE THAN 10,000 QUESTIONS DAILY.
   THE CHALLENGES OF MANAGING COSTS, LATENCY, AND RATE-LIMITING ON OPENAI WERE
   BECOMING OVERWHELMING; IT WAS THEN THAT PORTKEY INTERVENED AND PROVIDED
   INVALUABLE SUPPORT THROUGH THEIR ANALYTICS AND SEMANTIC CACHING SOLUTIONS.”
   
   
   KARTIK M
   
   Building Albus - RAG based community support


 * THANK YOU FOR THE GOOD WORK YOU GUYS ARE DOING. I DON'T WANT TO WORK WITHOUT
   PORTKEY ANYMORE.
   
   
   LUKA BREITIG
   
   Building The Happy Beavers - AI Content Generation


 * "WE ARE REALLY BIG FANS OF WHAT YOU FOLKS DELIVER. FOR US, PORTKEY IS
   OPENAI."
   
   
   DEEPANSHU S
   
   Building in stealth


 * "WE ARE USING PORTKEY IN STAGING AND PRODUCTION, WORKS REALLY WELL SO FAR.
   WITH REPORTING AND OBSERVABILITY BEING SO BAD ON OPENAI AND AZURE, PORTKEY
   REALLY HELPS GET VISIBILITY INTO HOW AND WHERE WE ARE USING GPT WHICH BECOMES
   A PROBLEM AS YOU START USING IT AT SCALE WITHIN A COMPANY AND PRODUCT."
   
   
   SWAPAN R
   
   Building Haptik.ai


 * "I REALLY LIKE PORTKEY SO FAR. I AM GETTING ALL THE QUERIES I MADE SAVED IN
   PORTKEY OUT OF BOX. INTEGRATION IS SUPER QUICK IF USING OPENAI DIRECTLY. AND
   BASIC ANALYTICS AROUND COSTING IS ALSO USEFUL. OPENAI FAILS OFTEN AND I CAN
   SET PORTKEY TO FALLBACK TO AZURE AS WELL."
   
   
   SIDDHARTH BULIA
   
   Building in Stealth




INTEGRATE
IN A MINUTE


WORKS WITH OPENAI AND OTHER AI PROVIDERS OUT OF THE BOX. NATIVELY INTEGRATED
WITH LANGCHAIN, LLAMAINDEX AND MORE.

View Integration Docs →

Node.js

Python

OpenAI JS

OpenAI Py

cURL

import Portkey from 'portkey-ai';
const portkey = new Portkey()

const chat = await portkey.chat.completions.create({
  messages: [{ role: 'user', content: 'Say this is a test' }],
  model: 'gpt-4,
});

console.log(chat.choices);





BUILD YOUR AI APP'S CONTROL PANEL NOW

Schedule your demo →

No Credit Card Needed

SOC2 Certified

Fanatical Support


30% FASTER LAUNCH


WITH A FULL-STACK OPS PLATFORM, FOCUS ON BUILDING YOUR WORLD-DOMINATION APP. OR,
SOMETHING NICE.


99.99% UPTIME


WE MAINTAIN STRICT UPTIME SLAS TO ENSURE THAT YOU DON'T GO DOWN. WHEN WE'RE
DOWN, WE PAY YOU BACK.


40MS LATENCY PROXIES


CLOUDFLARE WORKERS ENABLE OUR BLAZING FAST APIS WITH <40MS LATENCIES. WE WON'T
SLOW YOU DOWN.


100% COMMITMENT


WE'VE BUILT & SCALED LLM SYSTEMS FOR OVER 3 YEARS. WE WANT TO PARTNER AND MAKE
YOUR APP WIN.


FAQ


GOT QUESTIONS?

If you have any other questions - please get in touch at hello@portkey.ai


HOW DOES PORTKEY WORK?



You can integrate Portkey by replacing the OpenAI API base path in your app with
Portkey's API endpoint. Portkey will start routing all your requests to OpenAI
to give you control of everything that's happening. You can then unlock
additional value by managing your prompts & parameters in a single place.


HOW DO YOU STORE MY DATA?



Portkey is ISO:27001 and SOC 2 certified. We're also GDPR compliant. We maintain
the best practices involving security of our services, data storage and
retrieval. All your data is encrypted in transit and at rest. For enterprises,
we offer managed hosting to deploy Portkey inside private clouds. If you need to
talk about these options, feel free to drop us a note on hello@portkey.ai


WILL THIS SLOW DOWN MY APP?



No, we actively benchmark to check for any additional latency due to Portkey.
With the built-in smart caching, automatic fail-over and edge compute layers -
your users might even notice an overall improvement in your app experience.

Status Page

Privacy Policy

Cookie Policy

Terms of Service

DPA

© 2024 Portkey, Inc




COOKIE SETTINGS

We use cookies to enhance your experience, analyze site traffic and deliver
personalized content. Read our Cookie Policy.