Announcements
Stay informed about Portkey's newest releases, features, and improvements
New Update: Get a Detailed View of Request and Response Tokens!
New FeatureAnnouncementPortkey just upgraded its Tokens Used Chart! 🚀
You can now see a detailed breakdown of Request Tokens and Response Tokens used for each interaction. Stay on top of your token usage like never before!
Â
Â
Portkey at AWS re:Invent
CommunityPortkey's founders Rohit & Ayush will be at AWS re:Invent in Las Vegas next week. If you're attending the event, we'd love to catch up and chat about LLMs, Bedrock, Gateways, Governance, Observability and more!
Â
We're hosting a small practitioners' meetup on the sidelines, please join us! https://lu.ma/jyreb3gdÂ
Â
Weekly AI Engineering Hours
AnnouncementWe're launching weekly community calls to help you get the most out of Portkey and learn from other AI engineers building in production.
What to expect:
- Deep dives into new Portkey features
- Real implementation stories from the community
- Production best practices and challenges
- Live Q&A
Every Friday, 8 AM PT | Register hereÂ
Â
Announcing: prompt.new
AnnouncementWe’re thrilled to announce the launch of "prompt.new" - Your new go-to destination for building, testing, and refining AI prompts across 1600+LLMs.
Â
Here’s why prompt.new is a game-changer:
💡 Instant Access to 1600+ Models – No setup, no hassle. Start testing right away.
⚡ Switch Models in a Click – Compare outputs from multiple providers seamlessly.
🛠JSON Mode – Perfect for building production-ready prompts.
🤝 Versioning & Collaboration– Keep track of every tweak and iteration. Work with your team in real-time.
Â
No integrations. No setup. Just fast, intuitive prompt testing at your fingertips.
Try prompt.new today. With Portkey, building with AI has never been this simple.
Launching Prompt Labels
New FeatureAnnouncementWe're excited to introduce Prompt Labels to Portkey! 🎉
With this new feature, you can now add labels to different versions of your prompts—such as staging, dev, and production—to help you organize and manage your prompts more effectively.
Labels provide greater flexibility, allowing you to easily identify and categorize prompts based on their environment or version, making it easier to track changes.
🚨 OpenAI Model deprecation Alert!
AnnouncementÂ
OpenAI is shutting down GPT-4 Vision Preview models on December 6, 2024.
If you've been using:
• gpt-4-vision-preview
• gpt-4-1106-vision-preview
Â
It's time to migrate to GPT-4o.
🚀 Lessons from Building a Multi-Billion Scale AI Gateway
From 0 to 2Bn+ requests on Portkey’s AI Gateway!
Listen to Rohit Agarwal as he shares lessons from building a billion-scale AI Gateway, in conversation with Shaw Talebi from The Data Entrepreneurs
Â
Why do 75% of AI projects get stuck in the proof of concept stage? Rohit shares how Portkey is helping 600+ teams overcome the production challenges in AI deployment.
Â
Â
✨ Portkey Now Supports Controlled Generations on Google Vertex AI
AnnouncementWith Pydantic & Zod support, you can now enforce Gemini models to return responses that match exactly the JSON schema you define. No more unexpected outputs! 🎯
Currently Supported Models:
➡️ Gemini 1.5 Pro
➡️ Gemini 1.5 Flash
Â
 Link to DocsÂ
Launching Prompt Folders
New FeatureThis was a long standing demand from many users: The ability to organize prompt templates inside folders.
This is available to EVERY org now!
- Create multiple folders as you need
- Move any prompt from one folder to another
Try it out now!