Provider updates on Portkey ⚙️
We’ve rolled out several upgrades across major providers to make your LLM integrations faster, more flexible, and production-ready:
OpenAI & Azure OpenAI: Added support for streaming audio transcription requests.
Vertex AI: Added custom model support for batch inference.
Support for input_audio parameter to enable richer context in requests
Support for anthropic-beta header
AWS Bedrock: Added global profile support for Bedrock models.
Added name field mapping for /messages endpoint document objects.
Streamlined token counting endpoint to use invoke mode for better compatibility.
Added pass-through parameters for Bedrock batch create endpoints.
Azure:
Improved Azure Entra caching for better performance.
Added pricing support for image editing models across OpenAI, Azure OpenAI, and Azure Foundry providers.