Skip to content

Releases: CartoDB/litellm

v1.79.1-carto.1.0.20

09 Jan 23:17
8e58bb1

Choose a tag to compare

🚀 CARTO Release for LiteLLM v1.79.1

Base Version: LiteLLM v1.79.1

🐛 Bug Fixes

  • Responses API Session Handler: Fixed NameError crash when Redis cache misses occur. The code referenced an undefined _ENTERPRISE_ResponsesSessionHandler class that was renamed in upstream LiteLLM. Now uses the correct ResponsesSessionHandler class name.

  • Azure Responses API: Fixed 404 errors when using Azure OpenAI with deployment-specific paths in api_base. The Responses API URL construction now correctly strips deployment paths (e.g., /openai/deployments/gpt-4o/chat/completions) since the Responses API uses /openai/responses and specifies the model in the request body.

  • Snowflake Responses API Tool Calling: Fixed tool calling support for Snowflake provider, including tool_choice transformation, streaming handler for tool_use responses, and proper message format with content and content_list fields.

  • Snowflake Conversation History: Fixed "content or contentList must be provided" error when using previous_response_id to continue conversations. Empty assistant messages (created when Claude decides to only call tools) are now filtered from conversation history loaded from Redis or spend logs.

  • Snowflake PAT Authentication: Added support for Personal Access Token (PAT) authentication with automatic pat/ prefix detection and stripping, setting the correct PROGRAMMATIC_ACCESS_TOKEN header type.

  • Snowflake Claude Streaming: Added custom streaming handler for Claude models to handle missing 'created' field in streaming responses.

  • Prisma Offline Binaries: Cherry-picked upstream feature to download Prisma binaries at build time instead of runtime, fixing egress traffic issues in security-restricted environments (cherry-picked from upstream #17695).

✨ Features

No new features in this release.

🔒 Security

No CARTO-specific security changes in this release.

⚠️ Breaking Changes

No breaking changes in this release.


This release includes 7 bug fixes focused on Responses API, Snowflake provider, and Azure integration. Infrastructure and CI changes are not listed.

v1.79.1-carto.1.0.18

11 Dec 19:46
6ca4677

Choose a tag to compare

🚀 CARTO Release for LiteLLM v1.79.1

Base Version: LiteLLM v1.79.1

🐛 Bug Fixes

  • Fixed malformed JSON in streaming tool call arguments with Gemini models: Resolved intermittent JSON parse errors ("Unexpected non-whitespace character after JSON") that occurred when using Gemini models with the Responses API streaming and tool calls. The issue was caused by Gemini streaming sending duplicate/overlapping JSON chunks. Now uses Python's built-in JSONDecoder.raw_decode() to properly extract valid JSON objects from streaming responses, ensuring reliable tool call parsing.

This release includes 1 product-relevant change. Infrastructure and CI changes are not listed here.

v1.79.1-carto.1.0.17

10 Dec 23:08
9ee08fe

Choose a tag to compare

🚀 CARTO Release for LiteLLM v1.79.1

Base Version: LiteLLM v1.79.1 (synced from v1.78.5)

🔄 Upstream Sync

This release includes a major upstream sync from LiteLLM v1.78.5 to v1.79.1, incorporating 250+ upstream commits.

Upstream Releases Included:

Compare Upstream Changes:

Key Upstream Improvements:

  • Provider Enhancements: Added OpenAI video generation/retrieval support, Cohere v2/chat API support, enhanced Bedrock inference profile handling, and Lasso API v3 upgrade
  • Guardrails & Security: Added Model Armor guardrails with during_call support, improved message redaction controls via LiteLLM-Disable-Message-Redaction header, and enhanced IBM Guardrails configuration
  • Observability: Fixed Langfuse OTEL integration for tool usage messages, resolved duplicate trace issues, and improved OpenTelemetry header handling
  • Cost Tracking: Added cost tracking for Vertex AI passthrough batch jobs and Titan image generations
  • Bug Fixes: Fixed MLFlow tag handling, Azure AI vector stores, managed files error handling, Bedrock health checks, and Redis cluster hash tags
  • For full upstream changes, see the comparison link above

Sync Strategy:

  • ✅ Preserved all CARTO customizations
  • ✅ Accepted all upstream core improvements
  • ✅ Resolved conflicts favoring CARTO infrastructure + upstream functionality

✨ CARTO-Specific Features

  • Snowflake PAT Authentication: Added support for Snowflake PAT (Personal Access Token) authentication with automatic pat/ prefix detection and PROGRAMMATIC_ACCESS_TOKEN header handling
  • Claude Streaming for Snowflake: Implemented custom streaming handler for Claude models on Snowflake that handles missing 'created' field in responses

🐛 CARTO-Specific Bug Fixes

  • Vertex AI Metadata Handling: Fixed crash when LiteLLM proxy encounters null metadata in Vertex AI provider responses (cherry-picked from upstream with CARTO-specific provider-aware labels filtering preserved)
  • Docker Python Version Control: Switched Docker base image from python:latest-dev to wolfi-base to prevent Python 3.14 compatibility issues with native extension packages. Added dynamic Python version detection and proper Prisma CLI support in runtime

🔒 Security

No CARTO-specific security changes. All upstream security fixes included.

⚠️ Breaking Changes

No CARTO-specific breaking changes. See upstream release notes for upstream breaking changes.


This release includes upstream sync (250+ commits) + 3 CARTO-specific product changes. Infrastructure and CI changes are not listed.

carto-v1.79.1-1.74.336

18 Nov 16:22
00e06b8

Choose a tag to compare

🚀 CARTO Release for LiteLLM v1.79.1

Base Version: LiteLLM v1.79.1 (synced from v1.75.2)

🔄 Upstream Sync Details

This release includes a major upstream sync that brings CARTO's fork from LiteLLM v1.75.2 to v1.79.1.

Upstream Releases Included:

Compare Upstream Changes:

Sync Strategy:

  • ✅ Preserved all CARTO customizations (workflows, Docker configs, documentation)
  • ✅ Accepted all upstream core LiteLLM improvements
  • ✅ Resolved conflicts favoring CARTO infrastructure + upstream functionality
  • ✅ Updated model pricing data (24K lines vs 17K - 6,435 more models supported)

Key Conflict Resolutions:

  • model_prices_and_context_window_backup.json - Kept upstream version (more up-to-date model data)
  • CARTO workflows - Preserved all carto_*.yaml and carto-*.yml files
  • CARTO documentation - Preserved all CARTO_*.md files

✨ Features

  • Enhanced Model Support: Synced with upstream's expanded model catalog, adding support for 6,435+ additional model configurations
  • Vertex AI Labels Support: Enhanced Vertex AI integration to support custom labels in requests. Labels can now be passed directly or automatically converted from metadata fields, enabling better request tracking and organization in Google Cloud
  • Docker Image Improvements: Database view creation script (create_views.py) is now included in the base Docker image, simplifying downstream image builds and deployments

🐛 Bug Fixes

  • Vertex AI Metadata Handling: Fixed crash when LiteLLM proxy encounters null metadata in Vertex AI provider responses (cherry-picked from upstream)
  • Provider-Aware Labels Filtering: Corrected labels field handling to ensure proper provider-specific behavior - Google AI Studio (Gemini) endpoints now correctly exclude the labels field while Vertex AI endpoints properly include it, preventing request failures
  • Prisma CLI Path Resolution: Fixed Prisma CLI path detection in Docker containers for offline mode operation. The proxy now correctly locates pre-cached Prisma binaries, eliminating runtime npm download failures (403 errors) in air-gapped environments
  • Database View Creation: Improved error handling for database view creation in containerized environments. The script now gracefully handles missing databases, connection issues, and non-PostgreSQL databases without showing unnecessary errors
  • Upstream Compatibility: Includes all bug fixes from LiteLLM v1.78.5, v1.79.0, and v1.79.1 stable releases

🔒 Security

No CARTO-specific security changes in this release. All upstream security fixes from v1.75.2 → v1.79.1 are included.

⚠️ Breaking Changes

No breaking changes in CARTO customizations. Breaking changes from upstream (if any) are documented in upstream release notes.

📝 Version Note

⚠️ Note on versioning: The version number 1.74.336 was auto-calculated by semver-generator from commit history. Future releases will use a corrected versioning scheme starting from 1.0.0 for each new upstream base version.


This release includes upstream sync and 7 CARTO-specific changes. Infrastructure and CI changes are not listed here.

PRs Included:

  • #26 - Upstream sync conflict resolution
  • #30 - Intelligent release notes with Claude Code AI filtering

carto-v1.75.2-1.74.334

14 Nov 20:21
41441b9

Choose a tag to compare

🚀 CARTO Release

Base Version: LiteLLM v1.75.2
Previous Tag: carto-v1.75.2-1.74.299

Changes Since Last Release

🐛 Bug Fixes

41441b9 fix: cherry-pick upstream metadata None check [sc-521238] (#28)
4d1b516 fix: remove leftover monitor job code from resolver workflow
b643ae7 fix: correct workflow names in CI fixer trigger
8b2452a feat: create separate CI fixer workflow with upstream test check
8621cf8 feat: add CI monitoring and auto-fix to conflict resolver
4b092cb Fix workflow dispatch capability
8089cce Fix workflow dispatch capability
a0a9daa Fix workflow dispatch capability
d26c233 Fix workflow dispatch capability
2ff496e Fix workflow dispatch issue
08fad73 Fix workflow dispatch issue
4b313ae fix: use git reset instead of merge for upstream sync
9a75cea Add suffix 'CARTO' in custom workflows
55706c1 fix: auto-resolve conflicts in main branch sync with upstream
da1cdda Fix indentation
e6e381b refactor: convert to pure bash workflow with correct branch strategy
5a15c18 feat: add Claude Code automated upstream sync

✨ Features

8b2452a feat: create separate CI fixer workflow with upstream test check
8621cf8 feat: add CI monitoring and auto-fix to conflict resolver
6dcceee feat: add GitHub Actions output grouping and improve Slack notifications
5a15c18 feat: add Claude Code automated upstream sync

🔧 Chores & Maintenance

📦 All Commits

41441b9 fix: cherry-pick upstream metadata None check [sc-521238] (#28)
2a91d7d Improve CARTO_CLAUDE.md to always search for upstream solution
4d1b516 fix: remove leftover monitor job code from resolver workflow
b643ae7 fix: correct workflow names in CI fixer trigger
8fd7e35 Add changes to sync-resolver
8b2452a feat: create separate CI fixer workflow with upstream test check
8621cf8 feat: add CI monitoring and auto-fix to conflict resolver
2891ab9 Optimize workflow...
712d674 Set max turns for claude to 250
3428dea remove timeout from claude action
4b092cb Fix workflow dispatch capability
8089cce Fix workflow dispatch capability
a0a9daa Fix workflow dispatch capability
d26c233 Fix workflow dispatch capability
2ff496e Fix workflow dispatch issue
08fad73 Fix workflow dispatch issue
10297aa add actors
1a8520e Improve Claude conflict resolution workflow
b5efffb Add Claude conflict resolution workflow
72bbb1f refactor: use merge-first strategy with reset fallback
4b313ae fix: use git reset instead of merge for upstream sync
542804e refactor: decouple PR creation from upstream sync and improve merge strategy
9a75cea Add suffix 'CARTO' in custom workflows
683f8f8 add CARTOFANTE token in all steps
20891b5 Revert PR-automerge approach + add Cartofante token
ab28f94 Add permissions
55706c1 fix: auto-resolve conflicts in main branch sync with upstream
6dcceee feat: add GitHub Actions output grouping and improve Slack notifications
ef367d2 Remove docs for the release process
da1cdda Fix indentation
e6e381b refactor: convert to pure bash workflow with correct branch strategy
63b9936 Add changes
f6eb010 refactor: replace Claude Code sync with bash-based workflow
5a15c18 feat: add Claude Code automated upstream sync

carto-v1.75.2-1.74.299

28 Oct 15:34
60f9b2c

Choose a tag to compare

🚀 CARTO Release

Base Version: LiteLLM v1.75.2
Previous Tag: carto-v1.75.2-1.74.297

Changes Since Last Release

🐛 Bug Fixes

5748d6d Fix: Vertex AI Gemini labels field provider-aware filtering (BerriAI#14563)

✨ Features

🔧 Chores & Maintenance

📦 All Commits

5748d6d Fix: Vertex AI Gemini labels field provider-aware filtering (BerriAI#14563)
ec3c01c Optional labels field in Vertex AI request

carto-v1.75.2-1.74.297

20 Oct 18:49
8ea1bba

Choose a tag to compare

🎉 First CARTO Release for LiteLLM v1.75.2

Changes

912d5f3 feat: Improve create_views.py error handling for containers
98771e7 feat: Include create_views.py in base Docker image
b2a56a2 refactor: Simplify Prisma configuration and reduce complexity
45a37d5 fix: Correct Prisma CLI installation path in Docker build
aca7a78 fix: Use correct Prisma CLI path in offline mode
f941488 Modify Utils
6c97a6c Modify Utils
5e6f214 Fix issue with dependencies in image build
4cdfeed Fix issue with dependencies in image build
cdc3f3f Fix issue with dependencies in image build
2d12e24 Fix issue with dependencies in image build
5c29e21 Update prisma binaries download approach
b507b82 Updated docker/Dockerfile.non_root
9b66311 change PRISMA_NODEENV_CACHE_DIR
37e96d8 change carto-workflows naming
1078447 New version with Modify prisma-client in python
ab673c9 New version with Modify prisma-client in python
649342a Modify prisma-client in python to use the env
828ead9 Set Prisma binaries directory
d2d1375 Fix prisma binaries error again