refactor: create tensorzero-providers crate (step 1) by AntoineToussaint · Pull Request #7305 · tensorzero/tensorzero · GitHub
Skip to content

refactor: create tensorzero-providers crate (step 1)#7305

Open
AntoineToussaint wants to merge 4 commits intopr2-inference-traits-to-sharedfrom
pr3-providers-step1
Open

refactor: create tensorzero-providers crate (step 1)#7305
AntoineToussaint wants to merge 4 commits intopr2-inference-traits-to-sharedfrom
pr3-providers-step1

Conversation

@AntoineToussaint
Copy link
Copy Markdown
Member

Summary

Stacks on #7303. Creates the tensorzero-providers crate and moves the first batch of providers — those with no dependency on openai/.

Moved (11 modules):

  • Foundation: helpers, helpers_thinking_block, chat_completions
  • AWS: aws_common, aws_bedrock, aws_sagemaker
  • Anthropic family: anthropic, gcp_vertex_anthropic
  • GCP Gemini family: gcp_vertex_gemini, google_ai_studio_gemini

Still in core (moved in step 2): openai/, azure, deepseek, fireworks, groq, hyperbolic, mistral, openrouter, sglang, tgi, together, vllm, xai

Test plan

  • cargo check --all-targets --all-features
  • cargo clippy --all-targets --all-features -- -D warnings
  • CI green

🤖 Generated with Claude Code

AntoineToussaint and others added 4 commits April 14, 2026 15:53
Move GCP credential resolution and `BatchConfig` building out of the GCP
provider constructors and into `crate::model` callers. The provider
constructors now take pre-resolved `GCPVertexCredentials` and a fully
built `Option<BatchConfig>` directly, so the GCP provider modules no
longer depend on `crate::config::provider_types::*` or
`crate::model_table::{GCPVertexAnthropicKind, GCPVertexGeminiKind,
ProviderTypeDefaultCredentials}`.

This removes the last blocker preventing extraction of `providers/`
into a standalone `tensorzero-providers` crate.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Move shared types and utilities out of tensorzero-core into
tensorzero-inference-types, in preparation for extracting
crates/tensorzero-core/src/providers/ into a standalone
tensorzero-providers crate in a follow-up PR.

Moved (re-exported from core for backward compatibility):

- `InferenceProvider`, `WrappedProvider`, `TensorZeroEventError`
  from `tensorzero-core/src/inference/mod.rs` to a new
  `tensorzero-inference-types::provider_trait` module. The trait
  is the linchpin every provider implements; relocating it lets
  the providers crate live without depending on tensorzero-core.

- `BatchRequestRow` and `UnparsedBatchRequestRow` (plus their
  `sqlx::FromRow` impl for `PgRow`) into
  `tensorzero-inference-types`. The orphan rule means the FromRow
  impl had to follow the type.

- `deprecation_warning`, `is_mock_mode`, `get_mock_provider_api_base`
  into a new `tensorzero-inference-types::utils` module.

Also exposes `tensorzero-inference-types::serde_helpers` and adds
`deserialize_json_string` to it (needed by the moved BatchRequestRow).

Adds `async-trait` and `reqwest-sse-stream` as deps of
`tensorzero-inference-types`.

No behavior change. `cargo check --all-targets --all-features`
and `cargo clippy --all-targets --all-features -- -D warnings`
both pass.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
…-openai providers)

Create the `tensorzero-providers` crate and move the first batch of
provider files that have no dependency on `openai/`:

Moved:
- helpers.rs, helpers_thinking_block.rs, chat_completions.rs (foundation)
- aws_common.rs, aws_bedrock.rs, aws_sagemaker.rs
- anthropic.rs, gcp_vertex_anthropic.rs
- gcp_vertex_gemini/, google_ai_studio_gemini.rs

Remaining in core (depend on openai/ or core test infra):
- openai/, azure, deepseek, fireworks, groq, hyperbolic, mistral,
  openrouter, sglang, tgi, together, vllm, xai, dummy, test_helpers

Also:
- GCPVertexGeminiSupervisedRow refactored to own Vec<FunctionToolDef>
- Extension trait for building supervised rows from LazyRenderedSample
- From<&FunctionTool> for FunctionToolDef added in core
- Pre-commit exclusion for providers gcp_vertex_gemini test key
- warn_inference_parameter_not_supported + serialize_or_log added to
  tensorzero_inference_types::utils

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant