Skip to content

Merge branch 'master' into webb/litellm/close-spans

ecd3718
Select commit
Loading
Failed to load commit list.
Merged

fix(litellm): Avoid double span exits when streaming #5933

Merge branch 'master' into webb/litellm/close-spans
ecd3718
Select commit
Loading
Failed to load commit list.
@sentry/warden / warden: code-review completed Apr 10, 2026 in 2m 3s

1 issue

code-review: Found 1 issue (1 medium)

Medium

Test may fail if openai is not installed since streaming_chat_completions_model_response fixture uses openai types - `tests/integrations/litellm/test_litellm.py:238`

The streaming_chat_completions_model_response fixture in tests/conftest.py uses openai.types.chat.ChatCompletionChunk without guarding against openai being None (it's conditionally imported). The test relies on this fixture, but there's no pytest.importorskip('openai') or similar guard in the test file for this dependency. If openai is not installed while running litellm tests, this will cause a runtime error when the fixture tries to use openai.types.


Duration: 2m 2s · Tokens: 653.5k in / 4.6k out · Cost: $1.04 (+extraction: $0.00)

Annotations

Check warning on line 238 in tests/integrations/litellm/test_litellm.py

See this annotation in the file changed.

@sentry-warden sentry-warden / warden: code-review

Test may fail if openai is not installed since streaming_chat_completions_model_response fixture uses openai types

The `streaming_chat_completions_model_response` fixture in `tests/conftest.py` uses `openai.types.chat.ChatCompletionChunk` without guarding against `openai` being None (it's conditionally imported). The test relies on this fixture, but there's no `pytest.importorskip('openai')` or similar guard in the test file for this dependency. If `openai` is not installed while running litellm tests, this will cause a runtime error when the fixture tries to use `openai.types`.