Add support for optional api_key configuration for declarative openai-engine providers#9202
Open
monroewilliams wants to merge 3 commits into
Open
Add support for optional api_key configuration for declarative openai-engine providers#9202monroewilliams wants to merge 3 commits into
monroewilliams wants to merge 3 commits into
Conversation
…mlx declarative providers. These providers all set requires_auth=false since auth is optional for local inference, but due to logic elsewhere in the code, openai-engine based providers' api keys could not be configured through the UI and would not be sent to the endpoint unless requires_auth was set to true. To fix this, also correct the logic in several places so that declarative providers based on the openai engine which define a nonempty api_key_env and have requires_auth=false allow the key to be set through the UI and will send the key to the endpoint. Changes: - lmstudio.json: Set api_key_env to LMSTUDIO_API_KEY - llama_swap.json: Set api_key_env to LLAMA_SWAP_API_KEY - omlx.json: Set api_key_env to OMLX_API_KEY - declarative_providers.rs: Update llama_swap api_key_env test assertion - provider_registry.rs: Expose api_key_env config in UI regardless of requires_auth setting - openai.rs: Read API key when configured, regardless of requires_auth. For the requires_auth=false case: -- handle NotFound as not an error (the user didn't provide the optional api key) -- warn on keyring errors instead of failing (the api key isn't needed, so this isn't fatal) - inventory/mod.rs: Include api_key_env in identity hash so model cache invalidates when key changes, regardless of the value of requires_auth.
…t tests The api_key resolution logic in from_custom_config was hard to test deterministically because it called Config::global() directly. Extract it into a pure static method resolve_api_key that takes a closure for secret lookup, making it testable with mocked secrets. Add 6 unit tests covering all branches: empty env, missing key (required vs optional), present key, and other errors (required vs optional). Co-Authored-By: Qwen3.6-35B-A3B
19bed4f to
7db99f6
Compare
Contributor
Author
|
@DOsinga: here's the next one from my stack. |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Adds a nonempty api_key_env configuration to the lmstudio, llama_swap, and omlx declarative providers.
These providers all set
requires_auth=falsealready (since auth is optional for local inference), but logic elsewhere in the code coupledapi_key_envtorequires_auth, causing issues for providers whereapi_key_envis non-empty andrequires_auth=false:The PR fixes both of these issues by adjusting the logic to decouple api_key_env from requires_auth and handle this case correctly.
Changes:
lmstudio.json: Set api_key_env to LMSTUDIO_API_KEY
llama_swap.json: Set api_key_env to LLAMA_SWAP_API_KEY
omlx.json: Set api_key_env to OMLX_API_KEY
declarative_providers.rs: Update llama_swap api_key_env test assertion
provider_registry.rs: Expose the api_key_env config in UI regardless of requires_auth setting
openai.rs: Read the API key when configured, regardless of
requires_auth. For therequires_auth=falsecase:inventory/mod.rs: Include
api_key_envin identity hash so model cache invalidates when key changes, regardless of the value ofrequires_auth.Testing
Related Issues
N/A
Screenshots/Demos (for UX changes)
Before:


After:

