feat: add MiniMax as first-class LLM provider#955
Open
octo-patch wants to merge 1 commit intocrestalnetwork:mainfrom
Open
feat: add MiniMax as first-class LLM provider#955octo-patch wants to merge 1 commit intocrestalnetwork:mainfrom
octo-patch wants to merge 1 commit intocrestalnetwork:mainfrom
Conversation
Add native MiniMax LLM provider support via OpenAI-compatible API, enabling direct API access without OpenRouter proxy overhead. - Add MINIMAX enum to LLMProvider with is_configured/display_name - Add MiniMaxLLM class using ChatOpenAI with api.minimax.io/v1 - Add MiniMax-M2.7 and MiniMax-M2.7-highspeed to model CSV - Register MiniMax in create_llm_model factory - Update llm_picker to prefer native MiniMax when API key is set - Add MINIMAX_API_KEY to config and .env.example - Add 30 unit and integration tests (all passing)
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Add MiniMax as a native LLM provider, enabling direct API access via MiniMax's OpenAI-compatible endpoint (
api.minimax.io/v1) without requiring OpenRouter as an intermediary.Changes
intentkit/models/llm.py: AddMINIMAXtoLLMProviderenum withis_configured/display_namesupport; addMiniMaxLLMclass (extendsLLMModel, usesChatOpenAIwith MiniMax base URL); register increate_llm_model()factoryintentkit/models/llm.csv: AddMiniMax-M2.7(intelligence=5, 204K context) andMiniMax-M2.7-highspeed(speed=5, cost-efficient) native model entriesintentkit/models/llm_picker.py: Updatepick_default_model()andpick_summarize_model()to prefer native MiniMax whenMINIMAX_API_KEYis configured, with graceful fallback to OpenRouterintentkit/config/config.py: AddMINIMAX_API_KEYenvironment variable.env.example: AddMINIMAX_API_KEY=entryModels
Backward Compatibility
minimax/minimax-m2.7,minimax/minimax-m2-her) remain unchangedMINIMAX_API_KEYis setTest Plan
test_llm.py: Model loading/filtering with MiniMax key, coexistence with OpenRoutertest_llm_picker.py: Priority ordering with native MiniMax, fallback behaviortest_minimax_llm.py: Provider enum, LLM class (base URL, API key, temperature, model override, params), factory routing, model infoCo-Authored-By: Octopus liyuan851277048@icloud.com