feat(chat_v2): add MiniMax AI as LLM provider#101
Open
octo-patch wants to merge 1 commit intoruc-datalab:mainfrom
Open
feat(chat_v2): add MiniMax AI as LLM provider#101octo-patch wants to merge 1 commit intoruc-datalab:mainfrom
octo-patch wants to merge 1 commit intoruc-datalab:mainfrom
Conversation
Add MiniMax AI (https://www.minimaxi.com) as a third model provider alongside Local (vLLM) and HeyWhale API. This allows users to leverage cloud-hosted MiniMax models (M2.7) without running a local model service. Backend changes: - Add _iter_minimax_stream() using OpenAI-compatible SDK - Extend build_chat_runtime_config() with minimax provider routing - Add MiniMax-specific temperature clamping (0.01-1.0) Frontend changes: - Add MiniMax AI option in Model Provider dropdown - Add MiniMax API Key input field with persistence - Wire model name and API key into chat request Tests: - 27 unit tests covering temperature clamping, config building, stream mocking, and provider routing - 3 integration tests hitting the live MiniMax API
Collaborator
|
Is the API base |
Author
|
Thanks for trying it out @LIUyizheSDU! The correct API base URL for MiniMax is: Note: It's
Let me know if you run into any other issues! |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Changes
Backend (
demo/chat_v2/backend_app/services/chat.py):_iter_minimax_stream(): new streaming function using OpenAI-compatible SDK with MiniMax base URL_normalize_minimax_temperature(): clamps temperature to MiniMax-accepted range [0.01, 1.0]build_chat_runtime_config()withminimaxprovider routing, auto-defaults for API base and modelbot_stream()dispatch to route minimax provider to the new stream iteratorFrontend (
demo/chat_v2/frontend/components/three-panel-interface.tsx):Documentation (
README.md,README_ZH.md):Test plan