Skip to content

[ci] change test whl into python 312 and use test images#4513

Open
zhulinJulia24 wants to merge 27 commits intoInternLM:mainfrom
zhulinJulia24:async_result_auto
Open

[ci] change test whl into python 312 and use test images#4513
zhulinJulia24 wants to merge 27 commits intoInternLM:mainfrom
zhulinJulia24:async_result_auto

Conversation

@zhulinJulia24
Copy link
Copy Markdown
Collaborator

Thanks for your contribution and we appreciate it a lot. The following instructions would make your pull request more healthy and more easily receiving feedbacks. If you do not understand some items, don't worry, just make the pull request and seek help from maintainers.

Motivation

Please describe the motivation of this PR and the goal you want to achieve through this PR.

Modification

Please briefly describe what modification is made in this PR.

BC-breaking (Optional)

Does the modification introduce changes that break the backward-compatibility of the downstream repositories?
If so, please describe how it breaks the compatibility and how the downstream projects should modify their code to keep compatibility with this PR.

Use cases (Optional)

If this PR introduces a new feature, it is better to list some use cases here, and update the documentation.

Checklist

  1. Pre-commit or other linting tools are used to fix the potential lint issues.
  2. The modification is covered by complete unit tests. If not, please add more unit tests to ensure the correctness.
  3. If the modification has a dependency on downstream projects of a newer version, this PR should be tested with all supported versions of downstream projects.
  4. The documentation has been modified accordingly, like docstring or example tutorials.

Copilot AI review requested due to automatic review settings April 9, 2026 08:26
Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR updates the CI/e2e test setup to build and test Python 3.12 wheels, switch several workflows to use configurable “test” Docker images, and adjusts parts of the autotest suite/configs to fit the new environment.

Changes:

  • Update GitHub Actions workflows to build artifacts for py312, download the matching artifact name, and add docker_tag (and result_tag for benchmark) inputs for container selection/reporting.
  • Adjust autotest configs/model lists and quantization model selection logic (AWQ/GPTQ/W8A8) to better align with supported models/environments.
  • Modify/remove certain RESTful tests and relax/adjust a few assertions and prompt cases.

Reviewed changes

Copilot reviewed 18 out of 18 changed files in this pull request and generated 6 comments.

Show a summary per file
File Description
autotest/utils/pipeline_chat.py Tweaks pipeline output assertions for intermediate elements.
autotest/utils/config_utils.py Refactors quantization model list selection across backends/types.
autotest/tools/restful/test_restful_chat_hf_turbomind_llm.py Removes reasoning/tools RESTful test coverage; keeps core chat/logprobs coverage.
autotest/tools/restful/test_restful_chat_hf_pytorch_llm.py Removes reasoning/tools RESTful test coverage; keeps core chat + distributed/speculative coverage hooks.
autotest/tools/common_case_config.py Updates models and gates speculative decoding RESTful cases by TEST_ENV.
autotest/prompt_case.yml Expands memory-test keyword expectations.
autotest/interface/restful/test_restful_generate.py Updates stop_token_ids test payload for additional token id.
autotest/interface/restful/test_restful_completions_v1.py Loosens batch prompt-order assertions to tolerate output variance.
autotest/interface/restful/test_restful_chat_completions_v1.py Slightly relaxes token-length bounds for ignore-eos streaming.
autotest/config_5080_legacy.yml Adjusts quantization allow/deny lists for legacy 5080 environment.
.github/workflows/mllm_api_eval.yml Switches wheel build/artifact to py312.
.github/workflows/evaluate.yml Switches wheel build/artifact to py312 and adds configurable container tag.
.github/workflows/docker_nightly.yml Updates inner-registry nightly tag to nightly-test-cu12.8.
.github/workflows/daily_ete_test.yml Switches wheel build/artifact to py312, adds configurable container tag, adjusts cleanup/installs.
.github/workflows/daily_ete_test_5080.yml Switches wheel build/artifact to py312, adds configurable container tag, adjusts installs.
.github/workflows/daily_ete_test_3090.yml Switches wheel build/artifact to py312, adds configurable container tag, updates container image reference.
.github/workflows/benchmark.yml Switches wheel build/artifact to py312, adds configurable container tag and Feishu upload path via result_tag.
.github/workflows/api_eval.yml Switches wheel build/artifact to py312.
Comments suppressed due to low confidence (2)

.github/workflows/api_eval.yml:65

  • The build matrix was switched to py312, but this workflow installs the produced wheel inside openmmlab/lmdeploy:latest-cu12.8 containers and still uses COV_PARAM pointing at a Python 3.10 site-packages path. Please confirm the container image is Python 3.12-compatible (otherwise cp312 wheel install will fail) and update COV_PARAM (or compute it dynamically) to match the runtime Python version.
  linux-build:
    if: ${{github.event_name == 'schedule' || (!cancelled() && !inputs.offline_mode)}}
    strategy:
      matrix:
        pyver: [py312]
    runs-on: ubuntu-latest
    env:
      PYTHON_VERSION: ${{ matrix.pyver }}
      PLAT_NAME: manylinux2014_x86_64

.github/workflows/mllm_api_eval.yml:66

  • The build matrix was switched to py312, but this workflow installs the produced wheel inside openmmlab/lmdeploy:latest-cu12.8 containers and still uses COV_PARAM pointing at a Python 3.10 site-packages path. Please confirm the container image is Python 3.12-compatible (otherwise cp312 wheel install will fail) and update COV_PARAM (or compute it dynamically) to match the runtime Python version.
jobs:
  linux-build:
    if: ${{ !cancelled() }}
    strategy:
      matrix:
        pyver: [py312]
    runs-on: ubuntu-latest
    env:
      PYTHON_VERSION: ${{ matrix.pyver }}
      PLAT_NAME: manylinux2014_x86_64
      DOCKER_TAG: cuda12.8

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

result &= len(output.token_ids) > 0
else:
result &= len(output.text) > 0
result &= len(output.text) >= 0
Copy link

Copilot AI Apr 9, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

len(output.text) >= 0 is a no-op (always true for any non-None string) and significantly weakens this assertion compared to the previous > 0 check. If the intent is to allow empty chunks (e.g., for streaming), consider checking output.text is not None (or isinstance(output.text, str)) and gating the non-empty requirement based on is_stream / is_last rather than removing the validation entirely.

Suggested change
result &= len(output.text) >= 0
result &= isinstance(output.text, str)
if not is_stream:
result &= len(output.text) > 0

Copilot uses AI. Check for mistakes.
Comment on lines 941 to +949
def test_stop_token_ids(self):
print(f'\n[Model: {self.model_name}] Running stop_token_ids test')
payload = {'prompt': 'Once upon a time', 'max_tokens': 500, 'stop_token_ids': [11, 281], 'stream': False}
# 27974 for internlm3,
payload = {
'prompt': 'Once upon a time',
'max_tokens': 500,
'stop_token_ids': [11, 281, 128131],
'stream': False
}
Copy link

Copilot AI Apr 9, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The stop_token_ids payload was updated, but the inline comment and the log message below still reference the old list ([11, 281]) and a different InternLM3 token id (27974). This makes failures hard to diagnose. Please update the comment and the printed/error strings to reflect the actual stop_token_ids being sent (including the new id).

Copilot uses AI. Check for mistakes.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants