feat: add GLM model detection for LM Studio and OpenAI-compatible providers #11077
+566
−6
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Related GitHub Issue
Closes: #11071
Description
This PR addresses the questions raised in issue #11071 about GLM model detection and sharing Z.ai optimizations with LM Studio and OpenAI-compatible endpoints.
Question 1: How do you detect whether LM Studio or the OpenAI-compatible endpoint is serving a GLM model?
This PR introduces an
isGlmModel()utility function that detects GLM models by checking if the model ID contains common GLM patterns:glm-prefix (official naming: glm-4, glm-4.5, glm-4.7)glm4(compact naming without dash)chatglm(older ChatGLM models)Question 2: Is there a way so that when improvements and fixes are made to how Roo Code communicates with GLM models on Z.ai, they will also become available to LM Studio and OpenAI-compatible endpoints running GLM models?
Yes! This PR enables automatic application of Z.ai optimizations for GLM models detected on LM Studio and OpenAI-compatible endpoints:
Key Implementation Details
src/api/providers/utils/model-detection.tswith:isGlmModel(modelId): Detects GLM models from model IDgetGlmModelOptions(modelId): Returns GLM-specific configuration optionsLmStudioHandlerto detect GLM models and apply optimizationsBaseOpenAiCompatibleProviderto detect GLM models and apply optimizationsTest Procedure
Unit tests for model detection utility:
cd src && npx vitest run api/providers/utils/__tests__/model-detection.spec.tsManual testing (for users with GLM models):
Pre-Submission Checklist
Documentation Updates
Additional Notes
This implementation follows the same pattern used by Z.ai for GLM model handling. The key insight is that Z.ai uses
convertToZAiFormat()withmergeToolResultText: true, which is functionally equivalent to usingconvertToOpenAiMessages()with the same option. This PR enables this option automatically when a GLM model is detected.Future Z.ai improvements for GLM models can be easily extended to third-party providers by:
isGlmModel()if neededGlmModelOptionsinterfaceFeedback and guidance are welcome!
Important
Adds GLM model detection and Z.ai optimizations for LM Studio and OpenAI-compatible providers, with tests for model detection.
isGlmModel()anddetectGlmModel()inglm-model-detection.tsto identify GLM models by model ID patterns.BaseOpenAiCompatibleProviderandLmStudioHandler.parallel_tool_callsand enablesmergeToolResultTextfor GLM models.BaseOpenAiCompatibleProviderto detect GLM models and apply optimizations during construction and stream creation.LmStudioHandlerto detect GLM models and apply optimizations during message creation.logGlmDetection()for logging detection results.glm-model-detection.spec.tsto verify GLM model detection and configuration.This description was created by
for d010015. You can customize this summary. It will automatically update as commits are pushed.