fix(ollama): add logging and diagnostics for debugging connection issues #11055
+63
−25
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Related GitHub Issue
Closes: #11049
Description
This PR attempts to address Issue #11049 where local Ollama models appear unresponsive. The key changes add logging and diagnostics to help users and developers understand what is happening when Ollama requests fail or models are not appearing.
Key Implementation Details:
Enhanced parseOllamaModel return type: Now returns both
modelInfoandfilteredReasonto provide context when models are filtered out due to missing tool support.Logging for filtered models: When models are filtered out due to missing
toolscapability, a warning is logged explaining which models were filtered and why. This helps users understand why their models might not be appearing in the dropdown.Request logging in NativeOllamaHandler: Added logging when Ollama requests are initiated, showing the model ID and base URL being used.
Warning for models not in capability list: When a selected model is not found in the list of tool-capable models, a warning is logged suggesting the user check if their Ollama version reports capabilities.
The issue symptoms (model appears but no response/no server activity) suggest either:
These logging improvements will help diagnose similar issues in the future.
Test Procedure
cd src && npx vitest run api/providers/fetchers/__tests__/ollama.test.ts- all 13 tests passcd src && npx vitest run api/providers/__tests__/native-ollama.spec.ts- all 15 tests passTo manually verify:
Pre-Submission Checklist
Documentation Updates
Additional Notes
This PR adds diagnostic logging only. It does not change the core behavior of Ollama model filtering or request handling. The logging will appear in the VS Code Output panel under "Roo Code" and in the developer console, helping users understand why models may not be appearing or responding.
Feedback and guidance are welcome!
Important
Enhance logging and diagnostics for Ollama model connection issues by updating
parseOllamaModeland adding detailed logs inNativeOllamaHandler.parseOllamaModelnow returnsmodelInfoandfilteredReasonto provide context for filtered models.ollama.ts.native-ollama.ts.ollama.test.tsto check forfilteredReasonand ensure correct logging behavior.getOllamaModelsfor filtered models and connection issues.This description was created by
for 9869b06. You can customize this summary. It will automatically update as commits are pushed.