Skip to content

Conversation

@roomote
Copy link
Contributor

@roomote roomote bot commented Jan 28, 2026

Related GitHub Issue

Closes: #11049

Description

This PR attempts to address Issue #11049 where local Ollama models appear unresponsive. The key changes add logging and diagnostics to help users and developers understand what is happening when Ollama requests fail or models are not appearing.

Key Implementation Details:

  1. Enhanced parseOllamaModel return type: Now returns both modelInfo and filteredReason to provide context when models are filtered out due to missing tool support.

  2. Logging for filtered models: When models are filtered out due to missing tools capability, a warning is logged explaining which models were filtered and why. This helps users understand why their models might not be appearing in the dropdown.

  3. Request logging in NativeOllamaHandler: Added logging when Ollama requests are initiated, showing the model ID and base URL being used.

  4. Warning for models not in capability list: When a selected model is not found in the list of tool-capable models, a warning is logged suggesting the user check if their Ollama version reports capabilities.

The issue symptoms (model appears but no response/no server activity) suggest either:

  • The model lacks tool support and was filtered but somehow still selected
  • There is a connectivity issue being swallowed silently
  • The Ollama version does not report capabilities

These logging improvements will help diagnose similar issues in the future.

Test Procedure

  1. Ran existing Ollama unit tests: cd src && npx vitest run api/providers/fetchers/__tests__/ollama.test.ts - all 13 tests pass
  2. Ran NativeOllamaHandler tests: cd src && npx vitest run api/providers/__tests__/native-ollama.spec.ts - all 15 tests pass
  3. Type checking passes across all packages
  4. Linting passes

To manually verify:

  1. Configure Ollama with a model that does not have tool support
  2. Check VS Code Output panel for Roo Code logs - should see warnings about filtered models
  3. Try to use a model and observe the request logging in the output

Pre-Submission Checklist

  • Issue Linked: This PR is linked to an approved GitHub Issue (see "Related GitHub Issue" above).
  • Scope: My changes are focused on the linked issue (one major feature/fix per PR).
  • Self-Review: I have performed a thorough self-review of my code.
  • Testing: New and/or updated tests have been added to cover my changes (if applicable).
  • Documentation Impact: I have considered if my changes require documentation updates (see "Documentation Updates" section below).
  • Contribution Guidelines: I have read and agree to the Contributor Guidelines.

Documentation Updates

  • No documentation updates are required.

Additional Notes

This PR adds diagnostic logging only. It does not change the core behavior of Ollama model filtering or request handling. The logging will appear in the VS Code Output panel under "Roo Code" and in the developer console, helping users understand why models may not be appearing or responding.

Feedback and guidance are welcome!


Important

Enhance logging and diagnostics for Ollama model connection issues by updating parseOllamaModel and adding detailed logs in NativeOllamaHandler.

  • Behavior:
    • parseOllamaModel now returns modelInfo and filteredReason to provide context for filtered models.
    • Logs warnings for models filtered due to missing 'tools' capability in ollama.ts.
    • Logs request initiation details and warnings for models not in the capability list in native-ollama.ts.
  • Tests:
    • Updated tests in ollama.test.ts to check for filteredReason and ensure correct logging behavior.
  • Misc:
    • Added console warnings in getOllamaModels for filtered models and connection issues.

This description was created by Ellipsis for 9869b06. You can customize this summary. It will automatically update as commits are pushed.

…ion issues

- Add logging when models are filtered out due to missing tool support
- Add warning when selected model is not in the tool-capable models list
- Update parseOllamaModel to return filteredReason for better diagnostics
- Log request info (model ID, base URL) when starting Ollama requests

This helps diagnose issues like #11049 where Ollama models appear
unresponsive. The logging output will show:
- Which models are being filtered and why
- When a model may not support native tool calling
- Request details for debugging connectivity issues
@roomote
Copy link
Contributor Author

roomote bot commented Jan 28, 2026

Rooviewer Clock   See task on Roo Cloud

Review complete. No issues found.

The changes add diagnostic logging to help debug Ollama connection issues as described in #11049. The implementation is well-scoped, properly tested, and consistent with existing codebase patterns.

Mention @roomote in a comment to request specific changes to this pull request or fix all unresolved issues.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

Status: Triage

Development

Successfully merging this pull request may close these issues.

Local Ollama Models Not Responsive

1 participant