Skip to content

fix: compute Ollama Origin header from baseUrl to fix Forbidden error#2843

Merged
ComputelessComputer merged 3 commits intomainfrom
devin/1767677038-fix-ollama-forbidden
Jan 7, 2026
Merged

fix: compute Ollama Origin header from baseUrl to fix Forbidden error#2843
ComputelessComputer merged 3 commits intomainfrom
devin/1767677038-fix-ollama-forbidden

Conversation

@devin-ai-integration
Copy link
Copy Markdown
Contributor

Summary

Changes the Ollama Origin header from a hardcoded http://localhost to dynamically computed from the configured baseUrl. The previous fix (#2178) hardcoded the Origin, but this doesn't work in all Ollama configurations (e.g., when "Expose Ollama to network" is enabled or when using 127.0.0.1 instead of localhost).

The fix strips the /v1 suffix from the baseUrl and extracts the origin, so http://127.0.0.1:11434/v1 becomes http://127.0.0.1:11434.

Fixes #2123

Review & Testing Checklist for Human

  • Test with Ollama using http://127.0.0.1:11434/v1 as base URL - verify health check passes
  • Test with Ollama using http://localhost:11434/v1 as base URL - verify this still works (regression check)
  • Test with "Expose Ollama to network" enabled in Ollama settings
  • Verify edge cases: what happens if baseUrl is malformed or missing /v1 suffix?

Recommended test plan:

  1. Install Ollama locally
  2. Select "Ollama" provider in Hyprnote Intelligence settings
  3. Toggle "Expose Ollama to network" in Ollama settings
  4. Verify the connection health check shows success (green checkmark) instead of "Connection failed: Forbidden"

Notes

@netlify
Copy link
Copy Markdown

netlify Bot commented Jan 6, 2026

Deploy Preview for hyprnote canceled.

Name Link
🔨 Latest commit 0b6bb82
🔍 Latest deploy log https://app.netlify.com/projects/hyprnote/deploys/695dc171abcf150008f69590

@devin-ai-integration
Copy link
Copy Markdown
Contributor Author

🤖 Devin AI Engineer

I'll be helping with this pull request! Here's what you should know:

✅ I will automatically:

  • Address comments on this PR that start with 'DevinAI' or '@devin'.
  • Look at CI failures and help fix them

Note: I can only respond to comments from users who have write access to this repository.

⚙️ Control Options:

  • Disable automatic comment and CI monitoring

@netlify
Copy link
Copy Markdown

netlify Bot commented Jan 6, 2026

Deploy Preview for howto-fix-macos-audio-selection canceled.

Name Link
🔨 Latest commit 0b6bb82
🔍 Latest deploy log https://app.netlify.com/projects/howto-fix-macos-audio-selection/deploys/695dc1714fad260008d074b0

@netlify
Copy link
Copy Markdown

netlify Bot commented Jan 6, 2026

Deploy Preview for hyprnote-storybook canceled.

Name Link
🔨 Latest commit 0b6bb82
🔍 Latest deploy log https://app.netlify.com/projects/hyprnote-storybook/deploys/695dc171abcf150008f69595

@ComputelessComputer ComputelessComputer force-pushed the devin/1767677038-fix-ollama-forbidden branch from 3622f36 to 5503a0b Compare January 6, 2026 15:06
devin-ai-integration Bot and others added 3 commits January 7, 2026 11:14
The previous fix hardcoded Origin to 'http://localhost', but this doesn't
work in all Ollama configurations (e.g., when 'Expose Ollama to network'
is enabled or when using 127.0.0.1 instead of localhost).

This change computes the Origin header from the configured baseUrl,
stripping the /v1 suffix to get the correct origin. This ensures the
Origin header matches what Ollama expects regardless of the host
configuration.

Fixes #2123

Co-Authored-By: john@hyprnote.com <john@hyprnote.com>
Add the "unsafe-headers" feature to tauri-plugin-http in the desktop Tauri Cargo.toml to allow sending the Origin header. This change addresses warnings about the Origin header being skipped by fetch due to it being a forbidden header, and follows the suggestion to enable the feature when keeping the header is desired.
Allow the tauri HTTP plugin to send forbidden origin headers by enabling
its unsafe-headers feature where appropriate. The workspace-level
Cargo.toml enables the feature to avoid the fetch-spec warning while the
app-specific Cargo.toml keeps the workspace dependency (no extra feature
there) to prevent duplicate feature specification and maintain workspace
consistency.
@ComputelessComputer ComputelessComputer force-pushed the devin/1767677038-fix-ollama-forbidden branch from 5503a0b to 0b6bb82 Compare January 7, 2026 02:14
@github-project-automation github-project-automation Bot moved this to Backlog in Char Jan 7, 2026
@ComputelessComputer ComputelessComputer moved this from Backlog to In review in Char Jan 7, 2026
@ComputelessComputer ComputelessComputer merged commit f86b7b9 into main Jan 7, 2026
26 checks passed
@ComputelessComputer ComputelessComputer deleted the devin/1767677038-fix-ollama-forbidden branch January 7, 2026 02:24
@github-project-automation github-project-automation Bot moved this from In review to Done in Char Jan 7, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

No open projects
Status: Done

Development

Successfully merging this pull request may close these issues.

Ollama is forbidden

1 participant