Skip to content

Ollama setup trouble. #5160

@qwerty108109

Description

@qwerty108109

Question

This time around when setting up OpenCoder with Ollama to use offline, I am having a lot of trouble.
This is what my configuration file looks like.

{
  "$schema": "https://opencode.ai/config.json",
  "provider": {
    "ollama": {
      "npm": "@ai-sdk/openai-compatible",
      "name": "Ollama (PC1)",
      "options": {
        "baseURL": "http://localhost:11434/v1"
      },
      "models": {
        "ministral-3:3b": {
          "name": "ministral-3:3b"
        }
      }  
    }  
  }  
}  

Whenever I use the /models it does not give me an option for my Ollama models.

I am currently using Linux Debbie 13 and I am using opencode Version 1.0.134.
Ollama Version 0.13.1

Metadata

Metadata

Assignees

No one assigned

    Labels

    opentuiThis relates to changes in v1.0, now that opencode uses opentui

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions