Skip to content

Add ollama support#135

Open
rajo69 wants to merge 2 commits intoVectifyAI:mainfrom
rajo69:add-ollama-support
Open

Add ollama support#135
rajo69 wants to merge 2 commits intoVectifyAI:mainfrom
rajo69:add-ollama-support

Conversation

@rajo69
Copy link

@rajo69 rajo69 commented Mar 3, 2026

Fixes #115

What this does

  • Adds --base-url CLI flag for custom OpenAI-compatible API endpoints
  • Threads base_url through all LLM call functions in page_index.py, page_index_md.py, and utils.py
  • Adds base_url: null to config.yaml defaults (no behaviour change for existing users)
  • Fixes tiktoken to fall back to cl100k_base for non-OpenAI model names (previously crashed with KeyError)

Usage with Ollama

pageindex --pdf_path doc.pdf --model llama3.1 --base-url http://localhost:11434/v1

Works with any OpenAI-compatible provider

AWS Bedrock, Azure OpenAI, LM Studio, Mistral, etc. — anything that exposes an OpenAI-compatible endpoint.

rajo69 added 2 commits March 3, 2026 00:40
- Add pyproject.toml with package metadata, dependencies, and CLI entry point
- Extract CLI logic into pageindex/cli.py with a proper main() function
- Simplify run_pageindex.py to delegate to pageindex.cli:main
- Add build artifacts to .gitignore

Closes VectifyAI#103
- Add --base-url CLI flag for custom API endpoints
- Thread base_url through all LLM call functions in page_index.py,
  page_index_md.py, and utils.py
- Add base_url to config.yaml defaults
- Fix tiktoken fallback to cl100k_base for non-OpenAI model names

Closes VectifyAI#115

Usage with Ollama:
  pageindex --pdf_path doc.pdf --model llama3.1 --base-url http://localhost:11434/v1
@rajo69
Copy link
Author

rajo69 commented Mar 3, 2026

Just flagging that PR #135 takes a very similar approach, both were independently developed and are functionally equivalent.

Happy to close this in favour of #135 if that's easier to review, or to address any feedback here if you'd prefer this
one. Either way, glad to see Ollama support landing in the project!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Feature] ollama

1 participant