Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added Ollama support for interference with local LLM models #110

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

devparanjay
Copy link

Summary

Added support for Ollama to be able to use local LLM models.
FOSS LLMs are a necessity for the future of Open Source Artificial Intelligence. This is an attempt to add that functionality to Sirji (although it was done in Windows and Sirji works on Mac for now).

This is a reimplementation of an older PR #30 to account for Sirji's switch to being a VSCode extension.

What are the specific steps to test this change?

  1. Set the environment variables Sirji's settings to set "ollama" as model provider and set the name of the model you want Sirji to use.
  2. Run Sirji like usual and it ideally should work with Ollama like it is supposed to. An ollama server must be running in the background according to official Ollama instructions with the relevant model downloaded.

What kind of change does this PR introduce?

  • Bug fix
  • New feature
  • Refactor
  • Documentation
  • Build-related changes
  • Other

Make sure the PR fulfills these requirements:

  • It includes a) the existing issue ID being resolved, b) a convincing reason for adding this feature, or c) a clear description of the bug it resolves
  • The changelog is updated
  • Related documentation has been updated
  • Related tests have been updated

Other information:
I wasn't able to test this since I use a Windows machine.
I use Ruff for formatting in my IDE - explaining the format changes in code.

@kedarchandrayan
Copy link
Member

Thanks, @devparanjay, for your interest in Sirji.

I see that you haven't tested the code because you have a Windows machine. We will soon remove the OS restriction.

I reviewed the changes and have the following comments:

  1. The changes in the agents/sirji_agents/researcher folder will most likely cause issues as it utilizes the OpenAI assistants API. Please revert your changes in that folder.
  2. Sirji is a monorepo with an agents folder containing the code for the PyPI package - sirji-agents. Your changes to the following files look good:
    • agents/sirji_agents/llm/model_providers/factory.py
    • agents/sirji_agents/llm/model_providers/ollama.py
  3. We need to publish the PyPI package first before we can use it in the VS Code extension. Therefore, changes in the sirji/vscode-extension cannot be merged all at once.

I would suggest limiting the scope of this PR to introducing a new model provider, i.e., the changes in the two files mentioned above.

Also, when we remove the OS restriction, you can come and make the VS code Extension changes. As these changes will need thorough testing.

Thanks! Happy coding in Sirji!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants