Skip to content

Commit

Permalink
feat: implement auto_unload for ollama backend. (#38)
Browse files Browse the repository at this point in the history
* feat: implement auto_unload for ollama backend.

* docs(ollama): change the documentation to the default behaviour.
  • Loading branch information
Davidyz authored Jan 6, 2025
1 parent fd464ec commit ff2d683
Show file tree
Hide file tree
Showing 2 changed files with 10 additions and 1 deletion.
2 changes: 2 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -181,6 +181,8 @@ cmp_ai:setup({
provider = 'Ollama',
provider_options = {
model = 'codellama:7b-code',
auto_unload = false, -- Set to true to automatically unload the model when
-- exiting nvim.
},
notify = true,
notify_callback = function(msg)
Expand Down
9 changes: 8 additions & 1 deletion lua/cmp_ai/backends/ollama.lua
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,14 @@ function Ollama:new(o)
temperature = 0.2,
},
})

if self.params.auto_unload then
vim.api.nvim_create_autocmd('VimLeave', {
callback = function()
self:Get(self.params.base_url, {}, { model = self.params.model, keep_alive = 0 }, function() end)
end,
group = vim.api.nvim_create_augroup('CmpAIOllama', { clear = true }),
})
end
return o
end

Expand Down

0 comments on commit ff2d683

Please sign in to comment.