-
Notifications
You must be signed in to change notification settings - Fork 17
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ollama: implement ollama with llama3 command #188
Conversation
|
||
// Start the network call in a goroutine | ||
go func() { | ||
err := runOllamaWorkspace(t, model, ollamaStore) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Lets copy over the the async function from devplane collections. Its really useful for this kind of thing without having to wrangle goroutines manually
pkg/cmd/ollama/ollama.go
Outdated
s := t.NewSpinner() | ||
|
||
go func() { | ||
w, err = ollamaStore.CreateWorkspace(org.ID, cwOptions) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
same thing about async await
pkg/cmd/ollama/ollama.go
Outdated
lf, err = ollamaStore.BuildVerbContainer(w.ID, verbYaml) | ||
if err != nil { | ||
// ideally log something here? | ||
verbCh <- nil |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
same thing async
This PR adds support to launch an ollama server with llama3 through the CLI. Additionally, I updated the Workspace type to support verb and cloudflare fields
Ticket: ENG-3758