You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
At the moment the llms and associated inference / embeddings are in class specific implementations. Not sure if this is necessary or DRY as you want to start catering for different architectures (eg. Dolly v2).
Consider refactoring with interfaces using a config: AutoConfig = AutoConfig.from_pretrained(path_or_repo) type approach. As this might allow for scaling to different model types without the need for heavy configuration on the users side or massive amounts of boilerplate rewriting of specific implementations.
At the moment the llms and associated inference / embeddings are in class specific implementations. Not sure if this is necessary or DRY as you want to start catering for different architectures (eg. Dolly v2).
Consider refactoring with interfaces using a
config: AutoConfig = AutoConfig.from_pretrained(path_or_repo)
type approach. As this might allow for scaling to different model types without the need for heavy configuration on the users side or massive amounts of boilerplate rewriting of specific implementations.Eg. stub
The text was updated successfully, but these errors were encountered: