Implementing Prompt Chaining, import Server and more
This release accumulates many improvements, new features, and theoretically breaks nothing from previous versions.
- Implemented option to provide custom CA and Client Cert for TLS.
- Increased thread check frequency to reduce lag.
- Added Python prompt example (the funny one).
- Default settings for user is a bit more verbose, for guidance.
- Specified Gitea server endpoint for creating Issues.
- Added 'visible' key to prompt specification.
- Improved prompt filtering per available context and endpoints.
- Implemented 'prepend' command.
- Response template is more sophisticated, supporting more use cases.
- PromptInputs now supports text_from_prompt and list_from_prompt.
- Implemented a stack to support prompt chaining.
- Implemented run_in() to clean the code and moved pid/eid as kwargs.
- Added python typing hints to ease the development of the plugin.
- AssistantAISettings now uses classes for Server, Prompt, Endpoint and PromptInput.
- Added the ability to process 'import' statement in user servers.
- Lots of bug fixes on how settings are loaded.
- Added options to send query string arguments.
- Added assistant_ai_dump command for debugging processed settings.
- Providing more context to prompts: file, path, lines, ...
- Prompt arguments are passed all in kwargs.
- Improved prompt and status bar progress icons.
- Code readability.
- Updated README.md with better overall explanation.