Skip to content

Commit

Permalink
Merge branch 'arc53:main' into feature/mobile-responsive
Browse files Browse the repository at this point in the history
  • Loading branch information
varundhand authored Oct 26, 2023
2 parents de780bb + ebe84dd commit 8a4ac04
Show file tree
Hide file tree
Showing 13 changed files with 409 additions and 59 deletions.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -72,7 +72,7 @@ If you don't have enough resources to run it, you can use bitsnbytes to quantize

- Scripts - Script that creates similarity search index for other libraries.

- Frontend - Frontend uses Vite and React.
- Frontend - Frontend uses <a href="https://vitejs.dev/">Vite</a> and <a href="https://react.dev/">React</a>.

## QuickStart

Expand Down
51 changes: 51 additions & 0 deletions application/parser/file/openapi3_parser.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,51 @@
from urllib.parse import urlparse

from openapi_parser import parse

try:
from application.parser.file.base_parser import BaseParser
except ModuleNotFoundError:
from base_parser import BaseParser


class OpenAPI3Parser(BaseParser):
def init_parser(self) -> None:
return super().init_parser()

def get_base_urls(self, urls):
base_urls = []
for i in urls:
parsed_url = urlparse(i)
base_url = parsed_url.scheme + "://" + parsed_url.netloc
if base_url not in base_urls:
base_urls.append(base_url)
return base_urls

def get_info_from_paths(self, path):
info = ""
if path.operations:
for operation in path.operations:
info += (
f"\n{operation.method.value}="
f"{operation.responses[0].description}"
)
return info

def parse_file(self, file_path):
data = parse(file_path)
results = ""
base_urls = self.get_base_urls(link.url for link in data.servers)
base_urls = ",".join([base_url for base_url in base_urls])
results += f"Base URL:{base_urls}\n"
i = 1
for path in data.paths:
info = self.get_info_from_paths(path)
results += (
f"Path{i}: {path.url}\n"
f"description: {path.description}\n"
f"parameters: {path.parameters}\nmethods: {info}\n"
)
i += 1
with open("results.txt", "w") as f:
f.write(results)
return results
1 change: 1 addition & 0 deletions application/requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -57,6 +57,7 @@ nltk==3.8.1
numcodecs==0.11.0
numpy==1.24.2
openai==0.27.8
openapi3-parser==1.1.14
packaging==23.0
pathos==0.3.0
Pillow==10.0.1
Expand Down
4 changes: 4 additions & 0 deletions docs/pages/Deploying/_meta.json
Original file line number Diff line number Diff line change
Expand Up @@ -6,5 +6,9 @@
"Quickstart": {
"title": "⚡️Quickstart",
"href": "/Deploying/Quickstart"
},
"Railway-Deploying": {
"title": "🚂Deploying on Rainway",
"href": "/Deploying/Railway-Deploying"
}
}
54 changes: 30 additions & 24 deletions docs/pages/Guides/How-to-use-different-LLM.md
Original file line number Diff line number Diff line change
@@ -1,36 +1,42 @@
Fortunately, there are many providers for LLMs, and some of them can even be run locally.
# Setting Up Local Language Models for Your App

There are two models used in the app:
1. Embeddings.
2. Text generation.
Your app relies on two essential models: Embeddings and Text Generation. While OpenAI's default models work seamlessly, you have the flexibility to switch providers or even run the models locally.

By default, we use OpenAI's models, but if you want to change it or even run it locally, it's very simple!
## Step 1: Configure Environment Variables

### Go to .env file or set environment variables:
Navigate to the `.env` file or set the following environment variables:

`LLM_NAME=<your Text generation>`
```env
LLM_NAME=<your Text Generation model>
API_KEY=<API key for Text Generation>
EMBEDDINGS_NAME=<LLM for Embeddings>
EMBEDDINGS_KEY=<API key for Embeddings>
VITE_API_STREAMING=<true or false>
```

`API_KEY=<api_key for Text generation>`
You can omit the keys if users provide their own. Ensure you set `LLM_NAME` and `EMBEDDINGS_NAME`.

`EMBEDDINGS_NAME=<llm for embeddings>`
## Step 2: Choose Your Models

`EMBEDDINGS_KEY=<api_key for embeddings>`
**Options for `LLM_NAME`:**
- openai
- manifest
- cohere
- Arc53/docsgpt-14b
- Arc53/docsgpt-7b-falcon
- llama.cpp

`VITE_API_STREAMING=<true or false (true if using openai, false for all others)>`
**Options for `EMBEDDINGS_NAME`:**
- openai_text-embedding-ada-002
- huggingface_sentence-transformers/all-mpnet-base-v2
- huggingface_hkunlp/instructor-large
- cohere_medium

You don't need to provide keys if you are happy with users providing theirs, so make sure you set `LLM_NAME` and `EMBEDDINGS_NAME`.
If using Llama, set `EMBEDDINGS_NAME` to `huggingface_sentence-transformers/all-mpnet-base-v2`. Download the required model and place it in the `models/` folder.

Options:
LLM_NAME (openai, manifest, cohere, Arc53/docsgpt-14b, Arc53/docsgpt-7b-falcon, llama.cpp)
EMBEDDINGS_NAME (openai_text-embedding-ada-002, huggingface_sentence-transformers/all-mpnet-base-v2, huggingface_hkunlp/instructor-large, cohere_medium)
Alternatively, for local Llama setup, run `setup.sh` and choose option 1. The script handles the DocsGPT model addition.

If using Llama, set the `EMBEDDINGS_NAME` to `huggingface_sentence-transformers/all-mpnet-base-v2` and be sure to download [this model](https://d3dg1063dc54p9.cloudfront.net/models/docsgpt-7b-f16.gguf) into the `models/` folder: `https://d3dg1063dc54p9.cloudfront.net/models/docsgpt-7b-f16.gguf`.
## Step 3: Local Hosting for Privacy

Alternatively, if you wish to run Llama locally, you can run `setup.sh` and choose option 1 when prompted. You do not need to manually add the DocsGPT model mentioned above to your `models/` folder if you use `setup.sh`, as the script will manage that step for you.

That's it!

### Hosting everything locally and privately (for using our optimised open-source models)
If you are working with critical data and don't want anything to leave your premises.

Make sure you set `SELF_HOSTED_MODEL` as true in your `.env` variable, and for your `LLM_NAME`, you can use anything that is on Hugging Face.
If working with sensitive data, host everything locally by setting `SELF_HOSTED_MODEL` to true in your `.env`. For `LLM_NAME`, use any model available on Hugging Face.
That's it! Your app is now configured for local and private hosting, ensuring optimal security for critical data.
2 changes: 1 addition & 1 deletion frontend/src/Navigation.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -172,7 +172,7 @@ export default function Navigation({ navOpen, setNavOpen }: NavigationProps) {
<>
{!navOpen && (
<button
className="duration-25 absolute top-3 left-3 z-20 hidden transition-all md:block"
className="duration-25 absolute sticky top-3 left-3 z-20 hidden transition-all md:block"
onClick={() => {
setNavOpen(!navOpen);
}}
Expand Down
4 changes: 1 addition & 3 deletions frontend/src/assets/checkmark.svg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
129 changes: 99 additions & 30 deletions frontend/src/conversation/ConversationBubble.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -37,8 +37,13 @@ const ConversationBubble = forwardRef<
// Reset copied to false after a few seconds
setTimeout(() => {
setCopied(false);
}, 2000);
}, 3000);
};
const [isCopyHovered, setIsCopyHovered] = useState(false);
const [isLikeHovered, setIsLikeHovered] = useState(false);
const [isDislikeHovered, setIsDislikeHovered] = useState(false);
const [isLikeClicked, setIsLikeClicked] = useState(false);
const [isDislikeClicked, setIsDislikeClicked] = useState(false);

let bubble;

Expand All @@ -55,7 +60,10 @@ const ConversationBubble = forwardRef<
);
} else {
bubble = (
<div ref={ref} className={`flex self-start ${className} group flex-col`}>
<div
ref={ref}
className={`flex self-start ${className} group flex-col pr-20`}
>
<div className="flex self-start">
<Avatar className="mt-2 text-2xl" avatar="🦖"></Avatar>
<div
Expand Down Expand Up @@ -146,52 +154,113 @@ const ConversationBubble = forwardRef<
)}
</div>
<div
className={`relative mr-2 flex items-center justify-center md:invisible ${
className={`relative mr-5 flex items-center justify-center md:invisible ${
type !== 'ERROR' ? 'group-hover:md:visible' : ''
}`}
>
{copied ? (
<Checkmark className="absolute left-2 top-4" />
) : (
<Copy
className={`absolute left-2 top-4 cursor-pointer fill-gray-4000 hover:stroke-gray-4000`}
onClick={() => {
handleCopyClick(message);
<div className="absolute left-2 top-4">
<div
className="flex items-center justify-center rounded-full p-2"
style={{
backgroundColor: isCopyHovered ? '#EEEEEE' : '#ffffff',
}}
></Copy>
)}
>
{copied ? (
<Checkmark
className="cursor-pointer stroke-green-2000"
onMouseEnter={() => setIsCopyHovered(true)}
onMouseLeave={() => setIsCopyHovered(false)}
/>
) : (
<Copy
className={`cursor-pointer fill-none`}
onClick={() => {
handleCopyClick(message);
}}
onMouseEnter={() => setIsCopyHovered(true)}
onMouseLeave={() => setIsCopyHovered(false)}
></Copy>
)}
</div>
</div>
</div>
<div
className={`relative mr-2 flex items-center justify-center md:invisible ${
className={`relative mr-5 flex items-center justify-center ${
!isLikeClicked ? 'md:invisible' : ''
} ${
feedback === 'LIKE' || type !== 'ERROR'
? 'group-hover:md:visible'
: ''
}`}
>
<Like
className={`absolute left-6 top-4 cursor-pointer ${
feedback === 'LIKE'
? 'fill-purple-30 stroke-purple-30'
: 'fill-none stroke-gray-4000 hover:fill-gray-4000'
}`}
onClick={() => handleFeedback?.('LIKE')}
></Like>
<div className="absolute left-6 top-4">
<div
className="flex items-center justify-center rounded-full p-2"
style={{
backgroundColor: isLikeHovered
? isLikeClicked
? 'rgba(125, 84, 209, 0.3)'
: '#EEEEEE'
: isLikeClicked
? 'rgba(125, 84, 209, 0.3)'
: '#ffffff',
}}
>
<Like
className={`cursor-pointer ${
isLikeClicked || (feedback === 'LIKE')
? 'fill-white-3000 stroke-purple-30'
: 'fill-none stroke-gray-4000'
}`}
onClick={() => {
handleFeedback?.('LIKE');
setIsLikeClicked(true);
setIsDislikeClicked(false);
}}
onMouseEnter={() => setIsLikeHovered(true)}
onMouseLeave={() => setIsLikeHovered(false)}
></Like>
</div>
</div>
</div>
<div
className={`relative mr-10 flex items-center justify-center md:invisible ${
className={`mr-13 relative flex items-center justify-center ${
!isDislikeClicked ? 'md:invisible' : ''
} ${
feedback === 'DISLIKE' || type !== 'ERROR'
? 'group-hover:md:visible'
: ''
}`}
>
<Dislike
className={`absolute left-10 top-4 cursor-pointer ${
feedback === 'DISLIKE'
? 'fill-red-2000 stroke-red-2000'
: 'fill-none stroke-gray-4000 hover:fill-gray-4000'
}`}
onClick={() => handleFeedback?.('DISLIKE')}
></Dislike>
<div className="absolute left-10 top-4">
<div
className="flex items-center justify-center rounded-full p-2"
style={{
backgroundColor: isDislikeHovered
? isDislikeClicked
? 'rgba(248, 113, 113, 0.3)'
: '#EEEEEE'
: isDislikeClicked
? 'rgba(248, 113, 113, 0.3)'
: '#ffffff',
}}
>
<Dislike
className={`cursor-pointer ${
isDislikeClicked || (feedback === 'DISLIKE')
? 'fill-white-3000 stroke-red-2000'
: 'fill-none stroke-gray-4000'
}`}
onClick={() => {
handleFeedback?.('DISLIKE');
setIsDislikeClicked(true);
setIsLikeClicked(false);
}}
onMouseEnter={() => setIsDislikeHovered(true)}
onMouseLeave={() => setIsDislikeHovered(false)}
></Dislike>
</div>
</div>
</div>
</div>

Expand Down
3 changes: 3 additions & 0 deletions frontend/tailwind.config.cjs
Original file line number Diff line number Diff line change
Expand Up @@ -28,6 +28,9 @@ module.exports = {
'purple-30': '#7D54D1',
'blue-4000': 'rgba(0, 125, 255, 0.36)',
'blue-5000': 'rgba(0, 125, 255)',
'green-2000': '#0FFF50',
'light-gray': '#edeef0',
'white-3000': '#ffffff',
},
},
},
Expand Down
51 changes: 51 additions & 0 deletions scripts/parser/file/openapi3_parser.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,51 @@
from urllib.parse import urlparse

from openapi_parser import parse

try:
from scripts.parser.file.base_parser import BaseParser
except ModuleNotFoundError:
from base_parser import BaseParser


class OpenAPI3Parser(BaseParser):
def init_parser(self) -> None:
return super().init_parser()

def get_base_urls(self, urls):
base_urls = []
for i in urls:
parsed_url = urlparse(i)
base_url = parsed_url.scheme + "://" + parsed_url.netloc
if base_url not in base_urls:
base_urls.append(base_url)
return base_urls

def get_info_from_paths(self, path):
info = ""
if path.operations:
for operation in path.operations:
info += (
f"\n{operation.method.value}="
f"{operation.responses[0].description}"
)
return info

def parse_file(self, file_path):
data = parse(file_path)
results = ""
base_urls = self.get_base_urls(link.url for link in data.servers)
base_urls = ",".join([base_url for base_url in base_urls])
results += f"Base URL:{base_urls}\n"
i = 1
for path in data.paths:
info = self.get_info_from_paths(path)
results += (
f"Path{i}: {path.url}\n"
f"description: {path.description}\n"
f"parameters: {path.parameters}\nmethods: {info}\n"
)
i += 1
with open("results.txt", "w") as f:
f.write(results)
return results
Loading

0 comments on commit 8a4ac04

Please sign in to comment.