Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using "llm-wrap" external service #42

Draft
wants to merge 23 commits into
base: main
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
23 commits
Select commit Hold shift + click to select a range
610aa5f
Switching "Epic Demo" to "Epic, UW Medicine Epic (production)", same
mcjustin Oct 16, 2024
e92651a
Brings relevant changes from https://github.com/uwcirg/shl-ips/pull/38
mcjustin Oct 16, 2024
ae3263e
Rename AI to LLM
mcjustin Oct 16, 2024
7941761
More renaming...
mcjustin Oct 16, 2024
1a39eb3
Switch from internal /api/llm-chat to external service (hard-coded fo…
mcjustin Oct 16, 2024
322a5ec
Update llmChat.js - rename "llm-chat" subdomain to "llm-service"
mcjustin Oct 22, 2024
44c561b
Update llmChat.js - use port 5000 for llm-service
mcjustin Oct 22, 2024
3f178bd
Update llmChat.js - revert to calling llm-service w/out a port number
mcjustin Oct 22, 2024
b7cafc3
Update llmChat.js: maintain back-and-forth w/ LLM for context.
mcjustin Oct 23, 2024
f7fd67a
Fix reading of response from llm-service
mcjustin Oct 29, 2024
98b78ab
Another fix for handling response from llm-service...
mcjustin Oct 29, 2024
4b1aa55
Insert user queries into UI; label llm & user queries in UI.
mcjustin Oct 29, 2024
a81ed7f
Add token counts to llm message history
mcjustin Nov 12, 2024
4d60d9b
Format chat display a little bit; don't send prompt if it would be an…
mcjustin Nov 12, 2024
447d1e1
llm chat history now rendered in an html table.
mcjustin Nov 12, 2024
d26ef47
Refine llm history table
mcjustin Nov 12, 2024
4160e78
Improve labelling of LLM chat history table.
mcjustin Nov 12, 2024
47934b2
Add AI cost to llm chat history
mcjustin Nov 14, 2024
64a3e22
Fix syntax error in last commit
mcjustin Nov 14, 2024
fa29583
fix another syntax error
mcjustin Nov 14, 2024
bda7541
finesse display of llm cost
mcjustin Nov 14, 2024
5c1faea
finesse llm cost display
mcjustin Nov 14, 2024
d36a2a7
finesse llm cost info
mcjustin Nov 14, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 5 additions & 2 deletions src/lib/config.ts
Original file line number Diff line number Diff line change
Expand Up @@ -24,8 +24,11 @@ export const SOF_HOSTS = [
// },
{
id: "epic",
name: "Epic Demo",
url: "https://fhir.epic.com/interconnect-fhir-oauth/api/FHIR/R4",
//name: "Epic Demo",
// url: "https://fhir.epic.com/interconnect-fhir-oauth/api/FHIR/R4",
name: "Epic, UW Medicine Epic (production)",
// per https://open.epic.com/MyApps/Endpoints :
url: "https://fhir.epic.medical.washington.edu/FHIR-Proxy/api/FHIR/R4",
clientId: import.meta.env.VITE_EPIC_CLIENT_ID,
note: "<a href='https://fhir.epic.com/Documentation?docId=testpatients' target='_blank' rel='noreferrer'>Test patient credentials <Icon name='box-arrow-up-right' /></a>"
},
Expand Down
109 changes: 109 additions & 0 deletions static/ips/assets/js/llmChat.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,109 @@
let fhirResources = null;
let messages = [];

function initLLMChat(resources) {
fhirResources = resources;
const llmChatContent = document.getElementById('llm-chat-content');
const chatInput = document.getElementById('chat-input');
const sendMessageButton = document.getElementById('send-message');

sendMessageButton.addEventListener('click', sendMessage);
chatInput.addEventListener('keypress', (e) => {
if (e.key === 'Enter') sendMessage();
});
}

function insertMessageIntoUi(role, userMessage) {
const chatMessages = document.getElementById('chat-messages');

// Create a new table row for the user message
const row = document.createElement('tr');

// Create cells for the request
const requestCell = document.createElement('td');
requestCell.textContent = userMessage;

// Create empty cells for response and tokens
const responseCell = document.createElement('td');
const promptTokensCell = document.createElement('td');
const completionTokensCell = document.createElement('td');
const costCell = document.createElement('td');

// Append cells to the row
row.appendChild(requestCell);
row.appendChild(responseCell);
row.appendChild(promptTokensCell);
row.appendChild(completionTokensCell);
row.appendChild(costCell);

// Append the row to the chat messages table
chatMessages.appendChild(row);

// Return the row for later updates
return row;
}

// Update the sendMessage function to use the new insertMessageIntoUi
async function sendMessage() {
const chatInput = document.getElementById('chat-input');
const userMessage = chatInput.value.trim();
if (userMessage.length === 0) return;

// Append the FHIR resources as the first message
if (messages.length === 0) {
messages.push({
role: "user",
content: [{ type: "text", text: JSON.stringify(fhirResources) }]
});
}

// Append the user message
messages.push({
role: "user",
content: [{ type: "text", text: userMessage }]
});

// Insert the user message into the UI and get the row reference
const row = insertMessageIntoUi('user', userMessage);

chatInput.value = '';

try {
// FIXME config for this url...
const response = await fetch('https://llm-service.fl.mcjustin.dev.cirg.uw.edu/api/chat', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({ messages: messages }), // Send the messages array
});

if (!response.ok) {
throw new Error('Failed to get LLM response');
}

const data = await response.json();
// Append the assistant's response
messages.push({
role: "assistant",
content: [{ type: "text", text: data.content }]
});

const promptTokens = data.prompt_tokens;
const completionTokens = data.completion_tokens;
const costInput = parseInt(promptTokens) * 0.15 / 1000000;
const costOutput = parseInt(completionTokens) * 0.6 / 1000000;
const cost = costInput + costOutput;

// Update the existing row with the response and token counts
row.cells[1].textContent = data.content; // Response
row.cells[2].textContent = promptTokens; // Prompt Tokens
row.cells[3].textContent = completionTokens; // Completion Tokens
row.cells[4].textContent = costInput.toString().substring(0,7) + " + " + costOutput.toString().substring(0,7) + " = " + cost.toString().substring(0,7);
} catch (error) {
console.error('Error sending message to LLM:', error);
row.cells[1].textContent = 'Failed to get a response. Please try again.'; // Update response cell with error message
}
}

export { initLLMChat };
10 changes: 10 additions & 0 deletions static/ips/assets/js/renderIPS.js
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
import config from "./config.js";
import { initLLMChat} from "./llmChat.js";
var { pdfjsLib } = globalThis;
pdfjsLib.GlobalWorkerOptions.workerSrc = '//mozilla.github.io/pdf.js/build/pdf.worker.mjs';

Expand All @@ -21,6 +22,7 @@ $(document).ready(function () {
$('#content').show();
$('#FhirDropdown').on('click', () => updateDisplayMode('Entries'));
$('#NarrativeDropdown').on('click', () => updateDisplayMode('Text'));
$('#LlmChatDropdown').on('click', () => updateDisplayMode('LlmChat'));
});

function loadSample() {
Expand All @@ -43,11 +45,18 @@ function updateDisplayMode(displayMode) {
newText = 'App Interpretation';
} else if (displayMode == 'Text') {
newText = 'Generated Text';
} else if (displayMode == 'LlmChat') {
newText = 'LLM Chat';
}
if (newText) {
mode = displayMode
dropdown.html(newText);
}

// Show/hide content based on selected mode
//$('#rendered-ips').toggle(mode !== 'LlmChat');
$('#llm-chat-content').toggle(mode === 'LlmChat');

shlContents.forEach((e, i) => {
update(e, i);
});
Expand Down Expand Up @@ -234,6 +243,7 @@ function addTab(name, id) {
// Primary function to traverse the Bundle and get data
// Calls the render function to display contents
function update(ips, index) {
initLLMChat(ips);
sectionCount = 0;
$(`.output${index}`).html("");
$("#renderMessage").hide();
Expand Down
28 changes: 26 additions & 2 deletions static/ips/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,8 @@
<!-- IPS Rendering -->
<script type="module" src="/ips/assets/js/renderIPS.js"></script>
<script type="module" src="/ips/assets/js/retreiveIPS.js"></script>


<script type="module" src="/ips/assets/js/llmChat.js"></script>

<title>IPS Viewer</title>
</head>
Expand Down Expand Up @@ -97,6 +98,9 @@
<a id="NarrativeDropdown" class="dropdown-item" style="cursor:pointer">
Display text generated by IPS source
</a>
<a id="LlmChatDropdown" class="dropdown-item" style="cursor:pointer">
Chat with a large language model (LLM) about your health
</a>
</div>
</div>
<div class="text-info" id="renderMessage" style="display: none;">
Expand All @@ -113,6 +117,26 @@
</div>
<span class="loader"></span>
</div>

<div id="llm-chat-content" style="display: none; margin: 30px;">
<table id="chat-messages" style="width: 100%; border-collapse: collapse;">
<thead>
<tr>
<th>Your request</th>
<th>LLM Chat Response</th>
<th>Prompt Tokens</th>
<th>Response Tokens</th>
<th>Cost in US$ (prompt + response = total)</th>
</tr>
</thead>
<tbody>
<!-- Messages will be appended here -->
</tbody>
</table>
<input type="text" id="chat-input" placeholder="Ask a large language model about your health...">
<button id="send-message">Send to LLM</button>
</div>

<div id="rendered-ips" class="tab-content" style="margin: 30px; margin-right: 0px;">
</div>

Expand All @@ -137,4 +161,4 @@
</div>
</body>

</html>
</html>