Skip to content

Commit

Permalink
better package desc for npm
Browse files Browse the repository at this point in the history
  • Loading branch information
geckse committed May 12, 2023
1 parent b21ec2f commit d9c3a1d
Show file tree
Hide file tree
Showing 2 changed files with 16 additions and 9 deletions.
13 changes: 7 additions & 6 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,16 +1,14 @@
![Banner image](https://user-images.githubusercontent.com/10284570/173569848-c624317f-42b1-45a6-ab09-f0ea3c247648.png)

# n8n-nodes-gpt-tokenizer
# Work with BPE Tokens in n8n with the GPT-Tokenizer Node

# Work with BPE Tokens in n8n

This community package contains a node to work with BPE Tokens such as OpenAI's GPT models use under the hood.
As a matter of fact this node works just fine with the OpenAI Node.
This community package contains a node to work with BPE Tokens such as OpenAI's GPT models use under the hood. As a matter of fact this node works just fine with the OpenAI Node.

You can:
* Encode a string into BPE Tokens (may be cool for custom training)
* Decode an array of BPE Tokens back to a string (for funzies?)
* Determine a strings token length before submitting to the OpenAI API
* Calculate costs before submitting to OpenAI API
* Split a text into chunks which match exactly a definable Token Limit

[n8n](https://n8n.io/) is a [fair-code licensed](https://docs.n8n.io/reference/license/) workflow automation platform.
Expand All @@ -27,7 +25,7 @@ You can:
| ------------- | ------------- | ------------- |
| Encode | Encode a string into BPE Tokens. Returns an array of Tokens. | - |
| Decode | Decode an array of BPE Tokens into a string. Returns a string. | - |
| Count Tokens | Count the tokens a string produces. Return a the number of tokens. | - |
| Count Tokens | Count the tokens a string produces. Return the number of tokens. | - |
| Check Token Limit | Wheather a given string exceeds a defined Token Limit. Returns a boolean. | Optional: throw an error if the Token Limit is exceeded. |
| Slice to Max Token Limit | Slice the string into block which match exactly the provided token limit. Returns an array of strings. | - |

Expand All @@ -48,5 +46,8 @@ I hope you are enyoing these nodes. If you are in need of a smooth automation, s

## Version History

### 0.1.1
- just polishing the npm release

### 0.1.0
- initial release
12 changes: 9 additions & 3 deletions package.json
Original file line number Diff line number Diff line change
@@ -1,9 +1,15 @@
{
"name": "n8n-nodes-gpt-tokenizer",
"version": "0.1.0",
"description": "A n8n node to Tokenize the Input into byte pair sequence of integers in similar fashion as GPT models of OpenAI does",
"version": "0.1.1",
"description": "n8n node for working with BPE Tokens with OpenAI's GPT models in mind.",
"keywords": [
"n8n-community-node-package"
"n8n-community-node-package",
"n8n",
"nodemation",
"n8n-node",
"gpt",
"bpe",
"tokenizer"
],
"license": "MIT",
"homepage": "",
Expand Down

0 comments on commit d9c3a1d

Please sign in to comment.