Skip to content

Releases: mgonzs13/llama_ros

4.0.3

16 Oct 09:21
Compare
Choose a tag to compare
  • new XTC sampling added
  • new system_prompt param
  • llama.cpp b3923

This version does not compile due to errors in the vendor CMakeLists

4.0.2

11 Oct 11:36
Compare
Choose a tag to compare
  • common prefix added for llama.cpp commons
  • llama.cpp b3906

This version does not compile due to errors in the vendor CMakeLists

4.0.1

07 Oct 07:24
Compare
Choose a tag to compare
  • llama_rag_demo fixed
  • llama.cpp b3889

4.0.0

03 Oct 11:48
Compare
Choose a tag to compare
  • reranking added
  • separate LLM, embedding models and reranking models
  • new services (reranking and detokenize)
  • models for reranking and embeddings added
  • vicuna promopt added
  • llama namespace removed from LlamaClientNode
  • full demo with LLM + chat template + RAG + reranking + stream
  • README:
    • model shards example added
    • reranking langchain and demo added
    • embedding demo added
    • minor fixes
  • langchain reranking added
  • langchain upgraded to 0.3
  • llama.cpp b3870

3.9.2

26 Sep 08:17
Compare
Choose a tag to compare
  • chat_llama_ros added to README
  • model shard files download added
  • llama.cpp b3827

3.9.1

21 Sep 16:30
Compare
Choose a tag to compare
  • qwen2 updated to qwen2.5
  • llama.cpp b3799

3.9.0

15 Sep 18:19
Compare
Choose a tag to compare
  • new sampling from llama.cpp
  • grammar functions removed
  • n_remain removed
  • threadpool added
  • llama.cpp b3756

3.8.3

03 Sep 11:00
Compare
Choose a tag to compare
  • fixed stop when n_remain is 0
  • llama.cpp updated

3.8.2

30 Aug 10:10
Compare
Choose a tag to compare
  • ChatLlamaROS stream fix
  • ChatLlamaROS demo video added
  • Fix passing image as data

3.8.1

30 Aug 09:39
Compare
Choose a tag to compare
  • llama.cpp updated
  • new cpuparams