GitXplorerGitXplorer
L

nio-llm

public
2 stars
0 forks
0 issues

Commits

List of commits on branch master.
Verified
0f312a0a70819073e380ad286792c9ed29fb5004

⚡️ use async openai methods + create async task for the typing loop

LLaurent2916 committed a year ago
Verified
8eda4825d9540b7e45bffb4852579647a2b379c8

🔥 simplify the README

LLaurent2916 committed a year ago
Verified
12080ad3a56216f4641b5d2cf56e380f1823fb18

✨ make history_size configurable

LLaurent2916 committed a year ago
Verified
ca22fe640f210b033e47125807e704928c99efdf

🔧 change the default preprompt, add `concise` keyword

LLaurent2916 committed a year ago
Verified
2d91052d6edbe491e0a41ca4576afbf151168f5d

🐛 preemptively check if `rel_type` is in `event.source["content"]["m.relates_to"]`

LLaurent2916 committed a year ago
Verified
904dde744f15226eaf6e43192c57ca941b5a781b

♻️ big refactor, use llama server and openai python library

LLaurent2916 committed a year ago

README

The README file for this repository.

Nio LLM

GitHub Code style: black Ruff

You own little LLM in your matrix chatroom.

Usage

This project is split in two parts: the client and the server.

The server simply downloads an LLM and starts a llama-cpp-python server (which mimics an openai server).

The client connects to the matrix server and queries the llama-cpp-python server to create matrix messages.

Special thanks