GitXplorerGitXplorer
L

nio-llm

public
2 stars
0 forks
0 issues

Commits

List of commits on branch master.
Verified
95c751ea303364f523204eed06d8fb650710582c

🐛 temporarily encapsulate message_callback's inside logic inside a try/catch

LLaurent2916 committed a year ago
Verified
f92a20b2c58b024b529485da8be9a6eef24af664

🔧 change default typing_loop's `sleep_time` to 15 seconds

LLaurent2916 committed a year ago
Verified
8dac964d2d8dea5ff16ec4190ebb1fbea648a235

🔧 change default history_size

LLaurent2916 committed a year ago
Verified
d7a14fd4eefb8c1098a222dcba5aab8756ca44fb

✨ modify matrix message content to format mentions and newlines

LLaurent2916 committed a year ago
Verified
10c7513addcf0e2b1cd9ed1961a7bb6450d67fee

✨ update read receipt when message history is updated

LLaurent2916 committed a year ago
Verified
5b5a18d73b9b90f1fa110d8fde63c8be0ee1628b

🎨 parametrize typing_loop with `typing_loop`

LLaurent2916 committed a year ago

README

The README file for this repository.

Nio LLM

GitHub Code style: black Ruff

You own little LLM in your matrix chatroom.

Usage

This project is split in two parts: the client and the server.

The server simply downloads an LLM and starts a llama-cpp-python server (which mimics an openai server).

The client connects to the matrix server and queries the llama-cpp-python server to create matrix messages.

Special thanks