GitXplorerGitXplorer
d

super-simple-chatui

public
3 stars
2 forks
0 issues

Commits

List of commits on branch main.
Unverified
94b3a4a7ba87762af38a3e45f05f5e5b96515cb9

chore: documentation and setup instructions in README

ddaveschumaker committed 8 months ago
Unverified
9a344b5e997c07c3789a7f3140aa6cc36ee1a29e

chore: register system prompt modal

ddaveschumaker committed 8 months ago
Unverified
ef889b426e1f8df7a812db2d73ec6bb1589eee14

feat: add simplified management for system prompt

ddaveschumaker committed 8 months ago
Unverified
4e17603b517873633d92fa96006010e0739519fa

chore: split Header component from App

ddaveschumaker committed 8 months ago
Unverified
c46557ab2cd400d794c8a73e0a6ecff929551544

feat: add modal component and manager

ddaveschumaker committed 8 months ago
Unverified
2ed71b9e2b64a8fe5c6578d3d1b95accc2aef4e9

chore: update conversation interface and pass in system prompt

ddaveschumaker committed 8 months ago

README

The README file for this repository.

Super Simple ChatUI

Super Simple ChatUI Screenshot

Overview

Super Simple ChatUI is a React/TypeScript project that provides a simple and intuitive frontend UI for interacting with a local LLM (Large Language Model) on through Ollama. This project enables users to interact with their own LLMs locally, ensuring privacy and control over their data.

This project was setup using Vite, which allows for rapid development thanks to features like Hot Module Replacement, support for TypeScript, CSS modules and more.

Installation

Prerequisites

  • Node (v18.17.0 or later)
  • Ollama

Steps

  1. Clone the repository
> git clone https://github.com/daveschumaker/super-simple-chatui.git
> cd super-simple-chatui
  1. Install dependencies
> npm install
  1. Run development server
> npm run dev
  1. Access application by visiting the link in your terminal (I believe Vite uses: http://localhost:5173)

Usage

  1. Ensure that Ollama is running on your machine and exposes its API at: http://localhost:11434
  2. Interact with LLM: Use the super-simple-chatui interface to send queries to Ollama and receive responses.

TODOs

  • Add support for IndexedDb via Dexie (longer term storage for conversations, system prompts, various settings, etc)
  • Add support for picking from available models via Ollama
  • Add support for chatting with models via the AI Horde
  • Add support for OpenAI's ChatGPT API via API key
  • Write tests! Always with the tests.

Contributing

Contributions are welcome! Please follow these steps to contribute:

  1. Fork the repository.
  2. Create a new branch (git checkout -b feature-branch).
  3. Make your changes.
  4. Commit your changes (git commit -m 'Add new feature').
  5. Push to the branch (git push origin feature-branch).
  6. Open a pull request.

License

This project is licensed under the MIT License. See the LICENSE file for details.

Acknowledgments