From fb57930c69ac65f0e94c05ebbaa846133d3349e1 Mon Sep 17 00:00:00 2001 From: Hayden Johnson Date: Wed, 25 Sep 2024 13:14:44 -0700 Subject: [PATCH] Add README.md --- README.md | 63 +++++++++++++++++++++++++++++++++++++++++++++++++++++++ 1 file changed, 63 insertions(+) create mode 100644 README.md diff --git a/README.md b/README.md new file mode 100644 index 0000000..0471a74 --- /dev/null +++ b/README.md @@ -0,0 +1,63 @@ +**Assistant** +=============== + +A terminal-based chat interface with an intelligent assistant. + +**Overview** +------------ + +This script provides a simple way to interact with an AI assistant in your terminal. It uses the Ollama API to generate responses and can handle markdown code blocks, syntax highlighting, and shell commands. + +**Features** +------------ + +* Terminal-based chat interface +* Supports markdown code blocks +* Syntax highlighting using Pygments +* Handles shell commands +* Can copy highlighted code blocks to clipboard +* Allows follow-up questions with `--follow-up` argument +* Customizable model and temperature with `--model` and `--temp` arguments + +**Usage** +--------- + +### Command Line Arguments + +The script supports the following command line arguments: + +* `--follow-up`: Ask a follow-up question when piping in context +* `--copy`: Copy a codeblock if it appears +* `--shell`: Output a shell command that does as described +* `--model`: Specify model (default: `llama3.1:8b-instruct-q8_0`) +* `--temp`: Specify temperature (default: `0.5`) +* `--host`: Specify host of Ollama server + +### Piped Input + +You can pipe input to the script by running it without any arguments. The script will read from standard input and handle it as a single question. + +### Non-Piped Input + +Run the script with no arguments to enter interactive mode. Type your questions or commands, and the assistant will respond accordingly. + +**Installation** +-------------- + +This script requires Python 3.x and the Ollama API library (`ollama-api`). You can install these dependencies using pip: + +```bash +pip install ollama-api +``` + +You also need to set up an Ollama server on your local machine. Please refer to the Ollama documentation for instructions. + +**Running the Script** +---------------------- + +Run the script with Python: + +```bash +python assistant.py +``` +