ai-assistant/README.md

64 lines
1.8 KiB
Markdown
Raw Normal View History

2024-09-25 20:14:44 +00:00
**Assistant**
===============
A terminal-based chat interface with an intelligent assistant.
**Overview**
------------
This script provides a simple way to interact with an AI assistant in your terminal. It uses the Ollama API to generate responses and can handle markdown code blocks, syntax highlighting, and shell commands.
**Features**
------------
* Terminal-based chat interface
* Supports markdown code blocks
* Syntax highlighting using Pygments
* Handles shell commands
* Can copy highlighted code blocks to clipboard
* Allows follow-up questions with `--follow-up` argument
* Customizable model and temperature with `--model` and `--temp` arguments
**Usage**
---------
### Command Line Arguments
The script supports the following command line arguments:
* `--follow-up`: Ask a follow-up question when piping in context
* `--copy`: Copy a codeblock if it appears
* `--shell`: Output a shell command that does as described
* `--model`: Specify model (default: `llama3.1:8b-instruct-q8_0`)
* `--temp`: Specify temperature (default: `0.5`)
* `--host`: Specify host of Ollama server
### Piped Input
You can pipe input to the script by running it without any arguments. The script will read from standard input and handle it as a single question.
### Non-Piped Input
Run the script with no arguments to enter interactive mode. Type your questions or commands, and the assistant will respond accordingly.
**Installation**
--------------
This script requires Python 3.x and the Ollama API library (`ollama-api`). You can install these dependencies using pip:
```bash
pip install ollama-api
```
You also need to set up an Ollama server on your local machine. Please refer to the Ollama documentation for instructions.
**Running the Script**
----------------------
Run the script with Python:
```bash
python assistant.py
```