ai-assistant/README.md

64 lines
1.8 KiB
Markdown
Raw Permalink Normal View History

2024-09-25 20:14:44 +00:00
**Assistant**
===============
A terminal-based chat interface with an intelligent assistant.
**Overview**
------------
This script provides a simple way to interact with an AI assistant in your terminal. It uses the Ollama API to generate responses and can handle markdown code blocks, syntax highlighting, and shell commands.
**Features**
------------
* Terminal-based chat interface
* Supports markdown code blocks
* Syntax highlighting using Pygments
* Handles shell commands
* Can copy highlighted code blocks to clipboard
* Allows follow-up questions with `--follow-up` argument
* Customizable model and temperature with `--model` and `--temp` arguments
**Usage**
---------
### Command Line Arguments
The script supports the following command line arguments:
* `--follow-up`: Ask a follow-up question when piping in context
* `--copy`: Copy a codeblock if it appears
* `--shell`: Output a shell command that does as described
* `--model`: Specify model (default: `llama3.1:8b-instruct-q8_0`)
2024-09-27 16:54:53 +00:00
* `--temp`: Specify temperature (default: `0.2`)
2024-09-25 20:14:44 +00:00
* `--host`: Specify host of Ollama server
### Piped Input
You can pipe input to the script by running it without any arguments. The script will read from standard input and handle it as a single question.
### Non-Piped Input
Run the script with no arguments to enter interactive mode. Type your questions or commands, and the assistant will respond accordingly.
**Installation**
--------------
2025-05-16 14:44:27 +00:00
You can install the dependencies in a virtual environment using uv
2024-09-25 20:14:44 +00:00
```bash
2025-05-16 14:44:27 +00:00
uv sync
2024-09-25 20:14:44 +00:00
```
You also need to set up an Ollama server on your local machine. Please refer to the Ollama documentation for instructions.
**Running the Script**
----------------------
Run the script with Python:
```bash
python assistant.py
```