Add README.md
This commit is contained in:
parent
a69b63e4fe
commit
fb57930c69
63
README.md
Normal file
63
README.md
Normal file
|
|
@ -0,0 +1,63 @@
|
||||||
|
**Assistant**
|
||||||
|
===============
|
||||||
|
|
||||||
|
A terminal-based chat interface with an intelligent assistant.
|
||||||
|
|
||||||
|
**Overview**
|
||||||
|
------------
|
||||||
|
|
||||||
|
This script provides a simple way to interact with an AI assistant in your terminal. It uses the Ollama API to generate responses and can handle markdown code blocks, syntax highlighting, and shell commands.
|
||||||
|
|
||||||
|
**Features**
|
||||||
|
------------
|
||||||
|
|
||||||
|
* Terminal-based chat interface
|
||||||
|
* Supports markdown code blocks
|
||||||
|
* Syntax highlighting using Pygments
|
||||||
|
* Handles shell commands
|
||||||
|
* Can copy highlighted code blocks to clipboard
|
||||||
|
* Allows follow-up questions with `--follow-up` argument
|
||||||
|
* Customizable model and temperature with `--model` and `--temp` arguments
|
||||||
|
|
||||||
|
**Usage**
|
||||||
|
---------
|
||||||
|
|
||||||
|
### Command Line Arguments
|
||||||
|
|
||||||
|
The script supports the following command line arguments:
|
||||||
|
|
||||||
|
* `--follow-up`: Ask a follow-up question when piping in context
|
||||||
|
* `--copy`: Copy a codeblock if it appears
|
||||||
|
* `--shell`: Output a shell command that does as described
|
||||||
|
* `--model`: Specify model (default: `llama3.1:8b-instruct-q8_0`)
|
||||||
|
* `--temp`: Specify temperature (default: `0.5`)
|
||||||
|
* `--host`: Specify host of Ollama server
|
||||||
|
|
||||||
|
### Piped Input
|
||||||
|
|
||||||
|
You can pipe input to the script by running it without any arguments. The script will read from standard input and handle it as a single question.
|
||||||
|
|
||||||
|
### Non-Piped Input
|
||||||
|
|
||||||
|
Run the script with no arguments to enter interactive mode. Type your questions or commands, and the assistant will respond accordingly.
|
||||||
|
|
||||||
|
**Installation**
|
||||||
|
--------------
|
||||||
|
|
||||||
|
This script requires Python 3.x and the Ollama API library (`ollama-api`). You can install these dependencies using pip:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
pip install ollama-api
|
||||||
|
```
|
||||||
|
|
||||||
|
You also need to set up an Ollama server on your local machine. Please refer to the Ollama documentation for instructions.
|
||||||
|
|
||||||
|
**Running the Script**
|
||||||
|
----------------------
|
||||||
|
|
||||||
|
Run the script with Python:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
python assistant.py
|
||||||
|
```
|
||||||
|
|
||||||
Loading…
Reference in a new issue