This commit introduces several improvements and new features to the AI Assistant: - **Refactored code:** Improved code structure and readability. - **Added clipboard functionality:** Implemented a `/copy` command to copy code blocks to the clipboard. - **Improved command parsing:** Added a `/copy` command to copy code blocks to the clipboard. - **Enhanced error handling:** Added error handling for clipboard operations. - **Improved documentation:** Updated documentation to reflect changes. |
||
|---|---|---|
| .gitignore | ||
| assistant.py | ||
| README.md | ||
| refactored.py | ||
| requirements.txt | ||
Assistant
A terminal-based chat interface with an intelligent assistant.
Overview
This script provides a simple way to interact with an AI assistant in your terminal. It uses the Ollama API to generate responses and can handle markdown code blocks, syntax highlighting, and shell commands.
Features
- Terminal-based chat interface
- Supports markdown code blocks
- Syntax highlighting using Pygments
- Handles shell commands
- Can copy highlighted code blocks to clipboard
- Allows follow-up questions with
--follow-upargument - Customizable model and temperature with
--modeland--temparguments
Usage
Command Line Arguments
The script supports the following command line arguments:
--follow-up: Ask a follow-up question when piping in context--copy: Copy a codeblock if it appears--shell: Output a shell command that does as described--model: Specify model (default:llama3.1:8b-instruct-q8_0)--temp: Specify temperature (default:0.2)--host: Specify host of Ollama server
Piped Input
You can pipe input to the script by running it without any arguments. The script will read from standard input and handle it as a single question.
Non-Piped Input
Run the script with no arguments to enter interactive mode. Type your questions or commands, and the assistant will respond accordingly.
Installation
This script requires Python 3.x and the Ollama API library (ollama-api). You can install these dependencies using pip:
pip install ollama-api
You also need to set up an Ollama server on your local machine. Please refer to the Ollama documentation for instructions.
Running the Script
Run the script with Python:
python assistant.py