- Introduce SQLite database to store conversation history - Implement functions to save and load conversations from the database - Replace previous file-based history storage with database-backed solution - Add new command to list and resume previous conversations - Update chat logic to generate topics for saved conversations - Add database initialization and management functions - Modify command line interface to support resuming conversations |
||
|---|---|---|
| .gitignore | ||
| .python-version | ||
| assistant.py | ||
| pyproject.toml | ||
| README.md | ||
| requirements.txt | ||
| uv.lock | ||
Assistant
A terminal-based chat interface with an intelligent assistant.
Overview
This script provides a simple way to interact with an AI assistant in your terminal. It uses the Ollama API to generate responses and can handle markdown code blocks, syntax highlighting, and shell commands.
Features
- Terminal-based chat interface
- Supports markdown code blocks
- Syntax highlighting using Pygments
- Handles shell commands
- Can copy highlighted code blocks to clipboard
- Allows follow-up questions with
--follow-upargument - Customizable model and temperature with
--modeland--temparguments
Usage
Command Line Arguments
The script supports the following command line arguments:
--follow-up: Ask a follow-up question when piping in context--copy: Copy a codeblock if it appears--shell: Output a shell command that does as described--model: Specify model (default:llama3.1:8b-instruct-q8_0)--temp: Specify temperature (default:0.2)--host: Specify host of Ollama server
Piped Input
You can pipe input to the script by running it without any arguments. The script will read from standard input and handle it as a single question.
Non-Piped Input
Run the script with no arguments to enter interactive mode. Type your questions or commands, and the assistant will respond accordingly.
Installation
You can install the dependencies in a virtual environment using uv
uv sync
You also need to set up an Ollama server on your local machine. Please refer to the Ollama documentation for instructions.
Running the Script
Run the script with Python:
python assistant.py