A Python script that enables you to chat with an intelligent assistant (LLaMA) in your terminal. The script uses the ollama library to interact with the LLaMA model and provide a user-friendly interface.
Go to file
2025-04-29 16:25:17 -07:00
.gitignore Update gitignore 2024-09-25 11:58:55 -07:00
assistant.py change default model 2025-04-29 12:41:30 -07:00
README.md Update readme 2024-09-27 09:54:53 -07:00
refactored.py feat: Refactor code and add clipboard functionality 2025-04-29 16:24:58 -07:00
requirements.txt add requirements.txt 2025-03-16 12:10:15 -07:00

Assistant

A terminal-based chat interface with an intelligent assistant.

Overview

This script provides a simple way to interact with an AI assistant in your terminal. It uses the Ollama API to generate responses and can handle markdown code blocks, syntax highlighting, and shell commands.

Features

  • Terminal-based chat interface
  • Supports markdown code blocks
  • Syntax highlighting using Pygments
  • Handles shell commands
  • Can copy highlighted code blocks to clipboard
  • Allows follow-up questions with --follow-up argument
  • Customizable model and temperature with --model and --temp arguments

Usage

Command Line Arguments

The script supports the following command line arguments:

  • --follow-up: Ask a follow-up question when piping in context
  • --copy: Copy a codeblock if it appears
  • --shell: Output a shell command that does as described
  • --model: Specify model (default: llama3.1:8b-instruct-q8_0)
  • --temp: Specify temperature (default: 0.2)
  • --host: Specify host of Ollama server

Piped Input

You can pipe input to the script by running it without any arguments. The script will read from standard input and handle it as a single question.

Non-Piped Input

Run the script with no arguments to enter interactive mode. Type your questions or commands, and the assistant will respond accordingly.

Installation

This script requires Python 3.x and the Ollama API library (ollama-api). You can install these dependencies using pip:

pip install ollama-api

You also need to set up an Ollama server on your local machine. Please refer to the Ollama documentation for instructions.

Running the Script

Run the script with Python:

python assistant.py