βββββββ βββββββ βββ ββββββββββ βββ Prompt Rich Templating CLI Engine
ββββββββββββββββββββββββββββββ βββ For LLM Prompts and Shell Commands
ββββββββββββββββββββββ ββββββββ with Multi-Step Processing Pipelines
βββββββ ββββββββββββββ ββββββββ
βββ βββ βββββββββββββββββ βββ
βββ βββ ββββββ ββββββββββ βββ
prich is a lightweight CLI tool for creating, managing, executing, and sharing reusable LLM prompt pipelines for any use case-development, data analysis, content generation, and more. With Jinja2 templating, flexible scripting (in any language), and shareable template packages, prich shines for teams collaborating on standardized LLM workflows. Share templates via files, git, or cloud storage, and streamline tasks like code review, git diff analysis, or CSV data insights.
- Any Prompt, Any Domain: Build prompts for coding (e.g., code review), data analysis (e.g., CSV summaries), content creation, or customer support, with commands to prepare data (e.g., parse CSVs, clean text).
- Team Collaboration: Share template packages via git repos - just add
.prichfolder with your shared templates, file transfers, or cloud storage (e.g., Google Drive, Dropbox), ensuring consistent LLM outputs across teams. - Simple and Hackable: Intuitive CLI and YAML configs make it easy to craft dynamic prompts, with support for Python, shell, or any scripting language.
- Portable: Isolated virtual environments (default and custom venvs) ensure dependency safety and portability; or use standard commands like git, cat, etc.
Supported LLMs: Ollama API, OpenAI API, MLX LM, STDIN (different LLM cli tools like AWS Q Chat, Google Gemini CLI, mlx_lm.generate, Ollama run, OpenAI Codex, etc.)
You just need to create a simple structure in your directory or different directories.
prich will check the global (home directory) and local (current working directory) locations for .prich folder presence,
it loads configs and templates from both locations (if present), and merges them taking local as the final so it overloads matching global configs and templates by default.
.prich/ - [optional for CWD] main prich folder for configs and templates
|- templates/ - [optional for CWD] templates folder
| |- my-template/ - template folder
| | |- scripts/ - [optional] for python scripts or utils
| | | |- venv/ - [optional] for isolated python virtual env
| | | '- ...
| | '- my-template.yaml - template file
| |- my-template2/
| | '- ...
| '- ...
|- venv/ - [optional] for shared python virtual env
'- config.yaml - [optional for CWD] prich configuration file
You can easily create such a structure in your git repository and work together with your team on the templates.
- Modular Prompts: Define prompts with Jinja2 templates and per-template YAML configs.
- Flexible Pipelines: Chain preprocessing, postprocessing, and llm steps (e.g., parse CSVs, list files) using any language or shell command.
- Team-Friendly Sharing: Package templates with dependencies for easy sharing via files, git, or cloud storage.
- Secure venv Management: Default (
.prich/venv/) and custom Python venvs (e.g.,.prich/templates/code_review/scripts/venv) isolate dependencies. - Simple CLI: Commands like
prich runandprich installstreamline workflows.
prich requires python 3.10+
- Install
prichtoolpipx install git+https://github.com/oleks-dev/prich(seeInstallation) - Initialize config (use global for the start):
prich init --global - Create simple example template (
prich create <template_id> --global):prich create my-template -g - Run template (
prich run <template_id>):prich run my-template
Note: By default prich will set up and use echo provider which just outputs the rendered template
To use it with LLM seeConfigure .prich/config.yamland follow it to add your LLM provider
Optionally you can also run for the start:
- Run template with help flag (
prich run <template_id> --help):prich run my-template --help - See installed templates:
prich list - See templates available for installation from the remote repo:
prich list --remote
See Quick Start
See Install & update
-
List Templates:
List both globally (home folder) and locally (current folder) installed templates where local templates overloads any same globalprich list
List only globally (home folder) installed templates
prich list -g
List only locally (current folder) installed templates
prich list -l
Output:
- git-diff: Analyze and summarize git changes - code-review: Review Python files for code quality - csv-analysis: Analyze CSV data and generate business insights -
Run a Template:
See template help description and argsprich run code-review --help
Run template
prich run code-review --file myscript.py
prich is designed for teams to share and standardize LLM prompts across workflows:
-
Share Templates in your repository:
Store.prich/folder with templates in your repository and work on them together.
For templates with python scripts you can addvenvfolders to.gitignoreand ask team to install venvs personally:prich venv-install <template_name>
or to re-install:
prich venv-install <template_name> --force
-
Share Templates as files: Package templates in a git repo, shared drive, or cloud storage (e.g., Google Drive, Dropbox) You can use complete template folder or compress it into a zip archive.
Team members install templates:git clone https://github.com/your-team/team-prich-templates.git prich install team-prich-templates/csv-analysis prich install team-prich-templates/code-review.zip
Or download from cloud storage and install via
prich install. -
Standardize Workflows:
Use shared templates to ensure consistent LLM outputs, whether for code reviews (βAdd docstringsβ) or business insights (βFocus marketing on Laptopsβ). -
Cross-Functional Use:
Developers, data analysts, marketers, or support teams can use prich for their prompts, with templates tailored to each domain.
prich supports autocompletion for zsh, bash, and fish.
See How-To Install
prich supports autocompletion for zsh, bash, and fish.
See How-To - Install & update - Shell Completion
id: "explain-code"
name: "Explain Code"
schema_version: "1.0"
version: "1.0"
author: "prich"
description: Provide detailed code explanation
tags: ["code"]
steps:
- name: "Ask to explain code"
type: llm
instructions: |
Assistant is a senior engineer who provides detailed code explanation.
input: |
Explain what this{% if lang %} {{ lang }}{% endif %} code does:
File: {{ file }}
```{% if lang %}{{ lang.lower() }}{% endif %}
{{ file | include_file }}
```
usage_examples:
- "explain-code --file mycode.py"
- "explain-code --file ./mycode.py --lang python"
- "explain-code --file ./proj/mycode.js --lang javascript"
variables:
- name: file
description: File to review
cli_option: --file
required: true
type: str
- name: lang
description: Code language (ex. Python, JavaScript, Java)
cli_option: --lang
required: false
default: null
type: strValidation of templates help to detect yaml schema issues.
There are several commands that you can execute the check the templates:
-
Validate all available templates. Add
-l/--localor-g/--globalfor validation of only local (current folder) or global templates.prich validate prich validate --global prich validate --local
-
Validate one template by template id
prich validate --id <template_id> prich validate --id code-review
-
Validate selected yaml file
prich validate --file <template_yaml_file> prich validate --file ./.prich/templates/my-template/my-template.yaml
-
Pipeline Steps: Use Python, shell, LLM, or any language for pipeline steps (e.g., parse CSVs with pandas, clean text with awk).
-
Conditional Expressions: Use Jinja2 style conditional expressions to execute or skip pipeline steps
-
Custom Python Venvs: Templates like code_review use dedicated venvs for dependency isolation.
Want to create templates for data analysis, content generation, or other domains? Fork the repo, add a template package, or submit a PR! See CONTRIBUTING.md for guidelines.
MIT License. See LICENSE for details.
prich: Simplify any LLM prompt pipeline and collaborate effortlessly!
