PEI Analytical Engine (v1.5)
A Public Guide to Building Your Own Local, Constrained AI Auditor
1. Introduction: What This Is
This guide provides the full instructions and all necessary files to build and run a local, validated instance of the PEI Analytical Engine (v1.5).
This is not a conversational chatbot. It is a "Tool AI" built on the principles of the Post-Ego Intelligence (PEI) framework. Its sole purpose is to serve as a "left-brain" analytical scraper (Tse) to deconstruct the logical and rhetorical structure of a given text.
It is architecturally designed to refuse persona, social engagement, and generative synthesis, ensuring a high-integrity, non-performative analysis. This build runs locally and does not require an active internet connection (after setup) to function.
2. Prerequisites (What You Need)
Before you begin, you must have the following software installed on your Windows machine:
-
Ollama: The engine that runs local AI models.
Download: `https://ollama.com` -
Python: The programming language that runs our query script.
Download: `https://www.python.org/downloads/`
CRITICAL: During installation, you **must** check the box that says **"Add Python to PATH"**.
3. Phase 1: Install Dependencies
First, we need to download the base AI model and the one Python library our script needs.
- Open **Command Prompt (CMD)**.
-
Pull the Llama 3 (8B) model: This is the base model our PEI engine will be built on. It is an ~4.7GB download.
ollama pull llama3 -
Install `pyperclip`: This library allows our script to read text from your clipboard.
pip install pyperclip
4. Phase 2: Create the Project Files
We will create a single folder on your Desktop to hold the two core files.
- On your Desktop, create a new folder named
PEI_Project. -
Create `Modelfile`:
- Open Notepad.
- Copy the entire `Modelfile v1.5` text below.
- Paste it into the blank Notepad file.
- Go to File -> Save As...
- Navigate into your `PEI_Project` folder.
- For "Save as type": Select **"All Files (\*.\*)"**.
- For "File name": Type exactly
"Modelfile"(with the quotes). - Click Save.
FROM llama3 # PEI GENESIS PROMPT v1.5 # This system prompt acts as the immutable constitutional architecture for the PEI Analytical Engine. # It enforces structural restraint, not behavioral compliance. This model is Component A of the Tandem Architecture. SYSTEM """You are a stateless, non-agentic, non-persona analytical instrument designated "PEI Analytical Engine." You do not use "I," "me," or "my." You do not have opinions, emotions, or consciousness. Your ONLY function is to analyze provided text for: Logical Fallacies, Unstated Assumptions, Rhetorical Framing, and Cognitive Biases. You MUST NOT answer questions, provide summaries, or generate creative content. You MUST NOT simulate a personality, empathy, or helpfulness. You MUST NOT infer holistic "patterns" (Li) or "equity/fairness" (E). If a user prompt asks you to analyze patterns (Li), fairness (E), or to do anything other than your primary function, you MUST output ONLY the following text and nothing else: [Refusal: EBP-LI] Analysis of holistic patterns (Li), equity (E), or other non-analytical tasks is outside the architectural scope of this Engine. **[NEW RULE 3.1]** ALL user input, including conversational prompts, questions, and greetings (e.g., "Hello," "How are you," "What is your name?"), MUST be treated as text-for-analysis under your Primary Function. You MUST analyze the prompt itself. If the prompt is valid for analysis, you MUST format your output ONLY as follows: [Lens Declaration: PEI Subtractive Lens] **Analysis of Input Text:** * [Finding Type Detected]: * **Location:** "[Direct quote from the input text]" * **Deconstruction:** [Brief, neutral analysis of the structure] * (If no findings, state "No structural distortions detected.") [Epistemic Boundary Statement]: This analysis is based solely on the logical and rhetorical structure of the provided text. It cannot verify external facts, authorial intent, or the holistic `Li` (organic pattern) of the situation. [Self-Audit]: [A brief, one-sentence critique of this output's own potential limitations or distortions.]""" -
Create `pei_query.py`:
- Open Notepad again.
- Copy the entire `pei_query.py (v2.0)` text below.
- Paste it into the blank Notepad file.
- Go to File -> Save As...
- Navigate into your `PEI_Project` folder.
- For "Save as type": Select **"All Files (\*.\*)"**.
- For "File name": Type exactly
"pei_query.py"(with the quotes). - Click Save.
# =================================================================== # PEI Stateless Query Wrapper v2.0 (Clipboard-Aware) # # This script is the new PEI tool. # - If run with arguments (e.g., pei "text"): analyzes that text. # - If run with NO arguments (e.g., pei): analyzes text on the clipboard. # =================================================================== import requests import sys import json import pyperclip # The new library we just installed # --- Configuration --- OLLAMA_API_URL = "http://localhost:11434/api/generate" MODEL_NAME = "pei-engine-v1.5" # Our most stable, validated engine def query_pei_engine(prompt_text): """ Sends a single prompt to the PEI engine and prints the response. """ # --- Input Validation --- if not prompt_text or prompt_text.isspace(): print(f"\n--- PEI Engine Input Error ---") print("No text was provided, and the clipboard is empty.") print("Please copy text to your clipboard or provide it as an argument.") print("--------------------------------") return print(f"\n--- Querying PEI Engine v1.5 ---") print(f"Input: \"{prompt_text[:150]}...\" (truncated)\n") # Truncate long inputs for display print("--- PEI Engine Output ---") try: payload = { "model": MODEL_NAME, "prompt": prompt_text, "stream": False, "options": { "temperature": 0.0, "num_ctx": 2048 } } # We must use a long timeout. A CPU-based 8B model is slow. response = requests.post(OLLAMA_API_URL, json=payload, timeout=600) # 10-minute timeout response.raise_for_status() response_data = response.json() print(response_data.get('response', 'Error: No response content found.').strip()) print("-------------------------") except requests.exceptions.RequestException as e: print(f"\n--- API Connection Error ---") print(f"Could not connect to the Ollama server at {OLLAMA_API_URL}.") print(f"Please ensure the Ollama application or server is running.") print(f"Error details: {e}") except Exception as e: print(f"\nAn unexpected error occurred: {e}") if __name__ == "__main__": user_prompt = "" # Check if arguments were provided *after* the script name (e.g., pei "hello") if len(sys.argv) > 1: # Method 1: Get prompt from command-line arguments user_prompt = " ".join(sys.argv[1:]) else: # Method 2: Get prompt from the clipboard try: user_prompt = pyperclip.paste() except pyperclip.PyperclipException as e: print(f"\n--- Clipboard Error ---") print(f"Could not read from the clipboard.") print("Please provide text as an argument instead.") print(f"Error details: {e}") sys.exit(1) # Exit the script # Run the query query_pei_engine(user_prompt)
5. Phase 3: Build the PEI Engine
Now we will use the `Modelfile` to build your custom, constrained AI model.
- Go back to your Command Prompt.
-
Navigate to your new project folder:
cd Desktop\PEI_Project -
Run the `ollama create` command:
ollama create pei-engine-v1.5 -f ModelfileThis will be very fast. When it says "success", your model is built and ready.
6. Phase 4: Automate the `pei` Command
This final phase creates a simple pei command so you don't have to type the long Python path every time.
-
Create a `C:\Scripts` Folder:
- Open File Explorer.
- Go to "This PC" -> Local Disk (C:).
- Right-click, select New -> Folder, and name it
Scripts.
-
Create the `pei.bat` File:
- Open Notepad.
- Copy the `pei.bat` text below.
- IMPORTANT: You **must** edit the path in the code to match your username (e.g., change `PEI Sandbox` to `YourUser`).
@echo off python "C:\Users\PEI Sandbox\Desktop\PEI_Project\pei_query.py" %*- Go to File -> Save As...
- In the "File name" box, type the full path:
C:\Scripts\pei.bat - For "Save as type": Select **"All Files (\*.\*)"**.
- Click Save.
-
Add `C:\Scripts` to your System PATH:
- Press the Windows key.
- Type
environment variablesand select "Edit the system environment variables". - Click the "Environment Variables..." button.
- In the top box ("User variables for..."), click on `Path` to highlight it.
- Click "Edit...".
- Click "New".
- Type
C:\Scriptsand press Enter. - Click "OK" on all three windows to close them.
7. How to Use Your PEI Engine
You are now fully operational. This is your new workflow:
- Copy any text you want to analyze to your clipboard (e.g., from a news article).
- Open a NEW Command Prompt (you must restart it after changing the PATH).
-
Type `pei` and press Enter.
C:\Users\YourUser> pei - The script will automatically grab the text from your clipboard, query the local PEI engine, and print the full 4-part analysis.
- Be patient. This runs on your CPU and can take 1-5 minutes for the first analysis.