P

prompt-decorators

...
Created 2/28/2025bysynaptiai

Categories

aiai-standardsai-toolsdecoratorsmcpmcp-serverprompt-engineeringprompt-templatesspecification

Language:

Python

Stars:

14

Forks:

3

Prompt Decorators

License Python Versions Documentation Code Style

Code Quality and Testing Documentation Publish to PyPI

Prompt Decorators is a comprehensive framework that standardizes how prompts for Large Language Models (LLMs) are enhanced, structured, and transformed. This repository contains both the official Prompt Decorators Specification and its complete Python reference implementation.

DocumentationPrompt Decorators Specification

📋 Table of Contents

🔍 Overview

What Are Prompt Decorators?

Prompt Decorators introduces a standardized annotation system inspired by software design patterns that allows users to modify LLM behavior through simple, composable "decorators." By prefixing prompts with annotations like +++Reasoning, +++StepByStep, or +++OutputFormat, users can consistently control how AI models process and respond to their requests across different platforms and implementations.

This project addresses the growing complexity of AI interactions by providing:

  1. The Specification: A formal standard that defines decorator syntax, behavior, and extension mechanisms
  2. The Python Implementation: A production-ready reference implementation with comprehensive tooling
  3. MCP Integration: A Model Context Protocol server that enables prompt decorator functionality in tools like Claude Desktop

Key Components

  • 📝 Specification: The formal Prompt Decorators Specification (v1.0) defining the standard
  • 🛠️ Core Framework: A Python implementation with registry-based decorator management
  • 🧩 140+ Decorators: A comprehensive library of pre-built decorators covering reasoning, formatting, and more
  • 🔌 MCP Server: Integration with the Model Context Protocol for use with desktop AI applications
  • 📚 Extensive Documentation: API references, guides, and examples for both users and developers

Background & Motivation

            As Large Language Models become increasingly integrated into workflows across industries, the need for standardized, consistent ways to interact with these systems has become apparent. Current prompt engineering approaches are largely ad-hoc, requiring extensive documentation, reinvention, and significant cognitive overhead when switching between systems or use cases.

Prompt Decorators address this challenge by providing a systematic approach to modifying AI behavior through simple, composable annotations. Inspired by the Decorator pattern in programming and Python's function decorators, they serve as a layer of abstraction that decouples the core prompt from instructions about how to process and present the response.

Challenges in Prompt Engineering

Current prompt engineering suffers from several limitations:

  • Inconsistency: Instructions vary widely between users, platforms, and models
  • Verbosity: Detailed instructions consume token context that could be used for content
  • Cognitive Overhead: Users must remember or document specific prompting techniques
  • Lack of Composability: Combining different instruction paradigms is cumbersome
  • Undocumented Behavior: Expected model behavior is often implicit rather than explicit

Benefits of Prompt Decorators

Prompt Decorators solves key challenges in prompt engineering:

  • Inconsistency: Provides a standard syntax and behavior across different LLM platforms

  • Verbosity: Replaces lengthy instructions with concise annotations

  • Cognitive Overhead: Simplifies prompt crafting with reusable patterns

  • Lack of Composability: Enables clean combination of multiple instruction paradigms

  • Undocumented Behavior: Explicitly defines expected model responses

              Whether you're crafting prompts for specific reasoning patterns, structuring outputs in particular formats, or ensuring consistent responses across different models, Prompt Decorators provides a systematic approach that makes prompt engineering more modular, reusable, and maintainable.
    

The Prompt Decorators framework addresses these challenges through:

  • Standardization: Common vocabulary and syntax across platforms and models
  • Efficiency: Concise annotations that reduce token consumption
  • Reusability: Consistent behaviors that can be reused across different contexts
  • Composability: Ability to combine decorators for complex interaction patterns
  • Explicit Behavior: Clear documentation of expected model responses
  • Reduced Cognitive Load: Simple annotations instead of lengthy instructions

Key Features

  • 📚 Registry-based decorator management: Centralized registry of decorators with metadata
  • ✅ Parameter validation and type checking: Robust validation of decorator parameters
  • 🔢 Decorator versioning: Support for semantic versioning of decorators
  • 🔄 Compatibility checking: Verification of decorator compatibility
  • 📝 Documentation generation: Automatic generation of documentation for decorators
  • 🧩 Dynamic loading: Runtime decorator loading from definition files
  • 🔍 Runtime decorator discovery: Dynamic discovery and registration of decorators

💡 Implementation Status

The Prompt Decorators project is currently in active development.

You can see the how prompt decorators work by testing out the demo or running the MCP server implementation together with your Claude Desktop.

Or you can use the .cursorrules in this repository as system instructions in Cursor (or chatGPT/Claude) to instruct it. Try it out and share your experiences!

Implemented Functionality

            - **✅ Core Decorator Registry**: Load decorators from standardized JSON definitions
  • ✅ Decorator Application: Apply decorators to prompts with parameter validation
  • ✅ Sophisticated Transformation: Convert decorator parameters into prompt adjustments
  • ✅ Multiple Input Formats: Support for Python functions, strings, and JSON
  • ✅ Parameter validation and type checking: Robust validation of decorator parameters
  • ✅ Standard Decorators: Implementation of the standard decorators defined in the specification
  • ✅ Extension Framework: Support for domain-specific decorator extensions
  • ✅ Documentation Generation: Automated documentation generation from decorator definitions

For a detailed breakdown of implementation status, see our Implementation Status document.

Roadmap

The roadmap for this project is outlined in the ROADMAP file.

🚀 Getting Started

Installation

You can install the package from PyPI https://pypi.org/project/prompt-decorators/:

pip install prompt-decorators

For additional functionality, you can install optional dependencies:

# For Model Context Protocol (MCP) integration
pip install "prompt-decorators[mcp]"

# For development and testing
pip install "prompt-decorators[dev,test]"

# For documentation
pip install "prompt-decorators[docs]"

# For all optional dependencies
pip install "prompt-decorators[all]"

Basic Usage

import prompt_decorators as pd

# Load available decorators
pd.load_decorator_definitions()

# Create a decorator instance
reasoning = pd.create_decorator_instance("Reasoning", depth="comprehensive")

# Apply the decorator to a prompt
prompt = "Explain the concept of prompt engineering."
decorated_prompt = reasoning.apply(prompt)

print(decorated_prompt)
            For more detailed examples and usage instructions, please refer to the [official documentation](https://synaptiai.github.io/prompt-decorators/).

📝 License

This project is licensed under the Apache License, Version 2.0. See the LICENSE file for more information.

🤝 Contributing

Contributions are welcome! Please read the CONTRIBUTING file for guidelines on how to contribute to this project.

🤖 Acknowledgments

This project would not be possible without the contributions of the following individuals and organizations:

Last updated: 3/18/2025

Publisher info

synaptiai's avatar

Synapti.ai

A new kind of AI studio

1
followers
0
following
2
repos

More MCP servers built with Python

apollo-io-mcp-server

MCP server that exposes the Apollo.io API functionalities as tools

By Edward Choh
mcp-openvision

MCP Server using OpenRouter models to get descriptions for images

By Nazruden2
DeepView MCP

Enables IDEs like Cursor and Windsurf to analyze large codebases using Gemini's extensive context window.

By ai-1st