From Specification to Prompt Structure

Slide Idea 

This slide explains how specifications become executable prompts through systematic structuring across three categories: what is being shown (subject, action, environment), how it is being shown (camera position, framing, movement), and what constrains the result (tone, realism, ethical limits, exclusions). The central claim is that prompts do not add creativity; they formalize and document decisions already made during the specification process.

Key Concepts & Definitions

Prompt Structure

Prompt structure refers to the organization and systematic arrangement of instructions to generative systems into distinct, semantically meaningful components that serve different functional roles in guiding system behavior. Effective prompt structures typically decompose complex requirements into categories such as role definition (what the system should act as), task specification (what to do), constraints (limitations on approach or output), format specification (how to structure output), and context provision (relevant background information). Research on prompt programming shows that structured prompts with explicitly delineated components consistently outperform unstructured narrative prompts because structure reduces ambiguity, enables systematic modification of individual components, and provides clearer guidance about how different instructions relate to overall task goals. The slide exemplifies a three-category structure addressing orthogonal dimensions of visual generation requirements, organizing specifications into logically distinct groups rather than presenting them as undifferentiated text.

Source: White, J., et al. (2023). A prompt pattern catalog to enhance prompt engineering with ChatGPT. arXiv preprint.

Decomposition of Creative Decisions

Decomposition of creative decisions is the analytical practice of breaking down complex creative tasks into constituent choice categories, explicitly identifying which independent decisions must be made across different dimensions of the work. This decomposition reveals that seemingly unified creative outputs result from many discrete decisions—subject matter, style, composition, technical approach, and ethical considerations—each of which may vary independently. Making this structure explicit transforms vague creative goals into structured, answerable questions: not “what should this image be?” but “what subject?”, “what action?”, “what environment?”, “what camera approach?”, and “what constraints?”. Research on chain-of-thought prompting shows that explicitly decomposing complex problems into intermediate reasoning steps improves language model performance on multi-step tasks by allocating computational resources sequentially rather than attempting to resolve all dimensions simultaneously. The same principle applies to creative specification: categorical decomposition enables systematic reasoning about each dimension.

Source: Wei, J., et al. (2022). Chain-of-thought prompting elicits reasoning in large language models. In Advances in Neural Information Processing Systems (Vol. 35, pp. 24824-24837). 

Prompt Programming

Prompt programming treats prompts as programmatic artifacts—structured, reusable, and testable instructions analogous to software code—rather than as casual natural-language requests. This paradigm emphasizes engineering discipline: modularization (separating components with distinct functions), abstraction (identifying reusable patterns), version control (tracking changes and their effects), testing (evaluating outputs against success criteria), and documentation (describing purpose, structure, and usage). Research on prompt programming demonstrates that treating prompts as programs enables systematic development: components can be modified independently, prompt libraries can be shared, and performance can be measured empirically rather than intuitively. The slide’s structured categorization exemplifies this programmatic thinking: each category (subject/action/environment, camera/framing/movement, tone/realism/ethics/exclusions) functions as a distinct module that can be varied independently while others remain constant.

Source: Reynolds, L., & McDonell, K. (2021). Prompt programming for large language models: Beyond the few-shot paradigm. In Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems (pp. 1–7).

Decisional Transparency

Decisional transparency refers to making reasoning processes, choice criteria, and decision rationales explicit and inspectable rather than implicit or opaque. In creative and technical work, this involves documenting not only what was chosen but why it was chosen, what alternatives were considered, and which constraints applied. Structured prompts embody decisional transparency by organizing specifications into explicit categories that reveal which dimensions were considered, what decisions were made about each, and what was deliberately constrained. This transparency supports evaluation of decision quality, facilitates learning from decision patterns, enables collaboration by making reasoning visible, and creates accountability for choices rather than attributing outcomes to opaque processes. The slide’s structure makes creative decisions legible: it shows what was specified about subject, camera work, and constraints, and allows inference about the reasoning that shaped those specifications.

Source: Diakopoulos, N. (2016). Accountability in algorithmic decision making. Communications of the ACM, 59(2), 56–62.

Documentation vs. Generation Distinction

The distinction between documentation (recording decisions already made) and generation (creating new decisions) addresses a common misunderstanding about prompts in AI-supported creative workflows. As the slide states, “The prompt does not add creativity—it records decisions already made.” Prompts function as documentation artifacts that formalize and communicate specifications determined during prior creative thinking. They do not themselves perform the creative work of deciding what to make or how to approach it. This distinction locates creative agency in the specification process rather than in prompt writing. Students sometimes conflate prompt engineering skill with creativity, assuming that clever prompting substitutes for creative vision. Effective prompts, however, depend entirely on having clear creative intent to document; sophisticated prompting techniques cannot compensate for unclear decisions.

Source: Shneiderman, B. (2020). Human-centered artificial intelligence: Reliable, safe & trustworthy. International Journal of Human–Computer Interaction, 36(6), 495–504. 

Why This Matters for Students' Work

Understanding prompts as structured documentation of decisions reframes how work with generative AI systems is approached and clarifies how specification functions in complex creative and technical tasks.

Students often struggle with prompt writing not because of missing technical skills, but because underlying creative decisions have not been articulated. When prompts are vague or ineffective, the issue is typically incomplete specification rather than poor structure. The slide’s three-category decomposition (what/how/constraints) makes explicit that effective prompts depend on prior decision-making: subject matter, technical approach, and limitations must be determined before they can be documented. Prompts written without these decisions remain underspecified and delegate creative choices to system defaults rather than expressing intentional choices.

Prompt structure also highlights that different categories of decisions require independent specification. Students may thoroughly specify one dimension while neglecting others, such as describing subject matter in detail but leaving technical approach undefined, or articulating stylistic constraints without clarifying content. The slide demonstrates systematic coverage: content decisions (“what is being shown”), technical and aesthetic decisions (“how it is being shown”), and boundary decisions (“what constrains the result”). Each category captures a distinct type of choice that cannot be substituted by another.

Decomposition of creative decisions supports analytical skills transferable across disciplines. Holistic creative tasks can feel overwhelming when considered as a single problem. Breaking them into categorical sub-questions transforms abstract uncertainty into manageable decisions. Rather than confronting “what should this be?”, discrete questions emerge: subject, action, environment, perspective, tone, and exclusions. This analytical approach applies beyond AI prompting to design briefs, research planning, writing outlines, and project proposals.

The documentation-versus-generation distinction clarifies how creative agency is assessed. Structured prompts provide evidence of decision-making by recording what was specified about subject, approach, and constraints. Vague prompts provide little such evidence and obscure whether intentional choices were made. From pedagogical perspectives emphasizing reasoning and intentionality, structured prompts make thinking visible in ways that unstructured prompts do not.

Prompt programming as an engineering discipline further suggests that effective AI use involves learnable, improvable practices rather than intuition or chance. Understanding prompts as artifacts that can be documented, versioned, and evaluated supports systematic improvement and professional accountability.

How This Shows Up in Practice (Non-Tool-Specific)

Filmmaking and Media Production

The slide’s three-category structure parallels professional film production documentation. Production specifications organize requirements across what appears on screen (subject matter, action, environment), how it is captured (camera work, lighting, movement), and what governs approach (creative and ethical constraints). Shot lists document decisions already made during planning, specifying subject, action, camera setup, lighting, and constraints so that production can proceed efficiently without continuous clarification.

Pre-visualization artifacts such as storyboards and animatics similarly document decisions about composition, timing, and staging reached earlier in planning. Documentary filmmaking further illustrates constraint documentation through ethical guidelines that specify exclusions and standards governing production practice.

Design

Interface design specifications use analogous categorical organization: components (what elements exist), implementation details (how they are constructed), and usage guidelines (constraints governing use). Design briefs and critique processes rely on similar structuring to ensure that feedback and implementation address distinct dimensions of success rather than undifferentiated preferences.

Writing

Professional writing assignments specify content, structure, and constraints separately. Academic paper formats similarly document decisions made during research and planning by organizing them into standardized sections. Editorial style guides and content templates function as constraint documentation that governs presentation and ethical standards across publications.

Computing and Engineering

Software and systems documentation relies on explicit categorical decomposition: functional requirements, non-functional requirements, interfaces, and constraints. Architecture decision records, API documentation, and code review frameworks all use structured categories to make reasoning transparent, evaluable, and reusable.

Common Misunderstandings

"Well-structured prompts eliminate the need for creative decision-making"

This view reverses the relationship between structure and creativity. Prompt structure does not replace creative decisions; it organizes and documents them. All substantive creative choices—subject matter, approach, and constraints—must still be made independently of structure. Structure provides a framework for recording these choices systematically, and often reveals where decisions remain incomplete.

"The categories shown (what/how/constraints) universally apply to all creative and technical tasks"

The slide’s three categories are appropriate for visual generation tasks but are not universal templates. Other domains require different categorical organizations. The transferable principle is systematic decomposition into orthogonal dimensions, not the specific categories themselves.

"More structural categories always improve prompts"

Excessive categorization can increase cognitive load and fragment coherence. Effective structure balances completeness with manageability, identifying the minimum number of categories needed to capture relevant decisions without unnecessary subdivision. The slide’s three-category structure represents a manageable balance for the task shown. scirp

"Converting specifications to structured prompts is purely mechanical translation"

Prompt structuring involves analytical judgment about which dimensions are independent, how granular each should be, and how information is best organized. Structural choices affect clarity, modifiability, and reuse. The slide presents one defensible organization, but other structures may be equally valid depending on task and context.

Scholarly Foundations 

Wei, J., Wang, X., Schuurmans, D., Bosma, M., Ichter, B., Xia, F., Chi, E., Le, Q., & Zhou, D. (2022). Chain-of-thought prompting elicits reasoning in large language models. In Advances in Neural Information Processing Systems (Vol. 35, pp. 24824–24837).

Reynolds, L., & McDonell, K. (2021). Prompt programming for large language models: Beyond the few-shot paradigm. In Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems (pp. 1–7).

White, J., Fu, Q., Hays, S., Sandborn, M., Olea, C., Gilbert, H., Elnashar, A., Spencer-Smith, J., & Schmidt, D. C. (2023). A prompt pattern catalog to enhance prompt engineering with ChatGPT. arXiv preprint.

Diakopoulos, N. (2016). Accountability in algorithmic decision making. Communications of the ACM, 59(2), 56–62.

Shneiderman, B. (2020). Human-centered artificial intelligence: Reliable, safe & trustworthy. International Journal of Human–Computer Interaction, 36(6), 495–504.

Fiannaca, A., Kulkarni, C., Vasserman, L., & Huang, T.-H. (2023). Programming without a programming language: Challenges and opportunities for designing developer tools for prompt programmers. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems (pp. 1–16).

Sweller, J., Ayres, P., & Kalyuga, S. (2011). Cognitive load theory. Springer.

Beurer-Kellner, L., Fischer, M., & Vechev, M. (2022). Prompting is programming: A query language for large language models. arXiv preprint.

Boundaries of the Claim

The slide presents a three-category structure as one systematic method for organizing specifications into executable prompts. It does not claim universal applicability, optimality, or exclusivity. The distinction between creativity and documentation clarifies roles rather than denying that articulation can surface incomplete decisions. Research cited establishes that structure supports reasoning and performance, not that any particular structure guarantees success.

Reflection / Reasoning Check 

1. Consider a complex creative task you might undertake (designing an interface, writing an article, creating a video, planning a system). Before thinking about how to prompt an AI system, try to decompose the creative decisions involved into 3–5 categorical dimensions similar to the slide’s what/how/constraints structure. For each dimension, identify what types of decisions belong there and how they differ from other dimensions. Examine whether decisions in one category can be made independently of others or whether they are interdependent. Compare this decomposition to what would need to be specified in a prompt.

2. The slide states, “The prompt does not add creativity—it records decisions already made.” Reflect on experiences where attempting to write clear specifications revealed undecided or implicit assumptions. Consider why the act of structuring specifications can feel creatively productive even though it does not generate creativity itself. Examine the difference between vague intuitions and explicit decisions, and how structured specification can surface and resolve that difference.

Return to Slide Index