🧠

AI Prompt Engineering Platform

An interactive teaching tool for COLL 100 students to develop prompt engineering intuition. Side-by-side A/B comparisons, structured rubric scoring, and a community library of high-performing prompts — all built through LLM-assisted development.

education COLL 100 vibe coding prompt design
01
Prompt A/B Playground
Compare two prompts side-by-side and see scored responses in real time
promptlab.codelab.sh/playground
⚔️ A/B Compare
📚 Library
📊 Rubric
📈 My Progress
🎓 Lessons
A/B Prompt Comparison
Task: Explain recursion ▶ Run Both
Prompt A Variant A
"Explain recursion to me."
AI Response:
Recursion is when a function calls itself. For example, to calculate factorial(n), you call factorial(n-1). The base case stops the recursion when n reaches 0 or 1…
Clarity
5.5
Depth
4.0
Prompt B Variant B
"You are a CS professor. Explain recursion to a first-year student using a real-world analogy, then show a Python example with comments."
AI Response:
Think of recursion like Russian nesting dolls 🪆. Each doll contains a smaller version of itself until you reach the tiniest one — the base case. Here's how that looks in Python: def fact(n): ...
Clarity
8.8
Depth
8.5
02
Community Prompt Library
Browse, fork, and rate high-performing prompts from the class
promptlab.codelab.sh/library
⚔️ A/B Compare
📚 Library
📊 Rubric
📈 My Progress
Prompt Library · 48 entries
🔍 Search Filter: All + Submit Mine
Role + Format + Example 9.2 / 10
"Act as a senior engineer. Explain [X] with: 1. Analogy 2. Code 3. Common mistake"
structurerole promptingfew-shot
Chain of Thought 8.8 / 10
"Think step by step before answering. First list assumptions, then reason through each…"
CoTreasoningstructured
Audience Targeting 8.4 / 10
"Explain this concept as if I'm a 10-year-old who loves Minecraft, then as if I'm a PhD…"
audienceclarity
Constraint-Based 8.1 / 10
"Explain X in exactly 3 bullet points. Each must start with a verb. No jargon."
constraintsconcise
Showing 4 of 48 prompts · Load more
03
AI Rubric Evaluation
Structured criteria scoring with actionable improvement suggestions
promptlab.codelab.sh/rubric
⚔️ A/B Compare
📚 Library
📊 Rubric
📈 My Progress
Rubric Evaluation
Prompt: "Explain sorting..." Re-evaluate
Evaluation Criteria
Specificity
Does the prompt define audience, format, and constraints?
Context
Is background / role information provided?
Output Format
Is the desired response format specified?
Clarity
Is the prompt unambiguous and direct?