📝 Competition · Notebook · Beginner

Engineering Notebook Pathway

Four guides, one system. Where to start on kickoff day, when to write each entry type, what Emerging vs Expert actually means, and what Design Award judges look for across 5 minutes of review.

1
Start Here
2
Season Timeline
3
Rubric Deep Dive
4
Student Ownership
5
Award Prep
// Section 01
The Notebook Pathway — Where to Start
Four guides, one pathway. Which guide to open on kickoff day, which to open the night before competition, and how they connect.
🎯
Start here if you have never set up a notebook before. This pathway walks you through every resource on this site in the order that makes sense — from kickoff day to pre-competition audit.

The Four Resources and What Each Does

📝
Getting Started with the Engineering Notebook
What to write, when to write it, and your first three entries. Includes a kickoff day plan, the 5-minute session entry format, a first-two-weeks schedule, and a pre-competition audit checklist.
Start Here — Kickoff Day Setup →
📄
How to Use the Notebook Template
The blank Google Slides template structure. Six EDP colors, every slide type with placeholder prompts, the iteration divider format, and the multi-team program setup guide.
Build the Template →
📝
Engineering Notebook & Design Process
The deep-dive guide: rubric levels (Emerging → Expert), testing protocols, decision matrices, STEM connections, and interview prep. The reference you come back to all season.
Master the Rubric →
🏆
Mission Control — Judge & Notebook Prep
Interactive audit tool, interview practice, PID log, and notebook entry builder. Run the audit 7 days before every competition.
Pre-Competition Audit →

Which Guide to Use First

If it’s kickoff day

Open Getting Started first. Complete the first three entries before doing anything else. The template can wait until Day 2.

If you’re a coach setting up

Open Template Guide first. Build the master, duplicate for each team, then run kickoff. Students start in Getting Started.

If competition is in 2 weeks

Open Mission Control → Notebook Audit tab. Fix the red items. Then open Engineering Notebook → Interview Prep.

If you want Design Award

Open Engineering Notebook and read the rubric table. Your goal is Expert on all six criteria. Then read Judge Interview Playbook.

// Section 02
When to Write What
A season-by-season schedule showing when each entry type gets written and which site resources support each phase.
📅
The notebook is written in parallel with the robot — not after it. This schedule shows when each type of entry gets written and what site resources support each phase.

Kickoff Day

K
First Three Entries
All Team Members
Write: (1) Team roster with roles and ownership statements. (2) Season goals — measurable targets for robot performance and notebook quality. (3) Game analysis — scoring breakdown, priority elements, criteria and constraints.
Kickoff Day Guide →

Weeks 1–4 (Build Phase)

1
Identify the Problem
Strategist leads
Game overview, scoring analysis, field specs, criteria and constraints, team and robot goals. These slides use the green EDP color.
Game Analysis — Push Back →
2
Brainstorm & Research
All members contribute
Concept sketches (minimum 3), research notes with sources cited, subsystem research for each major mechanism. Gold slides.
Decision Matrix Guide →
3
Select Best Solution
Engineer leads, Strategist documents
Data comparison, decision matrix with weighted criteria, preliminary design selection with written conclusion. Purple slides.
Select Best Solution →

Weeks 5–12 (Build & Iterate)

4
Build & Program Entries
Engineer — one entry per significant change
Build log (date, members, what changed, why, before/after photo), CAD drawings, programming log with constants. Orange slides. One entry per change — not one entry per week.
Build Entry Format →
5
Test & Evaluate
Engineer + Strategist
Hypothesis, test procedure, data table (n≥5 trials), conclusions. Link back to the decision that triggered this test. Cyan slides.
Testing System Guide →

After Every Competition

6
Tournament Reflection
All team members
Match record, ranking, what worked, what failed (with specific data), one change for next event. Write this the night of the competition. Red slides.
Competition Entry Format →
7
Start EDP Cycle 2
Strategist starts, all build
Re-identify the problem based on competition data. The iteration divider slide signals the new EDP loop. This is what separates Expert notebooks from Proficient ones.
Iteration Divider Guide →

One Week Before Every Competition

A
Pre-Competition Audit
Strategist + Coach
Run the notebook audit in Mission Control or the Template Guide. Fix every red item. Every entry needs Written By, Witnessed By, and Date. TOC must be current.
Run the Audit →
// Section 03
Emerging, Proficient, and Expert
The six rubric criteria with all three scoring levels spelled out. Know what Expert requires before you start, not after.
📊
The rubric has six criteria. Judges score each one at three levels. Your goal is Expert on all six. Here is exactly what Expert requires for each.

The Six Rubric Criteria — Emerging to Expert

1. IDENTIFY THE PROBLEM
EmergingProblem is mentioned but vague. No constraints listed.
ProficientProblem clearly stated with goals and some constraints.
ExpertThorough description. Constraints are specific and measurable. Scoring analysis cited.
2. BRAINSTORM SOLUTIONS
EmergingOne or two ideas listed without explanation or sketches.
ProficientThree or more labeled sketches with descriptions.
ExpertMultiple detailed diagrams. Pros/cons per option. Research citations included.
3. SELECT BEST SOLUTION
EmergingChoice made without explanation. No alternatives compared.
ProficientChoice explained with reasoning. Alternatives mentioned.
ExpertDecision matrix with weighted criteria. Written conclusion explaining why the data agrees with the choice.
4. BUILD & PROGRAM
EmergingBuild notes or code exists but not linked to design decisions.
ProficientSteps recorded. Code changes noted alongside build.
ExpertDetailed build log with photos. Code shown alongside design intent. Every change has a reason.
5. TEST & EVALUATE
EmergingTesting mentioned but results not recorded.
ProficientTests performed and results noted with some data.
ExpertOriginal testing. Data tables with n≥5 trials. Benchmark targets set before testing. Conclusions drive next action.
6. ITERATE (EDP CYCLES)
EmergingOnly one design cycle shown across the season.
ProficientTwo or more cycles visible with some continuity.
ExpertMultiple full cycles. Each cycle explicitly linked to data from the previous one. V1 → V2 comparison shows measurable improvement.
🔭
“Fully Developed” = scoring Emerging or higher on the first four criteria. That threshold gets your notebook scored at all. Below it, judges set it aside. Above it, every criterion is ranked and compared against other teams.
RECF
The Rubric Is a Sorting Tool, Not a Test
Judges use the rubric to rank notebooks quantitatively first, then apply qualitative judgment for final award decisions. A notebook that scores Expert on every criterion is not automatically the Design Award winner — but it is guaranteed to be in the final deliberation. A notebook that scores Emerging on most criteria will not make the cut, regardless of how good the robot was.
// Section 04
Student Ownership and EN4
What student-centered means in practice, what mentors can and cannot do, and the RECF EN4 rule on AI-generated content.
⚠️
RECF EN4 is explicit: using AI tools to generate, organize, enhance, or alter notebook content violates the Student-Centered Policy. This includes using AI to draft entry text, improve writing quality, suggest what to write, or fill in placeholder prompts. The template provides structure. Students provide everything else.

What “Student-Centered” Means in Practice

A student-centered notebook has one test: can every team member explain every entry they wrote, in detail, to a judge who asks follow-up questions? If yes, the notebook is student-centered. If no, it is not — regardless of how well-organized it looks.

A student who says “I wrote that entry the night after we tested the intake and the numbers surprised us” owns that entry.
A student who can point to the page, explain the test conditions, and describe what changed in response to the data owns that entry.
A student who reads from the notebook but cannot answer “why did you choose that test protocol” does not own that entry.
Entries all written in one session after the season ends. Judges use version history to verify chronological writing. This is visible.
Writing style that does not match the student’s vocabulary, grade level, or demonstrated knowledge in the interview. Judges notice vocabulary mismatches.

The Originality Check — 3 Questions Before Every Submission

  1. Could I explain this entry to a judge who asks three follow-up questions? If not, rewrite it in your own words until you can.
  2. Does this entry describe something that actually happened, in the order it happened? A good entry reads like a lab notebook, not like a report written after the fact.
  3. Are there at least two different writing styles visible across all entries? On a 3-person team, judges expect three voices. Identical phrasing across all entries is a flag.

What Mentors Can and Cannot Do

✅ Mentors can
  • Set up the template structure
  • Explain what the rubric criteria mean
  • Ask students questions about their work
  • Review entries and point out missing elements
  • Show examples of strong vs weak entries
  • Set up version history monitoring
❌ Mentors cannot
  • Write entries or rewrite student text
  • Tell students exactly what to write
  • Use AI to draft or improve entries
  • Fill in decision matrix scores for students
  • Edit entries after submission
  • Reconstruct entries retroactively
RECF EN4: “The use of artificial intelligence / large language model (AI/LLM) programs or tools to generate, organize, enhance, or alter Engineering Notebook content or programming code is contrary to the RECF Student-Centered Policy.” This is not a gray area. If AI wrote it, it is a violation.
// Section 05
What Judges Actually Look For
What judges see in 5-8 minutes, the 5 things they skip, and what separates Design Award notebooks from the rest.
🏆
The notebook is judged on the same criteria as the interview. A team that knows their notebook cold — can point to any entry, explain what happened, and describe what they did next — will win more judge interviews than a team with a beautiful notebook they barely remember writing.

How Judges Evaluate Notebooks

Judges typically have 5–8 minutes per notebook. They are not reading every word. They are looking for:

The 5 Things Judges Skip

Decoration without content. Themed slide backgrounds, icons, and custom fonts do not score rubric points. Substance scores. Judges skip visually busy slides that say nothing.
Entries without dates. An undated entry is evidence-free. Judges cannot tell if it was written the day it happened or the week before competition.
Summaries of decisions without showing the process. “We chose a four-bar lift” is not evidence. “We compared three lift designs using a decision matrix — here are the scores and here is why we weighted torque most heavily” is evidence.
Test logs with no data. “We tested the intake and it worked” is not a test log. It is a one-sentence absence of evidence.
One author across 80 slides. If every entry shows the same “Written By” name, judges assume only one person understands the robot. They will probe the others in the interview.

What Separates Design Award Winners

Across the rubric criteria, Design Award notebooks consistently show:

📝
The interview and the notebook tell the same story. When a judge asks “why did you choose that intake design,” the answer should match what is on page 18 of the notebook. Practice the interview with the notebook open. Point to the evidence as you speak.

Related Guides for Award Prep

🎤
Judge Interview Playbook
How to structure the 10-15 minute interview, split speaking roles, use the notebook as evidence, and answer every rubric category with specific data.
Interview Prep →
🎤
Interview Skills Lab
Communication drills: STAR framework, active listening, hand-off practice, video self-review. The practice reps before the real interview.
Practice Interview Skills →
🏆
Mission Control — Judge Prep Tool
Interactive mock interview questions, notebook audit, and PID constants log. Run the audit tab one week before every competition.
Run the Audit →
Related Guides
📝 Getting Started with the Notebook → 📄 Notebook Template Guide → 📝 Engineering Notebook & Design Process → 🏆 Mission Control — Notebook Audit → 🎤 Judge Interview Playbook → 📅 Season Timeline →
← ALL GUIDES