A complete reference project layering five sensors on top of the V5 Clawbot — distance, AI Vision AprilTags, GPS, potentiometer, and bump switch — all stitched together with EZ-Template. Built to be read, understood, and rewritten in your own words for your engineering notebook.
The Clawbot Training Platform guide gets the Clawbot driving with EZ-Template + IMU. The Sensor Roadmap tells you the order to add the rest. Each per-sensor guide — distance, switches/pot/GPS, AI Vision/AprilTags — covers one piece in depth.
This guide is the integration story. How do those sensors actually fit together in one project? Where do the function boundaries live? When a sensor disagrees with another sensor, who wins?
Built on a stock V5 Clawbot with one IMU added (per Clawbot Training), then the following sensors mounted:
| Port | Device | Purpose |
|---|---|---|
1 | Left drive motor | Reversed in chassis ctor (-1) |
10 | Right drive motor | |
8 | Arm motor | Red 100 RPM cartridge |
3 | Claw motor | Red 100 RPM cartridge |
11 | IMU | Inertial sensor — required by EZ-Template |
5 | Distance sensor | Front-facing, low on the chassis |
6 | GPS sensor | Rear, ~10.5″ off the floor |
2 | AI Vision sensor | Front, unobstructed forward view |
7 | Rotation sensor | Optional — on an odom tracking wheel |
'A' ADI | Potentiometer V2 | Arm angle sensor (top limit) |
'H' ADI | Bump switch | Arm bottom limit |
'A' here matches the V5 Brain's standard 3-wire ADI labelling. The Sensors-Discrete guide uses 'B' in its examples — the difference is just where you wire it on your robot. Pick one, document it, stick with it.subsystems.hpp from clawbot.hpp.drive_to_wall() with two independent exit conditions, noise filter, and optional odometry pose-correction.main.cpp. The split here matches the EZ-Template project structure but extracts mechanism + sensor logic into its own pair of files.The wiring map. Every #define for ports and every device declaration. Change a wire? Edit one line here, the whole project moves with you.
The behaviour API. The functions the rest of the project calls — arm_control(), drive_to_wall(), sentry_acquire(). No port numbers in here.
This split lets you re-port the code to a different robot by editing only subsystems.hpp, without touching the behaviour logic. It's also the simplest example of separation of concerns a programming team can practice.
Every device declaration is one line. Every port is a named #define. The Drive object lives in main.cpp (because EZ-Template wants it there); everything else is here.
inline in a header? Without it, including the header from two different .cpp files would create two copies of each device, causing a linker error. The inline keyword tells the compiler "this is the same object across all translation units." Modern C++ (since C++17) supports this for variables — PROS uses C++20.This header is just function prototypes — the verbs the autonomous routines and opcontrol() call. Implementations are in clawbot.cpp.
Notice every public function reads as a sentence: arm control, drive to wall, sentry search. No abbreviations, no internals leaking out. When a teammate reads this header, they instantly know what the project can do.
The Switches & Potentiometer guide covers each sensor individually. The pattern below combines them. Here's the failure mode for each-alone:
Pot reads the angle — but if the pot wire pops loose mid-match it returns a stuck value, the code thinks the arm is fine, and the arm crushes into the bottom hard stop at full power.
Bump tells you when the arm hits the floor — but says nothing about the top. The arm keeps climbing past its safe range and snaps a chain or stalls the motor.
With both: the pot caps the top, the bump catches the bottom, and a stuck-pot reading still fails safe at the bottom because the bump switch is the final word.
This runs once per opcontrol() loop. Every motor command checks the relevant limit before applying power. Even a held trigger can't drive past either end.
ARM_POT_SAFE_BAND = 30 stops the arm 30 raw pot units before the absolute limit. Without this, the arm hits its mechanical hard stop slightly before the pot reading triggers, and the motor stalls against the stop until the loop catches up. The band gives the controller time to react.Preset buttons call arm_move_to_pot() — a small P-controller that drives the arm to a target pot reading. It still respects the bump switch on the way down, so even a "go to MIN" command bails out if the switch fires first.
Defaults shipped in the project (ARM_POT_MIN = 1200, MID = 2000, MAX = 2800) are placeholders. Run the calibrate routine, manually move the arm to each useful position, write down the readings, plug them in:
ARM_POT_MIN.ARM_POT_MID.ARM_POT_MAX.clawbot.cpp and re-upload.arm_at_top/arm_at_bottom. Check before tuning.arm_pot.get_value() returns a stuck reading of 0. The driver presses L2 to lower the arm. What happens with the dual-limit pattern?get_value() == 0 looks like "above MIN".drive_to_wall()Background reading: Distance Sensor Positioning, Distance Sensor Auton Correction, and Sensor-Based Autonomous. The function below combines patterns from all three.
Either sensor alone has a failure mode:
9999 when it doesn't see anything. If the wall is at an odd angle, behind a polycarbonate panel, or just outside the sensor's 2 m range, the sensor never confirms and the loop runs forever.Combining them: drive using EZ-Template's PID (which handles acceleration and heading-hold), poll the distance sensor each loop, exit on whichever condition fires first.
The Sensor-Based Autonomous guide warns: a single under-threshold reading can fire on a wire, an intake roller passing the beam, or a stray reflection. Require two consecutive confirmations before exiting. One bad reading resets the counter.
Once the loop exits and the robot is stopped, you know exactly where the wall is — so you know exactly where the robot is along that axis. Write the corrected coordinate back into EZ-Template's odometry. The next motion runs from a clean reference, with all the drift from earlier moves erased.
x = +72″, the right wall at y = +72″. If you start your auton at a corner with (0, 0) at that corner, the wall coordinates change accordingly. Whichever convention you pick, document it in your engineering notebook and stay consistent across every routine.Two flavours of call: with and without pose reset.
pid_drive_hold_exit_run() — that name doesn't exist in EZ-Template 3.2.x. The actual API is chassis.pid_targets_reset(), used above. Functional outcome is the same: the in-flight PID move aborts cleanly. (If you find the team guide using the older name, it's an aspirational example.)Background: AI Vision & AprilTags. The sentry pattern below uses everything that guide describes — the diagonal-corner centre trick, the tag-width-as-distance fallback, and the multi-exit safety pattern.
Spin slowly in place. Each loop, ask the camera "do you see Tag #5?" If yes, stop and return success. If we've turned a full sweep without finding it, return failure.
Once we've spotted the tag, drive toward it. Three things can stop us, and that redundancy is the point:
The tag-width fallback matters specifically for Override: tags sit on the goal bases, recessed inside the cup-shaped goals, where the distance sensor reads the goal frame, not the tag itself. If the distance sensor stops too early, the tag-width condition still pulls us all the way in.
(x0+x2)/2 — just the diagonal pair. With a partly-occluded or motion-blurred tag, the four corners can return inconsistent values; the diagonal pair is the most stable centre estimate. Averaging all four sounds more robust but actually amplifies noise from the worst-detected corner.STOP_TAG_WIDTH_PXThe 200 px threshold is a placeholder. Find the right value for your robot:
obj.object.tag.x1 - obj.object.tag.x0 in a debug loop.aivision.enable_detection_types(pros::AivisionModeType::tags) at startup. Without this, get_all_objects() returns nothing tag-related and your sentry just spins forever. The project's aivision_instance() singleton handles this in code — see clawbot.cpp.Background: V5 GPS Sensor Deep Dive covers the deadzone (~13.5″ from a wall), the field-code mechanics, and the Path A/B/C decision framework. The function below assumes you've already decided GPS is right for your team.
The GPS reports a per-frame RMS error in metres via gps.get_error(). A reading with 0.02 m error is reliable; a reading with 0.30 m error is the GPS struggling to localise from a partial view of the strip. Apply the GPS pose only if error is below your threshold.
GPS is a periodic correction layer, not a primary position source. Call it:
Don't call it inside a fast control loop. The V5 GPS updates at ~10 Hz; calling it every 10 ms wastes time without giving you new data.
This is where teams trip up. There are three coordinate spaces in play:
| System | Units | Origin |
|---|---|---|
| V5 GPS | Metres | Centre of field |
| EZ-Template odom | Inches | Wherever you call odom_xyt_set(0, 0, 0) |
| Field-coordinate convention | Inches | Your team's choice (centre, corner, etc.) |
The line chassis.odom_xyt_set(s.x * M_TO_IN, s.y * M_TO_IN, s.yaw) assumes your odom origin matches the GPS origin (centre of field). If you start odom at a corner, you need to add an offset.
gps_localize() as a discrete reset at known-good moments. For continuous fusion, use a Kalman filter — well outside the scope of V5RC and almost never worth the complexity.Localise. Drive to a wall and reset pose against it. Turn. Sentry-acquire a tagged object. Grab it. Lift. Drive home. Release.
Every step is here for a reason. Re-read the code with these annotations:
The auton selector registration in main.cpp looks like this:
Five autons cover the full progression: each individual sensor in isolation, then the integrated mission. Tune each one separately before running the mission — debugging a 7-step routine is much harder than debugging one drive-to-wall.
Wrap arm_control() with stall-detection so a wedged arm cuts power instead of cooking the motor.
Use PROS Tasks so the arm holds presets in the background while autonomous keeps driving the chassis.
Turn the arm into a clean finite state machine — IDLE / MOVING / HOLDING / FAULT — once the if-chain grows.
Add SD-card logging to record (time, pot, distance, x, y, theta) per loop. Graph after the run.
auton_full_mission(), why is gps_localize() called before drive_to_wall(), even though both can correct the pose?