🔧 Hardware · Beginner → Intermediate

Add a Sensor: From Wire to Working

The same five steps work for every V5 sensor. Once you do it for one, you can do it for any — bump, pot, distance, vision, GPS, optical. This guide is the universal scaffold between "I have a working chassis" and the per-sensor deep dives.

// Section 01
The 5-Step Workflow
Every sensor on the V5 follows the same path from physical hardware to working code. Memorize the sequence once and you never have to remember the details for any specific sensor — you just follow the steps.

Every per-sensor guide on this site — switches & pot, distance, AI Vision, optical, GPS — goes deep on what one specific sensor does. None of them teach the meta-pattern: the same five steps you go through every single time you add any sensor.

This is that pattern. Learn it once, apply it forever.

1

Pick a Port

Smart port for cameras, distance, GPS, rotation. ADI port for switches, pot, line trackers. Match the sensor type to the right port type.

2

Declare the Device

One line in subsystems.hpp. Use a named #define for the port number so it's easy to change later.

3

Verify Live

Print the value to the controller or brain screen. Wiggle the sensor by hand. Confirm the number changes the way you expect.

4

Calibrate Constants

Find your real-world threshold values by reading the sensor at known positions. Save them as named constants.

5

Use in Code

Read the value, compare against your constants, drive an output. The basic if-then pattern that turns sensor data into robot behaviour.

Why This Order Matters

Each step verifies the previous one. Skip any of them and you'll spend hours debugging the wrong layer:

🔥
The classic "skip step 3" disaster. A team adds a distance sensor, writes a 200-line autonomous routine, runs it, robot crashes into a wall. Two hours of debugging the auton later, they finally print the sensor value — it was wired into the wrong port and reading 9999 the entire time. Step 3 would have caught this in 30 seconds.

The five-step rhythm protects you from this. You don't move to the next step until the previous step works. Sensor reads garbage on the brain screen? Don't write any code yet — fix the wiring or the port assignment first.

📝
RECF EN4 reminder. The code on this page is reference material. Your engineering notebook should describe these steps in your own words, with screenshots from your robot. Don't copy-paste these snippets into your notebook or competition code.

What You Should Have Before This

If you have all three, you're ready. The worked example at the end of this guide uses a bump switch because it's the simplest sensor to verify (binary input, no calibration needed) — but the same workflow scales up to anything.

// Section 02
Step 1 — Pick a Port
The V5 Brain has two kinds of ports. Pick wrong and the sensor doesn't talk to the brain at all. Pick right and you're done with this step in 30 seconds.
◎ Step 1 of 5

Smart Ports vs ADI Ports

Smart Ports (1–21)

Round 4-pin plug. Carries digital data. For sensors that have onboard processing — cameras, distance, GPS, rotation, optical, IMU. Also where motors plug in.

ADI Ports (A–H)

Rectangular 3-wire plug. Analog or simple digital. For sensors that report a raw voltage or a binary state — bump switch, limit switch, potentiometer, line tracker, LED.

Which Port for Which Sensor

SensorPort typeNotes
Bump / Limit switchADIDigital input. Returns 0 or 1.
Potentiometer V2ADIAnalog. Returns 0–4095 across ~330° rotation.
Line trackerADIAnalog. Returns brightness 0–4095.
LED indicatorADIDigital output. Set 0 or 1.
Inertial Sensor (IMU)SmartRequired by EZ-Template.
V5 Distance SensorSmartReturns mm.
V5 Rotation SensorSmartFor tracking wheels.
V5 Optical SensorSmartHue/saturation/proximity.
AI Vision SensorSmartAprilTags + AI classes.
V5 GPS SensorSmartReads field code strips.

Picking a Specific Port Number

Once you know the port type, pick a specific number that's not already used. The Clawbot uses ports 1, 10, 8, 3 for motors and 11 for the IMU — ports 12–21 are usually free for new sensors.

💡
Conventions worth establishing early. Many teams reserve a port range for one type of device — e.g. ports 1–6 for motors, 11–15 for sensors, 16–21 spare. Document your convention in the engineering notebook on day one. When something fails at competition, the port log makes debugging fast.

How to Verify the Port Works (Before Coding)

Before you write a single line of code, plug the sensor in and check the V5 Brain's Devices screen. The brain auto-detects every connected sensor and shows live readings.

  1. Power on the V5 Brain.
  2. Touch Devices on the home screen.
  3. Find your sensor in the list — it'll show its port number and live values.
  4. If it's not in the list: bad cable, wrong port type, or the sensor is broken. Fix it now.
⚠️
If the brain doesn't see the sensor, your code never will. A flaky cable or a sensor plugged into the wrong port type silently fails. The brain's Devices screen is the ground truth — no software detection logic in PROS will rescue you from a missing physical connection.
// Section 03
Step 2 — Declare the Device
One line in your subsystems header tells PROS that a particular port has a particular kind of sensor on it. Use a named constant for the port number so future-you can change the wiring in one place.
◎ Step 2 of 5

Where the Line Goes

If you're using the EZ-Template project structure (highly recommended), there's already an include/subsystems.hpp file. Every device declaration goes there. The motors, the sensors, the controller alias — one big wiring map you can read top-to-bottom.

If you don't have a subsystems.hpp yet, see Organizing Code Across Files. Don't put device declarations in main.cpp alongside the autonomous logic — it gets unmanageable fast once you have more than two or three sensors.

The Pattern

Every device declaration follows the same pattern: a #define for the port number, then an inline object on that port.

include/subsystems.hpp — the universal pattern
// 1. Name the port #define PORT_MY_SENSOR 5 // 2. Declare the device inline pros::SensorClass my_sensor(PORT_MY_SENSOR);

Class Names by Sensor Type

Each sensor has a different PROS class name. The pattern is consistent — PascalCase, in the pros:: namespace, sometimes nested under pros::adi:: for ADI devices.

SensorPROS classHeader
Bump / Limit switchpros::adi::DigitalInpros/adi.hpp
Potentiometer V2pros::adi::Potentiometerpros/adi.hpp
Line trackerpros::adi::AnalogInpros/adi.hpp
Inertial Sensorpros::Imupros/imu.hpp
Distance Sensorpros::Distancepros/distance.hpp
Rotation Sensorpros::Rotationpros/rotation.hpp
Optical Sensorpros::Opticalpros/optical.hpp
AI Vision Sensorpros::AIVisionpros/aivision.hpp (PROS 4)
GPS Sensorpros::Gpspros/gps.hpp

The headers are usually all included transitively by "main.h" — you usually don't need to #include them explicitly. If a class doesn't resolve, check the header path.

ADI Ports Take a Char, Smart Ports Take a Number

One small wrinkle worth noting:

include/subsystems.hpp — smart vs ADI
// Smart port: just a number 1-21 #define PORT_DISTANCE 5 inline pros::Distance distance(PORT_DISTANCE); // ADI port: a CHAR 'A' through 'H' #define ADI_BUMP 'A' inline pros::adi::DigitalIn bump(ADI_BUMP);

Why inline in a Header?

Without inline, including subsystems.hpp from two different .cpp files (say main.cpp and autons.cpp) creates two copies of each device, causing a linker error like multiple definition of 'distance'. The inline keyword tells the compiler "this is the same object across all files" — modern C++ (since C++17) lets you do this for variables.

💭
The alternative. Older PROS projects use extern in the header and a separate .cpp file with the actual definition. Both patterns work; inline is one less file to maintain. Pick one and stick with it across the project.
// Section 04
Step 3 — Verify the Sensor Live
Don't write any sensor logic until you've watched the value change in real time. Print to the controller or brain screen, push the sensor by hand, and confirm the numbers behave the way you expect.
◎ Step 3 of 5

Why This Step Is Non-Negotiable

The whole rest of the project assumes the sensor reading is meaningful. If the sensor is wired backwards, mounted at the wrong height, or has a flaky cable — the readings are garbage and every line of downstream code is built on quicksand.

Step 3 is the last point where verification is cheap. Once you've written 50 lines of autonomous logic, debugging a "why does my robot crash" problem is much harder than checking "what does the sensor read when I push it by hand."

Two Places to Print

Brain LCD (LLEMU)

8 lines of text on the V5 Brain screen. Easy to read, large font, but you have to look at the brain — not great if the robot is moving.

Controller Screen

3 small lines on the V5 Controller. Easy to read while driving the robot, but limited to ~15 chars per line.

The Brain LCD Pattern

Drop this in opcontrol() for live debugging:

src/main.cpp — brain LCD print loop
void opcontrol() { while (true) { // Line 0 reserved by EZ-Template's auton selector. Use 1+. pros::lcd::print(1, "bump: %d", bump.get_value()); pros::lcd::print(2, "pot: %d", arm_pot.get_value()); pros::lcd::print(3, "dist: %d mm", distance.get()); chassis.opcontrol_tank(); pros::delay(ez::util::DELAY_TIME); } }

The Controller Screen Pattern

For something you want to see while driving:

src/main.cpp — controller print (rate-limited)
int tick = 0; while (true) { // Don't print every loop — the controller is bandwidth-limited // to ~10Hz. Spamming it every 10ms makes every print get dropped. if (++tick % 10 == 0) { master.print(0, 0, "pot %d ", arm_pot.get_value()); } pros::delay(ez::util::DELAY_TIME); }
⚠️
The controller print rate limit. The V5 controller has a slow radio link to the brain. Calling master.print() every loop silently drops most of the prints — the screen looks frozen, even though your code is running fine. Rate-limit to roughly 100 ms (every 10 loops at the default delay).

The Wiggle Test

Once values are printing, do the wiggle test:

  1. Push the sensor by hand — press the bump switch, rotate the pot, hold something in front of the distance sensor.
  2. Watch the printed value.
  3. Does it change the way you expect?
SensorWiggle testGood result
Bump switchPress and releaseValue flips between 0 and 1.
PotentiometerRotate slowly through full rangeValue sweeps smoothly across ~0–4095.
Distance sensorHold a flat object 5cm, 30cm, 1m awayReading roughly matches in mm.
Optical sensorHold a coloured card in frontHue value changes with colour.
IMU headingRotate the robot by handHeading changes smoothly with rotation.
🔥
If the wiggle test fails, do not move on. Common causes: wrong port number in the #define, sensor wired into the wrong port type, broken cable, sensor mounted backwards (some pots read reverse), sensor too far from its target. Fix the hardware/wiring before writing any sensor logic.
// Section 05
Step 4 — Calibrate Your Constants
Sensors return raw numbers. Your code needs human-meaningful thresholds — "arm at the bottom" or "wall close enough to score." Translate one into the other by reading the sensor at known positions and writing the numbers down.
◎ Step 4 of 5

The Magic-Number Problem

Here's a beginner's first sensor-driven autonomous routine:

src/autons.cpp — the wrong way
// What is 280 supposed to be? Stopping distance? Arm height? Ring colour? // Why 280? Why not 250? Why not 300? while (distance.get() > 280) pros::delay(10);

Six months later when the robot is rebuilt and the sensor is in a different position, no one remembers what 280 means or how to update it. The magic number is dangerous because it loses its meaning the moment you walk away from your computer.

The Calibrate-and-Name Pattern

Replace every magic number with a named constant. The constant carries the meaning forward.

src/autons.cpp — the right way
// Distance in millimeters from the wall when we should stop scoring. // Calibrated: pushed robot to scoring position, read sensor, got 280. const int SCORE_STOP_MM = 280; while (distance.get() > SCORE_STOP_MM) pros::delay(10);

How to Find Each Constant

The workflow is the same for every sensor:

  1. Print the sensor value live (Step 3).
  2. Manually move the robot or mechanism to a known meaningful position.
  3. Read the value off the screen.
  4. Write it down with a one-line comment explaining what the position was.
  5. Plug it in as a named constant.

Example: Three Arm Heights

Suppose you have a potentiometer on an arm. You want preset heights for "down," "mid," and "up."

tuning workflow
// Step 1: print pot value live to controller (see Step 3) // Step 2: manually push arm to its lowest safe position // -> controller shows: pot 1187 // Step 3: manually push arm to a useful mid-height for transport // -> controller shows: pot 1985 // Step 4: manually push arm to its highest safe position // -> controller shows: pot 2812 // Step 5: write them down as named constants const int ARM_POT_MIN = 1187; // arm against bottom hard stop const int ARM_POT_MID = 1985; // arm at carry height const int ARM_POT_MAX = 2812; // arm against top hard stop
💡
Add a small safe-band offset. Don't tune to the exact hard-stop reading. Subtract 30–50 from your MAX (so the arm stops just shy of the hard stop) and add the same to your MIN. This gives the controller time to react, prevents the motor from grinding against the mechanical stop, and protects the gearbox.

Re-Calibrate When Anything Changes

Calibration is not a one-time job. Re-run the workflow whenever:

🔭
Notebook this. Every calibration run is engineering-notebook gold. Photo of the robot at each calibration position, the raw sensor reading, the named constant. RECF judges love seeing quantitative testing methodology — this is exactly that.
// Section 06
Step 5 — Use the Sensor in Code
Read the value, compare against a constant, drive an output. The basic if-then pattern that turns sensor data into robot behaviour. Avoid the four common mistakes that show up in every team's first sensor code.
◎ Step 5 of 5

The Three Basic Use Patterns

Pattern A — Run-Until-Threshold

Run an output until a sensor reading crosses a threshold. The most common pattern in autonomous.

src/autons.cpp — pattern A
chassis.drive_set(60, 60); while (distance.get() > SCORE_STOP_MM) { pros::delay(10); } chassis.drive_set(0, 0);

Pattern B — Block-If-Limit-Reached

Refuse to drive an output if the sensor says it's unsafe. Used for protecting hard stops on arms and lifts.

src/clawbot.cpp — pattern B
if (master.get_digital(DIGITAL_L2) && bump.get_value() == 0) { arm.move_velocity(-50); // only lower if bump NOT pressed } else { arm.brake(); }

Pattern C — Branch-On-Reading

Take different actions depending on what the sensor reports. Common with optical sensors and AprilTags.

src/autons.cpp — pattern C
double hue = optical.get_hue(); if (hue < 15 || hue > 345) handle_red_object(); else if (hue > 200 && hue < 260) handle_blue_object(); else eject_unknown();

Four Mistakes Every Beginner Makes

Mistake 1: Hard-coded ports inside functions. Writing pros::Distance(5).get() inside an autonomous routine duplicates the port number across the project. Change one wire, edit ten places. Always declare devices in subsystems.hpp and reference them by name.
Mistake 2: Equality comparisons on noisy sensors. while (distance.get() == 300) almost never triggers — the value flies through 300 in one loop and the loop never exits. Use thresholds (>, <, <=) for analog sensors. Equality is fine for binary sensors like a bump switch.
Mistake 3: No timeout on a sensor-driven loop. What if the sensor never crosses the threshold? while (distance.get() > 280) runs forever and the auton hangs. Always add a safety cap — either a max travel distance or a timeout in milliseconds.
Mistake 4: Trusting a single reading. Sensors occasionally return spikes — a wire crossing the distance beam, an intake roller flashing past, a bad frame from the camera. For critical exits, require two or three consecutive matching readings before acting. See Sensor-Based Autonomous for the noise-filter pattern.

The Safe Sensor Loop Template

Putting the four mistakes' inverses together — this is the template for every robust sensor-driven loop:

src/autons.cpp — the safe template
const int SCORE_STOP_MM = 280; // named constant, not magic number const int CONFIRMS_NEEDED = 2; // noise filter const int TIMEOUT_MS = 3000; // safety cap int confirm = 0; int start_ms = pros::millis(); chassis.drive_set(60, 60); while (pros::millis() - start_ms < TIMEOUT_MS) { // timeout exit int r = distance.get(); if (r > 0 && r <= SCORE_STOP_MM) { // threshold, not equality if (++confirm >= CONFIRMS_NEEDED) break; // noise filter } else { confirm = 0; } pros::delay(ez::util::DELAY_TIME); } chassis.drive_set(0, 0);

Once you've used this template a few times, it becomes muscle memory. Every sensor-driven loop in your project follows the same shape.

// Section 07
Worked Example — Bump Switch on the Arm
Walk through all five steps with the simplest sensor on the V5 — a bump switch that stops the arm from going below its safe minimum. By the end, you've added one sensor end-to-end and you know the rhythm.

The Goal

Mount a bump switch on the chassis, positioned so the arm presses it when the arm reaches its lowest safe point. The switch tells the code: "the arm is at the bottom — stop trying to lower it."

This is the simplest possible sensor work and it's also genuinely useful — it's the bottom half of the dual-limit arm pattern from the Sensor-Fused Clawbot Walkthrough.

Step 1 — Pick a Port

A bump switch is a digital input, so it goes on an ADI port. Ports A through H are all interchangeable for our purposes — pick one that's empty. 'A' is conventional.

Plug the switch in. Power on the brain. Open Devices and confirm the brain sees an input on ADI port A. Press the switch by hand — the value should flip.

Step 2 — Declare the Device

Add two lines to subsystems.hpp:

include/subsystems.hpp
#define ADI_ARM_BUMP 'A' inline pros::adi::DigitalIn arm_bump(ADI_ARM_BUMP);

Save and upload. The device is now declared. Don't write any sensor logic yet.

Step 3 — Verify Live

Add one print line to opcontrol():

src/main.cpp — in opcontrol() while loop
pros::lcd::print(2, "bump: %d", arm_bump.get_value());

Upload. Look at the brain LCD. Press the switch by hand — the value should change between 0 (released) and 1 (pressed). If it doesn't change, the wiring is wrong; go fix it before moving on.

Step 4 — Calibrate Constants

The bump switch is binary — there's nothing to calibrate. The "constant" is just the meaning of 0 and 1, which we capture in two helper functions for readability:

src/clawbot.cpp
static inline bool arm_at_bottom() { return arm_bump.get_value() == 1; }

That's the calibration step done. For an analog sensor like a pot you'd be writing down numbers; for a binary sensor you're naming the meaning.

Step 5 — Use in Code

Pattern B from the previous section — block the lower-arm command if the bump switch says we're at the bottom:

src/clawbot.cpp — arm_control()
void arm_control() { bool want_down = master.get_digital(DIGITAL_L2); if (want_down && !arm_at_bottom()) { arm.move_velocity(-50); } else if (master.get_digital(DIGITAL_L1)) { arm.move_velocity(60); } else { arm.brake(); } }

Upload. Drive the arm down with L2 — it should stop the moment it presses the bump switch, even if you keep holding the trigger.

That's the entire workflow. One sensor, fully integrated, in five short steps.

🚀
What to do next. Add a potentiometer at the top of the arm using exactly the same workflow. Step 4 takes longer for the pot because you have to write down three calibration values, but Steps 1, 2, 3, and 5 are identical. Repeat with the distance sensor, then GPS, then AI Vision — each new sensor takes less time than the last because the rhythm is the same.
🔬 Check for Understanding
A team adds a distance sensor (Steps 1, 2, 5) and skips Steps 3 and 4. Their autonomous routine drives forward forever and crashes into a wall. What's the most likely cause?
The PROS distance sensor class is broken in this version.
The sensor is reading 9999 (no object detected) because of wrong port, bad cable, or wrong mounting — problems Step 3 would have surfaced in 30 seconds.
The chassis PID needs more tuning.
EZ-Template doesn't support the distance sensor.

Where to Go From Here

Sensor Roadmap

The priority order — which sensor to add next, and why.

Discrete Sensors Deep Dive

Switches, pot, GPS — sensor-by-sensor specifics.

Sensor-Based Autonomous

Replace timed waits with sensor-confirmed conditions.

Full Integration

Sensor-Fused Clawbot — what five sensors look like working together.

📝
EN4 reminder. Workflows are great notebook content because they're processes — describe in your own words how you went through these five steps for your specific sensor, with photos and your real calibration values. Judges read for understanding; rewriting these steps in your own voice is exactly the kind of evidence the rubric rewards.
Related Guides
🎯 Sensor Roadmap → 🔗 Switches & Pot → 📏 Distance Sensor → ⚡ Sensor-Based Auton → 🤖 Full Integration → 📄 Organizing Code →
← ALL GUIDES