Sources & confidence: Hardware specs (proximity range, color detection <100mm, RGB+HSV+gesture) and part number 276-7043 verified against the official VEX Library and VEX Robotics product page. PROS API references verified against pros.cs.purdue.edu/v5 (the pros::Optical class). The Purdue SIGBots Wiki (the maintainers of PROS) also confirms PROS-vs-VEXcode API differences. EZ-Template integration patterns drawn from EZ-Template's public docs at ez-robotics.github.io/EZ-Template. Override-specific use cases are speculation until the manual drops Monday April 27, 2026.
// Section 01

V5 Optical Sensor — Overview 🔮

A close-range multi-mode sensor that combines color detection, proximity, ambient light, and gesture recognition in one Smart Port device. Cheaper than AI Vision, narrow-purpose, but extremely useful for ring sorting, intake control, and game-element detection.
🌈 Color Detection 📍 Proximity Sensing 🏆 V5RC / VEX U / VEX AI Legal

Hardware Summary

VEX Part Number276-7043
Sensor TypeCombination ambient light, color, proximity, and gesture sensor — all in one housing.
Effective Color RangeBest detection when object is < 100mm (~4 inches) from the sensor face. Beyond that, color reliability drops sharply.
Proximity RangeReturns 0–255 (PROS API). Higher values = closer object. Nominal sensing range is roughly 10cm. Values are affected by ambient light and object reflectivity.
ConnectionOne V5 Smart Port via Smart Cable. Counts as a sensor port.
MountingTwo mounting tabs with slotted holes. Width fits inside C-channel, but requires a 1/4″ standoff (275-1013) or 8mm plastic spacer (276-2019) for Smart Port clearance.
White LEDBuilt-in. Programmable PWM 0–100. Helps maintain consistent color readings under variable competition lighting.
Toolchain CompatibilityPROS C++ via pros::Optical (full feature support). Also works in VEXcode V5 / EXP. Per Purdue SIGBots Wiki: PROS exposes more features than VEXcode (saturation, full proximity range).

Why You Want One on Your Override Robot

The Optical Sensor solves problems the AI Vision Sensor cannot — small, fast, close-range tasks:

📋
Optical vs AI Vision — complementary, not redundant. The AI Vision Sensor sees far and identifies objects/tags. The Optical Sensor sees close (<100mm) and reads color/proximity instantly. Top teams use BOTH: AI Vision for navigation, Optical for in-mechanism sensing.
// Section 02
What the Sensor Detects 🔭
Five distinct readings, each accessed through its own PROS function. Knowing which mode does what is the difference between fast detection and noisy readings.

1. Color Hue (0–360 degrees)

The hue is the angular position on the color wheel. Returned as a double from 0 to 359.99.

Red~0° (or ~360°)
Yellow~60°
Green~120°
Cyan~180°
Blue~240°
Magenta~300°

Detection works best when the object is < 100mm from the sensor. At greater distances, hue readings become inconsistent and noisy.

PROS function: optical.get_hue()

2. Saturation (0.0–1.0)

How "pure" the color is. 0.0 = grayscale (no color), 1.0 = fully saturated. Useful for distinguishing actual colored objects from ambient white reflections or grayscale field tiles.

PROS function: optical.get_saturation()

3. Brightness (0.0–1.0)

How bright the detected object is. Useful for distinguishing white from black, light from dark.

PROS function: optical.get_brightness()

4. Proximity (0–255)

The IR-based proximity reading. Higher values = closer object. Per PROS docs: range is 0 to 255. Per VEX docs: nominal sensor range is ~10cm.

⚠️
Proximity is affected by ambient light AND object reflectivity. The same object at the same distance can return different proximity values under different lighting or if the object is more/less reflective. Calibrate per-robot, per-object, per-venue. Do not hard-code thresholds without testing in the actual competition lighting.

Per VEX docs: proximity uses reflected IR (infrared) energy from an integrated IR LED, NOT the white LED. The white LED is for color detection only.

Important PROS limitation: Proximity is NOT available when gesture detection is enabled. Pick one or the other.

PROS function: optical.get_proximity()

5. Gesture Detection

Detects four directional motions of an object passing in front of the sensor: UP, DOWN, LEFT, RIGHT. Returned as optical_direction_e_t enum (NO_GESTURE, UP, DOWN, LEFT, RIGHT, ERROR).

PROS functions: optical.enable_gesture(), optical.get_gesture(), optical.disable_gesture()

Note: Useful for detecting things like a ring passing through your intake (motion event), but you lose proximity readings when gesture mode is on. For most competition uses, leave gestures disabled and use proximity directly.

6. RGB Components

If you need raw red/green/blue values (e.g., to do your own color matching with custom thresholds), PROS provides:

optical.get_rgb() — returns a struct with red, green, blue, and brightness fields.

PROS exposes saturation and full RGB separately; per Purdue SIGBots Wiki, this is more granular than VEXcode's API.

7. Integration Time (Update Rate)

How often the sensor updates internally. Lower = faster updates but noisier; higher = slower but more stable.

PROS functions: optical.set_integration_time(ms) — range 3–712ms. Default is fine for most use cases. Lower it if you need faster reaction (~10–20ms) for sorting fast-moving game elements.

// Section 03
Mounting the Sensor 🔧
The Optical Sensor's effectiveness depends entirely on placement. Wrong location = useless data. Right location = match-changing capability.

The Cardinal Rule: Stay Within 100mm

Color detection works best when the object is < 100mm (~4 inches) from the sensor face. This is the single most important constraint. If you mount the sensor far from where game elements travel, you will get unreliable readings or no readings at all.

Recommended Mounting Locations

Inside the Intake Throat (Best for Color Sort)
Mount the sensor pointing inward at the narrow throat of your intake, where rings/objects pass through one at a time. The object is briefly very close to the sensor face. Best location for alliance-color sorting. Use polycarbonate or 3D-printed brackets to position precisely.
Above the Intake Floor (Proximity Trigger)
Mount the sensor at the top of your intake throat, pointing down at the path objects travel. Use proximity reading to detect "an object is now in my intake" and trigger downstream mechanisms (conveyor on, scoring sequence start).
Lift Mechanism Stage (Mid-Path Detection)
If you have a multi-stage lift or conveyor, mount one Optical Sensor mid-path to detect "the object has reached this stage" transitions. Useful for choreographed scoring sequences.
Bottom-Facing Floor Detector
Mount underneath the robot pointing at the floor for tape-line detection (using brightness and the white LED). Be aware of foam tile color — field tile reflectivity is not uniform.

Mechanical Mounting Details

Per VEX Library on the Optical Sensor:

White LED Considerations

The integrated white LED is a critical feature. It illuminates objects close to the sensor for consistent color detection regardless of ambient light.

Calibration Pre-Match

Before every competition (or at least at every new venue):

  1. Place each game element you care about (red ring, blue ring, mobile goal, etc.) at the position the sensor will see it during a match.
  2. Read and log the hue, saturation, brightness, and proximity values for each.
  3. Build a hue-range table: e.g., red rings = hue 350–15°, blue rings = hue 215–245°.
  4. Use these ranges in your code for color classification, not raw equality.

You can build a small calibration utility that prints values to the V5 Brain screen and runs in opcontrol — press a button to log the current readings.

// Section 04
PROS Code Patterns 💻
Verified PROS V5 C++ API for the Optical Sensor. Reference: pros.cs.purdue.edu — pros::Optical class.
⚠️
EN4 reminder: Use these patterns to understand the approach. Rewrite the code in your own structure, comments, and variable names for the engineering notebook. Don't copy-paste.

Verified PROS API Reference

Classpros::Optical
Header#include "pros/optical.hpp" (or just use pros/api.h)
Constructorpros::Optical optical(uint8_t port);
Huedouble hue = optical.get_hue(); — range 0 to 359.99
Saturationdouble sat = optical.get_saturation(); — range 0.0 to 1.0
Brightnessdouble br = optical.get_brightness(); — range 0.0 to 1.0
Proximityint prox = optical.get_proximity(); — range 0 to 255
White LEDoptical.set_led_pwm(50); — range 0 to 100
RGB structauto rgb = optical.get_rgb(); — struct with .red, .green, .blue, .brightness
Update rateoptical.set_integration_time(20); — ms; range 3 to 712
Gesturesoptical.enable_gesture(), get_gesture(), disable_gesture() (disables proximity)

Pattern 1: Color Classification (Alliance Sort)

Determine if a detected object is a red ring, blue ring, or unknown. Use hue ranges from your pre-match calibration.

// PROS C++ — alliance color sort
// EN4: rewrite in your own words and structure for your notebook. #include "pros/apix.h" #include "pros/optical.hpp" // Sensor on Smart Port 5 pros::Optical optical(5); enum RingColor { NONE, RED, BLUE, UNKNOWN }; RingColor classify_ring() { // Object must be very close to read color reliably if (optical.get_proximity() < 200) return NONE; // Saturation gate: ignore washed-out / grayscale readings if (optical.get_saturation() < 0.3) return UNKNOWN; double hue = optical.get_hue(); // Red wraps around 0/360 -- check both ranges if (hue >= 350.0 || hue <= 15.0) return RED; if (hue >= 210.0 && hue <= 245.0) return BLUE; return UNKNOWN; } void initialize() { // Turn on white LED at 60% for consistent color reading optical.set_led_pwm(60); // Faster updates for catching fast-moving rings optical.set_integration_time(15); }

Pattern 2: Proximity-Triggered Intake Stop

Run intake until proximity hits a threshold (object detected), then stop. Avoids over-stuffing the intake.

// PROS C++ — proximity-triggered intake stop
// EN4: rewrite in your own words for your notebook. pros::Motor intake(11); pros::Optical optical(5); void intake_until_loaded() { intake.move(127); // full power forward // Poll proximity until ring is detected close to sensor while (optical.get_proximity() < 220) { pros::delay(10); } // Ring detected -- stop intake intake.move(0); intake.brake(); }

Pattern 3: Auto-Sort (Reject Wrong-Color Rings)

Combine color detection with intake control to ONLY keep alliance-colored rings.

// PROS C++ — auto-sort wrong-color rings
// EN4: rewrite in your own words for your notebook. pros::Motor intake(11); pros::Motor ejector(12); // a small mechanism that ejects rejected rings pros::Optical optical(5); const RingColor MY_ALLIANCE = RED; // set per match // Background task: continuously sorts what enters the intake void sort_task_fn(void* p) { bool ring_present = false; while (true) { int prox = optical.get_proximity(); bool now_present = (prox > 200); // Rising edge: ring just entered if (!ring_present && now_present) { pros::delay(50); // let it settle into the read window RingColor c = classify_ring(); if (c != MY_ALLIANCE && c != UNKNOWN) { // Wrong color -- eject! intake.move(-127); // reverse intake briefly ejector.move(127); // engage ejector pros::delay(300); intake.move(127); // resume normal intake ejector.move(0); } } ring_present = now_present; pros::delay(10); } } void initialize() { optical.set_led_pwm(60); optical.set_integration_time(15); pros::Task sort_task(sort_task_fn, nullptr, "sort"); }

Pattern 4: Calibration Helper

A simple opcontrol-mode utility that prints live sensor values to the V5 Brain screen. Use it during pit setup to record values for your hue ranges.

// PROS C++ — calibration print loop
// EN4: rewrite in your own words for your notebook. void calibration_loop() { pros::lcd::initialize(); optical.set_led_pwm(60); while (true) { double hue = optical.get_hue(); double sat = optical.get_saturation(); double br = optical.get_brightness(); int prox = optical.get_proximity(); pros::lcd::print(0, "Hue: %.1f deg", hue); pros::lcd::print(1, "Sat: %.2f", sat); pros::lcd::print(2, "Brt: %.2f", br); pros::lcd::print(3, "Prox: %d", prox); pros::delay(50); } }

Hold a red ring in front of the sensor: hue should read ~0° (or ~360). Try a blue ring: ~230. Try a mobile goal in your alliance color: log it. Build your tables from this.

// Section 05
EZ-Template & LemLib Integration 🔗
How sensors fit into chassis-library workflows. Spoiler: they don't plug into EZ-Template directly — you compose them with chassis calls.

The Honest Picture

EZ-Template and LemLib are chassis movement libraries. They handle:

They do NOT wrap:

For sensors and custom mechanisms, you use the standard PROS API directly. Integration with EZ-Template happens in your autonomous code — you compose chassis motion calls with sensor reads.

Pattern A: Sensor-Triggered Motion Exit

Start an EZ-Template chassis motion, watch a sensor in parallel, cancel the motion when sensor condition met.

// EZ-Template + Optical Sensor — drive until ring detected
// EN4: rewrite in your own words for your notebook. #include "main.h" #include "EZ-Template/api.hpp" #include "pros/optical.hpp" extern Drive chassis; extern pros::Optical optical; extern pros::Motor intake; // Drive forward up to 36 inches, but exit early if intake detects a ring void drive_until_ring() { intake.move(127); // intake on chassis.pid_drive_set(36_in, DRIVE_SPEED, true); // start motion (async-friendly) while (chassis.pid_wait_quick_chain() == false) { if (optical.get_proximity() > 200) { chassis.pid_targets_reset(); // cancel current PID motion chassis.drive_set(0, 0); // hard stop break; } pros::delay(20); } intake.move(0); }

Pattern B: Pre-Move Sensor Check

Before issuing a chassis movement, check if the sensor reports the expected condition. Use this for preconditions in auton sequences.

// EZ-Template + Optical — conditional auton step
// EN4: rewrite in your own words for your notebook. void auton_score_if_correct_color() { // Step 1: drive to scoring zone chassis.pid_drive_set(24_in, DRIVE_SPEED, true); chassis.pid_wait(); // Step 2: verify we have the right ring before scoring RingColor c = classify_ring(); if (c == MY_ALLIANCE) { // Score it chassis.pid_drive_set(6_in, DRIVE_SPEED, true); chassis.pid_wait(); // ...activate scoring mechanism... } else if (c == UNKNOWN || c == NONE) { // No ring or unsure -- skip scoring, save time return; } else { // Wrong color -- eject before continuing eject_ring(); } }

Pattern C: Background Sensor Task + Chassis Auton

Run sensor monitoring in a background task; main auton calls EZ-Template chassis motions in sequence. Sensor task can preempt or signal main flow.

// Background task pattern
// EN4: rewrite in your own words for your notebook. // Shared state struct SensorState { pros::Mutex mutex; bool ring_loaded = false; RingColor last_color = NONE; }; SensorState sensor_state; void optical_task_fn(void* p) { while (true) { bool present = optical.get_proximity() > 200; RingColor c = present ? classify_ring() : NONE; sensor_state.mutex.take(); sensor_state.ring_loaded = present; sensor_state.last_color = c; sensor_state.mutex.give(); pros::delay(20); } } void initialize() { optical.set_led_pwm(60); pros::Task t(optical_task_fn, nullptr, "optical"); } // Auton can read state without blocking void autonomous() { chassis.pid_drive_set(24_in, DRIVE_SPEED, true); chassis.pid_wait(); sensor_state.mutex.take(); bool have_ring = sensor_state.ring_loaded; sensor_state.mutex.give(); if (have_ring) { /* score */ } else { /* go pick one up */ } }

LemLib Equivalent Pattern

The same patterns apply with LemLib. Method names differ:

Both libraries support async motion with sensor-triggered exits. Pick the library your team is most comfortable with — the integration patterns work in either.

Where to Define Your Sensor

In an EZ-Template-based PROS project, the standard convention is to declare global sensor instances in subsystems.hpp (declared extern) and define them in subsystems.cpp. EZ-Template's example project follows this convention.

// subsystems.hpp
#pragma once #include "EZ-Template/api.hpp" #include "api.h" extern Drive chassis; extern pros::Optical optical; // your Optical Sensor extern pros::AIVision aivision; // your AI Vision Sensor extern pros::Motor intake; extern ez::Piston doinker;
// subsystems.cpp (or main.cpp)
// Definitions pros::Optical optical(5); // port 5 pros::AIVision aivision(8); // port 8 pros::Motor intake(11); ez::Piston doinker('A');
// Section 06
Override Use Cases 🎯
Speculative until the manual drops Monday April 27. Below are likely high-value applications based on the kickoff trailer's mention of roller-driven swing scoring and standard V5RC patterns.
🔔
This section is pre-manual speculation. The Override game manual drops Monday April 27, 2026. Once the field elements and scoring rules are known, specific use cases will become concrete. Re-read this section after Monday.

Likely Application 1: Alliance Ring/Object Sorting

Most V5RC games for the past three seasons (Spin Up, Over Under, High Stakes) have used alliance-colored game elements. If Override continues this pattern (likely), you'll have red/blue scoring objects on the field.

Optical Sensor application: Mount inside the intake throat. Classify each picked-up object by color. Reject opponent-colored objects automatically. This was a major competitive advantage in High Stakes for top teams.

Likely Application 2: Roller / Trigger Element Detection

The kickoff trailer mentioned roller-driven swing scoring. If Override has any rotational or trigger element, the Optical Sensor can detect:

Place the Optical Sensor close to where your robot contacts the swing/roller mechanism so it can read state in real-time.

Likely Application 3: Loaded vs. Empty State Detection

Many V5RC games include "possession limit" rules (you can only hold N rings/balls/etc at once). The Optical Sensor proximity reading can confirm:

Likely Application 4: Field Tape / Line Following

If Override has tape lines or color zones on the field tile (some past games had these), a downward-facing Optical Sensor can detect them via brightness contrast with the white LED on. This was less common in recent V5RC games but is technically possible.

Pattern: Multi-Sensor Robot

A competitive Override robot likely uses BOTH sensors:

The V5 Brain has 21 Smart Ports, so port count is rarely a constraint. The constraint is processing the data well in your code.

Override Hardware Order Plan

This weekend / Monday after manual:

  1. Order at least 1 AI Vision Sensor (276-8659). One per robot is sufficient for most uses.
  2. Order at least 2 Optical Sensors (276-7043). One for intake-throat color sort, one as spare or for secondary use.
  3. Order Smart Cables and the right standoffs/spacers (1/4″ standoff 275-1013 or 8mm spacer 276-2019 for Optical mounting).
  4. Print AprilTag PDF for practice (kb.vex.com).
  5. Forum discussion notes that AI Vision Sensor demand exceeds supply at season starts — order early.
// Section 07
Skills Run & Driver Tips 🏆
How the Optical Sensor specifically improves auton skills and driver skills runs. Different from the AI Vision Sensor — the Optical lives in your mechanisms, not your navigation.

Auton Skills (60-second Programming Skills)

Use Case 1: Reliable Object Loading
Don't use timed intakes. Use the Optical Sensor proximity reading to detect when an object is loaded, then move on. Saves time when objects load fast; prevents disasters when they jam. The single biggest skills-score improvement most teams can make is replacing timed intake commands with sensor-triggered ones.
Use Case 2: Object Counter for Possession Limits
Track how many objects you've loaded with a counter that increments on each rising edge of the proximity reading. When you hit your possession limit, stop intaking automatically. Avoids rule violations during fast skills runs.
Use Case 3: Color-Based Skip Decisions
If your skills run includes picking up unknown-color objects, scan them on intake. Skip scoring (or eject) anything wrong-colored. Don't waste cycle time scoring opponent-colored objects.

Driver Skills (60-second Driver Skills)

Driver Assist 1: Auto-Stop Intake on Load
Driver presses intake button to start. Optical Sensor stops the intake when the ring is loaded. Driver doesn't have to manually stop — faster cycles. Implement as a held-button macro: while held, intake runs and auto-stops on detection.
Driver Assist 2: Color-Locked Scoring
A scoring macro that ONLY fires if the loaded object is alliance-colored. Driver presses scoring button; if Optical Sensor reads correct color, the scoring sequence runs. If wrong color, the robot beeps and ejects instead. Prevents accidental opponent scoring (which gives the opponent points).
Driver Assist 3: Status Display on Brain Screen
During driver skills, display real-time sensor state on the V5 Brain screen: "LOADED [BLUE]" or "EMPTY". The driver can glance at the screen to confirm what they're holding without looking at the robot. Surprisingly useful when the action is fast.

Implementation Tips

  1. Always have a sensor failure fallback. If the Optical Sensor unplugs (loose Smart Cable, contact damage), all sensor-dependent macros stop working. Provide manual override buttons that bypass sensor checks.
  2. Run sensor logic in a dedicated PROS task. Don't poll sensors inline in opcontrol — you'll cause control delays. Background task pattern (shown in Pattern C above) is the right approach.
  3. Match-time recalibration. Some teams add a hidden controller-button combination that re-runs hue calibration mid-pit-time. If lighting changes between rounds, you can recalibrate without restarting.
  4. Test under tournament-typical lighting BEFORE the tournament. Gym lighting is often dimmer or more orange-tinted than classroom LED lighting. The white LED helps but doesn't fully compensate. Practice in conditions like the venue when possible.

Common Pitfalls

⚠️ Things That Will Bite You
  • Mounting too far. > 100mm and color readings become unreliable. The single most common Optical Sensor failure.
  • Forgetting to enable the white LED. Default LED state may be off. Set LED to 50–75% in initialize().
  • Hard-coding hue thresholds. Different venues = different lighting = different readings. Always use ranges and re-calibrate.
  • Reading hue when proximity is low. Far objects return garbage hue values. Always gate color decisions on proximity threshold.
  • Reading hue when saturation is low. Grayscale objects (white tile, black structure) return arbitrary hue values that look like real readings. Gate on saturation.
  • Enabling gestures by accident. Disables proximity. Make sure gestures are explicitly disabled if you need proximity.
  • Polling too fast. Reading at > 100Hz wastes CPU and doesn't give you faster updates than the integration time allows. Match your poll rate to your integration time.
  • Cable strain. The Smart Port is on the side of the sensor. Strain on the Smart Cable can disconnect it during a match. Use cable management.

Related Guides

← ALL GUIDES