# Camera-Follows-Aim Calibration Guide ## 🎯 **New Targeting System Overview** With the camera mounted to move with the aim mechanism, targeting is now much simpler and more accurate: ### **Goal:** Center the target mouth in the camera view - **Red crosshair** = aiming point - **Blue circle** = target deadzone (30 pixel radius) - **When mouth is in deadzone** = ready to fire ## 🔧 **Calibration Parameters** Edit these values in `octv2_server_v2.py` to match your hardware: ### **1. Targeting Sensitivity** ```python # In MouthDetector.__init__(): self.target_deadzone_pixels = 30 # Larger = more tolerant centering self.max_adjustment_degrees = 10 # Smaller = more precise movements # In calculate_centering_adjustment(): pixels_per_degree_rotation = 15 # Adjust for your setup pixels_per_degree_pitch = 12 # Adjust for your setup ``` ### **2. Distance Estimation** ```python # Camera parameters (measure your actual camera) self.camera_focal_length_mm = 3.04 # Pi Camera focal length self.sensor_width_mm = 3.68 # Pi Camera sensor width self.average_face_width_cm = 16.0 # Average human face width ``` ### **3. Aiming Offsets** ```python # Mechanical compensation self.rotation_offset_degrees = 0.0 # If camera/launcher not aligned self.pitch_offset_degrees = 0.0 # For gravity/drop compensation self.distance_pitch_factor = 0.5 # Higher pitch for closer targets ``` ## 🎮 **Calibration Process** ### **Step 1: Basic Functionality Test** ```bash # Run the server python3 octv2_server_v2.py # Test manual aiming first: # 1. Use iOS app manual controls # 2. Check that ESP32 responds to commands # 3. Verify camera moves with motors ``` ### **Step 2: Centering Calibration** 1. **Position test subject** at known distance (e.g., 1 meter) 2. **Switch to AUTO mode** in iOS app 3. **Open mouth wide** and observe behavior: ``` Expected sequence: 1. Detects WIDE_OPEN mouth 2. Calculates centering adjustment 3. Moves motors to center the mouth 4. Fires when mouth is in target zone ``` ### **Step 3: Pixel-to-Degree Tuning** If the system over/under-corrects, adjust these values: ```python # Too much movement (overshoots): pixels_per_degree_rotation = 20 # Increase value pixels_per_degree_pitch = 16 # Increase value # Too little movement (doesn't reach target): pixels_per_degree_rotation = 10 # Decrease value pixels_per_degree_pitch = 8 # Decrease value ``` ### **Step 4: Distance Accuracy Check** Test distance estimation accuracy: 1. **Measure actual distance** to test subject 2. **Check displayed distance** on video overlay 3. **Adjust parameters** if needed: ```python # If distance reads too high: self.average_face_width_cm = 15.0 # Decrease face width # If distance reads too low: self.average_face_width_cm = 17.0 # Increase face width ``` ### **Step 5: Elevation Compensation** For accurate trajectory at different distances: ```python # If shooting too low at close range: self.distance_pitch_factor = 0.7 # Increase compensation # If shooting too high at close range: self.distance_pitch_factor = 0.3 # Decrease compensation ``` ## 📊 **Understanding the Algorithm** ### **Distance Estimation Formula:** ``` distance = (face_width_real * focal_length * image_width) / (face_width_pixels * sensor_width) ``` ### **Centering Logic:** ```python # 1. Calculate offset from center dx = mouth_x - center_x dy = mouth_y - center_y # 2. Convert to angle adjustments rotation_adj = dx / pixels_per_degree_rotation pitch_adj = -dy / pixels_per_degree_pitch # 3. Apply distance scaling (closer = smaller adjustments) distance_factor = 100.0 / estimated_distance adjusted_pixels_per_degree *= distance_factor # 4. Add compensation offsets rotation_adj += rotation_offset_degrees pitch_adj += pitch_offset_degrees + distance_compensation ``` ## 🎯 **Visual Feedback Elements** ### **On Video Stream:** - **🟢 Green thick border + "🎯 TARGET!"** = Will fire at this mouth - **🟠 Orange border "SPEAKING"** = Ignored - **🟡 Cyan border "SMILING"** = Ignored - **⚪ Gray border "CLOSED"** = Ignored - **Red crosshair** = Current aim point - **Blue circle** = Target zone (deadzone) - **Distance display** = "~150cm" for WIDE_OPEN mouths ### **In Logs:** ``` 🎯 AUTO TARGET: Mouth detected (confidence 0.85, distance ~120cm) 🎯 CENTERING: Adjusting R:+2.3° E:-1.5° -> R:15.3° E:28.5° 🔥 AUTO FIRE: Launching Oreo at centered target! (offset: 12px) ``` ## 🔧 **Common Issues & Solutions** ### **System doesn't fire:** - Check mouth is truly **WIDE_OPEN** (not just speaking) - Verify mouth is detected as **green target** - Ensure mouth gets **centered in blue circle** ### **Overshooting targets:** - **Decrease** `pixels_per_degree_rotation/pitch` values - **Increase** `max_adjustment_degrees` for smaller steps ### **Undershooting targets:** - **Increase** `pixels_per_degree_rotation/pitch` values - Check motor gear ratios match ESP32 firmware ### **Wrong distance estimates:** - **Measure actual face width** of test subject - **Adjust** `average_face_width_cm` accordingly - **Verify camera focal length** specification ### **Systematic aiming errors:** - **Use offset parameters** to compensate: ```python self.rotation_offset_degrees = -2.0 # Aim 2° left self.pitch_offset_degrees = 1.5 # Aim 1.5° higher ``` ## 🎪 **Testing Tips** 1. **Start with stationary targets** - easier to tune 2. **Use consistent lighting** - improves detection 3. **Test at multiple distances** - 0.5m, 1m, 2m, 3m 4. **Mark successful positions** - note what worked 5. **Incremental adjustments** - change one parameter at a time ## 🍪 **Advanced Features** ### **Distance-Based Trajectory:** The system automatically adjusts pitch based on distance: - **Closer targets** (50-100cm) = higher pitch - **Farther targets** (200cm+) = lower pitch ### **Iterative Centering:** If target not centered on first try: - **System makes smaller adjustments** each iteration - **Fires when target enters deadzone** - **Max 10° adjustment per iteration** prevents overshooting Your OCTv2 now has precision camera-follows-aim targeting with distance compensation! 🎯📷