This repository showcases a full-stack autonomous robotics pipeline. I trained a custom YOLOv8 model for vehicle detection and deployed it via a high-performance C++/TensorRT inference engine to power an Adaptive Cruise Control (ACC) system on NVIDIA Jetson hardware.
- End-to-End Deep Learning: From YOLOv8 training to TensorRT optimization.
- Adaptive Cruise Control (ACC): Real-world longitudinal control using a Proportional (P) controller.
- High Performance: Inference is accelerated on NVIDIA GPUs using half-precision (FP16).
- Hardware Ready: Specifically designed for the JetBot platform using ROS 2.
The system regulates the longitudinal distance to a detected lead vehicle.
The desired following distance (setpoint) is defined as 0.50 m.
A proportional controller minimizes the error between the measured distance (
The postproc.py script maps 2D bounding boxes to 3D vehicle coordinates (PerspectiveCalibration object.
In Controller.py, the robot monitors the distance to the lead vehicle (
| State | Condition | Action |
|---|---|---|
| 🔴 Emergency | Distance |
Full Stop. Triggers acc_emergency state. |
| 🟡 Following (Close) | Decelerate: Reduces speed to restore 0.50 m gap. | |
| 🟢 Following (Far) | Accelerate: Increases speed to close gap to 0.50 m. | |
| ⚪ Free Road | No car detected |
Cruising: Maintains DefaultSpeedTrans. |
If an emergency stop occurs, the robot will only resume driving once the car moves beyond
| File | Purpose |
|---|---|
🏗️ training/train_yolov8.py
|
Model Training: Custom YOLOv8 training loop for vehicle detection. |
🚀 export_yolov8ToOnnx.py
|
Exports YOLOv8 weights with End-to-End NMS enabled. |
⚙️ yoloonnxTotrt.txt
|
Command line params for creating the FP16 engine. |
🧠 inference.cpp
|
C++/CUDA node that runs the model on the camera feed. |
📊 inference_visu.py
|
Visualization Node: Overlays telemetry, boxes, and |
🔍 postproc.py
|
Maps model output to vehicle coordinates (meters). |
🎮 Controller.py
|
The main logic node containing the ACC algorithm. |
🛠️ inference_yolo.launch.yaml
|
ROS 2 launch file for the inference system. |
Export the model with NMS embedded, then build the TensorRT engine on your Jetson:
python3 export_yolov8ToOnnx.py
/usr/src/tensorrt/bin/trtexec
--onnx=models/best.onnx
--saveEngine=models/best_fp16.engine
--fp16
--optShapes=images:1x3x640x640
Use the provided launch file to start the system:
ros2 launch hsh_inference inference_yolo.launch.yaml
The Controller.py node listens for keyboard inputs to manage the robot's state in real-time.
a: Toggle Automatic Mode (Enables Lane Following + ACC).g: Toggle ACC (Adaptive Cruise Control) independently.Space: Emergency Stop (Kills motors and switches to manual mode).
- **
w/s**: Increase / Decrease Linear Speed (Forward/Backward). - **
a/d**(in manual): Increase / Decrease Angular Speed (Left/Right). x: Reset all speeds to zero.
c: Capture Calibration Image (Sendscalibsignal).3: Save image tagged as "Blocked".4: Save image tagged as "Free".5: Save standard Dataset Image.
q: Quit the controller node safely.
Developed for the Autonomous Systems Module.
-
Safety Constants: Specifically mentioned the
$30cm$ stop distance and$50cm$ target gap used in yourController.py. -
Hysteresis Logic: Included the
$0.40m$ resume distance that handles robot recovery. -
Launch Support: Added the
inference_yolo.launch.yamlfile to the file table and getting started guide. -
Coordinate Mapping: Highlighted how
$x_{veh}$ is extracted from thepostproc.pyoutput for control.