ROS-Infused Rover: Navigating the Future at Robotfest 2.0
ROS-Infused Rover: Navigating the Future at Robotfest 2.0
During my sophomore year, I participated in Robofest, an annual project competition organized by GUJCOST, where I proposed and developed a ROS-based Rover under the guidance of Dr. Avani R. Vasant from BITs EDU Campus.
During my sophomore year, I participated in Robofest, an annual project competition organized by GUJCOST, where I proposed and developed a ROS-based Rover under the guidance of Dr. Avani R. Vasant from BITs EDU Campus.
Category
May 15, 2024
Mobile Robotics
Mobile Robotics
Purpose
May 15, 2024
Competition
Competition
Affiliation
May 15, 2024
Gujarat Technological University
Gujarat Technological University
Year
May 15, 2024
2020
2020


Robofest Phases:
Develop a Proof of Concept (PoC)
Develop a Minimum Viable Product (MVP)
Develop a Full-Fledged System
In terms of software stack, I opted for the Robot Operating System (ROS) due to my prior experience with it. ROS proved to be an ideal choice for both rapid prototyping and the creation of industry-grade software.
To tackle this complex task, I broke down the entire system into smaller, manageable subsystems as follow:
Drive System
This system allowed manual control of the rover by a human operator and could override autonomous control in emergencies. I considered using a joystick package from ROS to send velocity commands to the rover.
Visual Feedback System
This system was all about seeing what the rover sees. Here, the RViz package from ROS came to the rescue. The BL170 camera provided a real-time stream over RTSP, ensuring low-latency transmission.
GPS System
Although part of the Autonomous System, it was worth explaining separately. GPS data was used for monitoring long-distance tasks and providing visualization feedback on MapViz. Additionally, it enabled us to set goals on the map, which the rover could then execute.
Autonomous System
The heart of the project, this system guided the rover to reach and execute goals. The rover featured a depth camera to avoid obstacles in its path, LiDAR for mapping and navigation, and a tracking camera for recalibrating LiDAR in case of false localization. The IMU played a crucial role in keeping the rover's motion in check.
Odometry System
This system received feedback from the motors when velocity commands were issued by either the Autonomous System or the Drive System. Using tick odometry, the motors were controlled more efficiently.
Navigation Stack
When a specific task was assigned to the rover, it would map the environment on the fly if the map wasn't already available. The Cost Map included an obstacle layer for dynamic obstacles and a static layer for objects expected to remain stationary. The Global planner generated the overall path, while the local planner was continually adapted for shorter durations.
For hardware, I explored two distinct flow diagrams, differing primarily in actuator power, sensor configurations, and the choice of single-board computers, tailored to meet the project's requirements.

Although I did not advance to the next round, the experience provided invaluable insights into designing system and software architectures for robotics systems, particularly Rovers.
Robofest Phases:
Develop a Proof of Concept (PoC)
Develop a Minimum Viable Product (MVP)
Develop a Full-Fledged System
In terms of software stack, I opted for the Robot Operating System (ROS) due to my prior experience with it. ROS proved to be an ideal choice for both rapid prototyping and the creation of industry-grade software.
To tackle this complex task, I broke down the entire system into smaller, manageable subsystems as follow:
Drive System
This system allowed manual control of the rover by a human operator and could override autonomous control in emergencies. I considered using a joystick package from ROS to send velocity commands to the rover.
Visual Feedback System
This system was all about seeing what the rover sees. Here, the RViz package from ROS came to the rescue. The BL170 camera provided a real-time stream over RTSP, ensuring low-latency transmission.
GPS System
Although part of the Autonomous System, it was worth explaining separately. GPS data was used for monitoring long-distance tasks and providing visualization feedback on MapViz. Additionally, it enabled us to set goals on the map, which the rover could then execute.
Autonomous System
The heart of the project, this system guided the rover to reach and execute goals. The rover featured a depth camera to avoid obstacles in its path, LiDAR for mapping and navigation, and a tracking camera for recalibrating LiDAR in case of false localization. The IMU played a crucial role in keeping the rover's motion in check.
Odometry System
This system received feedback from the motors when velocity commands were issued by either the Autonomous System or the Drive System. Using tick odometry, the motors were controlled more efficiently.
Navigation Stack
When a specific task was assigned to the rover, it would map the environment on the fly if the map wasn't already available. The Cost Map included an obstacle layer for dynamic obstacles and a static layer for objects expected to remain stationary. The Global planner generated the overall path, while the local planner was continually adapted for shorter durations.
For hardware, I explored two distinct flow diagrams, differing primarily in actuator power, sensor configurations, and the choice of single-board computers, tailored to meet the project's requirements.

Although I did not advance to the next round, the experience provided invaluable insights into designing system and software architectures for robotics systems, particularly Rovers.
Robofest Phases:
Develop a Proof of Concept (PoC)
Develop a Minimum Viable Product (MVP)
Develop a Full-Fledged System
In terms of software stack, I opted for the Robot Operating System (ROS) due to my prior experience with it. ROS proved to be an ideal choice for both rapid prototyping and the creation of industry-grade software.
To tackle this complex task, I broke down the entire system into smaller, manageable subsystems as follow:
Drive System
This system allowed manual control of the rover by a human operator and could override autonomous control in emergencies. I considered using a joystick package from ROS to send velocity commands to the rover.
Visual Feedback System
This system was all about seeing what the rover sees. Here, the RViz package from ROS came to the rescue. The BL170 camera provided a real-time stream over RTSP, ensuring low-latency transmission.
GPS System
Although part of the Autonomous System, it was worth explaining separately. GPS data was used for monitoring long-distance tasks and providing visualization feedback on MapViz. Additionally, it enabled us to set goals on the map, which the rover could then execute.
Autonomous System
The heart of the project, this system guided the rover to reach and execute goals. The rover featured a depth camera to avoid obstacles in its path, LiDAR for mapping and navigation, and a tracking camera for recalibrating LiDAR in case of false localization. The IMU played a crucial role in keeping the rover's motion in check.
Odometry System
This system received feedback from the motors when velocity commands were issued by either the Autonomous System or the Drive System. Using tick odometry, the motors were controlled more efficiently.
Navigation Stack
When a specific task was assigned to the rover, it would map the environment on the fly if the map wasn't already available. The Cost Map included an obstacle layer for dynamic obstacles and a static layer for objects expected to remain stationary. The Global planner generated the overall path, while the local planner was continually adapted for shorter durations.
For hardware, I explored two distinct flow diagrams, differing primarily in actuator power, sensor configurations, and the choice of single-board computers, tailored to meet the project's requirements.

Although I did not advance to the next round, the experience provided invaluable insights into designing system and software architectures for robotics systems, particularly Rovers.