<< Chapter < Page Chapter >> Page >
Usage of the Hough Circle Transform on a robot to detect yellow balls.

3.0 hardware introduction

This project had a two part goal: 1. Understand the Hough Transform and be able to detect circles using from-scratch code and 2. Implement the Hough Transform on hardware to create a robot that can detect yellow balls. This section will explain that hardware implementation in detail.

The robot

The Robot
An image of our robot with important system parts labeled.

The hardware that we used for this project was largely provided by the Rice Robotics Club which uses tools from Vex Robotics. The robot four-wheeled using Mechanum wheels which allow for strafing if they are rotated in the proper directions. To drive the wheels, the Vex Cortex onboard computer runs code in C to drive each of the four motors at the proper power levels. The cortex also receives information from a Gyroscope in order to determine what angle the robot is facing. Finally, a Raspberry Pi runs code in Python with a PiCamera to run the ball detection algorithm which transmits data to the Cortex using the UART protocol. Below we can see the block diagram for the robot's system.

Overall hardware system

Overall Hardware System
The system in its entirety. The Raspberry Pi system and Vex Cortex system are explained in detail in further sections.

3.1 raspberry pi system

We will now dive into the inner workings of the Raspberry Pi ball detection algorithm. The overall objective of the Raspberry Pi is to use images taken from the PiCamera and determine an angle for the robot to turn to as well as a single bit determining whether or not a ball is seen.

The raspberry pi system

Pi Detection System
Image capture comes in through the PiCamera hardware. The output is always a 5 character string over UART to the Cortex.

The first thing the pi does is capture an image using the PiCamera. This image capture can only be done as fast as the code can run. At 360x240 pixels, the Pi can capture images at roughly 5 frames per second. Increasing that resolution decreases the frames per second and also decreases the speed that it can detect the ball. Howevever, increasing the resolution also increases the size of the ball that can be detected. A greater resolution means that the Raspberry Pi algorithm can detect balls that are either smaller or further away. At the resolution we chose (360x240) the pi can reliably detect a ball that is 2 or 3 feet away.

Once the image is captured, we need to filter out for the color yellow. This is done quite simply in the HSV (hue/saturation/value) colorspace. In good lighting, the color yellow falls in the Hue values of ~30-60, the Saturation values of ~80-255, and the Value values of ~80-255. More precise values for yellow detection can be used for more accurate results, but they depend heavily on lighting conditions on the room. For this reason, the program needs to be re-calibrated every time it enters new lighting condition. In the future we could have an auto-calibration sequence for the robot to determine these values, or simply preset modes for different lighting methods.

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, Hough transform object detection. OpenStax CNX. Dec 16, 2015 Download for free at http://legacy.cnx.org/content/col11937/1.1
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'Hough transform object detection' conversation and receive update notifications?

Ask