Tutorials/CloudSim/notebook color blob


 * 1) Color Blob Detection

The previous tutorial demonstrated how to save an image snapshot from a compressed image ROS topic. The image may be useful for a human in tasks such as remote teleoperation. However, in the case of autonomous robot navigation, a raw image itself is not very useful to a robot. It is often required that the robot be aware of the environment through some form of perception. This tutorial describes how to create a simple vision based object detection program and publish the results through ROS. It illustrates a python program that processes an image using the OpenCV computer vision library for color object detection. The program runs a color filter to extract color blobs from the raw image then publishes the centroid position of the blobs in image coordinates to a ROS topic.


 * 1) Prerequisites

This tutorial carries on from the previous tutorial. It is assumed that you have completed Notebook: ROS Camcorder and have an image named `snapshot.jpg` saved in your iPython Notebook working directory.


 * 1) Task and World

We'll use the same world as the previous tutorial, see Notebook: ROS Camcorder. Start a new notebook for this tutorial.

We don't strictly need the WebGL interface for this tutorial, but feel free to launch it so that you see what's going in the simulated world.


 * 1) The Code

The color blob detection code is shown below. For simplicity, image processing is performed on an image read from disk instead of an image from a ROS image topic. The color blob detector applies a color filter to the raw image file in the HSV color space with the given threshold then finds blob contours and a centroid. Red rectangles will be drawn over the contours and a red circle marks the position of the centroid.

To get started copy and paste the code below to a cell in iPython notebook.

import os import rospy import cv import geometry_msgs from geometry_msgs.msg import Point

class blob_detector: def __init__(self): self.centroid_pub = rospy.Publisher("blob", Point)

def detect(self, img_path, h, s, v, tol): # Load image using OpenCV frame = cv.LoadImage(img_path) cv.Smooth(frame, frame, cv.CV_BLUR, 3)

# Threshold the image in HSV space imgHSV = cv.CreateImage(cv.GetSize(frame), 8, 3) cv.CvtColor(frame, imgHSV, cv.CV_BGR2HSV) img_threshold = cv.CreateImage(cv.GetSize(frame), 8, 1) cv.InRangeS(imgHSV, (h-tol, s-tol, v-tol), (h+tol, s+tol, v+tol), img_threshold) cv.Erode(img_threshold, img_threshold, None, 3) cv.Dilate(img_threshold, img_threshold, None, 10)

# Find contours img_contour = cv.CloneImage(img_threshold) storage = cv.CreateMemStorage(0) contour = cv.FindContours(img_contour, storage, cv.CV_RETR_CCOMP, cv.CV_CHAIN_APPROX_SIMPLE) while contour: # Draw bounding rectangles over contours bound_rect = cv.BoundingRect(list(contour)) contour = contour.h_next pt1 = (bound_rect[0], bound_rect[1]) pt2 = (bound_rect[0] + bound_rect[2], bound_rect[1] + bound_rect[3]) cv.Rectangle(frame, pt1, pt2, cv.CV_RGB(255,0,0), 3)

# Compute moments and find centroid mat = cv.GetMat(img_threshold) moments = cv.Moments(mat, 0) if (moments.m00 > 1000): posX = int(moments.m10 / moments.m00) posY = int(moments.m01 / moments.m00) # Draw a circle at centroid cv.Circle(frame, (posX, posY), 5, cv.CV_RGB(255,0,0), 3) print 'x: ' + str(posX) + ' y: ' + str(posY) + ' area: ' + str(moments.m00) # Publish centroid position through ROS point = geometry_msgs.msg.Point point.x = posX point.y = posY self.centroid_pub.publish(point) else: print "No blobs found" # Save the image dest_path = os.path.split(img_path)[0] + "/blob.jpg" cv.SaveImage(dest_path, frame) return dest_path

The above code assumes that you have initialized a ROS node already from the previous tutorial. If you started a new task, opened a new notebook, or restarted the iPython Kernel, then you'll need:

rospy.init_node('blob')
 * 1) Do this only if ROS node is not yet initialized

The code above publishes the centroid in image coordinates to the `/blob` topic. So we'll subscribe to this topic and print its outputs using `rostopic echo`

%%bash --bg rostopic echo /blob &> detector_output
 * 1) Echo messages received on /blob topic and redirect output to file

As you can see, the output is redirected to an output file named `detector_output`. If you `cat` the file in a cell:

cat detector_output

You should see that nothing is being published yet as we haven't really called the color blob detector's `detect` function yet.

 WARNING: no messages received and simulated time is active. Is /clock being published?

So let's call color blob detector's `detect` function. It is assumed that you've completed the previous tutorial, and a `snapshot.jpg` was in saved your iPython notebook working directory. We'll pass the path to the `snapshot.jpg` image file to the function, along with H, S, V values, and a threshold. In the example below, it looks for objects of green color, which in this world is the top bar of the starting gate in front of Atlas.

The detector saves the result to disk, which we'll also load and display it here. The image display code from the previous tutorial is copied here for convenience

import matplotlib import matplotlib.pyplot import matplotlib.image import numpy as np
 * 1) use matplotlib to display images in the notebook

def show(img_path): """   Displays a png or jpeg file from disk    """ img = matplotlib.image.imread(img_path) np.flipud(img) matplotlib.pyplot.imshow(img, origin='lower')

detector = blob_detector img_path = "./snapshot.jpg"

blob_img_path = detector.detect(img_path, 60, 230, 230, 25) show(blob_img_path)
 * 1) detect a blob!
 * 2) The 2nd-4th arguments are the H, S, V values respectively,
 * 3) and last argment is the threshold for all 3 HSV components.

Depending on the world and pose of the Atlas, the color of the gate may vary slightly and so the HSV values would need to be adjusted. Execute the above code to see if you get any blobs detected, if not, try play around with the HSV values. The range of the HSV values are:

- H: 0-180, - S: 0-255, - V: 0-255.

An image with successful color blob detection result is shown below:



Now execute the color blob detection script a couple more times (the last cell in your notebook) then `cat` the `detector_output` file again. You should see the coordinates printed.

 WARNING: no messages received and simulated time is active. Is /clock being published? x: 406.0 y: 114.0 z: 0.0 ---

Try different HSV values and see if you are able to detect objects of a different color.

Notebook: tabletop manipulation
 * 1) Next ##