Tuesday, December 16, 2014

OpenCV essentials Review

OpenCV essentials provides a good overview of what sort of applications can be built with 2.4.9 release of OpenCV. There are a few references to what's coming in 3.0 release. According to the official website of OpenCV, 3.0 is to be released at the end of 2014. I like the way that the author has provided a fair description and code for detecting different 2-D feature descriptors in OpenCV. Cascade classification is also covered. The author goes on to describe what can be used as cascade classifiers. Latent SVM has also been included in this description. Tracking, background subtraction using OpenCV is also covered along with code samples.

Machine Learning is also provided in this book along with code samples. Code samples for classifiers such as KNN, Random Forest, SVM are provided.

I'm glad that steps to configure GPU (CUDA), a couple of GPU optimized OpenCV programs using NVIDIA CUDA are also provided.

If you are looking for a decent book for OpenCV with most recent stable release (2.4.9) covered, this is a good buy.

You can purchase it here.

Sunday, December 14, 2014

Programming a simple self driving robot

I had a lot of fun doing CpE 493M: Mobile Robotics this semester. We explored on how to use various sensors and program them to build an autonomous robot that drives straight.

The hardware was pre-built and all we had to do was to program the robot using a MATLAB toolbox. More details on the hardware platform are provided here.

The robot has a bunch of ultra-sonic sensors, IMU sensor, kinect sensor, magnetometer and all the sensors(wheel encoders, bump sensors) in a programmable roomba. The processing power is provided by an 11-inch asus X202e laptop on a Intel i-3 processor.

We were a group of 5 students who were given the task of navigating the hallway of our building at WVU. We wrote a program to drive across the hallway. The hallway wasn't exactly straight. So, even if we did dead-reckoning, the robot would steer into a wall or go out of the path. This is due to uneven friction on the wheels and other factors (such as cracks on the floor tiles).

My primary task was to design the module that would take images from the Kinect Sensor and give the right heading direction (to steer left or right). This data would be provided to a PID controller that controls the heading direction of the robot. The robot has a differential drive mechanism. This way we are able to control the heading direction by changing speeds of the wheels.

This is what the kinect cameras sees:

Then, we filter the red color from the rest of the image:

Then, we apply blob detection on the image, we then filter the blobs and identify the locations of the blobs. The following blob labels, along with their location are obtained:

Then, we obtain a path by drawing 2 lines from the 2 cups from bottom-left and bottom-right and find an intersection point. Its a naive technique, but works well most of the time.

So, we do path correction after every few frames (5 frames). The data is collected at 20-30 frames per second.

We then are able to correct the path of the robot every few seconds. I'm sorry for the shaky video. Here is a video of our robot driving

Monday, September 8, 2014

Getting data from camera in Robot Operating System

I have used webcams in OpenCV before.

However, I came across a situation where I had to stream the webcam data to a ROS (Robot Operating System) topic.

There are bunch of ROS packages on ros.org website. However, when I did a

it gave me errors saying there was no such package found.

All I want to do is grab the data from a webcam and post it to a topic (so that other ROS nodes can read from that topic). I knew it had to be simple

So, I came across this deb package in Ubuntu

Once, I did the above installation all I had to do was

Then, it started publishing data onto /image_raw topic.

Then I could do a

To read the data from the camera published on the topic.

To get a list of topics published to, do a

If you ran the above command, you could infer that the published image stream is of type /sensor_msgs/Image

Wednesday, February 5, 2014

fun with my Raspberry Pi

I ordered a Raspberry Pi recently. I've heard so much about this device in the recent past, I decide to try it out. I was reminded of few microprocessor(fun class) and electrical engineering(not so fun time) classes that I took in my undergrad. I wrote a simple program that could turn a Stepper Motor using a motor controller connected to a Raspberry Pi.

Raspberry was only 35$. If you are planning to buy it, it might cost you a little more than that if you don't have the necessary stuff required.

This is what i used (physical stuff) to run a simple Python Program to turn a stepper motor using Raspberry Pi:

  1. Raspberry Pi (Duh! ) model B (model B has 2 USB slots, HDMI, ethernet, you can fit keyboard, mouse, wifi dongle, etc). I guess model A has fewer of these slots and no HDMI.
  2. 4 GB + memory card (the Raspbian OS is just over 2 GB, so won't fit on a 2GB card). I guess any one will do. You can use the ones that go with your camera.
  3. HDMI cable to connect to a TV or monitor.
  4. Power Cable (get an external power adapter for Raspberry Pi).
  5. Screw driver set, if you don't have one (it comes in handy if you are messing with electronics). I got this one.
  6. Jumper cables.
  7. Battery holder for powering the stepper motor.
  8. WiFi dongle. Yes, Raspberry Pi (model B) doesn't come with builtin WiFi. So, I got this one. I didn't have any problems with drivers, etc.
  9. Stepper motor. This one comes with a tire. It will come in handy if are into Robotics.
If you add all these, you might end up spending around $70-$100.

I formatted the SD card with Raspbian OS (its a variant of Debian). Its customized to run on ARM architecture.

I followed the instructions posted on this video on YouTube.

The code that I have below was obtained from above video:

Once you have the program, you need to open the Terminal in Raspberry Pi and give the command as a root to run the program. For example,
Here is a video of the working system:

Thursday, January 30, 2014

setting up Ad-hoc network in Ubuntu 12.04 or Debian

I had a hard time looking for setting up an Ad-hoc network. This post explains to setup an ad-hoc network in one of the most easiest ways.

The basic idea behind setting this up is, set up an ad-hoc network on your both (or more) computers individually and let them talk (or ping) to each other.

One needs to understand that ad-hoc network is different from a WiFi hotspot. Also you need to be patient while one computer pings another.

You will have to do the following in both(or more, replace the ip addresses) of the computers
In first computer (copy paste the following text), you give the network name and other important details for the interface that you are configuring ad-hoc for.
Similarly, in second computer
Now back in ther terminal do the following:

ifup command is a very useful command.

That's all. You adhoc network is now setup. Now to be sure if everything is okay, you could do the following:

Now, all is left is to ping first computer from second computer (or vice-versa) to make sure that everything works well.
Now have some patience wait for some time till they ping each other
Once the network is established (at least once), the further attempts to reconnect (in case of a disconnect) will be much faster.

Reference: https://wiki.debian.org/WiFi/AdHoc