My Robots

I have been always excited by robots and many of my projects and works is related to this field. Following are some of the robots that I have built during my undergraduate studies. Please see Projects page for research and acdemic projects done by me.

Robots at IIIT Hyderabad Robotics Lab

Fall, 2008 - Fall 2010

IIIT Hyderabad RobotsDuring my stay at Robotics Research Lab of IIIT Hyderabad, I have worked on various robots and sensors. Most of the time, I made use of the Pioneer P3DX and the new P3AT robot. Over this platform, I implemented and developed various vision algorithms, to develop a mobile robot that can interact and assist people in home/office environment. I have also worked heavily with stereo and monocular cameras and SICK laser scanners mounted on the poineer robots. This is big diference from my undergraduate lab, where I had to build ever robot from scratch. The significant advantage is that, I get to concentrate more on the cool and more interesting problems rather then rebuilding the hardware evrytime. This I believe is also fundamental to development of the robotics community. That's why open-source in robotics is the next big thing. ROS, Playe-Stage-Gazebo are the primary robot infastructure libraries, we use for making the robots work.


Spring, 2008

MERPMERP is a mobile robot platform designed and developed as a testbed for robotics learning and research. The design objective was to create a modular, easy to debug robot platform, both at hardware and software level. Over this platform, I implemented and developed various vision algorithms, to develop a mobile robot that can interact and assist people in home/office environment.
Some of the abilities of the robot are:
   1. Speech based interaction and command execution.
   2. Face detection and face/person tracking.
   3. Vision based person following.
   4. Ability to read text messages and symbols.
Also this platform has been built with cheap off the shelf equipments, which can be afforded by hobbyists and home users. Main components of the platform includes a onboard laptop, three AVR ATmega uc, a webcam, two dc-geared motors, pan and tilt mechanism for webcam mounting, dc-dc converters for power management.

Details, videos


January, 2008

IndurIndur was my entry at international micromouse event in techfest 2008, held every year at IIT Bombay. This was my first micromouse entry for a competition.

Motors: 2 x Stepper motors
Battery: 6x li-ion
DC-DC converter: PTN78000W (free sample from TI)
CPU: AVR ATmega 32
Motor driver: Allegro A3982 (free samples from Allegro)
IR transmitter : OPE5594 (sourced from RS components)
IR receiver : TSL262R (same as above)
StraightSpeed: 1.9 m/s
Acceleration:2.4 m/s/s
Software: Flood-fill algorithm, In-place turns

'Indur' means mouse in Bengali. It was developed rather very quickly, in just a month. I didn't get time to optimize the PID routines, for perfect straight runs, as a result, it could not perform to its expectation. Indur made to the finals in Techfest 2008 micromouse event, and were among very few to reach the centre of the maze. I will be adding more details, possibly a separate whole page dedicated to micromouse, once i get some good free time. Stay tuned. :)


Summer, 2007

iRATiRAT is a micromouse development kit being developed at TRI Technosolutions. I worked on this project as an intern at TRI during summers, 2007 along with Anand Ramaswami of TRI. iRAT is a very modular micromouse development platform. It has both top-down and sidewall sensors. So the user can choose any sensor approach he likes.

iRAT is based on AVR ATmega32 RISC controller running at 16MIPS. It has two high-torque stepper motors, which can be over driven with the on-board current chopper circuitry. The board also has LEDs, buzzer, LCD, UART port for easy debugging. A simulator along with some sample codes has also been developed.


December, 2006

Maze SolverThis is a basically line following robot. It also has obstacle detection sensors. It has won 1st prize at 3 robot competitions. Apocalypse clocked the fastest time of 15s at a line following event in Edge at Techno India, Kolkata. While, the 2nd fastest time was 26s. Apocalypse also won the Sliding Door event (a maze solving contest). Again it was the fastest to cross the whole grid clocking 22s..

It uses a Atmel AVR ATmega8 microcontroller as its brain. L293D is used as motor driver for two geared dc motors. LDRs are used as the sensors. The sensor board is replaceable and it can use other sensors like IR LED + IR phototransistor with ease. Apocalypse is highly configurable. We configure it according to our needs and have been a formidable base for many events.

Video1: Line-following(Youtube)                            Video2: Sliding Door Event(Youtube)

Maze Solver

September, 2006

Maze SolverMaze Solver is an autonomous robot that could solve a maze with the help of an overhead camera using image processing principles. I built this robot for a robotics competition at IIT Guwahati's annual techfest, Techniche'06.

The objective was to build a robot that could solve a maze by reaching the circle by avoiding the obstacles( white unmovable, red: movable) with help of a overhead camera in the shortest time possible. The images from the overhead webcam is processed on a desktop computer, which then sends the movement commands to the robot.

The first task involved is to localize the robot in the arena using the overhead camera. A simple pattern on the robot is used to track it and find its orientation. Then it needs to differentiate between movable path and obstacles. After that it has to find the circular blob and calculate the shortest route to it. Finally it drives the bot along this route to its target.


December, 2004

MousEyeThis small robot can move around and scan the surface underneath it. Idea of building this robot came after coming across this article on mouse scanner. Optical Mouse generally uses tiny CCD sensors. The mouse I hacked has an ADNS-2610 chip. It has a 18x18 CCD matrix. It is programmable and gives it's output serially (SPI interface).

Though the images produced are not very encouraging but can be used as something like poor man's document scanner. I interfaced the ADNS-2610 with my PC parallel port and made a simple gui to show the scanned picture. I also attached two small DC motors to the mouse. So you can just sit back in your chair and just click few buttons to move the 'MousEye' robot and scan all the parts of the document you need.

The sensor chips in the optical mouses like the ADNS-2610 gives displacement readings, and thus can be used as alternative to conventional wheel encoders.