announcing BatBot: an experimental AI vision robot

This robot can understand spoken language and what it is looking at. It is created from commodity parts. The lower half is an Elegoo Robot Car v3.0. The upper half is a Jetson Nano. A Bluetooth Android app controls the robot using Spoken English. The robot has a camera, ultrasonic sensors and an unused 40 pin GPIO for sensor expansion. AI software in the Nano controls the robot. High-level spoken commands like ‘GO FIND OBJECT” instruct the robot to find and photograph things. The Android app is MVVM Kotlin and Java. The Jetson Nano is programmed in Python. The Arduino robot car is programmed in ‘C’. MIT License.

The robot has five ‘operating modes’:

  • default (accept commands from Android and the IR remote)
  • collision avoidance (try not to hit anything. optionally search for an object by name.)
  • line following (try and follow a black-tape line.)
  • security monitor (detect, photograph and report any detected motion)
  • map the world (try and map out everything around the robot)