A self propelled maze solving sneaker. Yeah, that seems like a good idea.
When we were looking for a way to use all the Raspberry Pis that are either gathering dust or wasting their life as media streaming boxes (come on, admit it, you’ve got one) in peoples homes, and hopefully get some people engaged in using these little marvels for the purpose they were designed, Mr Powell came up with the idea of building maze solving robots. That are also good at sumo wrestling. And robot football. So, nothing fancy then.
It just so happens those lovely people at the Pi Hut (www.thepihut.com) supply a CamJam robot building kit that comes with all the bits you might want (wheels, motors, sensors, driver board, battery box etc…). All it needs is a Pi released from it’s solitary confinement/life of drudgery streaming GoT and some sort of chassis.
You can use the box it comes in as a chassis if you like, there’s also some interesting designs out there on www.thingyverse.com that you can 3D print. Of course we have the laser cutter and just about every handtool back at the HackOldham workshop, so something in wood or plastic would be a doddle. But come on, we are creative, we can do better than that …
So looking around my Man Cave for some form of hollowed out container big enough to fit a USB power source, some AAs and a Pi, with a flat bottom suitable for mounting a couple of motors, a caster and a line sensor, my eyes arrested on the box the new trainers I had just bought came in. Of course the shoe box would be ridiculous, but not the old trainers I had just replaced. No. That would be perfectly sensible.
A couple of trips round the washing machine later, we are talking about a used trainer after all, the Adibot was born (instead of Run DMC I’ll quote Derek B from 1988, “We wear fresh fly Adidas/Not Nike they’re wack”).
Once my CamJam kit arrived I could start attacking the trainer with a Dremel to drill mounting holes and some conduits for the wiring. This taught me a valuable lesson – trainer soles have a tendency to want to catch fire when you drill them. Partially this is because of the loose wadding and stuff the drill picks up, but it’s also because the “rubber” is quite grippy. The motors were easy enough to mount, as were the caster and the line sensor.
The Pi, power pack and batteries were simple enough too as they sit in the bit where you’d normally find a foot. The only real challenge, so far, was mounting the ultrasonic transceiver. The CamJam kit comes complete with a small breadboard for the sensor and associated resistors, plus any additional electronic stuff you might like. This board happens to have adhesive tape on the bottom to aid mounting, so I ended up sticking it to the bottom of the tongue and then weaving the trainer’s laces around it to hide as much of the electronics as I could.
Next step was to wire all this up. Here’s a helpful hint for whenever you find yourself wiring up a shoe (like, you know, happens all the time) – lace holes are ideal for tidying up jumper jerky. And that’s it, all the hardware done. Now for some software.
To make life easier I installed a VNC server on my Pi (“sudo apt-get install tightvncserver” is your friend here) so I could just remote to it from my laptop, rather than have to chase it around with a TV, mouse and keyboard. The tightVNC client works properly on Windows 10 I can confirm. Using Python and the excellent materials from CamJam I was able to get the motors up and running and the line sensor working pretty much instantly.
The ultrasonic sensor proved a little more challenging. This works by driving the transmitter to send out a short pulse of inaudible noise and measuring the time it takes for that sound to be heard by the receiver. The shorter the intervening period, the nearer to an object you are. Except mine didn’t seem to be reading anything.
Placing my finger over the contacts of the receiver, thereby introducing some noise to the system and causing a random signal, showed that the Pi was picking up the received signal correctly. Sticking a DVM on the trigger pin of the sensor board and seeing the occasional spike towards 5V, showed that the transmitter was being triggered. So my software, wiring and Pi config were all good.
To prove I was on the right track I made up a test jig with an Adafruit Trinket and LCD display I just happened to have lying around. This too showed no life in the sensor. Fortunately we have extra CamJam kits at the Hackspace, so I borrowed another Ultrasonic transceiver and it immediately worked on the test rig. It also worked on the Pi. This is yet another good reason for being a member of a Hackspace – we’ll usually have useful stuff waiting for someone to use it, or someone else there will.
Figuring that if I had a duff one other people might too, I ordered a pack of 5 transceivers from Amazon – like most electronic components, ordering a few is usually not much more expensive than ordering one. If you think you’ve got a problem with yours why not come to one of our open nights? We can help you test it and be happy to let you have a spare assuming we’ve still got some (of course a small donation to the space would be appreciated).
Once everything was in place I fired up the beast. Which is when I realised the sole of my Adibot was a little bit flimsy. Switching from forward to reverse tended to make the wheels tuck in at the top and rub on the sides of the trainer. A quick mod with some offcut plywood (again, useful things are just waiting for you to use them in a Hackspace) to provide more support to the motor mounts fixed that. While I was at it I repositioned the motor battery pack to the sole between the wheels too.
A bit more python shenanigans and I had an obstacle avoiding autonomous shoe. Just cause I like a challenge, and had a Pi Camera I wasn’t using anywhere else, I decided having an Adibot view of the world was necessary. I assumed that I could just use the built in camera preview functionality in Python and my remote VNC connection to display it. Well, you know what they say about assuming …
This turned out to be the biggest pain of the entire build. I won’t go into too much detail, because I don’t like swearing and someone from the Pi foundation might read this, but the solution I ended up using was to run an Apache server on the Pi and use PHP to stream video from the camera to a browser.
Now comes the fun bit, working out a proper maze solving algorithm.