dogbotblog
dogbotblog
Dogbot
16 posts
Matt Heard makes a robot with his Raspberry Pi.
Don't wanna be here? Send us removal request.
dogbotblog · 6 years ago
Text
Ambling towards monocular SLAM
Tumblr media
(https://www.flickr.com/photos/104342908@N08/35337589432) (CC BY 2.0)
A couple of years ago, I moved from New Zealand to Germany with my family. I took Dogbot with me, and now I am resuming its construction. Before I moved, Dogbot learned to dance. But I knew that to take Dogbot to the next level, I would need to do some book-learning.
Dogbot uses an Android phone as its "face" and "brain". The phone has a front-facing camera, which Dogbot will use to survey its surroundings. Dogbot will convert this video input into useful navigation data, and it will do so with visual SLAM. ("SLAM" is "Simultaneous Localisation and Mapping".)
Most SLAM techniques use special sensors designed for range-finding. For example, self-driving car developers Waymo and Uber fought over patents on Lidar. Lidar uses pulses of lasers to detect distances from the sensor's surroundings. With enough pulses pointed at different angles, a robot can build a 3D model of its surroundings. This is necessary for navigation.
Instead of a range-finding sensor, a robot can replicate this using two cameras, like eyes. The differences in the images helps to estimate depth too. This allows a robot to localise objects, as if it had a range-finder.
There is a more constrained technique called "monocular visual SLAM". Imagine walking around a room with one hand over one eye. You can do it, but you need to be careful. You must amble forward with sensitivity, double-checking that objects are where you expect. Objects might be closer or farther than you expect, because range-finding is difficult.
This is what Dogbot will do, because it only has one front-facing camera.
To get there, I will need to develop some implementations of complex algorithms. I will increment my way through them, so that I know them well.
I need to understand these first: Alpha-beta filter, Kalman filter, Extended Kalman filter. With Extended Kalman filters, I can then build the MonoSLAM algorithm.
Then, I want to learn "Bundle adjustment" and "PTAM". ("PTAM" is "Parallel Tracking and Mapping".) With PTAM and other techniques, I can then build the ORB-SLAM algorithm.
Finally, I want to learn line detection, with which I can build the PL-SLAM algorithm.
MonoSLAM, ORB-SLAM, and PL-SLAM all work, but with increasing robustness and complexity. Let's see what we learn along the way. I am no computer vision expert, but by the end of this, I should be able to sustain a conversation with one over a beer. That's my goal. 🍻
1 note · View note
dogbotblog · 9 years ago
Text
Dance Instructions
Tumblr media
(https://www.flickr.com/photos/edsonhong1/5242124138) (CC BY-NC-ND 2.0)
The previous project, DanceButton, sent a single “Start dancing” command over USB to start a hard-coded dance. The current project, Choreography, allows users to design dance routines in an Android app, and then send the dance routines to the Raspberry Pi torso to perform. This project is a precursor to a future project where Dogbot will use serialised sensor data to determine how to turn its treads. We need to be able to send streams of data over USB, and then use that data to determine a sequence of tread movements.
There are a number of differences with the Choreography Android app, compared to previous Dogbot Android apps.
This is the first Dogbot Android app to use multiple Activity screens. Eventually, interaction with Dogbot will only involve a minimal amount of touching the screen, so I don’t yet know whether multiple Activity screens will be useful. It was good to learn, regardless.
Tumblr media
(The form for creating a new dance initially has one step: “Stop”.)
Tumblr media
(When steps are chosen, new fields for adding more steps are added dynamically.)
CreateNewDanceActivity uses a dynamic form, where fields for adding new steps are appended to the form as they are used, allowing for dance routines of any length.
This was my first use of an Intent to send data between Activity screens. In this case, I created a String array of the dance steps in CreateNewDanceActivity when the “Create” button was pressed, and returned that data to the SelectDanceActivity, which displayed the dance’s name and steps in a scrolling list.
The app data is not persisted, so previously created dance routines are be erased if the Activity is restarted. (This also happens when the screen is rotated.) The concepts I was focussing on for this project didn’t warrant persisting the dances. Short demo dance routines could easily be recreated.
Tumblr media
(The main screen shows a list of dances. Clicking on one sends the dance routine to the torso and displays a Snackbar showing what data was sent.)
The dance routine consists of a sequence of strings: “Forward”, “Reverse”, “Left”, “Right”, and “Stop”. These coincide with the commands that can be issued to the RaspiRobotBoard robot object in Python 3 on the Raspberry Pi.
My wife tested the app for me and found an issue where a user could not easily use the “Stop” dance step. The app only adds a dynamic step field if the last step field is not set to “Stop”. The new step field added has a default value of “Stop”. This means that all dance routines end with “Stop”, which seemed reasonable. The “Stop” command should also be used for pausing the treads, though, as in “Forward, Stop, Reverse, Stop”. This means that if you create a routine with a “Stop” step which isn’t the last step, a new step field will not be added until that “Stop” step is changed to something else.
The workaround is to just set the “Stop” step fields to something else in order to add the next step field, and then change them back to “Stop”. For example, to create a routine of “Forward, Stop, Reverse, Stop”, you could select “Forward, Left, Reverse” and then change “Left” back to “Stop”. (The dance routine will automatically have a “Stop” step at the end.)
The changes to the Python 3 script were minimal, as the USB connection setup and read loop is identical to the DanceButton project. The main difference was that instead of using a hard-coded sequence of commands which are sent to the treads, I created a set of functions for each command that can be sent to the treads and put those functions into a dictionary. The main loop waits for a dance routine to be sent by the Android app, and then parses it for a list of steps.
dance.py
... # Set up USB connection treads = raspirobotboard.RaspiRobot() def forward(treads, period): treads.forward(period) def reverse(treads, period): treads.reverse(period) def left(treads, period): treads.left(period) def right(treads, period): treads.right(period) def stop(treads, period): treads.stop() time.sleep(period) move = { "forward": forward, "reverse": reverse, "left": left, "right": right, "stop": stop } def perform_dance(): max_number_of_bytes = 1024 for i in range(5): data = endpoint_in.read(max_number_of_bytes, timeout=0) dance = bytes(data).decode() period = 0.5 # seconds stop(treads, period * 2) for j in range(4): for step in steps(dance): move[step](treads, period) if __name__ == "__main__": try: perform_dance() finally: GPIO.cleanup()
Each step is performed by taking the string (e.g. “forward”) and using it as a key for the dictionary of tread functions. The function returned from the dictionary can then be called and cause the treads to move the intended way.
Finally, until now, Dogbot has been shackled down by a number of cables. In order to start the Python scripts, I connect to the Raspberry Pi via SSH and start the script from there. This requires the Raspberry Pi to be connected to the local area network by Ethernet.
A few months ago, I learned that I can configure udev to run a program when a USB device is connected. By adding a udev rule, if the Motorola G2 phone is plugged into the Raspberry Pi, the vendor ID and product ID of the phone match the attributes in the udev rule, and cause it to run the given command. I set up a rule that matches the phone and runs the dance.py script for Choreography.
/etc/udev/rules.d/motorola-g2.rules
SUBSYSTEMS=="usb", ATTRS{idVendor}=="22b8", ATTRS{idProduct}=="2e76", ENV{DEVTYPE}=="usb_device", RUN+="/home/pi/dev/Dogbot/raspberry_pi/dance/choreography/dance.py"
Now, I can turn on my Raspberry Pi and plug in my Android phone, and the dance.py script will run, the USB accessory connection will be established, and the Choreography Android app will automatically open on the phone. All without a remote connection.
Now I can run the app without priming it over SSH. One less cable means that Dogbot is one step closer to being unshackled.
1 note · View note
dogbotblog · 9 years ago
Text
The Dance Button
Tumblr media
(https://www.flickr.com/photos/deadling/256390622) (CC BY 2.0)
As a fun proof-of-concept, I made Dogbot dance when I was testing the tread motors, but I had to connect to the Raspberry Pi via SSH and start the Python script before disconnecting it and placing it.
Now that I have USB communication, I have created a DanceButton Android app, which opens when the face (the Android phone) is plugged into the torso (the Raspberry Pi robot platform). The app shows a button which, when pressed, tells the torso to wiggle and dance.
The structure of the Android app is nearly identical to the MessageSender app, but it uses a floating action button (fab) to send a single byte of 0x01 to the torso, which starts a short sequence of dance moves when a byte of 0x01 is received.
Instead of using an EditText component for the user input, there is a FloatingActionButton component with a play icon (“▶”).
<?xml version="1.0" encoding="utf-8"?><android.support.design.widget.coordinatorlayout ... tools:context="nz.co.deuteriumlabs.dancebutton.DanceActivity"> ... <android.support.design.widget.floatingactionbutton ... app:srccompat="@drawable/ic_play_arrow"></android.support.design.widget.floatingactionbutton></android.support.design.widget.coordinatorlayout>
DanceButtonActivity is the main Activity in the app, and its onCreate method calls initFab, which initialises the floating action button.
// DanceButtonActivity private void initFab() { FloatingActionButton fab = findFab(); fab.setOnClickListener(new View.OnClickListener() { @Override public void onClick(View view) { dance(); makeSnackbar(view); } }); }
This is very much like setting up the EditText listener in the MessageSender app. Clicking the fab will call dance and makeSnackbar. dance sends a message to the Torso instance, and makeSnackbar creates a Snackbar (a little, temporary on-screen message) that says “Dance command sent”.
Like MessageSender#sendMessage does with Torso#sendMessage, DanceButtonActivity#dance simply delegates the dance command to Torso#dance.
// Torso void dance() { byte[] data = { START_DANCING }; sendData(data); }
START_DANCING is a constant variable of 0x01.
That’s all that is different in the Android app. In the Python script, instead of printing incoming messages, we’re starting the dance when we receive an 0x01 byte.
# dance.py treads = raspirobotboard.RaspiRobot() try: for x in range(0, 3): received_byte = endpoint_in.read(1, timeout=0).tolist()[0] if received_byte == 0x01: print("LET'S DANCE") placement_delay = 3 # seconds time.sleep(placement_delay) # Wait for Dogbot to be placed dance(treads) finally: GPIO.cleanup()
endpoint_in.read returns a Python array, so we’re turning it into a list and then getting the first item. (This is not optimised for performance.)
The for x in range(0, 3) loop means that the program will do the dance three times before quitting and cleaning up the GPIO ports.
The dance itself is very simple but could be as elaborate as you wanted.
# dance.py def dance(treads): treads.left(0.3) treads.right(0.3) treads.left(0.3) treads.right(0.3) time.sleep(0.6) treads.left(0.3) treads.right(0.3) treads.left(0.3) treads.right(0.3) time.sleep(0.6) treads.left(0.6) treads.right(0.6) treads.left(0.6) treads.right(0.6)
The final step was bundling all the cables together. The USB cables that connect the Raspberry Pi and the Android phone to the USB hub are quite long, but luckily the hub itself is small enough to embed in the top of the robot and the USB cables are quite compact when taped up. In the end, the only thing tethering Dogbot is the power cable for the USB hub.
youtube
0 notes
dogbotblog · 9 years ago
Text
Back and Forth
Tumblr media
(https://www.flickr.com/photos/keoni101/5145003933) (CC BY 2.0)
To prove that data can be sent across the USB connection in each direction, I’m going to write a pair of Python 3 scripts and a pair of Android apps. One Python script will send text messages to a corresponding Android app which will display the messages on the Android device screen. The other Android app will accept user text and send it to a corresponding Python script which will print the messages on the screen of the computer I am using to remotely connect to the Pi.
The Python script in another post covers most of what happens on the Raspberry Pi, but I wanted to explain some of the structure behind the MessageSender Android app, which shows a text field to the app user, and then sends the contents of the field over USB to the Pi when the app user hits “Send” on their on-screen keyboard.
Here are the relevant files in the app:
. └── app/src/main ├── AndroidManifest.xml ├── java/nz/co/deuteriumlabs/messagesender │ ├── Torso.java │ └── MessageSenderActivity.java └── res ├── layout │ └── activity_message_sender.xml └── xml └── accessory_filter.xml
The AndroidManifest.xml file includes information which tells the Android phone about the app. I’ve configured the accessory Intent so that the phone knows to open this app when the Raspberry Pi tells the phone to switch to accessory mode.
<?xml version="1.0" encoding="utf-8"?> <manifest ...> <uses-feature android:name="android.hardware.usb.accessory"> </uses-feature> <application ...> <activity android:name="nz.co.deuteriumlabs.messagesender.MessageSenderActivity"> <intent-filter> <!-- Start this Activity when opening from app menu --> <action android:name="android.intent.action.MAIN"> </action> <category android:name="android.intent.category.LAUNCHER"> </category> </intent-filter> <intent-filter> <!-- Start this Activity when an accessory is attached --> <action android:name="android.hardware.usb.action.USB_ACCESSORY_ATTACHED"> </action> </intent-filter> <!-- Find the associated accessory details in the 'accessory_filter' XML file --> <meta-data android:name="android.hardware.usb.action.USB_ACCESSORY_ATTACHED" android:resource="@xml/accessory_filter"> </meta-data> </activity> </application> </manifest>
The accessory_filter.xml file includes information about what USB accessories can trigger this app to open. We’re only interested in the Dogbot Python script right now. These values match the identifying information that the Python script sends.
<?xml version="1.0" encoding="utf-8"?> <resource> <usb-accessory model="DogbotTorso" manufacturer="Deuterium Labs"> </usb-accessory> </resource>
The activity_message_sender.xml file describes the layout of the main Activity of the Android app. The main component is the EditText text input. We don’t need to add a “submit” button because we can configure the keyboard to have a “Send” button instead inserting a new line.
<?xml version="1.0" encoding="utf-8"?> <relativelayout ... tools:context="nz.co.deuteriumlabs.messagesender.MessageSenderActivity"> <edittext ... android:id="@+id/editText" android:imeoptions="actionSend" android:inputtype="text"> </edittext> </relativelayout>
There are two Java classes: Torso and MessageSenderActivity. MessageSenderActivity is the Android Activity which handles what the user sees and interacts with. When the MessageSenderActivity starts, it sets up the user interface and starts the Torso. The Torso is responsible for handling the communication with the Raspberry Pi. When the MessageSenderActivity receives input from the text field, it forwards its contents to the Torso which, in turn, encodes and sends the message via USB.
MessageSenderActivity has the following structure:
public class MessageSenderActivity extends Activity { private Torso torso; private EditText editText; @Override protected void onCreate(Bundle savedInstanceState) { ... } private void initEditText() { ... } private void sendMessage(final String message) { ... } @Override protected void onNewIntent(Intent intent) { ... } @Override protected void onDestroy() { ... } }
Torso has the following structure:
class Torso { private final Context context; private final UsbManager usbManager; private ParcelFileDescriptor parcelFileDescriptor = null; private FileOutputStream outputStream = null; void sendMessage(final String message) { ... } private void sendData(final byte[] data) { ... } Torso(Context applicationContext) { ... } private UsbManager initUsbManager() { ... } void onNewIntent() { ... } void onDestroy() { ... } }
When the app starts, Android uses the AndroidManifest.xml file to figure out which Activity to start. In our case, Android creates a MessageSenderActivity instance and calls its onCreate method.
// MessageSenderActivity @Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); final Intent intent = getIntent(); onNewIntent(intent); setContentView(R.layout.activity_message_sender); initEditText(); }
MessageSenderActivity#onCreate gets the accessory Intent and calls onNewIntent which sets up the Torso. It then loads the layout with the EditText component, and initialises it.
// MessageSenderActivity @Override protected void onNewIntent(Intent intent) { if (torso == null) { torso = new Torso(getApplicationContext()); } torso.onNewIntent(); super.onNewIntent(intent); }
MessageSenderActivity#onNewIntent creates a new Torso instance if there isn’t one, and calls its onNewIntent method too, which sets up the USB connection. The app doesn’t really care about what’s in the Intent, so I don’t bother passing it to the Torso.
// Torso void onNewIntent() { if (usbManager.getAccessoryList() != null) { UsbAccessory accessory = usbManager.getAccessoryList()[0]; parcelFileDescriptor = usbManager.openAccessory(accessory); FileDescriptor fileDescriptor = parcelFileDescriptor.getFileDescriptor(); outputStream = new FileOutputStream(fileDescriptor); } }
Torso#onNewIntent gets the first (and only) UsbAccessory and saves a reference to its FileOutputStream.
// MessageSenderActivity private void initEditText() { editText = (EditText) findViewById(R.id.editText); editText.setOnEditorActionListener(new TextView.OnEditorActionListener() { @Override public boolean onEditorAction(TextView v, int actionId, KeyEvent event) { boolean isHandled = false; if (actionId == EditorInfo.IME_ACTION_SEND) { final String message = editText.getText().toString(); sendMessage(message); isHandled = true; } return isHandled; } }); }
Now, if you look carefully at the activity_message_sender.xml layout, you might see that the EditText component has an attribute of android:imeoptions="actionSend". This tells the on-screen keyboard to replace the “↵” character used for starting a new line with a button that says “SEND”.
MessageSenderActivity#initEditText, which is called by onCreate at the beginning, sets up an ActionListener which listens for events from the keyboard and if any of them match that “SEND” button, it gets the contents of the EditText component and calls sendMessage.
// MessageSenderActivity private void sendMessage(final String message) { torso.sendMessage(message); }
MessageSenderActivity#sendMessage simply delegates the task of actually sending the message to Torso by calling Torso#sendMessage.
// Torso void sendMessage(final String message) { byte[] data = {}; try { data = message.getBytes("UTF-8"); } catch (UnsupportedEncodingException e) { e.printStackTrace(); } sendData(data); }
Torso#sendMessage encodes the message and calls sendData.
// Torso private void sendData(final byte[] data) { if (outputStream != null) { try { outputStream.write(data); } catch (IOException e) { e.printStackTrace(); } } }
Torso#sendData takes the bytes and writes it to the outputStream we got earlier from the UsbAccessory. This is what sends the data across the USB cable.
This all gets received by the Python script, which includes a loop as follows:
# receive_messages.py message = "" while message != "quit": received_bytes = endpoint_in.read(1024, timeout=0) byte_string = bytes(received_bytes) message = byte_string.decode() print(message)
endpoint_in.read waits until something comes across the wire, and then the received bytes gets decoded into a message which is printed to the screen.
Magic!
0 notes
dogbotblog · 9 years ago
Text
New Starts
Tumblr media
In August, my wife and I had a second child, whom we named Arthur. The months leading up to his birth were busy with preparation, so I spent little time thinking about Dogbot.
I planned to take a few weeks of leave after the birth, and my wife suggested that I use that opportunity to resume work with Dogbot. We found a $25 powered USB hub online to replace the one I fried. While I waited for it to arrive, I learned more about OpenCV, which I will be using for processing the images from the Android cameras.
My wife and I also decided to get some new phones. She had an old iPhone and my Samsung Galaxy S4 was aging poorly. They both were cracked. We ordered a couple of Motorola Moto G2 phones, which were cheaper but more recent than our previous phones.
The Moto G2 will replace the Galaxy S4 as the “face” for Dogbot, and the new USB hub will be clearly labeled so that the wires don’t get swapped. Hopefully my hardware blockers will be solved.
Things are looking bright again.
0 notes
dogbotblog · 9 years ago
Text
USB Problems
Tumblr media
My favourite rendition of Sisyphus. (https://commons.wikimedia.org/wiki/File:Midevil_sysiphus.jpeg, public domain)
I don’t want to dwell on this for too long, but I think it’s important to document the failures just as much as the successes.
This project has moved a lot slower than I had hoped, partially because of the hardware issues I have encountered along the way. In fact, this project had completely stalled for about 6 months this year.
First, I was struggling to figure out what was going wrong when I was trying to send accessory mode control requests to my Samsung Android phone. In my previous post, I outlined how the Python script send various requests to the USB device in order to put it into accessory mode. With my Samsung phone, I was having issues with my Python commands switching it into accessory mode the first time, and not the subsequent times. This frustrated my attempts to test and automate my Python commands, because sometimes it just didn’t work and I couldn’t really predict when. Sometimes disconnecting the USB plug fixed it. Sometimes it didn’t. Sometimes restarting the phone helped. Sometimes it didn’t. It easily got to the point where I would set aside two hours to work on Dogbot, but would end up spending the whole time failing to find the cause of this issue.
Night after night and week after week of no progress wore away at me.
Second, I made a stupid mistake and fried an $80 USB hub which I needed to connect the Android phone to the Raspberry Pi.
The Raspberry Pi has several USB ports, which is great for plugging in USB flash drives and USB WiFi dongles, but plugging my Android phone directly into the Raspberry Pi was not going to work. As far as I have been able to figure out from the USB specifications, the Android phone needs to be drawing power (charging) over the USB cable while maintaining a data connection. The Android phone draws a lot of power and, similar to the tread motors, will starve the Raspberry Pi. To solve this problem, I decided to get a powered USB hub. The hub connects into the Raspberry Pi and Android phone connects into the hub, so the data connection can pass through the hub from Pi to Android, but because the USB hub draws its power from a wall socket, the USB hub charges the Android phone rather than the Raspberry Pi. I suspect that it might be possible to charge the Raspberry Pi via its USB connection, but I need to do more research before just trying it.
One day I was moving my home WiFi router and when plugging it back in, I grabbed the cable for my USB hub instead of the one for the router. The cables were identical, but they drew different amounts of power from the wall. The router was being weird and would disconnect occasionally, but worked well enough for me not to investigate the power cable. On the other hand, when I plugged my USB hub in with the WiFi router’s cable, the power LED blinked as if it was in pain. I pulled out the cable and smelled the hub. A bit “burny”. (Is that the term electricians use?) I checked the cables and recognised my blunder. I swapped them back and quit for the day, not realising it was dead.
Even dumber than that: I forgot that I fried the USB hub. I came back to the project later and spent hours trying to figure out why the plugged-in phone was charging but not connecting to the Pi.
Once I figured that I couldn’t make any progress without a new USB hub, I struggled to find any motivation for Dogbot. I had ruined an $80 USB hub and a $60 Raspberry Pi before that. Most of the cost of this project was the result of dumb errors.
I don’t know how much time I wasted trying to figure out my hardware issues, but I lost all motivation.
0 notes
dogbotblog · 9 years ago
Text
Accessorising Android
Tumblr media
(https://www.flickr.com/photos/blackarthur/3206236437) (CC BY-NC-ND 2.0)
A feature that distinguishes Dogbot from a lot of other Raspberry Pi robots is its use of an Android phone for sensing its environment. The Android smartphone connects to the Pi via USB, so that information that the Android phone collects can be sent to the Pi as input for navigating around obstacles and towards waypoints.
Android devices have an USB Accessory mode, which allows you to connect a device (called an “accessory”) to an Android phone and send messages saying “I am a speaker dock accessory. Open your speaker dock app.” The Android app associated with that accessory will automatically open and can start communicating with the accessory device. In my case, the Pi will be the “accessory” and will automatically open a Dogbot app on the phone.
To use the Android USB Accessory mode, a few things are needed.
First, the Android app needs to be associated with the accessory. There is an accessory.xml configuration file where you include the manufacturer, model, and version number of the accessory that can trigger this app to open.
Second, the Android app needs to respond to accessory intents. When the accessory connects to the phone, it will send a message identifying itself. That message includes the manufacturer, model, and version number. The phone will recognise this message as an attempt by an accessory trying to connect. The phone then looks for an app which is configured to use that accessory.
Third, we need a separate Java thread to handle the communication with the accessory. Using an “accessory thread” will allow the app to wait for more data to arrive over the USB connection without causing the “UI thread” to becomes unresponsive.
On the Raspberry Pi, we will be sending the USB data using the pyusb Python 3 module. This allows us to use Python to access libusb, which is the C library which sends the actual messages to the Raspberry Pi’s USB adapter.
Python has an interactive shell which can be used as a sandbox for trying out things, and it turned out to be really useful when figuring out how to send messages with pyusb.
Sending data from one device to another requires the connection to be established and for one side to listen for data while the other sends.
On the Pi, find the connected USB interface that represents the Android phone.
On the Pi, get the outbound endpoint for the USB interface.
On the Pi, send the accessory’s identifying information from the Pi to that endpoint. When the Android phone receives these messages, it will create the USB accessory intent.
On the Android phone, allow the USB accessory intent to open the right app on the Android phone. Permission needs to be granted, but this can be remembered for future intents for this accessory.
In the Android app, use the UsbManager to find the UsbAccessory and get the input stream associated with the UsbAccessory.
In the Android app, listen for incoming data on the input stream.
On the Pi, send a few bytes to the USB device.
In the Android app, do something with the incoming data.
The steps are similar for sending messages the other way but involves using the opposite endpoints and data streams. The end goal for this is to send sensor data collected from the Android device over the USB connection to the Pi.
Here is a demo of what the Python script looks like:
#!/usr/bin/env python3 import time import usb # Find the USB device representing the phone SAMSUNG_VENDOR_ID = 0x04e8 # USB Device vendor face = usb.core.find(idVendor=SAMSUNG_VENDOR_ID) # Prepare the accessory's identifying information MANUFACTURER = "Deuterium Labs" MODEL_NAME = "DogbotTorso" DESCRIPTION = "The torso of Dogbot" VERSION = "0.1" URL = "http://mattheard.net" SERIAL_NUMBER = "1" # Find out if the phone supports "accessory mode". # We know it does, so continue regardless of the response. ctrl_type_vendor = usb.util.CTRL_TYPE_VENDOR ctrl_in = usb.util.CTRL_IN ctrl_out = usb.util.CTRL_OUT GET_PROTOCOL = 51 face.ctrl_transfer(ctrl_type_vendor | ctrl_in, GET_PROTOCOL, 0, 0, 2) # Identify the accessory ID_INFO = 52 face.ctrl_transfer(ctrl_type_vendor | ctrl_out, ID_INFO, 0, 0, MANUFACTURER) face.ctrl_transfer(ctrl_type_vendor | ctrl_out, ID_INFO, 0, 1, MODEL_NAME) face.ctrl_transfer(ctrl_type_vendor | ctrl_out, ID_INFO, 0, 2, DESCRIPTION) face.ctrl_transfer(ctrl_type_vendor | ctrl_out, ID_INFO, 0, 3, VERSION) face.ctrl_transfer(ctrl_type_vendor | ctrl_out, ID_INFO, 0, 4, URL) face.ctrl_transfer(ctrl_type_vendor | ctrl_out, ID_INFO, 0, 5, SERIAL_NUMBER) # Put the Android phone into "accessory mode". START_ACCESSORY_MODE = 53 face.ctrl_transfer(ctrl_type_vendor | ctrl_out, START_ACCESSORY_MODE, 0, 0, None) # Wait for device to change into accessory mode time.sleep(1) # Find the new accessory-enabled USB device GOOGLE_VENDOR_ID = # AOA (Android Open Accessory) vendor face = usb.core.find(idVendor=GOOGLE_VENDOR_ID) # Get the outbound endpoint configuration = face.get_active_configuration() interface_number = configuration[(0,0)].bInterfaceNumber interface = usb.util.find_descriptor( configuration, bInterfaceNumber = interface_number) endpoint_out = usb.util.find_descriptor( interface, custom_match = lambda e: usb.util.endpoint_direction(e.bEndpointAddress) == usb.util.ENDPOINT_OUT) # Send a message message = "hello android" data = message.encode("UTF-8") endpoint_out.write(data, timeout=0)
I couldn’t have figured out much of this without the guidance of this tutorial by Manuel Di Cerbo. I tried a few times to do exactly what the tutorial was attempting but was stumped by various problems. I ended up using what I learned from my attempts to allow me to write my own test apps from scratch.
0 notes
dogbotblog · 9 years ago
Text
Spinning the treads
Tumblr media
Now that the components for the RaspiRobotBoard had been soldered together and the treads and chassis had been constructed, the Raspberry Pi could be mounted and made into a functional robot.
I got a handful of PCB spacers/supports, washers, and screws from Jaycar, and these all made mounting the Raspberry Pi onto the chassis straightforward. The RaspiRobotBoard connects to the Raspberry Pi via the GPIO pins, and I haven’t supported it in any other way, so every time I touch the RaspiRobotBoard, it moves around. It’s a horrible feeling, especially when I have to apply a little pressure to plug the battery pack plug into the socket on the RaspiRobotBoard. Maybe there’s an easy way to mount the RaspiRobotBoard to the Raspberry Pi, creating a double-decker of PCBs.
Once everything was mounted and connected, my wife and I bought a box of Lego to create a torso for Dogbot. We built the Lego torso so that there were holes in the sides for plugging cables and gave him some Lego eyes too. There's even a slot on top for cradling the Android phone that will eventually connect to the Raspberry Pi.
Attaching the Lego torso to the chassis was not obvious. I didn't want to glue any pieces down, so we ended up using thread to tie several Lego pieces with holes to the mounting holes in the chassis. It's a little loose, but stable.
The RaspiRobotBoard wiki clearly explains that Raspberry Pi and RaspiRobotBoard share a power source, and that you should either power them via USB on the Raspberry Pi or by battery pack on the RaspiRobotBoard, but never both. I can’t remember how long it took me to figure it out, but only the battery pack supplies enough energy to power the Raspberry Pi, RaspiRobotBoard, and the tread motors. If the power is coming via USB, the Raspberry Pi and RaspiRobotBoard would work fine, until I told the Raspberry Pi to turn on the motors. As soon as the motors would turn on, they would drain so much of the power from the USB, that there wouldn’t be enough left for the Raspberry Pi to keep operating. For example, I would send a command to run one motor for 1 second but instead: it would start spinning and just keep spinning and the Raspberry Pi SSH session would become unresponsive.
Running the electronics off the battery pack meant that the motors would work without causing the Raspberry Pi to die, but because I would often forget that it was running on batteries the power output would slowly dip, the Raspberry Pi would again start to become unresponsive. It was a source of weeks of frustrating thoughts of “Why is this no longer working? It was working perfectly 5 minutes ago!” Dying batteries. That’s why. Also, changing the batteries involves detaching the top part of the tank chassis, which is tricky with the Raspberry Pi mounted on top.
The RaspiRobotBoard’s creator, Simon Monk, created a Python 2 module appropriately called raspirobotboard, which abstracts away the GPIO commands and gives you a nice robot object to which you can send “forward” and “reverse” commands, for example.
#!/usr/bin/python import RPi.GPIO as GPIO import raspirobotboard if __name__ == "__main__": try: robot = raspirobotboard.RaspiRobot() robot.left(0.5) # Spin left for half a second robot.right(0.5) # Spin right for half a second robot.left(0.5) # Spin left robot.right(0.5) # Spin right finally: GPIO.cleanup() # Reset all GPIO ports to INPUT
With some calls to time.sleep and a varying arrangement of raspirobotboard commands, Dogbot can dance.
youtube
0 notes
dogbotblog · 9 years ago
Text
Soldering 101
Tumblr media
(https://www.flickr.com/photos/hartprod/10594669696) (CC BY-NC-ND 2.0)
After constructing the tank chassis for Dogbot, I needed to solder together the RaspiRobotBoard which I am using to control the motors. The Raspberry Pi itself has enough extra juice to power small electronics like LEDs with the GPIO pins, but if I plugged the tank motors in directly to the Raspberry Pi, there wouldn’t be enough power to go around, meaning that the Pi could possibly not boot up or even be damaged.
(As an embarrassing side note: I realised that I have been calling the GPIO pins “RPIO pins” up until now. I went back and fixed it in my previous blog posts but I think it was a pretty simple mistake. “GPIO” means “general purpose input/output”, but I was thinking “Raspberry Pi input/output”.)
The RaspiRobotBoard is a printed circuit board (PCB) in a kit with all of the little pieces that need to be soldered on to make it work. I have a version 1 RaspiRobotBoard, which has now been retired and replaced with a version 2 which comes fully assembled. As stressful as soldering together my first PCB kit might be, it is definitely more fun than just plugging in a pre-assembled version.
First, I need some space to work. All of my previous Dogbot electronics has been done in my warm, well-lit living room, but I don’t feel happy about soldering at my coffee table and having soldering fumes leave weird smells on everything my daughter plays with. We have a garage which we don’t park our car in, which is perfect. Just needed to quickly tidy up the corner with the workbench/desk.
Tumblr media Tumblr media
Good enough.
To build the RaspiRobotBoard, I followed the instructions in Simon Monk’s wiki on GitHub. To begin, I put together my soldering iron, which I bought from Jaycar when I got my breadboard and wires for playing with the LEDs.
I started soldering the GPIO socket onto the PCB and got faster with each joint. I watched a couple of videos on good soldering technique and resoldered the joints on the GPIO socket pins until I felt like they looked good.
Tumblr media
It took me a long time to get those pins soldered on. I kept melting the solder on adjacent pins, leading the solder to link the two pins. Not good. I didn’t have any solder wick, which is apparently a great tool for removing solder, and I didn’t have any flux, which apparently makes it a lot easier to manage the solder. By the last pin, though, I felt like my technique was sufficiently accurate and it was quick enough for me to move on to the smaller pieces.
I soldered on the resistors and noticed that my kit had 4 × 270Ω and 3 × 1kΩ resistors, instead of 3 × 270Ω and 4 × 1kΩ. This meant an extra trip to Jaycar in Lower Hutt to get the resistor I needed. They didn’t have any 4-band 1K0 resistors, like in my kit. Only 5-band 1K0 (1.0kΩ) resistors. I wasn’t sure that I had the right ones, so I spent 5 minutes in the store looking at resistor colour charts on my phone while Evelyn tried to make a mess. The resistors were all in small boxes in knee-level shelves, sorted by film and then by resistance, like an old card catalog, and Evelyn was having fun pulling the boxes out.
I got home with the resistors and when Evelyn went down for a nap, I went back to soldering on my resistors. I had been trying to solder the resistors onto the PCB from the top, but when I returned from Jaycar, I decided to redo the resistors I had already finished and solder them from the bottom. I was able to get much better joints from below, creating that ideal “volcano” shape. I then trimmed the leads, leaving tidy cones at the joints.
Tumblr media
The LEDs and small capacitor were done like the resistors, with no difficulty. The bigger pieces like the integrated circuits (ICs) and screw terminals were more difficult: not because they required careful positioning to keep straight, but because the tip on my soldering iron was becoming more and more oxidised and I didn’t have anything good to clean it with.
Tumblr media
By the time I got to the last few pieces, I was getting really frustrated with how long it was taking for the tip of my iron to melt the solder. The risk of burning the ICs or other components was worrying me, but my soldering iron came with a spare tip, so I switched them when the iron was cool and finished getting everything on. The tip with the black coating is the used, oxidised tip and the shiny tip is the unused tip.
Tumblr media
Once everything was soldered together, I took a break to admire my work.
Tumblr media Tumblr media
I showed my wife and she seemed impressed. It had been a stressful day soldering it all together, and it was the biggest soldering project I had ever done.
The next step is to get the motors moving.
0 notes
dogbotblog · 10 years ago
Text
Building a Tank
Tumblr media
Since I last wrote about Dogbot, I have worked on a few things:
Learning OpenCV;
Building the Android↔Raspberry Pi messaging service via USB; and
Maintaining Dendrite.
OpenCV is an open source library of tools for computer vision, and I have been learning how to use it so that I can implement monocular SLAM (MonoSLAM) on Android. OpenCV looks like an interesting open source project that I would like to contribute to and it has an Android port which works perfectly for Dogbot.
The Android “face” and the Raspberry Pi “torso” will communicate with each other over USB. Using the Python library pyusb, the Raspberry Pi registers itself as an Android accessory (similar to how a dock with speakers would) and then the Dogbot Android app responds to the accessory intent and opens up a connection.
Both of those two projects were interrupted by several big spikes of traffic to my previous project, Dendrite. As of writing, Dendrite now has had over 56,000 unique users and over 1.4 million pageviews and the influx of new visitors has kept me on the defensive. There is some refactoring I need to do to keep the server costs to a negligible level, but if it keeps spiking bigger and bigger like it has, the next spike will likely knock Dendrite offline. Any effort to minimise that possibility would take time away from Dogbot, which is my core project right now.
The OpenCV and USB messaging work had a lot of up-front research and development and as the calendar reached November, Dogbot looked no more like a robot than it did when I plugged the LEDs into the breadboard.
My wife encouraged me to take a different approach and start attacking the project from the bottom up: the wheels. My goal was to get a dumb rover going as soon as possible, even if it was just doing figure eights. My design was as follows:
The Raspberry Pi connects to a RaspiRobotBoard that was given to me as a gift.
The RaspiRobotBoard connects to two motors.
The Raspberry Pi runs a basic Python script to endlessly loop over a small set of commands telling the motors to spin at different times.
I ordered a Dagu DG021-SV Multi-Chassis Tank Kit from NiceGear and because it was “on order” rather than “in stock”, I was expecting it to take weeks before it arrived. It felt like only a week passed before I came home to find my wife seeming to have struggled over whether to open it herself before I got home. After Evelyn went to sleep, I cleared some space on the coffee table and prepared to build the tank.
First, I checked that I had all the parts. There was a small bag of weird gooey stuff which wasn’t mentioned on the parts list, so I figured it was some kind of dessicant and threw it in my rubbish bag. Everything else matched the parts list perfectly.
Tumblr media
I installed the DC motor gearboxes and then the little supports for the lid of the chassis.
Tumblr media
I then started mounting the wheels onto the gearboxes and learned that the packet of goo was lubricant for where the gearbox attaches to the wheel. Facepalm. I dug through my bag of rubbish and pulled out the lubricant. I was very hesitant to attach the wheels with anything more than the minimal force needed to keep them from falling off.
Tumblr media
Attaching the two remaining wheels was simpler, but I noticed that they were closer to the chassis than the wheels attached to the motors. I held my breath and used a bit of force to push the motored wheels together and click, they popped into place. The washers next to the wheels now fit flush against the side of the chassis, which made more sense than with the small gap you can see in that previous photo.
Tumblr media
I then noticed that the supports for the lid were in the wrong places. Facepalm × 2. Un-screw. Re-screw. Fixed.
Tumblr media
Almost done. Time to put the lid on. The instruction sheet said “Insert battery pack”, with no instructions about how to mount it inside. I tried to fit the battery pack in sideways, but it just couldn’t fit between the two passive wheels. I tried to fit the battery pack in straight, but it could only fit at a slant and was putting pressure on the motor gearboxes, which couldn’t be the correct way. As I puzzled over how to mount it, I was looking closer at the instruction sheet and noticed that I had installed everything into the case backwards.
Facepalm × ∞
After a primal scream of frustration, I had a quick shower, threw back a glass of wine, and then removed and reinstalled anything. However, I still could not figure out how to mount the battery pack. Maybe when the instructions said “insert” it meant “just throw it in there and hope it doesn’t rattle around too much”. So that’s what I did.
I put on the lid and screwed it down and it looked badass.
Tumblr media
Time to call it a night. This was supposed to be the easy part. The next task is to solder together the RaspiRobotBoard. In retrospect, I realised that I didn’t need to reverse the case. The orientation of the case only made a difference to the mounting holes on top, and I didn’t have a plan yet for how to mount the Raspberry Pi.
0 notes
dogbotblog · 10 years ago
Text
Sensory Overload
Tumblr media
(https://www.flickr.com/photos/edur8/2840744909) (CC BY 2.0)
The first day of February marked the beginning of the second stage of Dogbot. Before the Raspberry Pi is able to navigate routes and avoid obstacles, it needs to be able to sense its environment. I have a Samsung Galaxy S4 phone running Android 4.4.2 and have decided to write an Android app that acts as the interface between the Raspberry Pi and the world.
By the end of February, I am aiming to achieve the following: (1) collect a boatload of data from the sensors on the Android phone, (2) pack that data into messages, (3) send those messages over USB from the Android phone to the Raspberry Pi, (4) unpack that data from the messages, and (5) print the sensor data on the computer monitor via the HDMI from the Raspberry Pi.
The first step is to collect a boatload of data. I created an Android app in Android Studio and used the sensor framework from the android.hardware package to print out a list of all the sensors I had available. After getting excited about all the information I could collect, I slowly modified the app to listen to each sensor and update the latest sensor readings to the list in my app.
This is what it looks like.
The source for this app can be found on my GitHub project for DogBot. In particular, commit #1efe78975d contains the majority of the code changes I made to the Android app to get this working.
The next steps are to pack those sensor updates into small messages to send over the USB. But before I can start testing the USB, I need to go get a self-powered USB hub. My Android phone can draw quite a lot of power through the USB cable and plugging that directly into my Raspberry Pi will likely not leave enough power for the Pi itself. Because that might cause permanent damage to the Pi, I am going to go get a self-powered hub which causes the Android phone to draw its power from the wall (or hopefully a battery pack in the future), instead of the Raspberry Pi. My wife asked me whether the Raspberry Pi itself could be powered by that self-powered USB hub and I thought “why not?”. Does anybody know why not? I wonder if it will be difficult to find a self-powered USB hub that runs on DC power rather than AC. A battery pack for my robot would be DC, and most USB hubs (I presume) run on AC.
0 notes
dogbotblog · 10 years ago
Text
Back in Business
Tumblr media
My robot has a new brain. After I broke the SD card slot in my last Raspberry Pi, my wife and I forked out some extra cash to buy a shiny, new, upgraded Pi. My previous Pi had one USB slot and zero Ethernet ports and the new one has four USB slots and one Ethernet port! I had a slight delay in getting it up and running because I had to go out and get a new micro SD card, but I’m happy because (1) the SD card slot is tiny, so it doesn’t jut out like the old one, and (2) the new SD slot is metal so it won’t crack as easily.
I promised that I would show you morse.py, which I wrote to convert my wife’s excerpt of “A Boy’s Best Friend” into flashing LED Morse code. (Excuse me a second while I check that I capitalised “Morse” in my previous posts… whew, I did.)
Here we go:
Tumblr media
morse.py
#!/usr/bin/python3 import io import sys from time import sleep import RPi.GPIO as GPIO
I first imported the various modules and functions I needed. Nothing new here.
My general design for morse.py was to create a chain of small, simple functions which would process their inputs and pass along the outputs to the next one down the assembly line. Ideally, I would like these to run in parallel, like with Unix pipes, but I have not mastered asyncio yet. Maybe I’ll come back to morse.py and upgrade it once I have played around with the asynchronous stream functions.
def sanitise(in_stream=sys.stdin, out_stream=sys.stdout): if in_stream != sys.stdin: in_stream.seek(0) while True: ch = in_stream.read(1) if not ch: break if ch.isalnum(): print(ch.upper(), end='', file=out_stream) if ch.isspace(): print(' ', end='', file=out_stream)
The function sanitise takes a stream of text and filters it to only include numerals, uppercase letters and single space characters. For example:
"Hello, world!" :D
is converted into
HELLO WORLD D
This function is used so that the other functions don’t need to handle any characters other than numerals, uppercase letters and single space characters. While there are other characters in Morse code, I wanted to limit it to the smallest usable subset for simplicity’s sake.
def convert_to_morse(in_stream=sys.stdin, out_stream=sys.stdout): d = { 'A': '.-', 'B': '-...', 'C': '-.-.', 'D': '-..', 'E': '.', 'F': '..-.', 'G': '--.', 'H': '....', 'I': '..', 'J': '.---', 'K': '-.-', 'L': '.-..', 'M': '--', 'N': '-.', 'O': '---', 'P': '.--.', 'Q': '--.-', 'R': '.-.', 'S': '...', 'T': '-', 'U': '..-', 'V': '...-', 'W': '.--', 'X': '-..-', 'Y': '-.--', 'Z': '--..', '1': '.----', '2': '..---', '3': '...--', '4': '....-', '5': '.....', '6': '-....', '7': '--...', '8': '---..', '9': '----.', '0': '-----' } if in_stream != sys.stdin: in_stream.seek(0) while True: ch = in_stream.read(1) if not ch: break if ch.isalnum(): print(d[ch], end=' ', file=out_stream) if ch.isspace(): print('/', end=' ', file=out_stream)
The function convert_to_morse has a dictionary with alphanumeric characters as keys and sequences of dots and dashes as the values. Assuming that the input stream is sanitised, convert_to_morse will put out a stream of dots and dashes with spaces in between each letter and slashes in between each word.
For example:
HELLO WORLD
is converted into
.... . .-.. .-.. --- / .-- --- .-. .-.. -.. /
def convert_to_bits(in_stream=sys.stdin, out_stream=sys.stdout): d = { '.': '10', '-': '1110', ' ': '00', '/': '00' } if in_stream != sys.stdin: in_stream.seek(0) while True: ch = in_stream.read(1) if not ch: break if ch in list(d.keys()): print(d[ch], end='', file=out_stream)
The function convert_to_bits converts dots and dashes into ones and zeros. The conversion is not as simple as . → 1 or - → 0 or anything like that. If we think of a “unit” as the length of time that a dot takes to display, then:
a dot is one unit long (obviously),
a dash is three units long,
the gap between each dot and dash is one unit long,
the gap between each letter in one word is three units long, and
the gap between each word is seven units long.
As a binary bit, a 1 represents on and a 0 represents off, as in electrical currents or in an LED light.
I could infer a few things about the dictionary which helped to simplify the conversion code, but it means that the dictionary needs a little explaining.
All dots (which are represented by one unit of on) are followed by at least one unit of off. By including this off bit in the dictionary for ., we get 10. That means “one unit on, followed by one unit off”. You can see this in the dictionary pair: '.': '10'.
All dashes (which are represented by three units of on) are also followed by at least one unit of off, like the dots. Therefore, - is converted into 1110. That means “three units on, followed by one unit off”. You can see this in the dictionary pair: '-': '1110'.
All gaps between dots and dashes are represented by one unit of off but as you can see in (1) and (2), these are already included in each dot and dash, so we don’t add anything extra.
All gaps between characters (such as letters and numerals) are represented by three units of off. Because each character is made up of a sequence of dots and dashes, the last dot or dash already has an off unit tacked onto the end. This means that only two off units need to be added to the end of each character to make a total of three off units. You can see this in the dictionary pair: ' ': '00'.
All gaps between words are represented by seven units of off. Our input stream separates words with a ' / '. It is important to note that there is a ' ' (single space character) on each side of the '/' (slash character). Because we already have one off unit tacked onto the end of the last dot or dash in the last character of the previous word, followed by two off units for the first ' ' and two more off units for the second ' ', we only need two more off units to make up a total of seven for the entire word break. You can see this in the dictionary pair: '/': '00'.
For example:
.... . .-.. .-.. --- / .-- --- .-. .-.. -.. /
is converted into
10101010001000101110101000101110101000111011101110000000 10111011100011101110111000101110100010111010100011101010000000
If you think of each 1 representing the LED being on and each 0 being off, you can see how close we are to finishing.
LED_PIN = 7 def turn_led_on(): GPIO.output(LED_PIN, True) def turn_led_off(): GPIO.output(LED_PIN, False) def flash_led(in_stream=sys.stdin, period=1): GPIO.setmode(GPIO.BOARD) GPIO.setup(LED_PIN, GPIO.OUT) if in_stream != sys.stdin: in_stream.seek(0) while True: ch = in_stream.read(1) if not ch: break if ch == '1': turn_led_on() print(1) # print(1, end='', flush=True) sleep(period) if ch == '0': turn_led_off() print(0) # print(0, end='', flush=True) sleep(period) GPIO.cleanup()
These last functions, particularly flash_led, convert a sequence of ones and zeros into signals to turn the LED on and off with the Raspberry Pi’s GPIO pins. LED_PIN is used as a “constant” to represent which GPIO pin is used to control the flashing of the LED.¹
The function flash_led calls print for each bit to print a 1 or 0 to the console so that the Morse code can be read from the screen at the same time as the LED. Originally, I wrote morse.py on my OS X laptop with Python 3.4.2 but when I attempted to run it on my Raspberry Pi, it complained that it didn’t know what flush meant. It turns out that the flush keyword argument was only added to print in version 3.3, and my Raspberry Pi is running Python 3.2.2. (Or maybe it was 3.2.3… can’t remember). So I commented out those lines (in case I upgrade Python 3 soon) and replaced them with a call to print that did not change the end argument.
Setting end to '' and flush to True in Python 3.4 allows the program to print the ones and zeros on the same line and have them display on the screen as soon as print is called. Without those arguments, Python 3.2 needs a newline character to flush each bit immediately, so it prints each one or zero on its own line, which isn’t as pretty. (There may be other ways of getting Python 3.2 to flush the bits, but I’m not going to look for an answer when I can just upgrade it to Python 3.4 and get better control with the built-in print function.)
if __name__ == "__main__": arg_len = len(sys.argv) in_stream = sys.stdin out_stream = sys.stdout if arg_len >= 2: in_stream_name = sys.argv[1] in_stream = open(in_stream_name, 'r', encoding='utf-8') if arg_len == 3: out_stream_name = sys.argv[2] out_stream = open(out_stream_name, 'w', encoding='utf-8') sanitised_stream = io.StringIO() sanitise(in_stream, sanitised_stream) morse_stream = io.StringIO() convert_to_morse(sanitised_stream, morse_stream) bit_stream = io.StringIO() convert_to_bits(morse_stream, bit_stream) flash_led(bit_stream) print()
This section runs the program.
The first half looks at the arguments passed to morse.py and determines whether the user wants to use files or the standard console input and output. I wrote this so that my wife could dump a page of text into a text file and I could pass the name of that text file to morse.py as an argument.
The second half acts as a bucket brigade to convert the input from unsanitised text into LED flashes. There’s a call to print at the end just to make sure there’s a newline so that the user’s CLI prompt starts at the beginning of the next line.
If there’s anything interesting in there which I didn’t talk about, or if you want to hear more, write me a question here and I’ll try to concisely answer it.
Looking forward, February is upon us. My goal for the end of February is to connect my Android cellphone to the Raspberry Pi via USB and get some interesting sensor data from the phone and have the Pi print it out to my computer monitor. I need to start reading about Android apps accessing the USB.
¹ Python doesn’t have constants. I am simply using an uppercase snakecase format to indicate that LED_PIN is intended to be used as a constant.
0 notes
dogbotblog · 10 years ago
Text
"KERNEL PANIC... I/O ERROR"
Tumblr media
(https://www.flickr.com/photos/groume/8565200571) (CC BY-SA 2.0)
I wrote morse.py last night and my wife typed several paragraphs of a story into my Raspberry Pi. My Raspberry Pi has no Ethernet port and only one USB slot, so without a USB hub to connect both a keyboard and a USB storage device, everything has been copied into vi on the Raspberry Pi by hand. Despite the long time it took, I was excited that it was finished and I was looking forward to testing my Morse code skills tonight by decoding her story passage.
The Raspberry Pi boots its Linux image off an SD card which is plugged into the side because it has no on-board permanent memory. When I plugged in the SD card tonight, it fell out. I looked closely at the slot to find that it had cracked on both sides, with one side missing a relatively useful chunk of plastic. The Raspberry Pi wouldn’t boot because I couldn’t keep the SD card flat against the pins in the SD card slot.
Distraught, I went and gave my daughter a bath, because life goes on. As I scrubbed behind her ears and shampooed her cradle cap, I thought about what I could do to get my Raspberry Pi working.
Tumblr media
In the image, you can see a crack in the plastic SD card slot on the right.
After about 30 attempts to boot the Raspberry Pi by holding the SD card against the Pi with both hands, I realised that I wouldn’t be able to do any typing that way, let alone transcribe Morse code. Instead, I folded up some toilet paper and some paper from a writing pad, and taped the SD card in. It improves the stability of the SD card by pressuring it against the pins, but even then, I could only get it to boot after a dozen attempts.
By “attempt”, I mean “plug in the micro USB cable”. The SD card would only allow the Raspberry Pi to boot if it wasn’t bumped or nudged, but pushing the micro USB cable into its socket (right next to the SD card) required enough force that the SD card would inevitably be bumped or nudged.
Most of the time, when the Raspberry Pi did boot, the SD card would come loose seconds later, leaving me with a long sequence of scary errors, telling me in a roundabout way that the Pi couldn’t find the SD card anymore. No SD card, no worky.
After way too long, I got it booted, with the keyboard plugged in and with my hands off the Raspberry Pi. Once I got to the login screen, the thing seemed more stable. Maybe once the OS is loaded into memory, the SD card connection doesn’t need to be as reliable.
I quickly tested morse.py to make sure it handled multiple line files properly (it did) and then passed it a quick two word file to test my decoding skills.
Tumblr media
As you might be able to see in the picture, my first attempt at decoding the two word file didn’t work as well as I hoped: “HELLO ATORHG”. I slowed down the speed from 0.5 seconds per unit* to 1 second per unit and was better at my second attempt: “HELLO WORLD”.
I’ll do a more complete debrief of my horrible inabilities to decode Morse code, but here is what I got so far:
.--- .. -- --. ---- / .- - .- ... / --- ..- - / --- ..- . / ... .. ... / ... .--. .- -.-. . ... ..- .. - / -- --- .-- / .- -. -.. / .-- .- .- ... .... . -.. / ..- .--. / -.-- --- ..- .. .-.. .-- .- -.-- ... / .... .- -.. / ---- .-- .- ... .... / ..- .--. / .- ..-. - ..- . /
Let me pull out the handy Morse code binary tree.
Tumblr media
.--- .. -- --. ---- / .- - .- ... / --- ..- - / --- ..- . / J I M G Y / A T A S / O U T / O U E / ... .. ... / ... .--. .- -.-. . ... ..- .. - / -- --- .-- / S I S / S P A C E S U I T / M O W / .- -. -.. / .-- .- .- ... .... . -.. / ..- .--. / A N D / W A A S H E D / U P / -.-- --- ..- .. .-.. .-- .- -.-- ... / .... .- -.. / Y O U I L W A Y S / H A D / ---- .-- .- ... .... / ..- .--. / .- ..-. - ..- . / T O W A S H / U P / A F T U E
Tumblr media
Here is what my wife actually wrote:
JIMMY WAS OUT OF HIS SPACESUIT NOW AND WASHED UP. YOU ALWAYS HAD TO WASH UP AFTER COMING IN FROM OUTSIDE.**
I will post the source code of morse.py when I manage to convince my Raspberry Pi to boot up again.
*One unit is the length of a single dot. A dash is three units long, the gap between a dot or dash within the same character is one unit, the gap between two characters is three units, and the gap between two words is seven units. For my LED, the dot and the dash units are ON and the gaps between dots, dashes, characters and words are OFF.
**The full story passage would have taken something like an hour to decode:
Jimmy was out of his spacesuit now and washed up. You always had to wash up after coming in from outside. Even Robutt had to be sprayed, but he loved it. He stood there on all fours, his little foot-long body quivering and glowing just a tiny bit, and his small head, with no mouth, with two large glassed-in eyes, and with a bump where the brain was. He squeaked until Mr. Anderson said, “Quiet, Robutt.”
Mr. Anderson was smiling. “We have something for you, Jimmy. It’s at the rocket station now, but we’ll have it tomorrow after all the tests are over. I thought I’d tell you now.”
“From Earth, Dad?” “A dog from Earth, son. A real dog. A Scotch terrier puppy. The first dog on the Moon. You won’t need Robutt any more. We can’t keep them both, you know, and some other boy or girl will have Robutt.” He seemed to be waiting for Jimmy to say something, then he said, “You know what a dog is, Jimmy. It’s the real thing. Robutt’s only a mechanical imitation, a robot-mutt. That’s how he got his name.”
Jimmy frowned. “Robutt isn’t an imitation, Dad. He’s my dog.” “Not a real one, Jimmy. Robutt’s just steel and wiring and a simple positronic brain. It’s not alive.”
“He does everything I want him to do, Dad. He understands me. Sure, he’s alive.”
“No, son. Robutt is just a machine. It’s just programmed to act the way it does. A dog is alive. You won’t want Robutt after you have the dog.”
“The dog will need a spacesuit, won’t he?” “Yes, of course. But it will be worth the money and he’ll get used to it. And he won’t need one in the City. You’ll see the difference once he gets here.”
Jimmy looked at Robutt, who was squeaking again, a very low, slow squeak, that seemed frightened. Jimmy held out his arms and Robutt was in them in one bound. Jimmy said, “What will the difference be between Robutt and the dog?”
“It’s hard to explain,” said Mr. Anderson, “but it will be easy to see. The dog will really love you. Robutt is just adjusted to act as though it loves you.”
“But, Dad, we don’t know what’s inside the dog, or what his feelings are. Maybe it’s just acting, too.”
Mr. Anderson frowned. “Jimmy, you’ll know the difference when you experience the love of a living thing.”
Jimmy held Robutt tightly. He was frowning, too, and the desperate look on his face meant that he wouldn’t change his mind. He said, “But what’s the difference how they act? How about how I feel? I love Robutt and that’s what counts.”
And the little robot-mutt, which had never been held so tightly in all its existence, squeaked high and rapid squeaks-happy squeaks.
Again, I had to leave the LED flashing while I went to the supermarket to get some more butter and milk. Because life goes on.
1 note · View note
dogbotblog · 10 years ago
Text
Blink
Tumblr media
(https://www.flickr.com/photos/syntropy/2861992111) (CC BY 2.0)
After rigging up a basic circuit with an LED and a 270Ω resistor, I moved on to programming the LED to turn on and off at will.
on.py
#!/usr/bin/python3 import RPi.GPIO as GPIO from time import sleep def on(): """Turn the LED on for 10 seconds.""" GPIO.setmode(GPIO.BOARD) LED_PIN = 7 GPIO.setup(LED_PIN, GPIO.OUT) GPIO.output(LED_PIN, True) period = 10 # seconds sleep(period) GPIO.cleanup() if __name__ == "__main__": on()
As the function docstring says, this program turns on the LED for 10 seconds and then turns it off when the program ends.
RPi.GPIO is the Python package for using the GPIO pins on the Raspberry Pi. In my previous blog post, I was using the +3.3V pin (pin 1) and one of the ground pins (pin 6) to power my LED. Those pins are not programmable, so I didn't need to write anything to power that circuit. There are other pins, though, that can be programmed to be inputs or outputs. Inputs can be used for things like sensors, while outputs can be used for things like LEDs. Pin 7 is a programmable pin, so I changed my power source for my LED from pin 1 to pin 7. The GPIO package provides me with functions that I can call to control the programmable pins.
The on function does the following:
GPIO.setmode(GPIO.BOARD) sets the numbering system of the GPIO pins to match the board. There are a number of different modes for referring to the GPIO pins but the BOARD mode seems to make sense to me because I can just count the pins and use that number. I'll look into the other modes later to see if there's something good about using them instead.
LED_PIN = 7 defines a constant which I am going to use when controlling pin 7. When LED_PIN appears in later lines of the program, just think of it being replaced by the number 7. I am using this constant to make the program slightly clearer to read. No magic numbers here!
GPIO.setup(LED_PIN, GPIO.OUT) tells the pin 7 that it is an output pin. This means I can now use it to power the LED. By default, all of the programmable pins are set to be input pins, so this line changes that.
GPIO.output(LED_PIN, True) tells pin 7 to put out +3.3V of power. This turns on the LED.
period = 10 defines a variable which I am going to use to determine how long to leave the LED on for. # seconds is a comment that I put at the end to clarify that the period is 10 seconds (rather than milliseconds or something like that).
sleep(period) tells the program to pause for 10 seconds. At this point, the computer goes off to do other things and after 10 seconds (approximately¹), the program resumes.
GPIO.cleanup() resets all of the pins that I used in this script (pin 7) to be an input pin. This will prevent the pin from being left in a state where it might short circuit something and damage the Raspberry Pi.
When run from the command line, this script will call on() and turn on the LED for 10 seconds. After that, the clean-up will turn off the LED.
blink.py
#!/usr/bin/python3 import RPi.GPIO as GPIO from time import sleep def blink(): """Blink the LED 10 times.""" GPIO.setmode(GPIO.BOARD) LED_PIN = 7 GPIO.setup(LED_PIN, GPIO.OUT) NUM_BLINKS = 10 for i in range(0, NUM_BLINKS): GPIO.output(LED_PIN, True) sleep(1) GPIO.output(LED_PIN, False) sleep(1) GPIO.cleanup() if __name__ == "__main__": blink()
This program has very little new compared to on.py. The only difference is that this iterates 10 times through a for loop, turning the LED on for a second and then off for a second. Because each loop is approximately 2 seconds long, the program runs for about 20 seconds and then exits.
Blink blink.
Up next: a Morse code encoder. I'm going to test my Morse code decoding skills. Leah has selected a passage of a book. I'm going to get her to put it in a text file. The characters in that text file will then be converted into Morse code via flashing LED and I'm going to try to decode it. Maybe I'll allow it to have variable speeds so I can slowly speed up the Morse code until it's gibberish.
¹ The timing is not exact because the Raspberry Pi is not real-time like the Arduino. The Raspberry Pi is running a bunch of other things at the same time as my Python scripts, so sleep may not return exactly when I want it to.
0 notes
dogbotblog · 10 years ago
Text
Baby Steps Led By LED
Tumblr media
It may not look like much, but that is the future, my friends: LEDs!
I like to find projects that are ambitious enough to be challenging but feasible enough to not force me to rage-quit. A visual SLAM robot seemed to be such a project. I wouldn't be breaking any new ground in terms of robotics, but I would be pushing quite a few boundaries for myself: implementing machine learning algorithms for image detection, developing a non-trivial Android app, managing 3D models and navigating through them, and (most importantly) not electrocuting myself while building the circuits.
But whenever I start a new project, I know that I need to take baby steps. When you work on something big and you know that there will be lots of things that you will need to learn before you make it to the end, it helps to think about what you can achieve in the short term.
Everybody loves a win, and big projects can only be finished when you break that one big win into lots of tiny wins. I plan to finish this project by the end of 2015 and especially with lots of new priorities competing for my time (read: baby), I need to have a lot of momentum to keep me excited about going back to the workbench to keep up my progress.
So, when I think "autonomous rover", I realise pretty quickly that I need to find a stepping stone that can give me a quick win.
A week ago, I had a Raspberry Pi, a RaspiRobotBoard (which I didn't know what to do with), and I had written a few toy Python3 programs to get myself prepared for programming the Raspberry Pi.
I watched some videos on YouTube (particularly this playlist by Matthew Manning) and figured that lighting up an LED would be a good project to get done in one evening.
I decided to follow the instructions for the first half of this tutorial by Gordon Henderson, but before I could start, I needed to get a breadboard and some jumper wires to rig it all up. A very busy Sunday included a trip to Jaycar in Lower Hutt. Having primarily built all my projects from code at a computer, it was exciting going into an electronics store and walking out with something physical.
When I got home, I looked through everything I had and realised I had hit a bump. All my jumper wires were male-to-male and I needed at least a couple of female-to-male wires to plug into the GPIO pins on my Raspberry Pi.
My wife and I did a bit of searching online, as Jaycar didn't have any in-stock, and I found some female-to-female jumper wires and a small pack of LEDs and quickly sent through an order. (A female-to-female wire can plug into a male-to-male wire to make a female-to-male wire. Amazing!)
Two days later, the package arrives (along with my wife's nursing necklace from Estonia!) and at the end of the day, I quickly plugged my wires together and plugged the power into my Raspberry Pi. Magic! The red LED on the breadboard turned on! No electrocution today, Thor!
I should also mention that this resistor chart/calculator was helpful for convincing me I had the correct resistor from my RaspiRobotBoard kit.
It was pretty exciting seeing the LED turn on. It's not much, and I'll likely giggle a year from now at how basic it was. But I'll enjoy the moment.
1 note · View note
dogbotblog · 10 years ago
Text
DOGBOT
Tumblr media
(https://www.flickr.com/photos/skynoir/11927371385) (CC BY-NC 2.0)
Last year (and the year before), I was working on Dendrite. Since its release a few months ago, I have been itching to start my new project: a dog robot. From here on, it's name will be "dogbot". Until a better name pops into mind.
A while ago, I was given a Raspberry Pi as a gift and excitedly wrote a few toy Python programs on it. Some relatives later gave me a RaspiRobotBoard. I finally had everything I needed. Almost.
I went to Jaycar in Lower Hutt and got a breadboard and a small kit of wires all ready for prototyping small electronic circuits to build up my confidence.
My plan is to build a small rover with the following layers:
Android phone: collecting sensor information (e.g. GPS, accelerometer, compass), capturing raw video, processing raw video into images for image detection and visual SLAM, microphone input for voice control, and WiFi for networking;
Raspberry Pi: building and maintaining a local model of the robot's environment, using sensor info and imagery from the phone via USB, using the local environment model for navigation and object interaction;
RaspiRobotBoard: controlling motors and LEDs for rover control, getting its instructions from the Raspberry Pi through the GPIO pins.
All of this will be installed onto a chassis and wrapped in a case.
A few ideas I'm toying with:
Dogbot should be wrapped in a soft cover over a hard case. I have a daughter who will be a toddler at the end of the year and I want her to be able to interact with dogbot without worrying about her or my electronics. I'm even thinking of having a buzzer-like sensor so she can just bop it on the head to get it to react.
The Android phone should be as removable as possible, so I can use my main phone and pull it out whenever I need it. The phone will need its own power source (i.e. battery pack) and will basically need to easily dock and undock.
The Android phone should be able to process as much info as possible before passing it on to the Raspberry Pi. The phone will serve as a kind of face for dogbot. Voice recognition, touch screen controls, an animated dog face which reacts to stimuli. None of these need to be touching the Raspberry Pi as that will be entirely focused on navigation and controlling the electronics of the rover.
I'll post again soon with the first steps of my project.
0 notes