28 April 2011

Android Dev Team #12
    Week April 20-26


The progress:
So far the android app has a new gesture comparison algorithm similar to the one dollar gesture recognizer algorithm. The alteration is simpler in that it doesn't have rotation and the number of points for resampling and the bound box is a much smaller scale. The recognition is based on the point by point difference between the gesture inputted via the android app to a new gesture inputted by the fabric device. While the fabric device is configured on a separate app designed by Matthew Moss, I will port the blue tooth communication over to the main android app I am working on.
The app also has Text To Speech implemented. The application will now respond to input by sound, prompting the user with the gesture it is comparing the input gesture to. It is meant to completely replace the screen by earpiece sound and fabric input on bluetooth. However I do not know if adding the bluetooth headset for outbound sound will be within the scope of this project.
I had to change the way I had input the buttons by the device earlier this week as it had been causing my algorithm to not work correctly. The way I was doing this before was by grabbing each button top to bottom and right to left, causing an incorrect path. This has modified to input the button being pressed in series by the sequence in which they are being pressed. Also, I had to modify the gesture table in the sqlite database to contain not just numbers, but points sorted by which the order they are being pressed in the template x,y; where the colon is the delimiter that another point is in the gesture and a null character terminator. e.g. 5,8;5,7;6,7;\0

Things to do:
I need to make my android app work with the fabric device. I will do this by copying the code from Matt's testing app. Once this is in place, I will need to alter my input algorithm to match the fabric instead of the previous way of using the touch screen on the phone to input gestures. This may prove to be challenging. I plan on having the user make a single gesture, after the finger has removed pressure from all the buttons on the fabric interface, after a second the gesture with all the buttons that had pressure applied will be submitted for testing.
The professor had also mentioned changing the gesture database to be input straight from the fabric device itself so that it will have the same variations the fabric outputs have, as opposed to the phone touch interface, as it differs quite a bit. The fabric seems to apply pressure to multiple buttons surrounding the button the user wants to apply pressure to. Michael and Matt and working on making the button pressure more exact by applying some type of buffer material so that finger pressure is precise. Also, the fabric is being working on by the other team, so we will need time to test the device with the android app once it has been refined for fabric testing, by meeting up some time later. Once the device and the application have been combined, we will by able to further test the combined system for a user study.
Once we near the end of the semester we will need to submit our code and documents to the GIT server as to preserve the project process for another semester. This is of minimal concern right now. Currently we are using a svn server that has been working fine for uploading and collaborating our progress with other team members.

No comments:

Post a Comment