Monday, February 21, 2011

Week 10 - Little Actions Can Make a Big Difference

Hello all,

We've just wrapped up Week 10 of our Winter quarter work on Access Lecture, and we're approaching some great stopping points. With one week left before our (much needed!) Spring break, we've been working hard to put the finishes touches on our prototypes to prepare for the imminent usability tests.

But first, some great news! A few months ago, the eligible members of the AL team (Lindsey and myself) applied for scholarships regarding the 2011 Richard Tapia Celebration of Diversity in Computing Conference. We found out over the weekend that we have both received scholarships! This conference provides presentation, learning, and networking opportunities for both students and professional computer scientists. The current goal is to have a poster ready to present by the conference deadline. Oh yeah, the conference is in San Francisco, California and will be conducted from April 3rd to April 5th; jealous?

Now that I've gotten to brag across our vast internet via Blogger, it is time for a summary of what we accomplished this week. Our main focus was the polishing-up of our prototypes that are to be used for usability testing during Spring quarter. Broken up, we worked on...
  • Automatically logging important usability data with regards to the touch screen. This includes recording where the touch screen was pressed when trying to select certain icons, and specifics about the pinching/tapping gestures performed during particular tasks.
  • Setting up a test database on a remote server; this will allow us to simulate how AL would work in a 'real' setting more effectively.
  • Data mining more research articles that relate to the development of Access Lecture in a variety of ways.
With regards to the automatic logging of multi-touch and touch screen interactions, I can say that we have made a vast amount of progress with this; as a result, we can now objectively pinpoint a large subset of the potential usability flaws in the AL prototypes. We plan to make heavy use of this feature during our usability tests. The feature works by logging the x and y coordinates of touch screen interactions, as well as other specifics about what those interactions were trying to accomplish.

For example
: If we are testing a subset of users who are color blind, and we notice that they continually hit a certain edge of an icon or miss that icon consistently, then we can conjecture that the particular icon in question blends in with its surroundings with regards to color blind vision.

The database is still being worked on; the framework which we are using is taking some getting used to! More details will be posted next week, once more progress has been made.

That's all for this week! Tine for all of us to cram for some finals and take a well deserved week off.

Until next week...

- Alex

Monday, February 14, 2011

Week 9!

Hey everyone,

This week we managed to polish up our tapping to zoom app, which we had been having some bug difficulties with.  It seems like a lot of the problems we've been having with our code have been due to a lack of transparency with Interface Builder. Sometimes it seems like Interface Builder changes things or sets things and it's not very apparent so that can make it difficult to understand what's actually going on with your app.

Luckily, we figured out the problem and now both apps are working correctly with the zooming.  To finish the apps up we're working on adding an extra button that pops up when you're in the note taking view so that if you don't want to use the onscreen virtual key to exit the view, you can hit the red x. Our CREU faculty member recommended this and I think that it will be more intuitive for people who are not familiar with the iPad or iPhone. Here's a new screenshot of someone using the note taking view, and you can see the additional button on the top right of the note box.  Alex actually made the button using photoshop.


We were curious about how our app would look to color blind users, and we found a site where you could upload an image and it would show you how it looked to people with different types of color blindness. 


We tried submitting the image of our app and these are what it produces. In order these show how the app looks for people with tritanope, protanope, and deuteranope types of color blindness




While we didn't really consider color blindness very much when writing up our requirements, it might be something to take into consideration at some point.  We might not want to rely too much on the use of color to indicate what an icon does.

Other things that we looked into for the week were video streaming to the iPad, because that might be one way we can show the current lecture or the old lectures.  Apple seems to provide a lot of documentation on how to handle video streaming, so if we want to go down that route we should have enough information.  Abhishek also is trying to host the database we're going to be using for Access Lecture, since we will probably be able to start working with the database more next quarter.

Overall I think this week was pretty productive, and now we just need to get our new apps onto the iPad to test them out on a real device and then they'll be ready for our usability testing.

Lindsey

Friday, February 4, 2011

Week 8 update

Hello everyone!

Last week we updated about some fix we made regarding the notebox. However, one problem still prevailed. The problem was that even if we escaped the keyboard (please see the image in last post) the notebox still would be hanging there.  We had to manually push the notes button (a pen) to return to the lecture view. This week we are able to fix this problem as well. So now if we hide the keyboard the notebox also gets automatically hidden.

Though we have been talking about the application that we have been developing, we are also looking towards actually performing usability testing for the application with the targeted users for the application. For the purpose we put together three separate apps. Our main motivation would be to find out the most convenient method to zoom. Zooming would be one of the most important feature of the application. Also there are a lot many ways that we can invoke zoom; some of the most popular methods are tapping and pinching.We are trying to find out the method of zooming that the targeted users would find easier.

Another area that we are looking forward to find out is the appropriateness of the icons and how easily the users can relate to the icons to their intended action. The placement of icons would be another area that we would be focusing on as to how easily they are able to access the icons. As for the icons we are considering to make use two different set of icons (colored and black/white icons) to find which is the more preferred one.

Thus, we have the following applications ready for testing (although some minor work needs to be done in one of the app):
  1. Pinch to zoom (colored icons)
  2. Pinch to zoon (black/white icons)
  3. Tap to zoom (few works remain)
So this week that we have the first two apps completely functional and ready to be rolled out for testing. We are trying out best to get done with last app which we hope to complete by next week!

--
Abhishek