Monday, February 21, 2011

Week 10 - Little Actions Can Make a Big Difference

Hello all,

We've just wrapped up Week 10 of our Winter quarter work on Access Lecture, and we're approaching some great stopping points. With one week left before our (much needed!) Spring break, we've been working hard to put the finishes touches on our prototypes to prepare for the imminent usability tests.

But first, some great news! A few months ago, the eligible members of the AL team (Lindsey and myself) applied for scholarships regarding the 2011 Richard Tapia Celebration of Diversity in Computing Conference. We found out over the weekend that we have both received scholarships! This conference provides presentation, learning, and networking opportunities for both students and professional computer scientists. The current goal is to have a poster ready to present by the conference deadline. Oh yeah, the conference is in San Francisco, California and will be conducted from April 3rd to April 5th; jealous?

Now that I've gotten to brag across our vast internet via Blogger, it is time for a summary of what we accomplished this week. Our main focus was the polishing-up of our prototypes that are to be used for usability testing during Spring quarter. Broken up, we worked on...
  • Automatically logging important usability data with regards to the touch screen. This includes recording where the touch screen was pressed when trying to select certain icons, and specifics about the pinching/tapping gestures performed during particular tasks.
  • Setting up a test database on a remote server; this will allow us to simulate how AL would work in a 'real' setting more effectively.
  • Data mining more research articles that relate to the development of Access Lecture in a variety of ways.
With regards to the automatic logging of multi-touch and touch screen interactions, I can say that we have made a vast amount of progress with this; as a result, we can now objectively pinpoint a large subset of the potential usability flaws in the AL prototypes. We plan to make heavy use of this feature during our usability tests. The feature works by logging the x and y coordinates of touch screen interactions, as well as other specifics about what those interactions were trying to accomplish.

For example
: If we are testing a subset of users who are color blind, and we notice that they continually hit a certain edge of an icon or miss that icon consistently, then we can conjecture that the particular icon in question blends in with its surroundings with regards to color blind vision.

The database is still being worked on; the framework which we are using is taking some getting used to! More details will be posted next week, once more progress has been made.

That's all for this week! Tine for all of us to cram for some finals and take a well deserved week off.

Until next week...

- Alex

No comments:

Post a Comment