Friday, August 12, 2011

End of the year

Over the past year the Access Lecture team has done a lot of amazing things that were new to all of us.

In the beginning we
During the second quarter we
During the third quarter we
And finally, over this past summer we have
We have done a lot and learned a lot over the course of this project. This was the first time I've ever worked on something that really had a purpose, and wasn't just for a grade or to display technical skills- we've been making something that real students could use in their classes. I think that's pretty amazing.

The experiences and opportunities that this project gave me have been invaluable and I'm really glad I was able to be a part of the Access Lecture team and work with Alex, Abhishek, and our professor.


Thank you all for a great year!

Lindsey

A Final Reflection: 700 hours of total development yields 700 units of awesome

Today marks the last day of my work as a full-time developer for the Access Lecture project. I have been working on the project with Lindsey and Abhishek since the beginning of fall quarter, both full- and part-time.

While reflecting on all that we've done and experienced, I came upon a question that I felt needed immediate solving:
How many hours of development does it take to get to the center of a tootsie pop develop an awesome iPad application?

The answer? 700.
Now that we know the answer, let's make some Jeopardy questions from it!

The correct answer to each of these questions is: "What is 700 hours?"
  1. The time it took us to make significant advances in helping visually impaired students actively participate in math and science courses.
  2. A time period during which I learned about, conducted, designed, and analyzed usability tests.
  3. How many hours I have spent and will spend traveling to the Richard Tapia, Posters on the Hill, and ASSETS Conferences!
  4. How long I have been forced to stare at white/shiny/magical things.
  5. The amount of time in which we managed to learn Objective-C, the iPhone SDK, and the principles of iOS development, as well as apply these technologies in advanced and innovative ways.
  6. Time which I was fortunate enough to be able to spend forming valuable, long-lasting professional and personal connections.
  7. A time frame in which I learned about many different technologies which unexpectedly came up during the development process. These include Quartz-2D, OpenGL ES, remote desktop methodologies, handling online client/server authentication, and many more.
If you answered "700 hours" to any of the following, you are wrong. Your answer must be in the form of a question; thanks for playing.



In all seriousness, I am extremely thankful and appreciative that I received the opportunity to work on Access Lecture. It has truly been an invaluable experience, and has helped me both grow and expand personally and professionally.

During the course of my research this year, I have been exposed to the following for the first time:
  • The general principles, practices, and processes involved in software engineering research.
  • The academic research community, both the technical subset and the community as a whole.
  • Attending conferences and presenting posters on my research.
  • The non-technical elements which are necessary to make a large engineering project successful, such as usability and requirement gathering.
  • Working closely and long-term within a small development team.
  • The challenges which arise from having to unexpectedly and quickly adapt a software project. Such challenges have included:
    • Completely reworking a feature on short notice
    • Quickly and fully learning and implementing various methodologies and technologies
    • Using preexisting technologies in innovative ways, and working within uncharted territory.
Needless to say, I am grateful to all parties who were involved in letting me experience all of these things. I would like to extend my utmost thanks and appreciation to:
  • Our project adviser (whom I don't know if I should mention by name) - For being extremely supportive and understanding, leading the way throughout the process, and coming up with an idea for a system which I (speaking from first-hand experiences in my secondary education) truly believe can greatly improve academic conditions for visually impaired students.
  • The AL Team (Lindsey and Ahbishek) - I don't even know where to begin with this one; there is too much to say! The team was great to work with; we certainly accomplished tasks at a level that we could not have accomplished individually.
  • CREU (Collaborative Research Experience for Undergraduates) - For supporting our research and helping us receive the most from the experience.

That's all for now! There is a very good chance that I will be continuing work on Access Lecture part-time in the fall, and there is still the ASSETS conference in Scotland to look forward to!

- Alex

Wednesday, August 3, 2011

Week 9 - Working on a Brand New Poster!

Hey everyone,

Last week Alex told you guys about our new plans for the notetaking system. We got everything working, and we thought we were done with notetaking until we tried putting the application on the device.  It turns out that the iPad simulator in XCode is not a good indicator of how well something will work on the iPad...in fact the application can run quite differently.

While notetaking was rather smooth on the simulator, when we tried it out on the device we were horrified. There was a pretty bad delay when writing, and everything came out angular i.e. a circle would come out as a triangle. We tried optimizing it more, but we just couldn't get the notetaking to an adequate state.

We were previously using Quartz 2D to handle the notetaking/drawing system, and now we are considering using OpenGL instead.  We've learned that OpenGL is used a lot for ios games, and we believe that it could potentially be faster and more efficient. Unfortunately, none of us have any prior experience with OpenGL and it appears to be a bit time consuming to actually learn it.

We had planned on spending the last two weeks giving OpenGL our best shot, but then we received some excellent news. The poster application we submitted for ACM ASSETS 2011 was accepted! We had to write a short 2 page paper about the project and they also sent back a lot of feedback on what we wrote.  That feedback, in combination with the experiences we had at Posters on the Hill and the Richard Tapia Celebration of Diversity in Computing Conference will need to be taken into consideration when creating our new poster.

A couple of the things we learned from our previous poster presentations
  • Text should be kept short! People do not want to come up to your poster and stand there for a few minutes reading. We received a couple of comments about our posters having too much text.

  • Make text larger! Not excessively large, but it's a big poster and people will be standing a couple of feet away from it. Make sure it won't be difficult for people to read the text at that distance.

  • We need to make it more clear that we are only working on the software aspect of this project. At the Richard Tapia conference people became very excited about the hardware and asked questions about that, and we had to explain that we did the iPad application portion of the project.
  •  
  • People like diagrams. It can sometimes be easier and faster to convey something with pictures than with text.
This week we will be working on the new poster. Some of the material from the previous posters may be able to be rewritten and shortened so that it can be used on the new poster, but we also have to write about all of the work that we have done since the last conference.

Lindsey

Saturday, July 23, 2011

Hard at work on the new note-taking system!

"Introducing the new Note-Taking Feature®. Innovation reinvented, all for the same incredible price."
You're welcome, Apple marketing team. I'll be expecting my job offer shortly.

We've been hard at work on incorporating the feedback from our latest set of usability tests into Access Lecture. We've already taken care of a surprising amount of our to-do list. However, there was one feature that all participants agreed needed a major overhaul: the note-taking feature.

What is the note-taking feature?
In addition to being able to follow along with lecture notes, our goal is to make Access Lecture capable of allowing students to take their own notes by writing directly onto the lecture.

The basic idea:

In using the new note-taking system, students will...
  • Press a button to enter "note-taking mode".
  • Use an iPad stylus to draw directly onto the real-time lecture stream.
  • Save their notes directly onto the iPad as an image.
The issues:
During usability testing, the note-taking system took a snapshot of the currently-visible portion of the lecture. The application then transferred to a separate screen, where pen/eraser options were available. However, the snapshot did not allow the user to scroll or zoom. Additionally, the user's notes stayed on the image, regardless of where the user scrolled to in the lecture view, until the user saved and cleared.

The solution:
After a lot of brainstorming (see the image to your right), we came up with a solution that we hope will mitigate the issues experienced by testers.

Now, upon clicking the note-taking button, the following will happen:
  • The user will not be taken out of the lecture view. This means that scrolling, zooming, and other features will still work as normal, and the lecture will continue being updated in real-time while the user is taking notes.
  • The application's two toolbars will switch to two toolbars formerly present in the note-taking mode/view.
  • To enable both writing and scrolling, the scroll gesture will require two fingers will note-taking is active.
  • The user can write on the real-time lecture, now with no snapshot involved.
  • Additional settings have been added, to adjust pen/eraser size.
We have this system nearly completed. The only issue we are facing is saving the user's notes, since it is no longer a snapshot of fixed size.

In conclusion:
We are extremely excited about the massive progress being made, largely with the help of our usability testers. Expect good news next week regarding the new note-taking system!

- Alex

Tuesday, July 19, 2011

Usability Testing Comes to a Close

Hello all,

We've just finished our second round of usability testing! It went better than we had hoped, and we have very high hopes
here at Access Lecture (Trademark pending. Don't steal it. It's mine).

The Testing:
This second round of testing focused primarily on receiving feedback with regards to the app's new features. New features included:
  • Zooming buttons (in addition to the pinch-gesture)
  • A revamped note-taking system
  • The ability to save notes to the iPad
  • A settings page (including toolbar preferences and scroll/zoom options)
  • New icons
Additionally, we were finally able to demonstrate a working proof-of-concept. Instead of showing pre-made/sample lecture images on the iPad, we were able to actually transmit our own whiteboard notes in real-time!

Test subjects were given a set of tasks to perform and give feedback on:
  • Following along with and taking notes on three sample, real-time lectures.
  • Adjusting the app's settings.
  • Exploring the new note-taking system.
  • Identifying the new icons.
After completing these tasks, additional, quantitative feedback was collected via a post-test interview.

Results:
While feedback in certain areas in the initial round of testing were a bit scattered, the feedback for round 2 tended to be nearly universal.

Here are some of the key conclusions which we have made about the app, based on user feedback from the latest round of testing:
  1. The new icons were solid. 100% of participants found the icons intuitive and easy to see.
  2. The new zooming buttons were generally considered to be intuitive, although many preferred using the pinch-to-zoom method exclusively.
  3. Although the idea behind the new note-taking system was praised, the implementation needs a lot of reworking. This was expected, as this feature is still very much a work in progress.
  4. 90% of participants would find Access Lecture helpful in a real course.
  5. The settings need fine-tuning, but are on the right track.
Next Steps:
Work on integrating the feedback from the usability tests is already well underway. We have analyzed and prioritized the feedback, and have many of the high-priority fixes either implemented or near completion.

What remains to be done is to continue our brainstorming of an appropriate note-taking system. Participants did not want a still-image to write on, and would have preferred being able to both write and scroll on the actual lecture screen. This poses a significant usability challenge: how does one support useful and intuitive note-taking on-top of a real-time image stream? We are all excited to start brainstorming solutions!

Oh yeah, we hear back from ASSETS '11 in 8 days.

- Alex

Monday, July 11, 2011

Halfway through summer!

Hello everyone!

Last week Alex mentioned that our professor wanted us to begin working towards our second round of usability testing.  If you recall, our first round of usability testing was focused on the user interface, note-taking, icons, and the multi-touch gestures we can use for navigation. The tests went very well, and gave us a direction to move in with our application as development began.  Since then, we have worked a great deal on the application and it has become very...well...legit looking! A lot of the initial functionality we wanted to see in the application has been implemented, the note-taking system was revamped, and we were finally able to stream real lectures to the iPad.


And so, another round of usability tests were suggested. I'm a fan of the cyclical nature our development has taken, gathering requirements, developing, testing, using the feedback to further polish the system, and so on.  We want to make sure that the direction we've taken since our last tests was the right direction to take! That's why last week we developed new activities for the usability test, wrote up new pre-activity and post-activity questions, and completely rewrote our test script.  We also had to modify some of our code so that the application saved the users notes to the iPad's photo gallery (previously we were saving images on the desktop so that we could look at them easily). A couple of other minor changes were made (adding a clear all button to the the note screen, adding more feedback for the user so they can tell when they've saved something, adding an about page that has information about the developers/special thanks to CRA, etc) and we also created a new logo (seen above). It's supposed to be a whiteboard with a couple of markers and then the project name.  We wanted to have something nice to display on the home screen of the app.


Alex and I also came in on Saturday to make up some time we missed, and we worked on writing a "Related Work" section for a paper about this project.  We were unable to write a paper for the ASSETS conference this year but there are other conferences during the next academic year, and next years Access Lecture team might want to submit a paper for ASSETS.


This week we began the usability tests.  We were able to schedule two people for today, and there are four people scheduled to come in tomorrow. For our last round of tests we had 8 test subjects, and this time we'd like to get an even larger data pool. We appealed to one of the Computer Science professors that is here over the summer doing research with students and a couple of his students seemed interested in being test subjects.


The two tests we ran today went well, but wasn't a surprise because we've now had a bit of experience doing usability testing. Alex and I conducted the first usability test, and then Abhishek and I conducted the second test. We're already starting to see some issues pop up that didn't occur to us when we used the application, but we've also gotten some positive responses about things we've changed since the last round of tests, so it's very satisfying.


We'll update you guys more after we've finished up with our tests and looked at the data. So far it seems like we're going to have a lot of interesting things to consider about the application and where we're going with it.


Lindsey

Friday, July 1, 2011

Preparation for round 2 of usability testing begins!

Hello all,

We're extremely excited right now regarding our recent leaps in progress on Access Lecture. I believe that our progress can be best summarized as such:
Supervisor: "Alright team! It's time to get ourselves together and tackle this real-time screen capture issue! once and for all!"
AL Team: "Uhm, it's already done."
Supervisor: "...Really?"
AL Team: "Yep."
Supervisor: "Oh...well this is awkward."
AL Team: "Yeah..."
Supervisor: "Oh yeah? We still need to integrate it into the main project! Now get on that slackers!"
AL Team: "Did that too."
Supervisor: "Of course, I knew that!"
AL Team: "...So...what do we do now?"
Supervisor: "Do some usability tests! There's no way that you've had time to do that too!?"
AL Team: "Uh, no...no we haven't done those yet."
Supervisor: "What a bunch of slackers."

The AL project now boasts the implementation of these features:
  • Integration of the real-time screen capture features which were discussed in the last blog post!
  • Full zooming/navigation functionality has been added and modified to work with the real-time capture screen.
  • A nice cleanup and sharpening of the app. Features are now more responsive to the user, and various bugs/glitches have been tidied up.
The key victory is that first bullet. As we discussed in our last blog entry, we were having more than a little trouble integrating the real-time capture functionality with the rest of the app; this was mostly due to the UIWebView not appreciating having its ScrollView modified. We decided to step-back and take a different approach, and it certainly paid off! Instead of displaying a webpage which refreshes continuously with images of the Mac's screen, a UIImageView is now fed a stream of those same images. The result is a view that looks exactly like the WebView, but actually functions!

Now that we have gotten the app to a stable and working proof-of-concept state, we are ready to conduct our second round of usability testing. The insight and feedback obtained from the first round was extremely valuable, and with so many new features added since then, the expectations for round 2 are equally as high. The goal is to begin the testing no more than 2 weeks from now.

Additionally, our applicant-paper for the ACM ASSETS 2011 conference has been finalized and submitted! Here's to hoping that we'll be in Scotland come fall.

- Alex

Monday, June 27, 2011

Fighting with UIWebViews

Last week we wrote a 2-page paper about Access Lecture to try and get our poster accepted for the 2011 ASSETS conference in Dundee, Scotland. We always thought writing one of those long research papers would be difficult, but we found that our initial draft was almost 5 pages long. It turns out that it's really difficult to write about everything we've done when there's a limit of only 2 pages!  ASSETS is a conference specifically dealing with accessibility and assistive technology so we're really crossing our fingers and hoping we get accepted.

After finishing the paper we decided to try making our zooming work with the UIWebView we were using to display the notes. We quickly learned that we needed to inject javascript into the website in order to enable increased levels of zooming.  For pinching to zoom this worked fine, but when we tried using our code to programatically zoom via buttons, we encountered some problems.

When we were working with a UIImageView we were able to easily calculate the center point of the view and zoom in on that point. For some reason even though we are again calculating the centerpoint, the UIWebView only zooms in on the top left of the screen..additionally that are many quirky bugs that we didn't experience when working with UIImageViews.

We've posted on Stack Overflow and a couple of IOS development forums but so far we haven't received any responses.  We think that we lack an understanding of the connection between the UIWebView and the UIScrollView that's built in; however, we haven't found any comprehensive explanations of that relationship. We're hoping that we can learn more about how they work together so that we can understand what's different about the two scrollviews. Right now the behavior of the webviews scrollview is very different than the behavior of the scrollview we previously implemented. Apple also makes it a little challenging to get access to the UIWebView's scrollview, which might mean that it's not intended for developers to manipulate it very much.  Right now, it seems like that might be the case and that we're trying to do something with the webview that we're not supposed to. In our experiences so far, Apple typically makes methods private if they don't want you to use them for apps, but the fact that they don't supply an easy way to access the scrollview could also imply that we're not supposed to be meddling with it.

~Lindsey

Thursday, June 16, 2011

Pieces coming together

Hello all,

Just as our second week of 9-5 development comes to an end, our "proof of concept" implementation does as well!

We have worked towards one primary goal this week: Create an iOS application that is capable of connecting to a Mac remotely, and displaying the Mac's screen contents in real-time.

We have finally been able to receive and setup our Mimio Capture and Mimio Teach system next to our workstation (see photo); the Mimio is being used as a tool to aid in the demonstration of a proof of concept for Access Lecture. Essentially, the Mimio system wirelessly transmits what is drawn on a whiteboard to a PC/Mac program. Our application contrasts with the Mimio in several important ways (hardware usage, portability, and accessibility focus to name a few), but the Mimio can still provide us with the ability to mimic the ideal/eventual functionality of Access Lecture.

Our task was to take the capture obtained by the Mimio, and make it viewable from within our application on the iPad. There are many possible ways to approach this problem; the approach we chose involves establishing a wireless, remote connection to a Mac which is running the Mimio software. We were able to utilize the open source iOS framework "Telekinesis" to easily obtain core remote-connection functionality. We built on-top of Telekinesis, and utilized it to stream the Mimio capture display to the iPad with little latency (about a 1 second delay). An example screenshot of the app displaying the Mimio capture from the whiteboard image above can be seen here.

Our proof of concept has the following requirements:
  • Access to a wireless network
  • A Mac with the Telekinesis software installed, setup, and activated
Clearly, this model involves (generally) minor software/hardware constraints; in-terms of these constraints, it is still extremely unclear whether the final application will even remotely this setup. The current aspects technical setup are meant only to demonstrate, and do not yet provide a satisfactory level of practicality for our user-base.

Well, that is all for this week. We're very excited with the significant leaps in progress that Access Lecture has seen since full-time development began just two weeks ago, and we plan to continue this trend in the weeks to come.

- Alex

Tuesday, June 14, 2011

Update after the first week of fulltime development

Hey everyone!

So last week we posted to let you guys know about the tasks that we wrapped up and our plans for starting to develop the settings aspect of the application.  We're getting pretty close to finishing our initial settings page so I thought I'd summarize the process we went through and what we ended up making.

To begin with, we decided we wanted a settings page that looked like the settings application that Apple uses.  We read online after researching a bit that you could use a UISplitViewController to achieve the same look.  "Great!" we thought and we eagerly began working on some UISplitViewController tutorials.

After messing around with the data structure for a while we started to notice something odd...it was incredibly difficult to make the SplitViewController exist on anything but the root view.  Assuming we had simply hit a bump in the road, we consulted the SplitViewController documentation and lo and behold! Apple comes outright and says that Split View Controllers shouldn't be used outside of a root view.

Well that was bad news for us. Here we had found this magical class that could do everything we wanted, but the restrictions on it would keep us from being able to use it with our navigation-based application.  We desperately searched through the depths of stack overflow and the split view controller references but alas, it simply could not be done. Unless you wanted to do something hacky and try to cheat the system, by attaching the split view controller to the rootview and then whenever the application loads quickly push another view on top to hide the split view.  Well, that could have worked but the lack of elegance and the risk that it might get our application rejected from the app store made it seem like a pretty poor option.

And so we looked for alternatives and we found something called the InAppSettingsKit which has a bunch of wonderful open source code that allows developers to create a settings page very easily via editing a plist.  Even more encouraging, there are plenty of apps that use it and are already in the app store.

We looked at a sample project where the InAppSettingsKit was used and then imported the InAppSettingsKit into our project and started working on the plist.  Eventually we ended up with a lovely settings page that looks like this:

The great thing about InAppSettingsKit is it allows you to add setting options that bring the user to a custom view for changing the setting.  We ended up doing this for the color changing settings, seen below

Another feature we wanted to mention was the toolbar transparency setting we added.  During our usability tests a couple of the subjects pointed out that it'd be nice if you could turn toolbars transparent and then show more of the lecture behind them.  This was a great idea and I worked on implementing it and now when you change the transparency setting (using a slider that alters the alpha value of the toolbar) you end up with something like this
As you can see, the lecture image now shows up behind the toolbars.  Turning the toolbars transparent was very simple, but we also had to make sure that the scrollview displaying the lecture extended and retracted depending on what the transparency was set at. This also meant that when you go to the notes view to write on part of the lecture, it will need to show more or less of the image depending on whether or not toolbars are transparent. Lastly, I changed the minimum alpha value on the slider to be .3 because if you make a UI element too transparent, you will actually lose the ability to interact with it even if you click the space it was shown at.

So...we managed to get a lot done last week in regards to the settings, and our application actually saves the settings and will restore them after you close and reopen the application.

The professor we're working with also ordered us the Mimio to work with, and it arrived last week. We haven't had time to use it very much because we were wrapping up the settings, but this week we're going to start looking into the actual transmission of data from a computer to the iPad application.  I don't think that any of us have worked on a program that involved that kind of logic before, so it's going to be an interesting experience for us all.

Lindsey

Tuesday, June 7, 2011

Summer development starts

Hey guys,

Alex, Abhishek, and I are continuing our work on Access Lecture over the summer.  We just working yesterday and we've already gotten a couple things done. 

  • Abhishek solved the zooming issues we were having and is now working on refactoring our zooming code by putting it in some other more appropriate classes
  • I solved a couple of bugs with the note taking system that we didn't realize we had.  Now our re-envisioning of the note system is working pretty well. We have two versions of it right now and for the version I was working on, the user can choose to take a "snapshot" of the current portion of the notes they're viewing and then write notes directly on that snapshot. A lot of the users during our usability testing wanted some way to annotate the notes or write directly on them and we believe this new system will be more in line with what the users were expecting.
  • Alex was working on another version which was more similar to the old style of note taking we had but instead of typing notes, users could write them in a box that would appear below the lecture that's being viewed.  While this provides less area to write with, and doesn't allow users to write onto the lectures, it allows them to take notes & view the lecture as it's happening in real time.
  • Alex started working on adding a new page to the system where the user will be able to adjust settings for the system...
  • And I started working on implementing some of the settings that users will be able to change, i.e. changing the color of tool bars, changing the transparency of tool bars, etc.

There's a lot more that we'll be working on this summer...this is just what we've accomplished during our first two days. We're going to be working every day during the week so there will be a lot more to come :)

Lindsey

Friday, May 20, 2011

Spring Quarter Reflection

The major achievements this quarter include usability test and further improvements to the application that we have been working on as per the feedback received.  Another highlights would include the participation of Lindsey and Alex in two conferences the Richard Tapia Celebration of Diversity in Computing Conference and the Posters on the Hill Conference.

We conducted a usability tests on a sample of our target audience in the early weeks of this quarter. It helped us get insight to few areas of our development work like the UI elements as well as the gestures that are used for accessing the application. We found out from the test that most of the users preferred the "pinch to zoom" gesture to "tap to zoom" while zooming in and zooming out the lecture screen area. Also we got some feedback regarding some of the icons that we had used. Most of the participants expressed their confidence that the application would indeed be very helpful to them in their classroom setting provided few modification made. Most of the participants expressed their concern regarding lecture note-box area which is used for writing notes related to the lecture. Whenever the user presses the take notes button a note-box pops onto the screen covering almost 20% which reduces the lecture view area.

Taking few of these feedback we have already started working on some of them like changing few of the icons as well as allowing the user to actually take notes (using stylus) right on top of the lecture screens and saved it. From the usability test, we also found out that the use of keyboards both inbuilt and external presented their specific inconveniences; so we are planning to test out with the use of stylus for the next testing. Also we have been working on features like allowing the user to to select different colors in which they can scribble notes.

We would be extending these work over the summer quarter and we hope to get a lot done then as Lindey and Alex would be working as full-time.

--
Abhishek

Thursday, May 5, 2011

Progress before the school year ends

Hey everyone!

I mentioned in my last post that we were planning on wrapping up a couple of little things with the app before the school year ended.  While we will be continuing work over summer, we will be off from work for 2 weeks so we wanted to leave off at a good point. 

This week we reworked a lot of the icons to make them easier to distinguish (i.e. making certain icons thicker and removing gradients from icons since they made the icons more difficult to understand when inverted).  Abhishek worked on making a test application where you can take notes using a stylus, and this week we were able to integrate it with our project.  We added a navigation system so that the user would be able to open different pages of the application. While we were changing to application to be navigation based we were able to refactor it a bit and clean up a lot of the messy stuff that ended up sneaking it's way in over time. We were also able to make sure that the code was documented as well.  Since we will be off from the project for a while before we start up again over summer, it's important that the code is readable and easy to understand, so that was another priority before the school year ended.

Finally, we wanted to remove the tap zooming which our usability test subjects found unintuitive and conflicting with other applications gestures.  The app no longer has support for the tap zooming, and we're working on salvaging some of that logic to use for the zoom in and zoom out buttons.  At first we were expecting it to be very simple to reuse the code, but we've since realized that without the gestures to send locations to your zooming methods it can actually be difficult until you figure out how to pinpoint the coordinates you need to work with. We've all taken a couple shots at figuring out how to calculate the numbers we need but so far we've been unsuccessful; however, since we've finished everything else we wanted to work on before summer started, we will have all of next week to devote to tackling this problem.

We'll post next week to summarize the work we've done over the past quarter, and where we were able to leave off on.  And then in a couple more weeks we'll start regularly posting updates again!

-Lindsey

Monday, April 25, 2011

Rethinking the App

Hey all,

Before the conferences we wrapped up our usability tests and now that we've returned from San Francisco and Washington we've had time to really consider the test results and the feedback we received.  We created a summary of the data and the changes that we want to consider making to the project based on what we saw and what the subjects told us.

While we have a lot of ideas on things that we might want to change, remove, and add, we decided to pick a couple of the smaller more manageable ones to work on as the school year draws to an end.  Over the summer we will take on the larger changes and tasks.

Icons

Some of the small things that we've been working on are remaking some of the icons that the subjects had difficulty identifying, seeing, or selecting. This was a pretty simple fix with photoshop but hopefully it will make our icons much more visibly accessible than they were during the testing process.


Zooming

Another change we realized we'd need to make after the usability tests was that something had to be done about our tap to zoom method.  The usabiity tests showed that the method was unintuitive, less accurate, and the way that we handled zooming out actually conflicted with accepted zooming conventions.  In Safari, we were informed by one of our test subjects, a double tap (when you're at max zoom) will bring you out to the non-zoomed page.  We do not want the gestures in our app to have unexpected consequences or to conflict with the already existing conventions of zooming in iOS.  As a result, we've decided to remove the tap to zoom functionality and instead use that code for the zoom in and zoom out buttons we have in the app.

While the tap method was largely disliked by the test subjects, several subjects pointed out that it was nice in that there were set intervals that it would zoom to and the pinch method of zooming does not really account for that.  As a result, we're going to use the tap to zoom logic with our icons and have them zoom in and out at set intervals for the user.

Annotation for Note Taking

One of the biggest concerns the subjects expressed was that the notebox, with either the virtual keyboard or the physical keyboard, would not be sufficient for taking notes in class. With the virtual keyboard it was too difficult (typing with two fingers instead of being able to use homerow.  If you try to type with multiple fingers you will often accidentally activate the built in zooming that the iPad provides), or too slow for subjects to use.  The physical keyboard on the other hand could not provide the symbols that students would need to use when taking math or science notes.

The subjects almost all initially believed that the app could be used to tag notes onto certain parts of the lecture, and would try dragging the note box onto the lecture screen.

To try and address both of these very valid points we've decided to try implementing a way for the student to be able to just write notes directly onto the lectures. While we're still working on figuring out how we will save the notes in regards to the lecture, Abhishek has made a test application where you can use a stylus and take/erase notes onto the screen and we will likely be able to use that functionality in our app once we figure out how we want it to work and how we will store the data.

In Conclusion....

While we won't start the hardcore development until the summer, we still have some interesting things we're going to be working on over the next two weeks so stick around and we'll let you know how we're doing and where we're going to be at once summer starts.

-Lindsey

Monday, April 18, 2011

The conferences come to an end; development begins again!

Hello all,

Saying that we have been busy lately would be an understatement! We have spent the last three weeks preparing and presenting our poster at both the Richard Tapia Celebration of Diversity in Computing Conference and the Posters on the Hill Conference (sponsored by the CUR).

Posters on the Hill
We have just returned from Posters on the Hill, and it was an amazing experience. The conference took place at Capitol Hill in Washington DC, and was certainly different than your 'typical' conference. In addition to presenting our poster, we were also able to schedule meetings with legislative assistants of several of our congressional representatives; we presented our poster to the legislative assistants who we spoke with. There was a great deal of interest and positive feedback about Access Lecture from a wide spanning audience; this included congressional representatives, faculty, other poster presenters, and representatives from both the American Chemical Society and CUR.

Our poster was one of 74 which were chosen from an applicant pool of 700+ posters; it is a great feeling to know that our work was chosen to represent the importance of domestic undergraduate research. In addition, getting to go to the National Zoo was not a bad deal either!

Tapia Conference
Just one week prior to attending Posters on the Hill, we attended the Richard Tapia Conference in San Francisco. We were chosen as both scholarship recipients and poster presenters, which meant that we got to attend the conference for all three days and present our work, all free of charge! The conference consisted of talks from distinguished speakers, panels that provided students with information on how to succeed in various aspects of computing, company poster presentations (with companies including Microsoft, Google, Yahoo!, Intel, Symantec, NetApp, and Amazon), professional networking opportunities with companies and academics, and a student poster session.

The Tapia Conference provided valuable experience and exposure to professional networking and student research environments, as well as the opportunity to speak personally with representatives from the companies listed above (and many others). The poster presentation session was excellent; those who saw the poster raised points or gave us ideas which we had not considered before; the poster session also helped us realize what we need to clarify about Access Lecture, and what common questions and misconceptions may be.

Oh yeah, we got to stay at the wonderful Fairmont hotel...free of charge. I think that I am now inspired to become rich, simply so I can live there permanently. We didn't have much downtime, but we did manage to explore San Francisco (getting lost for two hours in the process, thanks to a 'questionable' GPS) and find some great places to eat! It was a sad day when we had to return to the East Coast, where we were immediately greeted by chilly weather and rain.

Conclusion
The past three weeks have been hectic, but very worthwhile. We have tons of photos, but I do not think that they will be needed; these experiences will certainly remain in memory for life! Work on Access Lecture will now resume as normal, with the next step being to further analyze our usability test data and refine our prototype.

Until next time...
- Alex

Friday, April 8, 2011

Hello World to those from Tapia 2011!

Hello everyone,

We've just returned from the 2011 Tapia Conference. We made a lot of connections there, and a lot of attendees expressed interest in our research during our poster presentation.

I just wanted to post this message for those that Lindsey and I met at Tapia, who may be checking this blog out for the first time! We haven't updated in 3 weeks, mostly due to trying to get ready for Tapia, usability testing, and the Posters on the Hill Conference in about a week.

No need to fear though; we will begin updates as normal soon. Here are a few things to expect:
  • A summary of the Tapia Conference. It was an amazing opportunity, and summarizing the experience will be difficult.
  • A report on the upcoming Posters on the Hill conference in Washington DC.
  • An in-depth review of our usability tests, and what we plan to do next.
Thanks to everyone for all of their interest and support.

Check back soon for updates!

Until next time...
- Alex

Friday, March 18, 2011

Week 2: Gearing up for the Usability Test!

As we are finishing off the second week of the quarter, we are pretty much ready for the next week's Usability Test. The glitch that Lindsey mentioned in last weeks posting about the problem encountered when plugging the external keyboard, we were able to fix it, thanks to Alex's untiring effort! So now if the external keyboard is plugged in then the keyboards do not show up whenever the focus is in textboxes.

Also this week Alex and Lindsey have been very busy preparing the design of the posters for the conferences they are attending next month and are ready with the final design. Once this design gets approved by Prof. Ludi, they would start filling out the contents. Hopefully they would be able to get that done next week; so you can watch out for the preview in next week's post!

We were also quite busy this week preparing the schedules for the Usability Test. We are planning to carry it out over the next two weeks. We are really excited about the testing and hope to find interesting data. We'll let you know about these as well in next entry.

Another issue we fixed this week is again polishing the application; and this one is regarding the maintaining of log of the events generated through user's interaction with the system. For instance, we are logging what icons the user has pressed, its exact location (clicked coordinate), along with the timestamp. Another information that we are logging is whether the user has zoomed in or out. We were able to trap these info earlier but were struggling to have them write to a file. But finally this week we were able to get this one through as well!

So it all looks quite good and smooth as we are gearing up for the test next week!

--
Abhishek

Wednesday, March 16, 2011

Week 1 again!

Hello everyone,

We started off this quarter by putting our test applications on the iPad to try them outside of the XCode simulator.  They both worked fine except for a small problem that occurred when you plugged in an external keyboard.  We want users to have the option to be able to use either the virtual keyboard or a physical keyboard, but we had overlooked the fact that our lecture view will display a little bit differentlu depending on whether or not the screen had to display the keyboard as well.

We realized early on that when the virtual keyboard popped open it'd cover up part of our lecture view, so to fix this we made it so that whenever the notes box was opened for typing, we'd push the lecture view up so that none of it would be hidden by the keyboard. Unfortunately we forgot that when the user plugs in the physical keyboard, we won't need to push the lecture view up.  We were able to fix the problem (thanks to some old posts we found on stackoverflow.com) so now the apps will display nicely regardless of what keyboard you choose to use.

Aside from that, Alex and I have been working on preparations for the two conferences we will be attending next month.  For one of the conferences we needed to revise our abstract, and for both of the conferences we need to create our posters.  Alex and I brainstormed yesterday to figure out what information we want to highlight on the posters, and then today we created a nice mock up of one of the posters using Photoshop.  Alex used some of his Photoshop ninja skills (which I'm hoping will rub off on me...) and now our design looks really nice.  Neither of us have made a poster before so we're waiting to get feedback from the professor we're working with.  Once we've come up with a solid design, maybe we can post the mock-up on here as a nice little preview.

We're also going to start scheduling people for the usability tests next week. We were able to recruit ~10 or so people who wear glasses that want to do the tests.  While it's disappointing that we were unable to find visually impaired volunteers, a lot of other research projects we've read about experience similar problems and had to similarly use people who wear glasses.  A lot of tests that were designed for completely blind subjects had to use volunteers wearing blindfolds because sometimes it's quite difficult to find people that are actually in your target group.

We'll let you know what happens this week with the development of our poster and with the upcoming usability tests!

Lindsey

Monday, March 7, 2011

Second quarter reflection

Hello everyone,

We managed to make a lot of progress this quarter, including:
  • Reading papers to find out what other researchers have discovered already about areas related to what we're working on (i.e. usability for multi-touch, mobile device accessibility for visually impaired, etc)
    • We also organized all of the relevant articles we found, and made a chart so that you could see which articles were related to different keywords. (i.e. which articles are about zooming)
  • Reading papers on usability tests/testing to learn the process
  • We developed several prototype applications that we could use during initial usability tests to figure out what kind of gestures the users preferred and found intuitive, as well as how the users felt about the UI.
    • We ran into a lot of bumps along the way as we were developing these, but ultimately we reigned supreme over Objective-C and all of the weird little bugs we had made.
    • As we progressed with our research into other usability tests that have been done, we found a paper discussing different types of images that can be used to test various aspects of note-taking sort of application.  We decided to adjust our test application so that you can cycle through several different types of images, each testing something different.
    • We also implemented part of the note-taking aspect of the application, which will allow us to test the functionality of our application by having the test subjects try to zoom in and read different notes and see if they can take their own notes using our app.
      • This will help us determine whether or not our app is actually able to help people with viewing notes.
  • We wrote out our plans for the usability test as well as making forms that we could use to fill out the test subjects responses to questions, and the form will store the information in a spreadsheet which will make the data easy to evaluate.
  • We wrote code that would allow us to print out things such as what button the users presses and where, as well as the time of the action.  We can use this kind of information to see if people have difficulty hitting buttons, or figuring out which buttons to press, for example.

And two of the most exciting developments for the quarter were our admission into two different conferences! We'll be presenting a poster at both:

  • Posters on the Hill which is a conference where students can present their research to government officials, such as congressmen.  The purpose of the conference is to show government officials where research funding actually goes, and what kinds of research are produced as a result.  We're going to want to show them how important our work is so that they'll continue to support undergraduate research.
  • The 2011 Richard Tapia Celebration of Diversity in Computing conference, which is going to be in San Francisco and includes a lot of panels by people in research and industry, which we're both excited about seeing. The program is up on the website, so if anyone wants to see what kinds of things we'll be doing for those 3 days you can view that here

All in all we've had a really busy (and really exciting) quarter.   I'm guessing that this quarter will be even busier, and even more interesting, so I'm definitely looking forward to it.

Lindsey

Winter Quater Reflection

The Access Lecture team was able to achieve quite a lot during this quarter. The main highlights include functional prototype ready for usability testing, research on related works, setting up the database required for the application and of course acceptance into two conferences!

While the fall quarter we mainly focused in the basics of iOS programming, in the last quarter (winter) we were able to actually build on top of it to make a prototype for our application. We also tried to gain some insight as to how the application would look like for people with tritanope, protanope, and deuteranope types of color blindness. So we have applications ready that include varieties of multi-touch gestures (zoom and tap) as well as colored and gray scaled version and also various images (math equations, geometry figures, chemistry equations, and biology figures) so that we could cover all the bases of the likely scenarios that the target audience would find in real life. We have also have the survey ready (both pre and post) for the usability test.

We were also able to get some work done setting up the database and some basic forms that allows to manage courses and lectures. But when it came to deployment we ran into few issues which remains yet to be resolved.

Another area that we worked on this quarter is going through a lot of related articles to gain knowledge as to what other advances have been made in the same field as ours. It provided us with some good ideas regarding some issues that we need to consider for conducting usability testing as well other innovative ideas. We were also able to get in touch with a similar research group. We hope to exchange some of our ideas further next quarter.

And last but not the least, some of the team member got accepted into two prestigious conferenes. These would indeed be a great opportunity to share what we have been doing over the last two quarter as well as the progress that we'd making next quarter. Kudos and congratulation to the team members!

Looking ahead...for the spring quarter we would be actually conducting the usability test which can yield further insights about the improvements required for the application.

--
Abhishek

A Quarterly Reflection - via Alex

Hello all,

Winter quarter has ended and Spring quarter has begun (although I can't say that the weather correlates!); We thought that it would be appropriate to post individual reflections/summaries of our second quarter of work on Access Lecture. If you have been following us at all, these reflections will be very similar to the first quarter reflections posted a while back. If you haven't been following us, then I think that you'll catch on rather quickly!



A big picture overview:

So, what exactly did we get done this quarter? The answer to this is "a lot", so I'll sum-it-up in a more abstract format, for both my sanity and yours. I will divide our focus points and accomplishments into two categories: technical and scholarly.

1. Technical Focus Points
We made a lot of technical progress in regards to developing our app for the iPad, as well as developing for the iPad in-general. I recently had a look over the first resources that we bookmarked and shared to try and teach ourselves these crazy things called "Objective-C" and the "iPhone SDK". Here's what some of them were called:
  • "The Hello, World Application: An Overview"
  • "How to Handle Basic iPhone Interaction"
  • "Creating a Basic iPad Application"
  • "A Basic Fundamental Guide for iOS Application Design
Well, you get the point. Looking back at these tutorials, it's almost laughable how little we knew at the time! I think that these past two quarters have proved that the "learn by doing" approach certainly works, as we are all very adept in various areas of iPhone SDK development. I've actually started answering iPhone development questions in Stack Overflow for fun (for the reputation!).

In short, we have more than solidified our knowledge of the fundamentals of iOS development, and then some; we don't need to yell for help whenever we want to add a new feature anymore! It's a great feeling to realize that you've gained so much knowledge without even realizing it.

So, specifically, what have we accomplished technically for Access Lecture?
  1. The near-finalization of our prototype applications for our usability tests.
  2. A polished UI, with emphasis on visual accessibility.
  3. The full implementation of the Note-Taking Box feature.
  4. A multi-touch logging system, that records the coordinates of each tap. This feature will be very valuable for usability testing, and will allow us to pinpoint potential usability flaws.
  5. Fixed various bugs with our prototype apps, most of which have been haunting us all year!
I feel pretty good about that list!

2. Scholarly Focus Points
I use the term "scholarly" to encapsulate all of our work/achievements in regards to our research (and the accompanying research process) itself. The key items to mention are:
  1. Our acceptance to present and attend two computing research conferences! The Richard Tapia Celebration of Diversity in Computing Conference in San Fransisco, and the Posters on the Hill conference in Washington D.C. We will be presenting a poster on Access Lecture at both of these conferences.
  2. Our intense work and research into usability testing. The term "usability test" has been all over our blog this quarter, and for a very good reason! We achieved and completed work excellently in this regard.
  3. The accumulation of a vast amount of scholarly research articles that relate to our own research at Access Lecture in various ways. We now have an excellent collection of notes and references on these articles, which will be used when we eventually write our own papers and reports.
In summary, we gained an incredible amount of experience and received a comparable amount of exposure to academic research and the nuts and bolts that go with it. Our acceptance to the conferences was exciting news to say the least, and to think that they're both coming up in a little over a month's time!

In Closing:

I have gained a great amount of exposure and experience to both the research and technical aspects of research. I am really amazed at how much I've learned about so many different things in just one quarter, I feel very accomplished and proud to be a part of this project. What's even better, is that we still have one quarter of part time work, and then an entire summer of a 9-5 for the Access Lecture project! With so much more to go and so much already happening, I can barely even speculate on what is to come for me in the future!

Here's to another great quarter, and most of all, some more great fun!

Thank you for reading.

- Alex

Monday, February 21, 2011

Week 10 - Little Actions Can Make a Big Difference

Hello all,

We've just wrapped up Week 10 of our Winter quarter work on Access Lecture, and we're approaching some great stopping points. With one week left before our (much needed!) Spring break, we've been working hard to put the finishes touches on our prototypes to prepare for the imminent usability tests.

But first, some great news! A few months ago, the eligible members of the AL team (Lindsey and myself) applied for scholarships regarding the 2011 Richard Tapia Celebration of Diversity in Computing Conference. We found out over the weekend that we have both received scholarships! This conference provides presentation, learning, and networking opportunities for both students and professional computer scientists. The current goal is to have a poster ready to present by the conference deadline. Oh yeah, the conference is in San Francisco, California and will be conducted from April 3rd to April 5th; jealous?

Now that I've gotten to brag across our vast internet via Blogger, it is time for a summary of what we accomplished this week. Our main focus was the polishing-up of our prototypes that are to be used for usability testing during Spring quarter. Broken up, we worked on...
  • Automatically logging important usability data with regards to the touch screen. This includes recording where the touch screen was pressed when trying to select certain icons, and specifics about the pinching/tapping gestures performed during particular tasks.
  • Setting up a test database on a remote server; this will allow us to simulate how AL would work in a 'real' setting more effectively.
  • Data mining more research articles that relate to the development of Access Lecture in a variety of ways.
With regards to the automatic logging of multi-touch and touch screen interactions, I can say that we have made a vast amount of progress with this; as a result, we can now objectively pinpoint a large subset of the potential usability flaws in the AL prototypes. We plan to make heavy use of this feature during our usability tests. The feature works by logging the x and y coordinates of touch screen interactions, as well as other specifics about what those interactions were trying to accomplish.

For example
: If we are testing a subset of users who are color blind, and we notice that they continually hit a certain edge of an icon or miss that icon consistently, then we can conjecture that the particular icon in question blends in with its surroundings with regards to color blind vision.

The database is still being worked on; the framework which we are using is taking some getting used to! More details will be posted next week, once more progress has been made.

That's all for this week! Tine for all of us to cram for some finals and take a well deserved week off.

Until next week...

- Alex

Monday, February 14, 2011

Week 9!

Hey everyone,

This week we managed to polish up our tapping to zoom app, which we had been having some bug difficulties with.  It seems like a lot of the problems we've been having with our code have been due to a lack of transparency with Interface Builder. Sometimes it seems like Interface Builder changes things or sets things and it's not very apparent so that can make it difficult to understand what's actually going on with your app.

Luckily, we figured out the problem and now both apps are working correctly with the zooming.  To finish the apps up we're working on adding an extra button that pops up when you're in the note taking view so that if you don't want to use the onscreen virtual key to exit the view, you can hit the red x. Our CREU faculty member recommended this and I think that it will be more intuitive for people who are not familiar with the iPad or iPhone. Here's a new screenshot of someone using the note taking view, and you can see the additional button on the top right of the note box.  Alex actually made the button using photoshop.


We were curious about how our app would look to color blind users, and we found a site where you could upload an image and it would show you how it looked to people with different types of color blindness. 


We tried submitting the image of our app and these are what it produces. In order these show how the app looks for people with tritanope, protanope, and deuteranope types of color blindness




While we didn't really consider color blindness very much when writing up our requirements, it might be something to take into consideration at some point.  We might not want to rely too much on the use of color to indicate what an icon does.

Other things that we looked into for the week were video streaming to the iPad, because that might be one way we can show the current lecture or the old lectures.  Apple seems to provide a lot of documentation on how to handle video streaming, so if we want to go down that route we should have enough information.  Abhishek also is trying to host the database we're going to be using for Access Lecture, since we will probably be able to start working with the database more next quarter.

Overall I think this week was pretty productive, and now we just need to get our new apps onto the iPad to test them out on a real device and then they'll be ready for our usability testing.

Lindsey

Friday, February 4, 2011

Week 8 update

Hello everyone!

Last week we updated about some fix we made regarding the notebox. However, one problem still prevailed. The problem was that even if we escaped the keyboard (please see the image in last post) the notebox still would be hanging there.  We had to manually push the notes button (a pen) to return to the lecture view. This week we are able to fix this problem as well. So now if we hide the keyboard the notebox also gets automatically hidden.

Though we have been talking about the application that we have been developing, we are also looking towards actually performing usability testing for the application with the targeted users for the application. For the purpose we put together three separate apps. Our main motivation would be to find out the most convenient method to zoom. Zooming would be one of the most important feature of the application. Also there are a lot many ways that we can invoke zoom; some of the most popular methods are tapping and pinching.We are trying to find out the method of zooming that the targeted users would find easier.

Another area that we are looking forward to find out is the appropriateness of the icons and how easily the users can relate to the icons to their intended action. The placement of icons would be another area that we would be focusing on as to how easily they are able to access the icons. As for the icons we are considering to make use two different set of icons (colored and black/white icons) to find which is the more preferred one.

Thus, we have the following applications ready for testing (although some minor work needs to be done in one of the app):
  1. Pinch to zoom (colored icons)
  2. Pinch to zoon (black/white icons)
  3. Tap to zoom (few works remain)
So this week that we have the first two apps completely functional and ready to be rolled out for testing. We are trying out best to get done with last app which we hope to complete by next week!

--
Abhishek

Friday, January 28, 2011

Week 7 updates

Hey everyone,

To begin, I'd like to show you guys a screenshot of something I mentioned during week 5.  One of the problems we had with our test applications was that when you opened a note box up it would cover the lecture.  This wasn't how we envisioned the application working, so some code was added to push the lecture view up whenever you opened the note view, and likewise push it down when the note view was exited. We didn't include a screenshot of this during the entry for week 5 so I thought that we should take one and put it in this weeks entry so that you guys could get an idea of what we were talking about.


You can see how the note box does not cover up the lecture, they are separate views and so all of the lecture will be accessible while you're typing your notes.  Before, the bottom half of the lecture would get covered up and could not be viewed.

This week we refined our search for related research that people have done regarding accessibility and mobile devices.  Our CREU faculty member discussed several different conferences with us last week, and by searching for papers from those conferences we were able to find more relevant and more academic papers on the subject. We found a lot of papers that focused on visual impairments and accessibility by looking through the ACM Digital Library and narrowing our search to specific conferences.

One of the conferences that seemed to have the most relevant papers was ASSETS, which makes sense because the ASSETS conference focuses on using computing and information technology to help people that have different kinds of disabilities. 

We focused our search on conferences within the last couple of years, from 08 and on.  The reasoning behind this was that smartphones and other touchscreen mobile devices are relatively new, and we also wanted to make sure that we were looking at current research so that we could see what the latest findings have been.  This doesn't mean that older research should be overlooked, but by looking at the newer papers we were able to find studies that were very related to our project and could help us in developing our own plans for our usability tests.

Interestingly, our CREU faculty member actually knew about a lot of the authors of the papers we were finding.  As a result, we might be able to get in touch with some of them so that we can learn more about the process they used for testing, as well as how they found participants.  When you're conducting research with a very specific group of people, it can be really difficult to find participants for your studies. 

A lot of the articles that I've read, that have dealt with research about visual impairments, have mentioned having participants who were sighted but wearing blindfolds.  I saw this mentioned in two different articles and  I was really surprised by it.  I feel like using sighted participants could give you much different results, even if they were blindfolded. I'm hoping that with our tests we will be able to find enough participants that have visual impairments so that our research will be more sound.

-Lindsey

Week 6 Update - Better Late than Never!

Hi all,

Alex here with a rather late week 6 rundown. We apologize for the delay; there was a mix-up in which week it was in an earlier post, causing us to think that week 6 had already been written! We're computer scientists and software engineers researching new and innovative technology to help visually impaired students, yet we can't keep track of the date; but enough about that!


The key talking points for week 6 can be split into two categories: conferences and surveys.

First, the conferences will be discussed!

We are preparing for the possibility of presenting a poster and/or a paper at several conferences. Our project adviser has identified several conferences (some coming up soon, and some not until the fall) for which we can likely get funding for at least one team member to present. Some of these conferences are specifically catered to the presentation of research with the purpose of utilizing technology to assist users with disabilities; others, such as the Richard Tapia Conference, are conducted with the intent of celebrating/acknowledging diversity in computing (including those with disabilities).

The possibility of presenting at these conferences is an extremely exciting prospect for the AL team, as this would be our first exposure to a professional research/conference environment. Therefore, we have been doing our homework in this regard! Our preparations so far have included:
  • Writing an abstract and introduction for our poster and/or paper.
  • Amassing a collection of research papers that related to our own research in various aspects. These included research on the accessibility of touch screen and mobile devices, as well as research that dealt with VI- and/or blind-user friendly versions of various technologies.
  • Data mining these research papers for references, ideas, inspiration, and leverage points.
  • Keeping a precise list of references that point to which data was mined from which paper, for the purpose of making this information usable in several of our own papers


Second, the pre- and post-surveys for our upcoming usability tests will be covered. We accomplished a great deal with regards to these surveys, including many fundamental and functional changes. Our progress can be summarized as such:
  • Received feedback on the surveys from our project adviser (CREU faculty member).
  • Implemented a large number of fundamental formatting/wording changes, based on our project adviser's feedback.
  • Changed the way in which our surveys were to be given to the users and conducted.
  • Modified the survey documents to exist as forms that would automatically parse themselves into a spreadsheet (as opposed to the previous pencil and paper versions).
Mainly, we revised the surveys and decided to change the way in which they are to be given. Instead of having the users fill-out the surveys themselves, we will ask them the questions and copy their responses on our own into an online form. Their data will then be sent into a spreadsheet for easy interpretation and management. This will help clarify the any potential obscurities in our questions, and will also help to make sure that the users answer the questions in the way that most benefits our study.


That's it for week 6! Stay tuned for an additional update on this week (week 7).

Until next time...
- Alex

Friday, January 14, 2011

Week 5- We Prepare Even More!

Hey all,

Last week Alex updated about our tests apps and our other preparations for the upcoming usability tests. This week went along pretty similar lines, we
  • Continued fleshing out the script for our tests
  • Continued working on our pre- and post- test surveys
  • Fixed a couple of bugs with the test applications
  • Made modifications to our test applications so that the lecture screen would be pushed up when you opened up the note view, so that the note view would not be obtrusive and cover the lecture up.
There is still more work and debugging we need to do with the test applications to get them ready. Right now, when the note box comes up it seems to block gesture recognition on the lecture view, which means that the user would not be able to zoom in on the lecture while they were typing. During our usability tests we would like to see if the participants can easily takes note on the lecture they're viewing and so this is something that we will need to fix prior to the tests.

This week we also came up with a title and an abstract for a conference that our CREU faculty member has recommended that we apply to. It's the 2011 Richard Tapia Celebration of Diversity in Computing Conference, and we've been working with our faculty member to make the abstract. Alex, Abhishek, and I came up with a draft for it and have changed it based on the feedback we've gotten. We also applied for the Posters on the Hill conference, but we won't hear back about that one until late February.

Next week we can get our abstract submitted, and then finish up some of the preparations for the usability tests. Once we're all set with our script for the testing process, we can start practicing that so that we'll be ready for when we're doing it for real. If we aren't ready for the real thing, mess ups could result in us having to throw out the data, so it's important that we're all prepared and know what to do. One of the upperclassmen that's working with our faculty member on another accessibility project even volunteered to come in during our practice tests and give us feedback and help out. Our faculty member, who has conducted these kinds of tests for her other projects, is also going to make sure that we're doing everything correctly before we try the tests with any of our real subjects.

-Lindsey





Monday, January 10, 2011

Week 4 - Preparations Begin for Usability Testing

Hello all,

It's week 5, which means that it's that time of the quarter again! Miraculously though, somewhere between not getting any sleep, neglecting the sun, and drowning ourselves in textbooks and notes, we have managed to make some amazing progress on our imminent usability tests.

Usability testing has been our primary focus for the majority of this quarter, and it's great to see things really coming together. Here is a summary of the focus points of our recent work in regards to this preparation:
  • Creating several test apps for our human subject trials.
  • Becoming certified and knowledgeable in the area of conducting research involving human subjects.
  • Data mining information on how to conduct a productive and proper usability test.
  • Creating a survey to give the tester before and after each usability test.
  • Creating a rough script to be used during each test.
  • Defining and fleshing out the descriptions of team roles.
  • Researching into the specifics of conducting usability tests that involve participants who are visually impaired.
Our test apps deserve some more in-depth coverage; a bullet can only show so much, after all! We are very proud of the work that we've accomplished so far this quarter; I'm also extremely vainglorious about my modesty and vocabulary.

The center of attention has been our test apps. We currently have two test apps; these apps differ only in the way that the user can zoom in-and-out of the content on the screen. We are currently experimenting with the standard iPad "pinch" gesture, as well as with a less conventional 'tap-to-zoom' method. A preliminary draft of our UI is placed on each app, with the option to switch between the inverted and non-inverted icon sets. In addition, the "notes" icon will popup a textbox in which the user can take notes in real-time, using either the on-screen keyboard or an external USB keyboard.

The nature of our test apps will allow us to foster an environment during the tests that will provide us with a substantial amount of information on the key visual and usability components of our app; it will also allow us to better feel-out how users are likely to respond to certain key features. This information will be very valuable to have so early on in the life cycle of the project.

An image of the current version of our test app can be seen to the left, with the inverted icon mode in effect. The large image in the center is a custom-made image that is intended to mimic what the user may see if notes were being captured by our device during a real lecture. To make the image as authentic as possible, it combines digital images with handwritten notes; it is also (purposefully) too large for the iPad screen, and must be 'scrolled' around the iPad screen using one finger. This will allow us to test the ease of use in regards to navigation in a real-world setting.

The toolbar on the top provides quick access to the home screen and the settings page, while the bottom toolbar contains all of the tools immediately available at the board view. These tools are (from left to right), "Save Lecture", "Take Notes", "Selection Tool", "Inversion Toggle", "Zoom-Out", and "Zoom-In". As mentioned above, the "Take Notes" icon is functional.

I hope that this 'tour' of our usability test apps has been fun for you, or at least more exciting than what you were doing before!

Once the test survey and script and more fleshed out, and once more research has been conducted into VI usability testing, we will post a nice content filled update.

Until next week...

- Alex