Friday, October 8, 2010

Week 5 - "I get by with a little help from my caffeine"

+5 Bonus points if you caught The Beatles reference.

Alex here with another update on AL.

Week 5 here at RIT means midterms and projects, but that didn't stop us from getting our hands dirty with Access Lecture and some more Objective-C! Unfortunately, it did stop me from sleeping and remaining 100% sane, but I probably wouldn't have done those things anyways.

Week 5 has concluded with some interesting results, and included some interesting events. Our focus this week was split between the iPhone SDK and focus group interviews. We still had some interviews left to conduct this week, and they proved to be just as interesting and helpful as their predecessors. We are mostly done with the "college professor" focus group now, and we plan to begin reaching out to more high school teachers next week. In addition, the college student survey has been finalized and deployed.

What we lacked in number-of-interviews for this week, we made up with in iPad programming progress. Let's just say that if difficult iPhone SDK concepts was a currency, I'd be programming on a golden iPad for the remainder of this project. To help us grasp some of the iPad specific concepts (such as drag & drop, zooming, and scrolling), we developed a long-term plan to create an app that would incorporate many iOS features.

The app is very simple: The user can enter in any string in a text field, and an animal (represented by some cute 1990's clip art) will repeat it back to them! The phrase can either be repeated as-typed, or in that animal's "language". If the user swipes to the side, the screen will scroll to an animal selector panel, which will let the user drag/select an image of a different animal onto the main screen to replace the current animal being used. In addition, the user will be able to double-tap on an animal icon to zoom-in on it. This app, as silly as it seems, will give us substantial experience with...
  • Using UIGestureRecognizer objects and interfaces to detect dragging, swiping, pinching, and tapping.
  • Working with multiple UIViews to create an interactive, scrolling interface.
  • Handling accessibility features, such as zooming.
  • Making a complex, fully featured app.
  • Working with the delegate design pattern, which (along with the MVC) is a major part of every iOS app.
To aid in the creation of our app, we are employing the help of Chris (another RIT student), who has a vast amount of experience developing for the iOS. We plan to meet with him Tuesday to ask questions in regards to important abstract principles that online tutorials have not helped with, and then possibly once a week after that to keep things rolling. We also have plans to attend two tech-talks that Chris is giving; one will be in lecture format and the other a hands-on workshop. This should help us move along very quickly.

After making the plan for the app, we got to work on it. So far, we have drag & drop functionality working with any UI object, and we have begun to look into accessibility features. Finding accessibility documentation that would fit our purposes has been challenging, so we may have to consult the official Apple forums for some more information (all of the documentation we found was for the voice-over technology, which is not applicable to our app).

All-in-all, we got a lot done this week. Next week should have some very interesting challenges and corresponding solutions for us!

Until next week...

- Alex

No comments:

Post a Comment