Monday 24 March 2008

Feedback From Ziggy beta 1.0



Overview of Data Collection Method
I have changed my chosen method of data collection significantly since the beginning of my research. Initially i was going to extrapolate data from trends and stats on resource usage from users of my site using analytics,alexa and a variety of other site trackers to build a picture of how the video tutorial maps impacted the browsing habits and affected the ease of use of the site. Since I have begun testing the user interface it became clear that I there would be subtle details and important user feedback that simply would not be apparent form analyzing site statistics. So I have opted for a more user centric analysis that uses direct observation and interviews that asses the different perceptions of users about the UI while using it for the first time.

During this first phase of beta testing I gave sat with users and asked them to test drive a "new browser" for my website. In each case the users had already been exposed to my site and used the traditional index/course hyperlink lists for navigation. I clearly explained to each user that this was a piece of software in development, a "beta" and that the browser they were about to use was "not finished". This was done to encourage the user to be critical of the user interface and not feel as though they would be offending the creator of the interface by recommending changes. Also I did not want users to feel as though it was complete and therefor unlikely to have modifications made to it based on their input. Being completely honest with users test driving the software enabled me to have a privileged view of the tester as a developer and work directly with beta users documenting their official and unofficial responses.

Each beta tester/user was asked a series of informal questions (each question had the same content and wording but was asked in the form of a informal line of questioning) this was done to put the user in a more relaxed mood with the intent of coaxing out greater honesty and detail from their responses. The result of using an informal interview style were extremely positive. Users exhibited an enthusiastic willingness to comment on a multitude of aspects of the user interface design and even went as far as spontaneously providing creative and insightful ideas for further modifications or "improvements".

Although The generated video tutorial maps are in many ways an integral part of the user interface itself for the sake of clarity I have attempted to classify the user feedback into two areas of analysis. The user interface interactions ie (the way the tester manipulates the mouse, uses the keyboard short cuts, and the interface tools when browsing the map) and the way the tester "reads" the information content in the map.

User Interface Interactions Analysis
There were three fundamental user interaction methods in the first set of tests on the beta 1.0 software: Dragging and dropping the map in the window, Zooming in and out of the map with the scroll wheel and double clicking to visit external content.

What was the first thing most users did when they entered the map?
What responses were users expecting form specific user interactions?
Were they surprised by UI response's?
How did users navigate around the map?
What was the most common user interaction method?
Did the user discover the UI interaction automatically or did they require instructions?

What aspects of the UI design did the users comment on as positive?
What aspects of the UI design did users comment on as negative?
How would users change/customize the user interface?


The overall response to the UI was positive. However the time it took for users to feel comfortable with the interface was reflected in their self assessed ability/computer literacy level. In cases where uses had limited exposure to computers and lower confidence levels their was a high degree of initial apprehension with using an unfamiliar interface, these users generally took a greater amount of time to "master" the interface controls.

The more computer/tech savy users stated exploring the UI immediately using universal controls without using prompting from the instructions page notably different from less experienced users who required clear instructions in order to fully use the UI tools effectively.This identifies a common chasm between experienced and inexperienced computer uses as far as UI design requirements. Creating a UI devices that are universally accessible and require no manual is once of the "holy grails" of UI design. This task generally becomes more difficult as the capability of the software increases. The more tools users have the more difficult a UI can be to use for first time users and the longer it takes to become a "proficient" user.
A good example of this would be adobe photoshop. First time users are often overwhelmed by the choice of tools and the number of options they have available to them. With many less confident users limiting the number of tools and techniques they use with the software to a handful of number of tools available. Even when they master this limited set of tools they are often reluctant to try new tools in fear of confusing the knowledge they already poses leaving them on a plateau of the "occasional tinkerer".

In all cases users presented a number of creative uses for the UI and the concept of mapping online resources and content. As expected users had their own personal preferences to towards UI design but there were a number of common points made by the first set of beta testers.

Analyzing The Mapping Language
The range, depth and eagerness or user interactions was just one aspect of the UI design analysis. The other important factor that was investigated was the users inate understanding of the "language and structure" of the map itself. Some of the users were self confessed visual learners while others adamantly declared that they prefer traditional page layouts with text based content. Ultimately these differences in learning style did affect the "favour" of their responses but in either case users were able to identify key aspects of the interface design that they recognized as assisting with their "learning" or acquisition of a new concepts.

Was the map intuitive to interpret?
Was the node text clear to read?
Could users identify the theme of the map?
How did the user interpret the colors in the map?
How quickly did users identify patterns in the map structure?
How did users perform when locating unfamiliar content?
Once users identified patterns in the map structure did this aid the process of locating unfamiliar content?

What aspects of the Map design did the users comment on as positive?
What aspects of the map design did users comment on as negative?
How would users change/customize the map/interface?


With this type of new and unfamiliar user interface there are countless modifications that could be made and factors that must be considered to create the "best" user interface of this type (visual concept mapping of digital resources). The various nuances of UID and the science behind the language and structure of the map itself could be the subject of number of detailed research papers. Even the optimization of data packing efficiency versus readability & usability could be the focus of a phd on its own so I don't expect to definitively answer the majority of profound questions raised by this research project. However the initial feedback i have already received has yielded some valuable insights into what factors need to be carefully considered and controlled, what must be flexible, what elements can be potentially controlled/customized by users and what can be considered to be entirely optional when designing interfaces of this type.

No comments: