Web Redesign Project

Using WebSort

Every tool (and every study) has it’s good and bad points. Overall, WebSort did what we wanted. The program had some limitations and our inexperience with online card sorts made evaluating the results cumbersome. However, the information we gleaned is valuable in furthering our understanding of what our users want to find – and where they expect to find it – on the Williams website.

WebSort Pros

  1. The price was right, especially with an 50% educational discount. We paid $150 for three studies of up to 100 participants each. We have two left.
  2. Email support was very good. Questions were answered, adjustments were made, and procedures were patiently explained, usually within 24 hours.
  3. The back-end interface for setting up the card sort was intuitive and easy to use.
  4. Completing the card sort was simple and straightforward for our participants (see #1 Cons below).
  5. For the most part, stats are broken down into usable presentations (see #3 Cons below).

WebSort Cons

  1. Instructions were too far removed from the study. Would be better if there were a way to have the instructions onscreen while completing the study. Or to have instructions in various parts of the sort screen while the participants are completing the study.
  2. No on board graphics, charts. All data has to be exported to manipulate.
  3. Combinations of categories could not be combined and saved in WebSort. Categories had to be combined for each group and then exported. Doing this for five groups and 108 categories took an inordinate amount of time. (See #1 Notes to Self below.)

Notes to Self

  1. Despite the advice of both WebSort and an on-campus number cruncher, we opted to allow users to create their own categories. This strategy had both good and bad results:
    • bad: we ended up with too many categories to combine meaningfully (see Categories Gone Wild)
    • good: we realized how very differently people look at and classify the same pieces of information
  2. Instructions were too long? Judging by the comments (see Comments and Comment Themes), participants did not get the message about putting items in the FIRST category in which they would look for them, even if they “should” appear in more than one category (see instructions in The Study).
  3. Participants should be tagged by the groups that are going to be compared, not by study groups. Some time was spent deciphering usernames of participants from study groups in order to determine whether they were faculty, staff, students, alumni, or an outside consultant (see The Participants).
  4. Items to sort were taken from the current Williams homepage. They could have been vetted better for the purposes of this study. For instance, Sports Information and Athletics should have been one item, maybe ditto Visitors Guide and Especially for Visitors. Since Visitors, Faculty, Staff, Parents, Donors and Students were suggested categories, the “Especially for __” items should probably have been eliminated. Next time: put more thought into both items and categories.

Next: In Conclusion