To evaluate a website with target users in order to create a rainbow
spreadsheet and offer design recommendations.
Microsoft Word, Qualtrics, Remote Interview Moderation (Microsoft Teams)
and Recording (QuickTime) Software
For my Evaluating Interactive Systems class, I was given a brief that I
then used to set specific goals. I then established that the following steps
would be the best way for me to gather the information needed to evaluate
the website:
The Script
Writing a script (of tasks), loosely framed by a scenario, both made
sure users reached the areas of the website the brief wanted to address
and made the evaluation feel more realistic by giving users purpose.
Questionnaires
To find if the users struggled while using the site, the Single Ease Question
(SEQ) was asked after certain tasks. Qualtrics was used to write and send
out a SUPR-Q questionnaire to each participant at the end of their interview
as elements related to ease of use, trust, and appearance were relevant to
the goals.
Testing Sessions
In total, I held five remote, moderated sessions (in addition to a pilot session)
and used a classic Think Aloud protocol. Due to limited resources, these sessions
had to be remote. However, the moderation and protocol were chosen as it
was important to understand what the users were feeling in addition to "the
why" behind their decisions.
Coding the Data
Transcripts from each participant were summarized and coded. Essentially,
three codes were used to note whether a comment was positive, something the
user would have liked to have or have been able to do, or was a negative
(e.g. triggering feelings of annoyance or frustration).
Qualitative and Quantitative Data
The qualitative data, alongside the quantitative data from the questionnaires,
was used to answer each requirement from the brief (in the form of a short
summary). Because of the small number of participants, on its own, the quantitative
data would not have been strong enough. So it was used to reinforce conclusions
drawn from the qualitative findings.
Deliverables
This data was then translated to the design recommendations. The main portion
of the report required four main design recommendations ranked by severity
(the ranking system was based on Dumash and Redish’s work as they considered
“a global versus local dimension to the problem”). However, before writing
these main recommendations, a rainbow spreadsheet was created which helped
me discover similarities in the issues participants ran into. I then created
another spreadsheet, not only to further group and organize each issue by
goal, but to clarify what effects these issues would have on the goals, to
rank them by severity, and to offer more potential re-design recommendations.
If you would like to see the full report, please send me an email at
raveena.s.jain@gmail.com
and let me know whether you are a potential employer or student looking
to read it.