Three steps to harvest valuable feedback from your users through unmoderated, remote user testing. Do you have some knowledge about user testing and are you eager to learn how to set up unmoderated, remote user tests with Preely?
Then read on, because this guide is just what you need to get started.
The purpose of this guide is to onboard anyone who is interested in remote, unmoderated user testing with Preely. When you’ve finished reading this guide you will know the simplest way to create, share and decide upon user tests in Preely and will be ready to start testing and getting valuable feedback from your users.
1. Create your prototype
Before you can build a test you need to have a prototype to build it from.
Creating a prototype does not need to be complicated at all. In fact, prototypes can take many forms. Think of it as a layout mock-up and start building from there.
A. Make and organize your prototype and test
You can use prototypes created in Adobe XD, Figma, InVision Cloud or Sketch directly in Preely. You can also upload screenshots, mockups, sketches, wireframes, artboards or finished designs. If you do not have a preferred prototyping tool, you can also consider sketching up your project on paper. This is the absolute easiest way to get an overview of your project. You can upload pictures of the prototype to Preely and make it interactive and navigable using Link , Preely forms and/or Timed transitions.
B. Break your prototype into smaller pieces
C. Take out anything that is not necessary
We have a tendency to create immense prototypes that are too complicated. Ask yourself:
Is anything missing, could something be left out or should it be reorganized?
Do I really need all these screens to get the feedback I need?
Now you know how to make a decent prototype.
Sweet. Let’s move on to building the test!
2. Create your test
In Preely a test always consists of at least two things: A task and an endpoint.
How they’re used, and everything in between, is entirely up to you.
Before you build the actual test we recommend that you define a research question incl. success criteria
A. Research question
Research question: Can our users book a table within 1 min?
Research question: How easy did our users find our new booking flow on a scale from 1-5?
– 75% of the test participants should be able to book a table within 1 min.
– Rating 4.3 or above.
B. Create a task
A task describes the goal you want the testers to achieve. Be careful not to set up a test that guides the testers to solve the test in a specific way. We have a tendency to give too much information to our users. Put yourself in their shoes and give them a task they can relate to.
– Tell them what to do but not how to do it
– Ask for evaluations – not solutions
– Maximum 5 tasks per test
Performance metrics to focus on:
Start by focusing on one or two performance metrics. Good places to start are with Success rate and Time on task.
C. Add questions
Questions may be applied to get feedback from the testers upon ending the test. There are many different answer types to choose from. We recommend that you start with a combination of:
5-point scale (also called Likert Scale) + a follow-up question
And an optional open-ended question to gather a bit of qualitative data
Follow-up: Why did you give this rating?
Open ended: During the task, was there anything you found difficult or wondered about (please elaborate)
3. Share your test
Depending on what type of test you want, a good guideline is to aim for five participants as a starting point for getting insight into your product. When you want to start measuring usability and UX metrics and maybe calculate basic statistics, you should consider having 15 participants or more, read more about number of participants here.
Share via link
Share the unique test link on your website, newsletter, social media platforms, or via emails.
Customized recruitment pages
Or share your test directly through your own test panel. Build recruitment pages that are customized for your target audience. This positively affects the conversion rate of test requests. Read more about building recruitment pages and creating your own test panel here.
That’s it. You will have actionable results in no time.
Formative vs. Summative Evaluation
New approach: We call it ‘Test First’
Post-task questions and the Single Ease Question (SEQ)
Remote testing – moderated vs. unmoderated