# Primary Research

# Phase: 🔎 Problem seeking
Focus: Landscape

IN BRIEF

Time commitment: Varying according to scope of research needed, but very time-intensive due to needs of interviewing
Difficulty: Moderate
Materials needed: Goals for outcome, interview guide, users to interview, location (physical or virtual), interviewer, notetaker, notetaking tools
Who should participate: User experience designers, developers, project/product owners, users, commmunity specialists
Best for: In-depth insights on the overall landscape of solutions (or lack thereof) from the user's perspective

# About this tool

When starting a project, much can be gleaned from competitive analysis or a look at the overall product or technical landscape — but from a user-centered perspective, the primary research in any design activity must be user research. While quantitative methods like numerical surveying and passive [analytics](analytics.md} are certainly helpful in initial user research for investigating questions along the lines of "how much" and "how many", qualitative methods — even if they're as unmoderated as an open-ended survey question — help reveal the complexity of a particular situation and encourage areas for digging in deeper at an early stage of a project.

Depending on the nature of your project, your budget and your timeframe, one or more of the following qualitative user research methods may be helpful (check the "external resources" section for more details on individual methods):

  • Exploratory user interviews: Open-ended, conversational interviews (often an hour or more) with users from a variety of backgrounds and vantage points. While led by an initial question set, open to going "off-course" to surface additional insights.
  • Expert interview: Again, an open-ended, conversational interview, but with relevant subject-matter experts. It may be more difficult to find experts than ordinary users, but they may have important input on the sorts of solutions they may (or may not!) jump to in response to your questions.
  • Extreme interview: Again, an open-ended, conversational interview, but with user types you've identified as edge cases. Speaking with these individuals early on will help you gain a better idea of how much allowance needs to be made for edge cases right off the bat, as well as suggest approaches that serve all while also directly benefiting edge-case users.
  • Focus group: A similar open-ended interview, but in a small-group format. Be aware: There's always the risk that one person will out-talk the entire group, or that participants will be afraid to share their true feelings in a group context.
  • Expert panel: Essentially a focus group made up of experts — with the same risks of a regular focus group, but with the possibility of rich discussion that only a discussion between experts can provide.
  • Analogous experience interview: An exploratory interview, but with someone who experiences similar challenges/opportunities in a different personal or professional context. For example, an event planner may have a lot to learn from a hospital manager due to the fact that both individuals and their clients rely heavily on sequential, high-priority tasks with little opportunity for redundancy or contingency planning.
  • Empathy interview: An exploratory interview, but where the focus is more on the feelings the user experiences during the task at hand than actual completion of the tasks. Though questions may elicit direct, even quantitative answers, the context of the answers and the emotions with which they are delivered are key.
  • Surveys: These can be an excellent means of combining qualitative and quantitative research, if designed well. However, keep in mind that because you're lacking a moderator or interviewer, you'll miss the opportunity to continue to probe if an answer seems particularly compelling. (Surveys don't have to be formal. Lurking and/or asking questions in forums, for example, is just another sort of survey.)

The lines between user research and user testing — or, at minimum, the types of methods involved — can blur substantially, so you may also want to look through the guides to moderated testing and unmoderated testing in this toolbox. For example, an activity like card sorting can work equally well when doing an initial evaluation of a product/project landscape as it can later in the development process.