Embracing the non-linear nature of a modern product design process

8 minute read

  1. Define
  2. Discover
  3. Ideate
  4. Prove
  5. Implement
  6. Evolve

Search for 'product design process' in your favourite image search engine and you'll be met with an interesting split of opinions.

Some teams favour a very traditional, linear process. Various steps are defined, neatly siloed from each other, and deliverables from each step are neatly tied off and handed over before proceeding to the next.

This waterfall-style methodology certainly gets a job done, but doesn't lend itself well to the complexities of modern apps or the rapidly changing environment of web technologies.

I'm a passionate proponent of the alternative — a more agile approach. One in which reflection and revision are baked into the process from the start. Progress through the steps is initially much faster, but with the expectation that we'll regularly step backwards in our process to re-work our solutions as our understanding of the problem evolves.

There's still a strong linearity to this method, but it acknowledges how vulnerable each stage is to the insights we glean along the way and the recalibration/redirection that results.

So, what's involved through each stage?


Define

This is the stage that a lot of our "Why?", "What?", and "How?" questions should be asked. If we're opening Sketch at this stage, we're probably doing it wrong.

Questions to be answered

  • What is the problem or opportunity that this work is aiming to solve?
  • Why does this problem or opportunity exist?
  • Why do we care about spending time on it? Does the long-term value definitely outweigh the effort?
  • Who is the user? Do we understand their typical mental model and how they perceive this functionality?
  • If we don't already have a strong grasp of a user's mental model (e.g. from support tickets or previous research), how can we quickly and efficiently learn this?
  • What technical constraints should we be aware of?
  • How does this fit into the wider company strategy?
  • How are we going to define 'success' in this project? How will this work improve the lives of our users?

Discover

This is our chance to fill in blanks in our understanding, either with user research or competitor analysis. And no, it's still not the time to be reaching for a design tool!

Questions to be answered

  • Are we now confident in our definition of the problem?
  • Do we have criteria with which we can measure success?
  • Do we understand a typical user's mental model at this point in their experience journey? (If research is needed, make sure our testing scripts leave specifics until last. We primarily want users to give us insights about their wider context rather than getting bogged down in minutiae.)
  • Is there anything (good or bad) our competitors are doing that we can learn from?
  • Have we made time to discuss our findings with the engineers on our implementation team?

Ideate

By this stage we've hopefully done our due-diligence and we can speak confidently on behalf of our users. We should have a clear sense of what challenges they face and how our end-product will improve their experience.

Taking a tool-agnostic approach, work up whatever assets are needed to quickly and efficiently communicate thinking, be that sketches on paper, low fidelity wireframes, Sketch files, static HTML prototypes, or a Git branch of the actual product.

The key here is speed. Your first pass at designs is not to meant to produce a finished product. You just need an adequate enough representation of your thinking that you can begin to prove it with users and stakeholders.

Questions to be answered

  • Have I communicated my thinking accurately and effectively, but without committing so much time to my designs that I won't be willing to throw them away?
  • My designs probably involve various assumptions. Am I prepared to have them challenged, either by colleagues or in user testing?
  • Have I considered how things work outside of the happy path? (Don't neglect the boring details like empty states, errors, pagination, and edge-cases like excessively long content.)

Prove

Armed with designs/scribbles/prototypes, we can now begin to learn where our understanding ends and assumptions begin.

Given the costliness of testing with users, it pays to get these early-stage concepts in front of knowledgeable team members as a priority. Engineers, project managers, support staff, stakeholders, etc. can all give valuable insight before you commit to user interviews.

Whatever the feedback, prepare to be jumping back and forth between 'Ideate' and 'Prove' quite a bit…

Questions to be answered

  • Are my ideas feasible from an engineering or company perspective?
  • Do my concepts resonate with users and meet their needs? If criteria aren't being met, just circle back to another stage of ideation.

Implement

Gone are the days of 'throwing a design over the wall' and leaving an engineering team to decipher your intentions. Even if you've carefully prototyped your interactions and designed all your error states, digital projects still have a way of throwing you curveballs during the build process.

Once you've got a solution signed off, carry on championing your designs throughout the implementation stage, and be prepared to pair with engineers on

Questions to be answered

  • Has my solution been successfully translated from design to product?
  • Do I need to revisit any details from my design (text wrapping? copy lengths? spacing combinations?) that haven't quite worked in practice?
  • By shipping this functionality, will I be meeting my success criteria?
  • If I need non-standard tracking have I spoken to the data science team to get this in place?

Evolve

Shipped? Nice work! The next big project is probably pressuring you to move on, but launch day shouldn't be seen as the end of our process. Following up on a project gives us a chance to learn from both our successes and missteps.

Questions to be answered

  • Has this functionality actually solved the problem or opportunity that we hoped it would?
  • Can we back this up with analytics data, reduced support tickets, improved NPS scores, etc?
  • What can we learn from the project that will improve other projects?