Design audits are an often overlooked but valuable tool for designers and design leaders to get valuable insights and energy for change.

A group of quirky robots conducting a design audit on a surreal cityscape, with magnifying glasses and clipboards in hand, while a perplexed human designer looks on in confusion, digital art, generated with DiffusionBee
A group of quirky robots conducting a design audit on a surreal cityscape, with magnifying glasses and clipboards in hand, while a perplexed human designer looks on in confusion, digital art, generated with DiffusionBee

☕ Design audits - improving collaboration, quality, and alignment

Design audits seem to be an underappreciated tool in designers’ and especially in design leaders’ toolboxes.

Audits however are a great starting point when somebody onboards a new team or organization, giving additional insights into what to work on, plus energy and motivation to get started. Auditing regularly is even better, as it gives a perspective on how the experience evolves and what tasks are ahead for the next period.

A design audit in the simplest terms is just a review of part of the experience for a given scope. Things like

  • analyzing the main user flow for usability issues,
  • reviewing brand usage across various channels,
  • making a list of accessibility issues,
  • creating an interface inventory for design system components, and
  • looking through written content tone of voice consistency all count as audits. The difference between “just reviewing things” is the clear scope for the given review and the concrete criteria for what the review is about. In a way, it’s a more structured way of inspection designers anyway do when they look at a designed artifact.

There are different ways to conduct a design audit, depending on the scope, criteria, and resources available. Some common methods and techniques for conducting a design audit are:

  • Heuristic evaluation: A group of designers or researchers evaluate the design based on a set of principles or guidelines, such as Nielsen’s heuristics or ISO standards. They identify usability issues, design flaws, and opportunities for improvement and rate them based on severity and frequency.
  • Expert review: Experts review the design based on their professional expertise and knowledge.
  • User testing: Recruit users of the product to test the existing design. User testing can be conducted in-person or remotely, using various techniques such as moderated or unmoderated sessions.
  • Analytics review: Analyze the quantitative data, such as traffic, conversion rates, bounce rates, or click-through rates. It helps identify patterns, trends, and anomalies that could indicate usability or design issues.
  • Content review: Analyze the quality, relevance, and consistency of the content such as text, images, videos, or audio. It helps ensure that the content aligns with the user’s needs and expectations and supports the overall user experience.
  • Design system audit: Evaluate the consistency, accessibility, and scalability of the design system used. It helps ensure that the design elements, patterns, and components follow the same standards and principles and can be easily maintained and updated.
  • Competitive analysis: Research and analyze the design of the competitors in the same industry or market. It helps identify best practices, trends, and opportunities for differentiation and innovation.

Also, a combination of techniques can be used and is better to triangulate problems - if the resources allow for it. Independently of which technique is chosen, an audit starts with a clear starting point and is finished with a report.

The starting point needs to clarify three things. First, the scope should be set, so which parts of the experience need to be covered, usually this would be the ownership of the design team, the most important user flow, or similar. Second, the evaluation criteria, so what the audit focuses on should be defined. In some cases a few words are enough, in other cases (for example for accessibility) criteria can be much more complicated. The third is the logistics for the audit: timelines, tools, and team members.

The report depends on the purpose of the audit. If a designer does an audit when entering a new team, the results can be shared along with recommendations directly from Figma, where the screenshots were collected. In other cases, a more detailed presentation, exhibition or talk might be necessary.

Sometimes the scope is focused on a specific aspect (like brand or content) which makes what to look at very clear. But as soon as the audit targets more general flows or experience, this can get a bit muddy. Since designers are looking at their Figma files most of the time, other pieces of the experience might get forgotten. So things like device switching, emails, customer service touchpoint, etc. should be also considered for an experience-focused audit.

Auditing when entering a new team or organization is a great way for a designer to get started with the context. For design leaders, auditing in a recurring fashion is also great to see changes over time, highlighting the work the design team did, and getting feedback on the strategy of the team. If there are issues the team wishes to pursue, an audit provides good input for making a push. For example, laying out UI inconsistencies after a design system audit is better than just talking about these issues if the goal is to get buy-in for more design system work.

Auditing on the product team and org level has a few additional benefits for leaders:

  • Improving design quality: Product design audits provide an opportunity to review the design process, identify weaknesses, and make improvements to design quality. This can help the team create more effective and efficient designs, reducing the number of errors and iterations required.
  • Enhancing Collaboration: Conducting audits on the design team level can foster collaboration and communication between team members. It can also help build a shared understanding of the design process, standards, and goals, leading to more cohesive teamwork and higher morale.
  • Aligning with Business Objectives: A design audit can help ensure that the design process is aligned with business objectives, ensuring that design decisions are based on evidence and not just personal preferences. This can also help the design team better articulate the value of their work to stakeholders.
  • Identifying Skills Gaps: Audits can highlight skills gaps within the team, helping design leaders identify areas where team members

🥤 To recap

  • Design audits are an underutilized tool in the designer’s toolbox, but they provide valuable insights and motivation for designers and design leaders.
  • A design audit involves a review of a specific part of the user experience, with a clear scope and criteria for evaluation.
  • Different techniques can be used for conducting a design audit, including heuristic evaluation, expert review, user testing, analytics review, content review, design system audit, and competitive analysis.
  • The starting point of a design audit should clarify the scope, evaluation criteria, and logistics of the audit, and the report should reflect the purpose and focus of the audit.
  • Conducting design audits regularly can improve design quality, enhance collaboration, align with business objectives, and identify skills gaps within the team.

This is a post from my newsletter, 9am26, subscribe here:

🍪 Things to snack on

Fresh Eyes Audits: Your key to better UX design by Jen Enrique is a great example of using design audits in practice, as a useful tool when starting on a new product or product area.

Since creating a new design system or performing a major iteration can feel so daunting, I found Brad Frost’s interface inventory quite helpful. There are a few useful tips about how to create an interface inventory, why someone should have it, and a few interesting examples.

A nice use case and a lot of great tips by Maria Margarida in Running a Product Design Audit. Since a complete audit can be intimidating, the article suggests to clarify what are the desired pages in a given round and make the audit process a routine to check more frequently on important pages. The audit should include collecting both qualitative and quantitative data. Report and next steps not only list conclusions and best next actions but also an immediate iteration on the next best actions.

A clear scope for design audits would be to check for accessibility issues. Olena Bulygina provides an overview on this in Accessibility audits: how to do a ‘quick and dirty’ audit. Compared to other types of audits, accessibility audits are better tooled up - at least parts of them can be automated. Still, checking against WCAG guidelines while straightforward, can take quite some time.

While Anna Kaley’s Content Inventory and Auditing 101 is obviously about content and not design elements, the approach is quite insightful. First, to do an effective audit an inventory needs to be established that sets the scope for the audit. Second, to set the stage for the audit, people (like ownership and collaboration), processes (like what to start with)s, and tools (for example automation or timebox) need to be set.