Translates complex audit standards requirements and controls into easily understandable tasks for engineers, resulting in easy to understand, better organized, faster audits.

Challenge

A two-phase project to first create a rapid MVP to engage early-adopter clients and design partners ahead of the second stage, a scalable longer term solution.

Role

Led end-to-end design through research, wire framing, visual design, prototyping and cross-team collaboration.

I worked closely with the CPO, Product Manager, and Engineering Lead to understand user needs and deliverables. I gained valuable background knowledge from our Audit expert, the Customer Success Manager.

AUDIENCE & RESEARCH

Three distinct yet related user types:

1
Audit Manager
The client user responsible for the overall success of the audit. This person may be a seasoned audit veteran or have no experience whatsoever and our solution needs to cater for that entire spectrum. They will coordinate the requirement gathering efforts of staff and potentially vendors while working with the Auditor.
2
Evidence Gatherer
Client staff members responsible for personnel, procedures, or technical security for apps and infrastructure. More often than not these people have little to no prior experience with audit evidence gathering and often have little time or patience for audit-related tasks assigned to them.
3
Auditor
The person, or team assigned by the Audit firm to obtain audit requirements evidence from the client. They understand audit requirements from a legal and business perspective but are often lacking in the ability to explain those requirements in a way that their clients can understand.

Process: Phase 1

Rapid MVP

Freshly funded, the founding team needed an initial MVP delivered with aggressive speed-to-market as the driving factor. Prior to me joining the team, the founders had validated their concept with both Auditors and potential clients.

We discussed the risks involved with this research-light project, the many assumptions we’d need to make, and that subsequent efforts may be major vs. iterations from a solid initial release. My concerns were reduced, however, when my own research revealed current tools being used were extremely basic and inefficient.

Existing solutions were typically provided by the Audit firm and consisted of crude spreadsheets tied to file shares like Google Drive or DropBox and endless emails and Slack messaging. Poorly organized, minimal instruction/assistance and an incredibly inefficient completion mechanism of uploading and linking files often multiple times for the same item.

Overall, existing solutions were confusing, inefficient and laborious. Creating an organized SaaS tool that logically organized the effort, provided better requirements guidance and removed duplicated effort seemed a worthy and valuable MVP goal.

Approach

Having to create a complex solution from scratch, at speed is always a sobering exercise as there’s little time for research, prototyping or feedback loops. Having explained the caveats related to this kind of MVP effort, I proceeded to interview the CEO, his co-founder, the CTO, and the head of customer success, a seasoned Deloitte auditor.

The interviews yielded a very extensive list of ideas which I organized  into a Google Sheet. I then lead the team through a MoSCoW exercise to group features into Must, Should, Could, and Would categories. I then had the leadership team assign a value (1-5) and an effort (1-5) on opposite scales. We re-reviewed Could, and Would items through a lens of non-inclusion in the MVP, found a few strays and placed those in either Must or Should categories.

The next exercise was to write each Must and Should idea onto a post it note and have the team place each on in the appropriate quadrant and then discuss/adjust. Once complete I created a facsimile using Miro (formerly RealtimeBoard) then shared for final validation ahead of product and design efforts.

Prior to me joining, the CEO had modelled his vision using Jira, a little unusual for sure but it was an effective way of showing hierarchy, grouping, and actionable items requiring evidence to be uploaded. Several ‘friendly’ client prospects had been shown the Jira board and liked the way it organized and supported the audit, an interesting signal.

It was a useful way to ramp up my industry knowledge and help me understand the IA of the audit data and requirements in a practical and more importantly, usable way. It was a useful reference model that accelerated my ideation process.

I spent time with our in-house audit specialist, and thankfully, a couple of client prospects who were close to the founders. I needed to understand what an audit involved procedurally, legally, and how each side (auditor and client) approached the effort. I then tried to understand and absorb the human factors side of the audit process. What is an audit actually like day to day over months? What did Auditors complain about often and likewise, the clients. Where were the friction points, the confusion, the frustration, the needlessly repetitive work?

Key Learnings

1
Clients rarely understand the language of audit requirements. This leads to extensive discussions with audit staff who often lack the depth of knowledge to translate generic audit terms into customer-specific action items.
2
Many audit requirements across different sections are solved by a previously provided evidence item. A lot of duplicated effort repeatedly uploading evidence.
3
The typical effort required is almost always far more than the client anticipates. This often leads to under-budgeted resources and highly stressful deadlines.

Clearly, a lot of immediate value could be added to an MVP product release.

There were no publicly accessible competitors in the space so nothing valuable could be gleaned from other assessments.

Given the extreme time-constraints of “a couple of weeks’ and the light concept validation with prospective clients and audit firms alike it was time to start thinking through the MVP.

Design

High-level MVP scope:

  • Ability to create an audit (only SOC 2, Type 1 was supported for MVP)
  • Dashboard showing progress both overall, and by Common Criteria
  • Organize the standard into logical groupings Common Criteria, Point of Focus, Task, and Evidence Requirements
  • Ease of locating information via search, filters, and sorting
  • Evidence upload
  • Guidelines and best practices for evidence requirements
  • Task level status flags
  • An administration system to create companies, create users (just one type, a super-user, for MVP), send onboarding emails, and reset passwords

Stretch goals:

  • Auto-assign evidence uploads to other tasks with matching requirements
  • A “Task View” showing all Audit Tasks in a tabular view

Notes:

  • Extensive discussions with engineering around data structures and information architecture to support future standards as they were researched, then added.
  • While a Closed Audits screen was required, we prioritized post MVP as by definition we had several months beyond a new audit to add that capability

Getting Started Form

Current Audit Dashboard

The core user experience consisted of three main functional sections:

DashBoard

Overall audit progress, total tasks broken down by status, audit requirements grouped into common criteria and the ability to view current and prior audits.

Wires Design

Outcome

Working closely with stakeholders, product management, engineering and our in-house audit expert we launched our MVP on schedule.

Feedback from our early-adopter beta testers was almost entirely positive. In summary, in comparison with existing tools, our 1.0 product was better organized, easier to use and far more efficient in terms of evidence upload and repurposing. We actually made a sale on the day of our launch!

The non-positive feedback centered around wanting the platform to do more e.g. task assignment to other team members with status checking, clearer explanations of tasks “in plain English” as well as Auditor access and automated evidence gathering. All good requests and great input into future releases.

An interesting learning was the need for our own engineers to spend time with our Audit expert to provide context for their work as it related to the overall project. A good lesson.

Overall, a very successful initial product launch.

Process: Phase 2

Scalable SaaS Audit Platform

Post MVP launch, with early clients, prospects, and investors satisfied that we now had an in-market research vehicle and a fledgling source of revenue. With that startup stress handled, it was time to explore and define a platform that we believed would better address current client needs, would support Audit firms, additional compliance standards and designed in a way to accommodate known and predicted future features.

Approach

I spent time talking to our existing single client, as well as attending as many sales calls as I could to understand points of resonance, desired features and capabilities as well as the perceived level of pain/value associated with each. I also worked with customer success to understand bugs, unclear functionality and feature requests from that channel.  Lastly, we scoped and initiated an active campaign to add additional design partners which was incredibly revealing and useful.

Key Learnings

1
A major discovery was that while clients liked the dashboard as a management tool, their primary request was an actionable list of audit tasks in a ‘To Do’ list format, something they were familiar with. Auditors on the other hand loved the organizational structure of the standard as it was how they processed it.
2
Client staff, especially Engineers were not at all interested in the audit. The work required of them was onerous and they didn’t understand the audit language or the generic technology or security requests due to wide variability in infrastructure and software application specifics.
3
Many voiced a variation of wanting a To Do list in as clear and simple language as possible, with example evidence or their own company evidence from a prior audit for reference.
4
When shown a wireframe of an ‘evidence-based’ ‘to do’ list approach, passionate responses included “Hell yes, this, build this!” to “I’d pay almost anything for this tool” to “No-one, literally no-one is approaching audit this way - this is a winner!”

Design

With more time and a mandate to create a scalable design approach, I started researching design patterns typically used in administration systems.

Given what I knew would be future roadmap features and capabilities plus a general desire to create a system that could expand to fit the unknown, I ended up with a modular solution.

FLEXIBLE CONTENT MODULES

To support a variety of personas, functionality and use-cases I wanted to create flexible content zones   to accommodate requirements.

Home tab, split modules
Home tab, three columns
Home tab, three rows

Highlights:

  • Bold brand colour palette for the background and framework components while   keeping the core functionality brand-agnostic using grey, black, and white. Designed to accommodate customized branding (e.g. a substantial partnership)  all the way  up to full white-label capability.
  • A responsive canvas that expands to fill the browser window.
  • Expandable tabs for accessing major system components. The first three  tabs are actually audit statuses so should really be incorporated into one tab. After discussion it was decided to take this approach initially then review in a future release when more tabs would exist.
  • System or utility tabs differentiated by position and colour.
  • Clear, simple notification and profile functionality.
  • Modular content panels that function full-canvas height, half-height, fixed-height as well as flexible widths as percentages or fixed-width. Some panels would be set to expand to fill the available canvas with others locked.
  • I also wanted to support custom layouts based on role, or preference as well as a build-you-own drag-and-drop system in a future release.
  • Potentially panels could also be expanded or collapsed as needed.
  • Each panel would function as a parent container for a variety of content.

CORE EXPERIENCE PROTOTYPE

A selection of screens for user research with clients, prospects and design partners.

Example Layouts

Client Audit Manager Role: Showing ‘no content’ layout

Client Audit Manager Role: Empty State (No Audits)

Evidence Gatherer Role: Task List but no access to other tabs

After walking through this approach with our group that had previously been so vocal about the wireframe version, feedback was even more positive.

Engineering

I setup a series of meetings with Engineering leads and specialists to deep-dive into proposed functionality, ask and answer questions and, over time, some ballpark estimates of effort and very approximate production-ready dates.

This is almost always the most sobering phase of the project. This phase often feels like a huge win, the end of a major effort, which it is, but it’s just the end of one phase ahead of the biggest piece, engineering begins.

From discussions with leadership, engineering, customer support, product, and design it was decided that this approach was absolutely the right solution for the scope of the product design requirements.

It was decided, however, that due to recent competitive pressure, we needed to innovate at a faster pace than this solution would allow.

What to do? After some consideration, I devised a hybrid approach, blending current and new designs that could be implemented in phases for an iterative solution vs. a complete overhaul.

Hybrid Design Approach

Instead of a complete overhaul, I blended the new designs with the current MVP inverted-L navigational layout.

By using the existing framework but implementing the new modular content panels, engineering time was drastically reduced along with production-ready timelines.

The overall responsive framework would then be scheduled as separate, substantial effort at the appropriate time on the product roadmap.

Email Templates

To support a variety of personas, functionality and use-cases I wanted to create flexible content zones   to accommodate requirements.

Reporting System

The reporting system created as part of the Automated Evidence Collection feature had some interesting challenges as report content is created in separate processes, queued, and then rendered. Also, the data that needed to be represented was often very long.

Each report consisted of a custom cover page, like this:

Followed by one or more pages of unique sectional layouts:

Style Guide

I created a Material Design based style guide for the new portal system.

Outcome

The Material Design based modular design system was very positively received by our internal team, Design Partners and Audit firms.

While elements of the new design have been implemented into the production system, the complete implementation is scheduled as a future release effort.