top of page

PORTFOLIO

Demonstrating Value
(Capital One)

Using data to demonstrate the value of design thinking

Demonstrating the Value of Design 

We achieve better outcomes when we lead with why, understand the goals we’re each striving for and lean on each other’s areas of expertise to bring better experiences to life.

​​

As a design leader, I want to be able to demonstrate the value our team’s work provides to our customers and business partners. When I first joined the team, we worked on making as many updates as possible in an effort to improve a subjective product score.

​

Problem: 

  • Design work requests were being thrown over the fence and it seemed to me no one knew why we were working on these answers. We only knew that we'd been asked to work on these things and one person said it could be better.

​

Action:

  • I began reviewing our post implementation performance metrics at a per-answer level to find out what impact our design changes had after they were released. After a few conversations with our product and data science partners, I learned that most of the answer improvements our team prioritized were low volume and only a subset had metrics with any significance.

  • I kept elevating my questions and concerns, asking why and if we’re really solving the right problems when we can’t demonstrate the impact our work is having. I shared the analysis I’d done side of desk with our product and data science partners to see what they thought about it.

​

​

​

​

​

​

​

​

​

​

​

​

​

​

Behavior Changes:

One excel sheet and some persistent questions drove cross-functional team behavior changes.

​

  • Since we didn’t have enough data to reach significance, my product managers agreed to choose one release to prioritize for pre and post-release metric evaluation. 

  • From our release, we selected four answers to be evaluated by an analytics team. They graded 1,600 conversations spanning a four month time frame. As I predicted, the subjective nature of our performance grading and the low volume of each answer resulted in our changes still demonstrating no impact whatsoever. 

 

We had clearly been prioritizing the wrong work and the changes we spent months trying to release made no difference in our product performance. Our analytics partners recommended we spend more time evaluating the work before we prioritize it for improvement to ensure we’re working on projects that have enough volume to have a measurable impact.

​

Impacts:

  1. Our product team now prioritizes top volume intent work and ensures we have customer and business data behind our experiences

    • Over 2020, our team began to prioritize design improvements for our most frequently triggered intents and spent most of the year handling quick turnaround requests to help customers as the uncertainty driven by the pandemic escalated. 

    • In 2021, our product team obtained buy-in from our executive leadership to focus on improving our top call driving intent, “Why was my transaction declined?” We’d come leaps and bounds, from delivering as much as possible, to delivering an exceptional experience on one, high impact answer.

    • I am proud that my product management team not only advocated for us to have the ability to focus on one area of high impact, but they also insisted we have a dedicated performance dashboard to inform this work.  

  2. Our team has increased stakeholder engagement

    • Before, we were lucky to have external partners send us an email and now these partners regularly attend our core hours, are invested in our deliverables and are eager to hear about the research our team is working on.

    • Our User Testing research is driving innovative MVP deliverables while also informing our product and stakeholder team roadmaps.

  3. We have regular customer focused ideation sessions

    • At the beginning of our projects, our team hosts ideation sessions where we explore the declined transaction space using real customer problems and phrases our customers spoke to our chatbot.

    • After that, we explore various ways we could solve these very real customer needs while simultaneously delivering on our business outcomes.​​

 

Results:​

  1. Measurable impact of our design process

    • Our MVP was only in production for the last week of the monthly report we obtained, but it’s already demonstrating a 5% lift in our call containment metrics. We’re anxiously awaiting October’s performance metrics data and we have high expectations that the containment rate will continue to rise. â€‹

  2. Iterative design approach

    • The team feels strongly that delivering excellent customer experiences for a high impact area will significantly improve our product and business outcomes. Delivery of our initial design isn't the end as it usually would have been. The team is invested in experience improvement and we’re already iterating on second and third versions of experience improvement for customers who need help with declined transactions. â€‹

  3. Design is now inspiring our product, technical and AI/ML partners and their roadmaps.
    • ​The user testing findings have inspired our product team to push our current technical constraints. While our developers were initially hesitant to make big changes to our initial launch experience, they were quickly onboard when they saw our visuals and how positively our customers reacted to an intelligent launch experience.

 

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

bottom of page