Select Offering

Finance

A product for Credit Unions to access an AI-driven credit decision model without integration into a Loan Origination System

Background

Zest.ai is a financial service company that makes AI-driven models for credit underwriting.  Until recently, Zest's clients were large financial institutions that required custom models and integration into their Loan Origination System (LOS).

The Goal

The majority of credit unions in the United States are unable to afford AI models.   The ROI simply doesn't make sense for them. The goal of the Scoreboard product is to allow smaller credit unions access to an AI model without the cost of integration.

Proof of Concept

The Head of Product asked me to make a quick and dirty UI so that engineering could build a proof of concept.  

The POC included an option to include “application data.”  This means including information like employment status, income and loan account - not just personal identifying information (PII).  Application data can power features like Credit Policy recommendations, but the form fields were removed because of the extra burden on the user.

Voice of the Customer Interviews

Once the POC was completed, it was time to get reactions from customers. I facilitated 15 different interviews with small credit unions.  I took the insights from those interviews and grouped them around common themes, then I shared these with the product team to create ideas for features.

Survey

As part of research and validation I created a survey using Survey Monkey to learn about several topics:

  • Solutions without Loan Origination System integration
  • Trust of AI
  • Credit Union Leagues
  • Credit Policies
Summary

Small credit unions need to understand that AI driven lending is not a robot that removes manual work from the loan process; rather, the true value is that it’s a powerful input into decision making.  Demonstrating this point is key to user adoption.

Product Market Fit Interviews

The POC proved that the product was feasible. The Voice of the Customers showed user needs. Next, we needed to know if the solution was viable.
To do this, I facilitated several Product/Market Fit interviews.  The goal was to understand what features were valuable enough that credit unions would pay for the product.

My role in the interview was to ask questions to get the participants to open up and create rapport:
  • Describe a day in life
  • What success looks like for smaller credit unions at  end of day/end of month/etc
  • What is your role in loan lifecycle
  • What are some of smaller credit unions  frustrations/pain points
  • How familiar do you think smaller credit unions are  with AI/ML
  • How many applications they process on a daily basis
Results

Credit unions were stuck on the need for integration, so we had to have it as an option, but they did perceive value in a UI-based product.

Journey Mapping

Now the product needed definition. I created a journey map based on discussions with the product team. Initially, the journey contained a concept for self serve customer portal.  The portal would contain tools like an ROI calculator to help them determine if Zest was a good fit. A self serve portal would take the burden of the sales and engagement teams, who wouldn’t be able to sustain the high volume of small credit unions.  The self serve part of the process has been pushed out farther in the product roadmap.

Concept Testing

Once the journey was defined, I created mockups of the main user flow. I showed this to users to get their reaction. They reacted to what they saw was redundant data entry, since their stated goal was automation and efficiency.  But digging deeper into their unmet needs showed that Scoreboard could still meet their goal.Since Scoreboard is more accurate that bureau models, it reduces the amount of applications that require manual review.  That would result in a net gain in efficiency, regardless of the redundant data entry.

Iterations

Scoreboard Admin

Early in the design process, the product team asked me to design an internal admin dashboard, but this concept was abandoned due to resource bandwidth.

Explainability

The first concept is a waterfall chart.  It shows how different key factors added or subtracted from the credit score.

This second concept used left/right direction to show how the Zest score was moving in positive and negative directions.

During research, I learned that users didn't intuitively understand the implications of the Zest score in the same way they understand a credit bureau score (FICO).  To help, I created concepts for ways to explain the "ingredients" that make up the Zest score.

The third concept used a bell curve distribution to show how the borrower’s score compared to the training population for the model.

The final concept used a bar to show the score composition.  This concept was chosen because of feedback from the Data Science team, who told us that the directionality of the score would be difficult to achieve.

Estimated Risk

Every credit score carries an estimated risk of default.  This is a major metric underwriters use to make a decision.  The Zest score also implies a risk score, but underwriters needed a way to understand this.  Underwriters intuitively understand the risk involved with a FICO score, because it has been the standard for decades.  The purpose of this feature was to impart the same understanding.

These are two concepts for explaining risk.  The horizontal number line was abandoned in favor of a tooltip to save space and make the UI look less complicated.

I explored other concepts including a guided walkthrough, a product map and an onboarding tutorial.  During design reviews, the product team’s feedback was that the popup was inuitive and easiest to implement for an MVP.

Business Review

Clients needed to know the ROI of their investment in Zest's software.  To do this, Zest normally gives every client a Business Review document, showing the performance of the AI model vs. a credit bureau.

ROI Calculator

Smaller credit unions operate on thin margins, so they need to know if an investment in an AI model is worth the cost.  A client management member of the team asked me to put together this ROI calculator concept, but it was deprioritized.

For larger clients these are custom reports, but this approach is not viable for the large volume of small credit unions.  I created concepts for what the business reviews could look like for Scoreboard, but the feature is on hold.

Usability Testing

The purpose of the test was to discover any usability issues with scoring an application, multi borrower scoring, documents and score historyI tested the product for usability and to validate features for each sprint during product build.  I showed a prototype built in Figma.  I gave testers a task based on the use case for the feature and observed their behavior.

Design Reviews

As I designed new features, I held design critiques with members of the product and engineering team.  The provided feedback on how well the designs met user needs and how to improve.

For one of the critiques, I used a FigJam board to capture insights.  For other reviews, I showed prototypes and captured comments.

Responsive Design

I provided specifications and mockups to show how pages and components would change at different breakpoints.

Pilot Launch

Scoreboard went live in January 2023 with some friendly clients who partnered as beta testers.  While our partners used the beta and provided feedback, I worked with the product and engineering team to build the rest of the MVP features

MVP Launch

Scoreboard is now available for wide release.  It includes a Score History page where users can review their previous transactions.

Multi-Borrower Applications

Many types of loans can have up to two co-borrowers to help the primary applicant.  Scoreboard needed to be able to handle this use case.  This set of screens show how the user will select one two three borrowers on an application and view results.  I designed different versions of this UI and showed it during design reviews and sprint grooming.  After gathering feedback from the team, I decided on this tab system with checkboxes to include borrowers.

I explored many layout iterations for the best way to display the results for multiple scores.  Ultimately, a tab system for results got the best feedback.  In the case of multiple borrowers, an underwriter could use the summary in the overview tab to make a decision.  They could also see individual scores in the tabs.

Future Releases

One actionable item of feedback was to simplify entering an address.  I had early concepts of predictive address search that I could now bring into the current build.  Upcoming releases will contain predictive address searching, more multi-borrower support and better score history details.

Lessons Learned

Scoreboard was a success in iterative product design.  What made this project work so well for me was the relationship with my product manager.  We closely collaborated and used each other as thought-partners.  Because of that, we gained a lot of trust with one another.  I trusted that the design and features could improve over time - and they did.  This allowed me to keep focusing on successful software releases that met release goals without worrying that I had to fight to include every design item into a sprint.  Without that trust between product and design, I don't think we could have made such successful, iterative gains.