A recent article in the American Banker asked the question “Is it OK for lending algorithms to favor Ivy League schools?” It begins by saying that much of the energy behind the fintech movement (to become chartered banks) comes from its promise of financial inclusion. Unfortunately, that is really not the goal of the fintech, aka alternative lender, movement—making money is. And what is the fintech plan for lending? It’s pretty simple actually. Here is the recipe: Take the inclusive element (credit risk data) out of the mix. Move instead to discover potential borrowers that can borrow big money (and pay it back) and inform your algorithms that where borrowers went to school—ah yes, especially the Ivy League schools—is key to propensity to pay and makes these borrowers good credit risks. Is this discrimination? You bet.
It is discrimination against the financially excluded. And it’s big time. The underbanked, who need these wealthier consumers to widen traditional lenders’ risk pools, instead lose good performing loans to alternative lenders. This makes it unlikely that marginal or unscorable borrowers will be able to access available credit. Not that regulated lenders have done a great job at financial inclusion—they haven’t—but it isn’t all their fault.
At least a good part of the blame has to go to the regulators that, post Dodd-Frank, have very often overridden bank inclusion initiatives with safety and soundness concerns (translation: too big to fail and the like). So what many traditional lenders have been doing is using their alternative data and algorithms to approve loans that have been declined by traditional credit scoring—in other words, they search for ways to include unscorables. Aite Group research among traditional lenders shows that the use of trended and alternative payments data to shore up traditional credit data has indeed improved inclusion rates and portfolio performance. This is a win-win for consumers who have been unable to secure access to credit at reasonable rates despite pristine recurring bill-payment records (e.g., telco bills).
It is long past time that a regulatory agency looks into these fintech credit algorithms—in fact, the OCC probably should have done that before it decided to award fintech companies’ bank charters. The other key justification for fintech credit practices is built around the fact that traditional lenders do not have “machine learning”; therefore, their credit decisions are made by individual lenders and are more likely to be discriminatory. This, too, is a myth—rule- and case-based reasoning decision engines automated consumer lenders’ decision-making processes years ago. For many overly cautious traditional lenders that have long reverted to human rather than IT resources to manage exceptions and now anticipate a lending turnaround with an uptick in loan applicant volume, this fintech regulatory review could be just the call-to-action needed to justify updating policies and procedures, and could finally automate more than 35% of credit approvals.