How AI Will Revolutionize Fair Lending

Cheyenne Noelle December 4, 2019 Finance

fair lending

Lenders and laws regulating fair lending have been a target for discrimination for hundreds of years. Many times, certain groups of people are left without the opportunity to qualify for a loan. These audiences include people of color, those without higher educational degrees, immigrants, and younger generations with thin or nonexistent credit history. Since these underserved communities are viewed as those with the most risk, they are usually last in line in the lending process.

Often, these people are labeled “underbanked”. This means that they don’t have a traditional credit score or have very little information in their credit files. These people find themselves stuck in a vicious cycle unable to get credit because they have no credit in their record or available. A flawed traditional scoring model is what is holding these potentially qualified applicants back from accessing loans.

Banks, mortgage lenders, and FinTech startups are starting to leverage the vast amount of data available to more accurate, informed decision-making in lending. This is where artificial intelligence (AI) and machine learning (ML) have the potential to revolutionize the fair lending process as we know it today.

What is a loan?

Investopedia defines a loan as “money, property, or other material goods given to another party in exchange for future repayment of the loan value or principal amount, along with interest or finance charges.”

There are many things to take into account with loans. For instance, loans with high-interest rates have higher monthly payments—or take longer to pay off—versus low-rate loans. Loans can be secured by collateral such as a mortgage or unsecured such as a credit card. Revolving loans or lines can be spent, repaid, and spent again, while term loans are fixed-rate, fixed-payment loans.

There are also different types of loans, such as secured versus unsecured, and revolving versus term. Mortgages and car loans are secured loans, while credit card loans are typically unsecured or not backed by collateral. A credit card is an unsecured, revolving loan, whereas a home-equity line of credit (HELOC) is a secured, revolving loan. 

 

The traditional credit scoring model

The Fair Housing Act (FHA) and the Equal Credit Opportunity Act (ECOA) protect consumers by prohibiting unfair and discriminatory practices. While these laws have been in place for decades, loan officers and landlords have notoriously found loopholes in the system to discriminate against certain groups of people.

Before AI underwriting, loans were processed by over paperwork by a human at a bank. A loan officer would use their own judgment (which is prone to bias) to evaluate the loan applicant’s trustworthiness to pay debts in full, on time, and manage their finances. Recognizing that certain groups of people may be “credit invisible” or underbanked, these loan officers either shy away from granting loans to these people or recommend “high-risk, high-priced products.” The result? The groups of people more like the loan officer were granted more loans and those most unlike the loan officer suffered.

Without a way to truly understand how each lender make their decision, regulators used statistical outcomes to determine who was practicing lending fairly and who wasn’t. But these statistics are often skewed and don’t take into account the bigger picture required to determine if someone is “risky” or not. Relying only on a credit score and reports leaves out valuable information that may make an applicant qualified to loan, even if their scores say differently.

Every lender sets its own criteria for loan approval and pricing. The lack of standardization makes it difficult to predict APR for someone with bad credit. If someone has poor credit, an applicant will usually receive offers on the high end of interest rates upwards of 36%. Based on research showing that certain groups of people are more vulnerable to having lower credit scores and thinner credit files, these people are often the target of unfair lending practices and discrimination when it comes to getting a loan.

How AI can level the playing field


Machine learning has the power to reduce discrimination in credit. For example, ML models use up to 100 times more data points and more sophisticated math to generate a better risk prediction in a few seconds. ML credit models can fold in more indicators of creditworthiness and surface subtle connections among pieces of information that paint a clearer picture of whether someone is a good credit risk.

There is a growing number of FinTech startups that capitalize on AI to revolutionize fair lending. Some of these companies include ones like Underwrite.ai, a company that creates algorithms that use machine learning that lenders can customize to help them make credit decisions. Other companies like Upstart have embraced big data and machine learning to the lending process. Upstart provides personal loans to consumers based on an assessment of their credit-worthiness that is based in part on education and employment history. According to Consumer Finance Monitor, results from Upstart’s model using alternative data and machine learning “approved 27% more applications than a traditional lending model and yielded 16% lower average APRs. “

It’s obvious that AI can increase access to credit for minority and low-income borrowers who have been left out of mainstream lending. One important aspect to note is that unfair lending is still a risk for machine learning models. “The proliferation of machine learning will expose human bias, acting as a ‘moral mirror’,” according to Shannon Vallor, Professor Department of Philosophy at Santa Clara University and AI Ethicist at Google. “Models are not just pattern identifiers, but pattern amplifiers,” she shared with Forrester.

To add to Vallor’s point, technology is neutral and itself isn’t harmful. It’s the attitudes that exist and has the potential to influence technology that is harmful. With this in mind, technologists can harness AI’s capabilities to equalize fair lending as long as ethics is at the forefront of product design and coding.

 

Summary


Given the mix of possible societal ramifications, policymakers must consider what practices are and are not permissible and what legal and regulatory structures are necessary to protect consumers against unfair or discriminatory lending practices. The country’s lending laws will have to be updated to keep pace with these technological developments, as they are adopted more widely by banks and other financial companies.

AI is a powerful tool to revolutionize fair lending practices. When used mindfully, this technology is able to give more qualified people access to loans to better their financial situation. This helps to remove discrimination and reduce poverty, which uplifts society overall and realizes fairer, equal opportunity for all.

Interested in learning more about how automated insights and decision-making can help the lending process? Check out our lending solution brief for automated, fair lending solutions. We help enterprises like yours ensure your loan processes are compliant and fair, every single time.

About ProcessMaker

ProcessMaker is a low-code business process management and workflow software.  ProcessMaker makes it easy for business analysts to collaborate with IT to automate complex business processes connecting people and existing company systems. Headquartered in Durham, North Carolina in the United States, ProcessMaker has a partner network spread across 35 countries on five continents. Hundreds of commercial customers, including many Fortune 100 companies, rely on ProcessMaker to digitally transform their core business processes enabling faster decision making, improved compliance, and better performance.

Stay in the loop with news and insights from ProcessMaker

Privacy Update
We use cookies to make interactions with our website and services easy and meaningful. Cookies help us better understand how our website is used and tailor advertising accordingly.

Accept