Fintech is becoming increasingly integrated into consumers’ daily lives.
This technology supports the growth of niche markets in Europe, including alternative finance, crowdfunding, peer-to-peer lending, automated loans, and investment management.
None of these services would work so efficiently or even at all without the artificial intelligence (AI) systems and machine learning (ML) enabled devices that underpin and facilitate the billions of daily computations needed for them.
What is fintech?
Fintech, or financial technology, refers to the innovative solutions developed for various financial services such as online banking, mobile payments, and cryptocurrency.
These automated systems are designed with a self-learning mechanism and, thus, developed and trained to adapt and evolve continually, making it difficult to understand their decision-making processes.
This, in turn, makes it difficult to understand how and why financial decisions are being made. For instance, why some people are approved for loans and why others are rejected. The machines decide – and we trust them.
Just one question: Is this automated decision-making, with no human oversight, legal?
Alternative credit scoring and regulatory compliance
The global fintech industry is growing due to a surge in startups. In 2022, 22% of European unicorns were fintech, raising $22.2 billion. However, by late 2022, the sector saw restructuring and layoffs. In the first half of 2023, European fintech funding dropped to €4.6 billion, down from €15.3 billion, due to tighter financial markets and a shift away from high-risk investments.
In tightening markets, banks and fintech companies look to reduce costs. An increasingly popular and cost-effective business model employed by fintech companies is known as the alternative credit scoring model. This model uses easily available data, such as digital footprints, to determine creditworthiness.
Using alternative credit scoring models, fintech companies and financial institutions evaluate people based on a multitude of non-traditional parameters, such as their mobile spending history, utility bill payments, social media environments, and mobile in-app purchases.
Lenders use the phone number or email provided on loan applications to look up customers’ social media profiles on Meta, X (formerly Twitter), Telegram, Snapchat, and their accounts on other platforms like Airbnb, LinkedIn, Pinterest, Microsoft 365, and Discord.
Manually reviewing massive amounts of alternative credit data in all its various formats is where AI algorithms and machine learning models become indispensable. An AI system in finance can, very cheaply and efficiently, scan vast arrays of publicly available data and identify patterns, even in unstructured data.
Combining traditional and alternative data sources for lending helps enrich conventional banking data. Details like behavioral analyses of customers' different social accounts and connections with other people are collected and used. Their interactions with websites, even text and audio data from credit applications, and previously recorded customer service conversations are collected, and inferences are made about the creditworthiness of individuals. All personal data that can be found on the Internet about individuals are used and analyzed.
Many fintech companies already use alternative credit scoring models in their financial products by developing credit applications or buying credit reports or credit scores from credit reference agencies (CRAs). These credit scores automate creditworthiness calculations and reduce costs.
Are AI and ML models fair?
As already pointed out above, there is, however, limited scope for human oversight in these AI-fueled models as the models learn and adapt by themselves. CRAs use AI to generate credit scores for individuals, which they then sell to fintech or financial institutions, who in turn use these scores to decide whether the individual is creditworthy or not, placing complete faith in the machines’ calculations.
The whole aim of these models is to build algorithms that can process vast amounts of personal data and make predictions from that data, i.e., replacing the human aspect in the approach. This means that the algorithm operates dynamically, adapting itself to changes in the data, relying not only on statistics but also on mathematical optimization. The best part about these models is that they reduce costs and time by eliminating the intervention of humans in the equation.
These AI and machine learning models, however, are not perfect. When creating these models, developers must use enormous quantities of data to train them. If this data contains biases, the models can learn and replicate these biases, leading to skewed or prejudiced outcomes. Similarly, errors can be introduced if the initial programming or the training data is of low quality and contains mistakes or inaccuracies.
The performance and fairness of AI and ML models, therefore, heavily depend on the quality and representativeness of the data upon which they are trained and the soundness of the algorithms used by the system. Hence, human vigilance is necessary to check if these systems are making sound and fair choices before using them to make life-changing decisions about individuals.
The European digital regulation landscape
Considering the above, the obligations of fintech and financial institutions under the General Data Protection Regulation (GDPR) are problematic when using alternative credit scoring models. Article 22 of the GDPR states that human oversight is required when decisions are automated by technology:
"The data controller shall implement suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests, at least the right to obtain human intervention on the part of the controller, to express his or her point of view and to contest the decision."
Until recently, it has never been clear whether fintech companies or traditional financial institutions employing alternative credit scoring models comply with these data protection obligations in their AI-fueled business models.
That is until December 2023, when the European Court of Justice (ECJ) clarified the situation. The case, known as Case C-634/21, was the first time the ECJ was asked to interpret Article 22 of the GDPR.
Article 22 grants data subjects the right "not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her." This case primarily dealt with whether the automated establishment of a credit score amounts to an automated decision within the meaning of Article 22 (1) of the GDPR.
The ECJ ruled that a credit reference agency engages in automated individual decision making when it creates credit repayment probability scores from automated processing and where lenders use that probability value to make financial decisions about the individual.
The implications of this case are significant not only for credit-scoring companies but also for fintech and financial institutions. Fintech will now need to provide more transparency about its scoring methods. It will also need to implement safeguards for individuals, such as the right to obtain human intervention, to express their point of view, and to contest automated decisions.
December 2023 was a bad month for fintech when considering AI-driven business models for yet another reason. In addition to Case C-634/21, a provisional agreement on the forthcoming AI Act was made on December 9. The EU AI Act also sets out six general principles for the use of AI systems in the EU, and the first general principle stipulates human agency and oversight for all AI systems.
What’s next for fintech?
Moving into 2024, observing how fintech and traditional financial institutions navigate the intricacies of compliance with both Article 22 of the GDPR and the forthcoming EU AI Act promises to be a fascinating and complex journey that may last not only months but years as Europe seeks to keep pace with AI innovation while also ensuring compliance with regulation.
Undoubtedly, the spotlight will be on research and cross-disciplinary collaboration between legal experts and technologists in the fintech landscape in the near future.