Ireland: AI in Fintech - investing in AI, with care
At the beginning of this year, the World Economic Forum Annual Meeting at Davos applauded the growth in artificial intelligence (AI), particularly generative AI. Against the backdrop of the world's biggest challenges, reports from Davos suggested that world leaders and business executives were cautiously optimistic for 2023. Reports have persisted throughout this year, with commentary lauding support for AI tech, while others point to a potential "dot.ai" bubble brewing. Time will unveil the answer. In the meantime, decisions regarding fintech and digital transformation point to an overarching mindset - investing in AI, with care.
In this Insight article, Rory O'Keeffe, Partner at Matheson LLP, will take you on a short journey through AI and Fintech, discussing their advantages and risks, focusing in particular on what the future holds for Ireland in this area.
What's the AI legal landscape in Ireland?
On July 8, 2021, Ireland published a National AI Strategy titled 'AI-Here for Good.' In addition, on a European level, there is a range of laws and regulations that regulate AI, including the General Data Protection Regulation (GDPR), the Platform-to-Business Regulation (applicable to online search engines providers and online intermediation service providers), and the European Union (Copyright and Related Rights in the Digital Single Market) Regulations 2021 (affording limited protection to ownership of content created by an AI system).
What's on the horizon for AI in Ireland?
- Artificial Intelligence Regulation (AI Act) - the AI Act is the EU's first attempt to regulate AI across the board. It aims to set a standard for harmonizing AI and lays down rules for its use and governance, addressing associated risks and development. The objective is to make the EU a trustworthy hub for ethical AI globally, using a risk-based approach. The AI Act classifies AI systems into three categories: unacceptable risk, high risk, and limited and minimal risk - each category imposes different obligations for providers. As of June 2023, the Act was in the negotiation stage involving the European Commission, Parliament, and Council of Ministers. It's expected to become law in late 2023 or early 2024, with a 24-month transition period. The digital economy regulation keeps growing, with more EU and domestic legislation coming soon.
- Product Liability Directive and the new AI Liability Directive - liability rules for products and new liability rules for AI.
- Digital Operational Resilience Act (DORA) - a comprehensive framework for ensuring the operational resilience of digital service providers, including those using AI in financial services.
- Network and Information Security Directive 2 (NIS2) - enhancing the cybersecurity of critical infrastructures and digital service providers.
- Digital Services Act and Digital Markets Act (DMA) - regulations for online platforms to foster competition, impacting AI applications that operate within digital platforms.
- Markets in Crypto-assets Regulation (MiCA) - aims to create a harmonized framework for cryptocurrency-related services, addressing AI-powered crypto-trading platforms.
What are common use cases for Fintech and how can AI enhance them?
- Smart Payments: this refers to providing innovative payment platforms that facilitate faster, more intelligent, and cost-effective payment methods (e.g., mobile wallets, contactless cards, biometric payments, cryptocurrencies).
- Embedded Finance: this refers to the seamless integration of financial tools or services into non-financial platforms or applications, making financial products and services readily available to users within their everyday interactions (e.g., virtual wallets, online marketplaces, personal finance management tools, subscription services, in-app purchases, payroll services, point-of-sale financing).
- Alternative Financing: this includes financing options such as revenue-based financing, crowdfunding, and peer-to-peer lending.
- Regtech: this refers to the use of technology to facilitate regulatory compliance (e.g., compliance management systems, regulatory reporting automation, transaction monitoring solutions).
- Credit Risk Assessment and Credit Underwriting: Fintech companies employ advanced algorithms and data analysis techniques to evaluate credit risk and underwrite loans.
- Know Your Customer (KYC) Use Cases: Fintechs are using technology to streamline the process of verifying the identity of customers.
- Front Office Automation: this includes the use of technology to automate front office tasks such as customer onboarding, account opening, and customer service.
- Personalized Financial Management: Fintechs are using technology to provide personalized financial advice and management services to customers.
The benefits of the use of AI in the aforementioned tasks are evident, and AI technologies have shown immense potential to revolutionize the financial sector in multiple ways, such as:
- identifying threats and fraud detection: AI-driven algorithms can analyze vast amounts of transaction data, identifying suspicious patterns and preventing fraudulent activities, thus bolstering security in financial services;
- improving quality, speed, and accuracy: AI algorithms can process complex financial data in real time, enabling faster decision-making and reducing human error. This has proved especially advantageous in high-frequency trading and risk management;
- enhanced customer experience: AI-powered chatbots and virtual assistants have streamlined customer interactions, providing constant support and resolving queries promptly, improving overall customer experience;
- personalized financial advice: AI-powered robo-advisors can assess individual financial goals, risk tolerance, and market trends to offer tailored investment advice, making wealth management services more accessible; and
- others: automating monotonous tasks, utilizing facial recognition for frictionless payment, employing predictive modeling to maximize bank incomes, among others.
Though new laws and regulations are coming, financial service companies have time to prepare for the upcoming legislation by reviewing existing practices and putting in place control. This proactive approach aims to anticipate changes rather than react to them, reflecting the desire to build public confidence in the use of the technology.
What are the key risks of using AI in Fintech?
There are also inherent risks and challenges:
- data privacy concerns: the increasing reliance on AI entails the collection and processing of vast amounts of personal and financial data, raising concerns about data privacy and security breaches;
- algorithmic bias: AI algorithms may inadvertently perpetuate bias and discrimination if trained on biased datasets, leading to unequal access to financial services and biased credit assessments;
- cybersecurity vulnerabilities: the incorporation of AI into financial systems introduces new avenues for cyberattacks, making them potential targets for cybercriminals, necessitating robust cybersecurity measures;
- lack of transparency: complex AI models often lack transparency, making it difficult to understand their decision-making process, leading to difficulties in compliance and carrying out regulatory scrutiny; and
- liability: risks may emerge in areas such as intellectual property rights, data protection claims, discrimination and bias issues, and financial loss to customers. The question of who is at fault can become challenging when human decision-making is removed.
A year of remarkable progress in generative AI
Generative AI, a subset of AI, has demonstrated its capabilities in producing content such as text, images, and videos. In Fintech, generative AI has been utilized in various applications like natural language processing, fraud detection and prevention, and financial market predictions. However, there are several challenges to generative AI, including accuracy and reliability, data security, software development, intellectual property protection, and employment implications. Widely accepted as a game-changing technology, it is notable that the current draft of the EU AI Act includes an obligation on generative AI tools to comply with additional transparency requirements, such as disclosing that the content was generated by AI. Appropriate safeguards and risk management are required here, as they would be for any emerging technology.
Unmasking cyber threats
Generally, it is evident that cybercrime now equally poses a significant threat to our society as traditional crimes. In recent times, the dependency on technology has increased exponentially. More people in Ireland have adapted to a hybrid-working model and as a result of this, there is a greater risk of cyberattacks. Through AI algorithms and data analysis, it is now possible to prevent cyberattacks more readily and successfully than ever before. More businesses are believed to be depending on AI to strengthen their cybersecurity defenses.
As part of those defenses, the need for operational resilience has been raised by many experts, especially in light of new risk management and incident reporting obligations falling out of the European AI Strategy (2018), including the proposed NIS2, DORA, and EU Cyber Resilience Act. The implications of each of these developments will need to be read with the AI Act and the conformity assessment requirements set out there.
AI in Fintech offers immense opportunities for financial institutions and consumers alike. However, the adoption of AI in this sector must be accompanied by a robust legal and regulatory framework to address challenges related to data privacy, bias, and cybersecurity. The legislative initiatives of the EU, including DORA, NIS2, GDPR, DSA, DMA, and MiCA, demonstrate the importance of striking a balance between technological advancements and safeguarding the interests of stakeholders within the financial ecosystem. As the Fintech landscape evolves, the collaboration between legal experts, regulators, and industry players will be crucial to realizing the full potential of AI while ensuring ethical and responsible deployment.
Rory O'Keeffe Partner
Matheson LLP, London