Securing privacy concerns in FinTech in Hong Kong

Written on 17 Sep 2019

Hong Kong has built a strong environment for fostering innovation and financial technology or FinTech. With its large financial sector and its strategic role with Mainland China and gateway to the rest of Asia and the world, Hong Kong has the potential to take on an important role in being a leader in FinTech. In March 2019, for example, Hong Kong issued its first virtual banking licences, which will likely increase adoption of FinTech in the financial services sector.

As FinTech development gathers pace and emerging technologies become a greater part of our lives, so will there be greater collection and use of an increasing amount of personal data in relation to customers, in order to provide more tailored and/or more efficient delivery of financial services. At the same time, the threat of fraudulent activity, such as fake identities using fake accounts, is also rising. These pose a threat to financial stability. In this article, we explore the current risks and regulation around privacy and data security in Fintech in HK.

FinTech Emerging Technologies

Emerging technologies used in Fintech services and operations come in different forms, and include:

  • electronic payments and remittances, digital wallets and stored value mechanisms (such as e-wallets);
  • data analytics that support the operations of financial institutions (for example, credit scoring, loan processing);
  • artificial intelligence (AI) virtual assistants and chatbots;
  • peer-to-peer (P2P) financing (such as P2P lending and crowdfunding platforms);
  • distributed ledger technology, such as cryptocurrency, bitcoin transactions and smart contract applications, as well as blockchain services to help reduce fraud by keeping provenance data on the blockchain; and
  • financial investments, such as stock trading apps, robo-advisors and algorithmic trading and budgeting apps.

FinTech adoption will continue to disrupt the delivery of financial services.

The Hong Kong Monetary Authority (HKMA) has recognised the rapid technological advancements and growing impact on the banking industry, having announced a number of initiatives to prepare Hong Kong to move into a new era of “Smart Banking”, aimed to help the banking sector to rise to a higher level and embrace the enormous opportunities brought about by the convergence of banking and technology, thereby improving the quality of banking products and services for customers.

Privacy Risks in FinTech

While FinTech may provide many benefits to financial institution, service providers and customers, there are privacy risks associated with the use of FinTech – as the The Office of the Privacy Commissioner for Personal Data, Hong Kong (PCPD) has highlighted. We briefly explore some key risks and areas in which the Personal Data (Privacy) Ordinance (Chapter 486 of the Laws of Hong Kong) (the Ordinance) applies to FinTech.

Collection and use of personal data

Increasing amounts of personal data may be collected or generated with the use of FinTech, and then used or disclosed, with or without a user’s notice or consent or beyond the users’ reasonable expectations. Data Protection Principle (DPP) 1(3) in Schedule 1 to the Ordinance provides that all practicable steps must be taken by a data user to ensure that the data subjects are informed, on or before collection of their personal data, of the purpose for which the personal data is to be used and the potential transferees of the personal data, amongst other things. DPP 3 in Schedule 1 to the Ordinance provides that a data user must obtain express and voluntary consent of the data subjects before using their personal data for new purposes.

For mobile payments, a vast amount of data (much of which is personal data) is often collected, such as mobile phone numbers and proof of identity, as well as contact lists, purchase histories and location data of the users, with or without the customer’s notice. While such data collected may enable financial operators and FinTech providers to better customise services and predict preferences and habits, there is danger of profiling of customers which may reveal sensitive data or unexpected new use beyond what a user’s reasonable expectation may be.

Likewise, credit scoring involving the use of big data analytics and scoring algorithms for assessing individuals’ financial standing may include a user’s purchasing patterns, social media posts and timeliness of bill payments. The data used, aggregated or generated may be beyond an individual’s reasonable expectations. Often there is also a lack of transparency and insufficient notice to individuals where such data analytics/credit scoring activities are being undertaken.

Use of personal data in unfair, inaccurate or discriminatory ways

Personal data of individuals may be generated by interactions between the individual and the lender (such as transaction records) or may be inferred by data analytics. Credit scoring algorithms, using data analytics, make assessment on individuals’ creditworthiness by analysing various data sets from multiple data sources, both publicly and privately available. Inferences gained from social media and other sources may not bear great accuracy for financial credit worthiness prediction, which puts verifying the accuracy and relevance of the data a key risk.

DPP 2(1) in Schedule 1 to the Ordinance provides that all practicable steps must be taken by a data user to ensure that the personal data is accurate having regard to the purpose for which the personal data is or is to be used. Further, the data user must stop using or erase the personal data if there are reasonable grounds for believing that the personal data is inaccurate. We are living in an era where data sources collected for one purpose are sometimes aggregated and re-used and sold via third parties for use and purposes never intended, which raises issues around use in unfair, inaccurate or discriminatory ways.

Protection against data security risks

DPP 4 in Schedule 1 to the Ordinance provides that all practicable steps must be taken by a data user to ensure that the personal data in its possession is protected against unauthorised or accidental access, processing, erasure, loss or use. Financial institution and FinTech’s increased use of electronic payments and open APIs involving transmitting personal data electronically among different organisations and end-users will inevitably increase the risk of data leakage (either through design or by human error) or possible interception during transmission.

The storage of vast amounts of personal data in databases of financial institutions and their FinTech providers will also pose a risk, including posing cybersecurity threats for hacking and misappropriation. Technical security measures and IT security processes may vary across financial institutions and FinTech providers, with lack of processes, systems and priority to de-identify and remove personal data after reasonable use exposing financial institutions and FinTech service providers to greater data security risks.

Correction and erasure or rectification of obsolete or inaccurate personal data

As discussed above, financial institutions and Fintech service providers may be inclined to collect and retain as much personal data as possible, given the increasing corporate value put on data. However, such stored data may be inaccurate, irrelevant or obsolete and risks are posed if there is no effective mechanism to erase or rectify the inaccurate, irrelevant or obsolete data in a timely manner as required by privacy law.

Section 26 and DPP 2(2) of the Ordinance provide that all practicable steps must be taken by a data user to ensure that the personal data is not kept longer than is necessary for the fulfilment of the purpose of collection (including any directly related purpose). Section 22 and DPP 6(e) of the Ordinance provide individuals with the right to correct their personal data that is inaccurate.

Ethical framework for use of personal data in FinTech

In light of the privacy risks posed by the greater use and development of FinTech, the HKMA issued a circular to licensed financial service providers and banks to adopt and implement the Ethical Accountability Framework for the collection and use of personal data issued by the PCPD. The Ethical Accountability Framework was released by the PCPD in October 2018 and aims at achieving ethical and fair processing of personal data and in advanced processing activities such as AI and machine learning, by fostering a culture of ethical data governance and addressing the personal data privacy risks brought about by technology.

The core recommendations of the Ethical Accountability Framework is that organisations who conduct advanced data processing activities should implement ethical data stewardship by adhering to the three core ethical values – being respectful (understanding the context for use of data and defining what is reasonable and respectful of the impact on individuals), beneficial (considering the benefits to the individual and society as a whole, as well as the risks), and fair (taking measures to avoid unfairness, discrimination, unequal treatment, and distress to individuals).

The objective of ethical data stewardship is to ensure that the impact on the interests, rights and freedoms of all stakeholders are duly considered and addressed in data processing activities. The PCPD outlined principles for handling and processing data ethically, including “ethics by design”, implementing processes for ethical review, and being able to evidence such processes.

Osborne Clarke comment

While Hong Kong is seen as a hub for FinTech development, privacy issues and risks need to be addressed by financial institutions and FinTech providers to ensure a secure, fair and transparent framework for the collection, use and management of data as part of FinTech development. It is integral for both legal and regulatory compliance as well as to ensure greater trust and adoption of FinTech in HK.

The HKMA and PCPD’s recent emphasis on ethical considerations in privacy and data used for FinTech signals that these matters will increasingly become important and be scrutinised in the FinTech space as technologies develop and data becomes increasingly used, transferred and managed.

With thanks to the research assistance of Jasmine Yung, Privacy and Data Specialist and Trainee Solicitor, Osborne Clarke, Hong Kong. Immediately before joining Osborne Clarke in our Hong Kong Office, Ms Yung spent 6 years as a Personal Data Officer (Acting) at the Office of the Privacy Commissioner for Personal Data, Hong Kong (PCPD).