Support Centre

You have out of 5 free articles left for the month

Signup for a trial to access unlimited content.

Start Trial

Continue reading on DataGuidance with:

Free Member

Limited Articles

Create an account to continue accessing select articles, resources, and guidance notes.

Free Trial

Unlimited Access

Start your free trial to access unlimited articles, resources, guidance notes, and workspaces.

China: The Internet Information Service Algorithm Recommendation Management Regulations

The Cyberspace Administration of China ('CAC') issued, on 4 January 2022, the Internet Information Service Algorithm Recommendation Management Regulations ('the Regulations') following approval by the Ministry of Industry and Information Technology, the Ministry of Public Security, and State Administration for Market Regulation. The Regulation entered into force on 1 March 2022. OneTrust DataGuidance breaks down key provisions of the Regulations, featuring insights from Julian Sun, Associate at Taylor Wessing.

Thomas de Wever / Essentials collection / istockphoto.com

Scope

In particular, the Regulations provide that algorithm recommendation services shall, among other things, abide by laws and regulations, and follow the principles of fairness, openness, and transparency (Article 4 of the Regulations). Equally, the Regulations set out that applicable industry organisations should establish and improve industry standards, industry guidelines, and urge algorithm recommendation service providers to formulate and impose service standards (Article 5 of the Regulations).

Sun commented, "for the first time with the People's Republic of China ('PRC') legal regime, the Regulations provide a clear definition of algorithm recommendation technologies, which refers to the use of algorithmic technologies such as generation and synthesis, personalised push, sorting and selection, retrieval and filtering, scheduling decision-making, [among other things] to provide information to users. Upon our quick observation, a clear definition of algorithm is not very common in other jurisdictions. The Regulations also respond to many detailed types of algorithm abuse in the business sector which may not be touched upon in other jurisdictions. This could mean that the Regulations may be used by international companies as a benchmark to manage/improve global compliance level as regards algorithm in a more detailed and practical way."

Algorithm service provider obligations

Chapter 2 of the Regulations establishes algorithm recommendation service providers obligations in relation to personal information protection and data security. More specifically, the Regulations stipulate that algorithm recommendation service providers must implement the main responsibilities for algorithm security including, among others, the establishment and improvement of algorithm mechanism and ethical review, data security, and personal information protection (Article 7 of the Regulations).

In addition, the Regulations outline that algorithm recommendation service providers must regularly review, evaluate, and verify algorithm mechanisms, models, data, and application results, among other things, and must not set up algorithm models that induce users to indulge in addiction, excessive consumption (Article 8 of the Regulations).

Sun stated, "algorithms have long been viewed as a black box fully controlled by its operator who has [large discretion] to decide how to use it to better serve [their] business interests. However, abuse of such technology in the past has led to problems including the infringement of consumer's rights and unfair competition, which are no longer tolerated by the public and authorities. Against such general background, we understand the Regulations focus on transparency and fairness, [with the] aim of restricting the excessive and unreasonable use of algorithms and to encourage proper and transparent use. In such context, the Regulations provide detailed guidance and restrictions that businesses should follow when designing, operating, and monitoring their algorithm service. [Nonetheless,] we do see some more general challenges that could be widely encountered by the businesses. For example, transparency principles require the operator of algorithm to disclose main operating mechanisms of algorithm. How to balance this disclosure requirement with confidentiality concerns could be a major issue to be assessed internally."

Protection of user rights

Importantly, the Regulations under Chapter 3 provide for the protection of user rights and interests aimed at protecting the personal data of data subjects. The Regulation highlights that service providers must provide users with options that are not tailored to their personal characteristics, or provide users with a convenient option to turn off the algorithmic recommendation service. Users must also be provided with the function of selecting or deleting user tags of their personal characteristics for the algorithmic recommendation service (Article 17 of the Regulations).

Furthermore, the Regulations detail specific protections for minors, namely that where providing services to minors, algorithm recommendation service providers must perform their obligations to protect the network of minors in accordance with the law, and facilitate use by minors by developing models suitable for use by minors and providing services suitable for the characteristics of minors' (Article 18 of the Regulations). In addition, the Regulation establishes that algorithmic recommendation services providers must not push information to minors that may cause minors to imitate unsafe behaviours, violate social morality, induce minor's bad habits, among other things, that may affect the physical and mental health of minors' (Article 18 of the Regulations).

To this end, Sun outlined, "businesses could face different challenges since various requirements under the Regulations come from many different angles (e.g. transparency, data security, protection of minors, anti-unfair completion) […] Besides, there are still some ambiguity under the Regulations that businesses may find [difficult] to [manoeuvre]. [For example,] the Regulations prohibit any algorithm model in violation of ethics and morals. However, exact meaning and scope of 'ethics and morals' [is] still not clear [and] the authorities have discretionary [power] to interpret [this] on a case-by-case basis. How to manage these uncertainties could be another challenge generally faced by the businesses."

Compatibility with other legislation

Concerning other data protection laws in China such as the Personal Information Protection Law ('PIPL') and Data Security Law ('DSL'), Sun concluded, "In a nutshell, the Regulations can be generally understood as a more detailed implementing rules based on Article 24 of the PIPL as regards automatic decision making. Besides [this], the Regulations also inherit other security principles and mechanisms from the DSL and Cyber Security Law, and created their counterpart in the field of algorithm (e.g. the classified security management system for algorithms)."

Further to the above, Article 24 of the PIPL provides that personal information processors using personal information for automated decision making must ensure the transparency of the decision making and the fairness, as well as the impartiality of the results, and may not apply unreasonable differential treatment to individuals in terms of transaction prices and other transaction conditions.

Penalties

The Regulations also note that where there are no laws and administrative regulations in place, relevant departments may provide warnings, circulated criticisms, or order corrections within a time limit for violation of certain provisions. Furthermore, the Regulations outline in case of refusal to make corrections, algorithm recommendation service providers may be ordered to suspend information updates and be fined between RMB 10,000 (approx. €1,430) and RMB 100,000 (approx. €14,260) (Article 31 of the Regulations).

Harry Chambers Privacy Analyst
[email protected]

Comments provided by:

Julian Sun Associate
[email protected]
Taylor Wessing, Shanghai