Practice and improvement of children’s network protection industry based on mobile Internet applications

The “Law of the People’s Republic of China on the Protection of Minors (2020 Revision)” was officially implemented on June 1, 2021. One of the highlights is the addition of a special chapter on “Network Protection” in response to the urgent need to properly protect minors in the digital age. need. Children, as a more vulnerable group among minors, should also pay more attention to the effectiveness of their network protection. If we take children’s network protection as the main line and mobile Internet applications as the entry point, and by combing the child protection practices of mainstream mobile Internet applications at home and abroad in recent years, we can find that the current mobile Internet applications for children’s network protection measures mainly achieve anti-addiction. , and content management functions, while privacy protection and age recognition, as the development direction and practical difficulties of children’s online protection, still need to be improved in terms of institutional systems and compliance levels.

0 Preface

The “Blue Book for Youth: Report on Internet Use by Minors in China (2020)” released in September 2020 shows that the Internet penetration rate of minors in my country has reached 99.3%, which is significantly higher than the overall Internet penetration rate in my country (64.5%). While a large number of “digital natives” enjoy efficient and convenient digital technology, how to protect their personal rights and digital rights is an important topic worthy of attention and discussion.

For a series of emerging digital social problems such as algorithm bias, information cocooning, personal information protection, etc., even in the context of discussions targeting adults, there is still no clear and complete solution, and minors (i.e. 18 The protection of natural persons under the age of one year is also facing new challenges. Among them, the related issues that focus on children (that is, natural persons under the age of 14) are more complicated, such as how to effectively implement the parental guardianship mechanism in the online environment, the balance between the guardianship mechanism and children’s autonomy, general application (i.e. applications for the general public) how to take into account the unique needs of children in the guardianship mode, etc.; but it is these complexities that highlight the value of the times to explore methods and paths for child protection in the digital age. This article starts from the issue of children’s online protection, briefly sorts out the child protection practices of major mobile Internet applications (Apps) at home and abroad in recent years, and summarizes the experience for reference.

1 Main Practices of Child Protection in Mobile Internet Applications (Apps)

From the perspective of industrial practice, there are two main approaches to child protection in mobile Internet applications (Apps): one is to embed a “teenage model” in general-purpose apps. At present, the youth mode function is common in my country’s audio and video, social, and live broadcast apps. Second, in the product, distinguish between the general version and the children’s version of two independent apps. Foreign products such as Facebook, Amazon, YouTube, Spotify, PBS and other products are equipped with corresponding independent children (Kids) versions. Our country also has similar distinctions in related content apps, such as “Little Penguin Paradise” (Tencent Video Children’s Edition), “Little Youku” (Youku Video Children’s Edition), and “iQiyi Qi Babu” (iQiyi Children’s Edition). ) and other specialized children’s apps.

The “teenage model” attempt in various apps in my country began in 2019. On March 28, 2019, the Cyberspace Administration of China launched the first batch of pilot projects for the “Youth Model”, and organized short video platforms such as Douyin, Kuaishou, and Volcano Video to launch a youth anti-addiction system. On May 28, 2019, the Cyberspace Administration of China continued to promote the “teenage model” in video applications based on the previous pilot experience, including 14 short videos including Bilibili, Weishi, Weibo, etc. The platform and four online video platforms including Tencent Video, iQiyi, Youku, and PP Video have also launched a “teenage model”. In October of the same year, another 24 online live broadcast platforms and 9 online video platforms launched this function, and the total number of platforms that launched the youth model rose to 53. Since then, the number of online youth mode applications has gradually increased, and its functional characteristics and product design have been continuously optimized in practice.

Similar to the juvenile model commonly used in apps in my country, some apps in foreign markets also adopt a built-in age-based design model, giving special consideration to product functions and privacy policies for the use of apps by minors. In addition, for the “children” group specially protected by law, operators in European and American countries tend to develop special children’s version apps in practice. From the perspective of user interactivity, such children’s apps can often improve children’s user experience through unique product design, and it is also easier for parents to manage children’s apps on Electronic devices dedicated to children.

2 The main functions, development and difficulties of children’s network protection

2.1 The main functional goals of the current “Youth Mode”

Judging from the specific functional characteristics of the youth mode in various apps in my country, content management and anti-addiction are still its main design goals, which mainly include the following functions:

Switching back to the restriction. At present, the youth mode in my country is mainly determined by the user whether to switch or not. However, when switching back to the normal mode in the youth mode, a password is required to facilitate parents and other guardians to set thresholds and restrict minors from entering the normal mode.

Pop-up reminder, that is, when the app is launched for the first time every day, a pop-up window asks “whether to switch to teen mode”.

After entering the youth mode, the following child protection functions are generally activated:

Time limit, that is, it cannot be used within a fixed period of time (currently from 22:00 to 6:00 the next day), and it cannot be used after a certain amount of daily use (usually minutes at present) or needs to be re-authenticated by entering a password;

Functional limitations, that is, functions that are inappropriate for minors will be prohibited, such as watching live broadcasts, posting videos, sending messages and comments, following specific bloggers and other interactive functions, as well as economical functions such as rewards and shopping malls;

Content restrictions, that is, content that is not suitable for minors will not be retrieved or pushed to users in the youth mode, and only content suitable for youth can be accessed, such as knowledge popularization, calligraphy and painting, and course learning.

At present, the youth model in my country’s apps is a concrete manifestation of companies and regulators actively exploring online protection for minors under the relevant legal framework. Its product design with content management and anti-addiction as its main functional features is also in line with the requirements of current legal norms. Consistent. Since the State Council issued the Measures for the Administration of Internet Information Services in 2000, the management of illegal and harmful content has been one of the important responsibilities of Internet service providers. Judging from the statement made by the Cyberspace Administration of the People’s Republic of China at the beginning of the pilot “Youth Model”, it was also called the “Adolescent Anti-Addiction System”. In addition, judging from the corresponding legal provisions, for example, Article 74 of the “Law of the People’s Republic of China on the Protection of Minors” also requires that “network service providers such as online games, online live broadcasts, online audio and video, and online social networking shall target minors. Use its services to set up corresponding time management, authority management, consumption management and other functions.”

2.2 The development direction of children’s online protection

It is worth affirming that the youth model has a positive effect on the protection of youth in the online world, but while focusing on anti-addiction and age-appropriate content through time and content restrictions, ensuring the safety of minors’ personal information is also a youth model that can continue to develop. direction. At present, the functional characteristics of the youth mode of some apps have actually achieved a certain degree of personal information protection. For example, “WeChat”‘s teen mode disables options such as “Live and Nearby” and “Shake”. The “Same City” function in the youth mode of “Douyin” cannot be used, thus restricting the processing of such youth sensitive information.

In addition, in practice, there are already some applications that will clarify the specificity of personal information processing in the teen mode in the privacy policy. For example, in addition to explaining the time limit in the youth mode, “Sohu Video” also indicates that it only collects the user’s viewing record information, does not specifically identify the user’s age, nor does it provide personalized recommendations for children or other commercial purposes; The children’s privacy policy of “Douyin” indicates that the personal information of minors will not be used for any commercial purposes, including but not limited to commercial promotion, marketing, etc.; The purpose of collection and use is different from the normal mode.

Similar to domestic practice, foreign apps often use age-specific design to achieve compliance requirements for children’s personal information protection. One is to develop an independent version specifically for children, and the other is to set up a special area for minors in the general version that does not target minors (including children) as the only target audience. The development is that, taking into account the characteristics and needs of minors of different age groups, some apps adopt a “refined age classification” method in compliance practice, and provide different user rules for minors in different age groups. and corresponding product features. In addition, many foreign apps also provide specific functional sections for guardians (parents), which are used to manage the privacy and security settings of minors’ user accounts, and to report on the risk content of minors on the app.

2.3 Practical difficulties of children’s online protection

One of the first difficulties faced by child protection in the digital world due to the isolation of physical space is how to identify them from the general user. However, if indiscriminately, requiring all online service providers to implement age identification measures, and requesting corresponding parents to implement guardianship obligations, it will be difficult to avoid the problem of excessive data collection and significantly increase data security risks.

In terms of age identification in the app access stage, my country’s “teenage model” actually puts the question of whether or not children’s online protection is applicable to the app users, while the current mainstream practice of European and American apps is still for app operators to identify them. It is not intended to precisely collect user age. Taking the children’s apps in Europe and the United States as an example, one approach is for “parents” to actively declare the age of children’s users to the app, and this declaration step is neither mandatory nor required to be completely accurate; Several alternative age groups are provided, from which the “parent” simply selects the age group the child belongs to. It can be said that in response to the so-called problem of “minors bypassing the age wall by lying about their age”, the practice of foreign apps is more to use parents to control beforehand and declare after the fact, rather than collecting minors’ biometric data, The method of personal sensitive information such as identity data to realize the age judgment of users. Even the judgment of “parent” status is not as strict as imagined. Sometimes it is only necessary to pass a simple knowledge test to determine the status of “parent”.

For example, when logging in to YouTube Kids for the first time, child users cannot log in by themselves (the App will prompt “Get a parent to unlock the App”), and parent users need to log in in a certain way (the first login takes the authentication method of actively declaring the age of the parents, and subsequent When entering the setting page, the parent’s identity is determined through the mathematical operation of the “nine-nine multiplication table”, and a four-digit password can be set). After logging in, the “Parent” user can select the appropriate age group (0-4, 5-7, 8-12) for the child, and it is not necessary to provide the exact age.

When you log in to Spotify Kids for the first time, only family managers who have purchased a Family VIP (which is Spotify’s “premium family” paid service) can create a Spotify Kids account. When creating an account for Spotify Kids, a parent-controlled four-digit password is required. Enter the child account creation link after setting the password. On the creation page of the Kids Account, there are two fill-in options for “Child’s Name” and “Birthday (optional)”, along with a functional description for collecting this information and a jump link to the Privacy Policy. Even if you don’t fill in “birthday”, you will enter the page for checking the age group (two options of “0-6” and “5-12”), which aims to provide children with targeted and classified information that is suitable for them to listen to. song.

When logging into Messenger Kids for the first time, only through a Facebook account (this account cannot be obtained by users under the age of 13), you can enter the account creation page of Messenger Kids, so that you can set a name for the child (no real name is required), birthday (skip pass). Next, go to the prompt page of the privacy policy. After the user enters the app, the children’s version of the app has also undergone a series of compliance designs. For example, the YouTube Kids application page of the video application does not have the function of commenting, users uploading short videos or live broadcasting. “YouTube Kids Notes for Children” states: “YouTube Kids does not allow you to share personal information with others or release personal information to the public.” The music app Spotify Kids also does not open functions such as comments and user uploads, providing children with a relatively closed The App environment makes it difficult for information to flow out of the user side.

On the issue of child user identification, the user identification mechanism of European and American children’s apps (which should be distinguished from games with strong supervision) is still relatively loose. collection of children’s personal information), business costs and the possibility of program operation. On the issue of guardian management, the European and American children’s apps take measures to strengthen parental control, believing that parents should fulfill their supervisory responsibilities for children’s users, and develop product functions based on the above concepts. However, the debate on whether relying on parental supervision is effective, and how to ensure that children (or a larger group of minors) can moderately participate in the online world and use technological devices while complying with the regulations, is still under intense debate. .

3 Improvement of children’s network protection in mobile Internet applications (Apps)

3.1 Institutional coordination of privacy protection in children’s online protection functions

From the perspective of system traceability, the development of “children’s apps” in European and American countries is also a manifestation of a series of compliance requirements put forward by relevant laws and regulations for operators to collect and process children’s personal information. For example, the US Congress passed the Children’s Online Privacy Protection Act (COPPA) in 1998, which aims to regulate the collection, use and disclosure of children’s personal information from children by commercial websites or online service providers. And in the following two decades, the Federal Trade Commission (Federal Trade Commission, FTC), as its designated enforcement agency, has successively issued and continuously revised a series of regulations and guidance documents, including the COPPA Rule, to guide Practice the protection of children’s information. The UK Information Commissioner’s Office also released the Age Appropriate Design: A Code of Practice for Online Services in 2020 to guide online service providers on how to protect children from privacy intrusions in the Internet age. With regard to the above laws and regulations, relevant Internet service providers in various countries need to take into account the compliance requirements of both content management and privacy protection.

my country’s current personal information protection legal system is in the process of building, starting with Chapter VI “Privacy Rights and Personal Information Protection” of the Civil Code of the People’s Republic of China. Passed on 11 March 2019, the supporting legislative process on personal information protection in various departments and localities is also in progress. Regarding the handling of personal information of minors, such as Article 31 of the Civil Code of the People’s Republic of China, the Personal Information Protection Law of the People’s Republic of China, Article 72 of the Law of the People’s Republic of China on the Protection of Minors, and the Personal Information of Children The Internet Protection Regulations have generally clarified the issue of handling minors’ personal information, but how to integrate and coordinate in the “Youth Model” to achieve proper privacy protection still needs to be explored in more detail.

For example, some critics say that there will be measures such as children bypassing the “age wall” and “teenage model”, and even parents may help children “lie”, so it is advocated that more children’s personal information needs to be collected to achieve age identification. However, as mentioned above, in the face of behaviors such as “false reporting of age”, a better approach may be to improve the reporting mechanism and post-event investigation measures, rather than arbitrarily collecting children’s personal information. Based on cost-benefit considerations, apps outside the strong regulatory requirements do not need to overly probe personal age information beforehand, otherwise the collection, use, and storage of this part of sensitive information will cause new difficulties, and verification will be conducted after the report appears. It may be a more beneficial method to protect the personal information of minors.

3.2 Segmentation and Types of Different Apps in Compliance Requirements such as Age Recognition

In response to the various privacy clauses and personal information protection policies of various apps, third-party companies have already conducted evaluations on them. The privacy policy evaluation project of the non-profit organization Common Sense Media has tracked and analyzed the information protection compliance of multiple apps for minors or exclusive children, and provided an evaluation index for reference. my country’s “Southern Metropolis Daily” and “Nandu Minor Network Protection Research Center” also released the “Short Video Live App Youth Protection Evaluation Report” in April 2021.

According to relevant report data, it can be seen that: First, mainstream apps have a higher level of privacy protection for minors. From a general performance point of view, compared with the apps developed by small and micro companies for minors, the series of mainstream apps developed by large companies have a higher level of compliance. This result does not rule out the promotion of supervision from all parties, and the fact that these mainstream large companies also have more strength to comply with the privacy protection of children’s personal information; Requirements differ from compliance difficulty. Due to the high interactivity and the possibility of more personal information exposure (such as communication information, facial information, etc.), social apps and live broadcast apps require more complex and detailed compliance design. For apps that provide simple video viewing functions or music listening functions, it is relatively easier to achieve privacy protection compliance.

Therefore, with reference to the practical experience of the European Union’s General Data Protection Regulation, which has drawn criticism for the same data protection legal obligations for enterprises of different sizes, in the field of children’s online protection, it can be established based on the size of the company, the type of business, etc. A more appropriate compliance system. For example, referring to the practices of YouTube, Spotify, Facebook and other companies mentioned above, as well as the laws and regulations on the protection of children’s online privacy in the United States, and the “Design Guidelines for School-aged Children” in the United Kingdom, for audio-visual, social and other apps that feature different functions and are oriented to different fields Different risks exist in issues such as age identification, access and subsequent use, and more suitable compliance standards are formulated to achieve the maximum benefit of children’s online protection.

4 Conclusion

The App youth model commonly used in my country’s current Internet application practice can be regarded as a beneficial exploration under the issue of children’s network protection. However, considering that the starting point of this model is the function restriction and content management on the basis of anti-addiction, how to access the functions such as the protection of minors’ personal information in the next step will be carried out in accordance with relevant laws and regulations such as the Personal Information Protection Law of the People’s Republic of China The convergence remains to be seen. From an international perspective, both China, which is building a children’s online protection mechanism, and the United States, where the children’s online privacy protection mechanism has been in operation for more than 20 years, are faced with further considerations such as user age identification, operator awareness standards, and guardian management effectiveness. and other issues. It is important to note that children should not be required to act as adults with independent control over their online behavior, nor should they be framed as a purely passive restrained person. Children’s online protection needs to start from different application scenarios and functional categories, propose more diverse and more sustainable solutions, and establish a healthy and operable distribution of rights and obligations among countries, enterprises, families and minors relationship, protecting children’s ability to access interactive content on the Internet while protecting their personal information.

The Links:   NL6448BC26-26 SKKE-301F12