As a former Special Agent for the Federal Bureau of Investigation who investigated cybercrimes involving children, I know from experience that the topic of increasing online protections for minors provoked intense debates among law enforcement, social services, parents, and the civil rights communities.
Often the discussions focused on how to preserve the positive impact of the internet while addressing the negative aspects, such as the facilitation of cyber bullying, narcotics trafficking, and various forms of exploitation. While others continue the discussion, Texas has stepped beyond the debate and enacted a new regulatory regime intended to shield certain materials from being viewed by minors, and to limit the collection and usage of their data.
On July 13, Texas enacted the Securing Children Online through Parental Empowerment Act, which takes aim at “digital service providers,” who are defined as owners or operators of a website, application, program, or software that has the purpose of collecting and processing “personal identifying information.” This definition applies to providers who can control the means of collecting and processing data used by their digital services.
“Personal identifying information” includes the traditional categories of data that can be “linked or [are] reasonably linkable to an identified or identifiable individual.” Texas expands this definition by including pseudonymous information, such as a fake username, when the controller or processor can reasonably link the data with internal records to compile a minor’s true identity.
The SCOPE Act appears to focus on social media platforms, such as 4Chan, Discord, and other popular platforms, applications, and websites that contain content appealing to minors. However, the new law also appears broad enough to apply to larger social media sites such as X (Twitter), Snapchat, YouTube, and other online forums that allow users to interact socially, allow for public or semi-public profiles, and allow users to post or create content for sharing. The act excludes many financial, educational, small business, and government entities as well as digital services that provide access for “news, sports, commerce, or content primarily generated or selected by the digital service provider.” There is also an exclusion for search engines and companies that transmit data through browser services.
End of the simple splash screen “Are you over 18?”
For those “digital service providers” who fall under the SCOPE Act, Subchapter B articulates numerous duties, including a requirement to register all users and their ages. If the user is a minor and does not register as a minor, the act provides “verified parents” the opportunity to challenge the registration. A “verified parent” is defined as an adult “whose identity and relationship to the minor have been verified by a digital service provider.”
Restrictions on use of a minor’s data
Subchapter B places restrictions on the collection, sale, disclosure, and usage of data from minors. Purchases and financial transactions involving minors are prohibited by the Act. Notably, the SCOPE Act does not further define “purchases” or “financial transactions,” so even microtransactions – which are commonly offered as “in-app purchases” in online and mobile games – might be unlawful under the Act. However, microtransactions might be excluded if their “chat, comment, or other interactive functionality” is “incidental to the digital service.” The word “incidental” is not further defined by the Act, so it is open to interpretation.
Duty to define “harmful materials”
Another duty articulated by the SCOPE Act is to prevent harm to minors. “Digital service providers” are required to “develop and implement a strategy to prevent the known minor’s exposure to harmful materials.” The definition for “harmful materials” includes a reference to Section 43.24 of the Texas penal code (“material whose dominant theme taken as a whole: (A) appeals to the prurient interest of a minor, in sex, nudity, or excretion; (B) is patently offensive to prevailing standards in the adult community as a whole with respect to what is suitable for minors; and (C) is utterly without redeeming social value for minors.”).
However, the Act also identifies specific categories of materials which “promote[], glorif[y], or facilitate[]” subject matter such as suicide, self-harm, eating disorders, substance abuse, stalking, bullying, harassment, grooming, trafficking, child pornography, and other types of sexual exploitation and abuse.
Duty to filter or censor “harmful materials” and adult-oriented advertising
The SCOPE Act requires providers to filter out “harmful materials” through human monitoring, databases of keywords, and algorithm codes, which must be made available to “independent security researchers.” Pursuant to the Act, “digital service providers” should also filter out advertising for products, services, or activities that would be unlawful for a minor to consume, use, or engage in (the consumption of alcohol or tobacco are examples).
Duty to allow access and controls for parents
When verifying a minor’s parent or guardian, a “digital service provider” can use a “commercially reasonable method.” The law does not provide examples of such methods or explain how providers should determine whether a parent has actual custodial authority. Once verified, however, the SCOPE Act requires these providers to offer parental tools to monitor a minor’s usage of any owned or operated “digital service.”
Next steps for “Digital Service Providers”
Although it is foreseeable that there will be legal challenges to the application and enforcement of the SCOPE Act and similar regulatory regimes, business entities who might be defined as “digital service providers” would be well advised to work proactively with their data privacy experts to discuss their specific circumstances and operations. Although only the Attorney General of Texas can seek civil penalties for violations, the Act does offer private individuals the ability to seek a declaratory judgment or an injunction against “digital service providers.” The SCOPE Act will take effect September 1, 2024.
- Partner
Jason is a member of the Constangy Cyber Team. He provides legal counsel and representation on a variety of data privacy issues, including the identification of protected-data, application of statutory requirements for specific ...
The Constangy Cyber Advisor posts regular updates on legislative developments, data privacy, and information security trends. Our blog posts are informed through the Constangy Cyber Team's experience managing thousands of data breaches, providing robust compliance advisory services, and consultation on complex data privacy and security litigation.
Subscribe
Contributors
- Suzie Allen
- John Babione
- Bert Bender
- Ansley Bryan
- Jason Cherry
- Christopher R. Deubert
- Maria Efaplomatidis
- Sebastian Fischer
- Laura Funk
- Lauren Godfrey
- Taren N. Greenidge
- Chasity Henry
- Julie Hess
- Sean Hoar
- Donna Maddux
- David McMillan
- Ashley L. Orler
- Todd Rowe
- Melissa J. Sachs
- Allen Sattler
- Brent Sedge
- Matthew Toldero
- Alyssa Watzman
- Aubrey Weaver
- Xuan Zhou