SENATE DOCKET, NO. 1313 FILED ON: 1/16/2025
SENATE . . . . . . . . . . . . . . No.
|
The Commonwealth of Massachusetts
_________________
PRESENTED BY:
John C. Velis
_________________
To the Honorable Senate and House of Representatives of the Commonwealth of Massachusetts in General
Court assembled:
The undersigned legislators and/or citizens respectfully petition for the adoption of the accompanying bill:
An Act relative to social media, algorithm accountability, and transparency.
_______________
PETITION OF:
Name: | District/Address: |
John C. Velis | Hampden and Hampshire |
SENATE DOCKET, NO. 1313 FILED ON: 1/16/2025
SENATE . . . . . . . . . . . . . . No.
[Pin Slip] |
The Commonwealth of Massachusetts
_______________
In the One Hundred and Ninety-Fourth General Court
(2025-2026)
_______________
An Act relative to social media, algorithm accountability, and transparency.
Be it enacted by the Senate and House of Representatives in General Court assembled, and by the authority of the same, as follows:
SECTION 1. Chapter 12 of the General Laws, as so appearing, is hereby amended by inserting after section 35 the following section:-
Section 36. (a) As used in this section the following words shall, unless the context clearly requires otherwise, have the following meanings:-
“Algorithm”, computational process that uses machine learning, natural language processing, artificial intelligence techniques, or other computational processing techniques of similar or greater complexity and that makes a decision or facilitates human decision-making with respect to users personal information, including to determine the provision of products or services or to rank, order, promote, recommend, amplify or similarly determine the delivery or display of information to an individual. For purposes of this section, an algorithm will refer to recommendation algorithms, also known as engagement-based algorithms, which passively populate a user’s feed or experience with content without any direct action or request by the user.
“Children”, consumers under 18 years of age.
“Covered platform”, an internet website, online service, online application, or mobile application, including, but not limited to, a social media platform that conducts business in this state or that produces products or services that is accessed by residents and that during the preceding calendar year: (1) controlled or processed the personal information of not less than one hundred thousand consumers, excluding personal information controlled or processed solely for the purpose of completing a payment transaction; or (2) controlled or processed the personal information of not less than twenty-five thousand consumers and derived more than twenty-five per cent of their gross revenue from the sale of personal data.
“Consumer”, a natural person who is a Massachusetts resident, however identified, including by any unique identifier.
“Independent third-party auditor”, auditing organization that has no affiliation with a covered platform as defined by this section.
“Likely to be accessed”, reasonable expectation, based on the following factors, that a covered platform would be accessed by children: (1) the covered platform is directed to children as defined by the Children’s Online Privacy Protection Act (15 U.S.C. Sec. 6501 et seq.); (2) the covered platform is determined based on audience composition where children comprise at least 10% of its audience; (3) the covered platform is paid for advertisements on its platform that are marketed to children; (4) the covered platform is substantially similar or the same as a covered platform that satisfies subsection (2); and (5) a significant amount of the audience of the covered platform, 10% or more, is determined, based on internal company research, to be children.
"Process" or "processing", any operation or set of operations performed, whether by manual or automated means, on personal information or on sets of personal information, such as the collection, use, storage, disclosure, analysis, deletion or modification of personal information.
“Personal information”, information linked or reasonably linkable to an identified or identifiable individual.
“Social media platform”, public or semipublic internet-based service or application that has users in Massachusetts and that meets both of the following criteria: (1) a substantial function of the service or application is to connect users and allow users to interact socially with each other within the service or application; provided further, that an internet-based service or application that provides email or direct messaging services shall not be considered to meet this criterion on this function alone; provided further that a service or application that is an internet search engine or website whose primary focus is e-commerce, which would include the buying, selling, or exchange of goods or services over the internet, including business-to-business, business-to-consumer, and consumer-to-consumer transactions shall not be considered to meet this criterion on the basis of that function alone; and (2) the application allows users to: (i) construct a public or semipublic profile for purposes of signing into and using the service or application; (ii) populate a list of other users with whom an individual shares a social connection within the system; and (iii) create or post content viewable by other users, including, but not limited to, on message boards, in chat rooms, or through a landing page or main feed that presents the user with content generated by other users.
“Experts in the mental health and public policy fields”, (1) academic experts, health professionals, and members of civil society with expertise in mental health, substance use disorders, and the prevention of harms to minors; (2) representatives in academia and civil society with specific expertise in privacy and civil liberties; (3) parents and youth representation; (4) representatives of the national telecommunications and information administration, the national institute of standards and technology, the federal trade commission, the office of the attorney general of Massachusetts, and the Massachusetts executive office of health and human services; (5) state attorneys general or their designees acting in State or local government; and (6) representatives of communities of socially disadvantaged individuals as defined in section 8 of the Small Business Act, 15 U.S.C. 637.
(b) There shall be an office of social media transparency and accountability, which shall be supervised and controlled by the office of the attorney general. The office shall receive, review and maintain the reports from covered platforms, to enforce this section, and to adopt regulations to clarify the requirements of this section.
(c) Annually before January 1, covered platforms shall register with the office by providing: (i) a registration fee, determined by the office of the attorney general; (ii) the platform’s name; (iii) physical address; (iv) email; and (v) internet address.
(d) The office shall compile a list of approved, independent third-party auditors and shall assign independent third-party auditors to conduct algorithm risk audits of covered platforms. Risk audits shall be conducted monthly by third-party auditors, unless specified otherwise by the office. Audits and associated costs shall be paid for by covered platforms. The algorithm risk audits shall focus on harms to children, including but not limited to: (i) mental health disorders including anxiety, depression, eating disorders, substance abuse disorders, and suicidal behaviors; (ii) patterns of use that indicate or encourage addiction-like behaviors; (iii) physical violence, online bullying, and harassment of the minor; (iv) sexual exploitation and abuse; (v) promoting and market of narcotic drugs as defined in section 102 of the Controlled Substances Act, 21 U.S.C. 802, tobacco products, gambling, or alcohol; and (vi) predatory, unfair or deceptive marketing practices, or other financial harms.
(e) Annually before January 1, the office shall empanel an Advisory Council of experts in the mental health and public policy fields as defined in this section to review these harms and identify additional ways covered platforms cause harms to children.
(f) Annually before July 1, the office shall promulgate regulations based on the cumulation of the potential harms identified by the Advisory Council that update the specific harms that must be examined by the algorithm risk audits required under this section.
(g) Beginning on January 1, 2026, covered platforms shall annually submit transparency reports to the office containing, but not limited to: (i) assessment of whether the covered platform is likely to be accessed by children; (ii) description of the covered platform’s commercial interests in use of the platform by children; (iii) number of individuals using the covered platform reasonably believed to be children in the United States, disaggregated by the age ranges of 0-5, 6-9, 10-12, 13-15 and 16-17 years; (iv) median and mean amounts of time spent on the covered platform by children in the United States who have accessed the platform during the reporting year on a daily, weekly and monthly basis, disaggregated by the age ranges of 0-5, 6-9, 10-12, 13-15 and 16-17 years; (v) description of whether and how the covered platform uses system design features to increase, sustain, or extend use of a product or service by users, including automatic playing of media, rewards for time spent and notifications; (vi) description of whether, how and for what purpose the covered platform collects or processes personal information that may cause reasonably foreseeable risk of harm to children; (vii) total number of complaints received regarding, and the prevalence of issues related to, the harms described in section 1, disaggregated by category of harm; (viii) description of the mechanism by which the public may submit complaints, the internal processes for handling complaints, and any automated detection mechanisms for harms to children, including the rate, timeliness, and effectiveness of responses.
(h) By January 1, 2027, covered platforms shall submit preliminary reports to the office. The preliminary report must measure the incidence of each of the specific harms identified in subsection (d) that occur on the covered platform. The office must consult with independent third-party auditors and covered platforms to determine what data shall be used to produce the preliminary reports.
After a covered platform has submitted a preliminary report, the covered platform may agree that the office will consult with independent third-party auditors and the covered platform to set benchmarks the covered platform must meet to reduce the harms, identified in subsection (d) on its platform as indicated in the preliminary reports required under this section. Upon agreement, each covered platform shall thereafter produce biannual reports containing, but not limited to: (i) steps taken to mitigate harm on its platform, including implementation of any systems used to meet benchmarks; and (ii) measurements indicating the redaction in harm as a result of these systems.
In the case the covered platform has failed meet the benchmarks, upon agreement its annual report must also contain: (1) a mitigation plan detailing changes the platform intends to take to ensure future compliance with benchmarks; and (2) a written explanation regarding the reasons the benchmarks were not met.
If a covered platform should choose not to consult with independent third-party auditors to set benchmarks it must meet to reduce the harms, identified in subsection (d), on its platform as indicated in the preliminary reports required under this subsection, the attorney general is not precluded from pursuing any other legal remedy available at law to mitigate harms.
(i) The records generated by this section shall be subject to chapter 66 of the General Laws and shall be made accessible to the public on the attorney general’s website. However, to the extent any information contained within a report required by this section is trade secret, proprietary or privileged, covered platforms may request such information be redacted from the copy of the report that is obtainable under the public records law and on the attorney general’s website. The office will conduct a confidential, in-camera review of requested redactions to determine whether the information is trade secret, proprietary or privileged information that should not be made accessible for public review. All information from the copy of the report submitted to the office, including redactions, will be maintained by a covered platform in their internal records.
(j) A covered platform shall be considered in violation of this section for the following: (i) fails to register with the office; (ii) materially omits or misrepresents required information in a submitted report; or (iii) fails to timely submit a report to the office.
(1) A covered platform in violation of this section is subject to an injunction and liable for a civil penalty not to exceed $500,000 per violation, which shall be assessed and recovered in a civil action brought by the attorney general. In assessing the amount of a civil penalty pursuant to this section, the court shall consider whether the covered platform made a reasonable, good faith attempt to comply with the provisions of this section. Any penalties, fees, and expenses recovered in an action brought under this section shall be collected by the office of the attorney general with the intent that they be used to fully offset costs in connection with the enforcement of this section and to promote the positive mental health outcomes of the children of Massachusetts.
(k) If any provision of this section, or any application of such provision to any person or circumstance, is held to be unconstitutional, the remainder of this section and the application of this section to any other person or circumstance shall not be affected.