HOUSE DOCKET, NO. 3448 FILED ON: 2/19/2021
HOUSE . . . . . . . . . . . . . . . No. 4029
|
The Commonwealth of Massachusetts
_________________
PRESENTED BY:
David Henry Argosky LeBoeuf
_________________
To the Honorable Senate and House of Representatives of the Commonwealth of Massachusetts in General
Court assembled:
The undersigned legislators and/or citizens respectfully petition for the adoption of the accompanying bill:
An Act relative to algorithmic accountability and bias prevention.
_______________
PETITION OF:
Name: | District/Address: | Date Added: |
David Henry Argosky LeBoeuf | 17th Worcester | 2/19/2021 |
HOUSE DOCKET, NO. 3448 FILED ON: 2/19/2021
HOUSE . . . . . . . . . . . . . . . No. 4029
By Mr. LeBoeuf of Worcester, a petition (accompanied by bill, House, No. 4029) of David Henry Argosky LeBoeuf relative to algorithmic accountability and bias prevention in the protection of consumers. Consumer Protection and Professional Licensure. |
The Commonwealth of Massachusetts
_______________
In the One Hundred and Ninety-Second General Court
(2021-2022)
_______________
An Act relative to algorithmic accountability and bias prevention.
Be it enacted by the Senate and House of Representatives in General Court assembled, and by the authority of the same, as follows:
Chapter 93 of the General Laws is hereby amended by adding the following section:-
Section 115. (a) As used in this section the following terms shall, unless the context clearly requires otherwise, have the following meanings:
‘‘Automated decision system’’, a computational process, including one derived from machine learning, statistics, or other data processing or artificial intelligence techniques, that makes a decision or facilitates human decision making, that impacts consumers.
“Automated decision system impact assessment”, a study evaluating an automated decision system and the automated decision system’s development process, including the design and training data of the automated decision system, for impacts on accuracy, fairness, bias, discrimination, privacy, and security that includes, at a minimum: (A) a detailed description of the automated decision system, its design, its training, data, and its purpose; (B) an assessment of the relative benefits and costs of the automated decision system in light of its purpose, taking into account relevant factors, including (i) data minimization practices; (ii) the duration for which personal information and the results of the automated decision system are stored; (iii) what information about the automated decision system is available to consumers; (iv) the extent to which consumers have access to the results of the automated decision system and may correct or object to its results; and (v) the recipients of the results of the automated decision system; (C) an assessment of the risks posed by the automated decision system to the privacy or security of personal information of consumers and the risks that the automated decision system may result in or contribute to inaccurate, unfair, biased or discriminatory decisions impacting consumers; and (D) the measures the covered entity will employ to minimize the risks described in clause (C), including technological and physical safeguards.
“Office”, office of consumer affairs and business regulation.
“Consumer”, an individual.
“Covered entity” any person, partnership, or corporation that: (A) had greater than $50,000,000 in average annual gross receipts for the 3-taxable-year period preceding the most recent fiscal year, as determined in accordance with paragraphs (2) and (3) of section 448(c) of the Internal Revenue Code of 1986; (B) possesses or controls personal information on more than: (i) 1,000,000 consumers; or (ii) 1,000,000 consumer devices; (C) is substantially owned, operated, or controlled by a person, partnership, or corporation that meets the requirements under subparagraph (A) or (B); or (D) is a data broker or other commercial entity that, as a substantial part of its business, collects, assembles, or maintains personal information concerning an individual who is not a customer or an employee of that entity in order to sell or trade the information or provide third-party access to the information.
“Data protection impact assessment”, a study evaluating the extent to which an information system protects the privacy and security of personal information the system processes.
“High-risk automated decision system”, an automated decision system that: (A) taking into account the novelty of the technology used and the nature, scope, context, and purpose of the automated decision system, poses a significant risk: (i) to the privacy or security of personal information of consumers; or (ii) of resulting in or contributing to inaccurate, unfair, biased, or discriminatory decisions impacting consumers; (B) makes decisions, or facilitates human decision making, based on systematic and extensive evaluations of consumers, including attempts to analyze or predict sensitive aspects of their lives, such as their work performance, economic situation, health, personal preferences, interests, behavior, location, or movements, that: (i) alter legal rights of consumers; or (ii) otherwise significantly impact consumers; (C) involves the personal information of a significant number of consumers regarding race, color, national origin, political opinions, religion, trade union membership, genetic data, biometric data, health, gender, gender identity, sexuality, sexual orientation, criminal convictions, or arrests; (D) systematically monitors a large, publicly accessible physical place; or (E) meets any other criteria established by the Office in regulations issued pursuant to this section
“High-risk information system”, an information system that: (A) taking into account the novelty of the technology used and the nature, scope, context, and purpose of the information system, poses a significant risk to the privacy or security of personal information of consumers; (B) involves the personal information of a significant number of consumers regarding race, color, national origin, political opinions, religion, trade union membership, genetic data, biometric data, health, gender, gender identity, sexuality, sexual orientation, criminal convictions, or arrests; (C) systematically monitors a large, publicly accessible physical place; or (D) meets any other criteria established by the Office in regulations issued pursuant to this section.
“Information system”, (A) means a process, automated or not, that involves personal information, such as the collection, recording, organization, structuring, storage, alteration, retrieval, consultation, use, sharing, disclosure, dissemination, combination, restriction, erasure, or destruction of personal information; and (B) does not include automated decision systems.
“Personal information”, any information, regardless of how the information is collected, inferred, or obtained that is reasonably linkable to a specific consumer or consumer device.
“Store”, (A) means the actions of a person, partnership, or corporation to retain information; and (B) includes actions to store, collect, assemble, possess, control, or maintain information.
“Use”, the actions of a person, partnership, or corporation in using information, including actions to use, process, or access information.
(b) A covered entity shall not: (1) violate a regulation promulgated under subsection (c); or (2) knowingly provide substantial assistance to any person, partnership, or corporation whose actions violate subsection (c).
(c)(1) Not later than 2 years after the date of enactment of this section, the Office shall promulgate regulations, that: (A) require each covered entity to conduct automated decision system impact assessments of (i) existing high-risk automated decision systems, as frequently as the Office determines is necessary; and (ii) new high-risk automated decision systems, prior to implementation, provided that a covered entity may evaluate similar high-risk automated decision systems that present similar risks in a single assessment; (B) require each covered entity to conduct data protection impact assessments of (i) existing high-risk information systems, as frequently as the Office determines is necessary; and (ii) new high-risk information systems, prior to implementation; provided that a covered entity may evaluate similar high-risk information systems that present similar risks in a single assessment; (C) require each covered entity to conduct the impact assessments under clauses (A) and (B), if reasonably possible, in consultation with external third parties, including independent auditors and independent technology experts; and (D) require each covered entity to reasonably address in a timely manner the results of the impact assessments under clauses (A) and (B).
(2) The impact assessments under clauses (A) and (B) of paragraph 1 may be made public by the covered entity at its sole discretion.
(d) It shall be unlawful for any covered entity to commit the acts prohibited in subsection (b), regardless of specific agreements between entities or consumers.
(e) (1) A violation of subsection (b) shall be an unfair or deceptive act or practice under chapter 93A.
(2)(A) The Office shall enforce this section in the same manner, by the same means, and with the same jurisdiction, powers, and duties provided to it pursuant to chapter 24A or any other general or special law. The Office may impose civil penalties or fines for a violation of subsection (b). The Office may refer any violation of this section to the attorney general.
(i) Except as provided in clause (iii), the attorney general, before initiating a civil action under paragraph (1), shall provide written notification to the Office that the attorney general intends to bring such civil action.
(ii) The notification required under clause (i) shall include a copy of the complaint to be filed to initiate the civil action.
(iii) If it is not feasible for the attorney general to provide the notification required under clause (i) before initiating a civil action under paragraph (1), the attorney general shall notify the Office immediately upon instituting the civil action.
(d) Any person who is aggrieved as a result of a violation of this section, or the attorney general, may bring an action for recovery of actual damages or $100,000 per violation, whichever is greater, and other relief, including injunctive relief, civil penalties and attorney's fees as provided by chapter 93A.