Amendment ID: S2806-21

Amendment 21

Automated Decision Making Control Board

Mr. Moore moves that the proposed new text be amended by inserting after section __ the following section:-

SECTION __. Chapter 7D of the general laws is hereby amended by inserting at the end there of the following new section:-

"Section 17. Automated Decision Making Control Board.

(a) As used in this section, the following words shall have the following meanings unless the context clearly requires otherwise:

“Algorithm”, a specific procedure, set of rules, or order of operations designed to solve a problem or make a calculation, classification, or recommendation.

“Artificial intelligence”, shall mean a machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations, or decisions influencing real or virtual environments.  Artificial intelligence systems use machine- and human-based inputs to: (1) perceive real and virtual environments; (2) abstract such perceptions into models through analysis in an automated manner; and (3) use model inference to formulate options for information or action.

“Automated decision system”, any computer program, method, statistical model, or process that aims to aid or replace human decision-making using algorithms or artificial intelligence. These systems can include, but are not limited to, analyzing complex datasets about human populations and government services or other activities to generate scores, predictions, warnings, classifications, or recommendations.

“Commonwealth of Massachusetts” or “governmental unit”, any state, county, or municipal agency as defined by section 1 of chapter 268A.

“Covered Entity" means (1) any governmental unit; or (2) any entity within the commonwealth that utilizes an automated decision system.

“Identified group characteristic", age, race, creed, color, religion, national origin, sex, gender identity, disability, sexual orientation, genetic information, marital status, pregnancy or a condition related to said pregnancy, ancestry, veteran status, receipt of public assistance, economic status, location of residence, or citizenship status.

“Source code”, the foundational programming of a computer application, model, or system that can be read and understood by people.

“Training data”, the data used to inform the development of an automated decision system and the decisions or recommendations it generates.

(b) There shall be a board within the executive office of technology services and security for the purpose of studying and making recommendations relative to the use of automated decision systems by covered entities within the Commonwealth that may affect human welfare, including, but not limited to, the legal rights and privileges of individuals. The board shall evaluate the use of automated-decision systems in the commonwealth, including government use, and shall promulgate appropriate regulations, limits, standards and safeguards. The board shall:

(i) undertake a complete and specific survey of all uses of automated decision systems by covered entities and the purposes for which such systems are used, including but not limited to:

(1) the principles, policies, and guidelines adopted by covered entities to inform the procurement, evaluation, and use of automated decision systems, and the procedures by which such principles, policies, and guidelines are adopted;

(2) the training specific covered entities provide to individuals using automated decision systems, and the procedures for auditing and enforcing the principles, policies, and guidelines regarding their use;

(3) the manner by which covered entities validate and test the automated decision systems they use, and the manner by which they evaluate those systems on an ongoing basis, specifying the training data, input data, systems analysis, studies, vendor or community engagement, third-parties, or other methods used in such validation, testing, and evaluation;

(4) matters related to the transparency, explicability, auditability, and accountability of automated decision systems in use in covered entities, including information about their structure; the processes guiding their procurement, implementation and review; whether they can be audited externally and independently; and the people who operate such systems and the training they receive;

(5) the manner and extent to which covered entities make the automated decision systems they use available to external review, and any existing policies, laws, procedures, or guidelines that may limit external access to data or technical information that is necessary for audits, evaluation, or validation of such systems;

(6) procedures and policies in place to protect the due process rights of individuals directly affected by Massachusetts offices’ use of automated decision systems, including but not limited to public disclosure and transparency procedures; and

(7) the manner in which automated decision systems are assessed by covered entities, vendors or third parties for biases, including but not limited to, discrimination on the basis of identified group characteristics;

(ii) consult with experts in the fields of artificial intelligence, machine learning, algorithmic or artificial intelligence bias, algorithmic or artificial intelligence auditing, and civil and human rights;

(iii) examine research related to the use of automated decision systems that directly or indirectly result in disparate outcomes for individuals or communities based on an identified group characteristic;

(iv) conduct a survey of technical, legal, or policy controls to improve the just and equitable use of automated decision systems and mitigate any disparate impacts deriving from their use, including best practices, policy tools, laws, and regulations developed through research and academia or proposed or implemented in other states and jurisdictions;

(v) examine matters related to data sources, data sharing agreements, data security provisions, compliance with data protection laws and regulations, and all other issues related to how data is protected, used, and shared by agencies using automated decision systems, in Massachusetts and in other jurisdictions;

(vi) examine matters related to automated decision systems and intellectual property, such as the existence of non-disclosure agreements, trade secrets claims, and other proprietary interests, and the impacts of intellectual property considerations on transparency, explicability, auditability, accountability, and due process; and

(vii) examine any other opportunities and risks associated with the use of automated decision systems by covered entities.

(c) The board shall consist of the secretary of technology services and security or the secretary’s designee, who shall serve as chair; 1 member of the Senate, designated by the senate president; 1 member of the house of representatives, designated by the speaker of the house of representatives; the chief justice of the supreme judicial court or a designee; the secretaries of the Executive Office of Public Safety and Security, and Executive Office of Health and Human Services, or their designees; the executive director of the American Civil Liberties Union of Massachusetts or a designee; 3 representatives from academic institutions in the Commonwealth to be appointed by the Governor who shall be experts in (i) artificial intelligence and machine learning; (ii) data science and information policy; (iii) social implications of artificial intelligence and technology; or (iv) technology and the law; the executive director of the Massachusetts Law Reform Institute or a designee; 1 representative from the National Association of Social Workers; 1 representative from the NAACP; 1 representative from the Massachusetts Technology Collaborative; and 1 representative from the Massachusetts High Technology Council; and 6 representatives of the business community, to be appointed by the Governor, who shall have relevant experience in at least two of the following fields: (i) artificial intelligence and machine learning; (ii) data science and information policy; (iii) social implications of artificial intelligence and technology; or (iv) technology and the law.

(d) Members of the board shall be appointed within 45 days of the effective date of this act and within 45 days of any vacancy. Any vacancy shall be filled in the same manner as the original appointment. The board shall meet at the call of the chair based on the board’s workload but not fewer than 10 times per calendar year. The board shall hold at least one public hearing per year to solicit feedback from Massachusetts residents and other interested parties. The board’s meetings shall be broadcast over the internet.

(e) The board shall submit an annual report by December 31 to the governor, the clerks of the house of representatives and the senate, and the joint committee on advanced information technology, the internet and cybersecurity. The report shall be a public record and it shall include, but not be limited to:

(i) a description of the board’s activities and any community engagement undertaken by the board;

(ii) the board’s findings, including but not limited to the publication of a list of all automated decision systems in use by governmental units, the policies, procedures, and training guidelines in place to govern their use, and any contracts with third parties pertaining to the acquisition or deployment of such systems.

(f) The board shall promulgate, amended, or rescind rules and regulations to establish standards and safeguards to:

(i) Promote racial and economic justice, equity, fairness, accountability, and transparency in the use of automated decision systems by covered entities;

(ii) Establish areas where governmental units shall not use automated decision systems or any qualifications, conditions, limits or prohibitions that shall be set on governmental use of an automated decision system;

(iii) Requirements for the adoption of policies and procedures by governmental units for the following purposes:

(1) to allow a person affected by a rule, policy, or action made by, or with the assistance of, an automated decision system, to request and receive an explanation of such rule, policy, or action and the basis therefor;

(2) to determine whether an automated decision system disproportionately or unfairly impacts a person or group based on an identified group characteristic;

(3) to determine prior to or during the procurement or acquisition process whether a proposed governmental unit automated decision system is likely to disproportionately or unfairly impact a person or group based on an identified group characteristic;

(4) to address instances in which a person or group is harmed by a governmental unit automated decision system if any such system is found to disproportionately impact a person or group on the basis of an identified group characteristic; and

(5) to make information publicly available that, for each automated decision system, will allow the public to meaningfully assess how such system functions and is used by a governmental unit, including making technical information about such system publicly available.

(iv) Regulate the training data related to an automated decision system, including but not limited to:

(1) security measures to protect that data of individuals used as part of the training data;

(2) informed consent, as defined by the board, from individuals before collecting, using, sharing or disclosing their data; and

(3) the deletion or de-identification of any data collected from individuals if it is no longer needed for the intended purpose of the training data or automated decision system.

(g) Whoever violates any provision of this section, and any regulations promulgated by the board, shall be punished by a fine of not more than one thousand dollars for each such violation. Each day during which a violation exists shall constitute a separate offense.

(f) The board or the attorney general may issue and recover penalties and enforce the provisions of this section. The attorney general may enforce this section pursuant to section 4 of chapter 93A."