Executive SummaryFederal agencies increasingly automate the provision of legal guidance to the public through technologies such as chatbots and virtual assistants. There may be benefits associated with the use of these technologies, including efficient allocation of limited staff resources; improved user experience and service delivery; and enhanced consistency. At the same time, the seemingly personalized nature of the guidance provided by such tools may lead users to unduly rely on it. This Recommendation identifies best practices for agencies to consider when they develop, use, and manage automated legal guidance tools. Among other things, it recommends:
See also: Recommendation 2021-7, Public Availability of Inoperative Agency Guidance Documents; Recommendation 2019-3, Public Availability of Agency Guidance Documents; Recommendation 2019-1, Agency Guidance Through Interpretive Rules; Recommendation 2017-5, Agency Guidance Through Policy Statements; Recommendation 2014-3, Guidance in the Rulemaking Process This summary is prepared by the Office of the Chair to help readers understand the Recommendation adopted by the Assembly, which appears in full below. |
Recommendation of the ACUS Assembly
Federal agencies increasingly automate the provision of legal guidance to the public through online tools and other technologies.[1] The Internal Revenue Service, for example, encourages taxpayers to seek answers to questions regarding various tax credits and deductions through its online “Interactive Tax Assistant,” and the United States Citizenship and Immigration Services suggests that potential green card holders and citizens with questions about their immigration rights communicate with its interactive chatbot, “Emma.” Almost a dozen federal agencies have either implemented or piloted such automated legal guidance tools in just the past three years.[2]
Automated legal guidance tools can take several forms. The most common are chatbots and virtual assistants. The simplest chatbots provide standardized responses based on keywords included in a user’s question. Although the terms can overlap, virtual assistants tend to be more versatile than chatbots and can often perform additional tasks such as making an appointment or filling out a form in response to a conversation.[3] More robust tools rely on natural language processing or artificial intelligence to interpret natural language and generate an individualized response.[4]
Agencies use automated legal guidance tools for a number of reasons. They include: efficiently allocating limited staff resources; improving user experience and service delivery; and enhancing the quality, consistency, and predictability of guidance, as well as the speed with which it is provided to the public. Because they are always available from any location and can efficiently and effectively provide answers to common questions, automated legal guidance tools have the potential to revolutionize the provision of agency guidance to the public.
Agencies generally take the position that users cannot rely on automated legal guidance. As this Recommendation recognizes, agencies must be clear in disclosing this position to users. That is true, of course, of all forms of guidance documents.[5] Automated legal guidance may, however, create an especially heightened risk of a user’s relying on the guidance issued in a way that the issuing agency does not intend. Since users often enter specific facts relating to their circumstances, users may assume that the automated guidance tool is giving a customized response that has accounted for all of the facts that have been entered, which may or may not be the case.
The Administrative Conference has adopted several recommendations on the development, use, and public availability of agency guidance documents.[6] This Recommendation builds on those recommendations by identifying best practices for agencies to consider when they develop, use, and manage automated legal guidance tools. In identifying these best practices, the Conference recognizes that automated legal guidance tools may not be suitable for all agencies and administrative programs and that even when agencies use them, agencies will need to provide additional guidance by other means, including live person-to-person support.
RECOMMENDATION
Design and Management
1. Agencies should explore the possible benefits of offering automated legal guidance tools, including enhancing administrative efficiency and helping the public understand complex laws using plain language. This is especially true for those agencies that have a high volume of individual interactions with members of the public who may not be familiar with legal requirements.
2. Agencies should also weigh the potential downsides of offering automated legal guidance tools, including potentially oversimplifying the law and creating confusion as to whether and when the agency intends users to rely on the guidance issued. To avoid such confusion, agencies should follow the recommendations set forth in Paragraphs 18–20.
3. Agencies using automated legal guidance tools should design and manage them in ways that promote fairness, accuracy, clarity, efficiency, accessibility, and transparency.
4. Agencies should ensure that automated legal guidance tools do not displace other agency mechanisms for increasing access to the underlying law.
5. Agencies should adopt clear procedures for designing, maintaining, and reviewing the content embedded in automated legal guidance tools and should publish these procedures on their websites. These procedures should incorporate periodic user testing and other forms of evaluation by internal and external researchers to ensure accessibility and effectiveness.
6. The General Services Administration should regularly evaluate the relative costs and benefits of using outside vendors for the production of automated legal guidance tools and share their evaluations with agencies.
Accessibility
7. Agencies should utilize human-centered design methodologies, empirical customer research, and user testing, as described and defined in Executive Order 14,058, Transforming Federal Customer Experience and Service Delivery to Rebuild Trust in Government (86 Fed. Reg. 71,357 (Dec. 13, 2021)), in designing and maintaining their automated legal guidance tools.
8. Agencies should, consistent with applicable laws and policies, design and periodically review and, when necessary, reconfigure automated legal guidance tools to ensure that they meet the needs of the particular populations that are intended to utilize the automated legal guidance tools.
9. Agencies should ensure that information provided by automated legal guidance tools is stated in plain language understandable by the particular populations that are intended to use these tools, consistent with the Plain Writing Act of 2010 (5 U.S.C. § 301 note); Recommendation 2017-3, Plain Language in Regulatory Drafting (82 Fed. Reg. 61,728 (Dec. 14, 2017)); and other applicable laws, policies, and Conference recommendations.
10. Agencies should design automated legal guidance tools to put users in contact with a human customer service representative to whom they can address questions in the event that a question is not answered by an automated legal guidance tool or if the users are having difficulty using the tools.
Transparency
11. When the underlying law is unclear or unsettled, or when the application of the law is especially fact-dependent, agencies should be transparent about the limitations of the advice the user is receiving. To the extent practicable, agencies should also provide access through automated legal guidance tools to the legal materials underlying the tools, including relevant statutes, rules, and judicial or adjudicative decisions.
12. Agencies should disclose how they store and use the data obtained through automated legal guidance tools.
13. Agencies should update the content of automated legal guidance tools to reflect legal developments or correct errors in a timely manner. Agencies should also maintain an electronic, publicly accessible, searchable archive that identifies and explains the updates. Agencies should provide the date on which the tool was last updated.
14. When automated legal guidance tools provide programmed responses to users’ questions, agencies should publish the questions and responses so as to provide an immediate and comprehensive source of information regarding the tools. Agencies should post this information in an appropriate location on their websites and make it accessible through the automated legal guidance tool to which it pertains.
15. When automated legal guidance tools learn to provide different answers to users’ questions over time, agencies should publish information related to how the machine learning process was developed and how it is maintained and updated. Agencies should post this information in an appropriate location on their websites and make it accessible through the automated legal guidance tool to which it pertains.
16. Agencies that use automated legal guidance tools should provide users the ability to offer feedback or report errors.
17. When applicable, agencies should provide disclaimers that the automated legal guidance tool is not human.
Reliance
18. Agencies should allow users to obtain a written record of their communication with automated legal guidance tools and should include date and time stamps on the written record.
19. Agencies should consider whether, or under what circumstances, a person’s good faith reliance on guidance provided by an automated legal guidance tool should serve as a defense against a penalty or other consequences for noncompliance with an applicable legal requirement, and they should prominently announce that position to users.
20. If an agency takes the position that it can depart from an interpretation or explanation provided by an automated legal guidance tool, including in the application of penalties for noncompliance, it should prominently announce its position to users, including in the written record of the communication with the automated legal guidance tool.
[1] This Recommendation defines “guidance” broadly to include interpretive rules, general statements of policy, and other materials that agencies consider to be guidance documents. See Admin. Conf. of the U.S., Recommendation 2019-3, Public Availability of Agency Guidance Documents, 84 Fed. Reg. 38,931 (Aug. 8, 2019).
[2] They include the Department of the Army, the Department of Education, the Environmental Protection Agency, the General Services Administration, the Food and Drug Administration, the Internal Revenue Service, the National Institutes of Health, the Patent and Trademark Office, the Social Security Administration, United States Citizenship and Immigration Services, and the Veterans Benefits Administration.
[3] See Joshua D. Blank & Leigh Osofsky, Automated Legal Guidance at Federal Agencies 1, 10 (May 26, 2022) (report to the Admin. Conf. of the U.S.).
[4] See Admin. Conf. of the U.S., Statement #20, Agency Use of Artificial Intelligence, 86 Fed. Reg. 6616 (Jan. 22, 2021); Blank & Osofsky, supra note 3.
[5] See Admin. Conf. of the U.S., Recommendation 2019-3, Public Availability of Agency Guidance Documents, ¶¶ 11–12, 84 Fed. Reg. 38,931, 38,933 (Aug. 8, 2019); Admin. Conf. of the U.S., Recommendation 2019-1, Agency Guidance Through Interpretive Rules, ¶¶ 6, 11, 84 Fed. Reg. 38,927, 38,929 (Aug. 8, 2019); Admin. Conf. of the U.S., Recommendation 2017-5, Agency Guidance Through Policy Statements, ¶¶ 4–6, 82 Fed. Reg. 61,734, 61,736 (Dec. 29, 2017).
[6] See Admin. Conf. of the U.S., Recommendation 2021-7, Public Availability of Inoperative Agency Guidance Documents, 87 Fed. Reg. 1718 (Jan. 12, 2022); Admin. Conf. of the U.S., Recommendation 2019-3, Public Availability of Agency Guidance Documents, 84 Fed. Reg. 38,931 (Aug. 8, 2019); Admin. Conf. of the U.S., Recommendation 2019-1, Agency Guidance Through Interpretive Rules, 84 Fed. Reg. 38,927 (Aug. 8, 2019); Admin. Conf. of the U.S., Recommendation 2017-5, Agency Guidance Through Policy Statements, 82 Fed. Reg. 61,734 (Dec. 29, 2017); Admin. Conf. of the U.S., Recommendation 2014-3, Guidance in the Rulemaking Process, 79 Fed. Reg. 35,992 (June 25, 2014).
Recommended Citation: Admin. Conf. of the U.S., Recommendation 2022-3, Automated Legal Guidance at Federal Agencies, 87 Fed. Reg. 39,798 (July 5, 2022).