In January 2025, the Department of Homeland Security (DHS) released its “Playbook for Public Sector Generative Artificial Intelligence Deployment” (the “Playbook”). The Playbook provides valuable insights and actionable steps that can be adapted by the private sector looking to leverage generative artificial intelligence (“GenAI”) technologies.1 The Playbook was drafted under the Biden administration, and may be changed to align with the policy views of the Trump administration. Nevertheless, the Playbook’s recommendations are relevant and helpful. This blog post summarizes key aspects of the Playbook and offers takeaways for the private sector.
A. Introduction to GenAI Pilots
In March 2024, DHS released an AI Roadmap detailing the responsible use of AI. As part of the AI Roadmap, DHS utilized three GenAI Pilot programs that focused on assisting employees and enhancing processes.
The three pilots included:
- Strengthening Investigative Leads: Homeland Security Investigations (HSI) used a large language model (LLM)-based system to enhance summary efficiency and accuracy, aiding in detecting networks and identifying crime-related patterns.
- Helping Local Governments Create Hazard Mitigation Plans: The Federal Emergency Management Agency (FEMA) tested LLM capabilities to help local governments develop hazard mitigation plans, making communities more resilient and eligible for grant applications.
- Creating Novel Training Opportunities for Immigration Officers: The United States Citizenship and Immigration Services (USCIS) used GenAI to improve training for refugee and asylum immigration officers, enhancing their interview skills and limiting the need for retraining over time.
The work from the AI Roadmap was core to the assessments made in the Playbook.
B. Key Steps and Insights for the Private Sector
The Playbook is designed to assist organizations at any stage of their AI journey in understanding and incorporating AI technology into their operations. The Playbook includes key steps when considering implementing a GenAI pilot program. A review of those steps, and how they can be applied to private sector entities trying to integrate AI follows.
-
Mission-Enhancing GenAI Use Cases
The Playbook emphasizes the importance of developing GenAI pilots that address specific mission needs and support mission-enhancing processes. It advises organizations to carefully scope potential deployments and consider long-term applicability across different departments or processes.
Actionable Steps for Private Sector Organizations:
- Align GenAI deployment’s mission and value with the organization’s priorities.
- Scope a GenAI pilot that improves a specific mission-enhancing process and is potentially scalable to similar processes.
- Enlist an executive sponsor to support and endorse pilot efforts.
- Assess resources, including staff, funding, data, and technology.
- Define the pilot’s minimum viable product and metrics for evaluating success.
-
Coalition Building and Effective Governance
Securing support from senior leadership and building cross-functional coalitions are key steps for successful GenAI deployment. The Playbook highlights the importance of including risk management, compliance, and oversight leads early in the planning process.
Actionable Steps for Private Sector Organizations:
- Prioritize early support from senior leadership and gain input from key organizational stakeholders.
- Evaluate current governance structures and policies to identify potential gaps.
- Designate or create a governance body with cross-functional representation for managing and overseeing GenAI projects.
-
Tools and Infrastructure
Organizations are encouraged to evaluate their current tools and infrastructure, consider different AI model types, and ensure they have the necessary technical capacity and security measures in place. This foundational assessment helps streamline the deployment process and ensures the effective integration of GenAI applications.
Actionable Steps for Private Sector Organizations:
- Assess existing tools, infrastructure, and technical capabilities against the needs and goals of the GenAI pilot.
- Consider using commercial, open-source, or open-weight models based on the organization’s needs.
- Determine if additional tooling or configurations are needed, considering the team’s technical capacity.
-
Responsible Use and Trustworthiness
One of the key pillars of successful GenAI deployment is responsible and trustworthy use. This means having clear policies to manage risks such as inaccuracies, discrimination, and data privacy.2 It also means not relying solely on GenAI outputs for critical decisions, but rather making decisions with human oversight. By prioritizing responsible use, organizations can instill confidence in their AI initiatives and reassure stakeholders of their commitment to ethical AI deployment.
Actionable Steps for Private Sector Organizations:
- Develop clear organizational guidance, principles, and best practices for responsible and trustworthy GenAI use.
- Identify risk areas, including inaccuracies, privacy violations, and data bias.
- Scope the application of GenAI tools, accounting for their limitations and risks.
-
Measurement and Monitoring
Continuous measurement and feedback are not just important, they’re integral to the success of GenAI pilots. Organizations should identify or develop qualitative and quantitative metrics that reflect the goals of the GenAI pilots. They should establish infrastructure, like dashboards, to monitor these metrics and share progress regularly with stakeholders, at least monthly, to enable iterative improvements. By engaging stakeholders in this ongoing process, organizations can ensure that their AI initiatives are always improving and meeting the needs of the business.
Actionable Steps for Private Sector Organizations:
- Develop and monitor key performance indicators to assess the effectiveness of AI deployments.
- Develop a process to make iterative improvements to the pilot product based on performance.
-
Training and Talent Acquisition
Investing in training and talent acquisition is crucial for successful GenAI deployment. Organizations should offer GenAI literacy training across business lines and hire technically skilled employees to support AI development efforts.
Actionable Steps for Private Sector Organizations:
- Offer GenAI literacy training across technical and non-technical business lines.
- Identify the technical skills needed for the GenAI pilot and assess if these skills exist within the organization.
- Provide upskilling and cross-training opportunities for current employees.
- Hire technically skilled employees to support AI development efforts.
-
Usability Testing and Feedback Mechanisms
Incorporating iterative feedback from users and other stakeholders is essential for developing and improving GenAI applications. This two-way communication helps pilot teams improve their products while also bolstering stakeholders’ confidence in the projects.
Actionable Steps for Private Sector Organizations:
- Identify relevant internal and external users and iteratively test the product with them throughout the development lifecycle.
- Communicate regularly with users and stakeholders to share progress, challenges, and lessons learned.
- Provide opportunities for stakeholders to share feedback and incorporate this feedback as appropriate.
C. Conclusion
Private sector organizations can enhance their AI initiatives by adapting the DHS Playbook’s insights and actionable steps. This includes aligning AI projects with strategic goals, starting with pilot programs, leveraging existing infrastructure, prioritizing responsible use, continuously measuring progress, and investing in training and talent. By fostering cross-functional collaboration, the private sector can unlock the full potential of GenAI.
Footnotes
- The Playbook defines the term “generative AI” (or, GenAI) as the class of AI models that emulate the structure and characteristics of input data to generate derived synthetic content. This can include images, videos, audio, text, and other digital content.
- The AI legal risk of discrimination may not be supported or pursued by the Trump administration, and consideration of such an issue may actually be viewed negatively by the new administration, particularly among entities that receive grants or contracts from the federal government. See Executive Order 14173 (“Ending Illegal Discrimination and Restoring Merit-Based Opportunity”) and Executive Order 14179 (“Removing Barriers to American Leadership in Artificial Intelligence”).
The material contained in this communication is informational, general in nature and does not constitute legal advice. The material contained in this communication should not be relied upon or used without consulting a lawyer to consider your specific circumstances. This communication was published on the date specified and may not include any changes in the topics, laws, rules or regulations covered. Receipt of this communication does not establish an attorney-client relationship. In some jurisdictions, this communication may be considered attorney advertising.