ITL AI Engagement
To foster collaboration and develop a shared understanding of what constitutes trustworthy AI, and to bolster scientific underpinning of how to assess and assure trustworthiness of AI systems, The NIST Information Technology Laboratory (ITL) AI Program organizes workshops bringing together government, industry, academia, and other stakeholders from the US and around the world. The workshops’ focus is on advancing the development of AI standards, guidelines, and related tools.
Upcoming Workshops & Events
- ITL AI Webinar Series: Building Traceability into Agentic AI Ecosystems Through Measurement Probes. Learn More and Register.
Recent Workshops & Events
- ITL AI Webinar Series: The International AI Standards Landscape and ITL’s Role, Priorities, and Progress (March 6, 2026) Watch Recording.
Past Workshops & Events
- ARIA Workshop was held on November 12, 2024
- Unleashing AI Innovation, Enabling Trust was held on September 24-25, 2024
- Secure Software Development Framework for Generative AI and for Dual Use Foundation Models Virtual Workshop was held on January 17, 2024
- Workshop on Collaboration to Enable Safe and Trustworthy AI was held November 17, 2023
- Launching Publication of the AI Risk Management Framework (AI RMF) 1.0 was held January 26, 2023
- Building the NIST AI Risk Management Framework: Workshop #3 was held October 18-19, 2022
- Artificial Intelligence and the Economy Conference was held April 27, 2022
- Two-part Workshop on AI Risk Management Framework – and on Bias in AI was held March 29-31, 2022
- Kicking off NIST AI Risk Management Framework: Workshop #1 was held October 19-21, 2021
- A workshop on AI Measurement and Evaluation was held June 15-17, 2021
- National Academy of Science, Engineering and Medicine (NASEM) workshop on Assessing and Improving AI Trustworthiness: Current Contexts, Potential Paths was held on March 3-4, 2021
- A workshop on Explainable AI was held January 26-28, 2021. Workshop summary
- A workshop on Bias in AI was held on August 18, 2020. A draft report which includes information about discussions during the workshop has been published. A recording of this event can be found on the event page. The final report Towards a Standard for Identifying and Managing Bias in Artificial Intelligence (SP 1270) was published in March 2022
- A kickoff AI Workshop, Exploring AI Trustworthiness, took place on August 6, 2020. A recording of the workshop can be found on the event page
Ways to Engage
The ITL AI Program relies on and encourages robust interactions with industry, universities, nonprofits, and other government agencies in driving and carrying out its AI agenda. There are multiple ways to engage with NIST, including:
- NIST AI Consortium: ITL has established the NIST AI Consortium to empower the collaborative establishment of a new measurement science that will enable the identification of proven, scalable, and interoperable techniques and metrics to promote the development and use of AI.
- Requests for Information (RFIs): The ITL AI Program sometimes uses formal RFIs to inform the public about its AI activities and gain insights into specific AI issues. For example, an RFI was issued to help develop the AI Risk Management Framework.
- Share your input on draft reports: The ITL AI Program counts on stakeholders to review drafts of reports on a variety of AI issues. Drafts typically are prepared based on inputs from private and public sector individuals and organizations and then posted for broader public review on NIST’s AI website and via email alerts. Public comments help to improve these documents.
- Student Programs: NIST offers a range of opportunities for students to engage with NIST on AI-related work. That includes the Professional Research Experience Program (PREP), which provides valuable laboratory experience and financial assistance to undergraduate, graduate, and post-graduate students.
Sign up for AI email alerts here. If you have questions or ideas about how to engage with us on AI topics or have ideas about NIST’s AI activities, send us an email: ai-inquiries [at] nist.gov (ai-inquiries[at]nist[dot]gov).
