In October of 2023, the Council of the Great City Schools and the Consortium for School Networking (CoSN) joined forces to release the much-anticipated K-12 Generative AI Readiness Checklist, with support from Amazon Web Services (AWS), the National Association of Secondary School Principals (NASSP), the State Educational Technology Directors Association (SETDA), and the National School Public Relations Association (NSPRA).
The Readiness Checklist Questionnaire is the most comprehensive set of guidelines/recommendations for district leaders who are trying to stay ahead of (or at least keep up with) the rapidly evolving generative AI landscape. At the time of publication, there is no better tool to help administrators lay the groundwork for a sustainable model that evaluates “the safety, privacy, security, and ethical implications of using Gen AI.”
We’ve broken down the checklist to identify some of the most important takeaways and action items, while adding some additional best practices based on what we’re seeing and hearing in the field.
Table of Contents
- Roles, responsibilities, and staffing
- Compliance and training
- Vendor accountability
1. Roles, Responsibilities, and Staffing
The K-12 Generative AI Readiness Checklist includes a number of questions about staff assignments and accountability for adoption, procurement, management, implementation, and communication of generative AI tools. Recommendations include:
- Designating a point person or team to provide oversight at the leadership level
- Setting up a cross-functional team, including student and parent representatives, to provide strategic oversight and guidance on all things Gen AI, including future-readiness and alignment with federal agencies and organizations
- Designating a point person or team to coordinate professional development for Gen AI
- Enabling processes, “both administratively and through collective bargaining agreements, to modify job descriptions and requirements or create new roles to support Gen AI”
- Identifying and/or hiring “staff that have the ability to prepare data to share with Gen AI tools”
- Identifying and/or hiring “staff with the right skillset to evaluate, procure, and operate Gen AI”
Key takeaways and observations
While some large districts will have the luxury of being able to create new job positions and hire specifically for AI experience and skill sets, we expect the vast majority to simply absorb these responsibilities into existing org charts and workflows. The “designated point person” for generative AI will likely be your Technology Director/CTO or one of their trusted deputies. Most internal procurement teams and data managers will need documented guidance, training, and policies to address the additional needs of Gen AI tools.
The lift and effort required for the modification of job descriptions and requirements to address and support Gen AI will vary depending on each district’s relationships with relevant unions and the strength of their CBAs. Ideally, you’ll want to include references to AI knowledge and training when backfilling technology roles as soon as possible.
The recommended “cross-functional team” has been a long-standing best practice for all major issues affecting the district community, whether related to technology, curriculum, safety, or more local concerns. Gen AI is no different. The best way to avoid issues down the road is to ensure every stakeholder has a voice at the table early and often. Here’s what that might look like in a typical school district:
Best practice: setting up a cross-functional AI advisory committee
- Include representation from senior leadership, technology, curriculum & instruction, special education, data management, legal, the business office, communications, instructional coaches, teachers, parents, and students.
- Encourage diversity of thought by ensuring equitable representation across groups; e.g. you may only need one representative from legal, but consider soliciting a variety of community voices from different grade levels and underrepresented groups.
- Get a kickoff meeting on the calendar as early as possible. Designate one member of the team to act as meeting facilitator and agenda owner, with the responsibility of keeping the conversation moving and on track. Designate another team member to take down notes and create meeting minutes (AI can help with that too).
- Use the kickoff to norm on a regular cadence for meetings and touchpoints. This doesn’t need to be a huge time commitment—consider starting with once a month and adjusting based on the ongoing size of your agenda.
- Set up and communicate asynchronous contribution opportunities to all stakeholders. Give team members a path to submit questions, discussion topics, and updates to the group. Establish workflows for reviewing and following up on all submissions between meetings.
- One of the primary goals of this team will be to eliminate communication barriers and keep community members informed. Consider standing up a web page or site where anyone can access agendas and minutes, submit questions or requests, and see the results of the cross-functional team’s efforts. Clearly communicate through multiple channels how those in your learning community can access these resources and/or contribute to the team’s efforts.
- Consider organizing at least one big, open forum event for all stakeholders to come together, see what the cross-functional team has been working on, and provide feedback or ask questions. This can be a great way to showcase ways you have used generative AI to increase productivity, streamline processes, improve personalized learning, free up instructional time, or provide additional supports to students.
2. Compliance and Training
The Checklist is heavy on CYA (cover-your-butts) recommendations, as one would expect from such groundbreaking and wide-ranging technology as generative AI. From legal review to professional development, districts will need to take steps to monitor and respond to new legislation while mitigating the threat of untrained, unprepared staff. Some examples include:
- Staying informed on state laws and district rules to ensure Gen AI is not prohibited and is explicitly allowed in various scenarios
- Creating and publishing “formal policies, processes, and procedures on the responsible use of Gen AI” in alignment with the White House Blueprint for an AI Bill of Rights and the US DOE’s AI & the Future of Learning report
- Read more about the Office of EdTech’s take on keeping humans in the loop here
- Adding Gen AI to your Use Policy, documenting procedures to track compliance, publishing consequences for noncompliance, and updating your Code of Conduct accordingly
- Updating internal audits and insurance plans to include Gen AI considerations
- Standing up role-based onboarding and continuous training plans for all stakeholders, with formal tracking for completion and appropriate coverage of:
- AI bias and equity issues
- Use Policy requirements and consequences
- Plagiarism/citation requirements
- Data privacy and security
- Data access controls
- Data loss notification
Key takeaways and observations
Compliance: State legislatures have been relatively quick to act on the AI boom, with many updating their consumer privacy laws, standing up protections for automated decision-making tools, and organizing new government agencies to provide regulatory oversight and guidance. Requirements will vary based on where you are, but several common themes have emerged.
Right to opt-out of profiling
Many states have enacted provisions for consumers to opt-out of any AI-based profiling that might be used in automated decision making. In a school district setting, this could mean anything from a student IEP referral to filling a vacant teacher position. Hiring, healthcare, lending, and insurance are the impetus for this legislation, but one can imagine a wide range of hypothetical future scenarios where this might apply. District leaders should consider whether any efficiencies that might be unlocked by automated decisions are worth the added compliance requirements.
Impact and data protection assessments
Dozens of states have passed bills with language requiring either “impact assessments” or “data protection assessments” for certain types of AI tools (most often those that pose a “heightened risk” to consumers, like the sensitive applications of AI mentioned in the previous section). Few have specified what those assessments might look like, but at a high level, someone within the district will need to be responsible for documenting the entire evaluation process, including risk assessments, harm mitigation, goals, roles and responsibilities, and readiness. Microsoft and the government of the Netherlands have both published comprehensive guides/templates for AI impact assessments to help you get started.
Note: AI Impact Assessments are a heavy lift when done correctly. Consider delegating this work to the cross-functional AI advisory team discussed earlier in this article.
Training: Training on AI has been top of mind for most district leaders dating back to the public release of ChatGPT in November 2022. Several of the superintendents we spoke to over the summer were looking for opportunities to integrate AI into professional development schedules early and often in the weeks leading up to the new school year.
Best practice: laying the groundwork for an AI-knowledgeable staff
- Your Acceptable/Responsible Use Policy and Code of Conduct will form the basis of your onboarding, coaching, and accountability efforts. Consider working with your AI advisory committee and legal to get these updated as soon as possible. Keep in mind that AI is evolving at an unprecedented pace—these policies can’t just be “set it and forget it,” they will need to be reviewed on a regular basis.
- Clearly defined consequences for misuse are important, but they shouldn’t be the only thing your staff gets out of these policies. Spell out the workflows for Gen AI vetting and approvals, including who teachers should look to for guidance and how they can get the go-ahead for new tools. Provide links or supporting resources (e.g. applications for use, how to report problematic outputs, FAQs, approved vendor lists, etc…) directly in the documentation for max transparency.
- Work with your professional development, school leadership, and/or HR teams to embed Gen AI into onboarding and offboarding workflows. Set up guardrails to ensure no staff are being asked to use any Gen AI products without first understanding the purpose, goals, and pitfalls of those products.
- Secure at least one formal PD session for all staff to receive a high-level overview of Gen AI—what it is, how to identify when it’s being used, what needs it can meet both instructionally and operationally, and what your compliance expectations are.
Coaching and follow-up
- As with any effective technology implementation, the initial training will only go so far. The onus will be on district leaders to keep their fingers on the pulse of their staff by staying aware of what’s being used, what problems Gen AI is solving, and what problems Gen AI is creating.
- Socialize both best practices and cautionary tales early and often. Have some of your more innovative teachers figured out how to automate tasks and reduce prep time requirements? Turn them into a case study! Have you unlocked new insights from student performance data? Show your staff where to find them and how to use them! Even the worst case scenario can be a teachable moment—if someone makes a mistake with Gen AI, don’t sweep it under the rug, use the story to reduce the risk of it happening again.
3. Vendor Accountability
One of the most immediately actionable topics covered in the K-12 Generative AI Readiness Checklist is vendor contracts and guardrails. Most districts already have formal compliance workflows in place to secure data privacy agreements, required vendor forms, and evidence of effectiveness; Gen AI requirements should fit neatly into these existing processes. Specific recommendations include:
- Revising “vendor contracts to include clauses on the responsible use of generative AI, along with specified consequences for contractors who violate these guidelines”
- Requiring algorithmic discrimination protections from any vendors using Gen AI
- Requiring sufficient transparency for the district to meet notice and explanation standards where necessary
- Requiring vendors to “proactively notify your district when Gen AI capabilities are added to current assets”
- Ensuring vendors are not combining external data with district data without formal written agreement
- Updating data privacy vetting processes (and DPAs) to include assessment of the “collection, use, and disclosure of personal information for Gen AI”
- Setting up a process to identify and block non-compliant Gen AI tools
- Requiring moderation guardrails to “filter toxic and inappropriate content and detect hallucinations”
Key takeaways and observations
The general consensus among experienced district technology leaders seems to be that AI should not be treated as some unique entity with separate requirements and safeguards. Every new technology we’ve seen in the past few decades has come with similar potential and challenges–Gen AI is just another in a long line, no matter how transformative it might prove to be. There will be some up-front work to integrate Gen AI into existing processes, but it can be managed alongside the work that’s already being done on the safety, privacy, and security fronts.
Best practice: vetting and monitoring Gen AI products
What to do now
- District technology leaders are encouraged to survey their existing vendor landscape to determine which products have already incorporated Gen AI technologies and which are planning to do so in the future.
- Consider building and distributing a mandatory questionnaire to all of your current vendors with the goal of identifying where Gen AI is being used and for what purpose, where you might be exposed to risk, and how your existing contracts might need to change to accommodate Gen AI integrations.
What to do going forward
- As part of your standard vendor vetting process, you’ll want to collect a few more pieces of information specific to Gen AI, including what data is being input into AI models, what technical safeguards are in place to prevent data loss or unacceptable sharing, what technical and human safeguards are in place to prevent bias, inappropriate content, or hallucinations, and whether the program is creating anything that might be construed as a “profile” of its users or making any automated decisions based on AI analysis.
- Set up a clear, mandatory feedback loop by which vendors must inform the district of any upcoming Gen AI features, integrations, and functionality prior to release. Include specific timelines and leave room for your team to evaluate the proposed changes in accordance with your DPAs, acceptable use guidelines, risk/change management policies, and mission alignment.
- Consider setting up an internal portal for the purpose of cataloging the AI vetting process and outcomes for all stakeholders. This can be a read-only spreadsheet, a website directory, or some other easily accessible format. Include descriptions of what each program is used for, what role Gen AI plays, whether it is an approved resource (and who it is approved for, including grade/age ranges if student-facing), and when contracts or agreements expire.
Valuable Resources for District Technology Leaders
The K-12 Generative AI Readiness Checklist is another example of the immense value provided by organizations like CoSN, the Council of the Great City Schools, and the others who contributed to these efforts. It may be a lot for any one person to handle, but we’re all that much better when we work together.
Subscribe to EdTech Evolved today to stay connected to everything that’s happening in the world of Gen AI, EdTech, and the K-12 landscape. We’ll continue to monitor for updated policy guidance, legislation and best practices from these and other trusted sources in the months to come, helping you turn information into action at every step along the way.