Smart Schools Ethical AI
- Karen Walstra

- Mar 26
- 5 min read

A smart school's AI use and the ethics they build around that result in a successful AI strategy.
If the AI strategy's focus is on technological competency before technology usage, the impact and application are more effective.
To achieve this desired output a coordinated effort is required between the school’s informed leadership, a dedicated school AI User Group, staff empowerment and buy-in, learner upliftment and skill-development and a curriculum that prioritises human agency over machine output (AI responses).
As Artificial Intelligence (AI) moves towards being a fundamental classroom tool, South African educational institutions face the dual challenge of harnessing innovation while maintaining rigorous ethical and legal standards.
Smart schools' ethical AI adoption explore responsible integration of AI, grounded in the Protection of Personal Information Act (POPIA) and the Children’s Act 38 of 2005.
By adopting a "Competency Before Technology" approach, schools can mitigate risks, such as data breaches and academic dishonesty, while empowering educators and learners to use AI ethically to solve local problems.
Highlighting and promoting a "Human-in-the-Loop" philosophy within schools, where AI serves as a "collaborative partner" rather than a substitute for human intellect, is extremely important in achieving true ethical and responsible AI use in schools.
Informed School Leadership: Prioritising Human Agency

The Informed School Leadership operating under the "Competency Before Technology" philosophy doesn't just buy the latest gadgets.
Leadership focuses on the human skills required to navigate a world where those gadgets exist. In this model, AI is treated as a support tool to enhance human thinking, not a replacement for it.
They ensure the school's culture prioritises pedagogical intent.
They drive a human-centric policy, where human agency is the priority.
They create and work with a collaborative team to manage the transition.
They ensure that AI requirements do not become a new barrier for disadvantaged learners that there is equitable AI access to the entire school community
The Information Officer: Navigating POPIA and AI Privacy and Security

The role of the School’s Information Officer (IO) is to navigate the complexities of processing data and information by automated means, ensuring learner and staff data privacy, and preventing unauthorized cross-border data transfers. By law, the Principal (public) or Head of School (private) is the default IO and is legally liable for data misuse, some schools appoint an information officer.
Automated Means Scrutiny: AI is considered "processing by automated means" under POPIA, requiring high levels of scrutiny for privacy compliance.
POPIA Compliance: The IO must ensure student data is not used to train public AI models and that any cross-border data transfer meets strict security standards.
Risk Mitigation: Conduct "Personal Information Impact Assessments" (PIIAs) for every AI tool before it is added to the school’s approved list.
Automated Decision Protection: Section 71 of POPIA protects individuals from "high-stakes" decisions (like final grades) made solely by an AI without human oversight.
The Legal Framework (POPIA & Children's Act) - Staff must understand that data protection in South Africa is a statutory requirement, not a choice.
The Responsible Party: Under POPIA, the school is the "Responsible Party," and the Principal (or Head of School) is the default Information Officer (IO) who is legally liable for data misuse.
Deputy Information Officer: Principals are encouraged to delegate these responsibilities to other staff members by appointing a Deputy Information Officer, but they generally retain overall responsibility (Boda & Powell, 2024)
Heightened Protection for Children: POPIA provides extra safeguards for children’s data. Any transfer of student work to AI servers outside South Africa must meet strict security standards equivalent to local law.
Mandatory Reporting: According to the Children’s Act 38 of 2005 (Section 110), educators are legally required to report any suspected abuse or violations, which now includes online violations like cyberbullying or sexual grooming discovered via digital tools.
The AI User Group: The Strategic Innovation Hub

The role of the School’s AI User Group is the strategic engine of the school’s AI integration. Its role is to bridge the gap between rapid technological change and the school’s legal and pedagogical responsibilities. The focus should not be a "policing" body, rather an innovation hub ensuring that any AI use is safe, ethical, and effective, including:
policy creation, regular audits, incident management and vetting of acceptable AI tools to add to the school’s AI tool list.
ensuring professional development for staff is ongoing and relevant
ensuring the safeguarding and ethics of AI use across the school
monitoring the alert to risks, such as AI-generated deepfakes, cyberbullying, or "predatory bots" that might target learners;
ensuring reporting strategies are in place when incidents occur, related to POPIA, Children's Act and the Bill of Rights - Children, and the Harassment Act
ensuring the "Human-in-the-Loop" standard and ethical modelling,
ensuring the school's own use of AI for marketing, report writing, or administration, also sets a standard for transparency and honesty.
Educators: Transparency & Human-in-the-Loop Verification

For Educators, this involves a commitment to:
transparency, letting school, other educators and learners know how AI was used
only using the school’s vetted AI tools
verifying every output for inherent cultural bias, accuracy and relevance
assessing differently. Teachers must add South African contextual information to prompts, as only 2% of global AI training data currently comes from Africa.
creating lesson activities that truly develop critical and creative thinking, in every subject
avoiding inputting learners’ personal information into AI tools, being aware of POPIA, Children's Act, the Bill of Rights - Children, and the Harassment Act.
reporting incidents of misuse or breach of human-rights (Form 22)
the strict avoidance of AI in high-stakes, automated decision-making such as final grading.
Learners: Proactive Digital Citizenship

For Learners, the focus shifts toward being a proactive digital citizenship, moving beyond plagiarism prevention to fostering competencies in critical thinking and creativity, using AI for learning, verification and personalisation of each AI response, responsible digital use and cyber kindness.
The School’s AI usage and ethics grounded in POPIA and the Children’s Act 38 of 2005, promotes a "competency before technology" approach to mitigate risks like data breaches and academic dishonesty. It champions a "Human-in-the-Loop" philosophy, ensuring AI acts as a collaborative partner that augments human intellect. Achieving true ethical and responsible AI use requires a coordinated effort from school leadership, the AI User Group, and a curriculum prioritising human agency over machine output.
Resources:
Boda, R., & Powell, A. (2024, February 13). Back to school: POPIA do’s and don’ts. ENSight. https://www.ensafrica.com/news/detail/8111/back-to-school-popia-dos-and-donts-
Department of Justice, South Africa. Legislation: Children's Act 2005 and amendments https://www.justice.gov.za/legislation/acts/2005-038%20childrensact.pdf
South African Government. Children's Act 38 of 2005 https://www.gov.za/documents/childrens-act
Constitutional Court of South Africa. South African Constitution. The Bill of Rights, Section 28 - Children https://www.concourt.org.za/index.php/71-children-s-rights/section-28-children/133-section-28-children
Department of Justice and Constitutional Development. Protection from Harassment Act, 2011 (Act 17 of 2011) https://www.justice.gov.za/forms/form_pha.html#:~:text=The%20Protection%20from%20Harassment%20Act%20(Act%2017,communication%20that%20causes%20harm%20*%20Sexual%20harassment
Department of Health. Knowledge Hub: Form 22 - REPORTING OF ABUSE OR DELIBERATE NEGLECT OF CHILD (Regulation 33) [SECTION 110 OF THE CHILDREN’S ACT 38 OF 2005] https://knowledgehub.health.gov.za/system/files/2023-11/Form%2022-Reporting%20of%20Abuse%20or%20deliberate%20neglect.pdf
Google for Education. Teaching responsible use of AI. Lessons and activities for students. https://services.google.com/fh/files/misc/google_teaching_responsible_ai.pdf
Google Workspace for Education data protection implementation guide. https://services.google.com/fh/files/misc/google_workspace_edu_data_protection_implementation_guide.pdf
Google. Safer digital learning with Google for Education. https://edu.google.com/our-values/privacy-security/
Google. Use Gemini Apps with a work or school Google Account. https://support.google.com/gemini/answer/14620100?co=DASHER._Family%3DEducation&oco=0




Comments