Towards Privacy-Preserving Data-Driven Education: The Potential of Federated Learning

📅 2025-03-16
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Educational AI systems face heightened privacy risks due to centralized collection of sensitive student data, and existing privacy-preserving approaches struggle to balance utility and security. Method: This work presents the first systematic empirical validation of federated learning (FL) for learning analytics prediction in educational settings. We integrate differential privacy mechanisms with rigorous adversarial evaluation—including model inversion and membership inference attacks—to establish a privacy-preserving paradigm that jointly optimizes predictive accuracy and robustness. Contribution/Results: Experiments demonstrate that our approach achieves prediction accuracy comparable to centralized training while exhibiting significantly smaller accuracy degradation under diverse adversarial attacks—outperforming baseline methods in privacy–utility trade-off. The framework is reproducible, scalable, and provides a principled technical pathway toward trustworthy, privacy-aware educational AI.

Technology Category

Application Category

📝 Abstract
The increasing adoption of data-driven applications in education such as in learning analytics and AI in education has raised significant privacy and data protection concerns. While these challenges have been widely discussed in previous works, there are still limited practical solutions. Federated learning has recently been discoursed as a promising privacy-preserving technique, yet its application in education remains scarce. This paper presents an experimental evaluation of federated learning for educational data prediction, comparing its performance to traditional non-federated approaches. Our findings indicate that federated learning achieves comparable predictive accuracy. Furthermore, under adversarial attacks, federated learning demonstrates greater resilience compared to non-federated settings. We summarise that our results reinforce the value of federated learning as a potential approach for balancing predictive performance and privacy in educational contexts.
Problem

Research questions and friction points this paper is trying to address.

Addresses privacy concerns in data-driven education applications.
Evaluates federated learning for educational data prediction accuracy.
Compares resilience of federated vs non-federated learning under attacks.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Federated learning for educational data prediction
Comparable accuracy to non-federated methods
Enhanced resilience against adversarial attacks
M
Mohammad Khalil
Centre for the Science of Learning & Technology (SLATE), University of Bergen, Bergen, Norway
Ronas Shakya
Ronas Shakya
Univeristy of Bergen
Qinyi Liu
Qinyi Liu
PhD candidate, University of Bergen
learning analyticsprivacy and securitymachine learning