Please use this identifier to cite or link to this item: http://theses.ncl.ac.uk/jspui/handle/10443/6688
Title: Scalable and continuous federated learning on edge-cloud computing
Authors: Sun, Rui
Issue Date: 2025
Publisher: Newcastle University
Abstract: A critical challenge in developing intelligent systems that operate efficiently in distributed and dynamic environments is ensuring their ability to continually learn from new data while maintaining data privacy. Federated Learning (FL) offers a decentralized solution that enables intelligent systems to collaboratively adapt to new information across diverse environments without the need for centralizing data. However, existing FL systems face significant challenges in scalability, efficiency, and the ability to retain and integrate new knowledge, particularly when dealing with varied hardware, network conditions, and evolving data streams. Relying on traditional, centralized methods for data aggregation and model training can lead to bottlenecks, reduced operational efficiency, and increased vulnerability to single-point failures. This thesis addresses these challenges by proposing enhancements to FL systems that not only preserve privacy but also improve scalability, adaptability, continual learning capabilities, and robustness, making them better suited for real-world deployment. To address hardware heterogeneity in FL, this thesis introduces FedMSA, a hardware-adaptive model selection method that balances performance and efficiency using hardware-aware Neural Architecture Search (HW-NAS). This approach simplifies model selection and adapts architectures dynamically, catering to the diverse hardware in large-scale FL deployments. Building on this, FedDAS is introduced to enhance the robustness of FL through runtime system monitoring and diagnostics, detecting issues like device failures or network delays and providing real-time adaptation suggestions. This system ensures FL applications remain stable and efficient in dynamic environments by combining automated adaptation with human-in-the-loop diagnostics. In FL systems, continual learning is crucial as clients generate new data that must be incorporated without forgetting previous knowledge. To address this, the thesis introduces the Dual Variational Knowledge Attention (DVKA) mechanism, which balances the retention of old knowledge with the integration of new information, mitigating catastrophic forgetting and ensuring robust learning in FL contexts requiring continual adaptation. ix Further advancing Federated Continual Learning (FCL), the thesis presents ECoral, a novel approach that improves the effectiveness of replay-based methods in classincremental learning by condensing exemplars and enhancing knowledge retention in non-IID environments. Finally, the thesis extends continual learning principles into the federated domain with RefFil, a rehearsal-free Federated Domain-Incremental Learning (FDIL) framework. RefFil utilizes global prompt-sharing and domain-specific contrastive learning to tackle domain shifts in FL, ensuring robust generalization across diverse and evolving data distributions, which is crucial for performance in resource-constrained and privacy-sensitive environments. These contributions collectively advance FL systems by combining adaptive model selection, robust continual learning strategies, and efficient data management, offering a comprehensive solution for deploying scalable and privacy-preserving machine learning in complex, distributed environments
Description: PhD Thesis
URI: http://hdl.handle.net/10443/6688
Appears in Collections:School of Computing

Files in This Item:
File Description SizeFormat 
Sun R 2025.pdfThesis12.47 MBAdobe PDFView/Open
dspacelicence.pdfLicence43.82 kBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.