Project Name: Project 1
Implementation of a project This is a dummy file. 
Implementation of a project This is a dummy file. 
Author: Satish Kumar Keshri, Nazreen Shah, Ranjitha Prasad, IIIT - Delhi
Under review, 2024
The holy grail of machine learning is to enable AI systems to learn continuously and adapt to changing environments. Continual Federated Learning (CFL) enhances the efficiency, privacy, and scalability of federated learning systems while simultaneously learning on the new tasks and preventing catastrophic forgetting on the previous tasks. The primary challenge of CFL is global catastrophic forgetting, where the accuracy of the global model trained on new tasks declines on the old tasks. In this work, we propose a novel aggregation strategy for memory-based CFL and provide convergence analysis where the focus of the analysis is on the factors that degrade the performance of CFL over $T$ communication rounds, such as client drift, bias and forgetting. We show that the proposed CFL framework converges at a rate of $\mathcal{O}(1/\sqrt{T})$ on the current task while circumventing the effect of bias and global catastrophic forgetting. We provide empirical evidence that the proposed technique outperforms several baselines with respect to metrics such as accuracy and forgetting.
Published:
This is a description of your talk, which is a markdown files that can be all markdown-ified like any other post. Yay markdown!
Published:
This is a description of your conference proceedings talk, note the different field in type. You can put anything in this field.
Undergraduate course, University 1, Department, 2014
This is a description of a teaching experience. You can use markdown like any other post.
Workshop, University 1, Department, 2015
This is a description of a teaching experience. You can use markdown like any other post.