论文标题
FedOS:使用开放式学习来稳定联邦学习中的培训
FedOS: using open-set learning to stabilize training in federated learning
论文作者
论文摘要
联邦学习是一种在不违反隐私限制的情况下培训分布式数据集上统计模型的最新方法。通过共享模型而不是客户和服务器之间的数据来保留数据局部性原则。这带来了许多优势,但也带来了新的挑战。在本报告中,我们探讨了这个新的研究领域,并执行了几项实验,以加深我们对这些挑战的理解以及不同的问题设置如何影响最终模型的性能。最后,我们为这些挑战之一提供了一种新颖的方法,并将其与文献中的其他方法进行了比较。
Federated Learning is a recent approach to train statistical models on distributed datasets without violating privacy constraints. The data locality principle is preserved by sharing the model instead of the data between clients and the server. This brings many advantages but also poses new challenges. In this report, we explore this new research area and perform several experiments to deepen our understanding of what these challenges are and how different problem settings affect the performance of the final model. Finally, we present a novel approach to one of these challenges and compare it to other methods found in literature.