Institute for Communication Technologies and Embedded Systems

Federated Learning for Automatic Modulation Classification under Class Imbalance and Varying Noise Condition

Authors:
Wang, Y. ,  Gui, G. ,  Gacanin, H. ,  Abedisi, B. ,  Sari, H. ,  Adachi, F.
Journal:
IEEE Transactions on Cognitive Communications and Networking
Publisher:
IEEE
Page(s):
1 -1
Date:
Jun. 2021
DOI:
10.1109/TCCN.2021.3089738
hsb:
RWTH-2021-06964
Language:
English
Abstract:
Automatic modulation classification (AMC) is a promising technology for identifying modulation types, and deep learning (DL)-based AMC is one of its main research directions. Conventional DL-based AMC methods are centralized solutions (i.e., CentAMC), which are trained on abundant data collected from local clients and stored in the server and generally have advanced performance, but their major problem is the risk of data leakage. Besides, if DL-based AMC is only trained with the data from their corresponding clients, it may exhibit weak performance. Thus, a federated learning (FL)-based AMC (FedeAMC) is proposed under the condition of class imbalance and noise varying. Its advantage is low risk of data leakage without severe performance loss, because data and training are in each local client, while only knowledge (i.e., gradient or model weight), rather than data, is shared with the server. In addition, there is generally class imbalance problem in each local client, and balanced cross entropy is introduced as loss function for solving this problem. Simulation results demonstrated that average accuracy gap between FedeAMC and CentAMC is less than 2%.
Download:
BibTeX