Transfer Learning for Semi-Supervised Automatic Modulation Classification in ZF-MIMO Systems
- Wang, Y. , Gui, G. , Gacanin, H. , Ohtsuki, T. , Sari, H. , Adachi, F.
- IEEE Journal on Emerging and Selected Topics in Circuits and Systems
- Jun. 2020
AbstractAutomatic modulation classification (AMC) is an essential technology for the non-cooperative communication systems, and it is widely applied into various communications scenarios. In the recent years, deep learning (DL) has been introduced into AMC due to its outstanding identification performance. However, it is almost impossible to implement previously proposed DL-based AMC algorithms without large number of labeled samples, while there are generally few labeled sample and large unlabel samples in the realistic communication scenarios. In this paper, we propose a transfer learning (TL)based semi-supervised AMC (TL-AMC) in a zero-forcing aided multiple-input and multiple-output (ZF-MIMO) system. TLAMC has a novel deep reconstruction and classification network (DRCN) structure that consists of convolutional auto-encoder (CAE) and convolutional neural network (CNN). Unlabeled samples flow from CAE for modulation signal reconstruction, while labeled samples are fed into CNN for AMC. Knowledge is transferred from the encoder layer of CAE to the feature layer of CNN by sharing their weights, in order to avoid the ineffective feature extraction of CNN under the limited labeled samples. Simulation results demonstrated the effectiveness of TL-AMC. In detail, TL-AMC performs better than CNN-based AMC under the limited samples. What’s more, when compared with CNNbased AMC trained on massive labeled samples, TL-AMC also achieved the similar classification accuracy at the relative high SNR regime.
- Copyright © by IEEE
- © 2020 IEEE.Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.