Sparse signal reconstruction via recurrent neural networks with hyperbolic tangent function.
Academic Article
Overview
Research
Identity
Additional Document Info
Other
View All
Overview
abstract
In this paper, several recurrent neural networks (RNNs) for solving the L1-minimization problem are proposed. First, a one-layer RNN based on the hyperbolic tangent function and the projection matrix is designed. In addition, the stability and global convergence of the previously presented RNN are proved by the Lyapunov method. Then, the sliding mode control technique is introduced into the former RNN to design finite-time RNN (FTRNN). Under the condition that the projection matrix satisfies the Restricted Isometry Property (RIP), a suitable Lyapunov function is constructed to prove that the FTRNN is stable in the Lyapunov sense and has the finite-time convergence property. Finally, we make a comparison of the proposed RNN and FTRNN with the existing RNNs. To achieve this, we implement experiments for sparse signal reconstruction and image reconstruction. The results further demonstrate the effectiveness and superior performance of the proposed RNN and FTRNN.