GhostRNN: Cost-Effective Transformer Layer for RNN Model Optimization
Editor: Jizhi Shutong Recurrent Neural Networks (RNNs) based on long-range dependency modeling are widely used in various speech tasks, such as Keyword Spotting (KWS) and Speech Enhancement (SE). However, due to the power and memory limitations of low-resource devices, efficient RNN models are needed for practical applications. In this paper, the author proposes an efficient … Read more