Partially Adaptive Multichannel Joint Reduction of Ego-Noise and Environmental Noise

IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) pages 1-5, doi: 10.1109/ICASSP49357.2023.10096344 - Jun 2023
Associated documents :  
Human-robot interaction relies on a noise-robust audio processing module capable of estimating target speech from audio recordings impacted by environmental noise, as well as self-induced noise, so-called ego-noise. While external ambient noise sources vary from environment to environment, ego-noise is mainly caused by the internal motors and joints of a robot. Egonoise and environmental noise reduction are often decoupled, i.e., ego-noise reduction is performed without considering environmental noise. Recently, a variational autoencoder (VAE)-based speech model has been combined with a fully adaptive non-negative matrix factorization (NMF) noise model to recover clean speech under different environmental noise disturbances. However, its enhancement performance is limited in adverse acoustic scenarios involving, e.g. ego-noise. In this paper, we propose a multichannel partially adaptive scheme to jointly model ego-noise and environmental noise utilizing the VAE-NMF framework, where we take advantage of spatially and spectrally structured characteristics of ego-noise by pre-training the ego-noise model, while retaining the ability to adapt to unknown environmental noise. Experimental results show that our proposed approach outperforms the methods based on a completely fixed scheme and a fully adaptive scheme when ego-noise and environmental noise are present simultaneously.

 

@InProceedings{FWTWG23, 
 	 author =  {Fang, Huajian and Wittmer, Niklas and Twiefel, Johannes and Wermter, Stefan and Gerkmann, Timo},  
 	 title = {Partially Adaptive Multichannel Joint Reduction of Ego-Noise and Environmental Noise}, 
 	 booktitle = {IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)},
 	 journal = {},
 	 number = {},
 	 volume = {},
 	 pages = {1-5},
 	 year = {2023},
 	 month = {Jun},
 	 publisher = {},
 	 doi = {10.1109/ICASSP49357.2023.10096344}, 
 }