%0 Journal Article
%J IEEE Transactions on Audio, Speech, and Language Processing
%D 2010
%T Plane-Wave Decomposition of Acoustical Scenes Via Spherical and Cylindrical Microphone Arrays
%A Zotkin,Dmitry N
%A Duraiswami, Ramani
%A Gumerov, Nail A.
%K Acoustic fields
%K acoustic position measurement
%K acoustic signal processing
%K acoustic waves
%K acoustical scene analysis
%K array signal processing
%K circular arrays
%K cylindrical microphone arrays
%K direction-independent acoustic behavior
%K microphone arrays
%K orthogonal basis functions
%K plane-wave decomposition
%K Position measurement
%K signal reconstruction
%K sound field reconstruction
%K sound field representation
%K source localization
%K spatial audio playback
%K spherical harmonics based beamforming algorithm
%K spherical microphone arrays
%X Spherical and cylindrical microphone arrays offer a number of attractive properties such as direction-independent acoustic behavior and ability to reconstruct the sound field in the vicinity of the array. Beamforming and scene analysis for such arrays is typically done using sound field representation in terms of orthogonal basis functions (spherical/cylindrical harmonics). In this paper, an alternative sound field representation in terms of plane waves is described, and a method for estimating it directly from measurements at microphones is proposed. It is shown that representing a field as a collection of plane waves arriving from various directions simplifies source localization, beamforming, and spatial audio playback. A comparison of the new method with the well-known spherical harmonics based beamforming algorithm is done, and it is shown that both algorithms can be expressed in the same framework but with weights computed differently. It is also shown that the proposed method can be extended to cylindrical arrays. A number of features important for the design and operation of spherical microphone arrays in real applications are revealed. Results indicate that it is possible to reconstruct the sound scene up to order p with p2 microphones spherical array.
%B IEEE Transactions on Audio, Speech, and Language Processing
%V 18
%P 2 - 16
%8 2010/01//
%@ 1558-7916
%G eng
%N 1
%R 10.1109/TASL.2009.2022000
%0 Conference Paper
%B IEEE Workshop on Applications of Signal Processing to Audio and Acoustics, 2009. WASPAA '09
%D 2009
%T Regularized HRTF fitting using spherical harmonics
%A Zotkin,Dmitry N
%A Duraiswami, Ramani
%A Gumerov, Nail A.
%K Acoustic applications
%K acoustic field
%K Acoustic fields
%K acoustic intensity measurement
%K Acoustic measurements
%K acoustic signal processing
%K Acoustic testing
%K acoustic waves
%K array signal processing
%K audio acoustics
%K circular arrays
%K computational analysis
%K Ear
%K ear location
%K head-related transfer function
%K Helmholtz reciprocity principle
%K HRTF
%K HRTF fitting
%K Loudspeakers
%K Microphones
%K Position measurement
%K signal reconstruction
%K spatial audio
%K spectral reconstruction
%K spherical harmonics
%K Transfer functions
%X By the Helmholtz reciprocity principle, the head-related transfer function (HRTF) is equivalent to an acoustic field created by a transmitter placed at the ear location. Therefore, it can be represented as a spherical harmonics spectrum - a weighted sum of spherical harmonics. Such representations are useful in theoretical and computational analysis. Many different (often severely undersampled) grids are used for HRTF measurement, making the spectral reconstruction difficult. In this paper, two methods of obtaining the spectrum are presented and analyzed both on synthetic (ground-truth data available) and real HRTF measurements.
%B IEEE Workshop on Applications of Signal Processing to Audio and Acoustics, 2009. WASPAA '09
%I IEEE
%P 257 - 260
%8 2009/10//
%@ 978-1-4244-3678-1
%G eng
%R 10.1109/ASPAA.2009.5346521