Thumbnail Image

Acoustic Vector Network Analyser

This research aims at increasing the accuracy and precision of characterising acoustic materials by adapting well-known techniques developed for the radio-frequency domain to the acoustic domain. Two novel methods are developed with different approaches for determining the reflectivity of materials: computationally and physically. The first method is a advancement of the current “industry standard”, the impedance tube, by using additional microphones and error correcting methods. A vector network analyser (VNA) is used as the data acquisition component with the aim of developing a module to allow the device to measure in the acoustic domain. The multi-microphone method employs the use of a mathematical model of the acoustics with the impedance tube to computationally solve for the unknown acoustic properties. The method is found to contain multiple sources of error at each stage of the process which makes reliable calibration and measurement difficult. The advantage of using a VNA is lost due to the inclusion of non-systematic error. The second method uses wave superposition to form directional couplers to determine the reflectivity. This method allows for a more robust calibration routine as the error is systematic and can be corrected using intelligent procedures and measurement standards. The measurement accuracy was found to be ± 1.5 dB over a frequency range from 800 to 2200 Hz . The device is used to characterise the acoustic properties of pasture in order to aid the development of pasture meters. Pasture was found to reflect a relatively small amount and absorb the majority of the incident acoustic energy.
Type of thesis
Pennington, K. (2017). Acoustic Vector Network Analyser (Thesis, Doctor of Philosophy (PhD)). The University of Waikato, Hamilton, New Zealand. Retrieved from https://hdl.handle.net/10289/11530
The University of Waikato
All items in Research Commons are provided for private study and research purposes and are protected by copyright with all rights reserved unless otherwise indicated.