针对探地雷达图像背景含有大量噪声，有效信号提取识别困难等问题。提出一种基于改进马尔可夫随机场结合大津算法的探地雷达去噪的方法。该方法首先使用大津算法，将探地雷达图像数据二值化，划分出前景和后景图像。后对图像使用改进马尔可夫随机场结合迭代条件模式算法（Iterated Conditional Mode，ICM）进行降噪处理，得到只含有效信号的二值图。将二值图像与带噪图像结合比对，使用中值滤波法降噪，得到优化后的探地雷达图像。根据实验得结果得出，该方法在探地雷达数据降噪和有效信号提取两个方面均具有较好的效果。同其他传统降噪方法相比，该方法在探地雷达信号单道波波形对比、峰值信噪比和结构相似度等评价方法中表现更优。同时得到二值图像结合滤波后图像后，对图像波形识别更清晰。在使用探地雷达进行探测、探伤工程项目中有一定实用价值。
Aiming at the problem that the background of ground penetrating radar (GPR) image contains a lot of noise, which leads to the difficulty of effective signal extraction and recognition, a GPR denoising method based on improved Markov random field combined with Otsu algorithm is proposed.Firstly, the Otsu algorithm was used to binarize the GPR image data, and the foreground and background images were divided. Then, the improved Markov random field combined with Iterated Conditional Mode algorithm (ICM) was used to denoise the image, and the binary image containing only the effective signal was obtained. The binary image was combined with the noisy image, and the median filtering method was used to reduce the noise, finally the optimized GPR image was obtained. According to the experimental results, this method has a good effect on the ground penetrating radar data noise reduction and effective signal extraction. Compared with other traditional and original Markov random field denoising methods, it performs best in various GPR evaluation methods. Specifically, it has the highest fitting degree in the signal single-channel waveform contrast, the peak signal-to-noise ratio reaches 52.5281dB, and the structural similarity is 0.9981. Meanwhile, the image waveform recognition become clearer by using the filter image which combined the binary image. Therefore, this method has a certain practical value in the use of GPR detection and flaw detection engineering projects.
吴学礼,宋凯,史思远,等. 基于改进马尔可夫随机场探地雷达有效信号提取方法[J]. 科学技术与工程, 2023, 23(30): 13031-13039.
Wu Xueli, Song Kai, Shi Siyuan, et al. Based on improved MRF Effective signal extraction method of GPR[J]. Science Technology and Engineering,2023,23(30):13031-13039.