Belief Propagation Algorithms on Noisy Hardware

Belief Propagation Algorithms on Noisy Hardware The wide recognition that emerging nano-devices will be inherently unreliable motivates the evaluation of information processing algorithms running on noisy hardware as well as the design of robust schemes for reliable performance against hardware errors of varied characteristics. In this paper, we investigate the performance of a popular statistical inference algorithm, belief propagation (BP) on probabilistic graphical models, implemented on noisy hardware, and we propose two robust implementations of the BP algorithm targeting different computation noise distributions. We assume that the BP messages are subject to zero-mean transient additive computation noise. We focus on graphical models satisfying the contraction mapping condition that guarantees the convergence of the noise-free BP. We first upper bound the distances between the noisy BP messages and the fixed point of (noise-free) BP as a function of the iteration number. Next, we propose two implementations of BP, namely, censoring BP and averaging BP, that are robust to computation noise. Censoring BP rejects incorrect computations to keep the algorithm on the right track to convergence, while averaging BP takes the average of the messages in all iterations up to date to mitigate the effects of computation noise. Censoring BP works effectively when, with high probability, the computation noise is exactly zero, and averaging BP, although having a slightly larger overhead, works effectively for general zero-mean computation noise distributions. Sufficient conditions on the convergence of censoring BP and averaging BP are derived. Simulations on the Ising model demonstrate that the two proposed implementations successfully converge to the fixed point achieved by noise-free BP. Additionally, we apply averaging BP to a BP-based image denoising algorithm and as a BP decoder for LDPC codes. In the image denoising application, averaging BP successfully denoises an image even when nominal BP fails to do so in the – resence of computation noise. In the BP LDPC decoder application, the power of averaging BP is manifested by the reduction in the residual error rates compared with the nominal BP decoder.