# Also in the Article

Reconstruction algorithm

A minimally invasive lens-free computational microendoscope

Procedure

To reconstruct the image of the object from the object’s system response, we used a reconstruction framework focusing on the local image structures. A popular model to quantify local image information is sparsity in an appropriate domain. Given a patch or block of pixels z extracted at a random location from the image of the object, its coefficient α under some sparsifying transform $Ψ∼(∙)$ defined by$α=Ψ˜(z)$should be sparse or compressible.

The reconstruction process estimates the sparse coefficient set of some patch set covering the entire image of interest, which is consistent with the object’s system response. In particular, let {zk} be a patch set extracted from the original image x, the image of the object can be mathematically represented by its patches as$x=P({zk})$where P(∙) is an operator that combines the patch set to obtain the original image. Denote {αk} as the coefficients of the patches {zk} and Ψ(∙) as the inverse sparsifying transform of $Ψ∼(∙)$ satisfying zk = Ψ(αk) for all k, the sensing process can be written as$y=A(P(Ψ{αk}))$

We propose to obtain the sparse coefficients from the following optimization problem

This optimization problem can be solved efficiently by an iteratively alternating minimization procedure. At iteration t of the algorithm, a noisy estimate x(t) of the original image consistent with the object’s system response is reconstructed on the basis of the information from the previous iteration. The estimates of the sparse coefficients ${αk(t)}$ at this iteration can then be found by thresholding the coefficients of the noisy patches ${zk(t)}$ extracted from x(t). The error between the true measurements and the sparsified reconstruction with the known coded aperture is used to generate the next image estimate x(t + 1). The algorithm stops when a maximum number of iterations is reached, or the inconsistency between the estimate and the measurements is sufficiently small.

Q&A