Image thresholding segmentation based on weighted Parzen-window and linear programming techniques

Image segmentation by thresholding is an important and fundamental task in image processing and computer vision. In this paper, a new bi-level thresholding approach based on weighted Parzen-window and linear programming techniques is proposed to use in image thresholding segmentation. First, by proposing a weighted Parzen-window to describe the gray level distribution status, we obtain the boundaries for the foreground and background of the image. Then the image thresholding problem can be transformed into the problem of solving a linear programming problem for computing the coefficient values of weighted Parzen-window. The results of testing on synthetic, NDT and a set of benchmark images indicate that the proposed method can achieve a higher segmentation accuracy and robustness in comparison to some classical thresholding methods, such as inter class variance method (OTSU), Kapur’s entropy-based method (KSW), and some state-of-art methods that consider spatial information, such as CHPSO, GLLV histogram method and GABOR histogram method.

www.nature.com/scientificreports/ Xiao proposed two new entropic bi-level thresholding methods. The first method employs gray level spatial correlation (GLSC) histogram 17 . In contrast to the 2D histogram, the GLSC histogram is obtained using the gray level of the pixels and their neighbors with similar gray level. The second method employs gray level and gradient magnitude (GLGM) histogram 18 . The GLGM histogram clearly captures the occurrence probability and spatial distribution features of gray level at the same time, and considers spatial information. Utilizing the orientation histogram of a gradient image to calculate the local edge property, a new bi-level thresholding method employing 2D-D histogram was proposed by Yimit 19 . A new thresholding method based on a GLLV histogram was proposed by Zheng 20 using the gray level information of pixels and its local variance in a neighborhood. A new thresholding method based on a GABOR histogram was proposed by Yi 21 . Recently, Xiong et al. proposed a new image thresholding method combining Kapur's entropy with Parzen-window estimation 22 .
In general, the improved 2D histogram methods outperform 1D histogram methods. However, the 2D entropic thresholding methods still have some limitations, such as, not a generic method for image thresholding, and lack of robustness or stability etc.
In this paper, we try to propose a new bi-level thresholding method, which is based on the boundaries for the foreground and background by using a weighted Parzen-window to describe the gray level distribution status rather than gray level probability density distributions (1D or 2D histogram) for the foreground and background in an image. Subsequently, image thresholding was successfully transformed into a linear programming problem. We used the simplex method to solve the linear programming problem. In the experimental section, the proposed method is compared with the classic and state-of-art methods to demonstrate its accuracy and robustness. The novel contribution of this study is the construction of a new data distribution description method based on the weighted Parzen-window technique, which can be regarded as a linear programming problem. This process is illustrated in Fig. 1.
The rest parts of this paper is organized as follows. In section "The proposed method", we briefly introduce the Parzen-window technology, and provide a new bi-level thresholding method based on the weighted Parzenwindow and linear programming. In section "Experimental results", the results of the experiments and a discussion are presented. Finally, section "Conclusions" gives the conclusion.

The proposed method
Parzen-window technique and its use in image estimation. For a gray image F = f x, y |x ∈ {1, 2, 3, . . . , m}, y ∈ {1, 2, 3, . . . , n} of size m × n with L gray levels, the gray level set G = {0, 1, 2, . . . , L − 1}. f x, y ∈ G is the gray value of the pixel located at location x, y . Let ω l = x, y |f x, y = l, x ∈ {1, 2, 3, . . . , m}, y ∈ {1, 2, 3, . . . , n}, l ∈ G ,C l (l ∈ G) represents the number of pixels in ω l , then ω = {ω l , l ∈ G} and N = L−1 l=0 C l . Obviously, ω l is defined in 2D space. Suppose that t is the threshold value, the result of bi-level thresholding by t is a binary function f t x, y : The result of thresholding is a clustering problem that separates all pixels into two classes O and B . Where O represents foregrounds and B represents backgrounds, or vice versa.
(1)  www.nature.com/scientificreports/ Traditional thresholding methods are first used to compute the probability of each gray level distribution. Then, the optimal threshold value was computed by optimizing an appropriate objective function, which was designed using the gray level distribution or other properties.
As we known well, the Parzen-window estimation is an effective non-parametric estimation with solid theoretical foundation, which can better describe the distributions of data [23][24][25] . The basic idea is to estimate the pdf using the mean value of the densities of each point within a certain range. If we want to estimate the pdf at point X , we can place a window of size h at X and see how many observations of X i fall into this window. The value of pdf is the average of the observations falling into this window. The Parzen-window estimate P n (X) can be expressed as: where V n is the volume of the d-D hypercube with edge length h n , V n = h d n , h n = c √ n is called window width. c is a constant parameter that always takes the value of 1. K(·) is the d-D kernel function (window function), and: The most commonly used kernel function is the Gaussian kernel function (normal distribution), defined as: Following the Parzen-window estimation, for the 2-D image F , the sample x, y in the two-dimensional point space ω l , its pdf p x, y, ω l can be estimated by Eqs. (5) and (6).
where C l is the number of pixels in ω l , p(ω l ) can be approximated by a histogram, given by: where N = L−1 l=0 C l . Then, the p x, y, ω l is obtained by: where x j , y j denotes the coordinates of jth sample (pixel) in ω l , V C l represents the volume of the cube whose edge length is σ l , σ l is also called the window width, that is, for a two-dimensional image, V C l = σ l 2 . ϕ(·) is a window function (also called kernel function). We chose Gaussian kernel function, which is taken as: Thus, p x, y can be estimated using: However, probability density estimation itself is an ill-posed problem. Moreover, the estimation of the probability density function involves a large amount of calculation, and it is easily affected by noise and the number of samples. To avoid these negative effects, we give up the estimation of probability density. We attempt to obtain www.nature.com/scientificreports/ the boundaries for the foregrounds and backgrounds in an image by using a weighted Parzen-window, to obtain a good description of the gray level distribution status, the thresholding problem can be converted to the problem of solving a linear programming problem for determining the coefficient values of the weighted Parzen-window.
Weighted Parzen-window combines linear programming for image thresholding. Here, we propose the weighted Parzen-window method, which is an improvement of the Parzen-window method. By combining the proposed weighted Parzen-window method and a linear programming technique, we provide a new image thresholding method. As is well know, thresholding segmentation assumes that the pixels are divided into two classes O and B . If we can choose an suitable ρ to divide {ω l , l ∈ G, G = {1, 2, . . . , .L − 1}} into two classes, such as According to the boundary, it is easy to divide the gray level into two classes. However, this approach is not always effective, because the Parzen-window technique does not provide a method for choosing an appropriate ρ . Thus, the Parzen-window technique must be modified so that it can better describe the boundary of the data distribution and obtain the appropriate ρ . We now provide a solution strategy.
Suppose that a d-D pattern space with N samples is as follows: where I denotes the coordinate set. Now, let's consider the following linear programming (LP).
where ϕ(·) denotes the kernel function. Because the kernel function ϕ(·) and coefficient a i are non-negative, the implicit constraint is ρ ≥ 0 . The solution of the above LP and the kernel function together constitute a new description of the data distribution: where I ′ = {i|i ∈ Ianda i > 0} . The solution of Eq. (15) is guaranteed by the following theorem.

Theorem 1 The solution of Eq. (15) is absolute existence.
Proof According to the constraints in Eq. (15), we have: It is easy to know that Eq. (15) is a feasible solution.
Thus, Eq. (15) exist in the solution domain. It can be concluded from the optimization theory that the solution of Eq. (15) is an absolute existence. Proof end.
If ϕ X j , X i is regarded as a measure of the similarity between samples j and i , Eq. (15) provides a strategy for selecting ρ . The constraints of Eq. (15) make the inter-class similarity as large as possible. Therefore, the boundaries of the data distribution can be better delineated. Simultaneously, p(X) is not the probability density estimation, instead, focus on describing the boundaries of data distribution, and: The simplex method 26 is the most commonly used method to solve the LP problem, thus, we chose it for this study. www.nature.com/scientificreports/ A gray image is regarded as a two-dimensional sample space. This can be easily mapped to linear programming. For example, a 2-dimensional space X is replaced by f x, y |x ∈ {1, 2, 3, . . . , m}, y ∈ {1, 2, 3, . . . , n} , the index coordinate I is replaced by the pixel coordinate set ω , kernel function ϕ(·) is the same as in Eq. (9). Thus, we can classify all gray levels into two classes using the proposed weighted Parzen-window and linear programming based image thresholding (WPWLPT) method.

Experimental results
In this section, we present the experimental results, obtained by using some classic methods (such as OTSU 3 and KSW 5 ), some state-of-art methods (such as CHPSO 14 , GLLV 19 and GABOR 20 , and all the parameters being set to the default values during the experiments) and our proposed method (we call it as WPWLPT from now). Li et al. 14 proposed the CHPSO method, which can be used for both bi-level and multi-level thresholding (they provide equations for both the bi-level and multi-level cases). It uses OTSU and KAPUR as objective functions, which we denote CHPSO_otsu and CHPSO_ksw, respectively. In order to assess the effectiveness of the proposed method, we qualitatively and quantitatively assessed on lots of images. For brevity, we only reported 22 representative thresholding results, which included two synthetic, eight nondestructive testing (NDT) and a set of benchmark images. These images had different sizes and histogram types. We designed two synthetic images. The NDT images were obtained from 2 . The benchmark images belong to the Image Processing Standard Database (http:// www. image proce ssing place. com/ root_ files_ V3/ image_ datab ases. htm) and the USC-SIPI Image Database (http:// sipi. usc. edu/ datab ase/), which are well-known and widely used in the image thresholding literatures.
Currently, there are several measurements 2,21,[27][28][29]32 to quantitatively evaluate the quality of the image thresholding method. We used the misclassification error ( ME) 2 , region nonuniformity ( NU) 2 , feature similarity ( FSIM ) 29 and mean intersection over union ( mIoU) 32 to quantitatively assess the different thresholding methods. ME measurement reflects the incorrect classification of foregrounds pixels to the backgrounds or vice versa 2 . For the bi-level image thresholding problem, ME can be taken as: where B o and F o denote the backgrounds and foregrounds of the optimal thresholded image, B T and F T denote the backgrounds and foregrounds region pixels of the original image, and | * | denotes the cardinality of the set * . Obviously, ME equals to 1 for the worst case and 0 for the best case. ME is the easiest and most effective method for the discrepancy measure 30 .
NU measures the intrinsic quality of the segmented regions, is defined as: www.nature.com/scientificreports/ where σ 2 denotes the variance of the image, and σ f 2 denotes the variance of the foregrounds. B T and F T denote the backgrounds and foregrounds region pixels of the original image. Obviously, NU closes to 0 for a well-segmented image and equals to 1 for the worst-segmented image.
FSIM calculates the similarity of two images, is defined as: where where T 1 and T 2 denote constants. Here, T 1 = 0.85, T 2 = 160 . is the whole space of image. G is the gradient of image, defined as: PC represents the phase consistency, defined as: where A n (x) denotes n order amplitude, E(X) represents n order response vector level at position X . ε represents a small positive constant. Obviously, FSIM closes to 1 for a well-segmented result and equals to 0 for the worstsegmented result.
Experiments on synthetic images. Synthetic images are perfect for testing the image thresholding algorithm because their optimal threshold values can be obtained manually 31 . Figure 2 shows two original synthetic images with 256 × 256 pixels, which named as "Circles" and "Squares" [Fig. 2a,e], respectively. In Fig. 2a, we place some circles (their gray level is 150) on a darker background (gray level is 50). Figure 2b shows a noisy  Figure 2h shows the ground-truth image of Fig. 2f. For the synthetic "Circles" image, the optimal threshold value, which was calculated manually based on the ground-truth image, is 108. The threshold values, MEs , NUs and FSIMs obatined using the seven thresholding methods are listed in Table 1. The best values in terms of MEs , NUs and FSIMs are highlighted in bold. Among the seven thresholding methods, the threshold value obtained by WPWLPT is the closest to the optimal threshold value. It equals 110, and that the ME , NU and FSIM are equal to 0.0049, 0.0837 and 0.8103, respectively. The threshold value obtained using the OTSU method is 102. Its ME , NU and FSIM are equal to 0.0120, 0.1025 and 0.7998, respectively. The threshold value obtained using the KSW method is 82. Its ME , NU and FSIM are equal to 0.2650, 0.2208 and 0.6425, respectively. The threshold value obtained using the CHPSO_otsu method is 102. Its ME , NU and FSIM are equal to 0.0120, 0.1025 and 0.7998, respectively. The threshold value obtained using the CHPSO_ksw method is 84. Its ME , NU and FSIM are equal to 0.2595, 0.2172 and 0.6652, respectively. The threshold value obtained using the GLLV method is 101. Its ME , NU and FSIM are equal to 0.0156, 0.1368 and 0.7921, respectively. And the threshold value obtained using the GABOR method is 105. Its ME , NU and FSIM are equal to 0.0098, 0.1027 and 0.8024, respectively.
For the synthetic "Squares" image, the optimal threshold value, which was calculated manually based on the ground-truth image, is 153. The threshold values, MEs , NUs and FSIMs obatined using the seven thresholding methods are listed in Table 2. The best values in terms of ME , NU and FSIM are highlighted in bold. Among the seven thresholding methods, the threshold value obtained by WPWLPT was also the closest to the optimal threshold value. It equals 152, and that the ME , NU and FSIM are equal to 0.0018, 0.0386 and 0.8198, respectively. The threshold value obtained using the OTSU method is 147. Its ME , NU and FSIM are equal to 0.0020, 0.0405 and 0.8002, respectively. The threshold value obtained using the KSW method is 108. Its ME , NU and FSIM are equal to 0.0606, 0.1982 and 0.6512, respectively. The threshold value obtained using the CHPSO_otsu method is 148. Its ME , NU and FSIM are equal to 0.0019, 0.0401 and 0.8076, respectively. The threshold value obtained using the CHPSO_ksw method is 110. Its ME , NU and FSIM are equal to 0.0618, 0.1884 and 0.6725, respectively. The threshold value obtained using the GLLV method is 150. Its ME , NU and FSIM are equal to 0.0018, 0.0398 and 0.8178, respectively. And the threshold value obtained using the GABOR method is 146. Its ME , NU and FSIM are equal to 0.0021, 0.0403 and 0.8104, respectively.
As can be seen from the results of the threshold values, MEs , NUs and FSIMs , it is clear that: For the synthetic "Circles" image, the threshold values of KSW and CHPSO_ksw are 82 and 84, respectively. They are almost worthless threshold values, because of them far from the optimal threshold (108). In contrast, the threshold values of OTSU, CHPSO_otsu, GLLV, GABOR and WPWLPT are 102, 102, 101, 105, 110, respectively, which are reasonable threshold values due to their near to the optimal value. Especially, the threshold value of our WPWLPT is only 2 larger than the optimal threshold. The results in terms of MEs , NUs and FSIMs also reveal that our WPWLPT yields the best results. The MEs and NUs provided by the KSW and CHPSO_ksw methods were so higher, and the FSIMs were so lower than other methods. While OTSU, CHPSO_otsu, GLLV, GABOR and WPWLPT can obtain reasonable results, especially our WPWLPT method which obtains the minimum ME , NU and the maximum FSIM values.
For the synthetic "Squares" image, the threshold values of KSW and CHPSO_ksw are 108 and 110, respectively. These are almost worthless threshold values too, because they are from the optimal threshold (153). In contrast, the threshold values of OTSU, CHPSO_otsu, GLLV, GABOR and WPWLPT are 147, 148, 150, 146, 152, respectively, which are reasonable threshold values because they are close to the optimal value. Especially, www.nature.com/scientificreports/ the threshold value of our WPWLPT is only 1 less than the optimal threshold. The results in terms of MEs , NUs and FSIMs also reveal that our WPWLPT yields the minimum ME , NU and the maximum FSIM values, which were best results among all the seven thresholding methods. Figure 3 provides a visual comparison between the thresholding results obtained by the OTSU, KSW, CHPSO_ otsu, CHPSO_ksw, GLLV, GABOR and the proposed WPWLPT methods. As can be seen from Fig. 3, the KSW and CHPSO_ksw methods obtained almost unvalued results because their segmented images had obvious noise (see Fig. 3, the second and fourth images of each row). In contrast, all the OTSU, CHPSO_otsu, GLLV, GABOR and WPWLPT methods segmented a cleaner image because the threshold values they obtained were close to the optimal threshold value. Furthermore, by zooming in Fig. 3, we can easily observe that the WPWLPT method gives the clearest segmentation results compared with the OTSU, CHPSO_otsu, GLLV and GABOR methods, because it has the least residual noise.
Experiments on NDT images. The NDT images are also ideal for testing the image thresholding algorithm because their ground-truth images can be obtained directly. In this part, eight NDT images were used to assess the performance of the WPWLPT. They are "PCB", "defective tile", "material structure", "fuselage material", "eddy current", "ultrasonic", "GFRP"and "bonemarr". All the above eight images, their histograms and groundtruth images are shown in Fig. 4. The thresholding segmentation results obtained using the reference thresholding methods and WPWLPT are shown in Fig. 5. Tables 3, 4 and 5 show the MEs , NUs and FSIMs of the different thresholding methods, respectively. ̟ ME ,̟ NU and ̟ FSIM represent the average of MEs , NUs and FSIMs respectively. The best results are highlighted in bold.
Because of length limits, we only analyzed four NDT images. They are the "PCB", "material structure", "eddy current" and "GFRP".  www.nature.com/scientificreports/ For "PCB" image, the ME and FSIM values obtained by WPWLPT method are optimal, while the NU value is inferior to GLLV method only. This is reasonable because they focused on different aspects of measurement. ME represents the percentage of background pixels incorrectly classified to the foreground, or vice versa, FSIM focuses on the texture, shape and other features, while NU judges the intrinsic quality of the segmented areas. The worst results are obtained using the KSW method. Its ME , NU and FSIM values are equal to 0.2170, 0.6725 and 0.5864, respectively. Obviously, compared with other methods, its ME and NU values are too high and FSIM value is too low, making the results worthless. A visual comparison, as shown in Fig. 5, shows that the OTSU, www.nature.com/scientificreports/ CHPSO_otsu, GLLV and WPWLPT methods can segment better segmentation image. By comparison, the KSW, CHPSO_ksw and GABOR methods segmented valueless results because they misclassify lots of foregrounds as backgrounds (see Fig. 5, first row, second, fourth and sixth images, they can't distinguish the backgrounds and printed circuit board, especially the second image). For "material structure" image, GLLV, GABOR and OTSU yield the best ME NU and FSIM values, respectively. However, WPWLPT yields the closest values of ME , NU and FSIM values to the best. The worst results are obtained using the KSW method. Its ME , NU and FSIM equal to 0.6176, 0.7036 and 0.5028, respectively. Obviously, its ME and NU values are too high and FSIM value is too low, so the results obtained are worthless. The visual comparison, as can be seen from Fig. 5, we can also discover that the OTSU, CHPSO_otsu, GLLV and WPWLPT methods can segment better segmentation image. By comparison, the KSW and CHPSO_ksw methods segment almost an unvalued segmentation image because they misclassify lots of foregrounds as backgrounds (see Fig. 5, third row, second and fourth image, the whole image looks black).
For "eddy current" image, the ME and NU values obtained by WPWLPT method are optimal, while the FISM value is inferior to GABOR method only. The worst results are obtained using the KSW method. Its ME , NU and FSIM values are equal to 0.0407, 0.1239 and 0.7002, respectively. Similar to the results of the KSW method, the CHPSO_ksw method also yielded poor results for ME , NU and FSIM values. The visual comparison, as can www.nature.com/scientificreports/ be seen from Fig. 5, we can also discover that the GLLV, GABOR and WPWLPT methods can segment better segmentation images. By comparison, the KSW and CHPSO_ksw methods segment a low-value segmentation image because they misclassify some backgrounds as foregrounds (see Fig. 5, fifth row, second and fourth images, some black shadows appeared in the segmentation image). For "GFRP" image, all the ME , NU and FISM values obtained by WPWLPT method are optimal. The worst results are also obtained by the KSW method. Its ME , NU and FSIM values are equal to 0.4898, 0.7879 and 0.6036, respectively. Obviously, the ME (0.7879!) is close to 1 which corresponding to the worst case. The results obtained by CHPSO_ksw method are also valueless due to their higher ME , NU values and lower FSIM values. A visual comparison, as can be seen from Fig. 5, shows that the OTSU, CHPAO_otsu, GLLV, GABOR and WPWLPT methods can segment better segmentation images. By comparison, the KSW and CHPSO_ksw methods segmented valueless results because they misclassify lots of backgrounds as foregrounds (see Fig. 5, seventh row, second and fourth images, it's impossible to distinguish between foregrounds and backgrounds).
Experiments on a set of benchmark images. A set of benchmark images belonging to the Image Processing Standard Database and USC-SIPI Image Database, which contain 12 Gy images. For brevity, we give 12 images here, "cameraman", "house", "jetplane", "lake", "milkdrop", "livingroom", "mandril", "peppers", "pirate", "walkbridge", "tank", and "boat", all in uncompressed tif or tiff format and of the same 512 × 512 size. The thresholding segmentation results of the corresponding twelve images obtained by the reference thresholding methods and WPWLPT are shown row by row from top to bottom in Fig. 6. Tables 6 and 7 show the NUs , and FSIMs of different thresholding methods, respectively. ̟ NU , and ̟ FSIM represent the average of MEs , and FSIMs respectively. The best results are highlighted in bold. It should be noted that, we did not employ ME to measure the quality of thresholding methods experiment on these images, because the ideal thresholded or ground-truth images cannot be acquired.
As shown in Fig. 6, Tables 6 and 7, it can be observed that: For most of the tested images, the values of NU and FISM obtained by WPWLPT are the lowest. Specifically, for the "cameraman", "house", "milkdrop", "peppers", "walkbridge", "tank" and "boat" images, the proposed WPWLPT method can obtain the lowest NU values. For the "pirate" image, OTSU obtains the lowest NU values. For the "lake" and "pirate" images, CHPSO_otsu obtains the lowest NU values. For the "livingroom" and "mandril" images, GLLV obtains the lowest NU values. For the "jetplane" image, GABOR obtains the lowest NU values. In terms of FISM , the proposed method are similar. It obtains the highest FISM values in the "cameraman", "milkdrop", "peppers", "pirate", "walkbridge", "tank" and "boat" images. Although the WPWLPT method did not obtain the lowest NU values and highest FISM values for all 12 test images, the average value of NUs and From the above analysis, we can assert that the WPWLPT method can calculate better ̟ NU and ̟ FISM values in comparision with other reference methods. This result demonstrates the stability and accuracy of the proposed method. mIou values. mIoU is the more widely used objective metric for the task of image segmentation. It is defined as: where k is the number of classes, TP, FNandFP denote true positives, false positives and false positives, respectively. We employ mIoU to objectively evaluate synthetic and NDT images because they have ground-truth images. The mIoUs obtained by different thresholding methods are listed in Table 8. ̟ mIoU represents the average of mIoUs . The best results are highlighted in bold.
Discussion. Based on the above analysis of the experimental results of synthetic, NDT and a benchmark images, we find that: www.nature.com/scientificreports/ 1. The proposed WPWLPT method obtained the best segmentation performance for most images. In addition, our method can yields the lowest ̟ ME and ̟ NU , the highest ̟ FISM and ̟ mIoU on all the synthetic, NDT and the benchmark of images. Specifically, for the two synthetic images, the ̟ ME , ̟ NU , and ̟ FISM of WPWLPT   (Figs. 3, 5 and 6), although for some images, our method does not achieve the best segmentation effectiveness, it can obtain acceptable or close to the best results, which also shows the stability of our method. 3. OTSU is a traditional method, that exhibits high stability and accuracy. It outperformed most of the other methods except for ours. 4. Although our method works well for most images, it doesn't yield best performance on "material structure", "lake", "milkdrop" et al. (For these three images, all the ̟ ME , ̟ NU , ̟ FISM or ̟ mIoU values are not optimal). The possible reason is the boundary information, which plays a crucial role in our proposed method is not obvious. 5. The KSW and CHPSO_ksw methods are the two worst performing methods. The Kapur based method is 1D entropy without considering other information. Obviously, GLLV and GABOR methods are superior to Kapur based method because they introduce other information such as gradient magnitude, texture and contour, etc. This also gave us inspiration to introduce other information into our method, which is our next step. This also gives us a clue to introduce other information into our method, and contour information is a potential choice. Moreover, it can be seen as our future work.

Running time.
In addition to the qualitative and quantitative assessment, the running time of the thresholding method is another important evaluation. Table 9 reports the average running time obtained by different threshold methods for the above 22 test images. All the experiments are running on the DELL notebook with Intel(R) Core (TM) i5-4300U CPU @ 1.90 GHz 2.50GHZ, 16 GB memory. The running environment is Matlab (R2015b). As shows in Table 9, the proposed WPWLPT method takes approximately the same amount of time as the GLLV method. It is superior to the GABOR method but inferior to 1D methods, such as OTSU, KSW, CHPSO_otsu and CHPSO_ksw. Although our method is slower than OTSU, KSW etc., the running time is completely acceptable in many applications. In addition, our WPWLPT method is still highly competitive because of its superior effectiveness and robustness.

Conclusions
In this study, a new image bi-level thresholding method is proposed. The method first obtains the boundaries for the foreground and background in the image using a weighted Parzen-window to describe the gray level distribution status. Secondly, the image thresholding problem can be transformed into the problem of solving a linear programming problem for computing the coefficient values of the weighted Parzen-window. By solving the problem of linear programming, we determine the threshold. In the experiment, we used two synthetic, eight NDT and a benchmark of twelve testing images, which have different histogram types, to evaluate the quality of the proposed image thresholding method. The measurement of visual and quantitative results demonstrates that our proposed method, compared with the OTSU, KSW, CHPSO_otsu, CHPSO_ksw, GLLV and GABOR methods, can achieve better effectiveness and robustness. In the future, as an extension of this work, we will  www.nature.com/scientificreports/ embed other information, such as texture, contour etc. in WPWLPT to enhance its performance, and extend the method to the problem of multilevel thresholding.