Process working condition recognition based on the fusion of morphological and pixel set features of froth for froth flotation Academic Article uri icon

abstract

  • © 2018 Elsevier Ltd Process condition recognition is an effective way to improve the froth process performance. In previous condition recognition algorithms based on machine vision in flotation process, the used features including gray value, bubble size distribution, load, etc., are essentially statistical results of gray level images and there is local bubble structure information loss in their extraction procedure. Meanwhile, the large number of image data are not adequately utilized. Thus, in this paper, deep neural network are used to extract the pix set features, and a two-step condition recognition method based on bubble image morphology and pixel set features is proposed, which utilizes the large number of image data. First, froth images are segmented into single bubble images. Next, the morphological feature vector of a single bubble image is extracted and classification labels are assigned to the single bubble images via K-means clustering. A large quantity of historical images is analyzed and labeled to train a convolutional neural network (CNN) by which the pixel set features of each bubble image are extracted. The morphological feature vector and pixel set features of the bubble images are then fused for bubble image clustering using the weighted mean-shift algorithm. The frequencies of various types of bubbles in a froth image are calculated to form a bubble frequency set for the froth image. A two-step working condition recognition strategy based on image sequence over a time period is then proposed. In this strategy, the bubble frequency sets of all froth images and those of the bubble images segmented from the froth images over the corresponding time period are matched with those of the images of typical flotation conditions in two main steps to determine the current working condition. Test results using industrial data demonstrate the high accuracy and calculation speed of the proposed method.

author list (cited authors)

  • Wang, X., Song, C., Yang, C., & Xie, Y.

citation count

  • 8

publication date

  • November 2018