We present a new classification approach for live cells, integrating together the spatial and temporal fluctuation maps and the quantitative optical thickness map of the cell, as acquired by common-path quantitative-phase dynamic imaging and processed with a deep-learning framework. We demonstrate this approach by classifying between two types of cancer cell lines of different metastatic potential originating from the same patient. It is based on the fact that both the cancer-cell morphology and its mechanical properties, as indicated by the cell temporal and spatial fluctuations, change over the disease progression. We tested different fusion methods for inputting both the morphological optical thickness maps and the coinciding spatio-temporal fluctuation maps of the cells to the classifying network framework. We show that the proposed integrated triple-path deep-learning architecture improves over deep-learning classification that is based only on the cell morphological evaluation via its quantitative optical thickness map, demonstrating the benefit in the acquisition of the cells over time and in extracting their spatio-temporal fluctuation maps, to be used as an input to the classifying deep neural network.
- cancer cells
- deep learning