Chest X-rays are one of the most common medical images in the medical domain. Reading chest X-rays are often regarded as one entry-level task for radiologist trainees. Traditionally, radiomics, as a field of medical study that aims to extract a large number of quantitative features from medical images, is very popular among the medical-related researchers before the deep learning era. With the rise of deep learning, this task has drawn an increasing attention from AI researchers. Yet, the interpretability of learning-oriented machine intelligence of understanding chest X-rays remains poor and non-transparent compared to radiomics. Motivated to solve the above challenge, we focus on combining radiomics with deep learning to improve the black-box's interpretability. Specifically, we propose a novel training strategy for deep neural networks to learn from radiomics on Chest X-rays to extract robust features without loss of interpretability.