Reduced sharing of affect and differences in use of facial expressions for communication are core symptoms of ASD  and are assessed as a part of standard diagnostic evaluations . Children with ASD more often display ambiguous expressions compared to children with other developmental delays and typically developing (TD) children . Standardized observational assessments of ASD symptoms require highly trained and experienced clinicians  that usually involves manually coding the observations of facial expressions from recorded videos using time-intensive facial action coding systems. Computer vision (CV) can be used to overcome this challenge with automatic extraction of facial landmarks that can be used to quantify the atypicality of facial expressions  and emotional competence  in ASD. Understanding the range of facial landmarks’ movement and its dynamics across a time domain  can serve as a distinctive behavioral biomarker towards early screening of ASD. In our work, we studied the facial landmarks’ dynamics of the toddlers with ASD versus TD, quantified in terms of a complexity estimate derived from Multiscale entropy (MSE)  analysis. We hypothesized that the toddlers with ASD would exhibit higher complexity (i.e., less predictability) in their landmarks’ dynamics associated with eyebrows and mouth regions.