||We investigate a pattern spotting approach, comprising one-class-classification and sliding window search to identify eating moments in continuous Electromyography (EMG) chewing data recorded with smart eyeglasses. Our approach circumvents the problems of binary classification, i.e. modelling non-eating behaviour and dealing with the high skew between eating and non-eating times in free-living dietary behaviour data. In addition to information retrieval performance, we investigate the timing performance for determining start and end of eating events. We compare fixed epochs without overlap and a sliding window search with overlapping windows to determine best event timing performance. To evaluate our approach, we use free-living study data of ten participants wearing the smart eyeglasses during daily life. Our results show that the overlapping window search in the eyeglasses' EMG data yields best F1 score (95%) at an approx. start timing error of 21.8 s±29.9s and end timing error of 14.7s±7.1s.