||In this paper, we present the multimodal eating behaviour dataset called iEatSet (iCareNet Multimodal Eating Behaviour DataSet). iEatSet shall serve as an algorithm benchmark dataset and aims to facilitate research in automatic dietary monitoring, eating recognition, and activity recognition in general. iEatSet provides multimodal synchronised data streams, including multi-camera vision, inertial motion sensor data, and associated ground truth labelling, recorded from 15 participants over 5 meals in a natural restaurant environment. Recordings included food selection and consumption without scripted protocol, to provide naturalistic behaviour. Having validated methods and tools to recognize people’s eating behaviour could advance research and coaching in applications related to nutrition and dieting. Currently, dieting analysis is done manually and is very tedious. Hence, an automated analysis tool is desired. The current state-of-the-art tools in activity recognition first need to be trained before they can recognize activities. The iEatSet can be particularly useful to benchmark supervised recognition algorithms and serve as reference for unsupervised algorithm analysis.