Hierarchical Approach to Classify Food Scenes in Egocentric Photo-Streams

Estefanía Talavera Martínez, Maria Leyva Vallina, Md Mostafa Kamal Sarker, Domenec Puig, Nicolai Petkov, Petia Radeva

Research output: Contribution to journalArticleAcademicpeer-review

4 Citations (Scopus)
113 Downloads (Pure)


Recent studies have shown that the environment where people eat can affect their nutritional behavior [1]. In this paper, we provide automatic tools for personalized analysis of a person's health habits by the examination of daily recorded egocentric photo-streams. Specifically, we propose a new automatic approach for the classification of food-related environments, that is able to classify up to 15 such scenes. In this way, people can monitor the context around their food intake in order to get an objective insight into their daily eating routine. We propose a model that classifies food-related scenes organized in a semantic hierarchy. Additionally, we present and make available a new egocentric dataset composed of more than 33 000 images recorded by a wearable camera, over which our proposed model has been tested. Our approach obtains an accuracy and F-score of 56% and 65%, respectively, clearly outperforming the baseline methods.

Original languageEnglish
Article number8735865
Pages (from-to)866-877
Number of pages12
JournalIEEE Journal of Biomedical and Health Informatics
Issue number3
Publication statusPublished - Mar-2020


  • Egocentric vision
  • food scenes
  • lifestyle
  • scenes classification

Cite this