We present a method to analyze images taken from a passive egocentric
wearable camera along with the contextual information, such as time and day of
week, to learn and predict everyday activities of an individual. We collected a
dataset of 40,103 egocentric images over a 6 month period with 19 activity
classes and demonstrate the benefit of state-of-the-art