Well described corpora that are rich of human multimodal behavior are needed in a number of disciplines, such as Health Monitoring or Behavioral Psychology. However, populating captured user data with adequate descriptions can be an extremely exhausting and time-consuming task. In my talk, I will present an approach that facilitates the acquisition of annotated data sets by involving end users directly in the machine learning process. I will demonstrate how the combination of active learning and cooperative learning helps speed up annotation of human behavioral signals in large multi-modal databases collected in various European projects. I will also discuss ideas of how to adapt the approach in such a way that it enables end users to collect and label behavioral data in the wild, for example, to keep track of factors that influence their fitness and wellbeing.