At the programme/project level, each entity that is listed in the Register (Albo) is required to provide information on its monitoring system, whose adequacy is assessed by the DGSCU. Such information should encompass the monitoring functions and roles, key tools and data collection processes, planned analysis and dissemination activities, and their timeframe. In particular, monitoring systems are expected to enable the tracking of: progress in project activities, aligning with the indicators and timeline outlined in the project plan; delivery of training activities, tutoring and the certification of competences; as well as the satisfaction of volunteers and the overall context in which the UCS experience happens. Entities designate a person responsible for monitoring actions (on volunteers’ demographic characteristics and on training provision), the preparation of the annual report on the UCS programmes/projects, and the administration of satisfaction questionnaires to volunteers.
Since the entities are responsible for the implementation and monitoring of their programmes/projects, the UCS currently does not impose a single, comprehensive monitoring system with pre‑defined tools for data collection. This set-up responds to DGSCU’s decision to grant autonomy to entities considering their specific characteristics and diverse missions and is in line with other international practices (e.g. the Canada Service Corps), where organisations implement and monitor their activities autonomously. Nonetheless, using heterogeneous data collection tools risks the dispersion of relevant information. A good balance needs to be struck between the central collection of information and the degree of autonomy left to the entities, taking into account the often‑scarce resources at their disposal to carry out in-depth M&E.
Every year by the end of March, entities are to submit to the DGSCU an annual report on the results achieved with projects in the previous year. The report should clearly inform (among other aspects) on the objectives, indicators and targets related to the outputs produced; the activities conducted; the resources committed; the training courses provided; the entity’s self-assessment of the achievement of the objectives; and the level of satisfaction of volunteers (DGSCU, 2021[38]). A summary report also needs to be published on the entity’s website. Monitoring by entities should consist of a flow of continuous observation of the system, highlighting its strengths and the gaps that need to be addressed. Consultations with entities, however, highlight that monitoring is often seen as a burdensome and formal exercise to fulfil the requirements of the law. Consequently, in various cases monitoring reports only consist of a description of the demographic characteristics of the volunteers (e.g. distribution by age, regions, and sectors) or a summary of the projects.
Together with data on programme/project implementation, questionnaires to volunteers represent a key source of information for such annual reports. Pre‑selection,1 follow-up2 and satisfaction3 questionnaires are key to understand: a) the characteristics, motivation, expectations and level of satisfaction of candidates/volunteers; b) the outcomes and impacts of the interventions on young volunteers and the territories; as well as c) success factors and areas for improvement (Table 2 shows that questionnaires to young people are relevant to feed various indicators on the UCS performance). Questionnaires (in paper or online) are sent out to volunteers every three or four months: at the beginning, in the middle and at the end of the service. Entities also send out questionnaires to their staff in charge of the UCS (including the local project operators) to collect data on the implementation of the project and verify its progress in relation to the initial plan. By being administered at different stages of the interventions, the questionnaires can collect information on the same question at different points in time and allow to detect emerging issues over the course of the projects. Administering such questionnaires also some time after the end of the placement would provide valuable information from an evaluative perspective, for instance allowing to understand young people’s education or labour market status of young people, or their motivation for active citizenship, at different points in time.
Yet, some issues can be identified regarding the current use of such questionnaires. First, there is a variable amount of information generated at a programme/project level, which is currently not summarised in a UCS-level dashboard or database. Second, the use of such tools varies across entities, from cases where questionnaires are just treated as a formal step, to virtuous cases where they are used for evaluation and leaning purposes (see examples in sub-section below). And third, follow-up of volunteers finishes at the end of the placement without capturing effects over a longer period.