Skip to main content

User survey among participants in the introduction programme

  • Engelsk sammendrag av Fafo-rapport 2024:20
  • Rebekka Ravn Lysvik, Ida Kjeøy og Guri Tyldum
  • 16. september 2024

This report presents the main findings from a study conducted by Fafo on behalf of the Directorate of Integration and Diversity (IMDi). It asks what is important for user satisfaction among participants in the introduction program in Norway, and how IMDi’s national user survey can be designed to better capture user satisfaction and key challenges in the introduction program.

The study draws on a combination of different methods. To map user satisfaction, we used a qualitative design, where we interviewed leaders of adult education/refugee services, teachers, program advisors, and participants in eight different municipalities. The municipalities were selected to reflect variation in centrality and population size; four larger, urban municipalities and four small and more rural municipalities. We also evaluated elements of IMDi’s quantitative user survey. We mapped the intentions behind and expectations for the current quantitative user survey, through interviews with involved actors in IMDi, and gained insight into how the participants experienced responding to the survey, through observation and interviews with participants. The following is a brief summary of our main conclusions.

Quality in teaching, user participation, and access to program advisors are important for high user satisfaction

The participants we spoke with are often very aware of the importance of knowing Norwegian to manage in Norway. They describe that knowing the language is crucial for their independence and many see the language training they receive in the municipality as a key element that will shape how their lives will be in Norway. Thus, the quality of Norwegian language training was an element many emphasized as important for their satisfaction with the program. In particular the quality of teachers, training adapted to their level/progression, and not having to spend time on activities they find meaningless, were particularly important for high user satisfaction.

When asked what they like best about the introduction program, participants often highlight the people and human relationships. Although Norwegian language training is only one of several components in the introduction program, the Norwegian teacher often holds a special position. What is perceived as poor education varies, of course, but several participants emphasize the experience of not benefiting from the teaching as decisive for dissatisfaction. Poor adaptation of progression in teaching is also something many participants point out as frustrating – both those who feel that the teaching is progressing too slowly and those who think it is going too fast.

Participants who experience the possibility of co-determination and user participation were more likely to perceive the time in the program as meaningful. In interviews with participants of the introduction program, this dimension of user satisfaction was often emphasized, for instance when participants were distressed because they have to attend courses they were not interested in, or that they did not understand why they had been assigned to.

Accessible feedback channels are also important for user participation. In this study none of our respondents knew how to complain about the content of the program if they were not satisfied with the offer they receive in the introduction program.

Newly arrived refugees often have a great need for advice and guidance. Respondents’ satisfaction with the introduction program was closely linked to how easy they perceived it to be to contact a program advisor. There is great variation among the municipalities we visited, both in terms of how easy it is to get hold of a program advisor and the role and tasks this actor has.

We also show that for some refugees, it is difficult to benefit from the training that they receive, because there are so many other elements in their lives that needs their attention. They talk about worries for family members still in war-torn areas and about health problems and traumas linked to war and flight. Here too, the relationship with the program advisor and others in the municipality plays an important role in ensuring that the participant receives the help they need and that the services that the participant receive are adapted to their needs.

Recommendations for the implementation and design of the User Survey

Municipalities are strongly encouraged to run IMDi’s user survey, but are not required to. In 2022, 142 municipalities participated in the survey, which constitutes 40 percent of the 354 municipalities where refugees were settled that year. In 2023, 158 municipalities (45 percent) participated. The user survey can be distributed to both participants in Norwegian and social studies education and participants in the introduction program. In total, approximately 5,000 participants took part in the survey in 2022. This constitutes less than 30 percent of participants in the introduction program or Norwegian education for adults in Norway. 

IMDi encourages municipalities to facilitate the participation of all participants, regardless of educational background, in the survey. The survey is available in 25 languages, with text and audio for those who do not read well. Municipalities are also encouraged to practice the demo version of the survey in advance and review the questions to ensure that participants can respond without assistance. In the municipalities we visited, this was largely done. Nevertheless, we saw that parts of the survey are designed in a way that makes it difficult for refugees with little or no schooling and limited digital skills to complete the survey without help. This concerns both the way the questions are formulated, the number of questions and the type of response categories, how to log in and start the survey, and the technical design of the survey. Several of the municipalities we visited choose to only give the survey to parts of the participant group, and often prioritize groups who have some schooling, as they find the survey too demanding for those with little educational background. Several teachers who assist during the implementation claim they observe participants with little schooling ticking off answers that are clearly incorrect. Analysis of the data indicates that this probably happens, as we find some logical errors, such as 20 percent of participants with 0–3 years of schooling claiming to attend preparatory courses for university. We take this as an indication that participants click through the survey without understanding the questions they are answering. Several municipalities also report in their feedback to IMDi that participants tick off that they receive offers (such as primary and secondary education) that none of the participants in their municipality actually have access to.

In this study, we also look at some specific question formulations from the current user survey and discuss these in light of the qualitative user survey. The elements we have described above, which are important for user satisfaction, are only captured to a limited extent in the current user survey. Questions about whether participants like, are satisfied with, or consider the introduction program useful can all give an indication of the quality of the education but are at too high an abstraction level to assess what can potentially be improved. It does not emerge from the current survey whether it is the teacher, level, progression, or relevance that participants are or are not satisfied with. Several of the questions could therefore be specified to information about specific challenges in the local introduction program that are important for the participant’s user satisfaction.

The language in the questions is sometimes complicated and difficult to understand, something that is linked with the high level of abstraction mentioned above. Using simple and clear language is especially important because the survey is to be translated, as many nuances can change in a translation. The clearer the meaning is in Norwegian, the more certain one can be that it will be translated correctly into all languages. Some of the questions in the survey are designed in a way that does not distinguish between experiences and participants’ assessments of experiences. In other questions, the participant must compile answer to several different questions in one answer.

In the qualitative interviews, we found that participants expressed frustration over certain parts of the training they got in the introduction programmed, but when asked how they had responded in the user survey, they explained that they had responded positively because they wanted to show gratitude for having received an offer to get training at all. In light of this, it may be wise to formulate questions that do not have an unequivocally positive and negative answers. This will make it easier for all participants to express things that do not work optimally.

One of the biggest challenges for conducting the survey, both from our own observations and in line with what employees in the municipalities have told us, is logging in and choosing a language. In classrooms where we have observed the implementation of the survey, the majority of participants need help logging in, even though they have reviewed the survey and practiced the demo version the day before. This also applies in classes where all participants have secondary education or higher; in these classes, a small group finds their way on their own, but the majority ask for help from classmates or the teacher for one or more steps at the start of the survey. After participants have started the survey, fewer ask for help. When even participants with good English skills and higher education have trouble logging in, conducting the survey becomes challenging for the schools, and it becomes difficult for participants to complete the survey anywhere other than in the classroom. We believe it is possible to facilitate a simpler login process for the user survey. Surveys sent to the general population are normally distributed with a link that the respondent clicks on, and the survey can then be completed with minimal scrolling. Such simple login is used in most surveys because simple login is central to ensuring high participation, good representativeness, and that respondents with weak data skills can also complete the survey.

Varying Use of Survey Results

IMDi produces an annual report on the findings from the survey and also sends reports to each municipality, summarizing the results from that municipality. For all user surveys (2020–2024), an analysis webinar (organized by Rambøll) has also been conducted for the municipalities on how they can interpret the results and work on following up the survey.

Several municipalities we contacted were not aware that the results from the user survey had been published when we interviewed them about this several months after publication in 2023. The municipality’s contact person(s) for the user survey receives an email with information about when the municipal report is available in the user survey portal, but much indicates that this information is not always communicated further within the organization, and that the results are not actively used to evaluate the program.

In other municipalities, the results had been presented to employees in both the refugee service and adult education, but this had not led to changes. Municipal employees report that they do not feel the survey captures the challenges they know they have in the introduction program, and that the questions do not capturing the issues the participants are dissatisfied with. Employees in several municipalities believe the survey paints an overly positive picture of the situation in the municipality. They claim that rather than contributing to change, the survey can lead to the management patting themselves on the back and thinking they can continue as before.

IMDi’s reporting on the survey to the municipalities does not distinguish between different types of participants, and the results are reported for the municipality as a whole. This means that, based on the analyses that have been published, they do not know what characterizes participants who are very satisfied with the introduction program and the ones who are not. In other Norwegian user surveys that are conducted for schools and educational situations, findings can be broken down into individual classes and subgroups of participants/students, and this contributes to the survey being perceived as useful both for the school owner and the teachers themselves.

From IMDi’s perspective, user participation is an important reason for conducting the user surveys. Our qualitative interviews indicate that the national user survey gives participants in the introduction program a feeling of being heard. The participants we interviewed after the survey are consistently positive. They tell us that they greatly appreciate that a user survey is conducted and that someone cares about what they think about the offer they receive. Many also hope that responding to the survey can contribute to changes in how the introduction program is run in their municipality.