I. Compile & Assess
Research requires literature review, stakeholder analysis, and theoretical foundation.
II. Research & Analyse
Surveys collect data, test hypotheses, and analyze results statistically.
III. Elaborate & Develop
Policy analysis identifies gaps, evaluates impact, and informs recommendations.
IV. Demonstrate & Validate
Validation ensures outcomes meet objectives, user needs, and credibility.
I. Compile & Assess
Desk Research & Stakeholder Studies
Gaining an in-depth insight into the scientific state of the art and defining a theoretically sound basis is critical for a research project. Accordingly, (systematic) literature reviews and desk-based research are carried out to gain a comprehensive understanding of the researched topic before an empirical study is initiated. Within the process of knowledge gathering, this exploration further extends to the stakeholder ecosystem of the research project and various methods may be applied for identifying, mapping, categorising, and engaging stakeholders.
- Systematic literature review:
The scientific method of conducting a systematic literature review aims for the identification and evaluation of topic-related (scientific) publications and other relevant literature within the researched field. As for any kind of literature review, first, the research questions and search strategy are defined, including the databases and search terms to be used, as well as inclusion and exclusion criteria. The systematic approach is structurally aligned with the research objectives of the study and follows a robust methodological framework. After the review of the identified literature, the obtained results are synthesised and conclusions are drawn from the elaborated state of the art. - Scoping review:
Following a similar methodological approach as systematic reviews, scoping reviews are applied to identify knowledge gaps, scope a body of literature, clarify concepts or to investigate research conduct. Thereby, the guiding research objectives are broader, as are the inclusion criteria and the review outcome serves as a research-informing overview of the examined topic. - Stakeholder mapping & engagement:
Stakeholder identification, mapping and engagement provide most relevant perspectives and insights for empirical research and are critical for the uptake of project outcomes. For the coordinated identification and categorisation of stakeholders a framework is created. In form of a structured road map, this framework outlines the stages of engagement for the mapped stakeholders as well as the methods for their involvement. These may vary between the early phases of the research process (problem identification, needs assessment, research design) or later stages (evaluation, validation, dissemination of research results).
Study Design & Case Studies
At the beginning of an empirical research, the study design lays the foundation for the methodological work carried out. Depending on the research questions of the project at hand, a suitable approach of qualitative, quantitative or mixed methods is applied and an appropriate study design and framework needs to be elaborated. This includes the selection of research methods, adequate sampling, approaches for analysis and definition of scientific outputs the research aims for.
- Definition of research questions & methodology:
Based on an in-depth understanding of the scientific state of the art and the problem faced, research questions are defined to guide the research and analysis (explorative vs. explanatory sequential design). The methodological approach is based on the theoretical framework. In alignment with the research questions, a quantitative, qualitative, or mixed-methods approach is designed and suitable methods from the vast range of social science methods are selected (see Research & Analyse). - Sampling strategy & recruiting methods:
Depending on the target population and research sites, the sampling strategy, recruiting methods, as well as inclusion and exclusion criteria of involved subjects are defined. In consideration of the research interest, different strategies can be employed, such as systematic sampling or stratified sampling. Ensuring the systematic reflection of gender equality, inclusivity and diversity research-related tasks, the sampling and recruitment also comprises the consideration of diversity factors like ethnicity, nationality, citizenship, disability, age, gender identity, sexual orientation, religion, and social or family status. - Risks and mitigation measures:
Before the fieldwork is initiated, potential risks need to be identified and suitable mitigation measures set in place. This includes questions of ethics, data protection and privacy, such as the definition of informed consent guidelines. Informed consent is the process by which participants are informed about the research and its objectives and made aware of their right to decide whether to participate, as well as of subsequent rights. It is at the centre of any empirical research involving human participants. - Analysis:
In line with the chosen methodological framework, the means of the analysis are defined, following a quantitative, qualitative or mixed-methods logic. (see Research & Analysis).
Classifications & Taxonomies
Classifications and categorisations are used to systematically characterise a specific topic at hand. Through the approach of demarcation, types of entities can be broken down into categories which help to identify and map a specific field of interest. There are different methods and approaches to do so, from the set up through systematic analysis to the validation through workshops.
- Preparation & analysis:
The first step of any classification or categorisation is to analyse the state of the art, identify definitions, initiatives, standards and/or approaches on which to build. This step closely relates to the context in which the classification is pursued. For example, in a business context, understanding industry-specific terms, market segments and benchmarks provides the required foundation of the analysis. - Iterative development:
Through different ways of collecting and validating information – for example, through empirical research, stakeholder consultations, or literature review – a classification scheme is built up in multiple iterations, each refining the previous work. This may result in a clustering framework or a conceptual decision tree. - Evaluation & validation:
For each step of the iterative development process, evaluation and validation activities are needed, considering the perspectives of various stakeholders and target groups. These can be engaged through workshops and other appropriate settings of discussion.
II. Research & Analyse
Quantitative Surveys
Surveys provide the possibility to collect data from a vast number of respondents. For their specification, a detailed understanding of the state of the art, as well as the target groups is needed to elaborate research questions and hypotheses. Building on the state of the art and theoretical framework, questions and hypotheses are specified to be tested (verified/falsified) via the means of the quantitative survey. The research design is then set up, research sites (offline, online, mixed) are defined, and respondents are selected, following a sampling strategy. After the data is collected and processed, statistical analysis is conducted and findings are interpreted, visualised and presented.
- Operationalisation & questionnaire development:
Based on the hypotheses to be tested, key terms and concepts are specified to define what is to be measured in order to ensure the viability of the research results. In this process abstract theoretical concepts are translated into variables that can be measured and the survey questionnaire is developed. - Sampling strategy:
The selection of survey participants is vital to ensure that the survey results are representative for a certain population and that there are no inherent biases within the sample. After defining a target population, a sampling strategy is developed to select a specific number of participants based on a defined set of characteristics (age, gender, etc.) which mirror the composition of the targeted population. Depending on the research interest there are different strategies that can be employed (such as systematic sampling, stratified sampling). - Piloting:
Before surveys are distributed and data is collected, questionnaires are tested with a smaller sample to ensure that the questions are well understood by the survey participants. The piloting phase serves to make final adjustments to the questionnaire (wording, length). - Data Collection:
Surveys can be conducted online (accessible via a link) or offline (using pen and paper). Both methods have their advantages and disadvantages. While online surveys can be more easily distributed, they fail to reach parts of the population that have limited access to or knowledge of web applications. Paper surveys are more cost intensive and time consuming, but can be necessary to ensure proper sampling and targeted inclusion of respondents. Alternatively, surveys can also be conducted via phone, or face-to-face, with the researcher asking the questions and marking the responses on the questionnaire. - Analysis:
After data collection, the responses are analysed using multivariate data analysis methods. Researchers use the data to test hypotheses, look for correlation of variables or conduct trend analyses. Based on the results of a survey, the researcher draws conclusions for a larger part of the population for which the survey is representative. In this context it is important to put particular emphasis on cautious interpretation of results and to be aware of the weaknesses of this research method. While surveys have the advantage of gathering data from a large sample, they cannot explain the motivations, perceptions or opinions of each participant in-depth, which can be accomplished through qualitative research. - Descriptive & inferential statistics:
Methods and tools are employed to describe, summarise and visualise raw data. These include measures of central tendency or dispersion, graphs such as scatter plots or histograms, as well as tables. Inferential statistics are used to make predictions and draw conclusions about broader populations based on sample data. These include the representation of confidence intervals, application of regression analysis or predictive modelling.
Focus Groups & Interviews
Social scientific research provides a variety of interviewing techniques in different levels of structuring, from standardised interviewing techniques to narrative interviews. Depending on the research question(s) and the target population(s), a suitable technique is selected, and an interview and fieldwork guide is elaborated, including sampling and recruiting strategies. After conducting the interviews, they are transcribed or summarised, and then analysed using heuristics or content analysis.
Focus groups and workshops are a way to collect data from several persons at the same time, allowing the researcher to understand reactions and opinions under the consideration of group dynamics. Following the same steps as interviewing techniques, focus groups are recommended to involve between 5-20 participants. (Expert) Interviews, focus groups, and workshops can be conducted face-to-face or online.
- Operationalisation & interview guide:
Based on the research question(s) key terms and concepts are specified to define what is to be measured in order to ensure the viability of the research results. In this process the interviewing style is defined and the interview guide developed. Depending on the research question, interviews can be more or less structured, ranging from open questions, to semi-structured and structured interviews. The interview style affects the level of control that the researcher and the participants have over the topics that are discussed during the interview. - Sampling & recruitment strategies:
The selection of interview or focus group participants is vital to ensure that the research approach is transparent and comprehensible. Depending on the study’s objectives, as well as the characteristics of the target population(s) (experts, end-users, practitioners, public, etc.), a sampling strategy is defined, setting out which and how many participants to select. The outlining of a recruitment strategy ensures the consideration and reach of all relevant stakeholders to be integrated into the research. - Transcription:
Transcription methods vary depending on the research interest and can range from very detailed transcripts that include non-verbal expressions, pauses, fillers etc. to edited transcription that summarises the spoken content that is deemed most relevant by the researcher. The transcription of the interview allows the securing of data to uphold accuracy in its analysis and make it accessible for further research purposes. Within this process the compliance with anonymisation and data protection guidelines is ensured. - Analysis:
In a next step the transcribed interviews or focus groups are analysed using content analysis or heuristic methods, taking into consideration the theoretical underpinnings of the research endeavour. The insights from focus groups and interviews serve as an empirical foundation for answering the research questions.
Content Analysis, Discourse Analysis & Social Media
Content analysis is a method to analyse texts, images, or videos, and can be conducted quantitatively or qualitatively. It follows a structured approach and allows the categorisation of content according to the research question(s). Discourse analysis, on the other hand, goes beyond the manifest content, considering the discourse around a topic within its power relations.
Social media provides the possibility to gain insights into a discourse within a specific channel and community and their relationship to each other as well as the networks through which information is distributed and shared. To conduct an analysis of social media data, researchers need to define the research questions and outline a data collection strategy, for example, in selecting specific channels or keywords. Analysis can be conducted quantitatively, qualitatively, using mixed-methods and may include various levels of analysis.
- Quantitative content analysis:
In a quantitative content analysis various types of written, visual or oral content are systematically coded into predefined categories and analysed. This can include social media and media analysis, sentiment analysis, and big data approaches. The category-based procedure ensures consistency and objectivity, while the application of statistical analysis allows to identify trends, patterns or correlations between variables in terms of their frequency. - Qualitative content analysis:
This research method facilitates the subjective interpretation of the content of data. A qualitative content analysis is most often used to analyse interviews or a limited amount of written content. Categories are used to analyse meaning, similarities and differences across a set of interviews. A comprehensive methodological framework based on state of the art research ensures transparency of the categorisation procedure. - Discourse analysis:
This method is used for the analysis of written or spoken language and its relationship to the social world. Here, language is understood as a form of social practice that influences the social world. Simultaneously, the social world influences language. Through a discourse analysis the changes of a particular discourse over time and its meanings can be understood. - Trend analysis:
For the analysis of the online search behaviour of individuals, a keyword search or trend analysis investigates and evaluates the volume of keyword search queries conducted in a specific geographic location. The analysis thereby discovers the relevance of related keywords based on search volume, competition, and relevance and produces exportable data for further analysis. - Data collection & social media analysis:
Social media data can be collected manually or automatically. In the case of the latter, to ensure privacy, data should be collected via the APIs provided by social media platforms. Following the principle of data minimisation, only those data that are relevant to the research questions should be collected. The analysis of social media data depends on the research questions and may be conducted quantitatively or qualitatively. This may include an analysis of sentiment, an analysis of metadata such as geolocation, or a network analysis.
Ethnography & Participatory Research
Through ethnography, researchers explore cultural phenomena, lived experiences and everyday life while being embedded in the lives and social structures of the interlocutors to gain an even deeper understanding of the ways in which people make sense of their social worlds. Participatory research requires a different involvement of the study’s interlocutors. Research participants are not only the source of the data, but are also actively involved in the design of the research and its questions as well as its analysis.
Both types of research involve a range of methods including interviews, participant observation as well as creative methods that provide an opportunity to explore complex and sensitive issues which may be hard to verbalise. These may include approaches such as visual or arts-based methods such as drawing, photography, poetry, and journaling.
- Participant observations:
This method aims at the collection of data on people, the ways in which they navigate their everyday lives and relationships as well as the cultural processes and explanations behind all of this. Key is that the researcher spends an extended time with and in the environment of the study’s participants to observe particular aspects of their lives. This could for example take place in a work or school setting where a researcher would like to understand the interaction of students/employees with new technologies. - Arts-based methods:
Creative research uses artistic forms of expression to explore, understand and represent human experience. The focus lies on artistic processes including written, visual, spoken and performance which are employed to understand subjective experiences, complex or sensitive topics. These are particularly useful when language proficiency of participants is not sufficient (for example migrants, children, people living with disability, etc.). Research outcomes can not only be used as data, but also to disseminate findings. - Journaling:
Journaling is used to record the experiences and reflections of participants over a certain period of time and in a natural setting. This may be carried out as a written exercise as well as in form of video journal. The latter, a method of visual ethnography, provides the opportunity to explore narratives and experiences that draw on more than language. Journaling is also often used as a reflective tool for researchers.
III. Elaborate & Develop
Policy Analysis & Recommendations
Policy analysis is a systematic process of identifying and comparing policy with the aim to determine gaps (prescriptive, normative) and/or evaluate current policy (analytical, descriptive) in relation to its intended purpose. Depending on the approach, the methodology and categories of analysis are defined. Policy recommendations are elaborated to inform new policy, based on scientific evidence and identified gaps.
- Descriptive policy analysis:
Descriptive policy analysis focuses on the process of policy development, the context in which it was developed and by whom. The analysis includes the historical dimension of the policy making process and explains why certain policies were formulated and implemented by governments. According to the model of the policy cycle, this often includes the analysis of the following phases: problem definition, agenda setting, policy formulation, policy implementation and policy evaluation. - Prescriptive policy analysis:
Prescriptive policy analysis evaluates the implications of a specific policy in the future. The results of the analysis can be used to issue recommendations for policy makers. - Policy briefs & recommendations:
Based on the policy analysis and the identified gaps and weaknesses in a specific policy area the researcher develops policy briefs and issues recommendations for (national or EU) policy makers with the aim to improve the quality, fitness for purpose and effectiveness of public policies. The drafting of policy recommendations is a way of making social science research outputs directly relevant to policy makers. Scientific results are translated into hands-on advice which is founded on empirical evidence. As concise summaries of information conceptualised to help readers understand and make decisions about government policies, policy briefs may also offer evidence-based advice to a general, non-specialised audience.
Requirement Analysis & User Tests
For the development of a technical solution, requirement analyses, use-case design, end-user testing and piloting are crucial steps. In all of these activities, it is critical to put the user at centre, aiming to understand different needs and requirements to design user interfaces that provide the optimal user experience. Through this, scenarios or use cases can be defined to inform the development of technical solutions in an agile way.
- User requirements & user studies:
User studies are conducted to understand the needs and requirements of the intended end-user(s) of a technical solution. User-centred design requires one to gain an understanding of the perspective, needs and requirements of the intended audience. This is done by applying a variety of methods such as workshops, interviews or cognitive walkthroughs. Insights are systematised through the definition of use cases, user stories, scenarios, use cases, and/or personas. - Usability & user experience testing:
Usability, as a multidimensional construct, can be examined from several perspectives. This may include the analysis of categories such as effectiveness, usefulness, usability, and ease of use, as well as attributes such as learnability, efficiency, memorability, error recovery, and satisfaction. User Experience (UX) is the process of testing the overall experience of someone using a solution. There are a variety of methods to test usability and UX, including card sorting, paper prototyping, workshops, interviewing, A/B testing, expert review, personas, etc. - User behaviour & piloting:
Pilot testing allows to verify a system or its components under real-time conditions, evaluating feasibility, costs, risks, and performance of a solution. Long-term pilots further allow researchers to understand how users behave and interact with the solution. To conduct a pilot study, a pilot plan needs to be set up, defining goals and activities. In the next step, the pilot is deployed, before results are evaluated through appropriate methods.
Methods Development, Frameworks & Training Guidelines
The translation of research insights and results into practically applicable outputs, is a crucial step for the connection of research and innovation tasks within a research project. Here, the application of a social science-based methodological approach shows its relevance on the input as well as on the output side. Co-creative settings with stakeholder engagement help to design materials, which can effectively be implemented with diverse groups to reach the targeted outcomes.
- Co-creation & co-design:
Co-creation and co-design describe the process of actively involving the user in the development of a (technical) solution. Rather than being study subjects, the intended users become active in the development and design process. Through workshops and similar methods, concepts, modules, functionalities, and/or interfaces are designed with the users. - Implementation framework and manuals:
Step-by-step guidance, provided in manuals, and comprehensive elaboration on the adaptation and usage of developed project materials ensure that research outcomes are used in a reflective manner, while also allowing for their updating and customisation to individual needs or contexts. The formulation of an implementation framework for materials or methods is therefore not only a key element within a project, but also fundamental for its exploitation success. - Training materials and manuals:
Training materials are a vital outcome of social science research projects, as they translate theoretical insights into practical tools that can be widely disseminated and applied. These materials, which may include manuals, workshops, and online resources, such as advisors or massive open online courses (MOOCs), help bridge the gap between research findings and real-world applications. By providing clear, accessible guidance, training materials empower diverse groups to implement research-based strategies effectively. This not only enhances the impact of the research even after a project has ended, but also fosters broader understanding and engagement with the project’s goals, ensuring that the benefits of the research are realised in various communities and professional settings. - Train-the-Trainer workshops:
Train-the-trainer workshops are invaluable for research projects as they ensure the effective dissemination and implementation of research findings. These workshops equip selected individuals with the knowledge and skills needed to train others, creating a multiplier effect that extends the reach and impact of the research. By fostering a deeper understanding of the project’s methodologies and outcomes, train-the-trainer workshops help maintain consistency and quality in the application of research-based practices. Additionally, they empower trainers to adapt the materials to diverse contexts and audiences, enhancing the overall relevance and sustainability of the project’s goals.
IV. Demonstrate & Validate
Validation & Test groups
An extensive validation and testing phase in a project ensures that generated outcomes align with the project’s objectives, fulfil the demands and identified requirements of the relevant user groups and effectively test whether the applied methods are successfully creating a bridge between scientific and practical knowledge. This enhances the credibility of the research carried out and helps to build trust among stakeholders and end-users, as it demonstrates a commitment to rigorous scientific standards and the practical applicability of the research outcomes.
- Validation plan:
The formulation of a comprehensive validation plan and the carrying out of its actions ensures that all implemented resources fulfil the requirements raised by the targeted groups in the conceptualisation phase of a research project. In this structured roadmap a detailed schedule of activities and responsibilities as well as a manual of the validation methodologies and steps are included. - Validation & test groups:
The practice-oriented validation of provided materials, toolkits, and solutions ensures that generated outputs are appropriate for the purposes of the respective research project. Depending on the developed research outputs the implementation of validation and/or test groups is applied to focus on the assessment of the functionality and performance of the solution under real-world conditions and its effectiveness to reach identified requirements. Test groups may also be implemented in the evaluation of generated materials.
Focus Groups & Workshops
For the validation and demonstration of developed concepts or solutions, the involvement of end-users, practitioners, experts and other stakeholders ensures the alignment of project results with identified needs and requirements. Thereby, several methods applied in the early stages of research (see Research & Analyse) can be adopted for the demonstration and assessment phase. This considers returning to stakeholders involved in the requirements analysis, conducting focus groups and interviews or implementing surveys.
- Focus groups:
As described in “Research & Analyse”, focus groups offer a valuable opportunity to collect data in group settings and help researchers to understand reactions and opinions under the consideration of group dynamics. Thereby, materials or preliminary analyses are presented to the focus group, which can involve different stakeholders or more homogenous groups of participants, and subsequently discussed. This may be carried out in a structured or less structured framework, usually following a more natural conversational pattern than that of a one-on-one interview. Collected data is used to validate and refine the developed research outputs. (All relevant steps – operationalisation & interview guide, sampling & recruitment strategies, transcription, analysis – are described under Research & Analyse) - Demo Workshops:
Demonstration workshops provide a hands-on, interactive platform to showcase a project’s methodologies, tools, and findings. These workshops facilitate knowledge transfer and skill development among participants, fostering a collaborative environment where ideas can be exchanged and refined. They also offer an opportunity for researchers to receive immediate feedback, which can be invaluable for refining approaches and ensuring the practical applicability of project outputs.
Evaluation
The evaluation process functions as a measure of quality control and supports the impact assessment. In line with the validation plan and its actions, the evaluation has its focus on the assessment of methodologies, analyses and generated materials. Moreover, it upholds accountability principles regarding the research process, engages stakeholders and informs decision-making. The analysis of the evaluation results lays the foundation for the continuous improvement of the quality and relevance of project results and all information disseminated to the public.
- Evaluation questionnaires:
The development and application of evaluation questionnaires in research projects are pivotal for gathering reliable and comprehensive data on project outcomes. The process begins with designing a questionnaire that aligns with the research objectives, ensuring it covers all relevant aspects of the study. This involves formulating clear, unbiased questions and selecting appropriate response formats. Once developed, the questionnaire undergoes rigorous testing for validity and reliability to ensure it accurately measures what it intends to. In application, these questionnaires are distributed to a targeted sample, allowing researchers to collect both quantitative and qualitative data efficiently. - Feedback gathering:
In addition to evaluation questionnaires, the gathering of feedback may also involve less structured methods. Here, open discussion rounds in the aftermath of workshops or focus groups are a valuable tool to immediately receive feedback from the targeted audience or involved actors. Anonymous feedback opportunities are important channels to receive inputs, especially on sensitive research topics. - Impact assessment:
Considering various dimensions, social or societal, economic, environmental and scientific, impact assessments aim to comprehend or estimate the consequences of research actions, decisions, or policies. Providing valuable insights into the effectiveness and outcomes of research projects, the impact assessment is carried out through various forms of data collection and its analysis. In the context of social impact assessments measurable data such as numbers in community engagement, inclusivity and diversity factors or indicators for increased social cohesion following project activities may be factors of consideration.


