Leonieke Boendermaker (School of Social Work and Law, Amsterdam University of Applied Sciences)
Reneé de Wildt-Liesveld (VU University, Faculty of Science, Science and Society)
Imagine you are busy implementing a new approach in your organization. An approach that requires a different way of thinking and working. How do you manage that? Where do you start and with whom? In this workshop we share the experiences and examples of Child Protection Amsterdam (Boendermaker & Regeer, 2018). In 2011 this organization started to work with Intensive System Focused Case Management (ISC).
In this workshop we will use the 'eye-opener workshop' methodology, developed by one of the workshop submitters (see Van Mierlo et al. 2010). The aim of an eye-opener workshop is to get participants to join in a learning process of a pioneer and to formulate joint lessons. The idea is that learning lessons should be formulated in co-creation with those who would apply the lessons in their own practice, rather than by pioneers/innovators or researchers. For this it is necessary to share the experiences of the pioneer in detail, including highs and lows.
At Child Protection Amsterdam we found strategies on four areas: competences of employees, organizational systems, leadership and culture (Bertram, Blase, Fixen, 2015). Depending on the learning needs of the participants, we will choose two areas on which we explain the experiences at Child Protection Amsterdam in detail.
Dr Wendy Hardeman (University of East Anglia)
Dr Elaine Toomey (National University of Ireland Galway)
This workshop aims to introduce and develop knowledge and skills in process evaluation with a specific focus on assessing fidelity within behavioural trials and what this means for subsequent implementation into practice. The workshop will use a combination of interactive strategies to facilitate engagement and learning including group-work, question and answer sessions and hands-on practical activities. By the end of this workshop, participants should:
1. Be able to explain the importance of process evaluation and provide an overview of its three components.
2. Have an understanding and awareness of the concept of fidelity and that it is broader than assessment of delivery: i.e. incorporating enhancing, assessing and reporting across intervention stages and at multiple stakeholder levels.
3. Have an understanding and awareness of fidelity assessment strategies, their pros and cons, and considerations in their selections.
Bianca Albers (European Implementation Collaborative)
Katie Burke (Centre for Effective Services)
Allison Metz (National Implementation Research Network)
In 2018, NIRN and CES partnered with the European Implementation Collaborative (EIC) to validate the skills and competencies and build more rigorous evidence for their use in implementation science. Through this effort, 17 international intermediaries have been invited to provide feedback on the competencies through survey. A systematic scoping review also is underway. Following results from these activities, the competencies will be refined, additional data will be collected through interviews, and usability testing will be conducted with a subset of international intermediaries. Presenters will share initial findings from the survey and scoping review and seek feedback from conference participants. Presenters hope to have a lively conversation with workshop participants about what participants see as essential skills and competencies based on their experience.
Barbora Krausova (King’s Improvement Science, Institute of Psychiatry, Psychology & Neuroscience at King's College London)
In this session we will teach delegates how to use a practical feasibility tool that we – the King’s Improvement Science (KIS) team - have developed to help people assess the likelihood of their quality improvement projects being successful. The tool is our quality improvement project decision matrix, which you can find on pages 12-15 of the ‘Step 2 guidance’ (please see link).
The matrix comprises a list of 12 questions that everyone should ask before starting an improvement project. For example: ‘Is there evidence showing that an improvement needs to be made?’ or ‘Do you foresee any major barriers to introducing your proposed change(s)?’
The questions are based on published evidence regarding good practice in quality improvement and the experience of KIS in conducting and supporting quality improvement projects. The matrix has also had a significant input from improvement and implementation science experts at the Centre for Implementation Science where our team is based.
The idea is that, ultimately, the matrix will help people to implement evidence-based interventions / guidelines into routine practice.
In this workshop, delegates will work together in small groups to evaluate a quality improvement project scenario using a sub-set of the decision matrix.
1) How to implement community-based suicide prevention programs across borders
Juliane Hug (European Alliance Against Depression, project coordinator of the European Alliance Against Depression)
2) Using a theory-based model and practical experiences to reflect on your implementation barriers and facilitators and work out tailor-made solutions
Willeke Vos (Tilburg University)
3) Research Funders roles in Dissemination and implementation: results and examples from an international survey.
Barbara van der Linden (ZonMw)
Rebecca Abma- Schouten (Dutch Heart Foundation)
4) Involving patients and service users in implementation and improvement - lessons from Northwest London
Stuart Green (DrPH student, Faculty of Public Health and Policy, London School of Hygiene and Tropical Medicine and Public Health Research Fellow, NIHR CLAHRC for Northwest London, School of Public Health, Imperial College London)