Development of MB 1-on-1
The goal of developing MB 1-on-1 was to adapt intervention content to complement the educational information and supportive communications home visitors provide during visits into 15–20 min segments that could be effectively incorporated during regular home visits. In doing so, care was taken to retain core content of the MB Group curriculum while modifying the intervention to ensure its appropriateness for 1-on-1 delivery. As a starting point, all group activities requiring small or large group conversations were identified; those that required multiple group members were either revised for provider-client interactions or removed. Weekly group relaxation activities were also removed due to the time requirement, while a session teaching relaxation techniques was included in MB 1-on-1. We then removed material that was duplicative across sessions (e.g., multiple discussions of harmful vs. helpful thoughts in the Thoughts module became the focus of a single MB 1-on-1 session). This process resulted in a 15-session version of MB 1-on-1.
Along with modifying the content to be delivered in 15 brief sessions, considerable effort was placed on revising the Instructor’s Manual to make it user-friendly for home visitors accessing the Manual in a client’s home. Each session consists of two to three topics and a personal project (i.e., homework) for completion between sessions. All topics contain: a) key points for home visitors to communicate to clients as takeaway messages, b) a script to guide home visitors when delivering content, and c) interactive learning activities that help clients practice and understand key concepts.
Setting
In partnership with the Illinois Governor’s Office for Early Childhood Development (OECD) an open invitation was offered to HV programs across Illinois to attend one of three regional day and a half-long trainings led by the first author on MB 1-on-1 between February and August 2014. Eighty-one home visitors from 19 HV programs in Illinois attended one of the trainings. Across programs, 14 (74%) used the Healthy Families America model, and 5 (26%) programs employed the Parents as Teachers model. The number of home visitors employed by agency ranged from 1 to 7. Healthy Families America enrolment targets parents facing challenges such as single parenthood, low income, childhood history of abuse and other adverse child experiences, and current or previous issues related to substance abuse, mental health issues, or domestic violence. Individual Healthy Families’ sites are given latitude to select specific characteristics of the target population they plan to serve, while enrolling prenatally or within three months of birth [34]. Parents as Teachers provider organizations select the specific characteristics and eligibility criteria of the target population they plan to serve, such as children with special needs, families at risk for child abuse, income-based criteria, teen parents, first-time parents, immigrant families, low literate families, or parents with mental health or substance use issues, enrolling families during pregnancy and the postpartum period [35].
During training, home visitors were provided with an orientation to MB’s cognitive-behavioral therapy underpinnings followed by an orientation to implementation guidelines and structure of the Instructor and Participant Manuals. The remainder of the training guided home visitors through each of the 15 sessions. Didactic portions of the training introduced home visitors to key points within each session and potential challenges to communicating material with clients. Training provided ample time for attendees to engage in small and large group activities, including opportunities to practice introducing core MB content, provider-client interactive activities, and personal project presentation, through structured role plays.
OECD was interested in training its HV programs on how to better address maternal depression among its clients and encouraged all HV programs trained to immediately pilot the intervention. Specifically, home visitors attending a training were asked to to implement MB with 1 to 2 clients who were pregnant or had a child < 6 months. In cases where prenatal and recently postpartum clients were not available, home visitors were encouraged to implement MB with clients on their caseload who could benefit from the intervention, with the goal being to practice implementation in close proximity to the training and supervision offered.
Intervention implementation
MB 1-on-1 recommends session delivery every 1–2 weeks, with flexibility to conduct more than one session per week if a home visitor believes the material is particularly salient for a client. Home visitors were trained to deliver a MB session at the beginning or end of a scheduled home visit, with flexibility to conduct a MB session by phone if a client was unable to complete the scheduled home visit. Other individuals were permitted to be present during MB delivery (e.g., father of baby); however, home visitors were encouraged to use professional judgment regarding whether the presence of another individual would compromise MB delivery or client openness. HV clients referred to material in the Participant Manual during sessions. Personal projects for client completion between sessions were described in the Participant Manual. Home visitors were trained to implement MB 1-on-1 chronologically from session 1 through 15.
Each HV program received biweekly phone supervision from the first author upon beginning MB implementation. Supervision was provided for 4–6 months, which corresponded with the length of time it took, on average, for home visitors at a HV program to implement the 15-session curriculum. Supervision calls allowed home visitors to share and receive feedback on both content and implementation process. For example, home visitors posed content-related questions on how to help clients better understand a core MB concept or process-related questions on how to most effectively transition from delivering core HV content to MB material.
Data sources and data collection
The Northwestern University Institutional Review Board reviewed and approved the study protocol and all data collection activities described herein. This protocol was deemed as minimal risk and pursuant to quality assurance activities and was therefore granted a waiver of consent.
Client acceptability
At the end of each of the 15 MB sessions, clients were asked to complete a brief de-identified 5-question survey given to them by their home visitor. Each survey asked the same questions: a) how much did you enjoy today’s session (1 = not enjoyable at all, 2 = somewhat enjoyable, 3 = very enjoyable), b) how well did you understand the information in today’s session (1 = didn’t understand, 2 = somewhat understood, 3 = totally understood), and c) how useful was the information in today’s session (1 = not useful at all, 2 = somewhat useful, 3 = very useful). Two open-ended questions were asked: do you have any suggestions on how to make today’s session any better, and, was there anything you really enjoyed or thought was really useful in today’s session.
Home visitor feasibility & acceptability
Home visitors also completed a survey at the end of each session. Each survey asked the home visitor to indicate for each session topic: a) how much material was covered (not covered, somewhat covered, totally covered), b) how well you think the client understood the topic (did not understand at all, somewhat understood, totally understood), and c) how engaged the client was (not engaged at all, somewhat engaged, very engaged). Home visitors were also given the opportunity to provide open-ended responses indicating if there were challenges when discussing session topics.
Data analysis
Home visitors entered their data directly into a web-based Research Electronic Data Capture (REDCap) survey or emailed or faxed data to the research team who entered the data into REDCap. Client data were collected by home visitors and emailed or faxed to the research team for REDCap data entry. All data were subsequently exported to SPSS 22 for analysis. To analyze client acceptability data, we assessed enjoyment by calculating the percentage of women who indicated they found each session very enjoyable, somewhat enjoyable, or not enjoyable at all. We assessed content comprehension by calculating the percentage of women who indicated they didn’t understand, somewhat understood, or totally understood each session. We assessed perceived usefulness of MB skills by calculating the percentage of women who indicated they found skills not useful at all, somewhat useful, or very useful.
To assess home visitor feasibility and acceptability data, we examined fidelity of MB implementation by calculating the percentage of home visitors who reported that they totally covered, somewhat covered, or did not cover each topic within a session. Each session contains between 2 and 4 topics; the percentages for each topic were combined into an average score indicating the extent to which a session was covered. We assessed client comprehension by calculating the percentage of home visitors who indicated that a client totally understood, somewhat understood, or did not understand at all the session material for each session. We also estimated client engagement by calculating the percentage of home visitors who indicated that a client was totally engaged, somewhat engaged, or not engaged at all for each session.
For each client and home visitor acceptability and feasibility construct, we calculated percentages: a) across all 15 sessions (i.e., the entire intervention) and b) for each MB module [Sessions 1–3 (Overview), Sessions 4–6 (Pleasant Activities); Sessions 7–11 (Thoughts), Sessions 12–15 (Contacts with Others)]. All open-ended client and home visitor responses were exported into Microsoft Excel and coded inductively by a member of the research team; open-ended responses were provided by 95% of clients and 91% of home visitors. The same research team member created two matrices—one for client data and another for home visitor data. Each matrix had a row for each code and a column for each respondent and all open-ended data were placed in one of these matrices. These matrices were then used to guide analysis that identified the most common themes and patterns [36] related to the constructs that the open-ended questions asked about—for example, client enjoyment and home visitors perceptions of intervention fidelity.