Skip to main content

Use Cases

• The NEW infrastructure built through Empowering the Participant Voice, has enabled Rockefeller to send participants personalized email invitations that link to a mobile-friendly electronic survey. The survey takes 2-5 minutes to complete.

• We have used the RPPS survey to collect participant experience feedback at Rockefeller since 2013 using snail mail, and paper surveys. In the past we have used RPPS results to drive improvements in how well participants say the informed consent process prepares them for research, and to better communicate to participants how much they are valued as essential partners in the research process.

• To adapt to the new EPV platform, we revised how we extract and transform information from our in-house data systems. We upgraded our REDCap version to more easily support multi-lingual fielding: we offer the survey in English and Spanish.

• Locally, we developed an awareness campaign to raise visibility to participants using posters and brochures; we gave presentations to hospital and research staff as well. In advance of the current project, we gathered feedback from stakeholders, including research participants, members of the IRB, hospital and research staff, and hospital and research leadership.

• As of February 2022, we send surveys every two months, sending an email invite containing a personalized survey link to every research participant who completes an informed consent form soon after that visit, and to participants at the end of their study participation. We also send annual surveys to participants in long-term studies. We send 400-500 surveys/year. Our response rate is about 23%; the demographics of our respondents are shown on our results page.

• We track the characteristics of the studies that the respondents were enrolled in and the demographics of the participants (responders and non-responders) so we can understand how experiences differ among different groups and in different studies.

• We review the survey results and the free-text comments from participants after every round of surveys and identify what is actionable (positive or negative).

• We analyze results for our institution overall, over time, within and across studies, and compared to the Consortium results for multiple centers. We use many of the dashboard and custom reporting features regularly.

We share the results of our RPPS surveys and anonymous comments directly with our:
o Investigators and research team members
o Community Advisory Board (CAB)
o Hospital Stakeholder Results Analysis and Action team
o Continuous Quality Improvement Committee
o Institutional Review Board
o With research participants and the public on our RPPS Results Page

• In addition to our community advisors, we would like to develop a research participant stakeholder group to help us learn how to increase our response rates and provide input on analysis and action.

The University of Rochester employs an enterprise-wide implementation of the RPPS for all OnCore CTMS studies which currently represents a defined subset of all the studies (those deemed with billing risk). This is expected to expand to additional studies in the coming year. A semi-automated process was established to refine the list of research subjects receiving the survey to include only those subjects from interventional studies where interactions with the study team had occurred.

• The site meets at least annually with existing internal stakeholder groups rather than establishing a new group; in this way, they leverage their established structures and institutional initiatives. This will help with sustainability as well. These groups include their Community Advisory Council and Health Research Advisory Committee. In addition to community members and investigators, stakeholders also include leaders from the Office of Clinical Research, Office of Human Research Protection, Sr. Associate Dean for Clinical Research, and Privacy Office.

• The University of Rochester conducted sub-analyses of studies identified as associated with our Cancer Center and compared findings with non-cancer center studies. Those findings were discussed with Cancer Center leadership.

• To promote survey completion, they also created and disseminated a flyer about the survey for study teams to disseminate to their enrolled participants. The uptake on this was limited.

• The survey, distributed via an individualized email link is distributed to ~1000 individuals per year. Eligible individuals are those who have either recently enrolled or recently completed a study. One email reminder is sent out about a week later. After testing several options, they determined that Saturday was the optimal day for emailing in that it yielded the highest response rate. Our response rate varies but is generally 21% overall. This was achieved after establishing a raffle-based incentive program ($50; 1:25 chance of winning).

• Since not all research participants were accessible via email, they established a paper-survey process to determine if these participants had different experiences than those who responded through email. We did find differences on several key survey items so, despite it being a manual process, this supplementary survey method will be continued.

• Based on input from community members we also sent paper surveys to Black and Hispanic/Latino participants to determine if their response rate would be higher (vs. emailed surveys). The response rate for these sub-groups did not improve.

• Similar to other sites, they track the characteristics of the studies that the respondents were enrolled in and the demographics of the participants (responders and non-responders) to understand who is and is not responding to the survey.

• Survey results and the free-text comments from participants are analyzed quarterly to identify what is actionable (positive or negative).

• They analyze results for the institution overall, over time, within and across studies, and compare them to the Consortium results (using the various dashboards and custom report options). In addition to the dashboard provided through Vanderbilt, we used Tableau and REDCap to create additional visualizations to share with stakeholders.

• Importantly, a substantial number of the survey items are rated highly, demonstrating the areas where the research teams and the research experience are meeting or exceeding participant expectations. This can help allay institutional or study team concerns about what conducting a survey might reveal.

• The team also publishes a public-facing report of the survey results that is updated annually (https://www.urmc.rochester.edu/research/health-research/empowering-the-participant-voice-public-report.aspx).

 

• We have used the RPPS survey to collect participant experience feedback at JHU since 2016. Every 6 months we use our clinical trial management system to select a random sample of research participants consented and enrolled to a study in the last 6 months. We offer the survey in English and Spanish

• Locally, we engage our stakeholders through meetings, emails and the posting of the survey results on a website that is provided to participants at the time of the survey. Our stakeholders include the Institutional Review Board, Research Teams, Research Participant Partners and our local Community Research Advisory Council. The survey results are very reassuring that research participants at Johns Hopkins report a good experience. With these results our stakeholders are more confident they can promote research to interested individuals.

• Our response rate is about 23%

• We review the demographics of the participants (responders and non-responders) so we can understand how experiences differ among different groups and in different studies and found that the satisfaction rates are consistent across participants of different races.

• We review all of the survey results and the free-text comments from participants after every round of surveys and identify what is actionable (positive or negative).

•We share the results of our RPPS surveys and anonymous comments directly with our:

o Investigators and research team members in training sessions at the University
o Community Research Advisory Council (C-RAC)
o Institutional Review Board
o Research Participants and the public, on a public-facing webpage

•During the COVID 19 pandemic we fielded the survey to participants enrolled in COVID 19 research to assess their experiences participating in research during the pandemic. We were able to reassure our Human Research Protection Program that participants reported a positive experience.

•The RPPS survey has only been utilized to measure the experience of adults over age 18. With minimal revisions we have created a survey for parents whose children were in a research study. We fielded our first survey of parents in the Fall 2023. The plan is to continue the survey process in 2024.

 

Wake Forest incorporated the built infrastructure of the EPV grant into existing institutional initiatives, making the EPV project itself an important institutional initiative.

The importance of EPV at Wake Forest allowed the team to advance innovative ideas, such as utilizing the Patient Portal to distribute the survey. The team fields surveys to all research participants with a portal account. The Portal is a secure, convenient, low-cost method of communicating with research participants, but a systematic process for sending individualized links to recipients through the portal was needed. Once developed, opportunities for the use of the Patient Portal in other research-related capacities emerged. An institutional approval process for even broader use of the portal in research expanded Wake Forest’s outreach capabilities in research overall.

The site team has made extensive efforts to raise awareness for the RPPS initiative among research departments and staff, through stakeholder presentations and fliers. Flyers are provided to clinic sites and posters for exam rooms, and on occasion elevators throughout the organization, provide information about the survey and the Patient Portal.

The RPPS data provides insight into how research participants at Wake Forest perceive different aspects of their research experience. The team also reviews the types of studies in which respondents participated and the demographics reported by respondents to gain better understanding of the research community’s experiences and any differences in experience. Both survey ratings and text comments are reviewed in analyzing results.

At Wake Forest, the EPV project has become an integral part of the research program and the connection to the community.

Duke University approaches individual study teams and implements the Research Participant Perception Survey (RPPS) on a study-by-study basis. The survey has undergone pilot testing across various study types, specifically targeting diverse populations. Our approach involves customizing the survey distribution for each study team, ensuring that survey results are promptly accessible for the purpose of designing and testing process improvements.

Currently, there is an ongoing initiative to conduct a pilot enterprise-wide deployment of the survey. This involves sending surveys to prospective cohorts in a variety of clinical trials within the Duke Cancer Institute.

Processes have been established to send survey messages to participants newly enrolled in the participating studies at approximately two-month intervals.

Engagements with individual study teams include an initial discussion of the purpose of the study and logistics of the survey distribution process; and a subsequent discussion with the presentation of survey results, encompassing both quantitative data and free-text comments.

Additionally, regular meetings are held with research leadership and teams to discuss the progress of the Enterprise-wide Research Participant Perception Survey (EPV) project. These meetings involve a comprehensive review of survey results with stakeholders and aim to enhance awareness and garner support for EPV, ultimately seeking to boost participation across the entire research enterprise.

Vanderbilt is the technical partner for the EPV project. The Vanderbilt team built and maintains the technical software to support the survey fielding and Dashboards and serves as the Data Coordinating Center for data aggregation and the Consortium dashboard.

 

 

 


Contact Us

Rhonda G. Kost, M.D., Principal Investigator
The Rockefeller University Hospital
1230 York Ave.
New York, NY 10065