Effective Clinical Practice
Effective Clinical Practice, March/April 1999
Terry S. Stein and Julie Kwan
For author affiliations, current addresses, and contributions, see end of text.
Background. Despite growing concern about the potential impact of managed care on the physician-patient relationship, efforts to enhance the quality of communication between practicing clinicians and their patients have been limited.
Objective. To determine the effectiveness of a 1-day educational workshop.
Design. Clinician self-assessment of interviewing skills measured immediately before and 3 months after the workshop.
Setting. The Kaiser Permanente Medical Care Program.
Participants. Practicing clinicians (n = 1384) in 22 workshops during a 5-year period. Nine hundred eleven participants (66% response rate) completed self-assessment questionnaires 3 months after the workshop.
Results. Self-assessed interviewing skills improved in all items 3 months after the workshop (P<0.05). Clinicians also reported a decline in the proportion of visits that they characterized as frustrating.
Conclusion. A 1-day educational intervention for large groups of practicing clinicians can improve confidence in medical interviewing skills and the ability to handle difficult encounters.
The shift in health care delivery to managed care is affecting the physician-patient relationship. (1, 2) Clinicians are frustrated that the scrutiny and regulation of their practice decisions interfere with patient care efforts. (3-5) Consumers, bewildered by the array of insurance alternatives, question how they can preserve a long-term relationship with a physician. (6, 7) Insurers and medical plan administrators, recognizing the link between the physician-patient relationship and patient satisfaction, promote personalized care to attract new members. (8)
This shift is increasing interest in teaching physicians how to communicate more effectively with patients, especially in difficult encounters. (9-12) Research on the effectiveness of communication training for practicing physicians usually does not address the practical questions that face health care leaders, such as how skeptical clinicians accept training programs about interpersonal skills, what elements of marketing and design enhance enrollment in programs, and how such training affects the clinician's frustration with patients.
In this article, we describe marketing, implementing, and assessing the effect of a 1-day interactive workshop, "Thriving in a Busy Practice: Physician-Patient Communication." The workshop is designed to enhance the communication skills of practicing physicians in a large health maintenance organization. We describe the workshop's content and format, review clinician acceptance, and present results of evaluation tools: a questionnaire measuring participants' responses and a self-assessment instrument measuring confidence in medical interviewing skills completed by participants before and 3 months after the workshop.
The program addressed development of skills for enhancing effectiveness and efficiency of the medical interview and for handling difficult patient visits. Table 1 outlines the workshop's design, which focused on practical approaches and a learner-centered format. Presenters were expert teachers of medical interviewing skills (i.e., leaders in the American Academy on Physician and Patient, faculty from the Bayer Institute for Health Care Communication, or physicians with specialized training in interpersonal skills).
The types of patient behaviors that were often found to be frustrating were described during the introduction and included being angry, bringing lists, being demanding, having unrealistic expectations, drug seeking, and challenging the physician's authority.
The session on effective interviewing featured a videotape of a patient describing an array of symptoms and concerns. The presenter facilitated the participants' responses to several questions by focusing on the patient's emotional state and the participants' reactions to the patient and by asking the participants what they would say next. Descriptions of key interviewing strategies (allowing patients to complete their opening statement without interruption, negotiating an agenda, setting limits for brief visits, responding to lists, eliciting the patient's beliefs, asking about psychosocial issues, and negotiating a treatment plan) were integrated into the large group discussions.
The session on difficult encounters offered a framework and a demonstration for understanding the dynamics of physician-patient conflict and reviewed two strategies: using empathy to diffuse strong emotion, and establishing an alliance with the patient to overcome obstacles to communication. Next, case scenarios solicited through questionnaires from workshop participants before the program began were used to practice these skills. The clinician who submitted a specific case was offered the chance to play the role of the patient while a second volunteer from the group played the clinician. Each pair was coached by a facilitator and by the large group. After three or four cases were presented and discussed in the group setting, participants continued the exercise by using their own cases in groups of three. At the end of the day, presenters and participants returned to the list of frustrating interactions and shared tips and ideas on topics not otherwise addressed.
The initial pilot workshop took place in 1990 and was attended by physicians who had expressed interest in the topic. Piloting the workshop to an enthusiastic audience gave the program designers useful feedback and generated an informal marketing system for the program. For the first 2 years that the programs were offered, self-motivated participants were either invited or referred by the chief of education at each facility. By the third year, marketing among colleagues was well established and a mailing that offered the program to the entire clinician group resulted in full enrollment and a waiting list of 400. One aspect that was essential to the popularity of the program was its focus on meaningful, practical skills that relieved stress for participants; such skills included how to address unrealistic expectations, manage late patients, and avoid the "fix-it" mentality. Using the group's own cases enhanced the credibility of the material.
The workshop was offered several times a year from 1990 to 1995 for groups of 50 to 75 clinicians from different specialties. The 22 programs offered were cumulatively attended by 1384 clinicians, 1055 of whom were physicians. Table 2 shows the demographic and practice characteristics of the physicians participating in workshop.
Workshop participants were asked to evaluate the workshop at its conclusion (i.e., to rate their level of agreement with the statements, "Overall, this workshop was excellent" and "I will recommend this workshop to my colleagues"). The two items were rated on a 7-point agreement scale in which 1 = "strongly disagree," 4 = "neutral," and 7 = "strongly agree."
Self-Assessment of Interview Skills
Clinicians completed a preworkshop assessment, which included a self-assessment questionnaire that asked them to evaluate themselves on 18 aspects of interactions with patients and to estimate the percentage of patient visits that they found frustrating. A postworkshop assessment, which included the same material, was done 3 months later. The items addressed on the self-assessment questionnaire were based on the objectives of the training workshop. These included the degree to which participants believed that they were able to elicit the real reasons for the office visit early in the interview, to guide patient-physician dialogue, and to respond effectively to various types of difficult patient interactions.
The 18 items on various aspects of clinician-patient interaction were rated on a 7-point scale. Of these, 6 items were rated in terms of frequency (7 = "extremely often," and 1 = "never"; 4 was the midpoint of the scale and indicated "sometimes") and 12 were rated in terms of agreement (7 = "agree strongly," and 1 = "disagree strongly"; 4 was the midpoint of the scale and indicated "neutral"). Items on the self-assessment questionnaire were grouped into two scales that represented the general topics addressed in the workshop: 1) interview content and structure and 2) difficult interactions. Each scale contained nine items. Scores in both scales were created for each participant by adding ratings for each item for all items in a scale and dividing the number result of items in the scale answered by the participant.
The final item on the self-assessment questionnaire asked participants to indicate the percentage of patient visits that they found frustrating by using a 6-point scale in which 1 indicated less than 11% and 6 indicated more than 90%.
For the pre- and postworkshop self-assessment measures, mean ratings were calculated for each frequency-agreement scale item and for both scales. Matched t-tests were done to determine whether differences between pre- and postworkshop ratings were statistically significant. All analyses of the self-assessment data were done by using the Statistical Package for Social Sciences (SPSS, Chicago, Illinois).
Data were available for 21 of the 22 workshops; the survey was not administered in the remaining workshop. (Instead, participants were asked to give qualitative responses.) Evaluations were completed by 1063 of 1384 participants (response rate, 77%) who strongly agreed with the statements that the workshop was excellent (mean, 6.2) and that they would recommend it to a colleague (mean, 6.3).
Nine hundred eleven participants (response rate, 66%) responded to the self-assessment questions 3 months after the workshop. Mean ratings on all items showed improvement after the workshop, and the largest improvement was found for the items on the difficult interactions scale.
Figure 1 shows the self-assessment on the medical interview content and structure scale. Before the workshop, the mean rating for this scale was 5.16. Three months after the workshop, it was 5.51. Figure 2 shows the self-assessment on the difficult interactions scale. Before the workshop, the mean rating for this scale was 4.05. Three months after the workshop, it was 4.97.
Figure 3 indicates the percentage of visits that the clinicians characterized as frustrating. Before the workshop, half of the respondents indicated that more than 11% of patient visits were frustrating. After the workshop, only one third characterized more than 11% of their visits as frustrating.
All differences between the pre- and postworkshop self-assessment ratings on the frequency-agreement items were statistically significant (P<0.05). Statistically significant improvements were also noted for the two composite scales. In addition, a statistically significant decrease was found between the pre- and postworkshop self-assessment ratings in the percentage of visits the respondents found frustrating.
Our evaluation suggests that participants in the 1-day continuing medical education workshop, "Thriving in a Busy Practice: Physician-Patient Communication," liked the program and indicated that they would recommend it to colleagues. Three months after the workshop, participants had more confidence in their communication skills, especially those skills used in responding to difficult interactions. The number of visits the participants found frustrating was significantly reduced.
We believe that several elements in the design and delivery of the program contributed to its high ratings and popularity. By inviting motivated participants to the initial programs, we were able to gain confidence about the content and format while "preaching to the choir." These early participants spoke highly of the workshop to colleagues and offered important feedback to workshop faculty. In later programs, we included an "alumni list" with each workshop binder so that new attendees could see the names of colleagues who had completed the program. Because the ultimate success of broad, voluntary training to improve the physician-patient relationship depends in part on word-of-mouth recommendations from colleagues, its developers must gauge the initial response to the program, refine the content and format accordingly, and use the positive experiences and comments of early participants to promote further interest.
In particular, comments contained in the workshop evaluations indicated that participants appreciated the chance to share problems and strategies with colleagues from various specialties and to learn new ways to handle difficult encounters. Because the workshop was interactive, participants often heard responses to their questions from colleagues instead of solely from faculty. The wisdom and experience of the group enhanced the practical value and credibility of the material. On the basis of scores from workshop evaluations, anecdotes from participants months and years later, and the current high enrollment in the companion workshop, "Thriving 2," we believe that busy clinicians can recognize and address the need to develop communication skills through continued training.
A few participants responded negatively to the workshop. Their comments centered on three themes: that the material did not pertain to the participant's specialty (especially for pediatrics and emergency medicine), that systems problems--and not communication problems--undermine the physician-patient relationship, and that additional topics (time management, stress management) would have been useful. As a result, specialty-based communication programs have been designed, systems issues have been better differentiated from course content, and programs on managing time and stress have been offered separately.
Other Efforts To Improve Communication
Many medical schools (13) and some residency programs have begun training in communication skills during the past decade, but the quality and scope vary. Primarily, only the Bayer Institute for Health Care Communication (14) and the American Academy on Physician and Patient (15) have provided courses on interviewing to practicing physicians in multiple health care settings.
Table 3 outlines some of the major studies that have evaluated communication skills programs. These evaluations provide some preliminary information about outcomes, (16) but the specific material taught and its research basis is not always well defined. Levinson and Roter (17) analyzed audiotapes of medical visits before and after participants attended either a 4.5-hour workshop or a 2.5-day course on medical interviewing. Only the longer course resulted in a statistically significant behavior change. Gask and coworkers (18-20) reported that experienced general practitioners attained statistically significant, sustained improvement in psychiatric interviewing skills and in managing somatization after attending small group courses using videotape feedback. Roter and coworkers (21) evaluated two 8-hour communication skills courses that emphasized addressing patients' emotional distress. Audiotape analysis showed a statistically significant improvement of the targeted skills over untrained physicians. Joos and colleagues (22) showed that a 4.5-hour intervention changed physician behavior but had no effect on patient outcomes.
Our evaluation has several limitations. The subjective nature of the assessment tool without correlation with behavior change allows only speculative conclusions about the program's effectiveness. Our study did not provide information about patient outcomes (e.g., satisfaction with their clinicians' communication). Further, the self-assessment data represented a single point in time 3 months after the workshop and did not include the complete sample. This incomplete response rate and the participants' voluntary participation in the follow-up evaluation could bias the results in favor of the workshop. No information on the characteristics of the one third of workshop participants who did not complete the postworkshop self-assessment is available. Evaluating how our findings relate to actual and sustained use of enhanced communication skills requires future assessment.
Although our evaluation did not measure communication behavior, the large sample size and the consistency of improvement in self-ratings found across the 22 presentations of the program suggest that this 1-day intervention enhanced the participants' confidence and decreased their frustration. Because more than one third of physicians in one large study (10) reported frustration with the interpersonal dynamics during as many as 25% of patient visits, efforts to decrease this potential source of dissatisfaction could reduce overall physician stress.
The combination of the workshop evaluation and the self-assessment ratings indicates that a 1-day educational intervention on communication skills for large groups of practicing clinicians can be received enthusiastically, can improve self-rated confidence in the skills covered, and can reduce the percentage of visits perceived as frustrating. On the basis of these results, we have implemented a multifaceted array of programs and services to integrate skill-building in clinician-patient communication into ongoing medical education.
|Take Home Points
1. Emanuel EJ, Dubler NN. Preserving the physician-patient relationship in the era of managed care. JAMA. 1995;
2. Emanuel EJ, Brett AS. Managed competition and the patient-physician relationship. N Engl J Med. 1993;329:879-82.
3. Orentlicher D. Health care reform and the patient-physician relationship. Health Matrix. 1995;5:141-80.
4. Ethical issues in managed care. Council of Ethical and Judicial Affairs, American Medical Association. JAMA. 1995;273:330-5.
5. Wennberg JE. Health care reform and professionalism. Inquiry. 1994;31:296-302.
6. Swee DE. Health care system reform and the changing physician-patient relationship. N J Med. 1995;92:313-7.
7. Ward RA. Age and patterns of HMO satisfaction. J Aging Health. 1990;2:242-60.
8. Spierer M, Sims HW, Micklitsch CN, Lewis BE. Assessment of patient satisfaction as part of a physician performance evaluation: the Fallon Clinic experience. J Ambulatory Care Manage. 1994 Jul;17:1-7.
9. Goldman HB, Reinhard JD, Kroll TZ. When patients complain: a special challenge--point and counterpoint. HMO Pract. 1991 Mar-Apr;5:51-4.
10. Levinson W, Stiles WB, Inui TS, Engle R. Physician frustration in communication with patients. Med Care. 1993;31:285-95.
11. Bartlett EE. Manage the difficult patient to reduce malpractice risk. HMO Pract. 1995 Jun;9:84-7.
12. Finocchio LJ, Bailiff PJ, Grant RW, O'Neil EH. Professional competencies in the changing health care system: physicians' views on the importance and adequacy of formal training in medical school. Acad Med. 1995;70:1023-8.
13. Novack DH, Volk G, Drossman DA, Lipkin M Jr. Medical interviewing and interpersonal skills teaching in US medical schools: progress, problems, and promise. JAMA. 1993;269:2101-5.
14. Segal ES. Maintaining communication in a time of uncertainty. Arch Fam Med. 1995;4:1066-7.
15. Novack DH, Suchman AL, Clark W, Epstein RM, Najberg E, Kaplan C. Calibrating the physician. Personal awareness and effective patient care. JAMA. 1997;278:502-9.
16. Stewart MA. Effective physician-patient communication and health outcomes: a review. CMAJ. 1995;152:1423-33.
17. Levinson W, Roter D. The effects of two continuing medical education programs on communication skills of practicing primary care physicians. J Gen Intern Med. 1993;8:318-24.
18. Gask L, McGrath G, Goldberg D, Millar T. Improving the psychiatric skills of established general practitioners: evaluation of group teaching. Med Educ. 1987;21:362-8.
19. Bowman FM, Goldberg DP, Millar T, Gask L, McGrath G. Improving the skills of established general practitioners: the long-term benefits of group teaching. Med Educ. 1992;26:63-8.
20. Kaaya S, Goldberg D, Gask L. Management of somatic presentations of psychiatric illness in general medical settings: evaluation of a new training course for general practitioners. Med Educ. 1992;26:138-44.
21. Roter DL, Hall JA, Kern DE, Barker LR, Cole KA, Roca RP. Improving physicians' interviewing skills and reducing patients' emotional distress: a randomized clinical trial. Arch Intern Med. 1995;155:1877-84.
22. Joos SK, Hickam DH, Gordon GH, Baker LH. Effects of a physician communication intervention on patient care outcomes. J Gen Intern Med. 1996;11:147-55.
The authors thank Wendy Levinson, MD, Director, Clinical Scholars Program, University of Chicago Pritzker School of Medicine, for assisting with design of the workshop program and instruction and reviewing drafts of the manuscript; Marilyn Libresco, MS, and Pamela Larson, MPH, for assisting with workshop design and management; Freddie Dempster for compiling data and providing project assistance; Janet Angell for providing data entry and report production; and Susan Bachman, PhD, for providing data analysis. We also thank the Medical Editing Department of Kaiser Foundation Research Institute for editorial assistance.
The pilot project was initially funded by an Innovation Program grant from The Permanente Medical Group and Kaiser Foundation Health Plan.
Preliminary data were presented at the 15th annual meeting of the Society of General Internal Medicine, Washington, D.C., April 29-May 2, 1992.
Terry S. Stein, MD, TPMG Physician Education & Development, The Permanente Medical Group, 1800 Harrison Street, 21st Floor, Oakland, CA 94612.