What do patients want to improve in their own general practice? Adapting discrete choice experiments for local use.
Patient experience data is increasingly used as a proxy for public voice in health care design and delivery. However, experience data may not reflect patient values, and raises questions about democratic validity: who decides how data are interpreted and used. An alternative is to ask the public about their priorities directly, using a discrete choice experiment (DCE). DCEs are commonly used to influence regional or national decision making. We tested the feasibility and acceptability of co-designing and delivering a locally contextualised DCE to strengthen patient voice when planning individual general practice service improvement.
A template DCE was co-designed as part of a participatory action research study. Two general practices and their patient participation groups (PPGs) agreed to adapt the DCE template (Practice 1=P1 and Practice 2=P2). PPG members (P1=12, P2=5) and staff (P1=23, P2=14) met separately to shortlist, and then together to vote on 24 pre-designed attributes (covering access, personalisation, continuity, co-ordination, equity, and quality of care). Observational notes were taken at each meeting. Two DCEs were produced, each in three formats; online, paper, and a ballot box. All formats included an introduction, a number of choice tasks (five online, three paper, and one ballot box), demographic questions, a free text box, and an advertisement for the PPG. Choice tasks had five attributes and three or four levels. The online DCE was distributed to practice patients via text message. The paper and ballot box DCEs were distributed in the waiting room by the PPG. Data was analysed using mixed logit regression analysis.
Initially, PPG members and staff prioritised different attributes. After discussion and voting there was clear consensus on the final five attributes to use, in both practices.P1 attributes were listening skills; shared decision making; comprehensive services; receptionists; and complaints. 333 people completed the survey: 160 online (8.4% response rate (RR)); 115 paper (83.3% RR); 58 ballot box (RR unknown). The most valued attribute was being listened to. P2 attributes were clinician choice; information continuity; self-management support; social prescribing; and appointment length. 342 people completed the survey: 153 online (3.8% RR); 116 paper (85.2% RR); 74 ballot box (RR unknown). The most valued attributes were information continuity, and appointment length. Free text responses included both preference and experience data. In both practices the most frequent comments were about attributes not included in the DCE.
It is feasible for patients and staff to co-design and deliver a DCE contextualised to local service improvement priorities that is acceptable to most patients. There is evidence from the free text comments that the choice task still lacked democratic validity. Further evidence is needed about whether preference data are more accurate or influential than experience data in producing patient-centred service improvement.