Australian Family Physician
Australian Family Physician


Volume 43, Issue 10, October 2014

Online continuing medical education (CME) for GPs: does it work? A systematic review

Isaraporn Thepwongsa Catherine N Kirby Leon Piterman Peter Schattner
Download article
Cite this article    BIBTEX    REFER    RIS

Numerous studies have assessed the effectiveness of online continuing medical education (CME) designed to improve healthcare professionals’ care of patients. The effects of online educational interventions targeted at general practitioners (GP), however, have not been systematically reviewed.
Online CME could improve GP satisfaction, knowledge and practices but there are very few well-designed studies that focus on this delivery method of GP education.
Eleven studies met the eligibility criteria. Most studies (8/11, 72.7%) found a significant improvement in at least one of the fol-lowing outcomes: satisfaction, knowledge or practice change. There was little evidence for the impact of online CME on patient outcomes. Variability in study design, characteristics of online and outcome measures limited conclusions on the effects of online CME.

Online CME seems to be a growing area, attracting increasing resources, time and attention. Therefore there is a professional and ethical obligation to ensure all CME interventions are evaluated for their quality, effectiveness and cost-effectiveness. Despite the evaluation of a wide range of CME interventions targeted at improving professional practice and patient outcomes6, evidence of the benefit of online CME is limited. Online CME can be effective in imparting knowledge,4,5,7,8 but few studies have examined the effects of online CME on practice behaviour5,7,9 and patient outcomes.7 Furthermore, the effects of online CME targeted at GPs have not been systematically reviewed. The purpose of this review, therefore, is to assess the evidence in the literature for the effectiveness of online CME specifically targeting GPs.


Search strategies

The literature search was conducted using multiple electronic databases and supplemented by a manual search of references. The search terms included ‘general practitioners’, ‘continuing medical education’, and ‘web-based” or ‘internet’. The following databases were searched from the earliest date of each database to 2013:

  • The Cochrane Library
  • ERIC
  • Scopus
  • Ovid MEDLINE
  • Informit Health Collection
  • Google Scholar.

This search was completed in September 2013.

Study selection

The first author (IT) screened the titles and abstracts of all retrieved articles. Two reviewers (IT, LP) screened the full texts of selected papers using the inclusion and exclusion criteria (Table 1). Disagreements were resolved by discussion.

Table 1. Inclusion and exclusion criteria

Inclusion criteria

Exclusion criteria

Type of studies
  • Randomised controlled trials
  • Non-randomised controlled trials
  • Interrupted time series studies
  • Before–after studies assessing changes in healthcare professionals’ learning, satisfaction, behaviour and/or patient outcomes.
Type of participants
  • GPs
  • Mixed participants where GPs were the majority
Type of CME interventions
  • Any online educational intervention that:

    -targeted practising general practitioners
    -aimed to produce measurable changes in GPs’ satisfaction, learning, process of care and/or patient outcomes
    -was defined explicitly
    -was conducted as a single delivery method (ie. online only).

Our definition of 'online educational intervention' was based on a definition by Cook et al7
Type of outcome measures
  • GP satisfaction, knowledge, behavioural changes, process of care, and clinical outcomes.
Articles were excluded if they:
  • were a review, pilot study, incomplete study, protocol study, conference abstract, editorial, commentary or letter
  • were a descriptive, case-report or qualitative study
  • were a non-English language publication
  • were published before 1990
  • did not include online education for GPs
  • did not evaluate an online educational activity
  • did not involve or did not state clearly that they involved GPs or family doctors
  • did not state the educational intervention clearly

Data extraction

Standardised forms were used for data extraction to minimise the risk of bias. Categories of information extracted are shown in Appendix 1 (available online only). Reviewers completed a study quality form for each article. Quality assessment was based on the criteria of Jadad et al.10 A score between 5 (high quality) and 0 (low quality) was assigned for each study.

Appendix 1. Studies comparing online interventions to no intervention or non-online interventions



Sampling (no.)
a) participants
b) patients
c) practices

Follow-up period /intervention exposure duration

Online characteristics

i1) Intervention 1
i2) Intervention 2
c) Control group




Houwink et al 2013 11

General practice/
The Netherlands

a) 80 GPs
b) Not specified
c) Not specified

RCT, pre-post test, 2 groups/ genetic cancer in primary care/6 months/
4 weeks

Web module with didactic components, cases with feedback

i1) A web module
i2) NA
c) No intervention

Knowledge (+)
Satisfaction (+)
Self-reported applicability (+/0)


Pelayo et al 2011 12

Primary care/

a) 179 primary care practitioners
b) Not specified
c) Not specified

RCTs, pre-post test, 2 groups / palliative care/no follow-up/
96 hours

Web module with cases, exercise activities and facilitated discussion

i1) A web module for palliative care self-training
i2) No access to a web module but could voluntarily receive or not the usual palliative care training offer in the working area (traditional training)
c) NA

Knowledge (+)(i1)
Attitude (+)(i1)
Satisfaction (+)


Fleet et al 2011 13

Not specified/

a) 457 (GPs, residents, nurses practitioner, registered nurses, medical students)
b) Not specified
c) Not specified

Pre-post test, 1 group/asthma/
no follow-up/
1 year

Web module with cases and discussion but was not facilitated

i1) An internet based course
i2) NA
c) NA

Knowledge (+)
Satisfaction (+)


Curran et al 201014

Not specified/

a) 153 licensed physicians
b) Not specified
c) Not specified

Pre-post test, 2 group/ Emergency medicine (trauma cases)/ no follow-up/
3 weeks

Web module with facilitated discussion compared to unfacilitated discussion

i1) A scheduled group learning format involved asynchronous discussion with peers and facilitator over a schedule period of 3 weeks
i2) An eCME on Demand format; no schedule; unfacilitated discussion (discussion board was provided)
c) NA

Knowledge (+) (i1)
Confidence (+) (i1)
Satisfaction (+) both group but (i1) had significant higher for the items related to learning needs, and clarity of content



A summary of the search results is presented in Figure 1. A total of 686 citations were found of which only 1111–21 met the inclusion criteria for this review (Appendix 1).

Figure 1. Overview of study selection

Study characteristics and evaluation methods

Only four studies focused solely on GPs or family practitioners.11,18,19,21 Seven studies included a mixed sample with a majority of GPs plus other healthcare professionals.12–17,20 The studies included six randomised controlled trials,11,12,17,18,20,21 one non-randomised controlled trial14 and four trials without control groups.13,15,16,19 A pre-post questionnaire was the most common method of measurement,11–19,21 followed by GP survey,11–17,19 patient medical record review,17,18 interview,15,21 a review of a third-party database20 and observational assessment of physician behaviour.18

Online CME characteristics

The characteristics of online CME based on Sargeant et al’s grouping22 included: content presentation only (eg text only, audio lectures with slides, text with multimedia materials),20 interaction with content (eg cases with questions, quizzes) 11,15,19 and interpersonal interaction (eg online courseware, electronic mail, desktop videoconference).12–14,16–18,21 National clinical practice guidelines from local authoritative bodies were used in four studies,11,15,17,18 either as the sole basis for the intervention or as a component of an online intervention.

Study quality

Each of the studies had identifiable methodological limitations. Only half of the trials were randomised .11,12,17,18,20,21 The majority of these trials described their randomisation techniques adequately11,12,17,18,20 but only two had adequate concealment of allocation.11,12 Participants in a study of an education intervention cannot be blinded to the interventions and therefore the trials were evaluated according to whether researchers evaluating the outcomes were blinded to the intervention. One-quarter of the trials described a blinded evaluation process.11,17,18 Only one-third of the trials described the number and reasons for participant withdrawals.11,12,17,21 Similar baseline measurements between intervention and control groups were reported in only four11,17,20,21 of seven studies.

On the basis of the quality scoring system described in the methods section, three studies achieved a score of 3;11,12,17 two studies achieved a score of 2;18,20 one study achieved a score of 1;21 and five studies achieved a score of 0.13–16,19

Outcome evaluation

Table 2 shows the effects of the interventions on measured outcomes, which are divided into four classifications: satisfaction, knowledge, practice and patient outcomes.

Table 2. Effects of interventions based on measured outcomes


Positive outcomes

No change

Negative outcomes

Mixed results*

Desired outcomes

Satisfaction with:
    -educational program
    -online delivery method
    -quality of the online technique

12–14, 16,12




Improved attitudes toward management

12, 14, 16, 19




Improved learning outcomes

11–14, 16, 19

17, 21


15, 18

Self-reported practice/behavioural

16, 19



11, 15

Observed changes in practice and/or behaviour





Improving clinical or patient outcomes





*Mixed results (+/0) mean some dependent variables were positive and others showed no changes
Numbers in columns are references (see Reference list for details).


GP satisfaction was measured in seven studies11–14,16,17,19 but one did not report the results.19 Participants in each study reported satisfaction with online learning techniques.12–14,16,17


Ten studies examined knowledge improvement following an online CME intervention.11–19,21 Although online CME typically improved GP knowledge, there was little evidence for greater learning via online versus other methods. Only one of four randomised controlled trials (RCTs) reported positive learning outcomes favouring online over traditional CME.12 Another study reported significant knowledge improvement in only one of two topics when compared with the control group.18 Another study reported an increase in knowledge without significant differences, compared with a workshop group,17 and the other reported no change in GP knowledge.21

Facilitated online interactions seem to influence GP learning. A non-randomised control study reported significant knowledge gain in an online facilitated, asynchronous discussion group over a non-facilitated discussion group.14 Finally, four studies without control groups also showed predominantly positive support for the learning outcomes of online CME; three reported significant knowledge gain13,16,19 and one study reported significant knowledge gain in only one out of three CME topics.15

Clinical practice

Three studies examining the impact of online CME on participant practice yielded mixed findings.17,18,20 One study reported improvements in guideline compliance regarding preventive health practices for perimenopausal patients but not for diabetes in older male patients.18 This study also reported changes in physician behaviours as assessed by standardised patients using a 16-item diabetes checklist. However, there were no significant differences between the intervention and comparison groups.18 Another study reported no change in the percentage of patients who had appropriate guideline-driven lipid panel screening.17 A further study reported the rate of chlamydia screening was significantly different in a multicomponent online group, compared with the flat-text online group.20

     Four studies also examined clinical practice improvements through participant self-reporting. Online CME was reported to improve participant confidence in their clinical management;16,19 however, less than half of participants felt their practices had been changed following CME interventions.15 In another study, participants reported limited relevance of the CME to their daily practice.11

Patient outcomes

Only one RCT examined the impact of online CME on patient outcomes.17 This study reported a significant increase in the percentage of patients treated for dyslipidaemia by participants who undertook online CME (with optional live web conference), compared with those who completed a face-to-face CME.


This review examined evidence for the effectiveness of online CME in improving GP satisfaction, knowledge and clinical practice, and patient outcomes. Our review focused specifically on GP populations. However, two-thirds of the studies reviewed also included other healthcare professionals. Despite an increase in utilisation of online CME,1–3 few studies have rigorously evaluated its impact on GP and patient outcomes.

Evidence also suggests that physicians still prefer traditional CME delivery methods.1,23 A recent survey of senior Australian doctors, of which more than half were GPs, showed that the traditional form of CME was more popular than online learning.23 Furthermore, CME preferences may also vary across individuals and topics.24 Thus, to promote adoption of online CME, education providers require a detailed understanding of GP learning needs and preferences in specific contexts.

This review focused on online techniques, but the interventions varied greatly in terms of instructional design and educational topics. It is difficult to draw sound conclusions, on the basis of the limited number of eligible studies included in this review, as to which instructional design of online CME is superior to other forms of GP education. Although the majority of studies included in this review used an interactive instructional design (discussion format), the effects on GP knowledge and clinical practice were inconsistent. Studies that trialled other online formats (interaction with content) also reported inconsistent changes in participant knowledge and practice.

An earlier review suggests superiority of the multicomponent online CME over a flat-text format.25 Another systematic review with meta-analysis suggests that internet-based learning formats including interactivity, practice exercises, repetition and feedback seem to be associated with improved learning outcomes, whereas the evidence for other online instructional formats is inconclusive.26 These reviews, however, did not focus solely on GPs.

The majority of studies reviewed tested the immediate impact of online CME on a change in knowledge. Only one of those studies examined whether knowledge was translated into practice; however, the results were based on participants’ self-reporting. Half of the studies11,17,18,20 measuring changes in practice or patient outcomes provided 5-months to 1-year follow-up, which may be argued as sufficient to measure intermediate change. The effects of the online CME identified from this systematic review fit into the four levels described originally by Kirkpatrick,27 including reaction, learning, behaviour and results, or modified forms for the medical education literature,5,28 namely, satisfaction, learning, performance and patient/health outcomes.5 However, the findings from this systematic review showed that there was limited research evaluating the effects of the Kirkpatrick’s highest level, which refers to quality of healthcare or patient outcomes.29

In this review, the observed effects of online CME varied depending on the presence or absence of control groups. Findings from this review suggest that with a non-intervention control group or without a control group, the online intervention produced positive outcomes in satisfaction, knowledge or practices.11–13,15,16,18,19 No effect was reported when the online intervention was compared with a non-online-intervention comparison group.17 There was little evidence for the impact of online CME on patient outcomes. Similarly, a review conducted by Cook and colleagues7 indicated that the effectiveness of internet-based CME, on average, is equivalent to traditional formats in terms of changes in knowledge, skills and behaviour.

Study quality issues

Various evaluation methods were used to measure GP and patient outcomes, including rating, self-assessment questionnaire, direct observation by standardised patients, and performance audit. There are widely acknowledged limitations to each of these methods;30 thus, results must be interpreted with caution. In addition, there was limited use of validated tools in the reviewed studies. The lack of evidence for the validity and reliability of study evaluation methods limited the strength of the evidence for the effectiveness of online CME.31

There are several factors that limit the generalisability of findings from this review: 1) differences in instructional methods of online program and complexity of desired outcomes; 2) the lack of established validity and reliability of many of the evaluation tools; 3) the lack of clear details about exposure duration; 4) study designs: although the majority of the studies were based on a RCT design, a quasi-experimental design and a non-randomised control trial were also included, which may have resulted in overestimation of observed effects; 5) participants were self-selected to the online programs, which may have produced bias; 6) study size: one study had small sample size21 and another reported that <10 participants had participated in five out of 10 courses offered;16 7) high attrition was reported from three studies;11–13 and 8) this review was limited to English language articles and therefore may have excluded relevant research published in other languages.

Implications for general practice research

  1. The number of studies examining GP online education is limited; further research is warranted.
  2. Further research is needed into the specific characteristics of online CME that produce positive GP and patient outcomes.
  3. To test and draw a clear conclusion on the effectiveness of any given educational intervention, reproducible, quality RCTs are required with adequate control groups.
  4. Exploratory qualitative research concurrent with RCTs may also be valuable in gaining an understanding of GPs’ learning needs, possible barriers or difficulties to completion of online CME and how to make online CME work.
  5. In order to gain an accurate measurement of the effects of online educational interventions on desired outcomes, educators and researchers are encouraged to utilise valid and reliable methods of evaluation.

Competing interests: None.

Provenance and peer review: Not commissioned, externally peer reviewed.


This systematic review forms part of a PhD project, which is part of a research project entitled The effectiveness of continuing medical education and feedback in altering diabetes outcomes at a population level (546096) funded by the National Health Medical Research Council.

  1. Harris JM, Sklar BM, Amend RW, Novalis-Marine C. The growth, characteristics, and future of online CME. J Contin Educ Health Prof 2010;30:3–10. Search PubMed
  2. Casebeer L, Brown J, Roepke N, et al. Evidence-based choices of physicians: A comparative analysis of physicians participating in Internet CME and non-participants. BMC Med Educ 2010;10:42. Search PubMed
  3. Hammoud M, Gruppen L, Erickson SS, et al. To the Point: Reviews in medical education online computer assisted instruction materials. Am J Obstet Gynecol 2006;194:1064–69. Search PubMed
  4. Wutoh R, Boren SA, Balas EA. eLearning: a review of Internet-based continuing medical education. J Contin Educ Health Prof 2004;24:20–30. Search PubMed
  5. Curran VR, Fleet L. A review of evaluation outcomes of web-based continuing medical education. Med Educ 2005;39:561–67. Search PubMed
  6. Marinopoulos S, Dorman T, Ratanawongsa N, et al. Effectiveness of continuing medical education. Evid Rep Technol Assess (Full Rep) 2007;149:1–69. Search PubMed
  7. Cook DA, Levinson AJ, Garside S, Dupras DM, Erwin PJ, Montori VM. Internet-based learning in the health professions: A meta-analysis. JAMA 2008;300:1181–96. Search PubMed
  8. Cobb SC. Internet continuing education for health care professionals: an integrative review. J Contin Educ Health Prof 2004;24:171–80. Search PubMed
  9. Short LM, Surprenant ZJ, Harris JM, Jr. A community-based trial of an online intimate partner violence CME program. Am J Prev Med 2006;30:181–85. Search PubMed
  10. Jadad AR, Moore RA, Carroll D, et al. Assessing the quality of reports of randomized clinical trials: is blinding necessary? Control Clin Trials 1996;17:1–12. Search PubMed
  11. Houwink EJ, van Teeffelen SR, Muijtjens AM, et al. Sustained effects of online genetics education: a randomized controlled trial on oncogenetics. Eur J Hum Genet 2014;22:310–16. Search PubMed
  12. Pelayo M, Cebrian D, Areosa A, Agra Y, Izquierdo JV, Buendia F. Effects of online palliative care training on knowledge, attitude and satisfaction of primary care physicians. BMC Fam Pract 2011;12:37. Search PubMed
  13. Fleet LJ, Fox G, Kirby F, Whitton C, McIvor A. Evaluation outcomes resulting from an internet-based continuing professional development (CPD) asthma program: Its impact on participants' knowledge and satisfaction. J Asthma 2011;48:400–04. Search PubMed
  14. Curran VR, Fleet LJ, Kirby F. A comparative evaluation of the effect of internet-based CME delivery format on satisfaction, knowledge and confidence. BMC Med Educ 2010;10:10. Search PubMed
  15. Robson J. Web-based learning strategies in combination with published guidelines to change practice of primary care professionals. Br J Gen Pract 2009;59:104–09. Search PubMed
  16. Curran V, Lockyer J, Sargeant J, Fleet L. Evaluation of learning outcomes in web-based continuing medical education. Acad Med 2006;81:S30–34. Search PubMed
  17. Fordis M, King JE, Ballantyne CM, et al. Comparison of the instructional efficacy of internet-based CME with live interactive CME workshops: A randomized controlled trial. JAMA 2005;294:1043–51. Search PubMed
  18. Stewart M, Marshall JN, Østbye T, et al. Effectiveness of case-based on-line learning of evidence-based practice guidelines. Fam Med 2005;37:131–38. Search PubMed
  19. Robinson L, Cruickshank N. Improving primary care nutrition skills. Asia Psc J Clin Nutr 2005;14:S92–96. Search PubMed
  20. Allison JJ, Kiefe CI, Wall T, et al. Multicomponent Internet continuing medical education to promote chlamydia screening. Am J Prev Med 2005;28:285–90. Search PubMed
  21. Chan DH, Leclair K, Kaczorowski J. Problem-based small-group learning via the internet among community family physicians: A randomized controlled trial. MD Computing 1999;16:54–58. Search PubMed
  22. Sargeant J, Curran V, Jarvis-Selinger S, et al. Interactive on-line continuing medical education: physicians' perceptions and experiences. J Contin Educ Health Prof 2004;24:227–36. Search PubMed
  23. Stewart GD, Khadra MH. The continuing medical education activities and attitudes of Australian doctors working in different clinical specialties and practice locations. Aust Health Rev 2009;33:47–56. Search PubMed
  24. Wong G, Greenhalgh T, Pawson R. Internet-based medical education: A realist review of what works, for whom and in what circumstances. BMC Med Educ 2010;10:12. Search PubMed
  25. Lam-Antoniades M, Ratnapalan S, Tait G. Electronic continuing education in the health professions: An update on evidence from RCTs. J Contin Educ Health Prof 2009;29:44–51. Search PubMed
  26. Cook DA, Levinson AJ, Garside S, Dupras DM, Erwin PJ, Montori VM. Instructional design variations in internet-based learning for health professions education: a systematic review and meta-analysis. Acad Med 2010;85:909–22 . Search PubMed
  27. Kirkpatrick DL. Evaluating training programs: The four levels. San Francisco: Berrett-Koehler, 1994. Search PubMed
  28. Hutchinson L. Evaluating and researching the effectiveness of educational interventions. BMJ 1999;318:1267–9. Search PubMed
  29. Turner NM. Continuing medical education in pediatric anesthesia – a theoretical overview. Paediatr Anaesth 2008;18:697–701. Search PubMed
  30. Reed D, Price EG, Windish DM, Wright, et al. Challenges in systematic reviews of educational intervention studies. Ann Intern Med 2005;142:1080–89. Search PubMed
  31. Ratanawongsa N, Thomas PA, Marinopoulos SS, et al. The reported validity and reliability of methods for evaluating continuing medical education: a systematic review. Acad Med 2008;83:274–83.  Search PubMed
Download article PDF


Australian Family Physician RACGP

Printed from Australian Family Physician - https://www.racgp.org.au/afp/2014/october/online-continuing-medical-education-cme-for-gps-do
© The Australian College of General Practitioners www.racgp.org.au