Article Text

Needs-based novel digital curriculum for the neuromodulation training deficit: Pain Rounds
  1. Shravani Durbhakula1,
  2. Serkan Toy2,
  3. Carlos A Acosta3,4,
  4. Ross A Barman5,
  5. Andrew F Kelner6,
  6. Mohammad A Issa7,
  7. Mustafa Y Broachwala8,
  8. Bryan J Marascalchi1,
  9. Yeshvant A Navalgund9,
  10. Daniel J Pak10,
  11. Erika A Petersen11,
  12. Neel D Mehta10,
  13. Susan M Moeschler5 and
  14. Lynn R Kohan12
  1. 1 Department of Anesthesiology and Critical Care Medicine, Johns Hopkins School of Medicine, Baltimore, Maryland, USA
  2. 2 Departments of Basic Science Education and Health Systems & Implementation Science, Virginia Tech Carilion School of Medicine, Roanoke, Virginia, USA
  3. 3 Bloomberg School of Public Health, Johns Hopkins University, Baltimore, Maryland, USA
  4. 4 Carey Business School, Johns Hopkins University, Baltimore, Maryland, USA
  5. 5 Department of Anesthesiology, Mayo Clinic, Rochester, Minnesota, USA
  6. 6 Nevada Comprehensive Pain Center, Las Vegas, Nevada, USA
  7. 7 Riverside Medical Center, Bourbonnais, Illinois, USA
  8. 8 Department of Physical Medicine and Rehabilitation, Johns Hopkins School of Medicine, Baltimore, Maryland, USA
  9. 9 National Spine and Pain Centers, Frederick, Maryland, USA
  10. 10 Department of Anesthesiology, Weill Cornell Medicine, New York, New York, USA
  11. 11 Department of Neurosurgery, University of Arkansas for Medical Sciences, Little Rock, Arkansas, USA
  12. 12 Department of Anesthesiology, University of Virginia, Charlottesville, Virginia, USA
  1. Correspondence to Dr Shravani Durbhakula, Department of Anesthesiology & Critical Care Medicine, Johns Hopkins School of Medicine, Baltimore, MD 21287, USA; sdurbha3{at}jhmi.edu

Abstract

This study reports the needs-based development, effectiveness and feasibility of a novel, comprehensive spinal cord stimulation (SCS) digital curriculum designed for pain medicine trainees. The curriculum aims to address the documented systematic variability in SCS education and empower physicians with SCS expertise, which has been linked to utilization patterns and patient outcomes. Following a needs assessment, the authors developed a three-part SCS e-learning video curriculum with baseline and postcourse knowledge tests. Best practices were used for educational video production and test-question development. The study period was from 1 February 2020 to 31 December 2020. A total of 202 US-based pain fellows across two cohorts (early-fellowship and late-fellowship) completed the baseline knowledge assessment, while 122, 96 and 88 participants completed all available post-tests for Part I (Fundamentals), Part II (Cadaver Lab) and Part III (Decision Making, The Literature and Critical Applications), respectively. Both cohorts significantly increased knowledge scores from baseline to immediate post-test in all curriculum parts (p<0.001). The early-fellowship cohort experienced a higher rate of knowledge gain for Parts I and II (p=0.045 and p=0.027, respectively). On average, participants viewed 6.4 out of 9.6 hours (67%) of video content. Self-reported prior SCS experience had low to moderate positive correlations with Part I and Part III pretest scores (r=0.25, p=0.006; r=0.37, p<0.001, respectively). Initial evidence suggests that Pain Rounds provides an innovative and effective solution to the SCS curriculum deficit. A future controlled study should examine this digital curriculum’s long-term impact on SCS practice and treatment outcomes.

  • pain management
  • chronic pain
  • education
  • spinal cord stimulation

Data availability statement

All data relevant to the study are included in the article or uploaded as supplementary information.

http://creativecommons.org/licenses/by-nc/4.0/

This is an open access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited, an indication of whether changes were made, and the use is non-commercial. See: http://creativecommons.org/licenses/by-nc/4.0/.

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Introduction

Chronic pain cuts across almost every medical specialty, impacting approximately 100 million adults1 2 in the USA and resulting in an annual economic burden between 560 and 635 billion,1 ,2 including costs of medical expenditures and lost productivity. Targeted pain treatments are critical in improving care quality, alleviating suffering, and reducing this annual sum. Existing evidence concludes that spinal cord stimulation (SCS), the most widespread neuromodulation treatment, is therapeutically beneficial and cost-effective for chronic pain.3

SCS is a surgical procedure requiring appropriate patient selection, technical expertise, and counseling regarding postoperative care. The incidence of complications is 30%–40%, and events range in severity from lead migration to epidural abscess.4 Physician expertise is a prognostic factor associated with optimal SCS outcomes.5

Pain fellowships provide varying experiences and exposure to neuromodulation over a 1-year program.6 Practitioners decline to use neuromodulation devices, even when appropriate, because they lack exposure and training.7 8 Recent graduating pain fellows in the USA reported deficits in SCS education, identifying poor SCS case volume (38.5%), lack of SCS curriculum (30.8%), and lack of faculty with SCS expertise (23.1%) as barriers to their projected future use of SCS.6

Currently, pain fellows turn to the industry for SCS didactic courses and cadaver labs. In fact, 77.5% of fellows reported participation in industry-sponsored workshops, and half reported attendance at three or more.6 These didactics are a useful supplement to academic training that highlight the nuances of competing devices but are inadequate due to inherent bias. In addition, they require fellows to sacrifice off-duty time and travel for education.

A robust, non-promotional, and standardized SCS curriculum for widespread implementation can reduce systematic variability in pain education and improve patient care. The authors created Pain Rounds, a novel, module-based SCS e-learning video curriculum, and investigated its effectiveness and feasibility through baseline and immediate post-test knowledge quizzes and learner engagement metrics. Pain Rounds has been deliberately designed to make learning enjoyable and ‘sticky’ through conversational dialogue between experts paired with step-by-step cadaver lab demonstrations, problem-based learning and interactive assessments. Its online nature caters to the current generation of e-learners and makes it scalable for national and international use.

This paper reports on the needs assessment that informed content development, the curriculum’s effectiveness for knowledge gain, the optimal timing for curriculum administration during the fellowship, and its feasibility through learner content utilization and engagement with the curriculum.

Methods

The Johns Hopkins School of Medicine’s institutional review board (IRB file number: IRB00203844) approved this study.

Needs assessment

The Qualtrics software (http://www.qualtrics.com, Qualtrics, Provo, Utah) was used to create a web-based survey that asked participants to rate 24 potential neuromodulation topics for inclusion in the curriculum on a Likert scale of 1–5 (1=not at all important to include and 5=extremely important to include). This survey was administered to program directors (PDs) of the Accreditation Council for Graduate Medical Education (ACGME)-accredited fellowships in person at the Association of Pain Program Directors (APPD) Spring 2019 meeting, and electronically through email requests. In total, 72 of 106 ACGME-accredited PDs responded. Additionally, an email with a link to the needs assessment was sent to 216 graduating fellows requesting their participation. After two reminders, this generated 74 responses representing 39 institutions. These responses were analyzed using descriptive statistics with frequencies and percentages to help prioritize curriculum content.

We subsequently conducted focus groups of graduating fellows (five groups with five fellows per group) in the fall of 2019 at a national meeting. Standardized, scripted, peer-reviewed questions were used to gain a deeper understanding of the fellows’ perspectives on current training deficits and preferences for information delivery. The focus group discussions were audiorecorded, deidentified, and transcribed. Analysis of the written transcripts used a hybrid thematic analysis, combining principles of deductive and inductive thematic analysis.9 The structured focus group questions served as the initial coding scheme, and framework extension accounted for more specific or unexpected responses.

Course outline

More than 75% of both PD and fellows rated 17 out of the 24 proposed topics as ‘extremely important’ or ‘very important’ (see online supplemental figures 1 and 2). These 17 topics were all included in the final curriculum. Topics that received ‘extremely important’ (see table 1) rating by >90% of both PD and fellows were given more emphasis in the curriculum. These topics had more content (either more videos were devoted to them or were reviewed multiple times in the curriculum) and knowledge questions. The focus groups revealed great variability in confidence in surgical skills, concern about differentiating SCS products and companies, and preference for short (20–25 min) digital materials. The focus group insights further informed content development and delivery.

Supplemental material

Supplemental material

Table 1

Topics that over 90% of fellows and PDs thought were ‘extremely important’

Recruitment of curriculum participants

We presented our concept to PDs and entertained feedback at three APPD meetings (November 2018, March 2019 and November 2020). We also used social media for awareness. At the November 2020 meeting, PDs were allowed to enrol fellows into the free pilot curriculum. In addition, pain fellows were allowed to enrol on the Pain Rounds website through a questionnaire requesting information such as National Provider Identifier (NPI) number, number of SCS cases exposed t and demographic information. Questionnaires were manually screened to ensure current enrolment in an accredited program. The pilot study was conducted on two groups: (1) ACGME-accredited fellows in their last half of training from February 2020 to June 2020 (late cohort) and (2) ACGME-accredited fellows in the beginning half of training from July 2020 to December 2020 (early cohort).

Video content and knowledge test development

The Pain Rounds curriculum includes 28 video episodes averaging 21 min in length. Videos consist of interviews, graphics and animations, step-by-step cadaver lab tutorials, and games that test decision-making. Each video was reviewed by three experts in SCS and two graduating pain fellows for content accuracy and relevance. Feedback was integrated iteratively, utilizing a modified-Delphi method, until the achievement of 90% consensus about a video’s curriculum suitability.

Since an unrestricted industry educational grant funded this project, the same individuals (three content experts and two graduating pain fellows) also reviewed the content to ensure it was free of bias without labels, marketing, or promotional material. Unrestricted grants do not have any stipulations on how the funds are used except that they are used for an educational endeavor. The videos, animations, and all content were created by the Principal Investigator (PI) and an independent media team and are copyrighted and owned by the PI and Johns Hopkins University. The American Academy of Pain Medicine has agreed to license this content from Johns Hopkins to provide to its members.

The videos are organized into three main parts: (1) Fundamentals, (2) Cadaver Lab, and (3) Decision-Making, The Literature, and Critical Application. These parts are further segmented into modules with topics derived from the needs assessment.

An expert panel of five established neuromodulators and two graduating fellows reviewed an initial set of 81 multiple-choice questions (MCQs) compiled for the pretest and post-test. For each proposed question–answer pair, experts commented on their accuracy, importance for an SCS curriculum, and question quality. Pain Rounds authors revised the questions based on this feedback. The questions were sent to the Johns Hopkins Office of Assessment and Evaluation for input on question structure and wording (ie, leading language) and revised accordingly.

Each main part of the curriculum has a required unique pretest. Part I: ‘Fundamentals’ has 27 associated MCQs; Part II: ‘Cadaver Lab’ has 32 MCQs; and Part III: ‘Decision Making, The Literature, and Critical Application’ has 18 MCQs. Once users complete this baseline knowledge assessment (composed of 77 MCQs across three parts), they must progress linearly through the modules. While learners answer all pretest questions before beginning the curriculum, at the end of each module, they complete a post-test comprising only 3–5 questions taken from the pretest corresponding to that module. The use of the same questions for pretest and post-test controls for the psychometric properties for comparison purposes.

Statistical considerations

Each of the three main curriculum parts had a unique number of associated knowledge test items. Knowledge scores per part were calculated as percentages by dividing participants’ total number of correct answers by the total number of items per test. These percentage scores were used in the analyses and reporting to allow for comparisons.

Pearson’s χ2 or Fisher’s exact test (where appropriate) was used to compare categorical variables. A bivariate Pearson correlation was used to examine the relationship between the reported number of cases involved (prior experience) and knowledge test scores (pretest/post-test: Part I, Part II, and Part III). As explained in section Recruitment of curriculum participants above, the first cohort enrolled in the curriculum in February (second half of fellowship), while the second cohort began the curriculum in July (beginning of fellowship). Therefore, we used a mixed-design analysis of variance (ANOVA) with repeated measures (pretest and post-test scores) as within-subject factors and fellowship cohort as between-subject factors to determine the effect of using the Pain Rounds curriculum early or late in the fellowship program.

A power analysis indicated that a total of 78 participants were needed (who completed both pretest and post-test) to provide at least 85% power to detect an effect size of 0.30 for the repeated measures, mixed-design ANOVA between factors main effects with an α level of 0.05.

Bonferroni correction was used in all correlations and pairwise comparisons. All statistical analyses were conducted with Statistical Package for the Social Sciences (IBM SPSS Statistics for Mac, V.25.0; IBM Corp, Armonk, New York), with significance level set at p<0.05.

Handling missing data

Some trainees took the pretest but not the post-test, which resulted in missing data. Examination of the dataset indicated that more participants completed Part I (pretest and post-test) and the completion rate slightly declined for subsequent curriculum parts. To make the best use of available data and allow for repeated measures tests, we employed pairwise deletion and only included those participants with complete data for each part of the curriculum. Thus, sample sizes are different for different parts of the curriculum.

Results

A total of 202 trainees across two cohorts (130 from late-fellowship and 72 from early-fellowship) registered for this free course and took the baseline test, but not all completed the course (defined by completing all associated post-tests). Thirty-eight fellows who took the baseline test did not complete any post-test, leaving 164 fellows (103 from late-fellowship and 61 from early-fellowship) with at least one completed post-test. Part I, II, and III were completed by 122, 96, and 88 fellows, respectively. Completion rates between early and late cohorts were similar for Part I and II; however, late cohorts had a significantly higher completion rate for the last part of the curriculum, Part III (p=0.012), see table 2. Late cohort trainees reported involvement with a significantly greater number of cases (p<0.001). However, the two cohorts did not differ significantly on pretest or post-test scores. See table 2 for curriculum completion rate, self-reported number of cases, and mean and SD of pretest and post-test scores by each cohort.

Table 2

Part-wise curriculum completion rate with ‘completed’ defined as all post-tests in any part completed, self-reported number of cases involved, and mean and SD of pretest and post-test scores by each cohort

There was a positive, low correlation between prior experience, measured by the number of cases a fellow has performed before using Pain Rounds, and Part I pretest scores (r=0.25, p=0.006) and a positive, moderate correlation between prior experience and Part III pretest scores (r=0.37, p<0.001). No other significant correlations were noted between the prior experience and knowledge test scores (see table 3 for correlations).

Table 3

Correlations between the number of cases involved and knowledge test scores

Knowledge change

For all mixed ANOVA tests, assumptions for Levene’s and Box’s M tests were met.

There was a significant interaction between the time (repeated measures) and cohorts for Part I: F(1,120) = 4.11, p=0.045, partial η2=0.03; and for Part II: F(1,94) = 5.06, p=0.027, partial η2=0.05. This interaction term indicated that the magnitude of increase from pretest to post-test in the knowledge scores between the two cohorts was not the same and that the early cohort had a higher rate of knowledge gain. Simple main effects analysis with Bonferroni correction for both Part I and Part II showed that fellows in the two cohorts did not differ significantly either at pretest or at post-test, but both cohorts increased their test scores significantly from baseline to postcourse (p<0.001). Figure 1 shows the profile plots comparing the average knowledge test scores for (A) Part I, (B) Part II, and (C) Part III.

Figure 1

The profile plots comparing the average knowledge test scores for (A) Part I, (B) Part II, and (C) Part III. Error bars indicate 95% CIs.

There was no significant interaction between the time (repeated measures) and cohorts for Part III: F(1,86) = 3.61, p=0.061, partial η2=0.04, indicating that the magnitude of knowledge score increase from baseline to post-test did not differ significantly between the cohorts for Part III. There was a significant main effect for time (F(1,86) = 190.31, p<0.001, partial η2=0.69). Overall, fellows scored significantly higher on the post-test compared with the pretest. There was no main effect for cohorts (F(1,86) = 0.48, p=0.489, partial η2=0.006).

Engagement and course completion

The engagement was measured through Vimeo statistics and the overall post-test completion rate.

Vimeo statistics

The average time per view was 16 min and 1 s. This measures the amount of time a video was watched uninterrupted and is used as a proxy for engagement in studies of online courses. On average, participants watched 6.43 hours out of 9.6 hours (67%) of the total course content.

Overall post-test completion rate

Out of 23 total post-tests across the entire curriculum, the total possible post-tests were (164 fellows × 23 post-tests) 3772. There was a total of 2411 completed post-tests, thus resulting in an overall post-test completion rate of 64% (2411/3772). When the curriculum is separated by parts as in table 2, the post-test completion rate for parts I, II, and III was 74%, 58.5%, and 54% respectively.

Discussion

The US Department of Health and Human Services (HHS) recently highlighted the inadequacy of training and education of pain practitioners in the use of interventional procedures as a potential contributor to complications and inappropriate utilization.10 The HHS report categorized SCS in the most complex category of interventional pain therapies.10 Despite case complexity and severity of complications, pain fellowships show substantial variability in equipping future practitioners with proficiency in these techniques.6

Our needs assessment engaged the key stakeholders, pain PD and fellows, to define the education gap more granularly. With this information, we created Pain Rounds which seeks to diminish the variability in pain fellowship SCS training by providing a standardized self-learning digital tool to supplement in-person, hands-on training.

Knowledge

ACGME-accredited pain fellows had significant, large increases in knowledge scores from pretest to post-test for every module in Part I, II, and III, across cohorts. This suggests that the Pain Rounds curriculum is an effective tool for SCS education throughout the fellowship year.

Systematic reviews on Continuing Medical Education (CME) programs suggest that education is most beneficial for knowledge application and psychomotor skills when the following are combined: multimedia (ie, videos, podcasts, animations), multiple instructional techniques (ie, case simulations, patient interactions, lectures, games), and multiple exposures.11 The Pain Rounds curriculum strategically integrated these various learning modalities and instructional tools. It also facilitated blended learning (synchronous combined with asynchronous learning), as the ACMGE-fellows enrolled in Pain Rounds were simultaneously receiving live, hands-on clinical education via their training programs. Blended learning is correlated with better effects on knowledge levels than traditional learning in health education.12

SCS is a surgical procedure. A systematic review has indicated that video-enhanced surgical education, compared with non-video training, is associated with improved knowledge and operative performance and greater learner satisfaction.13 Despite the availability of a high volume of surgical training videos on YouTube and their popularity among surgical trainees, a systematic review has concluded that these videos vary widely in their accuracy, quality, comprehensiveness, and utility for trainees, and they lack screening mechanisms such as peer review.14 In contrast, Pain Rounds content was rigorously reviewed by experts and fellows, revised until consensus was achieved through a modified Delphi method, and systematically organized to create a comprehensive, evidence-based, and needs-based curriculum.

Participants, on average, watched Pain Rounds for 16 min and 1 s uninterrupted before pausing or closing the video, which is greater than observed in other Massive Online Open Course (MOOC) platforms such as EdX where this time is 6 min.15 A notable problem with other MOOCs has been the completion rate. An MIT study of the EdX platform that examined 12.6 million course registrations in over 200 massive open-access online courses found completion rates to range from 2% to 10%.9 In contrast, Pain Rounds' overall post-test completion rate (across all three parts) was 64%, and on average, 67% of the total course content was watched by the participants.

We suspect that adherence to our curriculum is higher because we adopted published best practices for educational video production.15 Pain Rounds integrated key factors shown to improve viewer engagement including speaker enthusiasm, a personal feel, graphics interspersed between instructor speaking footage, a format other than classroom lectures, interactive tutorials, and shorter video length.15 In addition, Pain Rounds tailored its content to its target population’s self-spoken needs. Early stakeholder engagement likely contributed to the high engagement observed.

Timing

The early cohort started with lower pretest scores than the late cohort but experienced a greater learning rate and was able to effectively offset its baseline knowledge deficit. Using the Pain Rounds curriculum early in fellowship is suggested, as the information learned can then be implemented in cases throughout the remainder of the fellowship period and referenced as needed. Of note, however, is that compared with the late cohort, the early cohort had a lower rate of post-test completion for the last part of the curriculum, Part III (Decision Making, The Literature, and Critical Applications). This may indicate the importance of aligning the curricular topics with clinical experience for the trainee engagement, which may increase if they find the educational resource timely and relevant for their practice. While topics pertaining to foundational knowledge and technical skills are more relevant earlier in the fellowship, more complex and nuanced information may be better appreciated towards the end of the training.

A significant positive correlation existed between fellows’ prior SCS experience and Part I and III pretest scores. Curriculum Parts I and III relate to foundational information, decision-making considerations, and SCS literature. It is consistent with our expectation that medical knowledge about these topics should increase with increased clinical SCS exposure. However, the correlation was not significant for Part II (Cadaver Lab). This may reflect variability in the operative technique of SCS instructors at various programs. Regardless of SCS exposure level, the curriculum allowed fellows to enhance SCS knowledge.

Advantages of a video format

Videos stimulate curiosity and speak to the current generation of digital learners who frequently engage with online resources.16 Videos seem to capture attention better than textbooks17 and are as effective as live lectures in medical education.18 They also provide some learning advantages that are valuable for understanding complex information: (1) they allow learners to go at their own speed—stopping, rewinding, speeding up and replaying sections as necessary;19 (2) they give learners access to experts outside of their own institution, which helps address inconsistencies in faculty expertise at programs; (3) they are to easily scale and can be adopted without geographic or physical restraints; and (4) They allow procedures to be presented in a predictable way and control for individual differences between instructors.19

Limitations

The late cohort engaged in Pain Rounds during the COVID-19 peak. These fellows may not represent a typical, non-pandemic cohort. They likely had less SCS exposure due to elective surgery cancellations, and more interest in the curriculum due to reduced case volumes. Furthermore, these fellows’ time and mental capacity to engage with the curriculum were likely impacted by redeployments to intensive care units and pandemic-induced stressors. This limitation may be a reason for the difference in size between the late and early cohorts.

Another limitation is that this curriculum does not reach the highest levels of Miller’s Pyramid for assessing clinical competency.20 While it effectively tests knowledge (tier 1) and the clinical problem-solving games help assess knowledge application (tier 2), there are no integrated standardized patient assessments, practical exams, simulations (tier 3), or live patients (tier 4).

A third limitation was the structure, which required sequential advancement through parts. Part I contained the densest scientific content; this may have contributed to the loss of participants and reduced completion rates in Parts II and III. The structure has since been revised, and now allows non-linear navigation.

The final and most significant limitation is that the current pre/post study design measures the knowledge gain immediately after the Pain Rounds intervention. A future study should examine this digital curriculum’s long-term impact on knowledge gain and clinical outcomes.

Future directions

The development of an immersive virtual reality supplement to Pain Rounds, where users can simulate SCS, will allow the curriculum to reach the third tier of Miller’s pyramid (clinical competency demonstration)20 and create opportunities for use in credentialing and standardized testing. An immersive virtual reality supplement also opens up the opportunity to compare various combinations of the videos, the virtual reality supplement, and traditional fellowship education in a longer-term controlled study. The evaluation of long-term knowledge retention, changes in clinical practice, and impact on clinical outcomes will be important for validating the role of the curriculum in closing the educational gap.

Additionally, updating Pain Rounds at regular intervals will allow its content to remain relevant. Funds for updating can come from platform monetization by Johns Hopkins University. Partnerships with professional medical societies can also provide funds while enabling broader dissemination. Finally, the methods used in the development and implementation of Pain Rounds can be applied to other areas of medicine to address significant educational gaps.

Data availability statement

All data relevant to the study are included in the article or uploaded as supplementary information.

Ethics statements

Patient consent for publication

Acknowledgments

We have obtained permission to acknowledge the following physicians for their contributions to the Pain Rounds project. Steven P Cohen, MD, Johns Hopkins School of Medicine, Baltimore, MD. Timothy R Deer, MD, The Spine and Nerve Centers of the Virginias, Charleston, WV. Maged N Guirguis, MD, Ochsner Clinical Foundation, New Orleans, LA. Robert M Levy, MD, PhD, Anesthesia Pain Care Consultants, Tamarac, FL. Sean Li, MD, National Spine and Pain Centers, Frederick, MD. Srinivasa N Raja, MD, Johns Hopkins School of Medicine, Baltimore, MD. Peter S Staats, MD, MBA, National Spine and Pain Centers, Frederick, MD. Kayode A Williams, MD, MBA, Johns Hopkins School of Medicine, Baltimore, MD.

References

Supplementary materials

  • Supplementary Data

    This web only file has been produced by the BMJ Publishing Group from an electronic file supplied by the author(s) and has not been edited for content.

Footnotes

  • SD and ST are joint first authors.

  • Twitter @ShravaniD_MD, @rbarman145, @danieljpak, @SMoeschlerMD, @kohanlynn

  • Funding An unrestricted educational grant to Johns Hopkins University (Principal Investigator: Shravani Durbhakula) from Nevro Corp provided funding for Pain Rounds video production. Shravani Durbhakula’s effort for manuscript preparation and writing was funded by National Institutes of Health 5T32GM075774-17.

  • Competing interests SD received an unrestricted educational grant for the Pain Rounds project from Nevro Corp, and has no other conflicts of interest that are relevant to this manuscript. ST, CAA, RAB, AFK, MAI, MYB, BM, and LK do not have any conflicts of interest to disclose that are relevant to this manuscript. YAN is a consultant for Abbott, Nevro Corp, and Medtronic. DJP is a consultant for Nevro Corp. and Vertos and receives research support from Boston Scientific. EP has received research support from Mainstay, Medtronic, Neuros Medical, Nevro Corp, ReNeuron, SPR, and Saluda, as well as personal fees from Abbott Neuromodulation, Biotronik, Medtronic Neuromodulation, Nalu, Neuros Medical, Nevro, Presidio Medical, Saluda, and Vertos. She holds stock options from SynerFuse and neuro42. NM receives research funding from Nevro and Boston Scientific and is a consultant for Nevro and Boston Scientific. SM receives research support from Abbott.

  • Provenance and peer review Not commissioned; externally peer reviewed.

  • Supplemental material This content has been supplied by the author(s). It has not been vetted by BMJ Publishing Group Limited (BMJ) and may not have been peer-reviewed. Any opinions or recommendations discussed are solely those of the author(s) and are not endorsed by BMJ. BMJ disclaims all liability and responsibility arising from any reliance placed on the content. Where the content includes any translated material, BMJ does not warrant the accuracy and reliability of the translations (including but not limited to local regulations, clinical guidelines, terminology, drug names and drug dosages), and is not responsible for any error and/or omissions arising from translation and adaptation or otherwise.