Impact of Working in a Pharmacy during Pharmacy School on Licensure Exam Scores
By: Michael Peabody and Maureen Garrity
Students spend approximately 300 hours in Introductory Pharmacy Practice Experience (IPPE) and at least 1440 hours in the Advanced Pharmacy Practice Experiences (APPE). Most states have allowed these Pharmacy Practice Experiences to fulfill the required internship hours needed for licensure. Working while in pharmacy school can be an important co-curricular benefit to support the educational process; however, removing this requirement may have had unintended consequences. This study sought to examine the impact of student employment in a pharmacy during the academic year on licensure examination scores.
We surveyed all recent pharmacy school graduates who sat for the National Association of Boards of Pharmacy’s (NABP) North American Pharmacist Licensing Examination (NAPLEX) and Multistate Pharmacy Jurisprudence Examination (MPJE) in 2019, 2020, and 2021 and asked the following question:
How many hours per week (on average) were you employed in a pharmacy setting during the past year?
- I did not work
- I worked, but not in a pharmacy setting
- Less than 5 hours
- 5-9 hours
- 10-20 hours
- More than 20 hours
Survey results were then merged with school demographic classifications for each school from the Accreditation Council for Pharmacy Education (ACPE) website.
An ordinary least squares regression was conducted to determine the impact of the number of hours spent working in a pharmacy during school on licensure exam scores while controlling for graduation year, school demographics, and the mean ability of the school’s graduates. Controlling for the mean ability acts like a multi-level model to account for the nested, rather than randomly distributed, structure of students within schools.
The regression results indicated that working but not in a pharmacy was statistically significantly associated with a lower NAPLEX score compared to not working (p<0.001, β=-2.7, CI[-3.8,-1.6]). The following response were statistically significantly associated with higher NAPLEX scores compared to not working: less than 5 hours per week (p<0.001, β=2.6, CI[2.0,3.3]); working between 5-9 hours per week (p<0.001, β=3.9, CI[3.4,4.4]); and working between 10-20 hours per week (p<0.001, β=2.9, CI[2.5,3.4]). Working more than 20 hours per week was not associated with a statistically significantly different NAPLEX score compared to not working (p=0.67, β=-0.12, CI[-0.69,0.44]). Furthermore, graduates from 2021 had lower scores compared to 2019 graduates (p<0.001, β=-3.6, CI[-4.0,-3.2]). The adjusted R-squared value associated with this model was 0.095, suggesting that these variables explain approximately 9.5% of the variance in NAPLEX scores.
For the MPJE, working but not in a pharmacy was not statistically significantly different from not working (p=0.011, β=-0.37, CI[-0.65,-0.09]). The following response were statistically significantly associated with higher MPJE scores compared to not working: less than 5 hours per week (p<0.001, β=1.1, CI[0.91,1.2]); working between 5-9 hours per week (p<0.001, β=1.1, CI[1.0,1.2]); working between 10-20 hours per week (p<0.001, β=1.1, CI[0.94,1.2]); and working more than 20 hours per week (p<0.001, β=0.49, CI[0.35,0.64]). Furthermore, graduates from 2020 had higher scores (p<0.001, β=0.25, CI[0.15,0.35]) and graduates from 2021 (p<0.001, β=-0.43, CI[-0.53,-0.33]) had lower scores compared to 2019 graduates. The adjusted R-squared value associated with this model was 0.114, suggesting that these variables explain approximately 11.4% of the variance in MPJE scores.
These results suggest that, for both NAPLEX and MPJE, working in a pharmacy during school is more helpful than not working or working in a non-pharmacy setting, but working more than 20 hours may begin to negatively impact exam performance.
It is also interesting that, after controlling for the mean ability of students nested within individual schools, none of the ACPE school demographics were significant predictors of exam performance. This result warrants further investigation to determine whether these school demographic variables are important contributors to exam performance.
The effect sizes associated with this analysis are relatively small but the impact on scores could be up to 5 scaled score points which may be the difference between passing or failing the exam, particularly given the range and distribution of exam scores.