S. Ayhan Çalışkan1,2, Gülşen Taşdelen Teker3, Hatice Mavioğlu4, Özgül Ekmekci5, Figen Gökçay5, Figen Eşmeli6, Semiha Kurt7, Füsun Ferda Erdoğan8, Meltem Demirkiran9, Bijen Nazlıel10, Esen Saka11

1Department of Medical Education, United Arab Emirates University, College of Medicine and Health Sciences, Al Ain, United Arab Emirates
2Department of Medical Education, İzmir University of Economics, Faculty of Medicine, İzmir, Türkiye
3Department of Medical Education and Informatics, Hacettepe University Faculy of Medicine, Ankara, Türkiye
4Department of Neurology, Celal Bayar University Faculty of Medicine, Manisa, Türkiye
5Department of Neurology, Ege University Faculty of Medicine, İzmir, Türkiye
6Department of Neurology, Balıkesir University Faculty of Medicine, Balıkesir, Türkiye
7Department of Neurology, Tokat Gaziosmanpaşa University Faculty of Medicine, Tokat, Türkiye
8Department of Neurology, Erciyes University Faculty of Medicine, Kayseri, Türkiye
9Department of Neurology, Çukurova University Faculty of Medicine, Adana, Türkiye
10Department of Neurology, Gazi University Faculty of Medicine, Ankara, Türkiye
11Department of Neurology, Hacettepe University Faculy of Medicine, Ankara, Türkiye

Keywords: Board exam, computer-based exam, key feature problems, multiple-choice questions.

Abstract

Objectives: This study aimed to present the implementation of the computer-based Turkish National Neurology Board Examination (TNNBE) process, which was digitalized by the Turkish Board of Neurology (TBN) using an open-source learning management system to improve accessibility, and the feedback from candidates.

Materials and methods: Neurology academics submitted items to the exam pool. From this pool, TBN members selected 79 multiple-choice questions (MCQs) and 10 key feature problems (KFPs), each containing two to four items (totaling 30 questions), through 20 h of online meetings and discussions to ensure content validity. Standard setting was applied using the Nedelsky (MCQ) and Angoff (KFP) methods. Instructions and a sample exam were created and sent to candidates for piloting purposes. Sixty neurologists (35 females, 25 males; mean age: 30.6±1.48 years; range, 28 to 36 years) who fulfilled the eligibility criteria participated in the exam, which was conducted in December 2023 at one venue, under supervision, with online security measures such as individual passwords for login, question shuffling, and browser lockdown. The total exam time was 130 min, divided into MCQs followed by KFPs. Each question was worth 1 point, with a maximum of 100 points obtainable. The MCQ scores were calculated automatically by the learning management system, while KFP responses were downloaded and scored by two different board members, with a final consensus mark determined through discussion among all board members. The final score was the sum of MCQ and KFP scores. Candidates’ feedback was obtained via an online survey using a 9-point scale (1=strongly disagree/very bad; 9=strongly agree/very good).

Results: Seven MCQs were omitted from the exam set due to various reasons. The mean scores were 47.73±6.61 for MCQs, 16.09±3.82 for KFPs, and 63.83±8.79 overall. Thirty (50.0%) candidates scored higher than the minimum acceptable level of performance (65/100) and passed the exam. The mean score percentage for the MCQ section (68.2%) was significantly higher than for the KFP section (53.1%; p<0.001). Cronbach's alpha coefficients for the MCQ (63 items) and KFP (30 items) sections were 0.75 and 0.67, respectively. Candidates provided positive feedback (n=56), indicating that the exam venue was comfortable (X̄=6.02±2.42), the digital format was easy to use (X̄ =5.44±2.62), and the exam user interface was convenient (X̄ =5.96±2.45). The highest satisfaction was for the inclusion of clinical case questions (X̄ =6.63±2.21). Candidates also found the KFP section (X=6.59±1.80) more challenging than the MCQ section (X̄ =6.13±1.61).

Conclusion: The computer-based TNNBE effectively streamlined the exam process and received positive feedback from candidates, particularly for its user interface and inclusion of KFPs. However, KFP scoring was still challenging due to the range and small differences in formulating the acceptable answers, which made it difficult to standardize scoring by machines. In addition, implementing computer-based exams, particularly in large-scale settings, require advanced technology and logistical planning, which can be costly and challenging to manage.

Cite this article as: Çalışkan SA, Taşdelen Teker G, Mavioğlu H, Ekmekci Ö, Gökçay F, Eşmeli F, et al. Digital transformation of the Turkish national neurology board examination: Implementation and candidates’ feedback. Turk J Neurol 2025;31(3):270-277. doi: 10.55697/tnd.2025.383.