Skip to content
← Back to Publish Online

Reliability of YouTube© videos in terms of cardiopulmonary resuscitation education

Cardiopulmonary resuscitation education by YouTube© videos

Original Research DOI: 10.4328/ACAM.22776

Authors

Affiliations

1Department of Emergency Medicine, School of Medicine, Alaaddin Keykubat University, Alanya Education and Research Hospital, Antalya, Türkiye

2Department of Pathology, School of Medicine, Alaaddin Keykubat University, Alanya Education and Research Hospital, Antalya, Türkiye

3Department of Emergency Medicine, School of Medicine, Ondokuzmayıs University, Medical Faculty Hospital, Samsun, Türkiye

Corresponding Author

Abstract

Aim People particularly choose video-sharing platforms for medical information. In this study, we aimed to evaluate the scientific correctness of videos uploaded to one of the most popular video-sharing website, YouTube©.
Methods This research was performed by entering the keyword “Cardiopulmonary Resuscitation” to the YouTube© platform on March 10th, 2024. Of the videos; download date, views, type of videos (lecture, scenario, narrative, animation and manikin), number of likes and duration of the videos were determined. Then the modified DISCERN Score (mDS) and Global Quality Scale Criteria (GQS) were applied to the videos by the authors.
Results Eighty relevant videos were included in the study. The types of videos were: 20 lecture videos ,19 scenario videos, 15 manikin videos, 14 narrative videos and 12 were animation videos. The manikin videos were the most viewed (186,905 times, mean: 12,460) and liked (1,510 times, mean: 100.6). Animation videos had the highest scores for both mDS and GQS criteria (4, 4.2, respectively). No statistical significance was determined in terms of video types and mDS/GQS scores.
Conclusion Using manikins may be a useful method for basic life support and cardiopulmonary resuscitation education of the public.

Keywords

YouTube© basic life support cardiopulmonary resuscitation modified DISCERN global quality scale

Introduction

Survival of patients suffering cardiac arrest mainly depends on early recognition and treatment of the situation. Proper education is essential for a better cardiopulmonary resuscitation (CPR) which is a component of Basic Life Support (BLS) along with initial patient assessment and activation of emergency medical services 1.
In the current digital world, various digital platforms offer easier and faster access to information. People particularly choose video-sharing platforms for medical information. The most popular video-sharing website, YouTube©, generates 2 billion views daily. It is known that a new video is uploaded to YouTube© every minute and each user spends an average of 15 minutes daily. However, the scientific correctness of these videos is controversial 2.
Since any individual can upload his/her own content, regardless of background and reliability, there is no guarantee for the quality, accuracy and integrity of the information shared in such platforms. This can affect the patients negatively and mislead them in terms of the prevention and treatment of diseases 3. Hence, it is mandatory to evaluate the information provided to patients by scientific scales 4.
For this purpose, the modified DISCERN Score (mDS) and the Global Quality Score (GQS) were applied to the videos.
A thorough and continuous detection of these videos on such platforms is essential to prevent the public from misinformation. In this article, we aimed to evaluate the YouTube© videos on CPR in terms of quality and accuracy.

Materials and Methods

This research was performed by entering keyword “Cardiopulmonary Resuscitation” to the YouTube© platform on March 10th, 2024.
Download date, views, type of videos (lecture, scenario, narrative, animation and manikin), number of likes and duration of the videos were determined.
Exclusion criteria of this study were as follows:
• Reels,
• advertisements,
• repeated videos,
• videos uploaded in any language other than English,
• videos made for fun,
• videos without sound and explanation.
Then the mDS Score and GQS were applied to the videos by the authors.
mDS has five questions; a “yes” answer is scored as “1” and a “no” answer is scored as “0”. The total “yes” answers are calculated to reach a reliability score. These five questions are:
1) Are the aims clear and achieved?
2) Are reliable sources of information used?
3) Is the information presented balanced and unbiased?
4) Are additional sources of information listed for patient reference?
5) Does it refer to areas of uncertainty?
Video quality was also evaluated using the GQS. The GQS is a 5-point scale used to evaluate the overall quality of videos watched.
The GQS also has five points1) Poor quality, poor flow of the site, most information missing, not at all useful for patients, 2) Generally poor quality and poor flow, some information listed but many important topics missing, of very limited use to patients,
3) Moderate quality, suboptimal flow, some important information is adequately discussed but others poorly discussed, somewhat useful for patients,
4) Good quality and generally good flow, most of the relevant information is listed, but some topics not covered, useful for patients,
5) Excellent quality and excellent flow, very useful for patients 5,6,7,8.
For the statistical analyses, Statistical Package for the Social Sciences (version 22, SPSS Inc, Chicago, Ill) was used to analyze the data. The Kolmogorov–Smirnov test was initially employed to evaluate the normality of data. The Man–Whitney U test was employed for comparison of the dichotomous variable. A p-value typically ≤ 0.05 was considered statistically significant.
Also, the top ten videos were analyzed in details and presented in a table.
Ethical ApprovalThis study did not require ethical approval according to the relevant guidelines.

Results

A total of 303 videos related to CPR uploaded on YouTube© in a one-year period were investigated. Of these videos 223 were “Reels”, advertisements, videos uploaded in any language other than English and repetitive videos. The remaining 80 relevant videos were included in the study. The total views of these 80 videos were 198,429. The count of “likes” of these videos was 11,620. Total duration of the videos was 413.21 minutes. The mean duration of the videos was 5.1 minutes. While the mean mDS score of the videos was 3.67, the mean GQS was 3.46.
Of these videos; 20 were lecture videos ,19 were scenario videos, 15 were manikin videos, 14 were narrative videos and 12 were animation videos. Twenty lecture videos were viewed 6108 times (mean: 305.4) and liked 184 times (mean: 9.2). Mean mDS and GQS scores of lecture videos were each 3.4. Nineteen scenario videos were viewed 1,776 times (mean: 93.5) and liked 79 times (mean: 4.2). Mean mDS and GQS scores of scenario videos were 3.5 and 3.1, respectively. Fifteen manikin videos were viewed 186,905 (mean: 12,460) and liked 1,510 times (mean: 100.6). Mean mDS and GQS scores of manikin videos were 3.8 and 3, respectively. Fourteen narrative videos were viewed 3045 times (mean:217.5) and liked 130 times (mean: 9.2). Mean mDS and GQS scores of narrative videos were 3.7 and 3.3, respectively. Twelve animation videos were viewed 702 times (mean: 58.5) and 44 likes (mean: 3.6). Mean mDS and GQS scores of animation videos were 4 and 4.2. No statistical significance was determined in terms of video types and mDS/GQS scores. Comparison of the videos in terms of video categories is summarized in Table 1.
When the top 10 videos were investigated, the oldest video was uploaded 13 years ago. The mean upload time of the videos was 7.1 years. The most viewed video was an animation and had 19 million views with 45,000 likes. The longest video was a scenario video with 13 minutes and 44 seconds. The detailed data are provided in Supplementary Table S1.

Discussion

In a study investigating the accuracy of YouTube© videos on basic life support and CPR by Yaylacı et al. in 2014, the median duration of the videos was 165 seconds 1. In our study, the mean duration of the videos was 306 seconds. The reason for this increase in the duration of the videos over time may be linked to the fact that there have been significant improvements in BLS. In the above-mentioned video, it was also determined that only 11.5% of the detected videos were compatible with the 2010 CPR guidelines with regard to the sequence of interventions. Even though the researchers concluded that YouTube videos could be useful in public education on teaching BLS and CPR, they also emphasized the fact that it is not easy to find videotaped material on YouTube© demonstrating properly performed CPR 1. The scarcity of properly performed CPR videos can be interpreted as the abundance of improper CPR videos. Accordingly in our study, when mDS and GQS of the videos were considered, it is understood that the videos are not sufficient in terms of education. In a study investigating JAMA, modified DISCERN, GQS, and DISCERN scores of videos on hemoroids, it was determined that the scores of videos were low in each evaluation method. Compatible with our results, YouTube© videos on hemorrhoids lacked accurate and reliable information 8. In a cross-sectional observational study on Instagram posts with content related to chest pain, it was determined that the posts of physicians were highly reliable and had a better GQS. While Instagram posts produced by medical experts and the healthcare sector were more reliable and accurate, DISCERN scores and GQS of posts from other sources (hospitals, dieticians, patients and healthcare organizations) were low 9.
Recently, Kang et al. conducted a study to evaluate the evidence hierarchy supporting medical claims in health care professional-created online videos. Correlations between evidence quality and engagement metrics (views and likes) and traditional quality scores (DISCERN, JAMA benchmark criteria, and Global Quality Scale) were also investigated. In this research, they not only found that many videos contained information not based on evidence, but also that low-quality videos received more views and likes than high-quality videos 10.
According to our study, the most viewed and most liked videos were videos involving manikins for BLS education. In a study, it was determined that videos with a high rate of download had higher scores in terms of compatibility with contemporary guidelines 1. However, even though manikin videos were most viewed and most liked videos, the videos using animation had the highest mDS and GQS in our study.
In a study on the evaluation of the quality and the content of YouTube© videos on protection from coronavirus with GQS, it was determined that only 30.2% were of good quality 11. In another study on coronavirus videos, entertainment videos achieved a higher DISCERN score when compared to Interviews and News 10. Poor quality information from YouTube© videos may result in inaccurate information and compel patients to make wrong decisions 8.
Analysis of the top 10 videos also revealed that the mean upload time of the videos was 7.1 years. Considering that the information on BLS is constantly updated and needs to be renewed, it is clear that the uploaded videos should also be constantly updated. The videos that stray from scientific developments are likely to provide outdated information to viewers. As the first and most widely used social media platform, YouTube© has a significant impact on education 12. While videos have many benefits, it is essential that their accuracy is verified and they are updated in light of changing medical realities. Numerous studies have shown that medical videos found on YouTube© can lead to misinformation and misguidance 13,14,15. Social media platforms like YouTube©, which play a significant role in education, need to develop mechanisms to prevent misinformation and stop the spread of disinformation as a matter of responsibility.

Limitations

Our study also has some limitations. Due to its time-dependent manner, the number of views and likes of videos may change over time. Besides, the videos in our study were not classified according to the organizations that produced the videos. Videos from more reliable organizations may be more trustworthy.

Conclusion

Due to advantages such as visuality, YouTube© attract more attention from internet users. If used well, it can be beneficial to people. However, use of YouTube© videos for educational purposes is risky in terms of accuracy and reliability. Filtering videos under the supervision of a scientific committee can help people access more accurate information and minimize these risks. The scientific committee should also force the video producers to update the information given in these videos. The results of our study revealed that YouTube© videos are not of sufficient quality for education. Interventions based on these videos may lead to errors in approaching patients with cardiac arrest.

Declarations

Animal and Human Rights Statement

All procedures performed in this study were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki Declaration and its later amendments or comparable ethical standards.

Data Availability

The datasets used and/or analyzed during the current study are not publicly available due to patient privacy reasons but are available from the corresponding author on reasonable request.

Conflict of Interest

The authors declare that there is no conflict of interest.

Funding

None.

Scientific Responsibility Statement

The authors declare that they are responsible for the article’s scientific content, including study design, data collection, analysis and interpretation, writing, and some of the main line, or all of the preparation and scientific review of the contents, and approval of the final version of the article.

References

  1. Yaylaci S, Serinken M, Eken C, et al. Are YouTube videos accurate and reliable on basic life support and cardiopulmonary resuscitation?. Emerg Med Australas. 2014;26(5):474-7. doi:10.1111/1742-6723.12274.
  2. Maganur PC, Hakami Z, Raghunath RG, et al. Reliability of educational content videos in YouTubeTM about stainless steel crowns. Children (Basel). 2022;9(4):571. doi:10.3390/children9040571.
  3. Yilmaz, H, Aydin MN. YouTubeTM video content analysis on space maintainers. J. Indian Soc. Pedod. Prev. Dent. 2020;38(1):34-40. doi:10.4103/JISPPD.JISPPD_215_19.
  4. Di Spirito F, Giordano F, Di Palo MP, et al. Reliability and accuracy of YouTube peri-implantitis videos as an educational source for patients in population-based prevention strategies. Healthcare (Basel). 2023;11(14):2094. doi:10.3390/healthcare11142094.
  5. Barman Kakil Ş, Arslan N. Retinal vein occlusion on YouTube: how reliable is public health information in the digital era?. Int Ophthalmol. 2025;45(1):475. doi:10.1007/s10792-025-03854-2.
  6. Park JH, Christman MP, Linos E, Rieder EA. Dermatology on Instagram: an analysis of hashtags. J Drugs. Dermatol. 2018;17(4):482-4.
  7. Rodriguez-Rodriguez AM, Blanco-Diaz M, de la Fuente-Costa M, et al. Review of the quality of YouTube videos recommending exercises for the COVID-19 lockdown. Int J Environ Res Public Health. 2022;19(13):8016. doi:10.3390/ijerph19138016.
  8. Mendes SS, Oliveira R, Gonçalves R, Caetano AC. Evaluation of online content of proctological disorders in the Portuguese language. Rev Esp Enferm Dig. 2022;114(7):400-4. doi:10.17235/reed.2021.8333/2021.
  9. Gudapati JD, Franco AJ, Tamang S, et al. A Study of global quality scale and reliability scores for chest pain: an instagram-post analysis. Cureus. 2023;15(9):e45629. doi:10.7759/cureus.45629.
  10. Kang E, Lee H, Choi J, Ju H. The quality of evidence of and engagement with video medical claims. JAMA Netw Open. 2026;9(1):e2552106. doi:10.1001/jamanetworkopen.2025.52106.
  11. Babayiğit MA. Evaluation of the quality and the content of YouTube videos in Turkish on protection from coronavirus. J Health Sci Med. 2022;5(1):301-5. doi:10.32322/jhsm.1021618.
  12. Middelberg LK, Mason AE, Miller S, Helwig S, McKenzie LB. Risky social media challenges: a scoping review, 2000-2024. Inj Epidemiol. 2025;13(1):1. doi:10.1186/s40621-025-00647-0.
  13. Özeller E, Toçoğlu M, Aktı-Çakır E, Hüsna SÇ. Analyzing Turkish-language HPV vaccination videos on YouTube: Assessing content quality and educational value. Infect Dis Clin Microbiol. 2025;7(3):310-9. doi:10.36519/idcm.2025.699.
  14. Mezzapesa F, Bilancia EP, Afonina M, et al. Is YouTube™ a reliable source of information for the current use of HIPEC in the treatment of ovarian cancer?. Cancers (Basel). 2025;17(19):3222. doi:10.3390/cancers17193222.
  15. Khare S, Erridge S, Chidambaram S, Sodergren MH. Misinformation about medical cannabis in YouTube videos: systematic review. JMIR Form Res. 2025;9:e76723. doi:10.2196/76723.

Additional Information

Publisher’s Note
Bayrakol MP remains neutral with regard to jurisdictional and institutional claims.

Rights and Permissions

Creative Commons License

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License (CC BY-NC 4.0). To view a copy of the license, visit https://creativecommons.org/licenses/by-nc/4.0/

About This Article

How to Cite This Article

Ali Kemal Erenler, Behice Hande Erenler, Ahmet Baydın. Reliability of YouTube© videos in terms of cardiopulmonary resuscitation education. Ann Clin Anal Med 2026; DOI: 10.4328/ACAM.22776

Received:
June 16, 2025
Accepted:
January 19, 2026
Published Online:
March 11, 2026