Abstract
Background/Objectives
The use of mobile ophthalmology applications (MOA) is increasing, but many of these tools have not been validated. This study was performed to assess the accuracy of a popular MOA, Eye Handbook, in measuring five commonly-tested vision assessment parameters (distance visual acuity (DVA), near visual acuity (NVA), colour vision testing (CVT), contrast sensitivity (CS), and pupillary distance (PD)) was compared with traditional vision assessment methods (TVAM) [i.e. Snellen chart, Rosenbaum near card, Ishihara, Pelli Robson test, etc.] performed in the eye clinic setting.
Subjects/Methods
Prospective crossover clinical trial of 129 patients meeting inclusion criteria.
Results
Participants averaged significantly better DVA (p = 0.0008), NVA (p < 0.0001), and CVT (p = 0.0105) in the MOA than the TVAM, but all three MOA assessments were predictive of the TVAM values. CS was significantly better with the MOA (p < 0.0001). Linear regression and Spearman correlation tests were applied to assess the effect of CS on NVA, which showed no clear relationship between the difference in NVA and the difference in CS. PD using the two methods was in agreement with no significant difference (p = 0.2889).
Conclusion
The studied MOA offers an effective means of measuring four common vision parameters: DVA, NVA, CVT, and PD. The MOA can potentially be used by eye care providers, health care providers, and patients, both as a screening tool with correction factor and to monitor ocular pathologies. Atypical MOA measurements should prompt testing in the clinic with formal TVAMs.
This is a preview of subscription content, access via your institution
Access options
Subscribe to this journal
Receive 18 print issues and online access
$259.00 per year
only $14.39 per issue
Buy this article
- Purchase on SpringerLink
- Instant access to full article PDF
Prices may be subject to local taxes which are calculated during checkout
Similar content being viewed by others
Data availability
The data collected and analysed in this study are not publicly available due to institutional policy and concerns for protecting protected health information (PHI). However, de-identified datasets can be shared with reasonable requests made in writing to the corresponding author. These requests will be reviewed and are subject to subsequent institutional administration approval for the release of the data sets.
References
Aruljyothi L, Janakiraman A, Malligarjun B, Babu BM. Smartphone applications in ophthalmology: a quantitative analysis. Indian J Ophthalmol. 2021;69:548–53.
Leonard C. Reference apps for ophthalmologists. 2021. https://www.reviewofophthalmology.com/article/reference-apps-for-ophthalmologists.
Znamenska M. Top 11 mobile optometry & ophthalmology mobile apps. 2023. https://www.altris.ai/article/top-mobile-optometry-ophthalmology-apps-for-eye-care-specialists/.
Davidson J. Top 10 optometry smartphone apps. 2020. https://modernod.com/articles/2020-apr/top-10-optometry-smartphone-apps?c4src=article%3Ainfinite-scroll.
Lord K, Shah VA, Krishna R. The Eye Handbook: a mobile app in ophthalmic medicine. Mo Med. 2013;110:49–51.
Yeung WK, Dawes P, Pye A, Neil M, Aslam T, Dickinson C, et al. eHealth tools for the self-testing of visual acuity: a scoping review. NPJ Digit Med. 2019;2:82.
Tofigh S, Shortridge E, Elkeeb A, Godley BF. Effectiveness of a smartphone application for testing near visual acuity. Eye (Lond). 2015;29:1464–8.
Shah AA, Pasadhika S, Kim J, Wang M. Pseudoisochromatic color vision testing on an iPhone. Invest Ophthalmol Vis Sci. 2012;53:6399.
Cho P, Woo GC. Repeatability of the Waterloo four-contrast logMAR visual acuity chart and near vision test card on a group of normal young adults. Ophthalmic Physiol Opt. 2004;24:427–35.
Kaur K, Gurnani B. Contrast sensitivity. [Updated 2023 Jun 11]. In: StatPearls [Internet]. Treasure Island (FL): StatPearls Publishing; 2023 Jan. Available from: https://www.ncbi.nlm.nih.gov/books/NBK580542/
Cheng L, Peng S, Hao H, Ye D, Xu L, Zuo Y, et al. Effect of different screen brightness and devices on online visual acuity test. Graefes Arch Clin Exp Ophthalmol. 2024;262:641–9.
Acknowledgements
Jibran Sharieff, MD (PGY-1 ophthalmology resident at Dean McGee Eye Institute) assisted with instruction on testing methods and manuscript editing.
Author information
Authors and Affiliations
Contributions
AR and SC were responsible for supervising and conducting the research; extracting and analysing data; interpreting results; and writing the first draft of the manuscript. JH, TP, AL, and CL were responsible for conducting the data gathering; extracting and analysing the data; and creating tables and figures. JDD was responsible for the primary statistical analysis for the study. KMR was responsible for supervising all study participants; reviewing the collected data; and editing and finalizing the submitted final draft of the manuscript.
Corresponding author
Ethics declarations
Competing interests
The authors declare no competing interests.
Additional information
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Raney, A., Cottom, S., Huff, J. et al. Accuracy of a commonly used mobile ophthalmology application’s vision assessment tools in measuring five vision assessment parameters. Eye (2024). https://doi.org/10.1038/s41433-024-03315-7
Received:
Revised:
Accepted:
Published:
DOI: https://doi.org/10.1038/s41433-024-03315-7