Introduction

The number of people who own smartphones is growing rapidly. In 2016, there were 3.7 billion smartphone users in the world, while the estimated number for 2022 was over 6.5 billion1. Simultaneously, the number of medical applications is growing, which has been especially noticeable over the last few years2,3,4. With the increasing number of medical applications, the value of the medical applications market is growing as well—in 2021 it was priced at USD 5.37 trillion, while by 2029 it’s estimated to be worth USD 38.47 trillion5. In a study conducted among smartphone users, almost half of the respondents (47.7%) used at least one health app that was related to wellness, fitness, prevention, medication, treatment or follow-up care6.

Constantly-growing mobile application market has also met the needs of radiologists and there are many applications related to diagnostic imaging. We can distinguish radiological image viewers (so-called Digital imaging and communications in medicine viewers—DICOM viewers), platforms for exchanging photos between doctors or different medical centers, instructions for correct patient positioning, guidelines of radiological scientific societies, applications for learning radiological anatomy, quizzes and databases of questions with radiological issues. Mobile radiological applications are also used in remote image interpretation, which is especially useful in places where there are no radiologists7. Unfortunately, applications that are intended for clinical usage are most often used for educational purposes due to the lack of appropriate certification. However, it has been shown that the interpretation of CT examinations on tablets or smartphones despite small screen resolution is possible and the assessment made with the help of a mobile device is similar to that made with the help of picture achieving and communication system (PACS) station8,9. Using mobile devices to evaluate diagnostic images is possible because they have enough computing power to quickly process full exams7. Radiology applications are also used to educate students and doctors10. What is more, radiological journals, out of concern for their readers, create mobile applications which facilitate access to their content11.

Other authors addressing the topic of comparing mobile solutions in radiology have taken a number of different strategies in selecting sources for collecting data for analysis. An article on the mobile medical apps market published on globenewswire.com presents a qualitative study and analysis of mobile medical apps in general. The report is economic in nature, so it serves primarily business and IT issues5. Chen et al. present a pair of tools for radiology education, with specific characteristics. The work has the character of an approximation of the capabilities of selected platforms and teaching methods. The review was inspired by the need to implement remote learning in the face of the COVID-19 pandemic10. Rahmani et al. looked at mobile apps used by radiology journals—only the Apple® App Store™ was searched for this purpose11. Kufel et al. created a systematic review of the scientific literature comparing mobile DICOM image viewers described in reports available in public medical databases (e.g., PubMed)12.

We conducted our own research aimed at evaluating mobile applications related to radiology, which are available at Google Play store and App Store, and then selected the best and most useful for users.

Methodology

Search strategy and selection criteria

The applications were selected by entering the term 'radiology' into polish version of Google Play store and App Store. Then, the necessary data was entered into a spreadsheet for all the applications found. Data concerned the origin of the application (Google Play, App Store), the number of downloads, the date of the last update, the presence of contact details of the application author and the application name. In addition, applications were classified in terms of use (educational/clinical use). The aspect of application presentation in a given store was also assessed (visual presentation of the application by means of videos, photos, presence of a website related to the application, description of the application). App Store does not present information about the number of application downloads, application's evaluation in the store and the contact details of application author.

Data on payment for the application (app price and in-app payments) was also obtained, however, this parameter was not included in the application evaluation.

Data extraction and quality assessment

In September 2022, Google Play and the App Store were searched. Due to large discrepancies in access to data between these two stores it was not possible to evaluate these applications in the same way. Therefore, they were assessed independently using a separate methodology:

Google Play

Assessment of the visual presentation of the application (based on data from the store)

Scoring system used during the visual evaluation of the application:

  • Presence of photo(s)—1 point—It allowed for an initial assessment of the application's functionality and familiarization with its capabilities.

  • Presence of film(s)—1 point—It presented in a better way the capabilities of the software and the usefulness for a potential user.

  • Presence of description—1 point—An application with a description is more credible, in particular a description of updates and improvements made, which gives the impression of a "not abandoned" application.

  • The presence of a dedicated website—1 point—Website usually contains contact with the author enabling technical support, FAQ, a detailed description of the possibilities and functionalities of the application.

In total, the application could be assessed visually from 0 to 4 points.

Last update date

It may indicate the involvement of the application author in its creation, updating knowledge, improving operation and functionality, as well as correcting any errors. It is also the most reliable source of information about the validity of the contained data. To assess the “last update date”, date ranges were used, for which various point values from 0 to 5 points were awarded (Table 1).

Table 1 Points assignment system for the date of the last update.
Number of installations

The number of application installations was divided into numerical ranges, to which appropriate point values were assigned from 0 to 5 points. It was possible by rounding the number of downloads to full values in the Google Play Store (Table 2).

Table 2 Point assignment system for the number of installations.

Finally, a formula was created that, based on all the components, awarded points to individual applications from the database and assigned them the appropriate category from A to F (Table 3).

$$ \Sigma {\text{ Ultimate points}} = \left( {\left( {{\text{X1}} + {\text{X2}} + {\text{X3}}} \right)/\left( {{14}} \right)} \right)*{1}00\% $$
(1)
Table 3 Category assignment system.

14—the maximum number of points to be scored in the assessed categories.

App Store

Visual evaluation of the application (based on data from the store)

Unfortunately, due to the insufficient amount of available data, we were unable to perform a visual evaluation of the application at a level comparable to applications from Google Play store.

Last update date

It may suggest the degree of application developers’ involvement in its functioning, their care for the optimal, error-free operation of the program and the validity of knowledge or clinical data contained in the application.

Number of installations

Unfortunately, this stat cannot be obtained from the App Store. Therefore, it was decided to omit it from the work.

Detailed inclusion and exclusion criteria

In order for a given application to be qualified for this study it had to meet all of the following criteria: application was the result of a search for “radiology” in the search engine in a mobile application store, it had clinical or educational use, it dealt with radiology.

In order for an application not to be qualified for this study it had to meet at least one of the following criteria: application did not deal with radiology, it was related to scientific publishing houses, conferences or was of an entertainment nature (was about body scanner games and such).

Ethical approval

Not applicable. The scientific study we conducted was not a medical experiment. It was conducted without the involvement of humans or animals. Furthermore, no biological materials derived from humans or animals were used. Therefore neither medical nor personal information in any form was collected or collected from study participants.

Google Play and App Store review results

Google Play

Initial results showed 306 records. 78 applications were excluded from the study with reasons. The remaining applications were added to the spreadsheet for further review of any information that is covered in detail in the methodology. Ultimately, 228 applications were included in further testing. Whole process is presented in the Flow Diagram 1 (Fig. 1).

Figure 1
figure 1

Google Play—flow diagram.

It can be seen that the largest group (130 applications) in terms of the number of downloads is the group from 0 to 1000 application downloads. The least numerous group is a group with > 100,001 application downloads, which includes only 2 applications (Fig. 2).

Figure 2
figure 2

Number of Google Play app downloads in numerical ranges.

The number of applications updated in 2022 was 63, which shows that the most up-to-date applications account for 27.63% of all qualified applications from the Google Play store (Fig. 3).

Figure 3
figure 3

Number of applications that were last updated in a given year.

Most of the applications qualified for the review were completely free (207 from 228). 5 of them were paid before downloading, while the remaining 16 applications were free to download with the possibility of making additional paid purchases through the application. The prices ranged from PLN 9.19 (EUR 1,99, USD 2,11) for the “Infection & Arthritis X-ray QZ Skeletal Radiology®” application to PLN 97.99 (EUR 21,26, USD 22,54) for “Navi Radiography Pro®”.

The largest group of all applications qualified for the study were educational applications (154). We also decided to distinguish between applications that may be clinically useful (59) and those that may be both educational and clinically used (15). It should be emphasized that none of the above-mentioned applications has the appropriate permission for use in clinical practice, therefore it can only be used as an advisory tool.

The letter “A” in the classification means applications with the best scores. They have been classified in this category because they represent the highest level of credibility, usefulness and functionality, which is also reflected in the correlation between the high number of votes and very high rating of users. The compatibility of photos, descriptions and videos (if any) is also evidenced by positive user comments, along with feedback from the application developers. The letters "B", "C", "D", and "E" represent successively worsening levels both in terms of visual, quantitative (in terms of the number of installations) and qualitative. The letter "E" denotes the applications that represent the lowest level. The number of the best applications (“A” category) was equal to the number of the worst (“E” category) and amounted to 6. 35 applications were qualified to the “B” category, 149 to “C” and 32 to “D” category.

From the Google Play Store apps included in this study, the 6 apps received “A” category (Table 4).

Table 4 Mobile applications with “A” category.

App Store

In September 2022, an analysis of the store's offer was carried out and showed 190 results. 33 applications were excluded from the study with reasons. Remaining applications have been added to the spreadsheet for further review which is covered in detail in the methodology section. Ultimately, 157 applications were included in this study. Whole process is presented in the Flow Diagram 2 (Fig. 4).

Figure 4
figure 4

App Store—flow diagram.

Most of the apps qualified for the review were free (119). The number of paid applications was as numerous as those containing an additional paid application (19). Prices for paid applications ranged from PLN 4.99 for 'Thorax—thoracic radiology®' to PLN 529.99 for 'M3 TRR®'.

The largest group of qualified applications was the group containing applications for educational purposes (104), and 4 of them could also be used in medical practice. 49 applications were intended solely for diagnostics during daily clinical practice. It should be noted again that the presented applications do not have the necessary permissions to be officially and legally approved for use in clinical practice, so they can only be a helpful tool for both patients and clinicians.

The number of applications updated in 2022 was 36, which shows that the most up-to-date offers account for 22.92% of all applications analyzed in this review (Fig. 5). It is worth mentioning that due to the partially lacking data on the date of the last update, we were unable to determine this parameter for 22 apps from the App Store.

Figure 5
figure 5

Number of applications that were last updated in a given year.

Discussion

Strictly medical mobile applications are not very popular among users and deserve more attention and recognition of their potential, both in the educational field and as a helping tool in everyday clinical practice. Their implementation in clinical fields improves both the efficiency and quality of health care systems and has a positive impact on the time of diagnosing12. The use of mobile devices together with appropriate applications enables the interpretation of test results, which was proven by Schlechtweg in his study8.

When analyzing the above-mentioned data, it should be noted that radiology applications available on mobile devices have a niche, specialized application and do not reach a wide audience. One can come to this conclusion by analyzing the number of installations of individual applications, which for most of them is less than a thousand—in Google Play store this applies to 130 out of 228 applications qualified for this study. For comparison—in the Google Play store, only in September 2022, the TikTok® application was downloaded 23.96 million times and the Instagram application 21.17 million times13. In our work we have shown that the fewest applications are related to clinical practice, whereas most are used for educational purposes. This may be the result of their non-admission for clinical use due to the lack of appropriate permissions and certificates. Before a given application is approved for clinical use, it must be first approved by the relevant authorities of a given country or by another superior organization—in the case of the European Union, this is the European Medicines Agency (EMA)14. In the United States, the authority issuing these permits is the Food and Drug Administration agency (FDA)15. The first mobile application for viewing images and making diagnoses on MRI, X-ray, ultrasound and other radiographic scans for clinical use was approved by the FDA only in 201116.

However, we are satisfied with the large number of educational applications in both stores—154 in the Google Play store and 104 in the App Store. Educational applications can be a handy source of knowledge, available everywhere and at any time, both for students and practicing doctors17. De Oliveira et al. present a study on the learning of endodontics with the use of radiographs of various endodontic conditions and emphasize the fact that it positively affects both the learning process (students achieved better results in learning) and diagnosis18. Another study conclusively showed that the ability to remember radiology lectures is greater with the use of a remote response device. Millor et al. showed that during learning with an interactive device the number of correct answers was significantly higher than during passive education (63.6 vs. 53.2%, p < 0.05). The same applies to the percentage of incorrect answers, which was lower for students using the remote response device (respectively 23,4 vs. 27,4% p < 0,005 and 13 vs. 19,5%, p < 0,005)19. However, the application is not always able to replace traditional teaching methods. In a study by Nilsson et al. the educational achievements of two groups learning ultrasound were compared—one using the application and one using a manual. It has been shown that the use of the application while learning ultrasound does not affect the acquisition of knowledge. There were no statistical differences between the two groups, using an objective scale for assessing the ability to use ultrasound, and there was no significant difference in diagnostic accuracy. The first group which used an app for learning got an average OSAUS (Objective Structured Assessment of Ultrasound Skills scale) score of 42.3% [95% CI 38.5–46.0%], and the other group using only books scored mean-OSAUS of 45.3% [95% CI 39.3–51.3%] (d.f. [1.33] = 0.45, p = 0.41)20.

The use of mobile applications for consultation and remote diagnosis, especially in emergency situations, is helpful and can shorten the diagnosing time21,22. Hidlay et al. showed that when using a smartphone in head and neck diagnostics, in a sample of 76 CT examinations, it provided almost 100% diagnostic accuracy (p = 0.9999) using both the PACS workstation and the iPhone®23,24. Handelman et al. showed little difference between identifying emergency radiographs on a smartphone via WhatsApp® (accuracy 79,6% ± 7,9%, 11,5%—false positive results, 8,9%—false negative results) and a diagnostic station (accuracy 83,4 ± 6,5%, 8,9%—false positive results, 7,7%—false negative results), which was 6%. However, authors conclude that such a difference does not disqualify the use of WhatsApp as a helpful application for consulting radiographs25. WhatsApp also proved to be useful in remote diagnosis of tibial fractures. Demaerschalk et al. showed that the results of the assessment of vascular neurologists using ResolutionMD® on smartphones did not differ from the interpretation of radiologists diagnosing in the traditional way and independent arbiters viewing the images remotely26. Thanks to a mobile application implemented in South Korea, radiologists gained greater access to Korean imaging guidelines27. Park et al. and Zimmerman et al. showed little difference between coronary artery CT diagnoses made remotely using smartphones or iPad (smartphone: κ = 0.59 and κ = 0.72 for severe stenosis; iPad®: κ = 0.62, 95% CI 0.45–0.79) and comparing with workstation diagnosis (κ = 0.64, 95% CI 0.47–0.81)28,29.

Most applications are free—75.08% of App Store offers and 91.23% of Google Play store offers, which makes them easily accessible, regardless of the financial capabilities of users. According to a study by Biviji et al. applications with a higher percentage of downloads were those which were cheaper, were free applications with the option of purchasing individual options already in the application, were applications with a high user rating or were applications in which experts were involved in the creation and development30. A large part of the application has been updated recently (in 2021 or 2022)—73 App Store offers and 94 Google Play store offers, so it can be assumed that the knowledge contained therein is up-to-date. Users emphasized the importance of customer service responsiveness, speed in solving problems and meeting their requests for additional features31.

In addition, applications containing a reliable description as well as a dedicated website and photos or a video, showing the application's functions, will probably be more willingly chosen by users. Lau et al. proved that users value the visual aspect and nice design of applications more than evidence-based components32. Jiang et al. showed that users attach the greatest importance to the description of the application when choosing the application they intend to download. A high-quality description is one that briefly lists and describes the app's features, includes developer contact information and information about permission and requirement33.

Conclusion

Clinical applications could help radiologists in their daily work. First, however, they should be analyzed in order to obtain the appropriate certificate, so that they can be legally used during the healing process. Organizations responsible for issuing the required certificates should pay particular attention to the possibility of using medical applications in clinical practice. Moreover, applications that received "A" category may in our opinion be a good source of knowledge for students of medical faculties or young adepts of radiology. However, we do not rule out applications placed in other categories as long as the knowledge contained in them is up-to-date and presented in an accessible way, they also fulfill their educational role.

Limitations

We encountered several limitations that effectively prevented us from comparing apps from both stores in a homogeneous way. Limited data availability, mainly the lack of the number of downloads of a given application in the App Store, made it impossible to collect enough information about applications to apply the same application evaluation scheme that was used when assessing applications from Google Play store. Therefore, the rating according to the classification from "A" to "F" was applied only for applications from Google Play store.

Moreover, our article is a pilot study, and mobile applications were assessed subjectively, because due to missing data it was not possible to create a standardized assessment tool. The lack of data availability also made statistical analysis difficult.

Perhaps in the future, with further analysis and greater data availability, a more objective tool can be created and the assessment can be re-evaluated.