Abstract
Remote digital pathology allows healthcare systems to maintain pathology operations during public health emergencies. Existing Clinical Laboratory Improvement Amendments regulations require pathologists to electronically verify patient reports from a certified facility. During the 2019 pandemic of COVID-19 disease, caused by the SAR-CoV-2 virus, this requirement potentially exposes pathologists, their colleagues, and household members to the risk of becoming infected. Relaxation of government enforcement of this regulation allows pathologists to review and report pathology specimens from a remote, non-CLIA certified facility. The availability of digital pathology systems can facilitate remote microscopic diagnosis, although formal comprehensive (case-based) validation of remote digital diagnosis has not been reported. All glass slides representing routine clinical signout workload in surgical pathology subspecialties at Memorial Sloan Kettering Cancer Center were scanned on an Aperio GT450 at ×40 equivalent resolution (0.26 µm/pixel). Twelve pathologists from nine surgical pathology subspecialties remotely reviewed and reported complete pathology cases using a digital pathology system from a non-CLIA certified facility through a secure connection. Whole slide images were integrated to and launched within the laboratory information system to a custom vendor-agnostic, whole slide image viewer. Remote signouts utilized consumer-grade computers and monitors (monitor size, 13.3–42 in.; resolution, 1280 × 800–3840 × 2160 pixels) connecting to an institution clinical workstation via secure virtual private network. Pathologists subsequently reviewed all corresponding glass slides using a light microscope within the CLIA-certified department. Intraobserver concordance metrics included reporting elements of top-line diagnosis, margin status, lymphovascular and/or perineural invasion, pathology stage, and ancillary testing. The median whole slide image file size was 1.3 GB; scan time/slide averaged 90 s; and scanned tissue area averaged 612 mm2. Signout sessions included a total of 108 cases, comprised of 254 individual parts and 1196 slides. Major diagnostic equivalency was 100% between digital and glass slide diagnoses; and overall concordance was 98.8% (251/254). This study reports validation of primary diagnostic review and reporting of complete pathology cases from a remote site during a public health emergency. Our experience shows high (100%) intraobserver digital to glass slide major diagnostic concordance when reporting from a remote site. This randomized, prospective study successfully validated remote use of a digital pathology system including operational feasibility supporting remote review and reporting of pathology specimens, and evaluation of remote access performance and usability for remote signout.
Similar content being viewed by others
Introduction
The digital transformation of pathology has allowed for digitization of glass slides to generate whole slide images (i.e., WSIs or digital slides). The primary components of a digital pathology system include a whole slide scanner, software whole slide image viewer, and display monitor. Digital pathology allows pathologists to access, evaluate, and share pathology slides using a digital workflow, and provides facile access for remote sign-out, when appropriate. It offers pathologists a novel method to review and render diagnoses, broadens clinical expertize, and allows for digital management of pathology slides for reporting of patient specimens. An abundance of literature has been published to show concordance of rendering diagnoses using WSIs compared to glass slides [1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25]. These studies have shown that reporting of pathology specimens is readily achieved using WSIs and can be incorporated into routine anatomic pathology laboratory workflow. Furthermore, digital pathology can be leveraged to create and enhance a digital workflow for pathologists. At cancer centers such as Memorial Sloan Kettering Cancer Center (MSK), patients benefit from experienced sub-specialized expert pathologists. Digital pathology has also enhanced the pathologist’s experience and has provided efficiency and operational savings [3].
Clinical Laboratory Improvement Amendments (CLIA) regulations require pathologists to electronically verify patient reports from the CLIA-certified facility [26]. However, during the COVID-19 pandemic, this requirement potentially exposed pathologists and trainees to the SARS-CoV-2 virus and thus put the pathologist, colleagues, and household members at risk of infection from a potentially lethal virus. Moreover, in any public health emergency, if pathologists are unable to be present at the CLIA-licensed facility due to illness, quarantine, or other travel restrictions, hospital systems will lose continuous patient care and delay turnaround times of pathology reporting. Many pathologists have secure access (e.g., virtual private networks) through hospital firewalls and can connect to hospital information systems (i.e., laboratory information systems, electronic medical records) from remote locations (e.g., home). During this time, with the advent of digital pathology, novel digital workflows can be implemented and validated for remote use.
This study presents a validation of the digital pathology system at a large academic center in New York City, the world-wide epicenter of the COVID-19 pandemic. The validation encompasses digitization of glass slides generated from formalin fixed paraffin embedded and frozen tissue; and includes hematoxylin & eosin stains, immunohistochemical stains, and special stains. Glass slides included those generated in the MSK histology laboratory as well as slides received as consultation cases from referring institutions. The whole slide imaging process includes pre-analytic quality assurance of the glass slides, the analytic process of placing the glass slides on the whole slide scanner for scanning, and post-analytic quality assurance of the generated whole slide images. The whole slide scanning process includes the scanner capturing images from a glass slide under high resolution and stitching those images together to form a digital file that can be navigated similarly to a glass slide on a microscope, while using a digital workflow. WSI software viewers are developed to view the WSIs for pathologic diagnosis. At MSK, whole slide scanners from multiple vendors and custom viewer software haven been implemented and validated to support clinical workflow. The aims of this study include: (1) to assess operational feasibility supporting remote review and primary reporting of pathology specimens using digital pathology systems; (2) to evaluate remote access performance and usability for pathologist remote signout; (3) to validate a digital pathology system for remote primary diagnostic use.
Material and methods
Assessment of remote readiness
Prior to engaging a large-scale validation study, we sought to assess the departmental readiness for the technical accessibility needed for remote signout. A survey was distributed to all clinical members in the department, including all faculty and trainees.
Questions in the remote readiness survey included:
-
1.
Do you have a computer/workstation that you can use to work from home?
-
2.
What is the largest monitor on the computer that you can work on?
-
3.
Do you have internet at home, if so, what is the network bandwidth?
-
4.
Do you have institutional virtual private network and 2-factor authentication configured?
-
5.
Do you know how to connect your device to a hospital clinical workstation through remote desktop connection?
Case selection
Prospective selection of patient specimens included cases from our large academic tertiary comprehensive Cancer Center. Pathologists’ clinical service days were randomly selected for remote digital signout. Each signout day was composed of all specimen classes including biopsies, surgical resections, and departmental consultation material (e.g., pathology material from other institutions) accessioned for a given signout per routine departmental protocol. Each pathologist received their respective case worklist on their randomly assigned clinical service day, comprised of that day’s entire clinical volume within their subspecialty. Pathologists reviewed an entire day’s workload including all slides and levels taken from biopsies, all immunohistochemical stains, and special stains submitted with consultation cases. Cases where frozen sections had been performed were reviewed together with the corresponding frozen section control slides as per the pathology department routine workflow. A referee pathologist not participating in case signout verified all cases included in the study by ensuring all cases accessioned for each pathologist’s signout session were scanned and assigned to pathologists’ worklists.
Whole slide scanning protocol
Specimens received from within our hospital (hereafter referred to as “in-house”) underwent conventional procedures for specimen accessioning, gross pathology evaluation, and dissection of tissue into cassettes/blocks, tissue processing, embedding, microtomy, staining, and coverslipping. This study included all cases accessioned and ready for clinical review on each pathologist’s randomly assigned clinical service day. The overall operational workflow from accessioning of the patient case through slide distribution to pathologists is shown in Fig. 1. Glass slide hematoxylin & eosin staining, coverslipping, and drying of mounting media were prepared by Leica Spectra instruments (Leica Biosystems, Buffalo Grove, Illinois, USA). The Spectra instruments utilize Leica Universal Slide racks, which can be directly inserted into an Aperio GT450 whole slide scanner (Leica Biosystems, Buffalo Grove, Illinois, USA). After glass slide staining and coverslipping, the glass slides are loaded into the Aperio GT450 by laboratory personnel. Glass slides were scanned at ×40 equivalent magnification (0.26 µm/pixel) using a native ×40 objective lens (Leica Biosystems).
For patient specimens received as consultations from referring institutions, the glass slides were generated at the referring institution, shipped to MSK, and then accessioned into the laboratory information system. A departmental consult barcode was affixed to the consult glass slide, which was subsequently placed in Leica Universal Slide racks at the point of accessioning. Glass slide pick up and drop off stations were placed in the pathology department accessioning area and histology laboratory. Digital scanning team members routinely circulated to the accessioning area to courier racks of glass slide consults to undergo quality control (see “Image Quality Control”) and be placed on the Aperio GT450 whole slide scanner. All pathology assets (e.g., specimen container, blocks, slides) had 2D data-matrix barcodes. The barcode on the glass slide label enabled the Leica Aperio eSlide manager database to interface with the laboratory information system (Cerner CoPathPlus). Whole slide images were accessed through the PICSPlus module in CoPathPlus and launched into a custom whole slide image viewer. Glass slides from accessioning and the histology laboratory were scanned, including, but not limited to the subspecialties designated for the pathologists’ signouts. The additional glass slides not used as part of the validation study were scanned and reviewed for quality assurance; and to perform a technical assessment on the Aperio GT450 whole slide scanner.
Image quality control
All whole slide images were reviewed by technicians or the referee pathologist as part of the quality assurance process prior to initiation of primary digital signout.
There was no change to the laboratory’s current standard operating and quality control procedures for initial handling (accessioning), gross examination or histologic preparation (processing, embedding, microtomy and staining) of surgical pathology specimens. The quality control process also involves macro-evaluation of pre-analytic slide artifacts to be resolved before scanning. The following guidelines were provided for quality control: glass slides should be appropriately stained and dried, not broken, and clean of ink markings; coverslips were to be properly placed without hanging over the edge of the glass slide, and air bubbles were to be absent; the glass slide label should be flat and not extending past the slide edge or covering tissue on the slides. For consult slides, this process included a barcoding and tracking station to scan each glass slide; this ensured that the additional MSK laboratory barcode was readable by handheld barcode readers and marked the glass slide as scanned (received for scanning in the laboratory information system). Consult slides without barcodes, or barcodes that failed reading by the handheld reader had new patient labels with barcodes printed and placed on the glass slide label area. Additional real-time quality control was performed by the whole slide scanner that provided user interface messages including barcode detection failures, no tissue detection, macro focus image failure, or image quality errors. Prior to unloading the glass slides, following glass slides scanning, another quality control evaluation was performed by the digital imaging technologists to ensure the WSI was of adequate image quality. Image thumbnails were reviewed to ensure all tissue present on the glass slide was scanned and wholly represented on the WSI, and confirmation the slide barcode was decoded and present within the LIS. The whole slide scanner touchscreen display interface provided immediate feedback to the scanning operator who addressed any scanning errors immediately (e.g., rescan of a slide with an error). An additional post-scan quality control measure included the pathologist review of the WSI. The whole slide image viewer had a custom in-built tool to report a problem with the digital slide, which sends a notification to the digital scanning team to rescan that glass slide (Fig. 2). The pathologist had discretion to ask for a rescan of a glass slide if desired, or if any WSI revealed scanning artifacts.
WSI viewer
The WSI viewer software is an institutional standard, vendor-agnostic, whole slide image viewer. As per the institution scanning protocol, prospective clinical cases have been digitally scanned and interfaced with the laboratory information system. For over 3 years, all pathologists at our institution have used the WSI viewer on a daily basis and reviewed retrospectively archived whole slide images from pathology specimens. The WSI viewer launches from the PICSPlus module in Cerner CoPathPlus and visualizes any scanned whole slide images for that case (e.g., all, multiple, or individual); review of the WSI is controlled through the computer mouse and keyboard. The web-based viewer is initiated within the pathologist’s default web browser. For pathologist navigation, the WSI viewer software includes tools such as thumbnail viewing, slide label viewing, zooming, panning, and co-registration of multiple slides. Annotation tools include a digital ruler for measurements, tracking of viewed slides, heatmap coverage of WSI regions reviewed, screenshots, and the ability to add annotations.
Digital and glass signout
After each glass slide scan was successfully completed on the whole slide scanner, WSI are immediately available in the laboratory information system, automatically collated by case and sequential slide order as they exist in the laboratory information system. At our institution, pathology trainees (i.e., fellows) preview and compose a preliminary report for each patient case prior to reviewing with a faculty pathologist for final reporting. For this study, trainees were included in the overall reporting workflow with involvement in handling and previewing of the cases prior to pathologist’s review. Pathology trainees received the glass slides with respective paper surgical pathology requisitions for each case, as distributed from the central laboratory. In addition, trainees were also provided with access to the digital slides at the time of previewing and had the option to use either glass or digital slides when previewing cases to facilitate social distancing efforts within the department. After trainees entered preliminary reports in the laboratory information system, cases were subsequently assigned to a pathologist for review. Pathologists independently reviewed all cases as digital slides first, from a remote site (e.g., home) via secure access through a virtual private network and 2-factor authentication per institutional policy. Pathologists accessed clinical workstations located within the institutional enterprise firewall and had access to hospital information systems such as the laboratory information system, electronic medical record, and radiology picture archiving and communication system from their remote site. Each clinical workstation was an institutional standard configuration (3.2 GHz, 8 GB random-access memory (RAM), 64-bit processor) with default web browser as Google Chrome or Microsoft Internet Explorer 11. From the remote site, pathologists used the laboratory information system to view all surgical pathology requisitions digitally and launch WSIs into the WSI viewer. After all digital slides were reviewed for each case, a complete final report was entered into the laboratory information system, but not electronically released to the medical record from the remote site. Subsequent case re-review was performed in pathologists’ offices (at the CLIA-licensed facility) using glass slides on an Olympus BX series light microscope (i.e., Olympus BX43). Pathologists re-reviewed all glass slides for all cases, mirroring the remote digital WSI review. The interval between initial remote digital review and secondary glass slide review was short (mean, 2 days) to conform with turnaround time reporting requirements. Pathologists reported all cases according to their routine workflow (i.e., free-text, worksheet templates, synoptic worksheets) including all required diagnostic reporting elements. Prior patient pathology reports and slides (from prior patient’s pathology) were made available to the pathologists for both signout sessions via previously archived WSIs. Concordance metrics were captured after each digital and glass slide signout. Other data collected included remote hardware utilized, such as: network bandwidth while connected to virtual private network, web-browser (for web-based WSI viewer), monitor size, display resolution, computer processing unit, and RAM. Technical difficulties were also recorded, if any. After both signout sessions, an experience survey was distributed to the pathologists who participated in remote digital and on-site glass slide review.
Questions in the reader experience survey included:
-
1.
How many years have you been practicing pathology?
-
2.
How many years of experience do you have using digital pathology? (in any capacity)
-
3.
Rate the digital pathology slide viewer.
-
4.
Rate your satisfaction with the launching of slides from within the laboratory information system (CoPath).
-
5.
Rate the quality of the digital slides.
-
6.
Rate your satisfaction with the performance in navigating the digital slides.
-
7.
How comfortable would you feel providing primary diagnosis using digital pathology, with retrieval of glass slides available upon request?
-
8.
How comfortable would you feel providing primary diagnosis using digital pathology, without availability of glass slides?
Concordance
A read is defined in this study as a diagnosis for a specimen part (i.e., for each case, there may be one or multiple parts, each comprising one or multiple slides). Data captured after each signout session included, subspecialty, specimen type (i.e., biopsy, resection, in-house, consultation), part type description (i.e., organ site), number of slides per case, and ordered ancillary tests (e.g., hematoxylin & eosin stained recuts, immunohistochemical and special stains). Recorded diagnostic information included top-line diagnosis, margin status, lymphovascular and perineural invasion, pT and pN stage; and any changes in interpretation between remote digital and on-site glass slide signout sessions. Concordance metrics were evaluated based on paired reads, one read by WSI and one read by brightfield microscopy for the same case by the same pathologist. Frequency of concordant/discordant diagnoses were calculated. The glass slide diagnosis that was ultimately reported was considered the reference (i.e., gold standard diagnosis). The laboratory’s defined concordance thresholds were based on criteria used by other well-established studies examining WSI primary diagnosis using Food and Drug Administration cleared digital pathology systems [2, 3, 27, 28]:
-
Diagnoses rendered by WSI & glass slides have a major discrepancy rate of 4% or less
-
The major discrepancy rate of the diagnoses rendered on WSI (relative to the reference diagnosis) was 7% or less.
Major diagnostic discrepancies were defined as any finding identified by one modality and not identified on the other that would be clinically significant. Minor diagnostic discrepancies were defined as any finding identified by one modality and not identified in the other that would not impact clinical care. A pathologist who did not participate as a reader in this study adjudicated concordance of diagnoses.
Results
Department readiness for remote signout
The readiness survey was distributed to all clinical faculty and trainees (n = 142); there were 112 respondents.
Of the respondents, 96% (108/112) had a computer to perform remote work. The distribution of monitor size available on the computer was <14 in. (41.7%, 43/103); 15–20 in. (31.1%, 32/103); 21–25 in. (17.5%, 18/103); and >25 in. (9.7%, 10/103). 92.5% (99/107) of respondents had access to internet at their remote site. Median network bandwidth (e.g., download speed) was 94 megabits per second (range 3–835). Network download speeds were aggregated as follows: <20 (13%, 10/76); 20–49 (11%, 8/76); 50–99 (32%, 24/76); ≥100 (45% 34/76) megabits per second. Most (88%, 84/96) respondents had virtual private network connections configured with 2-factor authentication setup, and 77% (63/82) knew how to appropriately connect the remote device to their hospital clinical workstation through remote desktop connection. All faculty pathologists included in this validation study had access to a remote workstation with virtual private network, 2-factor authentication, and remote desktop connection.
Pre-analytic data
For the purposes of whole slide scanner technical evaluation, a total of 2119 glass slides were included and scanned. Of the 2119 WSI, the median scan area was 612 mm2. The median scan time per slide at ×40 equivalent resolution (0.26 µm/pixel) was 90 s. The median file size was 1.3 gigabytes (GB).
Quality assurance review of WSIs by technicians or the referee pathologist prior to initiation of primary digital signout revealed that, of the 2119 glass slides, 39 required rescanning due to barcode detection failures (n = 21), no tissue detected (n = 1), tissue detection failure (n = 9), no macro image focus (n = 1), and image quality (n = 7). The overall rescan rate was 1.84%. These rescanned slides were further analyzed by differentiating glass slides prepared in the MSK histology laboratory (n = 13/1573; 0.83%) compared to those glass slides received in consultation (n = 26/546; 4.76%) (Table 1). Standard procedure for consult labeling includes affixing a secondary MSK Department consult label directly below the referring institution label. At the beginning, the majority of barcode detection failures were from consult glass slides during the scanning for the first two pathologists’ remote signout sessions. These errors were addressed by the digital pathology system vendor, who successfully deployed a feature within the system to decode only MSK laboratory-specific barcodes. Consult glass slide barcode detection failures after implementation of the software update decreased by 46% for all subsequent scanning of consult slides, 2.59% (n = 4/154). After initial image quality control of scanned slides, no WSIs were requested to be rescanned by the pathologists during the reader sessions.
Twelve subspecialized surgical pathologists (“readers”), reported on cases from nine different surgical pathology subspecialties; with one reader for each and two for breast and genitourinary cases. A total of 16 reader sessions were conducted; one reader participated in three signout sessions, and two readers participated in two signout sessions (i.e., 16 remote and 16 on-site slide reader sessions). Surgical pathology subspecialties included: breast, bone & soft tissue, dermatopathology, gastrointestinal, genitourinary, gynecologic, head & neck, neuropathology, and thoracic pathology. Table 2 refers to the case, part, and slide distribution by reader. A total of 108 unique patient cases, 254 individual parts, and 1196 slides were included in the study. The readers completed 508 reads (i.e., diagnoses by part) in total: 254 reads by remote digital microscopy, and 254 reads using glass slides on premises. Patient case distribution by specimen type included 204 in-house parts (i.e., biopsies and resections performed at our institution) and 50 parts sent for consultation from other institutions. All slides from each case were included for digital and glass slide review. There was a mean of 11 slides per case (range, 1–49). Specimen types in this patient cohort included a standard distribution of cases as typically expected in a large tertiary care center with a primary focus on cancer (Table 3).
Technical and hardware specifications
Pathologists’ use of remote workstations ranged from consumer grade laptop computers to higher specification desktop computers with high definition dual monitors. Two pathologists reported virtual private network errors (i.e., loss of connectivity), but were able to immediately reconnect without additional errors. Network bandwidth for readers in this study while connected to the virtual private network ranged from 20 to 849 megabits per second. 11 of the 16 pathologist remote sessions included a single monitor ranging from 13.3 to 25 in. and 1280 × 800 to 3000 × 2000 pixel resolution. The other five pathologist remote reader sessions had two monitors where the monitor size and display resolutions ranged from 13.5 to 42 inches and 1900 × 1080 to 3840 × 2160 pixel resolution. All monitors had 24-bit color depth. Computer processing units ranged from 1.3–3.2 GHz. No difference in WSI latency was identified by remote workstation but was reportedly slower with lower network connectivity. Of the 16 reader sessions, 69% (11/16) of readers used Google Chrome web browser, and 31% (5/16) used Internet Explorer 11.
Concordance
A total of 508 reads were performed using both whole slide image and glass slides, 254 reads each. From all matched glass slide and whole slide image diagnoses, there was 100% major diagnostic concordance (254/254 reads). Three minor diagnostic discrepancies occurred, corresponding to an overall 98.8% (251/254 reads) for glass slide (at CLIA-certified facility) to remote WSI diagnostic concordance rate. Even readers with network speeds of less than 50 megabits per second had 100% major concordance between digital and glass diagnostic reads. Diagnostic discrepancies were defined as any finding identified by one modality and not identified on the other that would be clinically significant. The three discrepant diagnoses are detailed in Table 4. One each occurred in the breast, thoracic, and head & neck subspecialties. Size measurements performed by digital annotation were 3% larger than compared with the glass slide measurements. Concordance for presence and absence of lymphovascular and/or perineural invasion overall was 96.3% (26/27) for each appropriate part-level diagnosis. The surgical margin status, pathology tumor and node stage (e.g., pT, pN) were 100% concordant. Upon review of the discrepant cases, the diagnoses based on glass slide reads were deemed to be correct. For the three minor discrepant reads, after conclusion of the study, retrospective re-review of the whole slide images identified the pathologic findings not reported during remote digital review. Table 5 summarizes the concordance evaluation.
Reader experience survey
The reader experience survey distributed after completion of both digital and glass slide signout sessions produced ten responses (Fig. 3).
Of the ten survey respondents, the readers had practiced pathology for a median of 19 years (range 3–37). All readers had some experience using digital pathology, with the median number of years being 5 years (range 3–10). On a five-point Likert scale (1—very poor, 5—very good), all users rated the WSI Slide Viewer equal to or >3, with a median rating of 4 (i.e., good). The reader satisfaction with launching whole slide images from the laboratory information system was ≥4 by all readers. Regarding the whole slide images, readers reported the median image quality rating as 5 (i.e., very good). The satisfaction with the performance in navigating the whole slide images varied among the readers: five responded as 3 (i.e., neutral), three respondents as 4 (i.e., good), and two readers as 5 (i.e., very good). Comfort level regarding the option of signout using whole slide images for primary diagnosis with the availability of glass slides had a median rating of 5, where 90% of readers responded ≥4 (i.e., good), and only one reader as 3 (i.e., neutral). When asked their level of comfort regarding primary diagnosis using digital pathology without the availability of glass slides, the median reader rating was agreeable; one reader rated as 2 (i.e., uncomfortable), three readers as 3 (i.e., neutral), five readers as (i.e., comfortable), and one reader as 5 (i.e., very comfortable). Comments from pathologists included need for a better input device (e.g., computer mouse) for slide navigation. Readers also perceived that, when using the digital ruler, they were being more precise with their measurements.
Discussion
This study presents a validation of a digital pathology system and operational workflow for remote primary diagnosis, including digitization of glass slides generated from formalin fixed paraffin embedded and frozen tissue, and including hematoxylin & eosin stains, immunohistochemical stains, and special stains. All specimens were prospective clinical patient cases, first reviewed by the attending pathologists by digital means from a remote site, then the glass slides were subsequently re-reviewed with a conventional brightfield microscope and verified electronically within the CLIA-licensed facility. The validation successfully demonstrated operational feasibility of supporting remote review and reporting of pathology specimens and verification of remote access performance and usability for remote primary diagnostic signout. Our findings are similar to the results of earlier studies reporting that WSI is non-inferior to glass slide review for rendering pathology diagnoses [1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25], and the current study represents the first in the United States to show adequate concordance for primary diagnosis using a digital pathology system in a non-CLIA certified remote setting.
The COVID-19 pandemic has fueled interest in digital pathology, given the requirement for pathologists to review and report pathology for continuity of patient care is crucial, and protecting healthcare personal safety is key to ensuring uninterrupted, expert pathology practice. CLIA ‘88 requires reporting of patient pathology to be conducted in a CLIA-certified clinical laboratory [42 U.S.C. 263a]. Hitherto, only two digital pathology systems have been cleared by the Food and Drug Administration for primary diagnosis in surgical pathology, although not for remote use. Food and Drug Administration clearance, or lack thereof, pertains to manufacturer intended use and marketing; however, it does not govern healthcare practice. Nevertheless, CLIA ‘88 requirements prevent pathologists from routinely using these systems remotely from a non-CLIA certified site. As a result, digital pathology systems used for clinical purposes have been limited to use within CLIA-certified pathology laboratories. Due to the public health emergency, the Centers for Medicare & Medicaid Services issued a memorandum on March 26, 2020, applying enforcement discretion to allow pathologists to review and report WSIs remotely, temporarily decoupling the display device from the traditionally cleared end-to-end digital pathology system. The CMS memorandum states that laboratories choosing to utilize temporary testing sites (e.g., for remote review and reporting of glass slides/WSIs) may do so if certain criteria outlined in the memorandum are met [29]. On April 24, 2020 the Food and Drug Administration included further guidance pertaining to the public health emergency, issuing nonbinding recommendations for remote digital pathology systems. These recommendations stated that the Food and Drug Administration would not object to marketing of non-510(k) cleared systems and to their use in remote settings. The document states the Food and Drug Administration would not intend to enforce compliance for premarket notification submissions for a whole slide imaging system where the digital pathology system does not create undue risk for patients or pathologists. The document further recommends validation of pathologist remote reviewing and reporting of patient pathology materials. The College of American Pathologists has published validation guidelines in 2013 for primary diagnosis using digital pathology systems, and there is significant literature attesting to the equivalency of WSI to glass slides [30]. The College of American Pathologists also issued remote signout guidance during this public health emergency stating that pathology laboratories can review and report pathology specimens from remote non-CLIA certified facilities during the temporary CLIA waiver with appropriate validation and corroboration to the CMS waiver regulations [31, 32].
This large preclinical validation study substantiates a distinct paradigm emphasizing diagnostic concordance when remotely reviewing and reporting patient specimens using a digital pathology system, including non-510(k) cleared display devices, whole slide image viewer, and whole slide scanner. Equivalence, as defined in a previous study [3] includes all aspects of the clinical care setting pertaining to pathology. The College of American Pathologists validation guidelines for using whole slide imaging in the diagnostic setting state that each pathology laboratory planning on implementing a digital pathology system should perform their own validation studies for the respective intended clinical use. In keeping with these guidelines, our validation study encompassed surgical pathology slides at our institution, including frozen section tissue slides, special stains, and immunohistochemical studies. We also validated our complete workflow from the point of accessioning of cases in the laboratory information system to final reporting of patient pathology. This also included the whole slide image scanning operational workflow, trainee involvement, remote access by pathologists and launching of whole slide images from within the laboratory information system, ordering of ancillary studies through the laboratory information system, and diagnostic reporting of all clinically relevant pathology parameters. A Q-probe study performed by the College of American Pathologists analyzed self-reported laboratory discrepancy frequencies by prospective secondary review of glass slides to be a median of 5.1%, for anatomic pathology [33]. In addition, as previously published and accepted by the Food and Drug Administration in two other studies, the major discordance rate compared to a reference diagnosis was 4.9% for whole slide images and 4.6% using glass slides in one study [2]; and 3.6% for whole slide images and 3.2% for glass slides in another study [1]. In our study, the reference standard was designated as conventional light microscopy. The results from these two studies are congruent with the pre-set acceptance criteria of this validation study performed at our institution. In this study there were no major diagnostic discrepancies and three minor discrepancies, with a major concordance rate of 100% (254/254) and overall concordance rate (including minor discrepancies) of 98.8% (251/254), based on reads evaluated using conventional microscopy as the reference diagnosis.
While this study demonstrates diagnostic equivalence for digital and glass slides, there still remain limitations for certain situations that will require glass slide review with a microscope. For example, pathology cases requiring polarization of tissue (e.g., amyloid, oxalate crystals) remains as a technical limitation for diagnosis using digital pathology. Rhoads et al. reported limitations in interpretation of microorganism evaluation with lower resolution scanning for whole slide images reviewed at 40 × (0.25 µm/pixel), 83× oil-immersion (0.17 µm/pixel), and 100× oil-immersion (0.14 µm/pixel) [34]. In our validation study, two cases required special stains for microorganisms, one lung biopsy with non-necrotizing granulomas where GMS and mucicarmine stains showed cryptococcus, and another stomach biopsy, where the Helicobacter pylori immunohistochemical stain was negative.
Outside the United States, laboratories have undergone digital transformation of their anatomic pathology services and published successful experiences. Fraggetta et al. reported successful validation and adoption of digital pathology, and also described the importance of minimizing human intervention to streamline operations [35]. Vodovnik et al. has reported complete digital pathology validation of routine surgical pathology cases, autopsies, cytology, and frozen sections, including using remote signout [36,37,38]. Reporting was primarily performed by a single senior staff pathologist. Digital images were similarly accessed through the laboratory information system and displayed on either 14 in. laptops or desktop computers with dual 23 in. monitors for the remote and on-site digital reporting. Diagnostic concordance analysis was not performed for the cases reported remotely; however, a speed of 20 megabits per second was concluded to be an adequate network speed for remote signout. Additional literature supporting telepathology for diagnostic concordance has been limited to frozen sections or secondary consultations using either whole slide and static images, robotic, or live streaming of digital images [39,40,41,42,43,44,45,46]. However, none of these studies from the United States include reporting from a non-CLIA certified location. Due to crisis-driven relaxation of regulatory requirements, this is an unprecedented opportunity to collect data for remote feasibility and utility to show safe and efficacious reporting of pathology using validated digital pathology systems, including from remote sites. It also opens doors towards continuous and routine use of digital pathology for inter-institutional consultations and collaborations.
The College of American Pathologists provided further guidance such that for remote signout, laboratories can use their own discretion for validation protocols to demonstrate whether digital pathology systems will perform as equivalent to conventional microscopy for a given intended use. In addition, they recommend that laboratories can deem washout periods of any duration as reasonable to abate recall bias [32]. The use of a washout period in this study being less than the recommended two weeks was mandated based on the study design to ensure reasonable turnaround times in delivering final reports. Remote digital diagnoses were not electronically verified remotely, with only a full draft diagnostic report entered into the laboratory information system. On-site glass slide review was required prior to final release of reports, which limited the opportunity to introduce a lengthy washout period. Additionally, during the public health emergency, surgical pathology volumes were markedly decreased, suggesting there may be challenges for validating clinical volumes of prospective clinical cases.
Importantly, pathologists participating as readers while remotely reviewing and reporting of pathology specimens rated their signout experiences positively. These included pathologists from a range of years in practice and experience using digital pathology. In addition, pathologists used hardware available at their respective non-CLIA certified location (e.g., home), including real-world variables, such as internet service provider bandwidth. Remote reporting guidelines included ensuring all data was to be reviewed by securely connecting within the institutional firewall and that any data viewed outside of the primary CLIA-licensed facility for signout was protected from accidental or intentional unauthorized disclosure. The pathologist experience survey showed high satisfaction with remote review and reporting using a digital pathology system. Comments discussed a need for enhanced whole slide image navigation (e.g., input devices) and ergonomic improvements with dual monitor workstations were suggested. Of the 2119 WSI, the median scan area was almost three times the size of the industry standard by digital pathology hardware vendors (e.g., 15 × 15 mm; 225 mm2), measured at 612 mm2. Pathologists will require effective navigation to review each slide with similar efficiency to glass slides.
No differences in diagnostic concordance or remote digital signout experience were identified by pathologists’ demographics, digital pathology years of experience, network bandwidth, computer monitor, or display resolution of the hardware used, including Windows and Macintosh personal computers. In our study, we found no standard remote workstation configuration. Based on the pathologist’s remote hardware, minimum specifications were recommended for remote viewing and reporting of patient pathology in our standard operating procedure (Supplementary Table 2). However, similar to findings reported in a previous study [3], most pathologists subjectively mentioned that reviewing digital images took a longer time compared with reviewing glass slides by conventional microscopy. This is likely due to poor input devices and navigation tools as well as overall experience and familiarity with digital signout, which merits additional studies and future improvements to the technology. Overall, this study, as part of our laboratory validation, prove feasibility of remote signout for primary diagnosis and provides a milestone for our department and the digital pathology community.
With declaration of a public health emergency in the United States and subsequent enforcement discretions, pathologists may remotely review and report pathology specimens using a digital pathology system with appropriate validation. We report the first validation of a digital pathology system from a remote non-CLIA certified facility and find a major concordance rate of 100% comparing remote review and reporting of whole slide images to conventional microscopy. The validation included modifications to laboratory operational workflow to successfully support remote review and reporting of pathology specimens using digital pathology systems. As a result of this validation and provisional approval from our state Department of Health, pathologists at our institution have initiated digital/remote review and reporting of patient pathology and will continue to pursue this practice, as permissible by law, even after the public health emergency has concluded. Remote access was effectively utilized with sufficient performance for remote signout. Additional validation studies during these unprecedented times will enable aggregation of data and offer insights that may influence the existing regulatory environment.
References
Borowsky AD, Glassy EF, Wallace WD, Kallichanda NS, Behling CA, et al. Digital whole slide imaging compared with light microscopy for primary diagnosis in surgical pathology: a multicenter, double-blinded, randomized study of 2045 cases. Arch Pathol Lab Med. 2020. https://doi.org/10.5858/arpa.2019-0569-OA. Online ahead of print.
Mukhopadhyay S, Feldman MD, Abels E, Raheela Ashfaq, Beltaifa S, Cacciabeve NG, et al. Whole slide imaging versus microscopy for primary diagnosis in surgical pathology: a multicenter blinded randomized noninferiority study of 1992 cases (pivotal study). Am J Surg Pathol. 2018;42:39–52.
Hanna MG, Reuter VE, Hameed MR, Tan LK, Chiang S, Sigel C, et al. Whole slide imaging equivalency and efficiency study: experience at a large academic center. Mod Pathol. 2019;32:916–28.
Al-Janabi S, Huisman A, Nap M, Clarijs R, van Diest PJ. Whole slide images as a platform for initial diagnostics in histopathology in a medium sized routine laboratory. J Clin Pathol. 2012;65:1107–11.
Bauer TW, Schoenfield L, Slaw RJ, Yerian L, Sun Z, Henricks WH. Validation of whole slide imaging for primary diagnosis in surgical pathology. Arch Pathol Lab Med. 2013;137:518–24.
Brunelli M, Beccari S, Colombari R, Gobbo S, Giobelli L, Pellegrini A, et al. iPathology cockpit diagnostic station: validation according to College of American Pathologists Pathology and Laboratory Quality Center recommendation at the Hospital Trust and University of Verona. Diagn Pathol. 2014;9:S12.
Buck TP, Dilorio R, Havrilla L, O’Neill DG. Validation of a whole slide imaging system for primary diagnosis in surgical pathology: a community hospital experience. J Pathol Inform. 2014;1:43.
Campbell WS, Lele SM, West WW, Lazenby AJ, Smith LM, Hinrichs SH. Concordance between whole slide imaging and light microscopy for routine surgical pathology. Hum Pathol. 2012;43:1739–44.
Cheng CL, Azhar R, Sng SH, Chua YQ, Hwang JSG, Chin JPF, et al. Enabling digital pathology in the diagnostic setting: navigating through the implementation journey in an academic medical centre. J Clin Pathol. 2016;69:784–92.
Fonyad L, Krenac T, Nagy P, Zalatnai A, Csomor J, Sápi Z, et al. Validation of diagnostic accuracy using digital slides in routine histopathology. Diagn Pathol. 2012;7:35.
Gilbertson JR, Ho J, Anthony L, Jukic DM, Yagi Y, Parwani AV. Primary histologic diagnosis using automated whole slide imaging: a validation study. BMC Clin Pathol. 2006;6:4–19.
Goacher E, Randell R, Williams B, Treanor D. The diagnostic concordance of whole slide imaging and light microscopy. Arch Pathol Lab Med. 2017;141:151–61.
Houghton JP, Ervine AJ, Kenny SL, Kelly PJ, Napier SS, McCluggage WG, et al. Concordance between digital pathology and light microscopy in general surgical pathology: a pilot study of 100 cases. J Clin Pathol. 2014;67:1052–5.
Jukic DM, Drogowski LM, Martina J, Parwani AV. Clinical examination and validation of primary diagnosis in anatomic pathology using whole slide digital images. Arch Pathol Lab Med. 2011;135:372–8.
Snead DR, Tsang YW, Meskiri A, Kimani PK, Crossman R, Rajpoot NM, et al. Validation of digital pathology imaging for primary histopathological diagnosis. Histopathology. 2016;68:1063–72.
Tabata K, Mori I, Sasaki T, Itoh T, Shiraishi T, Yoshimi N, et al. Whole slide imaging at primary pathological diagnosis: validation of whole slide imaging-based primary pathological diagnosis at twelve Japanese academic institutes. Pathol Int. 2017;67:547–54.
Al-Janabi S, Huisman A, Jonges GN, ten Kate FJW, Goldschmeding R, van Diest PJ. Whole slide images for primary diagnostics of urinary system pathology: a feasibility study. J Ren Inj Prev. 2014;3:91–6.
Al-Janabi S, Huisman A, Nikkels PG, ten Kate FJ, van Diest PJ. Whole slide images for primary diagnostics of paediatric pathology specimens: a feasibility study. J Clin Pathol. 2013;66:218–23.
Al-Janabi S, Huisman A, Vink A, Leguit RJ, A Offerhaus GJ, Ten Kate FJ, et al. Whole slide images for primary diagnostics in dermatopathology: a feasibility study. J Clin Pathol. 2012;65:152–8.
Al-Janabi S, Huisman A, Vink A, Leguit RJ, A Offerhaus GJ, Ten Kate FJ, et al. Whole slide images for primary diagnostics of gastrointestinal tract pathology: a feasibility study. Hum Pathol. 2012;43:702–7.
Al-Janabi S, Huisman A, Willems SM, Van Diest PJ. Digital slide images for primary diagnostics in breast pathology: a feasibility study. Hum Pathol. 2012;43:2318–25.
Ordi J, Castillo P, Saco A, Del Pino M, Ordi O, Rodríguez-Carunchio L, et al. Validation of whole slide imaging in the primary diagnosis of gynaecological pathology in a university hospital. J Clin Pathol. 2015;68:33–9.
Reyes C, Ikpatt OF, Nadji M, Cote RJ. Intra-observer reproducibility of whole slide imaging for the primary diagnosis of breast needle biopsies. J Pathol Inform. 2014;5:5.
Stathonikos N, Nguyen TQ, Spoto CP, Verdaasdonk MAM, van Diest PJ. Being fully digital: perspective of a Dutch academic pathology laboratory. Histopathology. 2019;75:621–35.
Evans AJ, Salama ME, Henricks WH, Pantanowitz L. Implementation of whole slide imaging for clinical purposes: issues to consider from the perspective of early adopters. Arch Pathol Lab Med. 2017;141:944–59.
Clinical Laboratory Improvement Amendments of 1988 (CLIA) Title 42: The Public Health and Welfare. Subpart 2: Clinical Laboratories (42 U.S.C. 263a). https://www.govinfo.gov/content/pkg/USCODE-2011-title42/pdf/USCODE-2011-title42-chap6A-subchapII-partF-subpart2-sec263a.pdf. Accessed 22 Apr. 2020.
U.S. Food and Drug Administration. FDA news release: FDA allows marketing of first whole slide imaging system for digital pathology. https://www.fda.gov/drugs/informationondrugs/approveddrugs/ucm553358.htm. Accessed 28 Mar 2020.
U.S. Food and Drug Administration. Leica Biosystems Receives FDA Clearance for Aperio AT2 DX Digital Pathology System https://www.accessdata.fda.gov/cdrh_docs/pdf19/K190332.pdf. Accessed 28 Mar 2020.
Centers for Medicare & Medicaid Services Memorandum, Clinical Laboratory Improvement Amendments (CLIA) Laboratory Guidance During COVID-19 Public Health Emergency, March 26, 2020, https://www.cms.gov/medicareprovider-enrollment-and-certificationsurveycertificationgeninfopolicy-and-memos-states-and/clinical-laboratory-improvement-amendments-clia-laboratory-guidance-during-covid-19-public-health. Accessed 28 Mar 2020.
Pantanowitz L, Sinard JH, Henricks WH, Fatheree LA, Carter AB, Contis L, et al. Validating whole slide imaging for diagnostic purposes in pathology: guideline from the College of American Pathologists Pathology and Laboratory Quality Center. Arch Pathol Lab Med. 2013;137:1710–22.
College of American Pathologists. COVID-19—Remote Sign-Out Guidance. 2020. https://documents.cap.org/documents/COVID19-Remote-Sign-Out-Guidance-vFNL.pdf. Accessed 30 Mar 2020.
College of American Pathologists. Remote Sign-Out FAQs. 2020. https://documents.cap.org/documents/Remote-Sign-Out-FAQs-FINAL.pdf Accessed 3 Apr. 2020.
Raab SS, Nakhleh RE, Ruby SG. Patient safety in anatomic pathology: measuring discrepancy frequencies and causes. Arch Pathol Lab Med. 2005;129:459–66.
Rhoads DD, Habib-Bein NF, Hariri RS, et al. Comparison of the diagnostic utility of digital pathology systems for telemicrobiology. J Pathol Inform. 2016;7:10.
Fraggetta F, Garozzo S, Zannoni GF, Pantanowitz L, Rossi ED. Routine digital pathology workflow: the Catania experience. J Pathol Inform. 2017;8:51.
Vodovnik A. Distance reporting in digital pathology: a study on 950 cases. J Pathol Inform. 2015;6:18.
Vodovnik A, Aghdam MR, Espedal DG. Remote autopsy services: a feasibility study on nine cases. J Telemed Telecare. 2018;24:460–4.
Vodovnik A, Aghdam MRF. Complete routine remote digital pathology services. J Pathol Inform. 2018;9:36.
Pantanowitz L, Wiley CA, Demetris A, Lesniak A, Ahmed I, Cable W, et al. Experience with multimodality telepathology at the University of Pittsburgh Medical Center. J Pathol Inform. 2012;3:45.
Thrall MJ, Rivera AL, Takei H, Powell SZ. Validation of a novel robotic telepathology platform for neuropathology intraoperative touch preparations. J Pathol Inform. 2014;5:21.
Wilbur DC, Madi K, Colvin RB, Duncan LM, Faquin WC, Ferry JA, et al. Whole-slide imaging digital pathology as a platform for teleconsultation: a pilot study using paired subspecialist correlations. Arch Pathol Lab Med. 2009;133:1949–53.
Sirintrapun SJ, Rudomina D, Mazzella A, Feratovic R, Alago W, Siegelbaum R, et al. Robotic telecytology for remote cytologic evaluation without an on-site cytotechnologist or cytopathologist: a tale of implementation and review of constraints. J Pathol Inform. 2017;8:32.
Chong T, Palma-Diaz MF, Fisher C, Gui D, Ostrzega NL, Sempa G, et al. The California Telepathology Service: UCLA’s experience in deploying a regional digital pathology subspecialty consultation network. J Pathol Inform. 2019;10:31.
Chen J, Jiao Y, Lu C, Zhou J, Zhang Z, Zhou C. A nationwide telepathology consultation and quality control program in China: implementation and result analysis. Diagn Pathol. 2014;9(Suppl 1):S2.
Têtu B, Perron É, Louahlia S, Paré G, Trudel MC, Meyer J. The Eastern Québec Telepathology Network: a three-year experience of clinical diagnostic services. Diagn Pathol. 2014;9(Suppl 1):S1.
Mpunga T, Hedt-Gauthier BL, Tapela N, Nshimiyimana I, Muvugabigwi G, Pritchett N, et al. Implementation and validation of telepathology triage at cancer referral center in Rural Rwanda. J Glob Oncol. 2016;2:76–82.
Funding
This research was funded in part through the NIH/NCI Cancer Center Support Grant P30 CA008748.
Author information
Authors and Affiliations
Contributions
MGH, VER, OA, DK, DSK, MH: concept and design, data collection, data analysis/interpretation, manuscript drafting/revision. PS, JP, ES, MAF, PN, ML, CE: concept and design, data analysis/interpretation, manuscript revision. SJS, KB, JLS, EB, LKT, BX, TB, NA, LHH, LHE: data collection, manuscript revision.
Corresponding author
Ethics declarations
Conflict of interest
MGH is a consultant for Paige.AI and an advisor to PathPresenter. PJS is a co-founder and equity holder of Paige.AI. DSK is a founder, consultant, and equity holder for Paige.AI. All other authors declare no conflicts of interest.
Additional information
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary information
Rights and permissions
About this article
Cite this article
Hanna, M.G., Reuter, V.E., Ardon, O. et al. Validation of a digital pathology system including remote review during the COVID-19 pandemic. Mod Pathol 33, 2115–2127 (2020). https://doi.org/10.1038/s41379-020-0601-5
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1038/s41379-020-0601-5
This article is cited by
-
Artificial intelligence applications in histopathology
Nature Reviews Electrical Engineering (2024)
-
Digital pathology for nonalcoholic steatohepatitis assessment
Nature Reviews Gastroenterology & Hepatology (2024)
-
Exploring data mining and machine learning in gynecologic oncology
Artificial Intelligence Review (2024)
-
Digitalisierung der histopathologischen Routinediagnostik
Die Pathologie (2024)
-
Intraoperative assessment of axillary sentinel lymph nodes by telepathology
Breast Cancer Research and Treatment (2023)