The process of inferring causal effects is at the core of both health research and the practice of medicine. In this era of evidence-based practice and policy, clinical care recommendations and policies would ideally be supported by rigorous research, which might also be used to inform individual people’s choices.1,2 While the randomized controlled trial may be considered the gold standard to determine causation,3 many important clinical and policy questions are infeasible and/or unethical to evaluate in a clinical trial. Examples of such causal questions in pediatrics include: does current routine well-child care cause good pediatric health in long run?4 Does increased screen time cause pediatric obesity?5 Does e-cigarette use cause children, who would not otherwise have done so, to become tobacco smokers?6,7 The translation of research into practice and policy recommendations is complicated by the fact that even findings from well-conducted research can be explained by factors other than causation (e.g., chance, or bias owing to errors in data collection or statistical analysis).
Therefore, many stakeholders in medicine and heath care would stand to benefit from improved research design and conduct. This means both improving the estimation of causal effects, and also acknowledging the existing study limitations that prevent causal inference (which is an ideal rather than a standard that can be guaranteed by a given method or study design8,9,10). The growing prominence of causal inference in modern health research has also been accompanied by (and accelerated by) the abundance and increased availability of data.11,12 Large data resources and advanced methods together hold promise to advance our understanding of causation in health—and therefore to improve the evidence base for clinical practice. However, uptake of these advanced methods has been relatively slow in applied research to date, in part due to daunting terminology and the technical skills often required to choose among and implement these techniques. It is this barrier that Williams and colleagues address with their contribution in this edition of Pediatric Research.13 The authors present a commendably clear introduction to causal diagrams that should become an important resource for researchers and practitioners in pediatrics, and in other fields of medicine as well.
Sound research must be based on a deep understanding of the content being analyzed; even the most sophisticated analytical plan will produce meaningless results if developed and conducted in the absence of subject matter knowledge. Williams and colleagues demonstrate how causal diagrams may be used by pediatric researchers and consumers of pediatric research to map out the subject matter of a given study, and to assess the appropriateness of analytical choices and the presence of biases. The authors ground these methods in real-world pediatric content matter, including screen time and obesity, obstetric and neonatal care, and breastfeeding’s effects on pediatric cognitive development. The clearest lesson from this paper is exemplified by the hypothetical study on the effects of antenatal steroids on bronchopulmonary dysplasia (BPD). The authors use causal diagrams to show that some common control variables (e.g., disease severity and mechanical ventilation) are not confounders and actually introduce over-adjustment bias if adjusted for.13,14 Evaluating which variables should and should not be adjusted for should be made on a case-by-case basis, but the overall message is clear: researchers must be cautious in designating variables as confounders, and causal diagrams are very useful in this process.15,16,17,18 The authors demonstrate this clearly by showing that although mechanical ventilation should not be controlled in an analysis describing the association between antenatal steroids and BPD, receipt of antenatal steroids should be controlled in an analysis describing the association between mechanical ventilation and BPD.
Distilling real-world clinical content and decision-making processes into simple diagrams is a challenging task. For example, the association between screen time, physical activity, and obesity is a complex one that unfolds over long periods of time and has causal feedbacks;19,20 the causation of preterm birth involves many factors including many that remain unknown.21,22,23 Causal diagrams cannot overcome such limitations in our content knowledge or data granularity.23,24,25,26 In light of this, the authors are appropriately cautious to note that the causal diagrams presented are simplified versions of more complex real-world associations. Nonetheless, this type of explicit communication about the interrelationships between the variables in a given study (or equivalently, its “causal structure”) is an essential first step in conducting valid research and preventing erroneous analytical choices.
In the end, the question should drive the research approach, and no technique is certain to eliminate bias or ensure causal inference. All research is vulnerable to bias caused by data and analysis concerns. Williams et al. have shown how one specific tool—causal diagrams—can be used to formulate sound causal questions, design a corresponding analysis plan, and estimate associations that are more causal in nature. Because causal diagrams have great potential to improve research and accelerate translation of valid research into practice, we encourage pediatric researchers and clinicians to take the opportunity presented by this tutorial to become familiar with causal diagrams. We recommend not only reading the article, but marking up the margins by creating causal diagrams that correspond to the content from this article, and the content from your own research and clinical practice.
References
Evidence-Based Medicine Working Group. Evidence-based medicine. A new approach to teaching the practice of medicine. JAMA 268, 2420–2425 (1992).
Wharam, J. F. & Daniels, N. Toward evidence-based policy making and standardized assessment of health policy reform. JAMA 298, 676–679 (2007).
Bothwell, L. E., Greene, J. A., Podolsky, S. H. & Jones, D. S. Assessing the gold standard-lessons from the history of RCTs. New Engl. J. Med. 374, 2175–2181 (2016).
Coker, T. R., Windon, A., Moreno, C., Schuster, M. A. & Chung, P. J. Well-child care clinical practice redesign for young children: a systematic review of strategies and tools. Pediatrics 131(Suppl 1), S5–S25 (2013).
Robinson, T. N. et al. Screen media exposure and obesity in children and adolescents. Pediatrics 140(Suppl 2), S97–S101 (2017).
Bold, K. W., et al. Trajectories of e-cigarette and conventional cigarette use among youth. Pediatrics. 141, e20171832 (2018).
Soneji, S. et al. Association between initial use of e-cigarettes and subsequent cigarette smoking among adolescents and young adults: a systematic review and meta-analysis. JAMA Pediatr. 171, 788–797 (2017).
Vandenbroucke, J. P., Broadbent, A. & Pearce, N. Causality and causal inference in epidemiology: the need for a pluralistic approach. Int. J. Epidemiol. 45, 1776–1786 (2016).
Broadbent, A., Vandenbroucke, J. P. & Pearce, N. Response: formalism or pluralism? A reply to commentaries on ‘Causality and causal inference in epidemiology’. Int. J. Epidemiol. 45, 1841–1851 (2016).
Snowden, J. M., Tilden, E. L. & Odden, M. C. Formulating and answering high-impact causal questions in physiologic childbirth science: concepts and assumptions. J. Midwifery Womens Health in press (2018).
Bareinboim, E. & Pearl, J. Causal inference and the data-fusion problem. Proc. Natl Acad. Sci. USA 113, 7345–7352 (2016).
Shiffrin, R. M. Drawing causal inference from Big Data. Proc. Natl Acad. Sci. USA 113, 7308–7309 (2016).
Williams, T. C., Bach, C. C., Matthiesen, N. B., Henriksen, T. B. & Gagliardi, L. Directed acyclic graphs: a tool for causal studies in pediatrics. Pediatr. Res. (2018).
Schisterman, E. F., Cole, S. R. & Platt, R. W. Overadjustment bias and unnecessary adjustment in epidemiologic studies. Epidemiology 20, 488–495 (2009).
Hernandez-Diaz, S., Schisterman, E. F. & Hernan, M. A. The birth weight “paradox” uncovered? Am. J. Epidemiol. 164, 1115–1120 (2006).
Wilcox, A. J., Weinberg, C. R. & Basso, O. On the pitfalls of adjusting for gestational age at birth. Am. J. Epidemiol. 174, 1062–1068 (2011).
Reid, C. E., Snowden, J. M., Kontgis, C. & Tager, I. B. The role of ambient ozone in epidemiologic studies of heat-related mortality. Environ. Health Perspectives 120, 1627–1630 (2012).
Buckley, J. P., Samet, J. M. & Richardson, D. B. Commentary: does air pollution confound studies of temperature? Epidemiology 25, 242–245 (2014).
Mitchell, J. A., Rodriguez, D., Schmitz, K. H. & Audrain-McGovern, J. Greater screen time is associated with adolescent obesity: a longitudinal study of the BMI distribution from Ages 14 to 18. Obesity 21, 572–575 (2013).
Reid Chassiakos, Y. L., et al. Children and adolescents and digital media. Pediatrics 138, e20162593 (2016).
Goldenberg, R. L., Culhane, J. F., Iams, J. D. & Romero, R. Epidemiology and causes of preterm birth. Lancet 371, 75–84 (2008).
Goldenberg, R. L. et al. The preterm birth syndrome: issues to consider in creating a classification system. Am. J. Obstet. Gynecol. 206, 113–118 (2012).
Snowden, J. M. & Basso, O. Causal inference in studies of preterm babies: a simulation study. BJOG 125, 686–692 (2018).
Krieger, N. & Davey Smith, G. Response: FACEing reality: productive tensions between our epidemiological questions, methods and mission. Int. J. Epidemiol. 45, 1852–1865 (2016).
Blakely, T., Lynch, J. & Bentley, R. Commentary: DAGs and the restricted potential outcomes approach are tools, not theories of causation. Int. J. Epidemiol. 45, 1835–1837 (2016).
Krieger, N. & Davey Smith, G. The tale wagged by the DAG: broadening the scope of causal inference and explanation for epidemiology. Int. J. Epidemiol. 45, 1787–1808 (2016).
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Competing interests
The authors declare no competing interests.
Additional information
Publisher's note: Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Commentary on “Directed Acyclic Graphs: a Tool for Causal Studies in Pediatrics,” by Williams et al., forthcoming in Pediatric Research
Rights and permissions
About this article
Cite this article
Snowden, J.M., Klebanoff, M.A. Applying causal diagrams in pediatrics to improve research, communication, and practice. Pediatr Res 84, 485–486 (2018). https://doi.org/10.1038/s41390-018-0109-6
Published:
Issue Date:
DOI: https://doi.org/10.1038/s41390-018-0109-6