Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

Interpretable socioeconomic status inference from aerial imagery through urban patterns

A preprint version of the article is available at arXiv.

Abstract

Urbanization is a great challenge for modern societies, promising better access to economic opportunities, but widening socioeconomic inequalities. Accurately tracking this process as it unfolds has been challenging for traditional data collection methods, but remote sensing information offers an alternative way to gather a more complete view of these societal changes. By feeding neural networks with satellite images, the socioeconomic information associated with that area can be recovered. However, these models lack the ability to explain how visual features contained in a sample trigger a given prediction. Here, we close this gap by predicting socioeconomic status across France from aerial images and interpreting class activation mappings in terms of urban topology. We show that trained models disregard the spatial correlations existing between urban class and socioeconomic status to derive their predictions. These results pave the way to build more interpretable models, which may help to better track and understand urbanization and its consequences.

This is a preview of subscription content, access via your institution

Access options

Rent or buy this article

Prices vary by article type

from$1.95

to$39.95

Prices may be subject to local taxes which are calculated during checkout

Fig. 1: Sample of overlaid datasets (Paris).
Fig. 2: Observed performance of models trained to predict wealth in French cities: confusion matrices between predicted and observed SES classes.
Fig. 3: Maps of observed and predicted average income for Paris.
Fig. 4: Model interpretability studies using guided Grad-CAM.
Fig. 5: Correlations between urban topology and SES in the city of Paris.
Fig. 6: Heatmap representation of low/high SES coactivation and co-appearance gain values for the city of Paris.

Data availability

In this paper we built on the combination of three publicly available datasets. One was issued by the National Geographical Information Institute (IGN) and contains aerial images about French municipalities20. The second was provided by the French National Institute of Statistics and Economic Studies (INSEE) in 201921 and provides a high-resolution socioeconomic map. The third was shared by the European Environment Agency through the 2012 European Union Urban Atlas project about EU28 and EFTA countries22. All datasets are public and openly accessible online at https://geoservices.ign.fr/documentation/diffusion/telechargement-donnees-libres.html#ortho-hr-sous-licence-ouverte, https://www.insee.fr/fr/statistiques/4176290?sommaire=4176305 and https://land.copernicus.eu/local/urban-atlas/urban-atlas-2012. The individual files that were downloaded from the aerial imagery dataset are provided in GitHub. Figures depicting raw data are shown in Figs. 1, 3 and 4 and Supplementary Fig. 1. All figures and tables are provided as source data at https://doi.org/10.6084/m9.figshare.12595067.v2.

Code availability

Code developed for the research has been made freely available for non-commercial use under an MIT Licence and shared through an open repository32 at https://doi.org/10.5281/zenodo.3906063. For any further questions please contact the corresponding authors.

References

  1. World Urbanization Prospects: The 2018 Revision (United Nations, 2019).

  2. Glaeser, E. L. & Joshi-Ghani, A. The Urban Imperative: Towards Competitive Cities (Oxford Univ. Press, 2015).

  3. Gourevitch, M. N., Athens, J. K., Levine, S. E., Kleiman, N. & Thorpe, L. E. City-level measures of health, health determinants, and equity to foster population health improvement: the city health dashboard. Am. J. Public Health 109, 585–592 (2019).

    Article  Google Scholar 

  4. Revenus, Pauvreté et Niveau de Vie en 2014 (INSEE, 2014).

  5. Seto, K. C. & Kaufmann, R. K. Modeling the drivers of urban land use change in the Pearl River Delta, China: integrating remote sensing with socioeconomic data. Land Econ. 79, 106–121 (2003).

    Article  Google Scholar 

  6. Stead, D. Relationships between land use, socioeconomic factors and travel patterns in Britain. Environ. Planning B Planning Des. 28, 499–528 (2001).

    Article  Google Scholar 

  7. Mirmoghtadaee, M. The relationship between land use, socio-economic characteristics of inhabitants and travel demand in new towns—a case study of Hashtgerd New Town (Iran). Int. J. Urban Sustain. Dev. 4, 39–62 (2012).

    Article  Google Scholar 

  8. Kinzig, A. P., Warren, P., Martin, C., Hope, D. & Katti, M. The effects of human socioeconomic status and cultural characteristics on urban patterns of biodiversity. Ecol. Soc. 10, 23 (2005).

    Article  Google Scholar 

  9. Blumenstock, J. E. Fighting poverty with data. Science 353, 753–754 (2016).

    Article  Google Scholar 

  10. Llorente, A., Garcia-Herranz, M., Cebrian, M. & Moro, E. Social media fingerprints of unemployment. PLoS ONE 10, 1–13 (2015).

    Article  Google Scholar 

  11. Dong, L., Ratti, C. & Zheng, S. Predicting neighborhoods’ socioeconomic attributes using restaurant data. Proc. Natl Acad. Sci. USA (2019); https://doi.org/10.1073/pnas.1903064116

  12. Suel, E., Polak, J. W., Bennett, J. E. & Ezzati, M. Measuring social, environmental and health inequalities using deep learning and street imagery. Sci. Rep. 9, 6229 (2019).

    Article  Google Scholar 

  13. Jean, C. et al. Combining satellite imagery and machine learning to predict poverty. Science 353, 790–794 (2016).

    Article  Google Scholar 

  14. Timnit, G. et al. Using deep learning and Google street view to estimate the demographic makeup of neighborhoods across the United States. Proc. Natl Acad. Sci. USA 114, 13108–13113 (2017).

    Article  Google Scholar 

  15. Ayush, K., Uzkent, B., Burke, M., Lobell, D. & Ermon, S. Generating interpretable poverty maps using object detection in satellite images. In Proc. Twenty-Ninth Int. Joint Conf. Artificial Intelligence 4410–4416 (International Joint Conferences on Artificial Intelligence Organization, 2020).

  16. Sheehan, E. et al. Predicting economic development using geolocated wikipedia articles. In Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining (ACM, 2019); https://doi.org/10.1145/3292500.3330784

  17. Selvaraju, R. R. et al. Grad-cam: visual explanations from deep networks via gradient-based localization. In Proceedings of the IEEE International Conference on Computer Vision (ICCV) 618–626 (Springer, 2017); https://doi.org/10.1109/ICCV.2017.74

  18. Iizuka, T., Fukasawa, M. & Kameyama, M. Deep-learning-based imaging-classification identified cingulate island sign in dementia with Lewy bodies. Sci. Rep. 9, 8944 (2019).

    Article  Google Scholar 

  19. Tang, Z. et al. Interpretable classification of alzheimer’s disease pathologies with a convolutional neural network pipeline. Nat. Commun. 10, 2173 (2019).

    Article  Google Scholar 

  20. Données ORTHO HR (IGN, accessed 1 March 2020); https://geoservices.ign.fr/documentation/diffusion/telechargement-donnees-libres.html#ortho-hr-sous-licence-ouverte

  21. Données carroyées (INSEE, accessed 1 March 2020); https://www.insee.fr/fr/statistiques/4176290?sommaire=4176305

  22. Urban Atlas 2012 (Copernicus Land Monitoring Service, accessed 1 March 2020); https://land.copernicus.eu/local/urban-atlas/urban-atlas-2012

  23. Tan, M. & Le, Q. EfficientNet: rethinking model scaling for convolutional neural networks. In Proceedings of the 36th International Conference on Machine Learning Vol. 97, 6105–6114 (PMLR, 2019).

  24. Hosny, A. et al. Deep learning for lung cancer prognostication: a retrospective multi-cohort radiomics study. PLoS Med. 15, 1–25 (2018).

    Article  Google Scholar 

  25. Miao, Z. et al. Insights and approaches using deep learning to classify wildlife. Sci. Rep. 9, 8137 (2019).

    Article  Google Scholar 

  26. Toda, Y. & Okura, F. How convolutional neural networks diagnose plant disease. Plant Phenomics (2019); https://doi.org/10.1155/2019/9237136

  27. Shanahan, D. F., Lin, B., Gaston, K., Bush, R. & Fuller, R. Socio-economic inequalities in access to nature on public and private lands: a case study from Brisbane, Australia. Landscape Urban Planning 130, 14–23 (2014).

    Article  Google Scholar 

  28. Wilkerson, M. L. et al. The role of socio-economic factors in planning and managing urban ecosystem services. Ecosyst. Services 31, 102–110 (2018).

    Article  Google Scholar 

  29. You, H. Characterizing the inequalities in urban public green space provision in Shenzhen, China. Habitat Int. 56, 176–180 (2016).

    Article  Google Scholar 

  30. Head, A., Manguin, M., Tran, N. & Blumenstock, J. E. Can human development be measured with satellite imagery? In Proc. Ninth International Conference on Information and Communication Technologies and Development, ICTD ’17 (Association for Computing Machinery, 2017); https://doi.org/10.1145/3136560.3136576

  31. Audebert, N., Le Saux, B. & Lefevre, S. Joint learning from Earth observation and openstreetmap data to get faster better semantic maps. In IEEE Conference on Computer Vision and Pattern Recognition (CVPR) Workshops 1552–1560 (IEEE, 2017); https://doi.org/10.1109/CVPRW.2017.199

  32. Abitbol, J. L. Coding repository; https://github.com/jaklevab/SESEfficientCAM/.

Download references

Acknowledgements

This work was partially funded by the SoSweet ANR project (ANR-15-CE38-0011), the MOTIf Stic-AmSud project (18-STIC-07) and the ACADEMICS project financed by IDEX LYON. M.K. was supported by the DataRedux (ANR-19-CE46-0008) ANR and the SoBigData++ (871042) H2020 projects.

Author information

Authors and Affiliations

Authors

Contributions

J.L.A. and M.K. designed the research. J.L.A. built the combined dataset and implemented analysis of the results. J.L.A. and M.K. wrote the final manuscript.

Corresponding authors

Correspondence to Jacob Levy Abitbol or Márton Karsai.

Ethics declarations

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Extended data

Extended Data Fig. 1 Correlations between urban topology and socioeconomic status in the city of Marseille.

a, Mean model activation rate per urban class with error bars denoting the 95% confidence interval for samples predicted as respectively as low SES (blue) or high SES (red) by the model. The yellow line on the left plot indicates the random diffusion value (H0). b, Estimated probability of an urban polygon belonging to the bottom or top quintile of the income distribution with error bars denoting the 95% confidence interval. Sample sizes are provided in Supplementary Tables 2 and 3.

Supplementary information

Supplementary Information

Supplementary Figs. 1–13 and Tables 1–3.

Reporting Summary

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Abitbol, J.L., Karsai, M. Interpretable socioeconomic status inference from aerial imagery through urban patterns. Nat Mach Intell 2, 684–692 (2020). https://doi.org/10.1038/s42256-020-00243-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1038/s42256-020-00243-5

This article is cited by

Search

Quick links

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing