I know in this photo I look like I’m moonlighting as an Instagram model, but I’m actually doing some important, practical science that would have been impossible a couple of years ago.

In 2020, Apple released the iPhone 12 Pro, which contains a lidar sensor in the back. Lidar stands for light detection and ranging. The device uses a laser that emits 576 pulses in all directions and makes the iPhone camera capable of measuring depth. For Apple, this is about entering augmented reality and improving the camera’s auto-focus. For me and my team, it’s a great way to measure the erosion of cliffs.

I’m using the phone, along with a selfie stick and a 3D-mapping app, to measure 1,500 square metres of coastline in Roneklint, about an hour’s drive south of Copenhagen, as part of my geography PhD project at the University of Copenhagen. The ground here is a sandy, glacial till, which erodes very quickly. A single storm can change the cliff surface. Erosion rates of up to one metre per year occur in this part of the world.

So, if you have farmland or property just behind the cliff face, you need to know how long it might take for the sea to reach it, and what else might happen because of rising sea levels caused by climate change.

The iPhone is hugely useful for us — in the past we’d have had to use a big, expensive terrestrial laser scanner to get the same result — but the data we get from lidar are only one piece of the puzzle. My PhD is all about combining this information with geological data, as well as knowledge about wind and rainfall, to build a bigger picture.

The thing that excites me most about this kind of work is the longer-term potential that such techniques have for citizen science. In the future, rather than relying on field trips to a piece of coastline, we could ask local people, “Hey, do you have your iPhone? Could you collect some data, and then send them to us?”