The computers that scientists use are growing increasingly more powerful. But in many parts of the world, reliable electricity and Internet access, much less supercomputing power, remain elusive. There are things that researchers can do, however. Some scientists with experience in resource-limited areas share their tips.

Compute on the cheap

State-of-the-art technology is nice to have, but the cost means this is often impractical. To maximize return on investment, computer scientist Nicolás Wolovick, who leads the general-purpose computing on graphics processing units group at the National University of Córdoba in Argentina, and his colleagues buy what they call ‘post-mortem’ computer parts that are a generation or two out of date, and assemble them into something greater.

In 2017, Wolovick’s team built Eulogia, a supercomputing cluster built from recent-model Intel Xeon Phi processors. At its peak, Eulogia has a speed of roughly 40 teraflops (40 trillion floating point operations per second). That’s just a fraction of the lowest-ranked member of the world’s current top 500 supercomputers — Internet Service A in China, which as of November 2018 had a peak performance of 1,817.6 teraflops — but it’s good enough for many research tasks, Wolovick says. For example, one team at Córdoba used the system to model the rotation of the dark-matter haloes that astronomers think surround galaxies and other cosmic structures.

Pool your resources

When money is tight, pooling resources is key. That’s true for computers — Wolovick, for instance, has asked colleagues to eschew personal desktop computers in favour of a joint computing cluster. “Most of the time [those desktop computers] would be idle,” he says. It is also true for scientific instruments. In Uruguay, scientists enjoy the benefit of a national supercomputing centre as well as reliable electricity and Internet access, but scientific equipment — such as high-throughput genome sequencers, which produce large quantities of genetic data that require powerful computers for analysis — is in short supply, says Sergio Nesmachnow, director of the Uruguayan Scientific High Performance Computing Initiative at the University of the Republic in Montevideo. “The typical funding here for scientific equipment is about US$10,000,” Nesmachnow says. “With this kind of money, it is impossible to buy sophisticated equipment.” The cost of maintenance, service contracts and other expenses push some instruments even further out of reach.

As a result, Uruguayan research groups often pool their money to buy new equipment, repair damaged instruments and pay technicians, Nesmachnow says. “Each group contributes according to their capabilities to make the most of the scarce resources we have.”

Go mobile

Computers are expensive, and Internet connectivity can be spotty in some countries. So, rather than buying laptops to work in the field, cybersecurity researcher Robert Rosenthal Shoniwa, head of information security and assurance at the Harare Institute of Technology in Zimbabwe, suggests using mobile phones instead.

Although broadband access can be patchy, Rosenthal Shoniwa says, “basically every person has a mobile phone”. Using a phone, even researchers in remote locations can upload their data to the Internet. “You do not have to buy a whole laptop; it’s cheaper to rely on a mobile platform.”

For many tasks, mobile phones are sufficiently powerful to process data locally, without the need to transfer data to more powerful, remote computers. Rosenthal Shoniwa’s student Blessing Sibanda created an Android app that analysed photos of tomato leaves to help farmers determine whether the plant is healthy or diseased without the need to transfer data. “All the processing was done on the phone and no data connection was used in the process,” Rosenthal Shoniwa says.

Internet creatively

Like their colleagues around the world, students in South Africa are experts at finding and exploiting good public Wi-Fi hotspots. But on the university campus, “where the Wi-Fi is rubbish”, many students also carry Ethernet cables, “plugging into what Ethernet ports they can find”, says Anelda van der Walt, founder of electronic research training and consultancy firm Talarify in Kleinmond, South Africa.

But there’s more than one way to build a network, she notes. At one workshop in the central African nation of Gabon, van der Walt says, researchers bought routers and SIM cards to create their own hotspots from their mobile phones. “That can obviously prove quite costly,” she says.

Researchers can also ship data physically on USB keys or external hard drives. But beware: mailed storage devices can get lost, damaged, stuck in customs or simply contain the wrong files. But “sometimes there’s no other way”, van der Walt says.

Find power

Limited computational resources mean that it can take days to process data. That can be problematic if electricity or Internet access goes down during that time, which is all too common in some parts of the world. If a blackout happens in the middle of processing data, “you just have to start over and maybe lose a day or two”, says Nyasha Thusabantu, a big-data researcher at the Harare Institute of Technology.

A back-up generator or uninterruptible power supply can help. So, too, can solar panels, says Rosenthal Shoniwa, who bought some panels of his own so that he could work at home even during blackouts. “Zimbabwe has lots of sunlight most of the year,” he says. “Most people locally have adopted solar systems.”

Or researchers could simply watch the local news. Zimbabwe sometimes schedules power outages in advance, Thusabantu notes, allowing researchers to plan their computational work accordingly.

Go open source

Scientists in the United States and Western Europe can usually count on a reliable Internet, but that’s rarely the case elsewhere. “In the developed world, the always-on paradigm for Internet is a very privileged perspective,” says Jonah Duckles, former director of membership and technology for The Carpentries in San Francisco, California, which runs scientific computing workshops around the world. “In a world where most major software products — Adobe, Microsoft, Google — require either an active Internet connection or periodic phone-homes, this broken-by-design approach to modern software tools increases the digital divide in areas with limited Internet.”

Open-source tools, he says, can fill this gap. Whereas many closed-source programs stop working entirely if they cannot connect with corporate servers, open-source tools will work without an Internet link. With open-source programming languages, such as Python and R, and by stitching together software they find on the code-sharing site GitHub, “scientists can do a heck of a lot of research nowadays”, Duckles says. And they can do so without paying expensive software-licence fees.

Tap your social networks

Perhaps the best strategy for scientific computing in resource-limited environments is to leverage social networks rather than electronic ones. After all, colleagues might be aware of assets that you’ve overlooked. Rafael Mayo-García, for instance, directs RICAP, the Ibero American high-performance computing (HPC) network, based in Madrid, which aims to provide no-cost access to HPC resources for Latin American groups that do not own HPC infrastructure. Still, many researchers are unaware of the organization and what it does, Mayo-García says. “It is very common to hear that scientists are unaware of this possibility.”

Collaborations can also prove essential when getting up to speed with new technology, notes van der Walt. If researchers in South Africa lack relevant supercomputer experience and cannot find someone at their own institution to work with, they can often find partners elsewhere with the experience and resources they need. “The key is not being deterred,” she says.

In the end, Duckles says, “you have to have a mindset where you think you can solve your problems”.