Crystalline porous metal–organic frameworks (MOFs) have attracted much research interest because of their great potential in energy and environmental applications. The application versatility originates from the wide tunability of their properties and functions, which are offered by numerous candidates that can be used as building blocks, for instance metal nodes and organic linkers. It is crucial to computationally predict structure–property relationships when designing MOFs for different applications. However, the vast searching space of potential candidates and the various applications of interest pose challenges in the searching efficiency within existing computational techniques. For example, machine learning (ML) has been used to predict electronic and kinetic properties of MOFs. However, most ML models are limited in model transferability — meaning that they are designed specifically for predicting one particular property. In a recent study, Jihan Kim and colleagues proposed the use of a pre-trained multi-modal transformer model for transfer learning in MOF design, which was demonstrated to be capable of predicting a wide range of properties.
To realize the model transferability, the proposed architecture — MOFTransformer — was designed to be able to capture both the local and global features. For MOFs, the local features refer to chemical bonds and specific chemical groups, and the global features mean more macroscopic characters, such as geometry and topology. Specifically, they used a crystal graph convolutional network to model the local features and the grids of methane–MOF interaction energy to model the global features. MOFTransformer was pre-trained with 1 million hypothetical MOFs for three tasks: MOF topological prediction, void fraction prediction and metal cluster–organic linker classification. The pre-trained MOFTransformer can be further fine-tuned for predicting different properties, such as gas adsorption, diffusion, and electronic band gap. In each task, MOFTransformer showed improved accuracy when compared with other state-of-the-art methods, such as descriptor-based ML. In addition, it was demonstrated that MOFTransformer can provide physical insights through feature importance analysis. Overall, MOFTransformer provides a universal computational framework for MOF design; more importantly, such a transformer model will inspire future transfer learning frameworks that accelerate the computational design of novel materials.
This is a preview of subscription content, access via your institution