Correction to: Scientific Reports https://doi.org/10.1038/s41598-023-43986-y, published online 26 October 2023
The original version of this Article contained an error in Affiliation 1.
Affiliation 1
“School of Jiangxi, University of Science and Technology, Ganzhou, 341000, Jiangxi, China”
now reads:
Affiliation 1
“School of Electrical Engineering and Automation, Jiangxi University of Science and Technology, Ganzhou, 341000, Jiangxi, China”
The original Article has been corrected.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Yang, G., Yu, S., Sheng, Y. et al. Author Correction: Attention and feature transfer based knowledge distillation. Sci Rep 13, 20296 (2023). https://doi.org/10.1038/s41598-023-47361-9
Published:
DOI: https://doi.org/10.1038/s41598-023-47361-9
Comments
By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.