The impacts of remote learning in secondary education during the pandemic in Brazil

The transition to remote learning in the context of coronavirus disease 2019 (COVID-19) might have led to dramatic setbacks in education. Taking advantage of the fact that São Paulo State featured in-person classes for most of the first school quarter of 2020 but not thereafter, we estimate the effects of remote learning in secondary education using a differences-in-differences strategy that contrasts variation in students’ outcomes across different school quarters, before and during the pandemic. We also estimate intention-to-treat effects of reopening schools in the pandemic through a triple-differences strategy, contrasting changes in educational outcomes across municipalities and grades that resumed in-person classes or not over the last school quarter in 2020. We find that, under remote learning, dropout risk increased by 365% while test scores decreased by 0.32 s.d., as if students had only learned 27.5% of the in-person equivalent. Partially resuming in-person classes increased test scores by 20% relative to the control group.

If you wish to submit a suitably revised manuscript we would hope to receive it *within 8 weeks*. We understand that the COVID-19 pandemic is causing significant disruptions which may prevent you from carrying out the additional work required for resubmission of your manuscript within this timeframe. If you are unable to submit your revised manuscript within 6 months, please let us know. We will be happy to extend the submission date to enable you to complete your work on the revision.
With your revision, please: • Include a "Response to the editors and reviewers" document detailing, point-by-point, how you addressed each editor and referee comment. If no action was taken to address a point, you must provide a compelling argument. This response will be used by the editors to evaluate your revision and sent back to the reviewers along with the revised manuscript.
• Highlight all changes made to your manuscript or provide us with a version that tracks changes.
Please use the link below to submit your revised manuscript and related files:

[REDACTED]
<strong>Note:</strong> This URL links to your confidential home page and associated information about manuscripts you may have submitted, or that you are reviewing for us. If you wish to forward this email to co-authors, please delete the link to your homepage.
Thank you for the opportunity to review your work. Please do not hesitate to contact me if you have any questions or would like to discuss the required revisions further.
Reviewer #1: Remarks to the Author: Following the worldwide school closures starting in March 2020, there has been an outpouring of studies collecting data on the time use and academic outcomes of children and youth who are kept at home. Many of these have been from Europe and the US, while data from other parts of the world have been lacking. This study presents evidence from Brazil (São Paulo), which offers an important case given the long duration of school closures and the developing country context.
The study is well-designed, using a differences-in-differences design that compares test scores before and after remote learning to corresponding test score growth during the same period in the previous year. It also implements a triple-difference design comparing differences between middle-and high-school students in municipalities where the latter group was allowed to return to school. The study finds large increases in incomplete grades (which they call "dropout risk") and drops in test scores, as well as improvements in test scores among high-school students who were allowed to return.
The study appears to confirm evidence from previous work. The design is close to that of Engzell et al (2021), and better than most other studies. This study is also valuable for estimating treatment effects across the age distribution, across population groups, and by local disease intensity. The finding that disease intensity is mostly unrelated to student outcomes is important since it counters a key objection to studies of learning loss in the wake of COVID: that they reflect wider impacts of the pandemic and not that of school closures as such. The added analysis of the effect of school reopening is well executed and does not have a parallel in existing literature.
My main concern is that the study is not appropriately contextualized with respect to existing studies of COVID-induced learning loss. There are by now dozens of studies on this subject (see bibliography below). Some of them are briefly discussed in the paper, but not until the concluding discussion. The motivation of this paper with respect to other works needs to be made clear already in the introduction. Specifically, the abstract and introduction overclaim when they say that "no study has rigorously documented the educational impacts of remote learning relative to in-person classes within primary and secondary education" and that "the evidence base for the impacts of remote learning is thin".
To my mind, the main contributions of this paper is that it a) uses sound data and methods to expand the evidence on COVID-induced learning loss, b) extends previous evidence centered on Europe and the US to a Latin American country, c) studies population heterogeneity along several dimensions, and d) studies the effect of school re-opening and not just closure.
In addition, I have several smaller comments on the analysis and discussion.
More information on the tests needs to be included. How were these designed and for how long did the students sit them? Is there any sense of the reliability? Did the remote testing regime offer any opportunities for cheating? Were the tests taken at Q1 and Q4 identical in design? If not, how was a difference score calculated and what is its interpretation?
To better allow comparison with existing work, effect sizes should be discussed in light of the exact length of school closures. What is the expected loss per week of school closure, and what is the expected gain per week of opening up? How does this compare to the estimates reported by Engzell et al (2021) for a European country or Kuhfeld et al for the US? (https://www.nwea.org/content/uploads/2020/11/Collaborative-brief-Learning-during-COVID-19.NOV 2020.pdf) Test scores are markedly higher in 2020 compared to 2019. There are several potential reasons for this: sample selection, mode effects on the test (online/take-home vs in person), and the simplified curriculum during the pandemic. These points appear scattered throughout but need to be brought together at some point in the text. Do the authors view any of them as a more likely explanation?
The main results report difference-in-difference estimates with 2019 as a single comparison year. I assume that the results do not differ much if 2018 is used as a comparison instead, but this information could be included to gauge the robustness. What is the estimate of (Q4 2020-Q1 2020)-(Q4 2018-Q1 2018)?
The introduction and discussion mention that in municipalities in which high school students were allowed back, middle school students also partly returned. This information is not mentioned in the subsection "Effects of Resuming In-person Classes", so a reader who skips to that section will be puzzled about why attendance increased for both groups in Table 2.
The authors use incomplete grades as a proxy for dropout risk, arguing that enrollment was kept artificially high by school administrators. This seems fine. But, incomplete grades do not appear to be a very good predictor of dropout ( Figure A.1). Please report the correlation in this figure. And as a suggestion, why not call the outcome incomplete grades instead? Figure B.1 suggests that most of the shortfall in test score growth occurred between Q1 and Q2 when classes were supposedly still in person, and rates of learning were similar thereafter. Does this affect conclusions about the efficacy of remote learning? Should we conclude that educators and families struggled initially but eventually adapted with time? Figure C.1-C.2. It seems unconventional to use a continuous density function to represent a discrete outcome. These graphs could be made clearer if a histogram was used.
In the analysis of heterogeneous treatment effects by student demographics shown in Appendix D, no coefficients or SEs are reported. There is little way to gauge the magnitude of these differences and their significance. Adding these numbers would be helpful. Table E.1 reveals that non-white, poorer and low-performing students are underrepresented in 2020. The authors use propensity score reweighting to address this but confounding on unobservables might remain. However, the fact that adjustment for observed confounders does not alter estimates much between column (3) and (4)-(5) in Table 1 can be marshalled to claim that residual confounding is also likely to be small.
A key contribution of this paper is to study heterogeneity by local COVID incidence. The authors write that "risk increased with local disease activity" but Figure F.1 largely looks like the absence of a meaningful association to me. However, the detected case load depends on testing capacity. If learning loss is larger in poorer communities and testing capacity correlates negatively with poverty, null effects might be spurious. Is this a worry?
In referring to Supplementary Materials, please state the specific Figure Below is a partial list of relevant work. The authors do not need to cite all this, but they do need to revise the abstract, introduction and discussion to reflect the fact that they are not alone in studying the effect of school closures and remote learning during the pandemic. * Ahn, Kunwon; Lee, Jun Yeong; and Winters, John V., "Employment Opportunities and High School Completion during the COVID-19 Recession" IZA Discussion Paper * Boruchowicz, Cynthia et al "Time Use of Youth during a Pandemic: Evidence from Mexico" https://www.researchgate.net/publication/350122372_Time_Use_of_Youth_during_a_Pandemic_Evide nce_from_Mexico * Curriculum Associates, "Understanding Student Needs: Early Results from Fall Assessments" https://www.curriculumassociates.com/-/media/mainsite/files/i-ready/iready-diagnostic-results-under standing-student-needs-paper-2020.pdf * Domingue, Hough, Lang, Yeatman, "Changing Patterns of Growth in Oral Reading Fluency During the COVID-19 Pandemic" https://edpolicyinca.org/publications/changing-patterns-growth-oral-reading-fluency-during-covid-19pandemic * GL Assessment, "Impact of Covid-19 on attainment -initial analysis " https://www.gl-assessment.co.uk/news-hub/research-reports/impact-of-covid-19-on-attainment-initial -analysis/ * Gore, Jennifer, Leanne Fray, Andrew Miller, Jess Harris, and Wendy Taggart. "The impact of COVID-19 on student learning in New South Wales primary schools: an empirical study." The Australian Educational Researcher (2021): 1-33. * Juniper Education, "The impact of the Covid-19 pandemic on primary school children's learning" https://21e8jl3324au2z28ej2uho3t-wpengine.netdna-ssl.com/wp-content/uploads/juniper_folder/Juni per-Education-National-Benchmark-Dataset-Report.pdf * Kofoed, Michael, Lucas Gebhart, Dallas Gilmore, and Ryan Moschitto. "Zooming to Class?: Experimental Evidence on College Students Online Learning During Covid-19." IZA Discussion Paper * Kogan, Vladimir and Stéphane Lavertu, "The COVID-19 Pandemic and Student Achievement on Ohio's Third-Grade English Language Arts Assessment" http://glenn.osu.edu/educational-governance/reports/reports-attributes/ODE_ThirdGradeELA_KL_1-27 -2021.pdf * Kuhfeld, Megan Beth Tarasawa, Angela Johnson, Erik Ruzek, and Karyn Lewis, "Learning during COVID-19: Initial findings on students' reading and math achievement and growth" https://www.nwea.org/content/uploads/2020/11/Collaborative-brief-Learning-during-COVID-19.NOV2 020.pdf * Orlov, George, Douglas McKee, James Berry, Austin Boyle, Thomas DiCiccio, Tyler Ransom, Alex Rees-Jones, and Jörg Stoye. "Learning during the COVID-19 pandemic: It is not who you teach, but how you teach." Economics Letters 202 (2021) * Pier, Hough, Christian, Bookman, Wilkenfeld, Miller, "Evidence on Learning Loss From the CORE Data Collaborative" https://edpolicyinca.org/newsroom/covid-19-and-educational-equity-crisis * RS Assessment "The impact of school closures on autumn 2020 attainment" https://www.risingstars-uk.com/media/Rising-Stars/Assessment/RS_Assessment_white_paper_2021_i mpact_of_school_closures_on_autumn_2020_attainment.pdf * Schult, Johannes, and Marlit Annalena Lindner. "Did Students Learn Less During the COVID-19 Pandemic? Reading and Mathematics Competencies Before and After the First Pandemic Wave." https://psyarxiv.com/pqtgf/ * Tomasik, Martin J., Laura A. Helbling, and Urs Moser. "Educational gains of in-person vs. distance learning in primary and secondary schools: A natural experiment during the COVID-19 pandemic school closures in Switzerland." International Journal of Psychology (2020). * Weidmann, B., Allen, R., Bibby, D.,Coe, R., James, L., Plaister, N. and Thomson, D., "Covid-19 disruptions: Attainment gaps and primary school responses," Education Endowment Foundation. https://educationendowmentfoundation.org.uk/public/files/Covid-19_disruptions_attainment_gaps_an d_primary_school_responses_-_May_2021.pdf Reviewer #2: Remarks to the Author: This paper analyses the impact of remote learning versus in person learning. To do so, the authors exploit unique variation and data in Brazil. In the context of the COVID-19 pandemic, insights in the difference between both education delivery forms is highly policy relevant. The estimated effects are large, with standardized test scores of 0.32 SD and dropout risks of 365%. I also like that they show that the negative effects are larger if schools did not offer online academic activities prior to the pandemic. In fact, I think that the latter finding deserves even more attention than it received in the submitted version of the paper. • The authors have unique data, with quarterly standardized test scores in 2020. Unfortunately, there is no information on which schools actually reopened. The authors circumvent this issue by using IV. However, evidence from other countries shows that the leeway that schools receive is used in a non-random way, with school characteristics correlating with the actual reopening. To the extent that in the Brazilian education system more advantaged schools reopened sooner, the estimated effects will be upward biased. • More attention should be paid to defining the key variables. For example, how is remote learning in Table 1 defined? For the dropout variable in Table 1: what about students without test scores in Q2-Q3, and with a score in Q4, or vice-versa? In table 2, how are 'in person activities' defined? Why is it not possible to define it as a continuous variable for the number of days that in person activities are possible (as municipalities probably did not open simultaneously on the same day)? • The analysis is Figure 1 is valuable. However, it is counter-intuitive that the risk of dropout is decreasing from grade 9 onwards. I would expect that in the older grades, students would more easily dropout than in the younger years. This might have to do with how the dropout variable is constructed (i.e., a missing test result). Although the supplementary analysis clearly shows that a missing test result is a good predictor for dropout in earlier years, during the pandemic this might be different. More discussion and (anecdotical) evidence would be in place here. Related is the lack of an attrition analysis. This might show whether there is (selective) attrition in the sample (and hence, the dropout).
• For the analysis in Table 2, more discussion and evidence is needed on the characteristics of municipalities that allowed for reopening the schools. This might not be random, but correlated with the socio-economic pattern of the municipality. Although this will be partly captured in the DiD specification, the differences in trend can potentially still result in biased estimates. In Table 2 student characteristics are matched (using what matching method?), but not municipality characteristics. Given that students are non-randomly allocated in municipalities, I would be more interested in the latter.
• It is unclear how the tests are standardized? Did you standardize them by quarter (and if so, how can you compare the estimates without linking questions)? Please discuss this more extensively, as it matters for the internal validity. • Related to the earlier comment, the authors average math and Portuguese scores because for Q4-2020 only the overall standardized test is available. However, the literature on COVID-19 learning losses shows significant differences between subjects. In the approach taken, the estimates might result in a regression to the mean. Therefore, the authors should also provide estimates (without Q4) for the subjects separately.
• The COVID-19 crisis came as a surprise. In some education systems, there was initially a lack of hardware and software. However, as time passed, education systems could adopt to the new situation. Unfortunately, this might undermine the external validity of the estimates. On the bright side, given that the authors have detailed quarterly data, the could examine how the availability of hardware and software changed the estimated impact of online versus in person learning. • There are significant differences (even in sign) between the naïve estimates and the DiD estimates in Table 1. Although this is briefly mentioned, a more profound discussion is needed as similar naïve estimates have been used broadly in earlier literature.

Reviewer #3:
Remarks to the Author: The authors take advantage of a relatively unique situation during Covid, the application every quarter in Sao Paolo Brazil of standardized achievement exams as well as the combination of some in person and some online classes which potentially allows effects of online schooling on learning to be isolated. The authors study both risk of dropout and impacts on learning and find large negative effects on the risk of dropout and on learning during the pandemic. While the topic is of great interest and importance, I have some concerns on the validity of the empirics which I detail below. 1. Defining students to be at risk of dropout if they do not take a quarterly exam applied online during the pandemic strikes me as not very convincing indicator of dropout risk. The authors provide evidence (in the supplementary material) this variable is correlated with actual dropout using pre-pandemic data when students were attending in person classes. I do not believe this is a valid exercise to demonstrate that the same indicator is a predictor of dropout during the pandemic when all school activities are remote. I thus suggest the authors drop this analysis (or call it what it is-probability of not taking the exam) or study the correlation between this variable and returning to school later using actual data from the pandemic on to provide evidence that it effectively measures dropout risk later on e.g. during/after the pandemic. 2. The impacts on learning using the two experiments (e.g. the period of closure to measure reduction in learning and the period when some schools reopen to measure the improvement in learning) have different results by an order of magnitude and this discrepancy casts doubt on what to believe about the true impacts of learning losses. Table 1 (columns 3 to 5) suggests reductions in learning over 9 months of online learning on standardized tests of 0.3 standard deviations whereas Table 2 suggests comparing municipalities where schools returned to those who did not that the return to in person learning led to an increase in test scores of 0.024 standard deviations e.g. less than one tenth the effects implied by Table 1. What are the reasons for this enormous discrepancy and which are we to believe represents the true learning losses due to the closure of schools? The authors need to reconcile these differences and provide guidance to the reader as what the takeaways of the analysis are.

Author Rebuttal to Initial comments Decision Letter, first revision:
2nd December 2021 Dear Dr Lichand, Thank you once again for your manuscript, entitled "The Impacts of Remote Learning in Secondary Education during the Pandemic in Brazil," and for your patience during the peer review process.
Your manuscript has now been evaluated by 3 reviewers, whose comments are included at the end of this letter. Although two reviewers now recommend publication of your work, one reviewer has several remaining concerns. We remain interested in the possibility of publishing your study in Nature Human Behaviour, but would like to consider your response to these concerns in the form of a revised manuscript before we make a decision on publication.
Specifically, Reviewer #3 is still concerned about the reliability of your dropout proxies. While we do not believe that you should remove these analyses altogether, we ask you to be fully transparent about the limitations of your proxy, and to report full statistics and figures of your supporting analyses in Supplementary Sections A1 and A2. In your revision, we also ask you to strengthen your argument showing that matching has appropriately controlled for selection effects, as requested by Reviewer #3.
Finally, your revised manuscript must comply fully with our editorial policies and formatting requirements. Failure to do so will result in your manuscript being returned to you, which will delay its consideration. To assist you in this process, I have attached a checklist that lists all of our requirements. If you have any questions about any of our policies or formatting, please don't hesitate to contact me.
In sum, we invite you to revise your manuscript taking into account all reviewer and editor comments. We are committed to providing a fair and constructive peer-review process. Do not hesitate to contact us if there are specific requests from the reviewers that you believe are technically impossible or unlikely to yield a meaningful outcome.
We hope to receive your revised manuscript within four to eight weeks. We understand that the COVID-19 pandemic is causing significant disruption for many of our authors and reviewers. If you cannot send your revised manuscript within this time, please let us know -we will be happy to extend the submission date to enable you to complete your work on the revision.
With your revision, please: • Include a "Response to the editors and reviewers" document detailing, point-by-point, how you addressed each editor and referee comment. If no action was taken to address a point, you must provide a compelling argument. This response will be used by the editors to evaluate your revision and sent back to the reviewers along with the revised manuscript.
• Highlight all changes made to your manuscript or provide us with a version that tracks changes.
Please use the link below to submit your revised manuscript and related files:

[REDACTED]
<strong>Note:</strong> This URL links to your confidential home page and associated information about manuscripts you may have submitted, or that you are reviewing for us. If you wish to forward this email to co-authors, please delete the link to your homepage.
We look forward to seeing the revised manuscript and thank you for the opportunity to review your work. Please do not hesitate to contact me if you have any questions or would like to discuss these revisions further.
Reading the comments and replies to the other referees, I agree with reviewer 1 that you embedded the paper better in the literature. With respect to this, there are recently a few new papers published on the theme that you might want to include (e.g. Donnelly and Patrinos; Werner and Woessmann; Grewenig, Lergetporer, Werner, Zierow and Woessmann; Gambi and De Witte; Iterbeke and De Witte -And references in these recent articles). Adding these will embed the paper also in the most recent literature on the covid related learning losses. Moreover, some papers are recently published and do not appear as working papers any longer. Please check in the reference list.
In the revised version, you discuss on page 6 the selection effect. I think that you underplay this effect here. At the very least, you should refer to the recent paper by Werner and Woessmann (the legacy of covid-19 in education) who devote significant attention to the cohort effects and how this underestimates the true learning losses.
Reviewer #3: Remarks to the Author: 1. "Defining students to be at risk of dropout if they do not take a quarterly exam applied online during the pandemic strikes me as not very convincing indicator of dropout risk." Figure S.A.1 shows the correlation between actual dropouts and the proposed proxy using taking a quarterly exam in the 4th quarter of 2019. As can be seen in the graph, there is some correlation but hardly a very high one. Showing some correlation (0.7 is not that high actually) is not sufficient to argue that a variable is a good proxy. This graph overall I consider evidence that using taking the exam is a poor proxy for actual dropout.
The authors provide a section in the supplementary materials on page 2 (section A2) describing how they test that not taking the exam can proxy dropout during the pandemic. Here they do not show a correlation or graph but construct an alternative proxy to dropout measured not as dropout but as the probability of not attending class because in fact they do not have information on actual dropout.
It is difficult to understand why the authors insist on studying the effects of the pandemic on dropout when they do not have information on dropout. The proxies reflect jointly current attendance, knowledge about when the test would be applied/whether test perceived as important, incidence of health problems which affect getting to school etc. etc. I suggest dropping this analysis all together.
2. There really needs to be some basic description of the characteristics of those who take the test and those who don't before and after in 2019 and 2020 to provide an initial idea if selection changes over time and during the pandemic and whether this selection affects the results of the paper. Such an analysis is a basic ingredient to establishing whether the changes in test scores the paper describes in fact reflect effects of online learning/pandemic or are simply differences in the characteristics of the population taking the tests e.g. selection bias. The paper simply does not provide sufficient evidence that the estimates are (at least largely) free of selection bias. Where is the evidence/arguments that the matching has adequately controlled for selection? Your manuscript has now been evaluated by Reviewer 3 and their comments are below. As you can see, Reviewer 3 finds that the paper has largely improved in revision. We will therefore be happy in principle to publish it in Nature Human Behaviour, pending minor revisions to satisfy Reviewer 3's final requests and to comply with our editorial and formatting guidelines.

Author
We are now performing detailed checks on your paper and will send you a checklist detailing our editorial and formatting requirements within two weeks. Please do not upload the final materials and make any revisions until you receive this additional information from us. Referee report: My main critique has focused on the proxy used for dropout risk which during the period of school closure was captured using the probability of taking an online standardized test whereas when schools are open such tests are administered in person. I find the analysis of the proxy on not attending any classes in Q1/2021, when in-person classes had been authorized to return in Sao Paulo State helpful although not completely convincing.
While the pandemic will likely have long term effects on dropout and is an important topic, it is also important that published results are plausibly unbiased, e.g. unlikely to provide misleading estimates not just on whether an effect is positive or negative but also on the size of the magnitudes.
My suggestion is thus to modify the strong tone of the text to say instead of for instance on p. 6 "The table shows that, by all accounts, dropout risk has increased dramatically during remote learning, by roughly 365% (significant at the 1% level; Columns 3-5). "The table is suggestive of important potential effects on dropout, as measured by our proxy indicador for dropout risk." Similarly in the rest of the text. Minor comment: Relatedly, please provide some references for the following claim on p. 6 "This proxy has been used for years by the Education Secretary and by philanthropic organizations that support quality education in Brazil to predict student dropouts, especially when it comes to identifying the schools most likely to be affected."

Final Decision Letter:
Dear Dr Lichand, We are pleased to inform you that your Article "The Impacts of Remote Learning in Secondary Education during the Pandemic in Brazil", has now been accepted for publication in Nature Human Behaviour.
Please note that Nature Human Behaviour is a Transformative Journal (TJ). Authors whose manuscript was submitted on or after January 1st, 2021, may publish their research with us through the traditional subscription access route or make their paper immediately open access through payment of an article-processing charge (APC). Authors will not be required to make a final decision about access to their article until it has been accepted. IMPORTANT NOTE: Articles submitted before January 1st, 2021, are not eligible for Open Access publication. Find out more about Transformative Journals Authors may need to take specific actions to achieve compliance with funder and institutional open access mandates. If your research is supported by a funder that requires immediate open access (e.g. according to Plan S principles) then you should select the gold OA route, and we will direct you to the compliant route where possible. For authors selecting the subscription publication route, the journal's standard licensing terms will need to be accepted, including self-archiving policies. Those licensing terms will supersede any other terms that the author or any third party may assert apply to any version of the manuscript.
Once your manuscript is typeset and you have completed the appropriate grant of rights, you will receive a link to your electronic proof via email with a request to make any corrections within 48 hours. If, when you receive your proof, you cannot meet this deadline, please inform us at rjsproduction@springernature.com immediately. Once your paper has been scheduled for online publication, the Nature press office will be in touch to confirm the details.
Acceptance of your manuscript is conditional on all authors' agreement with our publication policies (see http://www.nature.com/nathumbehav/info/gta). In particular your manuscript must not be published elsewhere and there must be no announcement of the work to any media outlet until the publication date (the day on which it is uploaded onto our web site).
If you have posted a preprint on any preprint server, please ensure that the preprint details are updated with a publication reference, including the DOI and a URL to the published version of the article on the journal website.
An online order form for reprints of your paper is available at https://www.nature.com/reprints/author-reprints.html. All co-authors, authors' institutions and authors' funding agencies can order reprints using the form appropriate to their geographical region.
We welcome the submission of potential cover material (including a short caption of around 40 words) related to your manuscript; suggestions should be sent to Nature Human Behaviour as electronic files (the image should be 300 dpi at 210 x 297 mm in either TIFF or JPEG format).
Please note that such pictures should be selected more for their aesthetic appeal than for their scientific content, and that colour images work better than black and white or grayscale images. Please do not try to design a cover with the Nature Human Behaviour logo etc., and please do not submit composites of images related to your work. I am sure you will understand that we cannot make any promise as to whether any of your suggestions might be selected for the cover of the journal.
You can now use a single sign-on for all your accounts, view the status of all your manuscript submissions and reviews, access usage statistics for your published articles and download a record of your refereeing activity for the Nature journals.
To assist our authors in disseminating their research to the broader community, our SharedIt initiative provides you with a unique shareable link that will allow anyone (with or without a subscription) to read the published article. Recipients of the link with a subscription will also be able to download and print the PDF.
As soon as your article is published, you will receive an automated email with your shareable link.
In approximately 10 business days you will receive an email with a link to choose the appropriate publishing options for your paper and our Author Services team will be in touch regarding any additional information that may be required.
You will not receive your proofs until the publishing agreement has been received through our system.
If you have any questions about our publishing options, costs, Open Access requirements, or our legal forms, please contact ASJournals@springernature.com