Abstract
IT is known from the fundamental work of Adams and Kohlschütter and their followers that certain pairs of lines in stellar spectra change in relative intensity with absolute luminosity, and this has formed the basis of the method of ‘spectroscopic parallaxes.’ The method has been hitherto empirical, stars of known luminosity being used as a basis to determine the luminosities of other stars from calibration curves. Saha's researches on high-temperature ionisation, whilst not removing the empirical basis, afforded a general qualitative explanation of many of the results observed. They showed that the lowered value of surface gravity g in giant stars as compared with dwarfs must cause reduced pressures in the atmospheres of giants with consequent increased ionisation and hence increased intensity of enhanced lines (Pannekoek, B.A.N., 19).
This is a preview of subscription content, access via your institution
Access options
Subscribe to this journal
Receive 51 print issues and online access
$199.00 per year
only $3.90 per issue
Buy this article
- Purchase on Springer Link
- Instant access to full article PDF
Prices may be subject to local taxes which are calculated during checkout
Similar content being viewed by others
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
MILNE, E. Absolute Magnitude Effects in Stellar Spectra. Nature 122, 840–841 (1928). https://doi.org/10.1038/122840b0
Issue Date:
DOI: https://doi.org/10.1038/122840b0
Comments
By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.