Setting the Standard for Serum Ferritin Levels
In their study published in the Journal of Clinical Pathology in 1972, Addison and colleagues developed an immunoradiometric assay (IRMA) to measure ferritin levels in the serum of both normal subjects and individuals with iron deficiency and iron overload.
The researchers aimed to establish a reliable and accurate method for quantifying ferritin, a protein that stores iron, in the blood. They used the IRMA technique, a sensitive and specific immunoassay, to detect and quantify ferritin levels in serum samples from different groups of subjects.
The study compared ferritin levels in normal subjects to those with iron deficiency and iron overload. The results indicated that individuals with iron deficiency had lower levels of serum ferritin, while those with iron overload exhibited higher levels of serum ferritin.
The immunoradiometric assay for ferritin proved to be a valuable tool in diagnosing and monitoring iron-related conditions, as it provided a precise measure of serum ferritin levels, allowing for early detection and appropriate management of iron deficiency and iron overload disorders.
Overall, the research by Addison et al. demonstrated the utility of the immunoradiometric assay for ferritin in clinical practice, providing valuable insights into iron metabolism and contributing to the understanding and diagnosis of iron-related disorders.
My question stemming from this paper was "What constituted a normal, healthy subject?". This paper is typically seen as setting the accepted standard of ferritin levels for normal people. But can we really be sure those in 1972 were not experiencing some type of low-grade inflammation? Mineral imbalance? Nutrient deficiency?