TSH measurement is the most common endocrine test performed in the clinical lab. TSH serves as the primary analyte for the assessment of thyroid function. TSH serves as an indicator of normal production, inadequate production, or overproduction of thyroid hormone. Despite the fact that measuring TSH is an indirect approach (since TSH is not made in the thyroid gland itself), TSH serves as a stand-alone test to screen for thyroid abnormalities, even in patients who are asymptomatic.
Because of its high utility in clinical practice, the analysis of TSH needs to be sensitive, specific and well-standardized so that clinicians can confidently interpret results and decide whether additional testing or treatment is required. As immunoassay methods have evolved over the past three decades, these assays have become increasingly sensitive.
The first-generation TSH assays employed principles of hemagglutination inhibition with a level of detection to 1.0 µIU/mL but were deemed far too insensitive for clinical purposes. Second-generation assays for TSH employed radioimmunoassay (RIA) and enzyme immunoassay (EIA) and had much-improved sensitivity. They were able to detect levels down to 0.1µIU/mL. Today's third-generation immunoassays, employing monoclonal antibodies and chemiluminescence, are capable of reporting TSH concentrations ten times lower than that, down to levels of 0.01 µIU/mL.
Standardization of TSH assays calibrated to the same source of TSH antigen has helped to minimize variability between analytical methods. Efforts to standardize also have led to the recommendation of universal reference intervals to help clinicians better interpret TSH results.