Hubbry Logo
Reference ranges for blood testsReference ranges for blood testsMain
Open search
Reference ranges for blood tests
Community hub
Reference ranges for blood tests
logo
7 pages, 0 posts
0 subscribers
Be the first to start a discussion here.
Be the first to start a discussion here.
Reference ranges for blood tests
Reference ranges for blood tests
from Wikipedia

Reference ranges (reference intervals) for blood tests are sets of values used by a health professional to interpret a set of medical test results from blood samples. Reference ranges for blood tests are studied within the field of clinical chemistry (also known as "clinical biochemistry", "chemical pathology" or "pure blood chemistry"), the area of pathology that is generally concerned with analysis of bodily fluids.[1][2][3]

Blood test results should always be interpreted using the reference range provided by the laboratory that performed the test.[4]

Interpretation

[edit]

A reference range is usually defined as the set of values 95 percent of the normal population falls within (that is, 95% prediction interval).[5] It is determined by collecting data from vast numbers of laboratory tests.[6][7]

Plasma or whole blood

[edit]

In this article, all values (except the ones listed below) denote blood plasma concentration, which is approximately 60–100% larger than the actual blood concentration if the amount inside red blood cells (RBCs) is negligible. The precise factor depends on hematocrit as well as amount inside RBCs. Exceptions are mainly those values that denote total blood concentration, and in this article they are:[8]

  • All values in Hematology – red blood cells (except hemoglobin in plasma)
  • All values in Hematology – white blood cells
  • Platelet count (Plt)

A few values are for inside red blood cells only:

  • Vitamin B9 (folic acid/folate) in red blood cells
  • Mean corpuscular hemoglobin concentration (MCHC)

Units

[edit]

Arterial or venous

[edit]

If not otherwise specified, a reference range for a blood test is generally the venous range, as the standard process of obtaining a sample is by venipuncture. An exception is for acid–base and blood gases, which are generally given for arterial blood.[12]

Still, the blood values are approximately equal between the arterial and venous sides for most substances, with the exception of acid–base, blood gases and drugs (used in therapeutic drug monitoring (TDM) assays).[13] Arterial levels for drugs are generally higher than venous levels because of extraction while passing through tissues.[13]

Usual or optimal

[edit]

Reference ranges are usually given as what are the usual (or normal) values found in the population, more specifically the prediction interval that 95% of the population fall into. This may also be called standard range. In contrast, optimal (health) range or therapeutic target is a reference range or limit that is based on concentrations or levels that are associated with optimal health or minimal risk of related complications and diseases. For most substances presented, the optimal levels are the ones normally found in the population as well. More specifically, optimal levels are generally close to a central tendency of the values found in the population. However, usual and optimal levels may differ substantially, most notably among vitamins and blood lipids, so these tables give limits on both standard and optimal (or target) ranges. In addition, some values, including troponin I and brain natriuretic peptide, are given as the estimated appropriate cutoffs to distinguish healthy people from people with specific conditions, which here are myocardial infarction and congestive heart failure, respectively, for the aforementioned substances.[14][15][16]

Variability

[edit]

References range may vary with age, sex, race, pregnancy,[17] diet, use of prescribed or herbal drugs and stress. Reference ranges often depend on the analytical method used, for reasons such as inaccuracy, lack of standardisation, lack of certified reference material and differing antibody reactivity.[18] Also, reference ranges may be inaccurate when the reference groups used to establish the ranges are small.[19]

Sorted by concentration

[edit]

By mass and molarity

[edit]

Smaller, narrower boxes indicate a more tight homeostatic regulation when measured as standard "usual" reference range.

Hormones predominate at the left part of the scale, shown with a red at ng/L or pmol/L, being in very low concentration. There appears to be the greatest cluster of substances in the yellow part (μg/L or nmol/L), becoming sparser in the green part (mg/L or μmol/L). However, there is another cluster containing many metabolic substances like cholesterol and glucose at the limit with the blue part (g/L or mmol/L).[citation needed]

The unit conversions of substance concentrations from the molar to the mass concentration scale above are made as follows:

  • Numerically:
  • Measured directly in distance on the scales:
,

where distance is the direct (not logarithmic) distance in number of decades or "octaves" to the right the mass concentration is found. To translate from mass to molar concentration, the dividend (molar mass and the divisor (1000) in the division change places, or, alternatively, distance to right is changed to distance to left. Substances with a molar mass around 1000g/mol (e.g. thyroxine) are almost vertically aligned in the mass and molar images. Adrenocorticotropic hormone, on the other hand, with a molar mass of 4540,[20] is 0.7 decades to the right in the mass image. Substances with molar mass below 1000g/mol (e.g. electrolytes and metabolites) would have "negative" distance, that is, masses deviating to the left. Many substances given in mass concentration are not given in molar amount because they haven't been added to the article.

The diagram above can also be used as an alternative way to convert any substance concentration (not only the normal or optimal ones) from molar to mass units and vice versa for those substances appearing in both scales, by measuring how much they are horizontally displaced from one another (representing the molar mass for that substance), and using the same distance from the concentration to be converted to determine the equivalent concentration in terms of the other unit. For example, on a certain monitor, the horizontal distance between the upper limits for parathyroid hormone in pmol/L and pg/mL may be 7 cm, with the mass concentration to the right. A molar concentration of, for example, 5 pmol/L would therefore correspond to a mass concentration located 7 cm to the right in the mass diagram, that is, approximately 45 pg/mL.

By units

[edit]

Units do not necessarily imply anything about molarity or mass.

A few substances are below this main interval, e.g. thyroid stimulating hormone, being measured in mU/L, or above, like rheumatoid factor and CA19-9, being measured in U/mL.

By enzyme activity

[edit]

White blood cells

[edit]

Sorted by category

[edit]

Ions and trace metals

[edit]

Included here are also related binding proteins, like ferritin and transferrin for iron, and ceruloplasmin for copper.

Test Lower limit Upper limit Unit* Comments
Sodium (Na) 135,[21] 137[10][22] 145,[10][22] 147[21] mmol/L or mEq/L[21] See hyponatremia or hypernatremia
310,[23] 320[23] 330,[23] 340[23] mg/dL
Potassium (K) 3.5,[10][21] 3.6[22] 5.0,[10][21][22] 5.1 mmol/L or mEq/L[21] See hypokalemia or hyperkalemia
14[24] 20[24] mg/dL
Chloride (Cl) 95,[21] 98,[25] 100[10] 105,[21] 106,[25] 110[10] mmol/L or mEq/L[21] See hypochloremia or hyperchloremia
340[26] 370[26] mg/dL
Ionized calcium (Ca) 1.03,[27] 1.10[10] 1.23,[27] 1.30[10] mmol/L See hypocalcaemia or hypercalcaemia
4.1,[28] 4.4[28] 4.9,[28] 5.2[28] mg/dL
Total calcium (Ca) 2.1,[21][29] 2.2[10] 2.5,[10][29] 2.6,[29] 2.8[21] mmol/L
8.4,[21] 8.5[30] 10.2,[21] 10.5[30] mg/dL
Total serum iron (TSI) – male 65,[31] 76[22] 176,[31] 198[22] μg/dL See hypoferremia or the following: iron overload (hemochromatosis), iron poisoning, siderosis, hemosiderosis, hyperferremia
11.6,[32][33] 13.6[33] 30,[32] 32,[33] 35[33] μmol/L
Total serum iron (TSI) – female 26,[22] 50[31] 170[22][31] μg/dL
4.6,[33] 8.9[32] 30.4[32] μmol/L
Total serum iron (TSI) – newborns 100[31] 250[31] μg/dL
18[33] 45[33] μmol/L
Total serum iron (TSI) – children 50[31] 120[31] μg/dL
9[33] 21[33] μmol/L
Total iron-binding capacity (TIBC) 240,[31] 262[22] 450,[31] 474[22] μg/dL
43,[33] 47[33] 81,[33] 85[33] μmol/L
Transferrin 190,[34] 194,[10] 204[22] 326,[10] 330,[34] 360[22] mg/dL
25[35] 45[35] μmol/L
Transferrin saturation 20[31] 50[31] %
Ferritin – Males and postmenopausal females 12[36] 300[36][37] ng/mL or μg/L
27[38] 670[38] pmol/L
Ferritin – premenopausal females 12[36] 150[36] – 200[37] ng/mL or μg/L
27[38] 330[38] – 440[38] pmol/L
Ammonia 10,[39] 20[40] 35,[39] 65[40] μmol/L See hypoammonemia and hyperammonemia
17,[41] 34[41] 60,[41] 110[41] μg/dL
Copper (Cu) 70[30] 150[30] μg/dL See hypocupremia or hypercupremia
11[42][43] 24[42] μmol/L
Ceruloplasmin 15[30] 60[30] mg/dL
1[44] 4[44] μmol/L
Phosphate (HPO42−) 0.8 1.5[45] mmol/L See hypophosphatemia or hyperphosphatemia
Inorganic phosphorus (serum) 1.0[21] 1.5[21] mmol/L
3.0[21] 4.5[21] mg/dL
Zinc (Zn) 60,[46] 72[47] 110,[47] 130[46] μg/dL See zinc deficiency or zinc poisoning
9.2,[48] 11[10] 17,[10] 20[48] μmol/L
Magnesium 1.5,[30] 1.7[49] 2.0,[30] 2.3[49] mEq/L or mg/dL See hypomagnesemia or hypermagnesemia
0.6,[50] 0.7[10] 0.82,[50] 0.95[10] mmol/L
  • Note: Although 'mEq' for mass and 'mEq/L' are sometimes used in the United States and elsewhere, they are not part of SI and are now considered redundant.

Acid–base and blood gases

[edit]

If arterial/venous is not specified for an acid–base or blood gas value, then it generally refers to arterial, and not venous which otherwise is standard for other blood tests.[citation needed]

Acid–base and blood gases are among the few blood constituents that exhibit substantial difference between arterial and venous values.[13] Still, pH, bicarbonate and base excess show a high level of inter-method reliability between arterial and venous tests, so arterial and venous values are roughly equivalent for these.[51]

Test Arterial/Venous Lower limit Upper limit Unit
pH Arterial 7.34,[22] 7.35[21] 7.44,[22] 7.45[21]
Venous 7.31[52] 7.41[52]
[H+] Arterial 36[21] 44[21] nmol/L
3.6[53] 4.4[53] ng/dL
Base excess Arterial & venous[52] −3[52] +3[52] mEq/L
Oxygen partial pressure (pO2) Arterial pO2 10,[21] 11[54] 13,[54] 14[21] kPa
75,[21][22] 83[30] 100,[22] 105[21] mmHg or torr
Venous 4.0[54] 5.3[54] kPa
30[52] 40[52] mmHg or torr
Oxygen saturation Arterial 94,[52] 95,[25] 96[30] 100[25][30] %
Venous Approximately 75[25]
Carbon dioxide partial pressure (pCO2) Arterial PaCO2 4.4,[21] 4.7[54] 5.9,[21] 6.0[54] kPa
33,[21] 35[22] 44,[21] 45[22] mmHg or torr
Venous 5.5,[54] 6.8[54] kPa
41[52] 51[52] mmHg or torr
Absolute content of carbon dioxide (CO2) Arterial 23[52] 30[52] mmol/L
100[55] 132[55] mg/dL
Bicarbonate (HCO3) Arterial & venous 18[30] 23[30] mmol/L
110[56] 140[56] mg/dL
Standard bicarbonate (SBCe) Arterial & venous 21, 22[21] 27, 28[21] mmol/L or mEq/L[21]
134[56] 170[56] mg/dL

Liver function

[edit]
Test Patient type Lower limit Upper limit Unit Comments
Total protein (TotPro) 60,[21] 63[22] 78,[21] 82,[22] 84[30] g/L See serum total protein Interpretation
Albumin 35[21][57] 48,[22] 55[21] g/L See hypoalbuminemia
3.5[22] 4.8,[22] 5.5[21] U/L
540[58] 740[58] μmol/L
Globulins 23[21] 35[21] g/L
Total bilirubin 1.7,[59] 2,[21] 3.4,[59] 5[10] 17,[21][59] 22,[59] 25[10] μmol/L
0.1,[21] 0.2,[22] 0.29[60] 1.0,[21][30] 1.3,[22] 1.4[60] mg/dL
Direct/conjugated bilirubin 0.0[21] or N/A[10] 5,[21] 7[10][59] μmol/L
0[21][22] 0.3,[21][22] 0.4[30] mg/dL
Alanine transaminase (ALT/ALAT[10]) 5,[61] 7,[22] 8[21] 20,[21] 21,[25] 56[22] U/L Also called serum glutamic pyruvic transaminase (SGPT)
Female 0.15[10] 0.75[10] μkat/L
Male 0.15[10] 1.1[10]
Aspartate transaminase (AST/ASAT[10]) Female 6[62] 34[62] IU/L Also called
serum glutamic oxaloacetic transaminase (SGOT)
0.25[10] 0.60[10] μkat/L
Male 8[62] 40[62] IU/L
0.25[10] 0.75[10] μkat/L
Alkaline phosphatase (ALP) 0.6[10] 1.8[10] μkat/L
Female 42[61] 98[61] U/L
Male 53[61] 128[61]
Gamma glutamyl transferase (GGT) 5,[61] 8[22] 40,[61] 78[22] U/L
Female 0.63[63] μkat/L
Male 0.92[63] μkat/L

Cardiac tests

[edit]
Test Patient type Lower limit Upper limit Unit Comments
Creatine kinase (CK) Male 24,[64] 38,[22] 60[61] 174,[30] 320[61] U/L or ng/mL
0.42[65] 1.5[65] μkat/L
Female 24,[64] 38,[22] 96[30] 140,[30] 200[61] U/L or ng/mL
0.17[65] 1.17[65] μkat/L
CK-MB 0 3,[22] 3.8,[10] 5[61] ng/mL or μg/L[10]
Myoglobin Female 1[66] 66[66] ng/mL or μg/L
Male 17[66] 106[66]
Cardiac troponin T (low sensitive) 0.1[14] ng/mL 99th percentile cutoff
Cardiac troponin I

(high sensitive)

0.03[14] ng/mL 99th percentile cutoff
Cardiac troponin T (high sensitive) Male 0.022[14] ng/mL 99th percentile cutoff
Female 0.014[14] ng/mL 99th percentile cutoff
newborn/infants not established more than adults [67][68]
Brain natriuretic peptide (BNP)
Interpretation Range / Cutoff
Congestive heart failure unlikely < 100 pg/mL[15][16]
"Gray zone" 100–500 pg/mL[15][16]
Congestive heart failure likely > 500 pg/mL[15][16]
NT-proBNP
Interpretation Age Cutoff
Congestive heart failure likely < 75 years > 125 pg/mL[69]
> 75 years > 450pg/mL[69]

Lipids

[edit]
Test Patient type Lower limit Upper limit Unit Therapeutic target
Triglycerides 10–39 years 54[30] 110[30] mg/dL < 100 mg/dL[70]
or 1.1 mmol/L[70]
0.61[71] 1.2[71] mmol/L
40–59 years 70[30] 150[30] mg/dL
0.77[71] 1.7[71] mmol/L
> 60 years 80[30] 150[30] mg/dL
0.9[71] 1.7[71] mmol/L
Total cholesterol 3.0,[72] 3.6[21][72] 5.0,[10][73] 6.5[21] mmol/L < 3.9 mmol/L[70]
120,[22] 140[21] 200,[22] 250[21] mg/dL < 150 mg/dL[70]
HDL cholesterol Female 1.0,[74] 1.2,[10] 1.3[72] 2.2[74] mmol/L > 1.0[74] or 1.6[72] mmol/L
40[75] or 60[76] mg/dL
40,[75] 50[77] 86[75] mg/dL
HDL cholesterol Male 0.9[10][74] 2.0[74] mmol/L
35[75] 80[75] mg/dL
LDL cholesterol
(Not valid when
triglycerides >5.0 mmol/L)
2.0,[74] 2.4[73] 3.0,[10][73] 3.4[74] mmol/L < 2.5 mmol/L[74]
80,[75] 94[75] 120,[75] 130[75] mg/dL < 100 mg/dL[75]
LDL/HDL quotient n/a 5[10] (unitless)

Tumour markers

[edit]
Test Patient type Cutoff Unit Comments
Alpha fetoprotein (AFP) 44[22] ng/mL or μg/L Hepatocellular carcinoma or testicular cancer
Beta human chorionic gonadotrophin (β-hCG) In males and non-pregnant females 5[22] IU/L or mU/mL choriocarcinoma
CA19-9 40[22] U/mL Pancreatic cancer
CA-125 30,[78] 35[79] kU/L or U/mL
Carcinoembryonic antigen (CEA) Non-smokers, 50 years 3.4,[10] 3.6[80] μg/L
Non-smokers, 70 years 4.1[80]
Smokers 5[81]
Prostate specific antigen (PSA) 40–49 years 1.2–2.9[82] μg/L[10][22] or ng/mL[30] More detailed cutoffs in PSA – Serum levels
70–79 years, non-African-American 4.0–9.0[82]
70–79 years, African-American 7.7–13[82]
PAP 3[30] units/dL (Bodansky units)
Calcitonin 5,[83] 15[83] ng/L or pg/mL Cutoff against medullary thyroid cancer[83]
More detailed cutoffs in Calcitonin article

Endocrinology

[edit]

Thyroid hormones

[edit]
Test Patient type Lower limit Upper limit Unit
Thyroid stimulating hormone
(TSH or thyrotropin)
Adults –
standard range
0.3,[10] 0.4,[22] 0.5,[30] 0.6[84] 4.0,[10] 4.5,[22] 6.0[30] mIU/L or μIU/mL
Adults –
optimal range
0.3,[85] 0.5[86] 2.0,[86] 3.0[85]
Infants 1.3[87] 19[87]
Free thyroxine (FT4)
Normal adult 0.7,[88] 0.8[22] 1.4,[88] 1.5,[22] 1.8[89] ng/dL
9,[10][90] 10,[91] 12[92] 18,[10][90] 23[92] pmol/L
Child/Adolescent
31 d – 18 y
0.8[88] 2.0[88] ng/dL
10[90] 26[90] pmol/L
Pregnant 0.5[88] 1.0[88] ng/dL
6.5[90] 13[90] pmol/L
Total thyroxine 4,[91] 5.5[22] 11,[91] 12.3[22] μg/dL
60[91][92] 140,[91] 160[92] nmol/L
Free triiodothyronine (FT3) Normal adult 0.2[91] 0.5[91] ng/dL
3.1[93] 7.7[93] pmol/L
Children 2-16 y 0.1[94] 0.6[94] ng/dL
1.5[93] 9.2[93] pmol/L
Total triiodothyronine 60,[22] 75[91] 175,[91] 181[22] ng/dL
0.9,[10] 1.1[91] 2.5,[10] 2.7[91] nmol/L
Thyroxine-binding globulin (TBG) 12[22] 30[22] mg/L
Thyroglobulin (Tg) 1.5[91] 30[91] pmol/L
1[91] 20[91] μg/L

Sex hormones

[edit]

The diagrams below take inter-cycle and inter-woman variability into account in displaying reference ranges for estradiol, progesterone, FSH and LH.

Levels of estradiol (the main estrogen), progesterone, luteinizing hormone and follicle-stimulating hormone during the menstrual cycle.[95]
Test Patient type Lower limit Upper limit Unit
Dihydrotestosterone adult male 30 85 ng/dL
Testosterone Male, overall 8,[96] 10[97] 27,[96] 35[97] nmol/L
230,[98] 300[99] 780–1000[98][99] ng/dL
Male < 50 years 10[10] 45[10] nmol/L
290[98] 1300[98] ng/dL
Male > 50 years 6.2[10] 26[10] nmol/L
180[98] 740[98] ng/dL
Female 0.7[97] 2.8–3.0[97][10] nmol/L
20[99] 80–85[99][98] ng/dL
17α-Hydroxyprogesterone male 0.06[30] 3.0[30] mg/L
0.18[100] 9.1[100] μmol/L
Female (Follicular phase) 0.2[30] 1.0[30] mg/L
0.6[100] 3.0[100] μmol/L
Follicle-stimulating
hormone
(FSH)
Prepubertal <1[101] 3[101] IU/L
Adult male 1[101] 8[101]
Adult female (follicular
and luteal phase)
1[101] 11[101]
Adult female (Ovulation) 6[101]
95% PI (standard)
26[101]
95% PI)
5[102]
90% PI (used in diagram)
15[102]
(90% PI)
Post-menopausal female 30[101] 118[101]
Luteinizing hormone (LH)
Female, peak 20[102]
90% PI (used in diagram)
75[102]
(90% PI)
IU/L
Female, post-menopausal 15[103] 60[103]
Male aged 18+ 2[104] 9[104]
Estradiol
(an estrogen)
Adult male 50[105] 200[105] pmol/L
14[106] 55[106] pg/mL
Adult female (day 5 of follicular phase,
and luteal phase)
70[105] 500,[105] 600[105] pmol/L
19[106] 140,[106] 160[106] pg/mL
Adult female – free (not protein bound) 0.5[107] 9[107] pg/mL
1.7[107] 33[107] pmol/L
Post-menopausal female N/A[105] < 130[105] pmol/L
N/A[106] < 35[106] pg/mL
Progesterone
Female in mid-luteal phase (day 21–23) 17,[102] 35[108] 92[108] nmol/L
6,[102] 11[109] 29[109] ng/mL
Androstenedione Adult male and female 60[103] 270[103] ng/dL
Post-menopausal female < 180[103]
Prepubertal < 60[103]
Dehydroepiandrosterone sulfate Adult male and female 30[110] 400[110] μg/dL
SHBG
Adult female 40[111] 120[111] nmol/L
Adult male 20[111] 60[111]
Anti-Müllerian hormone (AMH)
13–45 years 0.7[112] 20[112] ng/mL
5[113] 140[113] pmol/L

Other hormones

[edit]
Test Patient type Lower limit Upper limit Unit
Adrenocorticotropic hormone (ACTH) 2.2[114] 13.3[114] pmol/L
20[22] 100[22] pg/mL
Cortisol 09:00 am 140[115] 700[115] nmol/L
5[116] 25[116] μg/dL
Midnight 80[115] 350[115] nmol/L
2.9[116] 13[116] μg/dL
Growth hormone (fasting) 0 5[21] ng/mL
Growth hormone (arginine stimulation) 7[21] n/a ng/mL
IGF-1
Female, 20 yrs 110[117] 420[117] ng/mL
Female, 75 yrs 55[117] 220[117]
Male, 20 yrs 160[117] 390[117]
Male, 75 yrs 48[117] 200[117]
Prolactin
Female 71,[118] 105[118] 348,[118] 548[118] mIU/L
3.4,[118] 3.9[118] 16.4,[118] 20.3[118] μg/L
Male 58,[118] 89[118] 277,[118] 365[118] mIU/L
2.7,[118] 3.3[118] 13.0,[118] 13.5[118] μg/L
Parathyroid hormone (PTH) 10,[119] 17[120] 65,[119] 70[120] pg/mL
1.1,[10] 1.8[121] 6.9,[10] 7.5[121] pmol/L
25-hydroxycholecalciferol (a vitamin D)
Standard reference range
8,[30][122] 9[122] 40,[122] 80[30] ng/mL
20,[123] 23[124] 95,[124] 150[123] nmol/L
25-hydroxycholecalciferol
Therapeutic target range
30,[125] 40[126] 65,[126] 100[125] ng/mL
85,[70] 100[126] 120,[70] 160[126] nmol/L
Plasma renin activity 0.29,[127] 1.9[128] 3.7[127][128] ng/(mL·h)
3.3,[129] 21[130] 41[129][130] mcU/mL
Aldosterone
Adult 19,[129] 34.0[129] ng/dL
530,[131] 940[131] pmol/L
Aldosterone-to-renin ratio
Adult 13.1,[132] 35.0[132] ng/dL per ng/(mL·h)
360,[132] 970[132] pmol/liter per μg/(L·h)

Vitamins

[edit]

Also including the vitamin B12)-related amino acid homocysteine.

Test Patient type Standard range Optimal range Unit
Lower limit Upper limit Lower limit Upper limit
Vitamin A 30[30] 65[30] μg/dL
Vitamin B9
(Folic acid/Folate) – Serum
Age > 1 year 3.0[133] 16[133] 5[134] ng/mL or μg/L
6.8[135] 36[135] 11[135] nmol/L
Vitamin B9
(Folic acid/Folate) – Red blood cells
200[133] 600[133] ng/mL or μg/L
450[135] 1400[135] nmol/L
Pregnant 400[133] ng/mL or μg/L
900[133] nmol/L
Vitamin B12 (Cobalamin) 130,[136] 160[137] 700,[136] 950[137] ng/L
100,[138] 120[10] 520,[138] 700[10] pmol/L
Homocysteine
3.3,[139] 5.9[139] 7.2,[139] 15.3[139] 6.3[70] μmol/L
45,[140] 80[140] 100,[140] 210[140] 85[70] μg/dL
Vitamin C (Ascorbic acid) 0.4[30] 1.5[30] 0.9[70] mg/dL
23[141] 85[141] 50[70] μmol/L
25-hydroxycholecalciferol (a vitamin D) 8,[30][122] 9[122] 40,[122] 80[30] 30,[125] 40[126] 65,[126] 100[125] ng/mL
20,[123] 23[124] 95,[124] 150[123] 85,[70] 100[126] 120,[70] 160[126] nmol/L
Vitamin E 28[70] μmol/L
1.2[70] mg/dL

Toxic Substances

[edit]
Test Limit type Limit Unit
Lead Optimal health range < 20[25] or 40[30] μg/dL
Blood ethanol content Limit for drunk driving 0,[142] 0.2,[142] 0.8[142] or g/L
17.4[143] mmol/L

Hematology

[edit]

Red blood cells

[edit]

These values (except Hemoglobin in plasma) are for total blood and not only blood plasma.

Test Patient Lower limit Upper limit Unit Comments
Hemoglobin (Hb) Male 2.0,[144] 2.1[21][145] 2.5,[144] 2.7[21][145] mmol/L Higher in neonates, lower in children.
130,[10] 132,[22] 135[21] 162,[22] 170,[10] 175[21] g/L
Female 1.8,[144] 1.9[21][145] 2.3,[144] 2.5[21][144][145] mmol/L Sex difference negligible until adulthood.
120[10][21][22] 150,[10] 152,[22] 160[21][30] g/L
Hemoglobin subunits (sometimes displayed simply as "Hemoglobin") Male 8.0,[146] 8.4[146] 10.0,[146] 10.8[146] mmol/L 4 per hemoglobin molecule
Female 7.2,[146] 7.6[146] 9.2,[146] 10.0[146]
Hemoglobin in plasma 0.16[21] 0.62[21] μmol/L Normally diminutive compared with inside red blood cells
1 4 mg/dL
Glycated hemoglobin (HbA1c) < 50 years 3.6[10] 5.0[10] % of Hb
> 50 years 3.9[10] 5.3[10]
Haptoglobin < 50 years 0.35[10] 1.9[10] g/L
> 50 years 0.47[10] 2.1[10]
Hematocrit (Hct) Male 0.39,[10] 0.4,[22] 0.41,[21] 0.45[30] 0.50,[10] 0.52,[22] 0.53,[21] 0.62[30] L/L
Female 0.35,[10] 0.36,[21] 0.37[22][30] 0.46,[10][21][22] 0.48[30] L/L
Child 0.31[22] 0.43[22] L/L
Mean corpuscular volume (MCV) Male 76,[30] 82[22] 100,[30] 102[22] fL Cells are larger in neonates, though smaller in other children.
Female 78[22] 101[22] fL
Red blood cell distribution width (RDW) 11.5[22] 14.5[22] %
Mean cell hemoglobin (MCH) 0.39[21] 0.54[21] fmol/cell
25,[21] 27[10][30] 32,[30] 33,[10] 35[21] pg/cell
Mean corpuscular hemoglobin concentration (MCHC) 4.8,[147] 5.0[147] 5.4,[147] 5.6[147] mmol/L
31,[22] 32[10][30] 35,[22] 36[10][30] g/dL or %[note 1]
Erythrocytes/Red blood cells (RBC) Male 4.2,[30] 4.3[10][21][22] 5.7,[10] 5.9,[21] 6.2,[22] 6.9[30] x1012/L
or
million/mm3
Female 3.5,[21] 3.8,[22] 3.9[10] 5.1,[10] 5.5[21][22]
Infant/Child 3.8[22] 5.5[22]
Reticulocytes Adult 26[10] 130[10] x109/L
0.5[21][22] 1.5[21][22] % of RBC
Newborn 1.1[22] 4.5[22] % of RBC
Infant 0.5[22] 3.1[22] % of RBC
Immature reticulocyte fraction (IRF) Adult 1.6[148] 12.1[148] % of reticulocytes
Reticulocyte hemoglobin equivalent Adult 30.0[148] 37.6[148] %
24.1[149] 35.8[149] pg
Immature platelet fraction (IPF) Adult 0.8[148] 5.6[148] %

White blood cells

[edit]

These values are for total blood and not only blood plasma.

Test Patient type Lower limit Upper limit Unit
White Blood Cell Count (WBC) Adult 3.5,[10] 3.9,[150] 4.1,[22] 4.5[21] 9.0,[10] 10.0,[150] 10.9,[22] 11[21]
  • x109/L
  • x103/mm3 or
  • x103/μL
Newborn 9[151] 30[151]
1 year old 6[151] 18[151]
Neutrophil granulocytes
(A.K.A. grans, polys, PMNs, or segs)
Adult 1.3,[10] 1.8,[150] 2[151] 5.4,[10] 7,[150] 8[151] x109/L
45–54[21] 62,[21] 74 % of WBC
Newborn 6[151] 26[151] x109/L
Neutrophilic band forms Adult 0.7[151] x109/L
3[21] 5[21] % of WBC
Lymphocytes Adult 0.7,[10] 1.0[150][151] 3.5,[150] 3.9,[10] 4.8[151] x109/L
16–25[21] 33,[21] 45 % of WBC
Newborn 2[151] 11[151] x109/L
Monocytes Adult 0.1,[10] 0.2[152][153] 0.8[10][151][153] x109/L
3,[21] 4.0 7,[21] 10 % of WBC
Newborn 0.4[151] 3.1[151] x109/L
Mononuclear leukocytes
(Lymphocytes + monocytes)
Adult 1.5 5 x109/L
20 35 % of WBC
CD4+ T cells Adult 0.4,[22] 0.5[25] 1.5,[25] 1.8[22] x109/L
Eosinophil granulocytes Adult 0.0,[10] 0.04[153] 0.44,[153] 0.45,[151] 0.5[10] x109/L
1[21] 3,[21] 7 % of WBC
Newborn 0.02[151] 0.85[151] x109/L
Basophil granulocytes Adult 40[150] 100,[10][153] 200,[151] 900[150] x106/L
0.0 0.75,[21] 2 % of WBC
Newborn 0.64[151] x109/L

Coagulation

[edit]
Test Lower limit Upper limit Unit Comments
Thrombocyte/Platelet count (Plt) 140,[22] 150[10][21] 350,[10][30] 400,[21] 450[22] x109/L or
x1000/μL
Mean platelet volume (MPV) 7.2,[154] 7.4,[155] 7.5[156] 10.4,[155] 11.5,[156] 11.7[154] fL
Prothrombin time (PT) 10,[25] 11,[21][157] 12[22] 13,[25] 13.5,[157] 14,[22] 15[21] s PT reference varies between laboratory kits – INR is standardised
INR 0.9[10] 1.2[10] The INR is a corrected ratio of a patient's PT to normal
Activated partial thromboplastin time (APTT) 18,[22] 30[10][25] 28,[22] 42,[10] 45[25] s
Thrombin clotting time (TCT) 11 18 s
Fibrinogen 1.7,[22] 2.0[10] 3.6,[10] 4.2[22] g/L
Antithrombin 0.80[10] 1.2[10] kIU/L
0.15,[158] 0.17[159] 0.2,[158] 0.39[159] mg/mL
Bleeding time 2 9 minutes
Viscosity 1.5[160] 1.72[160] cP

Immunology

[edit]

Acute phase proteins

[edit]

Acute phase proteins are markers of inflammation.

Test Patient Lower limit Upper limit Unit Comments
Erythrocyte sedimentation rate
(ESR)
Male 0 Age÷2[161] mm/h ESR increases with age and tends to be higher in females.[162]
Female (Age+10)÷2[161]
C-reactive protein (CRP) 5,[10][163] 6[164] mg/L
200,[165] 240[165] nmol/L
Alpha 1-antitrypsin (AAT) 20,[166] 22[167] 38,[167] 53[166] μmol/L
89,[168] 97[10] 170,[10] 230[168] mg/dL
Procalcitonin 0.15[169] ng/mL or μg/L

Isotypes of antibodies

[edit]
Test Patient Lower limit Upper limit Unit
IgA Adult 70,[10] 110[170] 360,[10] 560[170] mg/dL
IgD 0.5[170] 3.0[170]
IgE 0.01[170] 0.04[170]
IgG 800[170] 1800[170]
IgM 54[170] 220[170]

Autoantibodies

[edit]

Autoantibodies are usually absent or very low, so instead of being given in standard reference ranges, the values usually denote where they are said to be present, or whether the test is a positive test. There may also be an equivocal interval, where it is uncertain whether there is a significantly increased level.

Test Negative Equivocal Positive Unit
anti-SS-A (Ro) < 1.0[171] n/a ≥ 1.0[171] Units (U)
anti-SS-B (La) < 1.0[172] n/a ≥ 1.0[172]
Anti ds-DNA < 30.0[173] 30.0–75.0[173] > 75.0[173] International Units per millilitre (IU/mL)
Anti ss-DNA < 8[174] 8–10[174] > 10[174] Units per millilitre (U/mL)
Anti-histone antibodies < 25[174] n/a[174] > 25[174]
Cytoplasmic anti-neutrophil
cytoplasmic antibodies (c-ANCA)
< 20[174] 21–30[174] > 30[174]
Perinuclear anti-neutrophil
cytoplasmic antibodies (p-ANCA)
< 5[174] n/a > 5[174]
Anti-mitochondrial antibodies (AMA) < 0.1[175] 0.1-0.9[175] ≥ 1.0[175] Units (U)
Rheumatoid factor (RF) < 20 20–30 > 30[22] Units per millilitre (U/mL)
Antistreptolysin O titre (ASOT) in
preschoolers
> 100
ASOT at school age > 250[22]
ASOT in adults > 125[22]
Test Negative Low/weak positive Moderate positive High/strong positive Unit
Anti-phospholipid IgG < 20[174] 20–30[174] 31–50[174] > 51[174] GPLU/mL[174]
Anti-phospholipid IgM < 1.5[174] 1.5–2.5[174] 2–9.9[174] > 10[174] MPL /mL[174]
Anti-phospholipid IgA < 10[174] 10–20[174] 21–30[174] > 31[174] arb U/mL[174]
Anti-citrullinated protein antibodies < 20[174] 20–39[174] 40–59[174] > 60[174] EU[174]

Other immunology

[edit]
Test Lower limit Upper limit Unit
Serum free light chains (FLC): kappa/lambda ratio 0.26[176] 1.65[176] (unitless)

Other enzymes and proteins

[edit]
Test Lower limit Upper limit Unit Comments
Serum total protein 60,[21] 63[22] 78,[21] 82,[22] 84[30] g/L
Lactate dehydrogenase (LDH) 50[30] 150[30] U/L
0.4[61] 1.7[61] μmol/L
1.8[10] 3.4[10] μkat/L < 70 years old[10]
Amylase 25,[21] 30,[22] 53[30] 110,[22] 120,[177] 123,[30] 125,[21] 190[61] U/L
0.15[10] 1.1[10] μkat/L
200[165] 240[165] nmol/L
D-dimer
n/a 500[178] ng/mL Higher in pregnant women[179]
0.5[10] mg/L
Lipase 7,[22] 10,[30] 23[61] 60,[22] 150,[30] 208[61] U/L
Angiotensin-converting enzyme (ACE) 23[61] 57[61] U/L
Acid phosphatase 3.0[61] ng/mL
Eosinophil cationic protein (ECP) 2.3[10] 16[10] μg/L

Other electrolytes and metabolites

[edit]

Electrolytes and metabolites: For iron and copper, some related proteins are also included.

Test Patient type Lower limit Upper limit Unit Comments
Osmolality 275,[21] 280,[30] 281[10] 295,[21] 296,[30] 297[10] mOsm/kg Plasma weight excludes solutes
Osmolarity Slightly less than osmolality mOsm/L Plasma volume includes solutes
Urea 3.0[180] 7.0[180] mmol/L BUN – blood urea nitrogen
7[21] 18,[21] 21[22] mg/dL
* Uric acid[22] 0.18[21] 0.48[21] mmol/L
Female 2.0[30] 7.0[30] mg/dL
Male 2.1[30] 8.5[30] mg/dL
Creatinine Male 60,[10] 68[181] 90,[10] 118[181] μmol/L May be complemented with creatinine clearance
0.7,[182] 0.8[182] 1.0,[182] 1.3[182] mg/dL
Female 50,[10] 68[181] 90,[10] 98[181] μmol/L
0.6,[182] 0.8[182] 1.0,[182] 1.1[182] mg/dL
BUN/Creatinine Ratio 5[30] 35[30]
Plasma glucose (fasting) 3.8,[21] 4.0[10] 6.0,[10] 6.1[183] mmol/L See also glycated hemoglobin (in hematology)
65,[22] 70,[21] 72[184] 100,[183] 110[30] mg/dL
Full blood glucose (fasting) 3.3[10] 5.6[10] mmol/L
60[184] 100[184] mg/dL
Random glucose 3.9[185] 7.8[185] mmol/L
70[186] 140[186] mg/dL
Lactate (Venous) 4.5[30] 19.8[30] mg/dL
0.5[187] 2.2[187] mmol/L
Lactate (Arterial) 4.5[30] 14.4[30] mg/dL
0.5[187] 1.6[187] mmol/L
Pyruvate 300[30] 900[30] μg/dL
34[188] 102[188] μmol/L
Ketones 1[189] mg/dL
0.1[189] mmol/L

Medication

[edit]
Test Lower limit Upper limit Unit Comments
Digoxin 0.5[190] 2.0[190] ng/mL Narrow therapeutic window
0.6[190] 2.6[190] nmol/L
Lithium 0.4,[191] 0.5,[192][193] 0.8[194] 1.3[192][193] mmol/L Narrow therapeutic window
Paracetamol 30[195] mg/L Risk of paracetamol toxicity at higher levels
200[195] μmol/L

See also

[edit]

Notes

[edit]

References

[edit]
[edit]

Further reading

[edit]
Revisions and contributorsEdit on WikipediaRead on Wikipedia
from Grokipedia
Reference ranges for blood tests are standardized sets of upper and lower limits for measurements of components, such as electrolytes, enzymes, and blood cells, established to represent normal values in healthy individuals and aid in the interpretation of test results. These ranges provide a benchmark for clinicians to assess whether a patient's results fall within expected norms, helping to detect deviations that may indicate , monitor treatment efficacy, or guide further diagnostic steps. Typically encompassing the central 95% of values from a reference —calculated as the plus or minus two standard deviations—they exclude the outermost 2.5% at each end to account for natural variation in healthy people. The determination of reference ranges involves rigorous statistical analysis of blood samples from a large cohort of at least 120 healthy volunteers selected to match the target population in terms of age, , , and other relevant factors. Laboratories validate these ranges using additional samples, often 20 to 40, to ensure applicability to their specific testing methods and equipment, as required by accrediting bodies like the (CAP) or the (CLIA). Variations in reference ranges arise from differences in analytical techniques, reagents, and instrumentation across labs, as well as physiological influences such as pregnancy, altitude, or time of day, necessitating the use of lab-specific ranges printed on reports. For example, due to physiological adaptation to hypoxia at high altitudes, hemoglobin reference ranges are higher in locations like Quito, Ecuador (~2,850 m above sea level) compared to sea-level Guayaquil. Some reported ranges in Quito include 14.9–18.3 g/dL for men and 12.7–16.2 g/dL for women (with means around 16.7 g/dL and 14.5 g/dL), while sea-level ranges are typically lower, such as ~13–17 g/dL for men and ~12–15 g/dL for women. Many laboratories in Ecuador use standardized ranges without altitude adjustment, but studies recommend establishing local, altitude-specific ranges for accurate clinical interpretation. While reference ranges are essential for clinical decision-making, they are not absolute diagnostic tools; the central 95% design means approximately 5% of healthy individuals will have at least one result outside the range for any given test parameter due to inherent biological variability. This per-parameter rate substantially elevates the likelihood of outliers in comprehensive panels, such as complete blood counts (10–15 parameters) or comprehensive metabolic panels (~14 parameters), even among healthy 35-year-olds using standard adult ranges without specific adjustments. Minor, transient, or isolated abnormalities—often influenced by hydration, diet, stress, or lab variations—are typically inconsequential absent symptoms, with clinical correlation, patterns, or repeat testing guiding interpretation. For common blood tests like complete blood counts or metabolic panels, ranges differ by demographic groups—for instance, hemoglobin levels are higher in adult males (13.5–17.5 g/dL) than in females (12.0–16.0 g/dL)—highlighting the need for personalized interpretation. Ongoing research and updates to these ranges reflect advances in laboratory technology and broader population data to enhance accuracy and relevance.

Common Reference Ranges for Selected Blood Tests

The following provides approximate reference ranges for selected common blood tests. These values are illustrative only and may vary depending on the laboratory, age, sex, ethnicity, and other individual factors. Laboratories use their own validated ranges, and results must always be interpreted by a qualified physician in the context of the patient's clinical presentation and history.
  • Fasting blood glucose (FBS): 70-99 mg/dL
  • Hemoglobin (Hb): Men 13.5-17.5 g/dL, Women 12-15.5 g/dL
  • White blood cells (WBC): 4000-11000 per microliter
  • Red blood cells (RBC): Men 4.5-5.9 million per microliter, Women 4.1-5.1 million per microliter
  • Hematocrit (Hct): Men 41-50%, Women 36-44%
  • Platelets (PLT): 150000-450000 per microliter
  • Blood urea nitrogen (BUN): 7-20 mg/dL
  • Creatinine (Cr): 0.6-1.2 mg/dL (men slightly higher)
  • Total cholesterol: less than 200 mg/dL
  • Triglycerides: less than 150 mg/dL
  • Thyroid-stimulating hormone (TSH): 0.4-4.0 mIU/L
For example, the haematology reference ranges used by Gloucestershire Hospitals NHS Foundation Trust (UK), last updated in October 2025, include: Haemoglobin (adult male) 130–180 g/L, (adult female) 115–165 g/L; Total White Cell Count (adult) 3.6–11.0 ×109/L; Platelet count (adult) 140–400 ×109/L; Red cell count (adult male) 4.50–6.50 ×1012/L, (adult female) 3.80–5.80 ×1012/L. These are provided in SI units and may differ slightly from other sources due to methodological variations. Full details available at .

General Principles

Definition and Purpose

Reference ranges, also known as reference intervals, represent the central 95% of laboratory test values obtained from a defined healthy , typically encompassing the interval between the 2.5th and 97.5th percentiles of the distribution. This interval is often approximated as the plus or minus two standard deviations for analytes that follow a Gaussian distribution, providing a statistical benchmark for what constitutes "normal" results in clinical . The selection of the is critical, involving the exclusion of outliers and individuals with conditions that could skew the data, ensuring the range reflects physiological norms rather than pathological states. The primary purpose of reference ranges is to facilitate the interpretation of results by identifying deviations that may indicate , thereby supporting clinical , monitoring, and screening programs. In practice, these ranges help clinicians determine whether a 's levels fall within expected physiological bounds, guiding decisions on further testing, treatment initiation, or therapeutic adjustments. For instance, values outside the prompt evaluation for underlying conditions, while those within it provide reassurance, though clinical context remains essential due to inherent biological variability. Historically, reference ranges for blood tests were standardized in the late 1960s through large-scale population studies, with significant contributions from the and Examination Survey (NHANES) in the United States beginning in the 1970s. NHANES I (1971–1975) and subsequent cycles provided extensive data on healthy populations, enabling the establishment of robust, nationally representative intervals that have been updated periodically to account for demographic shifts and analytical advancements. These efforts underscored the need for ongoing refinement, as modern ranges incorporate larger, more diverse datasets to enhance applicability across populations. Key statistical concepts underpinning reference ranges include measures of , such as the or , which indicate the typical value in the reference population, and dispersion metrics like standard deviation or percentiles that define the interval's width. exclusion protocols, often based on statistical tests or predefined criteria, ensure the of the distribution by removing extreme values that do not represent the healthy cohort. While factors like age, , and influence these ranges, the core methodology prioritizes a parametric or non-parametric approach to derive reliable intervals for clinical use.

Sample Collection and Processing

Blood samples for laboratory testing are categorized into three primary types: whole blood, serum, and plasma, each suited to specific analytes to ensure accurate measurement and maintain reference range validity. Whole blood consists of cellular components such as red blood cells, white blood cells, and platelets suspended in plasma, and it is used without separation for tests like complete blood counts or blood gas analysis. Serum is the liquid portion obtained after allowing whole blood to clot and then centrifuging to remove the clot and cells; it lacks fibrinogen and other clotting factors, making it appropriate for assays such as electrolyte panels, liver enzymes, and hormone levels like thyroid-stimulating hormone. Plasma, in contrast, is derived from anticoagulated whole blood that is centrifuged immediately to separate cells, retaining fibrinogen and clotting factors; it is essential for tests including coagulation studies (using citrate anticoagulant) and certain cardiac markers like troponin (using heparin). The choice of sample type directly influences analyte stability and test performance, as interchanging them can lead to erroneous results due to differences in matrix composition. Collection methods vary by site and purpose, with venous sampling being the most common for routine blood tests due to its reliability in obtaining sufficient volume. is typically drawn via from arm veins using evacuated tubes or syringes, with a applied proximal to the site for no longer than 1 minute to minimize hemoconcentration and ensure patient comfort. Arterial sampling, performed for blood gas analysis to assess oxygen and levels, involves direct puncture of an (e.g., radial) and requires specialized handling to avoid air bubbles that could alter and gas measurements. sampling, obtained by skin puncture (e.g., heel or fingertip lancet), is preferred for pediatric or where small volumes suffice, such as glucose monitoring, but it mixes arterial, venous, and fluids, potentially introducing variability. for 8-12 hours is often required prior to collection for tests sensitive to dietary influences, like lipid profiles or glucose, to prevent lipemia that obscures spectrophotometric readings. Post-collection processing is critical to isolate serum or plasma while preserving integrity, beginning with allowing serum samples to clot at (20-25°C) for 30-60 minutes to form a stable clot. Both serum and plasma samples are then to separate the liquid phase from cellular elements, typically at 1,000-2,000 × g for 10 minutes at 18-25°C using a refrigerated if temperature-sensitive analytes are involved. samples bypass for immediate analysis, such as in analyzers. To prevent —rupture of red blood cells—tubes must be handled gently, kept upright during transport, and processed promptly without excessive shaking or temperature extremes. Following separation, samples are transferred to secondary containers to avoid contamination. Storage conditions must align with analyte stability to uphold reference range applicability, with separated serum or plasma stable at room temperature (20-25°C) for up to 8 hours, refrigerated (2-8°C) for 24-48 hours, and frozen (≤ -20°C) for longer periods, though repeated freeze-thaw cycles should be avoided. Whole blood is generally not refrigerated unless specified, as chilling can induce metabolic changes in certain tests. Improper handling can introduce interferences like hemolysis (releasing intracellular contents such as potassium), lipemia (turbidity from lipids masking absorbance), or icterus (bilirubin absorption interfering with wavelengths), which compromise result accuracy. Laboratories often reject samples exceeding interference thresholds, such as a hemolysis index greater than 50 (indicating free hemoglobin >50 mg/dL), to prevent reporting biased values that deviate from established reference ranges.

Units of Measurement

In , blood test results are reported using either the (SI), which expresses concentrations in molar terms such as millimoles per liter (mmol/L) for analytes like electrolytes and glucose, or conventional units, which often use mass concentrations such as milligrams per deciliter (mg/dL) for glucose and . The SI system was developed to provide a coherent framework for scientific measurements, emphasizing chemical relationships and molar quantities to facilitate international comparability, while conventional units stem from historical practices in clinical reporting that prioritize familiarity in certain regions. widely adopted SI units for laboratory reporting starting in the 1980s, following recommendations from the International Federation of Clinical Chemistry and Laboratory Medicine (IFCC), to enhance and reduce errors in data interpretation across borders. Conversion between these unit systems is essential for harmonizing results, particularly when comparing data from diverse laboratories or guidelines. For instance, to convert glucose from mg/dL to mmol/L, multiply the value by 0.0555 (or equivalently, divide by 18, as the molecular weight of glucose is approximately 180 g/mol). Similar multipliers apply to other key analytes, as shown in the table below for common blood tests. These factors are derived from molecular weights and ensure accurate translation without altering the clinical meaning.
AnalyteConventional UnitConversion Factor to SI (mmol/L)Example SI Unit
Glucosemg/dL× 0.0555mmol/L
Cholesterolmg/dL× 0.0259mmol/L
BUN (to urea)mg/dL× 0.357mmol/L
Creatininemg/dL× 88.4µmol/L
Calciummg/dL× 0.25mmol/L
SodiummEq/L× 1 (already equivalent)mmol/L
For enzymes, activity is traditionally reported in international units per liter (IU/L or U/L), where 1 IU represents the amount of catalyzing the conversion of 1 micromole of substrate per minute under defined conditions, but the SI unit is the (kat), defined as 1 mole of substrate per second. The conversion is 1 kat = 6 × 10^7 IU, making submultiples like nanokatal (nkat) more practical for clinical use (1 IU = 16.67 nkat). The IFCC has advocated for adoption to align with SI principles, though IU/L remains prevalent in assays for tests like . Despite these standards, challenges persist due to country-specific preferences, such as the retaining mg/dL for like to align with established clinical guidelines, which can complicate global and patient care. The IFCC, through initiatives like the International Consortium for of Clinical Results, continues efforts to promote uniform reporting units worldwide, including to reference materials and education on conversions to minimize discrepancies.

Conventional vs Optimal Ranges

Conventional reference ranges for blood tests are statistically derived from the central 95% of values (2.5th to 97.5th percentiles) observed in a presumably healthy reference population, providing a benchmark for identifying potential abnormalities in clinical settings. These ranges do not guarantee that values within them reflect optimal , nor do values outside them always indicate , as the distributions of healthy and diseased states often overlap. A key limitation is their potential to overlook subclinical issues, where an individual's result may fall within the range but deviate from their personal baseline, signaling early dysfunction that trend monitoring could detect more effectively. In preventive medicine, optimal ranges represent narrower intervals linked to peak physiological function and minimized disease risk, often informed by longitudinal studies assessing health outcomes rather than mere population norms. For instance, lipid profiles derived from the and supporting trials indicate that low-density lipoprotein (LDL) cholesterol levels of 50-70 mg/dL correlate with reduced and coronary heart disease events, in contrast to broader conventional targets below 100 mg/dL. While the previously recommended optimal serum 25-hydroxyvitamin D (25(OH)D) levels of 30-50 ng/mL (2011) for broader benefits like immune function and fracture prevention, their 2024 guideline for disease prevention aligns with the Institute of Medicine (IOM), defining sufficiency at ≥20 ng/mL for healthy adults under 75 without routine higher targets. These optimal targets are applied in wellness contexts to guide interventions aimed at longevity and performance, rather than solely diagnosing overt pathology. As of 2025, updates like the ESC/EAS lipid guidelines maintain aggressive LDL-C goals below 70 mg/dL (or <55 mg/dL) for high-risk groups, emphasizing risk-based personalization. Despite their utility, optimal ranges face controversies due to a lack of universal standardization, as recommendations vary by organization and outcome measured, complicating clinical adoption. Evidence from meta-analyses, such as those on vitamin D supplementation, suggests that achieving levels above conventional thresholds may yield benefits in deficient populations, though optimal 25(OH)D remains debated. For lipids, trials aggregated in reviews show superior cardiovascular event reduction with levels below conventional thresholds, underscoring the value of functional approaches while highlighting the need for personalized adjustments.

Factors Influencing Ranges

Biological Variability

Biological variability refers to the inherent fluctuations in blood analyte concentrations due to physiological processes within individuals and across populations, distinct from external influences like sample handling. This variability arises from homeostatic mechanisms that maintain analyte levels around a set point, but it can affect the interpretation of reference ranges by introducing natural oscillations that must be accounted for in clinical decision-making. Understanding these variations is crucial for establishing individualized reference intervals and detecting true pathological changes. Within-subject variability encompasses short-term and long-term physiological changes in an individual, such as circadian rhythms, where cortisol levels peak in the early morning due to activation of the hypothalamic-pituitary-adrenal axis and decline throughout the day. Hormonal fluctuations during the menstrual cycle also contribute, with estrogen levels rising significantly by cycle day seven to promote follicular development, while progesterone increases post-ovulation, potentially influencing related analytes like iron or inflammatory markers. Age-related changes further exemplify this, as serum creatinine concentrations rise steadily after age 40 in females and 60 in males, reflecting declines in muscle mass and renal function that alter analyte homeostasis over time. Between-subject variability stems from differences in genetic and environmental factors among individuals, leading to diverse homeostatic set points for blood analytes. Genetic polymorphisms, such as those in the G6PD gene, result in variable enzyme activity levels, with affected individuals showing reduced glucose-6-phosphate dehydrogenase concentrations that can predispose to hemolytic responses under stress, thereby widening population-level analyte distributions. Lifestyle elements, including diet, similarly impact variability; high intake of saturated fatty acids elevates low-density lipoprotein cholesterol by up to 10-15% in susceptible individuals, highlighting how nutritional patterns contribute to inter-individual differences in lipid profiles. Key indices quantify this variability: the within-subject coefficient of variation (CV_I or CV_w) measures intra-individual fluctuations, while the between-subject coefficient of variation (CV_G or CV_b) captures inter-individual differences, and analytical variation (CV_A) represents laboratory imprecision. For sodium, a tightly regulated analyte, CV_G is approximately 1.8% and CV_w 1.0%, indicating low biological fluctuation around homeostatic set points compared to more variable analytes like lipids, where CV_G can exceed 10%. Longitudinal studies, such as analyses from the National Health and Nutrition Examination Survey (NHANES), demonstrate that reference ranges for many analytes shift by 20-50% across age groups, underscoring the need to incorporate age-specific biological variation data for accurate clinical application. Pre-analytical factors, like timing of collection, can interact with these inherent variations but are addressed separately.

Pre-Analytical Factors

Pre-analytical factors encompass the variables occurring prior to laboratory analysis that can significantly influence blood test results, primarily related to patient preparation and external conditions. These factors are modifiable and must be standardized to ensure accurate reference ranges and reliable clinical interpretations. The (CLSI) provides protocols, such as those in GP41-Ed7, emphasizing patient comfort during venipuncture to minimize stress-induced alterations in analytes like cortisol and prolactin, which can transiently elevate due to procedural anxiety. Patient-related preparation plays a critical role in mitigating artifacts. Fasting for 8-12 hours is recommended for tests measuring glucose and lipid profiles to avoid postprandial elevations; for instance, blood glucose can increase by nearly 12% within one hour after a meal, while triglycerides may rise substantially, peaking 2-4 hours post-meal. Hydration status directly impacts hematological parameters, with dehydration causing hemoconcentration that elevates hematocrit levels by reducing plasma volume, potentially leading to falsely high readings. Recent exercise, particularly strenuous activity, can elevate creatine kinase (CK) enzymes due to muscle tissue breakdown, with levels rising up to 30 times the upper limit of normal in untrained individuals shortly after intense sessions. Timing of sample collection relative to daily or seasonal patterns also affects results. Postprandial states alter lipid profiles, as non-fasting triglycerides better reflect cardiovascular risk but require specific interpretation compared to fasting baselines. Seasonal variations influence certain analytes, such as 25-hydroxyvitamin D levels, which are typically 13-14% higher in summer due to increased sunlight exposure compared to winter. Medication use introduces interference that necessitates timing adjustments or disclosure to clinicians. Statins, commonly prescribed for hypercholesterolemia, lower total and LDL cholesterol levels, potentially masking baseline values if not accounted for during testing. Diuretics, such as thiazides, can disrupt electrolyte balance by increasing sodium excretion and altering potassium levels, with recommendations to assess timing relative to the last dose for accurate results. CLSI guidelines advocate informing patients to report all medications and adhere to withholding periods where applicable to standardize pre-analytical conditions.

Analytical Variability

Analytical variability in blood testing refers to the inconsistencies introduced by laboratory methods, instruments, and procedures that can affect the reliability of reference ranges. These variations arise from differences in analytical techniques, calibration standards, and quality assurance processes, potentially leading to discrepancies in measured analyte concentrations across laboratories. Ensuring minimal analytical variability is crucial for establishing standardized reference ranges that support consistent clinical decision-making. Different analytical methods can introduce significant bias in results, particularly for complex analytes like hormones and cardiac biomarkers. For instance, immunoassays, commonly used for hormone measurements such as estradiol and testosterone, often exhibit higher variability compared to more specific methods like liquid chromatography-mass spectrometry, with coefficients of variation ranging from 4% to 49% and biases exceeding 100% at low concentrations. Similarly, troponin assays, which rely on immunoassay platforms, demonstrate inter-assay variances of 10-20% due to differences in antibody specificity and calibration, complicating the harmonization of reference ranges for myocardial infarction diagnosis. While spectrophotometric methods are employed for certain biochemical tests, such as enzyme activity assays, their precision can vary based on reagent stability and wavelength selection, though they generally offer lower bias than immunoassays for non-protein analytes. Instrument calibration plays a pivotal role in mitigating analytical variability by ensuring traceability to certified reference materials. Laboratories calibrate analyzers using standards from authoritative bodies like the National Institute of Standards and Technology (NIST), such as Standard Reference Material (SRM) 3152a for sodium, which provides metrological traceability for accurate ion-selective electrode measurements. Total allowable error (TEa) limits define acceptable performance; for sodium, the Clinical Laboratory Improvement Amendments (CLIA) specify a TEa of ±4 mmol/L, guiding calibration adjustments to keep systematic bias and imprecision within bounds that preserve reference range integrity. Quality control measures are essential to monitor and control analytical variability on an ongoing basis. Internal quality control involves daily runs of control materials at multiple concentration levels to detect shifts in precision and accuracy, with results plotted on Levey-Jennings charts to identify outliers exceeding predefined limits. External proficiency testing, such as surveys conducted by the , evaluates inter-laboratory comparability by comparing participant results against peer groups, helping to identify method-specific biases. Sigma metrics quantify method performance by integrating bias, imprecision, and TEa; a sigma value greater than 4 indicates excellent quality, allowing fewer control rules and reduced false rejections, while values below 3 signal the need for method improvements. Efforts to harmonize analytical practices have advanced through international collaborations, reducing variability in reference ranges. Since 2010, the International Federation of Clinical Chemistry and Laboratory Medicine (IFCC) Committee on Reference Intervals and Decision Limits (C-RIDL) has led initiatives, including global multicenter studies to standardize reference value derivation and promote traceable measurements across laboratories. These efforts emphasize method-independent protocols and shared reference materials, fostering consistency in blood test results worldwide.

Population and Demographic Differences

Reference ranges for blood tests vary significantly across populations due to demographic factors such as age, sex, ethnicity, and geography, necessitating tailored intervals to accurately interpret results and avoid misdiagnosis. Age profoundly influences reference ranges, with distinct patterns observed between pediatric, adult, and elderly populations. In children, white blood cell (WBC) counts are typically higher than in adults; for instance, medians reach 8.55 × 10³/µL in ages 0.5–4 years compared to 5.91 × 10³/µL in ages 15–82 years. Hemoglobin levels are also lower in pediatric groups, averaging 10.78 g/dL in young children versus 13.14 g/dL in adults. In the elderly, declines become evident; hemoglobin concentrations drop below WHO anemia thresholds (<13.0 g/dL for men and <12.0 g/dL for women) after age 70 and 80, respectively, with averages as low as 9.7 g/dL in men aged ≥90. Serum albumin levels similarly decrease with advancing age, estimated at 42 g/L (95% interval: 36–48 g/L) in older persons compared to higher values in younger adults. Sex-based differences arise primarily from hormonal influences and physiological states like pregnancy. Men generally exhibit higher hemoglobin levels than women due to androgen effects on erythropoiesis, with adult medians of 13.91 g/dL in males versus 12.40 g/dL in females. Pregnancy induces specific alterations through expanded plasma volume, which increases by 40–60% and exceeds red blood cell mass expansion (20–50%), resulting in physiologic anemia and lowered hematocrit from 38–45% pre-pregnancy to about 34% late in gestation. Accordingly, trimester-specific ranges adjust downward for hemoglobin, with minimum normals of 11 g/dL in the first and third trimesters and 10.5 g/dL in the second. Ethnic variations in reference ranges stem from genetic and environmental factors, including higher prevalence of conditions like . Southeast Asian populations often show lower mean corpuscular volume (MCV) due to alpha- carriers, which affect up to 1 in 20 individuals and reduce MCV below standard intervals (e.g., <81 fL in carriers versus 81–98 fL typical). Broader surveys reveal differences such as lower hemoglobin and MCV in Black individuals compared to White or Asian groups, with non-Hispanic African-Americans averaging 0.5–1.0 g/dL less in hemoglobin. Global data from WHO indicate these patterns contribute to regionally adjusted ranges for accurate anemia assessment across ethnicities. Geographic factors, including altitude and nutrition, further modify ranges to reflect environmental adaptations. At altitudes above 2000 m, reduced oxygen partial pressure stimulates erythropoiesis, elevating by approximately 3% or more; for example, means rise from 155 g/L at sea level to 160 g/L at >1800 m. A specific example occurs in Ecuador, where significant altitude differences between cities influence hematological parameters. In Quito (altitude ~2,850 m), reference ranges for hemoglobin are elevated due to physiological adaptation to hypoxia: 14.9–18.3 g/dL for men (average 16.7 g/dL) and 12.7–16.2 g/dL for women (average 14.5 g/dL). In contrast, Guayaquil (at sea level) uses standard sea-level ranges (e.g., ~13–17 g/dL for men and ~12–15 g/dL for women). Other parameters such as leukocytes and platelets may show slight variations, but the primary difference is in red blood cell parameters due to altitude. Many clinical laboratories in Ecuador apply standardized sea-level ranges without local adjustment, despite recommendations from studies to establish altitude-specific reference ranges for accurate interpretation. In developing regions, nutritional deficiencies like iron scarcity—prevalent in areas with inadequate dietary intake—lower and contribute to higher rates, affecting up to 40% of populations in low socioeconomic zones per WHO estimates.

Hematology

Red Blood Cell Parameters

Red blood cell parameters are essential components of a (CBC) that assess the oxygen-carrying capacity and morphological characteristics of erythrocytes. These include the red blood cell count, hemoglobin concentration, and hematocrit, which provide foundational data for diagnosing anemias, polycythemias, and related disorders. Derived indices such as (MCV), (MCH), (MCHC), and red cell distribution width (RDW) offer insights into red blood cell size, hemoglobin content, and variability, aiding in the classification of hematologic abnormalities. The count measures the number of erythrocytes per unit volume of , typically reported in units of ×10¹²/L. In adult males, the is 4.5–5.9 ×10¹²/L, while in adult females, it is 4.2–5.4 ×10¹²/L. , the protein responsible for oxygen transport, has a of 13.5–17.5 g/dL for males and 12.0–16.0 g/dL for females. , representing the percentage of volume occupied by red blood cells, ranges from 41–50% in males and 36–44% in females.
ParameterMale Reference RangeFemale Reference RangeUnits
4.5–5.9 ×10¹²/L4.2–5.4 ×10¹²/L×10¹²/L
13.5–17.5 g/dL12.0–16.0 g/dLg/dL
41–50%36–44%%
Reference ranges can vary by laboratory; for instance, a UK NHS trust (Gloucestershire Hospitals, updated October 2025) reports haemoglobin as 130–180 g/L for adult males and 115–165 g/L for adult females, slightly broader than some general sources. Always use local lab-specific ranges for interpretation. Red blood cell indices quantify average cell properties to evaluate anemias. The (MCV) indicates average size, calculated as MCV = ( / count) × 10, with a of 80–100 fL. Low MCV values (<80 fL) suggest microcytic anemias, such as iron deficiency anemia, where impaired hemoglobin synthesis leads to smaller erythrocytes. The mean corpuscular hemoglobin (MCH) measures average hemoglobin per , ranging from 27–33 pg. The mean corpuscular hemoglobin concentration (MCHC) reflects hemoglobin concentration within , with a normal range of 32–36 g/dL; reduced MCHC indicates hypochromia, often seen in iron deficiency. Red cell distribution width (RDW) assesses variation in red blood cell size (anisocytosis), expressed as a percentage with a reference range of 11.5–14.5%. Elevated RDW (>14.5%) can signal mixed anemias or early nutritional deficiencies, while normal RDW with low indices points to uniform cell abnormalities like . Reference ranges for red blood cell parameters require adjustments for physiological and environmental factors to ensure accurate interpretation. At high altitudes, chronic hypoxia stimulates , increasing by approximately 0.3 g/dL per 1,000 meters above sea level and elevating red blood cell count and accordingly. For example, in Quito, Ecuador (~2,850 m above sea level), hemoglobin reference ranges are higher due to adaptation to hypoxia: 14.9–18.3 g/dL for men and 12.7–16.2 g/dL for women (averages ~16.7 g/dL and ~14.5 g/dL). In contrast, Guayaquil, Ecuador (sea level), uses ranges closer to standard sea-level values (e.g., ~13–17 g/dL for men, ~12–15 g/dL for women). While many Ecuadorian laboratories apply standardized ranges without local altitude adjustment, studies recommend establishing altitude-specific reference ranges. During , plasma volume expansion dilutes red blood cell parameters, lowering by 0.5–2 g/dL in the second and third trimesters, with adjusted ranges of 11.0–14.0 g/dL. Ethnic variations also influence ranges; individuals of African descent typically exhibit lower levels (0.5–1.0 g/dL below Caucasian norms) and slightly reduced MCV due to genetic factors like benign .

White Blood Cell Parameters

White blood cell (WBC) parameters, also known as leukocytes, are essential components of the (CBC) used to assess immune function and detect infections, inflammation, or hematologic disorders. The total WBC count measures the overall number of leukocytes in circulation, while the differential count provides the relative and absolute proportions of specific subtypes: neutrophils, lymphocytes, monocytes, , and . These parameters help identify patterns such as in bacterial infections or in viral conditions. The for total WBC count in healthy adults is typically 4.0 to 11.0 × 10^9/L (or 4,000 to 11,000 cells/µL). This range can vary slightly by laboratory and population, but deviations outside it may indicate (below 4.0 × 10^9/L) or (above 11.0 × 10^9/L). Neonates exhibit higher counts, often 9.0 to 30.0 × 10^9/L in the first two weeks of life, reflecting immature dynamics, with levels gradually declining to adult ranges by age 4–6 years. Sex differences are minimal in adults, though some studies note slightly higher counts in males. The WBC differential categorizes leukocytes by subtype, reported as percentages of the total count and absolute numbers for clinical accuracy, as percentages alone can mislead if total WBC is abnormal. Neutrophils, the most abundant, range from 40% to 70% (absolute: 1.8 to 7.7 × 10^9/L), serving as primary responders to bacterial infections. Lymphocytes follow at 20% to 40% (absolute: 1.0 to 4.8 × 10^9/L), crucial for adaptive immunity. Monocytes constitute 2% to 8% (absolute: 0.2 to 0.8 × 10^9/L), differentiating into macrophages. are 1% to 4% (absolute: 0.0 to 0.4 × 10^9/L), elevated in parasitic or allergic conditions, while are 0% to 1% (absolute: 0.0 to 0.1 × 10^9/L), involved in reactions. WBC counting and differentiation occur via manual or automated methods, with preferred for efficiency and precision in routine testing. Manual counting involves microscopic examination of a stained , where a technologist classifies at least 100 cells based on morphology, but it is labor-intensive and subject to inter-observer variability. Automated analyzers, such as those using , hydrodynamically focus cells through a narrow stream and employ scatter, impedance, and fluorescent dyes to distinguish subtypes by size, granularity, and content—enabling rapid 5-part differentials. These systems flag abnormalities, such as a "left shift" indicating increased immature neutrophils (bands >5–10%) during acute infections, prompting manual review for confirmation. Several factors influence WBC parameters, introducing variability that must be considered for accurate interpretation. Physiological stress, including emotional or physical exertion, can transiently elevate counts through demargination from vascular walls, mimicking . Ethnic differences also play a role; for instance, individuals of African descent often have lower total WBC and counts but higher percentages compared to those of European descent. These variations underscore the need for population-specific reference ranges to avoid misdiagnosis.
WBC ParameterAdult Reference Range (Percentage)Adult Reference Range (Absolute, ×10^9/L)
Neutrophils40–70%1.8–7.7
Lymphocytes20–40%1.0–4.8
Monocytes2–8%0.2–0.8
1–4%0.0–0.4
0–1%0.0–0.1
Ranges are approximate and may vary by lab; neonatal values are higher overall.

Platelet and Coagulation Factors

Platelets, also known as thrombocytes, are small cell fragments essential for primary , where they adhere to damaged vessel walls, aggregate, and initiate clot formation to prevent . The reference range for platelet count in healthy adults is typically 150-450 × 10^9/L, though values may vary slightly by and population demographics. (MPV), which reflects platelet size and activation potential, normally ranges from 7 to 11 fL, with deviations indicating possible disorders or increased platelet turnover. Coagulation factors contribute to secondary hemostasis by forming a stable fibrin clot through enzymatic cascades. Prothrombin time (PT) measures the extrinsic pathway and is normally 11-13.5 seconds, assessing factors VII, X, V, II, and fibrinogen. Activated partial thromboplastin time (aPTT) evaluates the intrinsic pathway, with a reference range of 25-35 seconds, involving factors XII, XI, IX, VIII, X, V, II, and fibrinogen. The international normalized ratio (INR), derived from PT, standardizes results across reagents and is 0.8-1.2 in healthy individuals not on anticoagulants; it is calculated as: INR=(patient PTmean normal PT)ISI\text{INR} = \left( \frac{\text{patient PT}}{\text{mean normal PT}} \right)^{\text{ISI}} where ISI (international sensitivity index) is calibrated against a World Health Organization reference thromboplastin to ensure consistency, typically ranging from 0.9 to 2.0 depending on the reagent. Fibrinogen, a key coagulation factor (factor I), converts to during clotting and has a of 200-400 mg/dL; levels below 100 mg/dL increase risk. , a fibrinolysis marker reflecting breakdown of cross-linked , is normally 0-0.5 μg/mL (or <500 ng/mL fibrinogen equivalent units), with elevated values suggesting thrombosis or disseminated intravascular coagulation. Anticoagulant therapies like warfarin prolong PT and aPTT by inhibiting vitamin K-dependent factors (II, VII, IX, X), requiring INR monitoring to maintain therapeutic levels of 2.0-3.0 for conditions such as atrial fibrillation.

Biochemical Markers

Electrolytes and Trace Elements

Electrolytes are essential ions that maintain fluid balance, nerve function, muscle contraction, and acid-base homeostasis in the body. Reference ranges for serum electrolytes reflect concentrations typically found in healthy adults, derived from large population studies using standardized laboratory methods such as ion-selective electrodes. These ranges can vary slightly by laboratory due to methodological differences, but standard values provide a benchmark for clinical interpretation. Deviations may indicate disorders like dehydration, renal dysfunction, or endocrine imbalances, though clinical context is crucial for diagnosis. The primary electrolytes measured in routine blood tests include sodium, potassium, chloride, and bicarbonate. Sodium, the most abundant extracellular cation, supports osmotic pressure and cellular function, with a typical serum reference range of 135-145 mmol/L in adults. Potassium, vital for cardiac and muscular activity, has a narrow range of 3.6-5.2 mmol/L, as even small shifts can lead to arrhythmias. Chloride, the major extracellular anion, aids in acid-base balance and is referenced at 98-107 mmol/L. Bicarbonate, reflecting the buffering capacity against acidosis, normally spans 22-29 mmol/L. Calcium and magnesium are divalent cations critical for bone health, enzymatic reactions, and neuromuscular excitability. Total serum calcium, including both protein-bound and ionized forms, is typically 2.1-2.6 mmol/L (or 8.6-10.2 mg/dL), while ionized calcium—the physiologically active fraction—ranges from 1.1-1.3 mmol/L. Magnesium, involved in over 300 enzymatic processes, maintains a serum concentration of 0.7-1.0 mmol/L (1.7-2.4 mg/dL). Abnormalities in these levels often relate to parathyroid function or nutritional status. Trace elements like iron, zinc, and copper are measured to assess nutritional adequacy and metabolic disorders. Serum iron, which fluctuates diurnally, is referenced at 10-30 μmol/L (50-170 μg/L), but ferritin—a storage protein—better indicates iron reserves at 30-300 ng/mL for adult males and 15-150 ng/mL for females. Zinc, essential for immune function and DNA synthesis, has a reference range of 11-18 μmol/L (70-120 μg/L). Copper, a cofactor in superoxide dismutase, is typically 12-20 μmol/L (75-125 μg/L), often bound to ceruloplasmin. The anion gap, calculated as [Na⁺] - ([Cl⁻] + [HCO₃⁻]), helps evaluate metabolic acidosis by estimating unmeasured anions, with a normal range of 8-16 mmol/L in serum. This metric is particularly useful in emergency settings to differentiate causes of acid-base disturbances, such as lactic acidosis or renal failure.
Electrolyte/ElementReference Range (Adults)UnitsClinical Notes
Sodium135-145mmol/LPrimary extracellular cation; hyponatremia common in SIADH.
Potassium3.6-5.2mmol/LNarrow range; hyperkalemia risks cardiac arrest.
Chloride98-107mmol/LParallels sodium; altered in vomiting or diarrhea.
Bicarbonate22-29mmol/LIndicates renal compensation in acid-base disorders.
Total Calcium2.1-2.6mmol/L50% ionized; hypocalcemia linked to tetany.
Ionized Calcium1.1-1.3mmol/LFree form; measured in critical care.
Magnesium0.7-1.0mmol/LHypomagnesemia associated with arrhythmias.
Iron (Serum)10-30μmol/LDiurnal variation; low in deficiency anemia.
Ferritin30-300 (males); 15-150 (females)ng/mLIron stores; elevated in inflammation.
Zinc11-18μmol/LDeficiency impairs wound healing.
Copper12-20μmol/LWilson's disease shows low ceruloplasmin-bound levels.
Anion Gap8-16mmol/LElevated in ketoacidosis; formula excludes potassium in some labs.

Acid-Base Balance and Blood Gases

Acid-base balance in the blood is maintained through the interplay of respiratory and metabolic processes, primarily involving the bicarbonate buffer system, carbon dioxide (CO2) as a volatile acid, and oxygen (O2) levels. Blood gas analysis measures key parameters such as pH, partial pressure of (pCO2), partial pressure of O2 (pO2), bicarbonate (HCO3-), , and lactate to assess homeostasis. Deviations indicate acidosis (pH < 7.35) or alkalosis (pH > 7.45), which can be respiratory (driven by pCO2 changes) or metabolic (driven by HCO3- or alterations). These measurements are crucial for diagnosing conditions like , shock, or metabolic disorders. Reference ranges for gases reflect normal physiological states in healthy adults at . The following table summarizes standard values:
ParameterReference RangeUnits
7.35–7.45-
pCO2 (PaCO2)35–45mmHg
pO2 (PaO2)75–100mmHg
HCO3-22–26mmol/L
-2 to +2mmol/L
Lactate0.5–2.2mmol/L
These ranges can vary slightly by laboratory and population, but they provide benchmarks for interpreting acid-base status. For instance, elevated lactate above 2.2 mmol/L often signals tissue hypoperfusion or anaerobic . gas values differ from arterial due to tissue . is typically 4–6 mmHg higher than arterial pCO2, is about 0.03–0.05 units lower, and venous pO2 is substantially reduced (around 40 mmHg). These differences make arterial sampling preferable for accurate oxygenation and acid-base assessment, though may suffice in stable patients for pH and pCO2 estimation. Blood gas samples must be collected in heparinized syringes to prevent clotting and minimize dilution effects from excess anticoagulant. Lithium heparin is commonly used at low concentrations (e.g., 7–10 IU/mL) to avoid altering or gas levels; samples should be analyzed promptly, ideally within 15–30 minutes, and kept on ice if delayed to reduce metabolic changes. The Henderson-Hasselbalch equation quantifies the relationship between , HCO3-, and in the : pH=6.1+log10([HCO3]0.03×pCO2)\text{pH} = 6.1 + \log_{10} \left( \frac{[\text{HCO}_3^-]}{0.03 \times \text{pCO}_2} \right) Here, [HCO3-] is in mmol/L and in mmHg; the constant 0.03 represents the solubility coefficient of CO2 in plasma. This equation aids in distinguishing primary disorders: features high (>45 mmHg) with compensatory HCO3- rise, lowering pH below 7.35; shows low (<35 mmHg) with HCO3- decrease, raising pH above 7.45. involves low HCO3- (<22 mmol/L) or negative , often with compensatory low pCO2; metabolic alkalosis has high HCO3- (>26 mmol/L) or positive , with compensatory high pCO2. imbalances, such as those involving chloride, can influence the in metabolic acidosis, with further details in the Electrolytes and Trace Elements section.

Liver Enzymes and Function

Liver enzymes and function tests assess hepatocellular integrity, biliary excretion, and synthetic capacity, providing key insights into liver health. These tests measure enzymes released from damaged hepatocytes or biliary epithelium, as well as proteins synthesized by the liver and parameters reflecting its role in clotting factor production. Abnormalities in these markers can indicate acute or chronic , , or impaired synthetic function, with reference ranges varying slightly by laboratory, age, sex, and methodology. Alanine aminotransferase (ALT) is primarily localized in hepatocytes and serves as a sensitive marker of liver cell injury, with a typical reference range of 7-56 U/L in adults. Aspartate aminotransferase (AST), found in liver, heart, muscle, and other tissues, has a reference range of 10-40 U/L; elevations often parallel ALT but can reflect extrahepatic sources. An AST/ALT ratio greater than 2:1 is characteristic of alcoholic liver disease, attributed to pyridoxine deficiency impairing ALT synthesis and greater AST release from mitochondria in alcoholic hepatitis. Alkaline phosphatase (ALP) originates from liver, bone, intestine, and placenta, with a reference range of 44-147 U/L; isolated elevations suggest cholestasis or bone disorders. Gamma-glutamyl transferase (GGT), highly concentrated in biliary epithelium, ranges from 9-48 U/L and is particularly useful for confirming hepatic origin of ALP elevations or detecting alcohol-related damage when exceeding twice the upper limit alongside an elevated AST/ALT ratio. Bilirubin, a byproduct of metabolism, reflects hepatic conjugation and excretion; total bilirubin typically ranges from 5-21 μmol/L (0.3-1.2 mg/dL), while direct (conjugated) bilirubin is less than 5 μmol/L (<0.3 mg/dL). Hyperbilirubinemia patterns—predominantly unconjugated in hemolysis or Gilbert syndrome, conjugated in cholestasis—aid in differential diagnosis. Albumin, the liver's principal synthetic protein, maintains oncotic pressure and transports substances, with a reference range of 35-50 g/L (3.5-5.0 g/dL); hypoalbuminemia indicates chronic liver disease or malnutrition. Total protein, encompassing albumin and globulins, ranges from 60-80 g/L (6.0-8.0 g/dL) and supports assessment of overall synthetic function. Prothrombin time (PT), measuring the extrinsic coagulation pathway, prolongs in liver failure due to reduced synthesis of factors II, V, VII, IX, and X, with a normal range of 11-13.5 seconds. The Child-Pugh score integrates PT (or INR), bilirubin, albumin, ascites, and encephalopathy to classify cirrhosis severity: Class A (5-6 points) indicates compensated disease with good prognosis, Class B (7-9 points) moderate decompensation, and Class C (10-15 points) advanced failure with high mortality risk. This system, originally developed for surgical risk assessment, remains widely used for prognostic stratification in chronic liver disease. For ALP elevations, isoenzyme fractionation distinguishes hepatic from bone origins; liver ALP is heat-stable and accounts for about 50% of total serum activity in healthy adults, while bone ALP predominates in growing children or Paget disease. Techniques such as electrophoresis or immunoassays enable this separation, guiding whether further biliary or skeletal evaluation is needed. Note that AST elevations can overlap with cardiac injury, as detailed in cardiac biomarker assessments.
MarkerReference Range (Adults)Clinical Significance
ALT7-56 U/LHepatocellular injury
AST10-40 U/LLiver or extrahepatic damage; AST/ALT >2 in alcoholic disease
ALP44-147 U/L or
GGT9-48 U/LBiliary obstruction, alcohol use
Total Bilirubin5-21 μmol/LConjugation/excretion impairment
Direct Bilirubin<5 μmol/LCholestatic jaundice
Albumin35-50 g/LSynthetic function
Total Protein60-80 g/LOverall protein status
PT11-13.5 sCoagulation factor synthesis

Cardiac Biomarkers

Cardiac biomarkers are proteins and enzymes released into the bloodstream in response to myocardial injury, stress, or dysfunction, aiding in the diagnosis and management of conditions such as acute myocardial infarction (AMI) and heart failure. These markers provide insights into the timing and extent of cardiac damage, with reference ranges established based on the 99th percentile upper reference limits (URLs) in healthy populations to minimize false positives. The primary biomarkers include cardiac troponins, creatine kinase-MB (CK-MB), , and B-type natriuretic peptides (BNP and NT-proBNP), each with distinct release kinetics and clinical utility. Troponins I and T are the most specific and sensitive indicators of myocardial injury, forming part of the contractile apparatus in cardiac muscle cells and released upon cell membrane disruption. In high-sensitivity cardiac troponin (hs-cTn) assays, the reference range is typically 0 to 0.04 ng/mL (or 0-40 ng/L), corresponding to the sex-specific 99th percentile URL, such as 17 ng/L for females and 35 ng/L for males using the Abbott assay. Conventional troponin assays (third and fourth generations) had higher detection limits (around 0.01-0.1 ng/mL) and lower sensitivity for early or minor injuries, while high-sensitivity assays (fifth generation), introduced around 2007 and widely adopted by 2017, detect troponin in over 50% of healthy individuals, enabling earlier diagnosis within 1-3 hours of symptom onset. Troponin levels rise 2-3 hours after injury, peak at approximately 24 hours post-AMI, and remain elevated for 5-14 days, with a characteristic rise-and-fall pattern required for AMI diagnosis per the Universal Definition of Myocardial Infarction. CK-MB, the cardiac isoform of creatine kinase, is released from damaged myocardial cells and serves as an early marker of AMI, though less specific than troponins due to presence in skeletal muscle. The reference range for CK-MB is 0-5 ng/mL, with levels above this indicating potential cardiac damage when the CK-MB fraction exceeds 5-6% of total CK. It rises 4-6 hours after injury, peaks at 12-24 hours, and returns to normal within 48-72 hours, making it useful for confirming reinfarction in the subacute phase. Myoglobin, a heme protein in cardiac and skeletal muscle, is an early but nonspecific biomarker of muscle injury, rapidly released due to its small size. The reference range is 28-72 ng/mL, with elevations above 85 ng/mL suggesting acute damage. It increases within 1-4 hours of AMI, peaks at 6-12 hours, and normalizes by 24 hours, offering value in ruling out AMI in low-risk patients presenting early but limited by lack of cardiac specificity. BNP and its inactive precursor NT-proBNP are secreted by ventricular cardiomyocytes in response to wall stress, primarily used to diagnose and assess heart failure severity. For BNP, levels below 100 pg/mL effectively rule out acute heart failure in symptomatic patients. NT-proBNP reference ranges are age-adjusted: less than 300 pg/mL for individuals under 75 years, and less than 450 pg/mL for those 75 years and older, reflecting physiological increases with age and renal function. These peptides do not exhibit acute rise-and-fall patterns like injury markers but provide prognostic value, with elevations correlating to worse outcomes in heart failure.
BiomarkerReference RangeUnitsKey Clinical Context
High-sensitivity Troponin I/T0-0.04 (sex-specific 99th percentile)ng/mLAMI diagnosis; peaks ~24 hours post-injury
CK-MB0-5ng/mLEarly AMI confirmation; normalizes in 48-72 hours
Myoglobin28-72ng/mLEarly rule-out of AMI; normalizes in 24 hours
BNP<100pg/mLRules out heart failure
NT-proBNP<300 (<75 years); <450 (≥75 years)pg/mLAge-adjusted heart failure assessment

Lipid Profile

The lipid profile is a panel of blood tests that measures various lipids and lipoproteins to assess cardiovascular risk, including total cholesterol (TC), low-density lipoprotein cholesterol (LDL-C), high-density lipoprotein cholesterol (HDL-C), and triglycerides (TG). Desirable reference ranges for adults, based on guidelines from major cardiovascular organizations, are as follows: TC below 5.2 mmol/L (200 mg/dL), LDL-C below 3.4 mmol/L (131 mg/dL) for those at low risk, HDL-C above 1.0 mmol/L (39 mg/dL) for men and above 1.3 mmol/L (50 mg/dL) for women, and fasting TG below 1.7 mmol/L (150 mg/dL). These ranges help identify dyslipidemia, a key modifiable risk factor for atherosclerosis and coronary heart disease, with elevated LDL-C and TG promoting plaque formation while low HDL-C reduces protective effects against arterial buildup. LDL-C is often estimated indirectly using the Friedewald equation when direct measurement is unavailable, calculated as LDL-C=TCHDL-CTG2.2\text{LDL-C} = \text{TC} - \text{HDL-C} - \frac{\text{TG}}{2.2} in mmol/L (or divided by 5 in mg/dL). This formula assumes very low-density lipoprotein (VLDL) cholesterol is approximately TG divided by 2.2 mmol/L, but it has limitations, including inaccuracy when TG exceeds 4.5 mmol/L (400 mg/dL), in patients with type III hyperlipoproteinemia, or at low LDL-C levels below 1.8 mmol/L (70 mg/dL), where overestimation can occur. Alternative methods, such as direct assays or newer equations like Martin-Hopkins, may be preferred in these scenarios to improve precision for risk stratification. Traditionally, lipid profiles required fasting for 8-12 hours to minimize postprandial effects on TG and calculated LDL-C, but the 2016 European Society of Cardiology (ESC) and European Atherosclerosis Society (EAS) guidelines recommend non-fasting samples for initial screening and general cardiovascular risk assessment, as non-fasting levels provide similar predictive value for events like myocardial infarction. Fasting remains advised if TG is elevated above 5.0 mmol/L (443 mg/dL) non-fasting or for confirming hypertriglyceridemia, as food intake can transiently raise TG by 0.5-1.0 mmol/L without significantly altering TC or HDL-C. This shift enhances practicality and patient adherence in primary care settings. As an advanced marker, apolipoprotein B (ApoB) quantifies the number of atherogenic lipoprotein particles (including LDL and VLDL), with reference ranges typically 0.6-1.1 g/L (60-110 mg/dL) in healthy adults, though desirable levels below 0.9 g/L (90 mg/dL) are often targeted for risk reduction. ApoB outperforms LDL-C in some populations for predicting cardiovascular events, particularly when discordance exists between the two, as it better reflects particle concentration. Debates on optimal versus conventional ranges continue, with some evidence suggesting stricter targets below population medians for maximal prevention.

Tumor and Inflammatory Markers

Tumor Markers

Tumor markers are serum proteins or antigens whose levels in the blood can become elevated in the presence of certain malignancies, aiding in diagnosis, staging, prognosis, and monitoring of treatment response for specific cancers. These biomarkers are not diagnostic on their own due to their limited specificity, as elevations can occur in benign conditions or other diseases, but they are valuable when used in conjunction with imaging, clinical findings, and serial measurements. Common tumor markers include prostate-specific antigen (PSA) for prostate cancer, cancer antigen 125 (CA-125) for ovarian cancer, carcinoembryonic antigen (CEA) for colorectal and other gastrointestinal cancers, alpha-fetoprotein (AFP) for hepatocellular carcinoma and germ cell tumors, and beta-human chorionic gonadotropin (beta-hCG) for germ cell tumors and trophoblastic diseases. Reference ranges vary by laboratory, patient demographics, and assay method, but established upper limits help guide clinical interpretation. Prostate-specific antigen (PSA) is a serine protease produced by prostate epithelial cells, with normal serum levels typically below 4 ng/mL in men without prostate cancer. Levels above this threshold may prompt further evaluation, though PSA can be elevated in benign prostatic hyperplasia (BPH), prostatitis, or recent prostate manipulation, reducing its specificity for malignancy. For instance, BPH can cause PSA elevations due to increased prostate volume and glandular disruption, often necessitating differentiation via digital rectal exam, imaging, or biopsy. Serial PSA monitoring is preferred over single measurements to detect trends, such as a rise greater than 0.75 ng/mL per year, which may indicate progression. The guidelines recommend age- and risk-adjusted PSA cutoffs for early detection; for example, men aged 40-49 at average risk may use a threshold of 2.5-3.0 ng/mL, while higher-risk individuals (e.g., family history or African ancestry) should start screening earlier with lower cutoffs like 1.0 ng/mL to balance sensitivity and overdiagnosis. Cancer antigen 125 (CA-125) is a glycoprotein expressed on ovarian epithelial cells, with reference levels generally less than 35 U/mL in healthy individuals. Elevated CA-125 is observed in about 80% of advanced epithelial ovarian cancers but has low specificity, as it can rise in endometriosis, pelvic inflammatory disease, or menstruation. It is primarily used for monitoring response to therapy and detecting recurrence rather than screening, with serial declines post-treatment indicating favorable outcomes. Carcinoembryonic antigen (CEA) serves as a marker for colorectal cancer, with normal values under 5 ng/mL in non-smokers and up to 5 ng/mL in smokers due to tobacco-induced inflammation. Post-resection CEA normalization (to <5 ng/mL) predicts better prognosis, while persistent elevation signals residual disease; however, it overlaps with inflammatory conditions like inflammatory bowel disease. Alpha-fetoprotein (AFP) is a fetal glycoprotein produced by yolk sac and liver cells, with adult reference ranges below 10 ng/mL. Elevations exceeding this level are associated with hepatocellular carcinoma (especially in cirrhosis) or nonseminomatous germ cell tumors, correlating with tumor burden and guiding treatment decisions. Beta-human chorionic gonadotropin (beta-hCG), a placental hormone subunit, has reference levels below 5 IU/L in non-pregnant adults. It is markedly elevated in choriocarcinoma and some germ cell tumors, aiding in diagnosis and surveillance, though transient rises can occur in hypogonadism or marijuana use. Overall, these markers' utility lies in longitudinal tracking rather than absolute values, with NCCN emphasizing risk-stratified approaches to minimize false positives. Brief overlaps with acute phase reactants, such as CEA elevations in general inflammation, underscore the need for integrated clinical assessment.
Tumor MarkerAssociated CancersReference Range (Adults)Key Notes
PSAProstate<4 ng/mLAge/risk-adjusted cutoffs; elevated in BPH
CA-125Ovarian<35 U/mLMonitoring post-treatment; low specificity
CEAColorectal<5 ng/mL (non-smokers)Higher in smokers; serial for recurrence
AFPLiver, germ cell<10 ng/mLCorrelates with tumor burden
Beta-hCGGerm cell, trophoblastic<5 IU/L (non-pregnant)Useful in staging and prognosis

Acute Phase Reactants

Acute phase reactants are a group of plasma proteins whose concentrations change significantly—typically by at least 25%—during inflammation, infection, or tissue injury, serving as non-specific indicators of the body's systemic response. These proteins are synthesized primarily by hepatocytes under the influence of cytokines such as interleukin-6, helping to modulate immune and inflammatory processes. They are classified as positive acute phase reactants, which increase in concentration, or negative acute phase reactants, which decrease to prioritize resources for the inflammatory response. Positive acute phase reactants include C-reactive protein (CRP), ferritin, and procalcitonin, which rise rapidly to combat pathogens and limit tissue damage. CRP, for instance, binds to phosphocholine on damaged cells and bacteria, activating complement and promoting phagocytosis. Its reference range in healthy adults is typically less than 10 mg/L, with high-sensitivity CRP (hs-CRP) used for cardiovascular risk assessment: less than 1 mg/L indicates low risk, 1–3 mg/L average risk, and greater than 3 mg/L high risk. In acute inflammation, CRP levels can increase dramatically, doubling approximately every 8 hours and peaking within 36–50 hours after onset. Ferritin, an iron-storage protein, also acts as a positive acute phase reactant, with levels elevating during inflammation to sequester iron and deprive pathogens of this essential nutrient, potentially masking underlying iron deficiency anemia. Procalcitonin, a precursor to calcitonin, surges in bacterial infections and sepsis; concentrations below 0.5 ng/mL generally rule out severe systemic infection, while levels above 0.5 ng/mL suggest a high likelihood of bacterial sepsis requiring antibiotics. Negative acute phase reactants, such as albumin, decrease during inflammation to conserve amino acids for synthesizing positive reactants and acute phase proteins like fibrinogen. Albumin levels typically fall below the normal range of 3.5–5.7 g/dL in such states, contributing to hypoalbuminemia observed in chronic inflammation or infection. The erythrocyte sedimentation rate (ESR), while not a protein, is a related non-specific marker of inflammation influenced by acute phase proteins like fibrinogen; normal values are less than 15 mm/h for men and less than 20 mm/h for women under 50 years, though these increase with age.
MarkerReference Range (Normal/Non-Inflammatory)Clinical Notes
CRP<10 mg/Lhs-CRP <1 mg/L low CV risk; rises rapidly in infection.
ESR (Men <50)<15 mm/hInfluenced by fibrinogen; higher in women and elderly.
ESR (Women <50)<20 mm/hNon-specific; complements CRP.
FerritinVaries by age/sex (e.g., 30–300 ng/mL men)Elevates in inflammation; interpret with iron studies.
Procalcitonin<0.5 ng/mLRules out sepsis; >0.5 ng/mL prompts antibiotic use.
Albumin (Negative)3.5–5.7 g/dLDecreases in acute response; signals catabolism.

Autoantibodies and Immune Markers

Autoantibodies and immune markers in blood tests are essential for evaluating autoimmune diseases, immunodeficiencies, and certain malignancies, providing reference ranges that help clinicians assess immune system dysregulation. These tests measure specific antibodies produced against self-antigens or components of the immune response, such as immunoglobulins and complement proteins, which can indicate active disease processes when levels deviate from normal. Reference ranges vary slightly by laboratory and population demographics, but standardized values guide interpretation, with elevations or reductions signaling potential pathology like rheumatoid arthritis (RA) or systemic lupus erythematosus (SLE). Antinuclear antibodies (ANA) are autoantibodies targeting nuclear components and serve as a screening tool for connective tissue diseases. The reference range for ANA titer is typically less than 1:40, considered negative, while titers of 1:40 or higher may warrant further investigation, though low-positive results (1:40 to 1:80) can occur in healthy individuals. Rheumatoid factor (RF), an autoantibody against the Fc portion of IgG, is associated with RA and other autoimmune conditions; normal levels are below 14 IU/mL, with elevations above this threshold increasing diagnostic specificity when combined with clinical findings. Anti-cyclic citrullinated peptide (anti-CCP) antibodies offer higher specificity for RA than RF; the reference range is less than 20 U/mL for negative results, with values at or above this level supporting early diagnosis and prognosis. Complement proteins C3 and C4 are key components of the classical and alternative pathways, consumed during immune complex-mediated inflammation. Normal serum levels for C3 range from 0.9 to 1.8 g/L, and for C4 from 0.1 to 0.4 g/L in adults, with reductions often reflecting active disease consumption. In SLE, low C3 levels (below 0.9 g/L) during flares correlate with increased disease activity, particularly renal involvement, serving as a biomarker for monitoring therapeutic response. Immunoglobulins represent the humoral arm of immunity, with quantitative assays establishing baseline immune competence. Serum IgG levels in healthy adults typically range from 7 to 16 g/L, providing long-term protection against pathogens. IgA concentrations are 0.7 to 4 g/L, crucial for mucosal immunity, while IgM levels of 0.4 to 2.3 g/L indicate acute responses to new antigens. Deviations, such as hypogammaglobulinemia (e.g., IgG below 7 g/L), may signal primary immunodeficiencies, whereas polyclonal hypergammaglobulinemia can occur in chronic infections or autoimmunity. Serum protein immunofixation electrophoresis is used to detect monoclonal proteins in multiple myeloma, identifying abnormal immunoglobulin bands that indicate clonal plasma cell proliferation. In healthy individuals, no monoclonal bands are present; detection of a discrete M-protein spike, often IgG or IgA type, confirms monoclonality when serum protein electrophoresis shows an abnormality.
MarkerReference Range (Adults)Clinical Context
ANA Titer<1:40 (negative)Screening for autoimmune diseases like SLE
RF<14 IU/mLDiagnosis of RA; elevated in 70-80% of cases
Anti-CCP<20 U/mL (negative)Specific for RA; predicts erosive disease
Complement C30.9-1.8 g/LLow in active SLE flares
Complement C40.1-0.4 g/LReduced in immune complex diseases
IgG7-16 g/LHypogammaglobulinemia if low
IgA0.7-4 g/LMucosal immunity assessment
IgM0.4-2.3 g/LAcute infection response

Hormones and Vitamins

Thyroid Hormones

Thyroid hormones play a crucial role in regulating metabolism, growth, and development through the hypothalamic-pituitary-thyroid axis. Blood tests for thyroid function primarily measure thyroid-stimulating hormone (TSH) from the pituitary gland, which stimulates the thyroid to produce and triiodothyronine (T3). These tests help diagnose conditions like hypothyroidism and hyperthyroidism by comparing levels to established reference ranges, which can vary slightly by laboratory but are generally standardized for adults. The reference range for TSH in adults is typically 0.4-4.0 mU/L, serving as the initial screening test due to its sensitivity in detecting thyroid dysfunction. Free T4, the unbound form of thyroxine available for cellular use, has a reference range of 9-23 pmol/L in adults. Free T3, the active form influencing metabolic rate, ranges from 3.1-6.8 pmol/L. Total T4, which includes both bound and unbound thyroxine, is measured at 58-140 nmol/L, though free T4 is preferred for accuracy as it is less affected by binding proteins.
TestReference Range (Adults)Unit
TSH0.4-4.0mU/L
Free T49-23pmol/L
Free T33.1-6.8pmol/L
Total T458-140nmol/L
Antibody tests, such as anti-thyroid peroxidase (anti-TPO), aid in identifying autoimmune thyroiditis; the reference range is less than 35 IU/mL, with elevated levels indicating potential Hashimoto's disease. The thyrotropin-releasing hormone (TRH) stimulation test, which historically assessed pituitary responsiveness by measuring a TSH rise greater than twofold above baseline, is now outdated following the development of sensitive TSH assays that better evaluate the axis without stimulation. Subclinical hypothyroidism is characterized by mildly elevated TSH levels (4-10 mU/L) with normal free T4, often warranting monitoring rather than immediate treatment unless symptoms or risk factors are present. Subclinical hyperthyroidism similarly involves suppressed TSH below 0.4 mU/L with normal T4 and T3 levels. Reference ranges require adjustments for physiological states; for instance, the TSH upper limit is 2.5 mU/L in the first trimester of pregnancy, lower than non-pregnant levels, to account for hCG-mediated thyroid stimulation. Age-related changes may also elevate the TSH upper limit in older adults, though specific thresholds vary by guideline.

Sex and Adrenal Hormones

Sex and adrenal hormones encompass key regulators of reproduction, stress response, and secondary sexual characteristics, with reference ranges varying significantly by sex, age, menstrual cycle phase, and time of day due to diurnal rhythms and physiological fluctuations. These hormones include gonadal steroids like testosterone, estradiol, and progesterone, which fluctuate across the menstrual cycle in females, and adrenal products such as cortisol, adrenocorticotropic hormone (ACTH), and dehydroepiandrosterone sulfate (DHEA-S), which are influenced by the hypothalamic-pituitary-adrenal (HPA) axis. Accurate interpretation requires consideration of timing, as levels can indicate conditions like hypogonadism, polycystic ovary syndrome, or Cushing's syndrome when outside normal ranges. Reference ranges are established through population studies and may differ slightly between laboratories, but standardized values provide clinical benchmarks. Testosterone, the primary male androgen but present in both sexes, supports muscle mass, bone density, and libido. In adult males, total testosterone levels typically range from 8.6 to 29 nmol/L, measured in morning samples to account for diurnal variation. In adult females, levels are lower, ranging from 0.3 to 1.9 nmol/L, with minimal cycle-related changes. Elevated or low levels can signal disorders like androgen excess or deficiency, often assessed alongside free testosterone for bioavailability. Estradiol, the predominant estrogen, drives female reproductive cycles and bone health. In females during the follicular phase (early menstrual cycle), serum estradiol ranges from 45 to 854 pmol/L, rising toward ovulation. Postmenopausal levels drop below 100 pmol/L, reflecting ovarian decline. In males, levels are consistently low, around 40-160 pmol/L, contributing to estrogen balance. Progesterone, which prepares the uterus for pregnancy, remains low in the follicular phase at less than 5 nmol/L, surging to over 20 nmol/L post-ovulation in the luteal phase. Anovulation is suggested if follicular progesterone exceeds this threshold unexpectedly. Adrenal hormones like cortisol and ACTH regulate stress and metabolism via the HPA axis. Morning cortisol (8-9 AM) normally ranges from 140 to 690 nmol/L, decreasing throughout the day; evening levels below 50% of morning indicate healthy rhythm. ACTH, which stimulates cortisol production, has a reference range of 2 to 11 pmol/L, also peaking in the morning. The dexamethasone suppression test assesses HPA feedback, with post-test cortisol below 50 nmol/L confirming normal suppression. DHEA-S, an adrenal androgen precursor to testosterone, in adult males ranges from 2.7 to 13.5 μmol/L, declining with age after peaking in the 20s. Gonadotropins follicle-stimulating hormone (FSH) and luteinizing hormone (LH) from the pituitary orchestrate gonadal function. In the follicular phase of premenopausal females, FSH ranges from 3 to 10 IU/L, with LH slightly lower at 2 to 10 IU/L. Postmenopause, FSH elevates above 30 IU/L due to lack of ovarian feedback, often exceeding 40 IU/L as a diagnostic marker. These phase-specific ranges highlight the need for timed sampling to evaluate fertility, menopause, or pituitary disorders.
HormoneSex/PhaseReference RangeUnitsNotes
Testosterone (total)Adult males8.6–29nmol/LMorning sample preferred
Testosterone (total)Adult females0.3–1.9nmol/LStable across cycle
EstradiolFemales, follicular45–854pmol/LEarly cycle (days 1–14)
ProgesteroneFemales, follicular<5nmol/LIndicates low pre-ovulatory levels
CortisolAdults, AM (8–9 AM)140–690nmol/LDiurnal peak
ACTHAdults, morning2–11pmol/LStimulates adrenal cortisol
DHEA-SAdult males (20–30 years)2.7–13.5μmol/LDeclines with age
FSHFemales, follicular3–10IU/LRises postmenopause >30
LHFemales, follicular2–10IU/LSurges at midcycle
Interactions with other endocrine axes, such as minor influences on function, may modulate these hormones but are secondary to gonadal and HPA dynamics.

Vitamins and Nutritional Markers

Vitamins and nutritional markers in blood tests assess the status of essential micronutrients and proteins involved in metabolic processes, immune function, and overall health. These tests measure circulating levels of fat-soluble and water-soluble vitamins, as well as carrier proteins like prealbumin, to identify deficiencies, excesses, or functional impairments that may contribute to conditions such as , neuropathy, or . Reference ranges vary by laboratory method, age, and population, but established guidelines provide benchmarks for interpretation, emphasizing the importance of clinical context alongside results. Vitamin D, primarily evaluated through 25-hydroxyvitamin D (25-OH D), is crucial for calcium and . Sufficient serum levels are generally considered 50-125 nmol/L (20-50 ng/mL), as recommended by the Institute of Medicine, with values below 50 nmol/L indicating potential deficiency and levels above 125 nmol/L risking toxicity in some cases. (cobalamin) supports formation and neurological function, with serum reference ranges typically spanning 148-675 pmol/L (200-914 pg/mL) in adults. Levels below 148 pmol/L suggest deficiency, while functional assessment via (MMA) is useful when B12 is borderline; normal serum MMA is less than 0.40 μmol/L, with elevations above this threshold confirming impaired B12 utilization even if direct B12 measurement is normal. Folate (vitamin B9) is essential for and prevents , with serum reference ranges commonly 6.8-45 nmol/L (3-20 ng/mL); values below 6.8 nmol/L indicate deficiency, though folate provides a longer-term status indicator. Retinol (vitamin A) maintains vision, skin integrity, and immune response, with normal serum concentrations ranging from 1.05-2.45 μmol/L (30-70 mcg/dL). Deficiency is defined as below 0.70 μmol/L, while toxic levels exceeding 3.5 μmol/L (>100 mcg/dL) can lead to , characterized by liver damage and elevated . Thiamine (vitamin B1) is vital for energy metabolism, particularly in nerve and muscle cells, with reference ranges of 70-180 nmol/L indicating adequate status; levels below 70 nmol/L signal deficiency, often seen in or . Prealbumin, also known as , serves as a marker of protein nutritional status due to its short and liver synthesis, with adult serum reference ranges typically 20-40 mg/dL (200-400 mg/L). Low levels below 15 mg/dL correlate with acute or , though interpretation requires adjustment for non-nutritional factors like renal disease. These markers highlight the balance between deficiency and toxicity, where optimal ranges support health without excess risk; for instance, while sufficiency aligns with conventional ranges, some guidelines advocate higher targets for specific benefits like prevention.
MarkerSpecimenReference Range (Adults)Key Interpretation
25-OH Serum50-125 nmol/LSufficient; <50 nmol/L: deficiency risk
Vitamin B12Serum148-675 pmol/LNormal; <148 pmol/L: deficiency
FolateSerum6.8-45 nmol/LNormal; <6.8 nmol/L: deficiency
Retinol (Vitamin A)Serum1.05-2.45 μmol/LNormal; >3.5 μmol/L: toxicity
Whole Blood70-180 nmol/LAdequate; <70 nmol/L: deficiency
PrealbuminSerum20-40 mg/dLNormal nutrition; <15 mg/dL: malnutrition risk
MMA (for B12 status)Serum<0.40 μmol/LNormal; >0.40 μmol/L: functional B12 deficiency

Other Metabolites and Toxins

Renal Function Markers

Renal function markers are blood and urine tests used to assess the kidneys' ability to filter waste products from the , maintain fluid and balance, and excrete toxins, providing essential insights into kidney health and detecting conditions like (CKD). These markers primarily evaluate (GFR) and waste clearance, with reference ranges varying by age, sex, and laboratory methods. Abnormal levels can indicate impaired renal function, often prompting further evaluation for underlying causes such as or . Serum creatinine, a byproduct of muscle metabolism filtered by the glomeruli, serves as a key indicator of renal filtration, with normal reference ranges of 62-106 μmol/L for adult males and 44-80 μmol/L for adult females, influenced by factors like muscle mass, diet, and age. Higher levels suggest reduced GFR, though creatinine levels can be misleading in individuals with low muscle mass, such as the elderly or malnourished, where they may appear normal despite kidney impairment. , or (), measures another waste product from protein breakdown, with a typical range of 2.5-7.8 mmol/L in adults; elevated urea often correlates with or reduced renal perfusion, but it is less specific than creatinine due to influences from or high-protein intake. Estimated (eGFR), calculated from serum using equations like the CKD-EPI formula, provides a more accurate than alone, with normal values exceeding 90 mL/min/1.73 m² for adults under 60 years, declining gradually with age. The 2021 CKD-EPI equation incorporates age and sex (race-free since 2021 to address inequities) to estimate GFR, offering improved precision over older formulas like MDRD, particularly for values above 60 mL/min/1.73 m². As an alternative to , —a inhibitor produced at a constant rate by all nucleated cells—has a of 0.6–1.2 mg/L in adults and is less affected by muscle mass, making it valuable for confirming eGFR in cases where may be unreliable. In urine tests, the albumin-to-creatinine ratio (ACR) assesses early damage by detecting , with normal values below 3 mg/mmol indicating intact glomerular barrier function; ratios between 3-30 mg/mmol suggest , a for CKD progression. These markers collectively guide staging of CKD according to guidelines from organizations like KDIGO, where eGFR below 60 mL/min/1.73 m² for over three months defines stage 3 disease. Adjustments for muscle mass are crucial, as overestimation of GFR can occur in muscular individuals using creatinine-based estimates. imbalances, such as , may accompany advanced renal dysfunction but are evaluated separately.

Glucose and Metabolic Panels

Glucose and metabolic panels are essential laboratory assessments used to evaluate , insulin secretion, and resistance, aiding in the diagnosis of diabetes mellitus, , and related metabolic disturbances. These panels typically include measurements of blood glucose under various conditions, for long-term glycemic control, and markers of pancreatic beta-cell function such as insulin and . They provide critical data on and help identify risks for conditions like , where impaired glucose regulation contributes to broader cardiovascular and endocrine dysfunction. Fasting plasma glucose serves as a primary screening tool, with normal levels in healthy adults ranging from 3.9 to 5.6 mmol/L (70 to 100 mg/dL); values between 5.6 and 6.9 mmol/L indicate , a state. For non-symptomatic individuals, random plasma glucose below 11.1 mmol/L (200 mg/dL) is considered within normal limits, though levels at or above this threshold with symptoms suggest . (HbA1c) reflects average blood glucose over 2-3 months, with normal values below 5.7% (typically 4% to 5.6% in non-diabetic populations). Insulin and measurements assess endogenous insulin production, particularly in distinguishing type 1 from or evaluating insulinomas. Fasting serum insulin levels in healthy adults normally range from 2.6 to 24.9 μU/mL (or mIU/L). Fasting , a of proinsulin cleavage that correlates with insulin , typically measures 0.3 to 0.6 nmol/L in healthy individuals, rising postprandially to 1 to 3 nmol/L. The metabolic syndrome panel often incorporates indices like the Homeostatic Model Assessment of Insulin Resistance (HOMA-IR) to quantify insulin sensitivity, calculated using the formula: HOMA-IR=fasting glucose (mmol/L)×fasting insulin (μU/mL)22.5\text{HOMA-IR} = \frac{\text{fasting glucose (mmol/L)} \times \text{fasting insulin (μU/mL)}}{22.5} Values below 2 generally indicate normal insulin sensitivity, while higher levels suggest resistance, a key feature of ; for example, scores above 2.5 are associated with increased cardiometabolic risk. The oral (OGTT) dynamically evaluates glucose handling after a 75-g glucose load, with a normal 2-hour plasma glucose level below 7.8 mmol/L (140 mg/dL); impaired glucose tolerance is defined as 7.8 to 11.0 mmol/L, and levels at or above 11.1 mmol/L indicate . These panels may briefly reference lipid components, such as triglycerides and HDL , to contextualize overall metabolic risk without detailed lipid analysis.
TestNormal Reference RangeUnitsNotes
3.9–5.6mmol/L (70–100 mg/dL)After 8-hour fast; : 5.6–6.9 mmol/L.
Random Plasma Glucose (non-symptomatic)<11.1mmol/L (<200 mg/dL)Diabetes threshold ≥11.1 mmol/L with symptoms.
HbA1c<5.7% (4–5.6%)%Reflects 2–3 month average; : 5.7–6.4%.
Fasting Insulin2.6–24.9μU/mLMeasures beta-cell output; lab-specific variations apply.
Fasting C-Peptide0.3–0.6nmol/LIndicates endogenous insulin production; rises post-meal.
OGTT (2-hour)<7.8mmol/L (<140 mg/dL)After 75-g load; impaired: 7.8–11.0 mmol/L.
HOMA-IR<2IndexNormal sensitivity; >2.5 suggests resistance.

Toxic Substances and Drug Levels

Therapeutic drug monitoring (TDM) involves measuring blood concentrations of medications with narrow therapeutic indices to ensure efficacy while minimizing toxicity risks, guiding dose adjustments based on pharmacokinetic principles such as absorption, distribution, , and elimination. This approach is essential for drugs like , , and , where individual variability in clearance—due to factors like age, renal function, and drug interactions—can lead to subtherapeutic or supratherapeutic levels. TDM typically targets steady-state concentrations, often assessed via trough levels drawn just before the next dose, to correlate plasma levels with clinical outcomes. For , used in and , the therapeutic range is generally 0.5-2.0 ng/mL, though lower targets (0.5-0.9 ng/mL) may suffice for to reduce toxicity risks without losing efficacy. Lithium, employed for maintenance, has a therapeutic range of 0.6-1.2 mmol/L, with levels above 1.2 mmol/L increasing risks of such as or . , an for serious infections, requires trough levels of 10-20 mg/L to achieve adequate tissue penetration, particularly for , while avoiding at higher concentrations. Monitoring for these drugs incorporates half-life considerations to time sampling appropriately; for instance, theophylline's half-life in young adults is 4-8 hours, necessitating levels checked after steady state (about 5 half-lives) to adjust dosing for asthma or COPD management. Environmental toxins like lead require surveillance, with the CDC blood lead reference value (BLRV) at 3.5 μg/dL (0.17 μmol/L) as of 2021 to identify elevated exposure in adults (no safe threshold exists, as higher concentrations impair neurocognitive function and hematopoiesis). Acetaminophen overdose assessment uses a 4-hour post-ingestion level below 150 μg/mL to rule out hepatotoxicity risk, per the Rumack-Matthew nomogram, prompting N-acetylcysteine therapy if exceeded. Overdose thresholds for salicylates, such as aspirin, indicate toxicity above 3 mmol/L (approximately 40 mg/dL), manifesting as , , and , with severe cases exceeding 7 mmol/L requiring urgent intervention like . These ranges underscore TDM's role in balancing benefits against adverse effects, with serial measurements essential during acute intoxication or chronic exposure to prevent organ damage.
SubstanceTherapeutic/Toxic RangeUnitsKey Monitoring NoteSource
Therapeutic: 0.5-2.0 ng/mLng/mLTrough at ; lower for
Therapeutic: 0.6-1.2 mmol/Lmmol/L12-hour trough; avoid >1.2
Therapeutic trough: 10-20 mg/Lmg/LBefore 4th dose; adjust for renal function
LeadReference value: <3.5 μg/dL (<0.17 μmol/L)μg/dL (μmol/L)Whole blood; no safe threshold (CDC BLRV, 2021)
AcetaminophenToxic if ≥150 μg/mL at 4h post-ingestionμg/mLNomogram-based; treat if above line
SalicylateToxic: >3 mmol/Lmmol/LSerial levels; for severe
Half-life: 4-8 hours (young adults)hours level post-20 hours

Urinalysis Reference Ranges

A complete urinalysis (also known as urine analysis or UA) is a routine laboratory test that examines the physical, chemical, and microscopic properties of urine to screen for urinary tract infections, kidney disorders, metabolic conditions such as diabetes, and other abnormalities. The parameters below provide approximate normal reference ranges for a standard urinalysis. These values are approximate and may vary depending on the laboratory methods, patient age, sex, hydration status, diet, and other individual factors. Results should always be interpreted by a physician in the context of the patient's clinical presentation.
ParameterNormal RangeNotes
ColorPale yellow to yellowInfluenced by hydration, diet, and medications.
ClarityClearCloudiness may suggest infection, crystals, or other abnormalities.
pH4.5-8Average around 6; varies with diet and certain conditions.
Specific gravity1.005-1.030Reflects kidney concentrating ability and hydration status.
GlucoseNegativePositive may indicate diabetes or renal threshold issues.
ProteinNegative or traceHigher levels may suggest kidney damage.
KetonesNegativePositive in diabetic ketoacidosis, starvation, or prolonged fasting.
BloodNegativePositive may indicate hematuria from infection, stones, or other causes.
Leukocytes (WBC)0-5 per high-power fieldHigher counts suggest urinary tract infection or inflammation.
Red blood cells (RBC)0-3 per high-power fieldHigher indicates microscopic hematuria; further evaluation needed.
BacteriaNegative or fewSignificant presence may indicate urinary tract infection.

References

Add your contribution
Related Hubs
User Avatar
No comments yet.