Search

The Online Encyclopedia and Dictionary

 
     
 

Encyclopedia

Dictionary

Quotes

 

Gold standard (test)

A gold standard is the "best" test to determine whether something exists or not. The term gold standard is commonly used in medicine to distinguish the one test that identifies whether an individual has a disease process or not.

An ideal gold standard test has a sensitivity of 100% (it identifies all individuals with a disease process, and it does not have any false-negative results) and a specificity of 100% (it does not falsely identify someone with a condition that does not have the condition; it does not have any false-positive results). There are no ideal gold standard tests.

Because even the gold standard test can be incorrect (either a false-negative or a false-positive result), the test results should be interpreted in the context of the history, physical findings, and other test results in the individual that is being tested. It is within this context that the sensitivity and specificity of the gold standard test is determined.

Quite often the gold standard test is not the test performed in a particular individual. In fact, many gold standard tests are not performed in the clinical practice of medicine at all. This is because the gold standard test may be difficult to perform or may be impossible to perform on a living person (ie: the test is performed as part of an autopsy), or may take too long for the results of the test to be available to be clinically useful.

As new diagnostic methods become available, the gold standard test may change over time. For instance, for the diagnosis of aortic dissection, the gold standard test used to be the aortogram , which had a sensitivity of as low as 83% and a specificity of as low as 87%. More recently, with the advancements of magnetic resonance imaging, the magnetic resonance angiogram (MRA) has become the new gold standard test for aortic dissection, with a sensitivity of 95% and a specificity of 92%.

Last updated: 09-12-2005 02:39:13