Which preanalytical factors that can affect validity of test results is not always under the phlebotomists control?

Extraction Techniques and Applications: Biological/Medical and Environmental/Forensics

F. Bamforth, in Comprehensive Sampling and Sample Preparation, 2012

3.09.8.3 Sample Preparation for MS/MS

The pre-analytical phase is a crucial part of the analysis and is equally important as the analytical component. Components of the pre-analytic phase include sample preparation, and reagent preparation, including extraction solution, mobile phase, reconstitution reagent, derivatization reagent, and internal standards. It is important that HPLC grade reagents and solvents are used throughout the analysis. The use of sodium dodecyl sulfate (SDS) in the sample preparation of DBS for MS/MS is avoided because of its interference in ESI. Compounds present in blood or in the filter paper matrix may also cause ion suppression, and it is essential that method validation is complete before introducing assays into clinical practice.40

The sensitivity and accuracy of MS/MS allows quantification of very small concentrations of analytes, and it is important that a good-quality blood sample is collected from the newborn to facilitate optimum detection. Typically a 3.2-mm (equivalent to 3.4 μl of whole blood) or a 4.8-mm DBS (equivalent to 7.6 μl whole blood) is punched from the DBS sample for analysis.36

Amino acids and acylcarnitines may be analyzed as their esterified derivatives or as nonderivatized compounds. Although conversion to butyl esters enhances ionization, particularly for the dicarboxylic acid carnitine derivatives, and minimizes possible interference from the solvent, it may introduce some interferences not detected when the compounds are not derivatized. Derivatization does not alter the 85 m/z daughter ion during acylcarnitine analysis but does change the m/z of the neutral loss mass for amino acids.40 Derivatization results in neutral loss of a fragment with an m/z of 102, while the nonderivatized methods result in neutral loss of a fragment with an m/z of 56. Both derivatized and nonderivatized methods are currently used in newborn screening although a survey of methodologies for phenylalanine by MS/MS in 2005 showed that of 611 laboratories using MS/MS, 542 (89%) were using derivatization, with 424 using a nonkit method. Only 69 laboratories were using a nonderivatized method, with 19 using a kit method.25 One disadvantage of derivatization is that it increases the time required for sample preparation. Interferences by other matrix components with the same m/z may contribute to the higher false-positive results. While nonderivatized methods are faster, there is general agreement that derivatization improves the volatility and ionization of some acylcarnitines, in turn increasing sensitivity and specificity.

Regardless of methodology, the analytes are extracted from the DBS by elution in methanol, containing known concentrations of stable isotope, usually deuterated, amino acid, and acylcarnitine internal standards, over approximately 30 min. Methanol, acetonitrile, or a mixture of both efficiently elute analytes from the filter paper and cause protein denaturation and precipitation, giving a cleaner extract. Extraction efficiency exceeds 90%.36 The extracts are transferred to a 96-well-titer plate and dried over a stream of nitrogen with gentle warming. If derivatization is employed, the dried extracts are heated with 3N acidified butanol for 15–30 min at 6065 °C. Longer incubation and higher temperatures may excessively hydrolyze acylcarnitines while lower temperature and shorter incubation may incompletely derivatize the acylcarnitines.36 The samples are reconstituted in a suitable medium facilitating soft ionization, typically a mixture of water and an organic solvent (e.g., methanol, acetonitrile, and isopropanol) and are introduced into the MS/MS via a syringe or HPLC pump where they are subject to ESI.

The improved specificity of MS/MS in PKU detection was demonstrated in a retrospective analysis of DBS samples from 208 newborns collected before 24 h of age. Screening for PKU before 24 h of age may be less reliable because phenylalanine concentration may still be below the critical limit and therefore not identified as abnormal. Samples had initially been tested by fluorometric phenylalanine measurement. There were 91 samples with an initial elevated phenylalanine level with normal phenylalanine on a second sample, i.e., a false-positive initial screen, 93 samples with an initial negative screen, and 12 samples from newborns with confirmed PKU. Both tyrosine and phenylalanine were measured by MS/MS and the molar ratio of phenylalanine to tyrosine (phe/tyr) calculated. A phe/tyr ratio of >2.5 was deemed to be abnormal. By MS/MS, only 3 of the 91 false-positive results from the fluorometric analysis remained positive, demonstrating the greater specificity of MS/MS compared to fluorometric analysis. Applying the phe/tyr ratio to the remaining three cases, only one false-positive result remained. All confirmed PKU cases by fluorometric analysis were abnormal by MS/MS.41

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780123813732000739

Clinical Use of Bone Turnover Markers in Osteoporosis

Symeon Tournis, Konstatinos Makris, in Reference Module in Biomedical Sciences, 2018

Preanalytical Phase

The preanalytical phase is the phase where the laboratory has no direct control on the process. Preanalytical factors that can affect results include: sample type, sampling time, sample handling, patient's preparation and the nutritional status of the patient. Several studies have investigated preanalytical factors that influence BTM measurement. The sources of preanalytical variability can be divided into technical and biological (Table 3). The numerous sources of biological variability that contribute to preanalytical variability can be also divided into two groups: (1) the uncontrollable factors such as age, gender, menopausal status, disease state (recent fracture, immobility), all of which must be accounted for before interpreting levels of bone markers and (2) the controllable factors, such as circadian rhythm, menstrual cycle, food intake, and exercise, which can be minimized by standardizing the timing and conditions under which samples are taken. These factors should be identified and considered before collecting a specimen in order to minimize variability.

Table 3. Sources of preanalytical variability

Technical sources
Type of specimen (serum, plasma)
Specimen handling and storage (effect of temperature and light)
Mode of sample collection
Between laboratory variation
Biological sources
1. Uncontrollable

Age (puberty, somatic growth, menopausal status, aging)

Gender

Ethnicity

Recent fractures (up to 1 year before)

Pregnancy/lactation

Drugs

Nonskeletal disease (thyroid, diabetes, renal disease, liver disease, degenerative joint disease, and systemic inflammatory disease)

Immobility

2. Controllable

Diet

Menstrual cycle

Exercise

Season of the year

Diurnal variation

Among controllable sources of variation, circadian rhythm is of high importance, especially for serum CTX. Peak CTX values usually occur early in the morning and can be twice as high from the nadir values that usually occur in the mid-afternoon. It is critical for the follow-up of patients’ serial samples to be collected constantly at the same time of the day, preferably early in the morning. This will allow better interpretation of results.

In most studies, exercise has an acute effect to increase markers of collagen formation and degradation by 15%–40%. These increases may persist up to 72 h. It is important to ask the patient whether she/he exercise on regular basis and to refrain from exercise at least 24 h before any sample collection.

The effect of diet and food intake must be considered in certain BTM. Food consumption may lower the CTX values. There is a degree of controversy regarding seasonal variation of BTM, with some suggesting that there is significant seasonal variation in some BTMs, while others suggesting that the overall seasonal variation is insignificant.

Few studies have examined the variation of BTMs during menstrual cycle and in pregnancy. Bone turnover has been found to vary with the menstrual cycle. Research suggests that osteoblastic activity is higher during the luteal phase and bone resorption is increased during the follicular phase. Pregnancy is also affecting all BTMs, in part because of the increased calcium requirements of the fetus, and in part because of the maternal changes in GFR.

The technical sources of preanalytical variation include the variation due to specimen collection, processing and storage. BTM can be measured in serum, plasma (EDTA) and urine. Urine is a biological fluid that BTM are in higher concentration than in serum or plasma and is a less complex matrix because many blood constituents are either not filtered or reabsorbed by the tubules.

The nature of the sample (serum or plasma either EDTA or heparin) may have an impact on the results, since all the assays cannot be run on both media due to obvious incompatibilities (e.g., calcium or alkaline phosphatase cannot be determined on EDTA plasma). On the other hand, some analytes have been shown to be more stable on EDTA plasma because complexion of calcium decreases the activity of proteolytic enzymes. EDTA plasma is often recommended for the measurement of BTMs but there is no hard evidence.

Stability of the analytes is another serious issue when testing is not scheduled on the same day when the sample is collected. Appropriate control of sample collection and preparation is vital for successful BTM measurement. Several BTMs, especially OC and TRAP5b, are sensitive to thermodegradation and levels can be significantly reduced after only a few hours at room temperature. TRAP5b activity is also reduced during storage, samples must be kept at − 70°C or lower and multiple freeze–thaw cycles should be avoided. No significant decrease has been detected in CTX if stored at − 20°C or lower for up to 3 years; nevertheless it rapidly decreases in serum at both 4°C and 37°C. The molecular mechanism is unknown, but EDTA minimizes this decrease. CTX is reportedly stable in EDTA blood tubes before separation or up to 48 h, likewise OC becomes stable for up to 8 h at room temperature. Consequently blood should be collected into EDTA tubes and separated as soon as possible, if samples are not analyzed immediately they should be stored at − 20°C or lower. Both PINP and Bone ALP were found to be stable in any sample type. Notably current TRAP5b assays are not affected by haemolysis, but erythrocytes are known to contain proteases, which degrade OC. Grossly haemolyzed samples in general should always be avoided. The different requirements of each biomarker should be given a special attention especially when samples are prospectively collected for later measurement. Other newer biomarkers (sclerostin, DKK1, OPG, RANKL) are either less well studied or not studied at all (Seibel, 2005; Delmas et al., 2000; Hlaing and Compston, 2014; Bernardi et al., 2004; Morris et al., 2017).

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780128012383995382

Interferences of hemolysis, lipemia and high bilirubin on laboratory tests

Steven C. Kazmierczak, in Accurate Results in the Clinical Laboratory (Second Edition), 2019

Introduction

The preanalytical phase of the laboratory testing process includes all of the procedures prior to the start of sample analysis, and typically involves a variety of healthcare professionals. There are a wide variety of preanalytical factors that can adversely impact the integrity of specimens that are submitted for analysis. These factors include the improper or incorrect use of collection containers, improper tourniquet application, contamination from infusion routes, excessive time delay from specimen collection to analysis, failure to store specimens at an appropriate temperature prior to analysis, failure to shield the specimen from direct light, and collection of the specimen at the wrong time of day or at an inappropriate time following the administration of certain medications [1]. Prevention of medical errors in healthcare has received a great deal of attention since the publication in 2000 of the report from the Institute of Medicine which estimated that medical errors result in approximately 44,000 to 98,000 preventable deaths and 1,000,000 excess injuries each year in U.S. hospitals [2]. Therefore, accurate laboratory test results are important in order to prevent medical errors because diagnoses of many diseases today are based on laboratory test results. While medication errors are most often cited as the number one cause of medical errors, inappropriate treatment of patients due to incorrect test results caused by interfering substances has been noted to be a factor contributing to medical errors [3–5]. One study found a significant decrease in the number of errors observed in the clinical laboratory between 1996 and 2006. However, the proportion of pre-analytical errors observed during this time span remained constant [6]. Interference in clinical assays is often underestimated and too often undetected in clinical laboratories [7]. Preanalytical errors due to endogenous interfering substances are perhaps one of the most common causes of errors that occur in laboratory testing.

In principle, interferences that affect the spectrophotometric measurement of a sample can be reduced by use of an adequately blanked analytical method. However, this is often not practical or easy to implement. In addition, endogenous interfering substances can cause not only spectral interference, but can also cause chemical interferences in some assays. Also, interference due to hemolysis can increase the concentrations of those analytes that are present in the erythrocytes themselves. While common endogenous interferences such as hemolysis, lipemia and icterus are known to interfere with photometric assays, interference with turbidimetric methods and immunoassays have also been reported.

The prevalence of endogenous interfering substances seen in patient samples submitted for analysis can be significant, but the actual frequency at which interferences occur can be difficult to estimate. One study that investigated the prevalence of endogenous interferences seen in outpatients found that 9.7% of samples submitted for analysis contained at least one endogenous interfering substance [8]. Of the samples considered to have some type of endogenous interfering substances, 76% were considered to be lipemic, 16.5% were hemolyzed and 5.5% were icteric. However, significant differences in the incidence of endogenous interferences have been noted with respect to where patients are located within a hospital setting.

Observations by both physicians and clinical laboratory staff suggest that the rate of hemolysis in samples collected in emergency departments significantly exceeds those collected in other hospital locations. One study that evaluated a total of 4,021 samples found that hemolyzed samples were more frequently seen in samples collected in the emergency department compared to samples obtained from the medical unit [9]. Of the 2,992 specimens collected in the emergency department, 372 (12.4%) were hemolyzed while of the 1,029 samples from the medical unit, 16 (1.6%) were hemolyzed. The use of trained phlebotomists to collect blood from patients in the medical unit, versus the use of nurses not formally trained in phlebotomy practices to collect blood in the emergency department, was suggested to play a significant role in the differences seen in the rates of hemolysis. Another factor cited as a cause for higher rates of hemolysis in the ED is the common use of intravenous catheters for blood collection in this particular setting [10].

The incidence of endogenous interfering substances seen in specimens submitted to the clinical laboratory is dependent upon a number of factors including the patient population being served (i.e., neonates, diabetics, elderly, patients on total parenteral nutrition, inpatients vs. outpatients, etc.), use of skilled phlebotomists versus minimally trained healthcare providers, and elapsed time from collection of sample to processing and analysis. Patient characteristics such as gender and age have also been cited as factors that affect the rates of hemolysis. One study found a significantly higher rate of hemolysis in samples collected from males (13.1%) versus females (10.1%). This trend for higher rates of hemolysis in males was seen regardless of whether samples were collected in a primary healthcare setting, nursing home or the emergency department [11]. In addition to gender, age has been cited as a contributing factor to the rate of hemolysis. Higher rates have been noted in patients greater than 63 years of age compared to younger individuals [11]. However, samples collected from infants and neonates typically show higher levels in vitro hemolysis.

Another important factor in the assessment of endogenous interferents is the mechanism that is used to identify the presence of interfering substances. The use of visual inspection of samples for identification of interfering substances is still in use by some laboratories, although this practice is rapidly being supplanted by the use of instruments with the capability to detect and quantify the amount of interference present. Manual visual detection of endogenous interferences is noted to suffer from significant lack of agreement between individuals and also vastly underestimates the actual number of samples that have levels of endogenous interfering substances that can cause assay interference. Also, increased concentrations of bilirubin can result in underestimation by visual means of the amount of plasma hemoglobin that is present in hemolyzed samples. This situation is frequently seen in samples collected from newborns who often show increased bilirubin concentrations and whose samples are often hemolyzed. Thus, the use of automated systems for detection of endogenous interfering substances is imperative if proper evaluation of these types of samples is to be accomplished. One study that utilized an algorithm for the detection and processing of clinically or analytically relevant amounts of hemolysis found that automated detection of relevant hemolysis was approximately 70 fold higher compared to when sample hemolysis was assessed by manual detection [9].

Despite the vast number of publications that have addressed the problems of endogenous interfering substances, there is still a significant lack of understanding concerning sources of endogenous interferences. For example, artificial substitutes such as Intralipid used to mimic lipemia, often do not behave the same as samples with native lipemia [12]. Despite this, virtually all studies designed to assess the effect of lipemia utilize Intralipid to evaluate this interference. Other aspects of interference testing often ignored or overlooked include whether the interfering substance produces similar interference effects at different analyte concentrations. Evaluation of endogenous interfering substances should be performed at several different concentrations of analyte. For example, a 10% bias in the measured concentration of 100 mg/dL of glucose due to 300 mg/dL of plasma hemoglobin may be insignificant when the glucose concentration in the sample is 200 mg/dL.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780128137765000054

Pre-analytics in biomedical metabolomics

Rainer Lehmann, in Metabolomics for Biomedical Research, 2020

1 Introduction

In the 1970s, the term pre-analytical phase was introduced in the biomedical field. It is part of the standard terminology in Clinical Chemistry. The pre-analytical phase starts even before any sample is collected, that is, the planning phase of each project is also part of the pre-analytical phase. Examples of possible error sources settled in the planning phase are vague instruction of patients, definition of impracticable, error-prone pre-analytical standard operating procedures (SOPs) which cannot be fulfilled by nurses or physicians, selection of unsuitable collection tubes (e.g., releasing interfering compounds), definition of wrong centrifugation conditions for blood samples in the SOP, selection of cheap pipette tips releasing plasticizer during sample preparation, etc. A scheme of all different pre-analytical steps is given in Fig. 1. All these examples, which are discussed in detail in the following sections, can heavily affect the analyses and metabolomes of interest, and therefore, the success and outcome of the entire metabolomics project.

Which preanalytical factors that can affect validity of test results is not always under the phlebotomists control?

Fig. 1. Schematic view on the pre-analytical phase in biomedical metabolomics projects spanning from sample collection via processing to storage and metabolite extraction.

The major part takes place at the side of the biomedical cooperation partners (gray box). Only the final pre-analytical steps are usually performed by experienced metabolomics experts (blue box).

Sample collection is often entitled as the simple part of a biomedical study. But in daily clinical routine diagnostics inaccuracies in the pre-analytical phase account for up to 80% of laboratory testing errors [1–3]. Accidental as well as systematic errors in the pre-analytical phase can lead to useless samples for metabolomics analysis. The actual sample quality used in biomedical metabolomics research is often an underestimated pitfall. In the following sections, crucial aspects of the pre-analytical phase will be discussed in detail.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780128127841000037

Sample processing and specimen misidentification issues

Alison Woodworth, Amy L. Pyle-Eilola, in Accurate Results in the Clinical Laboratory (Second Edition), 2019

Conclusions

Because most errors occur in the pre-analytical phase of laboratory testing, it is important to have robust procedures in place in the laboratory to eliminate various errors that may occur in the pre-analytical phase. Proper blood collection procedure, transport of specimens, and timely centrifugation ensures not only good specimen integrity but also accurate test results. Sample misidentification may cause serious errors in laboratory test results, and may cause significant morbidity or even mortality, especially if a blood typing specimen is mislabeled. Good laboratory practice not only involves careful attention in the analytic phase, but also in the pre- and post-analytical phases as well.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780128137765000030

Effects of Pre-analytical Variables in Therapeutic Drug Monitoring

Valerie Bush, in Therapeutic Drug Monitoring, 2012

Introduction

Laboratory results are dependent on the quality of the specimen analyzed [1]. The pre-analytical phase accounts for the majority of laboratory errors [2]. It has been estimated that pre-analytical errors account for more than two-thirds of all laboratory errors, while errors in the analytical phase and post-analytical phase account for one-third of all laboratory errors. Carraro and Plebani reported that among 51 746 clinical laboratory analyses performed in a 3-month period in their laboratory (7615 laboratory orders, 17 514 blood collection tubes), clinicians questioned the validity of 393 test results out of which 160 results were confirmed by the authors as being due to laboratory errors. Of the 160 confirmed laboratory errors, 61.9% were determined to be pre-analytical errors, 15% analytical errors and 23.1% post-analytical errors. The pre-analytical phase thus showed the highest percentage of errors, the most frequent problems arising from mistakes in tube-filling with an incorrect blood to anticoagulant ratio for coagulation tests, and empty or inadequately filled tubes. Other common errors in the pre-analytical steps included using the wrong type of blood collection tube, errors in the requested test procedure, wrong patient identification, contradictory demographic data from different information systems, missing tubes, samples diluted with intravenous infusion solution, and other problems. The authors also identified 24 errors in the analytical phase and 37 in the post-analytical stage. The majority of laboratory errors had no impact on patient care (121 errors out of 160), but 1 error caused inappropriate intensive care admission (0.6%), 2 errors caused inappropriate transfusion (1.3%), 9 errors (5.6%) resulted in inappropriate investigation and 27 errors (16.9%) required repeated laboratory tests [3]. By controlling and standardizing practices in the pre-analytical phase, test accuracy can be improved significantly to ultimately benefit patient care and patient safety.

The pre-analytical phase consists of all steps from preparing the patient for collection of the specimen to processing of the specimen prior to the analytical step. Pre-analytical errors can occur in vivo or in vitro. Many pre-analytical factors can alter test results by producing changes that do not reflect the patient’s true physiological or clinical condition. In vivo factors are more difficult for laboratory professionals to control, but some pre-analytical errors can be avoided by enforcing specimen-collection and -handling requirements. Certain patient populations (i.e., pediatric, geriatric, dialysis and oncology patients) present additional challenges in obtaining a specimen for therapeutic drug monitoring or other clinical laboratory tests. Some other pre-analytical patient factors include patient identification, time of dose versus collection, hemolysis/lipemia, drug interactions, and degree of protein binding. One of the most common pre-analytical variables associated with therapeutic drug monitoring occurs after the blood has been collected, and is related to the drug’s stability in blood collection tubes. In vitro drug stability is dependent upon several factors, including the primary tube used, the fill volume in the tube, along with the time and temperature of storage. Another important pre-analytical variable is the collection of hydrophobic antibiotics through intravenous lines. This chapter will focus on pre-analytical variables during collection and processing and their impact on therapeutic drug values, with a brief mention of some in vivo factors.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780123854674000026

Laboratory automation

Mark A. Marzinke, in Contemporary Practice in Clinical Chemistry (Fourth Edition), 2020

Automation beyond the chemistry analyzer

In the current landscape, centralized laboratory automation has mainly involved the integrated connectivity of the pre- and postanalytical phases of the testing process to chemistry and immunochemistry analyzers. However, integrated automation for hematology, coagulation, urinalysis, and molecular and microbiological testing is also available. For example, Sysmex Corporation and Beckman Coulter offer track-based automation for hematology and urinalysis testing, respectively. Several centralized laboratory platforms now have the capability of automated specimen handling and transport to a variety of downstream instruments. As previously discussed, open access instrumentation can facilitate more integrated laboratory automation via track extension to nonchemistry and immunochemistry analyzers manufactured by other vendors.

Many of the benefits of centralized laboratory automation expand beyond the core laboratory setting. Automation has permeated labor-intensive and manually driven laboratory settings, including bacteriology and clinical microbiology. Automated tasks in clinical bacteriology include appropriate Petri dish selection, sample inoculation, spreading of inoculum on appropriate plates, as well as the labeling and sorting of media and plates. The automation of many of these tasks also provides a way to standardize assays across laboratories; for example, variability in the streaking of liquid or nonliquid specimens on plates is reduced by spreading inoculum via magnetic beads, an approach used by the automated specimen processing analyzer offered by BD Kiestra. Other vendors offering such automated solutions include bioMérieux Inc., Becton-Dickinson Diagnostics, and Copan Diagnostics. In addition, over the last decade, TLA systems targeting the clinical microbiology environment have been brought to market. Beyond exploiting automated barcode reading and a track-driven platform, incubators and digital equipment can be interfaced with the aforementioned microbiology and bacteriology specimen processing modules. BioMérieux Inc., BD Kiestra, and Copan Diagnostics all offer centralized automated clinical microbiology platforms. Further, a recent Q&A published in Clinical Chemistry reinforced the growing need and rationale for automation in clinical microbiology settings [10].

Fully integrated laboratory automation has also been met with success in clinical molecular laboratories, complementing work conducted in clinical microbiology. Various automation platforms offer barcode recognition of specimens, as well as automated nucleic acid extraction and polymerase chain reaction (PCR) amplification of DNA regions of interest. Laboratory automation focused on preanalytical DNA extraction, or PCR amplification and detection techniques are available through a number of vendors, including Abbott Laboratories, BD Diagnostics, bioMérieux Inc., Qiagen, Roche Diagnostics, and Thermo Fisher Scientific. However, much like automated chemistry and immunochemistry analyzers and TLA systems, specimen handling capabilities, tube-type flexibility, barcode detection, and laboratory automation software components are vendor-specific and should be considered when selecting an integrated platform.

Although many laboratories continue to look toward integrated automation solutions to reduce errors, improve efficiencies, and combat staff shortages, there are still areas that require automated preanalytical and centralized solutions. While mass spectrometry has successfully transitioned from a basic and translational tool to a clinical platform used for identification and/or quantification of small molecules and, more recently, proteins, specimen handling and management are still largely manual. Given the success of immunosuppressant and pain management testing by liquid chromatographic-mass spectrometric (LC-MS) techniques, centralized automated solutions may further integrate these methodologies into the clinical testing environment [11]. While vendors are pursuing automation-driven solutions with regard to mass spectrometry, including the automated review of mass spectra and chromatograms and interfacing of these data with the LIS, fully integrated automation for LC-MS platforms is still in its infancy.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780128154991000144

Variables affecting endocrine tests results, errors prevention and mitigation

Hossein Sadrzadeh PhD, ... Gregory Kline MD, in Endocrine Biomarkers, 2017

1.6.1.1 Patient physiology

It is well established that most laboratory errors (48%–68%) are related to the preanalytical phase of testing [8]. Patient preparation is an additional preanalytical consideration that is highly susceptible to the introduction of errors. Fortunately, patient adherence to preparation instruction and the implementation of specific reference ranges can reduce the impact of some physiological variables on test results. The laboratory, however, has limited ability to prevent physiological variables such as biological rhythms and nutritional status of the patient. As mentioned earlier, it is important that both the medical and laboratory staff be familiar with factors that can impact patient preparation and specimen collection.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B978012803412500001X

Billing and Reimbursement for Molecular Diagnostics

Michael S. Watson, Michele Schoonmaker, in Molecular Diagnostics, 2010

Billable Components of Clinical Testing

Like all clinical testing, molecular genetics involves several steps: pre-analytical, analytical, and post-analytical. The pre-analytical phase involves specimen collection, acquisition by the laboratory, labeling and coding, and preparation for analysis. The analytical phase involves the actual running of the test, while the post-analytical phase includes recording the results, interpreting the results, reporting the results to the ordering physician, and filing the report. Different assays have different billable components. In some instances if specimen collection is via an invasive procedure, it may be billable apart from the assay itself. When a laboratory procedure is reported or coded as a comprehensive procedure, such as those in molecular microbiology, all of the testing elements are included in the price established for the CPT code. In contrast, laboratory procedures for molecular diagnostics have a variety of separately billable CPT codes that will describe various billable aspects of testing. For the purposes of payment, some comprehensive CPT codes are divided into two components: technical (analytical) and professional (interpretive) components. While for many testing systems the analytical complexity is decreasing by becoming more automated, the clinical complexity is increasing due to the laboratory professional having to interpret multiple results from a single test. For many CPT codes, particularly in molecular pathology, these nuances are not captured in the payment system.

However, there are situations in which a service that is billable may be reimbursed only to a specific provider group. As CPT codes for genetic testing have evolved over the past two decades, both the technical and professional components were reimbursed to laboratories providing the service. In recent years, there has been a shift from reimbursing professional components of test result interpretation to laboratories to reimbursing the laboratory only for the technical components, while requiring that the reimbursement for the professional components be made only to physicians, regardless of their specific training in the laboratory field in question. This has caused problems in many areas of genetic testing that include significant numbers of board-certified Ph.D. laboratory directors. In particular, only a small proportion of the board-certified cytogenetics laboratory directors in the United States are physicians, many of whom are not involved in heritable disease cytogenetic testing.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780123694287000082

Preanalytical Considerations

Timothy P. Rohrig, in Postmortem Toxicology, 2019

Abstract

Toxicological analyses play a critical role in the determination of the cause and manner of death. However, as critical as the toxicological testing phase is, the preanalytical phase that is the specimen collection method and/or storage can have a significant impact on the toxicological results, thus adding additional complexities to the interpretation. Drug concentrations may change due to the specimen collection process, the storage container the specimen is held in and the storage conditions for the specimen(s). The storage container may add, form or remove target analytes to or from the specimen. The storage conditions can influence not only the stability of the biological matrix itself, but also the stability of the drug in the specimen. Changes in drug concentrations and/or target analytes found in the specimen could lead to misinterpretation of the toxicological test results and potentially change the determination of the cause and manner of death.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780128151631000022

Which of the following is not subject to quality control QC procedures by a phlebotomist?

Chapter 2 - Quality Assurance and Legal Issues.

What is an example of a QC measure in phlebotomy?

An example of a QC measure in phlebotomy is: Checking the expiration dates of evacuated tubes.

What phase of testing is the phlebotomist most associated with?

The preanalytical phase is everything that occurs prior to the actual performance of the test. The phlebotomist is critical in ensuring quality sample collection for blood specimens.

What is the most serious error in phlebotomy?

The most serious error is failure to properly identify the patient. Even if everything else is done perfectly, the final result will not apply to the patient incorrectly presumed to be the source.