The science behind the human body’s immune response after a blood vessel is damaged is very complex and highly coordinated. After the blood vessels are damaged, the bleeding must stop and then the blood must clot. In the clinical world, the study of clotting factors, bleeding or excessive bleeding, and the monitoring of such effects is a technologically advancing field.
The overall coagulation reaction, or any component involved in the reaction, may be studied in an anticoagulated plasma sample by observing the time required for a visible fibrin strand, or coagulum, to form after adding calcium chloride to the sample. Although functional techniques provide accurate results in the hands of a skilled operator, increased demand for testing led to the development of electromechanical, turbidimetric (nephelometric), and immunologic methods for use on instruments.
Hospital coagulation labs and reference labs play a key role in patient care. Speed to result is critically important for coagulation test results used in clinical settings and drug monitoring. Improved efficiency of automated coagulation testing enables the generation of high-throughput, accurate, and precise analyzers with minimal human error in measurement.
Coagulation monitoring, especially perioperative, is important to diagnose potential causes of hemorrhage, to guide hemostatic therapies, and to predict the risk of bleeding during consecutive surgical procedures. There are diseases that cause reduced clot formation, but prevention of blood coagulation is required post-surgery for wounds to heal
Initial coagulation analyzers were operated mechanically, using a hook to detect a clot in the cuvette. This has now been replaced by simultaneous detection of clotting factors via clotting, colorimetric, and immuno principles. Current technologies use automated platelet-function analyzers, flow cytometers, PCR, and microarrays. Combinations of all these technologies eliminate pre-analytical and post-analytical handling, yielding accuracy and increased productivity.
Since 1994, the POC for PT/INR systems has enabled healthcare professionals to allow closer patient monitoring and care management, especially in the case of Warfarin therapy. While most coagulation analyzers are heading toward accuracy, ease of use, and timely results, focus has shifted to better bioinformatics and integrated software across multiple platforms and analyzers. Informatics is an area of growing interest for clinical laboratories; IT solutions will help them leverage data management, improve process flow in their lab, and monitor instrument events like error flags and service events to yield enhanced diagnostic value for clinicians.
In the recent past, the numbers of ordered coagulation screening tests have increased tremendously. This, in turn, has improved the efficiency of automated coagulation testing, enabling the generation of high-throughput, accurate, and precise analyzers with minimal human error in measurement.
Coagulation analyzers have come a long way from measuring the optical density of a clot in a cuvette to new improved and ever-precise software that allows hospital runs to be standardized, easy to use, and quality controlled. Monitoring of perioperative patients using coagulation analyzers is now significantly faster, easier, and more accurate.
While coagulation analysis and monitoring tests and POC analyzers bring results quickly and reliably to the clinician for healthcare monitoring, several concerns have been raised regarding the sample processing time and blood collection site – even how the patient’s age and gender may alter the test result. Therefore, in order to interpret results accurately during a run, a standardized test with controls needs to be run on the same assay and analyzer.
The direction in which this field is now moving is toward integrated analyzing software, high robustness of devices, a wide variety of ranges in these analyzers, and increased precision of results with reduced analysis time.