This article requires a subscription to view the full text. If you have a subscription you may use the login form below to view the article. Access to this article can also be purchased.
Extract
The current conception of the clinical laboratory has been around for over a century now. Throughout that time, many practitioners have wished that they could “just do the job” and not worry about outside influences. They have also assumed that those who came before them had it easier. But that approach fails the test of reality.
Demands for access to universal healthcare and worries about how to pay for it were common during the unionization activities prior to the country's entry into World War I. Concerns about the educational level of practitioners were around in the 1920s. Worries about the recognition of this profession as a profession resulted in the formation of ASCLS in the mid 1930s. Quality control first made its appearance in the clinical laboratory at about the same time. World War II dramatically changed the clinical laboratory as the advances stimulated by the war effort were seen in community hospitals with the increased demand for the identification of microorganisms, now treatable with antimicrobials. More invasive surgery, especially organ transplantation, put additional strain on transfusion services whose specialists were also coping with newly discovered and troublesome red cell antigen families.
The sixties saw an explosion of laboratory tests and cutting edge automation. For the first time, clinical laboratory data became critical to all patients. Medicare and Medicaid brought federal interest to the laboratory. After such growth, the seventies saw a time of restriction as insurers and the government grappled with the finances of healthcare. While there was some…
- © Copyright 2006 American Society for Clinical Laboratory Science Inc. All rights reserved.