Miscellaneous Procedure-Related Infections



Miscellaneous Procedure-Related Infections


Sharon F. Welbel

Robert A. Weinstein



Technical, computer, and radiologic advances over the past 10 to 20 years have facilitated dramatic growth in the fields of diagnostic and therapeutic procedures. With miles of vascular network, dozens of extravascular spaces, and several organ systems, any patient is a candidate for a staggering array of such interventions. Although many of these procedures can provide information that is essential for sophisticated patient care and supplant more traumatic interventions, most procedures also bypass natural host defenses and place patients at increased risk of healthcare-associated infection (HAI) (1,2). It is not surprising, then, that the introduction of any new procedure often is followed closely by case reports of procedure-associated infections. Occasionally, epidemiologic experiments of nature in the form of HAI outbreaks provide more detailed information on specific procedure-related risks, which eventually could be subjected to prospective study. This chapter discusses a variety of procedure-associated infections that have been highlighted by retrospective or prospective investigations and that have not been discussed elsewhere in this book.

Because of the seemingly eclectic contents of this chapter, it is important to recognize from the outset that the procedures to be discussed have certain themes in common. First, all of the procedures are vulnerable to these possibilities: inexperienced operators; breaks in aseptic technique; contaminated, inadequately disinfected, and/or technically difficult-toclean equipment; and ineffective antiseptics. Second, many procedure-related infection problems unfortunately reemerge as new generations of healthcare workers (HCWs) rediscover these vulnerabilities and/or surface in developing countries where limited infection control resources heighten risks, hence the importance of reviewing hazards that at first glance could appear remote (3). Third, various procedures involving many different sites have as a common path of infection, the bloodstream. However, the risk of infection differs depending on whether bloodstream contamination is transient or persistent as well as on host-and organism-specific factors. Fourth, risks of biofilm formation on surfaces of many different devices that enter normally sterile body sites and the value of anti-infective coatings (e.g., antimicrobials, antiseptics, and heavy or precious metals) for such devices are being studied actively. Finally, many procedures bear the burden that the specific risks have not been defined sufficiently to determine whether certain preventive measures, such as the use of prophylactic antimicrobial therapy, are mandated.


INFECTIONS FROM PROCEDURES INVOLVING THE VASCULAR SYSTEM


NEEDLELESS AND SAFETY DEVICES


Healthcare Worker Risk

HCWs have always been at risk of needlestick injuries, but quantifying the actual number of percutaneous needlestick injuries (NSIs) sustained by U.S. HCWs is difficult. Panlilio et al. combined data for 1997 and 1998 from the National Surveillance System for Healthcare Workers (NaSH) and the Exposure Prevention Information Network (EPINet). The EPINet was developed in 1991 to provide standardized methods for recording and tracking percutaneous injuries, and blood and body fluid contacts. The EPINet system consists of a needlestick and sharp object injury report and a blood and body fluid exposure report and software for entering and analyzing data. Since its introduction in 1992, EPINet has been acquired by >1,500 hospitals in the United States; EPINet has also been adopted in other countries including Canada, Italy, Spain, Japan, and the United Kingdom.

The Needlestick Safety and Prevention Act (NSPA) of 2000, and the 2001 revised Bloodborne Pathogens Standard require healthcare facilities to maintain a sharps injury log. The log must include, at a minimum, the type and brand of device involved in the exposure incident, the department where the exposure occurred, and an explanation of how the injury occurred. EPINet has adjusted the data for underreporting. Panlillio et al. estimated that the number of percutaneous NSIs sustained annually by U.S. HCWs is 304,325 (4). This is in contrast to the 1,728 percutaneous NSIs reported in 2003 by 48 U.S. healthcare facilities to the EPINet surveillance program (5). The overall annual percutaneous NSI rate for all network hospitals (29 facilities) in 2007 was 27.97 per 100 occupied beds (33.49 injuries per 100 occupied beds for teaching hospitals; 16.16 for nonteaching facilities). A safety-designed device was used only 37.4%, and was activated fully only 11.9%, of the time (6). Clearly, the risk of transmission of bloodborne pathogens, such as HIV, hepatitis B virus (HBV), and hepatitis C virus (HCV), still exists. Although much work has been done to decrease NSIs, we must continue to develop and implement devices that will be used consistently and that are not too costly.


The Bloodborne Pathogens Standard promulgated by the Occupational Safety and Health Administration (OSHA) (7) in 2001, in addition to mandating reporting, also requires that engineering controls and work practices eliminate or minimize HCW exposure to blood. One means of accomplishing this goal is to use needleless systems. Phillips et al. set out to determine whether the NSPA itself had an effect on the rate of percutaneous injuries among hospital employees. By analyzing data from a multihospital sharps injury database, the investigators found evidence that the NSPA did contribute to the decline in percutaneous injuries among U.S. hospital workers (8). Conversely, Jagger et al. found, despite the NSPA, that surgical injuries continued to increase during the same period that nonsurgical injuries decreased, suggesting that attention to safer surgical devices and practice is needed. The EPINet data (suture needles comprising much of sharps injuries) support this view (9).

A retrospective review of 3,297 percutaneous injuries from hollow-bore safety-engineered devices occurring between 2001 and 2009 was conducted, using the EPINet needlestick surveillance data. Nurses sustained 64.6% of all injuries from safety-engineered sharps devices (SESD); 42.9% of SESD injuries occurred after device use and may be preventable through consistent use of safety-engineered technology. Excluding injuries that occurred during device use or between procedural steps, 71.8% (28/39) of physician injuries, 58.2% (645/1,109) of injuries to nurses, and 45.8% (88/192) of injuries to phlebotomists occurred when an available SESD was not fully activated. Passive devices that do not require action on the part of the user to engage the safety feature currently represent a small portion of the SESD market. The use of innovative passive SESDs coupled with continual education of end users is necessary for an effective sharps injury prevention program (10).

Still, since the introduction of needleless systems, decreased rates of occupational needlestick exposures have been documented (11,12,13,14,15,16,17). Protected-needle intravenous (IV) systems also have decreased NSIs due to IV connectors by 62% to 88% (18,19). Again, unfortunately, the devices are not routinely activated, which appears to be related to inadequate training (13).


Patient Risk

The association between needleless devices and patient infection seems to relate to HCWs’ lack of familiarity with the device and/or its mechanics. Some have investigated the mechanics of needleless devices to determine whether new technology added to the device could reduce infection risk. Menyhay and Maki reported on a simulation study that compared the efficacy of conventional alcohol disinfection of the membranous septum of needleless Luer-activated valved connectors before access with the use of a novel antiseptic-barrier cap that, when threaded onto the connector, places a chlorhexidine-impregnated sponge in continuous contact with the membranous surface (20). After removal of the cap, there is no need to disinfect the surface with alcohol before accessing it. After contaminating, disinfecting, and then culturing the connectors, the authors demonstrated that if the membranous septum of a needleless Luer-activated connector is heavily contaminated, conventional (5 to 7 seconds) disinfection with 70% alcohol did not reliably prevent entry of microorganisms. In contrast, the antiseptic-barrier cap provided a high level of protection. Another study considered a recently developed needleless closed Luer access device (CLAD) (Q-Syte, Becton Dickinson, Sandy, Utah). The devices were contaminated with bacteria and then disinfected with 70% isopropyl alcohol followed by flushing with 0.9% saline. Although the devices had been accessed up to 70 times, no microorganisms were found even when challenge microorganisms were detected on the syringe tip after activation and on the compression seals before decontamination, suggesting that the Q-Syte CLAD can be activated up to 70 times with no increased risk of microbial contamination within the fluid pathway (21). The Luer-activated valved connector, which allows a chlorhexidine-impregnated (or alcohol) sponge to do the work, could be particularly helpful because it does not depend on the action of an HCW for disinfection.

Needleless systems are now almost universally used; the benefit to the HCW and the risk to the patient have been demonstrated. Education is a key intervention to prevent patient infections with devices new to an institution. Novel interventions will need to be studied further to assess their benefit and cost. A study to determine the risk factors for bloodstream infections (BSIs) in patients receiving home intravenous infusion therapy (22) revealed that receipt of total parenteral nutrition and intralipid therapy through a needleless system was a BSI risk factor. The results of a survey on the subject of injection caps demonstrated that positive cultures were significantly more common from needleless devices than from protected-needle devices. It was concluded that when injection caps are manipulated, nutrient-rich solutions can remain in the caps of the needleless devices and become contaminated. Another study that assessed needleless systems used with Hickman catheters suggested that such systems can be associated with increased rates of catheter-related (CR) BSI (23). The investigators cultured luminal fluid from Hickman catheters of hematology patients and found that these catheters with the needleless system were twice as likely to show luminal contamination compared with catheters without the system. Four BSIs in patients with the needleless device had peripheral blood and luminal fluid cultures that yielded concordant bacterial strains based on the results of pulsed-field electrophoresis (PFGE) and restriction fragment polymorphism studies. Do et al. described an increased BSI rate with the use of a needleless intravenous system in a home-care setting when caps were changed every 7 days and a subsequent decrease in BSIs when the needle-less device end cap was changed every 2 days, suggesting that the mechanism for BSI could involve contamination from the end cap (24). Kellerman et al. reported an 80% increase in BSIs related to central venous catheters (CVC) in pediatric hematology-oncology patients receiving home healthcare after introduction of a needleless device for CVC access (25). At another institution, a significant increase in the BSI rate in a surgical intensive care unit (ICU) and an organ transplant unit was associated with the introduction of a needleless intravenous system. This was attributed to nurses’ lack of familiarity with the device and deviation from the manufacturers’ recommended practices (26). Finally, another study that investigated the risk factors associated with an increased rate of BSIs in pediatric ICU found that exposure to the IVAC first-generation needle-less device (IVAC, San Diego, California) was an independent BSI risk factor. The BSI rate returned to baseline after institution of a policy to replace the entire IVAC device, valve, and end cap every 24 hours (27).

Although the needleless systems have evolved, risks for the patient still exist. Before the 1980s, access to an intravenous system was accomplished by inserting a hollow-bore needle
into a latex cap, placed on the administration set. The first safety systems were needle devices with shielding or retracting mechanisms, then came split septum needleless connectors in which a blunt cannula was used to access the connector instead of a needle. The first rendition of this device created negative pressure when the blunt cannula was removed. This made it more likely for the catheter to occlude. The next generation of such connectors incorporated an antireflux, Luer-activated valve, which helped neutralize any negative pressure when the syringe was disconnected. Eventually, the mechanical valve needleless connectors that generate negative, positive, or neutral pressure during disconnect, using Luer lock connectors, became available. The third generation of connectors combined the existing Luer-activated valve concept with a displacement action, which expels a small amount of solution used to flush the catheter when the flush syringe is disconnected from the Luer. The displacement is a passive feature; once this has occurred, the remaining solution is retained within the catheter and no further positive pressure exists. The positive-pressure Luer connector is designed to reduce retrograde flow into the catheter more effectively than can the slit septum or standard Luer connectors, without the use of heparin flushes, by preventing backflow and reducing the amount of blood, and hence clot, in the catheter.

One institution, however, found a 60% increase in the rate of CR-BSIs in their ICUs associated with the introduction of a positive-pressure mechanical valve (MV) port. The institution reverted back to using their previously used mechanical valve (28). Others additionally found an increase rate of BSIs associated with positive-pressure mechanical valve needleless connectors, even after educational sessions regarding proper use of the MV (29,30,31,32). Jarvis et al. compared healthcare-associated CVC-BSI (HA-BSIs) rates at five hospitals that introduced MV needleless connectors (MV-NCs) after using split septum (SS) connectors. All hospitals performed HA-BSI surveillance using Centers for Disease Control and Prevention (CDC) definitions during use of both types of connectors. The authors found that the HA-BSI rate increased significantly in all ICUs and wards when SS-NCs were replaced by MV-NCs and decreased significantly when SS connectors were reintroduced (33).

These outbreaks led to the Society for Hospital Epidemiology of America (SHEA)-Infectious Disease Society of America (IDSA) recommendation against routine use of positive-pressure MV-NCs (34). In August 2010, the U.S. Food and Drug Administration (FDA) sent out a medical device safety alert and a letter to infection control practitioners summarizing the concerns regarding the safety of positive-displacement needleless connectors and required nine companies to conduct postmarket surveillance studies on positive-displacement needleless connectors to assess whether they are associated with higher rates of HA-BSIs. Of note, FDA has received three reports of death associated with BSI and positive-displacement needleless connectors (35).

Of course, understanding how to efficiently and conveniently disinfect a valve is important. Rupp et al. found that a 5-second scrub with a 70% isopropyl alcohol pledget adequately disinfected SS intravascular catheter connector valves under clinical and laboratory conditions (36).

Finally, the tourniquet could function as a possible vehicle for cross-contamination of pathogens such as methicillin-resistant Staphylococcus aureus (MRSA). Leitch et al. examined the contamination rate of phlebotomy tourniquets with MRSA. The investigators found that the tourniquets were contaminated with MRSA 25% of the time; they believed that the contamination occurred via the phlebotomists’ hands, not the patients’ skin (37). The practice for tourniquet use varies widely from single-patient use to disposal upon the discretion of the phlebotomist. Given that multidrug-resistant organisms, such as community-acquired MRSA, now are ubiquitous, the need for practices such as discarding or disinfecting tourniquets with alcohol wipe between uses to prevent tourniquets from harboring pathogens must be considered and evaluated.


LEECHES

Despite the popular appeal of highly sharpened, disposable phlebotomy needles for diagnostic bloodletting, leeches have, in fact, resurfaced as a specialized part of the reconstructive and microvascular surgeons’ armamentarium (e.g., for salvage of congested flaps) (38). However, as with many other advances discussed in this chapter, leeches have an infectious risk (39). Aeromonas hydrophila, normal gut flora of the leech, has caused wound infections in 2.4% to 20.0% of microsurgical procedures using leeches (40,41,42). Aeromonas sp. meningitis also has been associated with leech therapy (43). In an attempt to decrease infectious complications of leeches, one group tried unsuccessfully to sterilize the gut of leeches (44). Some investigators believe that aquariums filled with tap water to house leeches could contribute to the Aeromonas sp. problem (45). Infection with Serratia marcescens also has been linked to leech therapy (46). Understanding the nature of the leech’s contamination (gut flora and surrounding environment) could help direct control efforts and prophylactic antimicrobial therapy.


CARDIAC CATHETERIZATION

Serious local and systemic infections can result from cardiac catheterization procedures, particularly when contaminated instruments or ineffective antiseptics (e.g., dilute aqueous benzalkonium chloride) are used inadvertently or when breaks in technique occur in the cardiac catheterization laboratory. The major pathogens are staphylococci and gram-negative bacteria.

Up to 50% of patients undergoing cardiac catheterization could experience an increase in temperature of >1°C (1.8°F) within 24 hours after catheterization. Their fever, however, has been attributed to the use of angiocardiographic contrast material rather than to infection. In fact, bacterial endocarditis has been reported very rarely in large studies evaluating the complications of cardiac catheterization, and individual examples could have been due to initially undetected concurrent infection. The rates of BSI, after procedures in the cardiac catheter laboratory, range from 0.11% to 18%. In a study of >22,000 patients undergoing invasive, nonsurgical, coronary procedures from 1991 to 1998, BSIs occurred in 0.11% at a median of 1.7 days after the procedure; in >4,000 patients undergoing coronary intervention, bacterial infections occurred in 0.64% and septic complications in 0.24% (47,48). However, in 147 consecutive patients undergoing complex percutaneous coronary interventions (PCI), positive blood cultures were found in 18% immediately after the procedure and in 12% at 12 hours after the procedure (49).

Some studies reporting transient BSIs obtained blood cultures from the intravascular catheter or from the vessel from
which the catheter had been removed. It is possible that some of the isolates represented contamination of the external part of the catheter or the site of insertion and that the incidence of BSI was actually less frequent. A study designed to assess this possibility obtained blood for culture by using standard techniques from a vein distant from the site of catheter manipulation (50). Venous blood cultures of 106 patients, most of whom had valvular heart disease, were obtained in this manner during cardiac catheterization, and all were sterile. Of the 38 samples drawn through the catheter that was placed in the heart or aorta during the procedure, three grew diphtheroids or microaerophilic streptococci. The researchers concluded that the contamination of the hub end of the catheter with normal skin flora led to an overestimation of the BSI incidence. Removal of organisms by lung filtration also could have accounted, in part, for the failure to isolate organisms from distal sites. In either instance, it is clear that some contamination of the catheterization field had occurred.

Coronary stent placement is routinely practiced yet has been linked to few reports of coronary stent infections. When such infections do occur, the mortality and morbidity are high (Table 46.1). Once a stent has been placed, it is not removable; therefore, illuminating the risk factors for stent infection is paramount (51,52,53,54,55).

Retrospective and prospective studies have illuminated various risk factors for PCI-associated BSI. These factors include difficult vascular access, multiple skin punctures, repeated catheterizations at the same vascular access site, extended procedure duration, use of multiple percutaneous transluminal coronary angioplasty (PTCA) balloons, deferred removal of the arterial sheath, presence of congestive heart failure, and patient’s age >60 years (47,48,56). We should focus on nonpatient factors such as timely removal of the arterial sheath after percutaneous transluminal angioplasty to decrease HAI rates in the cardiac catheter laboratory. In addition, catheterization-associated infection should be infrequent with rigorous application of strict aseptic technique and adoption of the working principle that cardiac catheterization is a surgical procedure. The Laboratory Performance Standards Committee of the Society for Cardiovascular Angiography and Interventions (SCAI) has published guidelines that address the increased use of the catheterization laboratory as an interventional suite with device implantation. The guide is divided into sections on the patient, laboratory personnel, and laboratory environment (57). A more recent standard discusses issues related to infection control and sterile technique while performing procedures in the cardiac catheter laboratory (58).


INDWELLING ARTERIAL CATHETERS

Indwelling arterial catheters are used regularly in patients who require pressure monitoring or repeated blood gas determination. Although they provide information that is essential for patient care and eliminate the need for potentially traumatic repeated arterial punctures, such catheters also provide a continuing portal of entry for microbial invasion of the bloodstream. The reported incidence of arterial catheter colonization and infection varies depending on the catheter-tip culture technique used. Colonization incidence reports vary from 27% (49 episodes per 1,000 catheter-days) to 4% (11.7 episodes per 1,000 catheter-days) (59,60). Maki et al. found lower rates of arterial catheter-associated BSI when the absolute and relative risks of BSI associated with the various types of intravascular devices were analyzed by reviewing 200 published studies of adults. Every device in the study population was evaluated prospectively for evidence of associated infection, and micro-biologically based criteria were used to define device-related BSI. Arterial catheters used for hemodynamic monitoring were found to have an incidence rate of 0.8% (1.7 per 1,000 catheter-days) (61). The source-organisms have not always been evaluated, and no direct relation with patient disease has been established, but the incidence of colonization of radial catheters (in contrast to umbilical catheters) did appear to be related to longer durations of catheterization (>4 days) (62). Inflammation at the catheter site and the use of a cutdown procedure to place the catheter also appear to be associated with an increased infection risk (63,64,65,66). A prospective study of 95 patients (130 catheters) in a medical-surgical ICU found a 4% risk of arterial cannula-related septicemia; 12% of all sepsis in this unit was the result of intra-arterial catheters (67). These BSIs were caused by gram-negative bacilli, enterococci, or Candida. Many surgical ICUs have policies for routine replacement of intravascular catheters; however, there are few data to support this practice. Pirracchio et al. compared arterial catheter colonization rate and arterial CR-BSI rate in all ICU patients requiring an arterial catheter from 1997 to 2004. From 1997 to 2000, arterial catheters were routinely replaced every 5 days; from 2000 to 2004, they were not routinely replaced. The investigators found that scheduled replacement of arterial catheters was associated with increased risk of BSI (68).

Intuitively, it seems that a femoral catheter would engender a greater risk of infection than a radial site. One study by Lorente et al. attempted to answer this question (69); the authors performed a prospective observational study of all consecutive patients (2,018 patients with 2,049 arterial catheters) admitted to a medical-surgical ICU over 3 years. Multivariate analysis revealed that the catheter-related local infection rate was significantly higher for femoral than radial access (odds ratio [OR], 1.5; 95% confidence interval [CI], 1.10 to 2.1). The CR-BSI rate also was higher (femoral 1.92/1,000 catheter-days vs. radial 0.25/1,000 catheter-days) (OR, 1.9; 95% CI, 1.15 to 3.4.) (70,71,59). Other investigators found a trend toward a higher rate of infection among femoral catheters (4.13 vs. 3.36 episodes per 1,000 catheter-days) (p = .72) when they compared 705 arterial catheters in femoral location with 838 in radial location. The investigators additionally found a higher rate of gram-negative infection in the femoral site (16 episodes [61.5%] vs. 7 [28%] in radial location) (OR, 2.586; 95% CI, 1.051 to 6.363) (72). Femoral arterial catheters have been found to be colonized more often than radial arterial catheters (hazard ratio, 5.08; 95% CI, 0.85 to 30.3; p = .075), and colonization was significantly higher when the catheter was inserted in the operating room or emergency department (hazard ratio, 4.45; 95% CI, 1.42 to 13.9; p = .01) compared with the ICU (73). As expected, an increased rate of infection is associated with increased catheter-days and length of ICU stay (74,75). Interestingly, older studies investigating arterial catheter infection rates have not found a difference between the rates of radial, axillary, or femoral site infections and of CR-BSIs (76).

There are national guideline recommendations for arterial catheter site insertion; in adults, the use of the radial, brachial, or dorsalis pedis site is preferred over the femoral or axillary site to reduce infection. In children, the recommendation is to not use the brachial site (77).









TABLE 46.1 Published Case Reports of Coronary Stent Infections



















































































































Age in Years/Gender


Stent Type


Symptoms


Time of Presentation After Initial Procedure


Vessel; Complications


Diagnostic Tool


Organism


Therapy


Outcome


66/f


Palmaz-Schatz


Fever


4 weeks


RCA; abcess, pericardial empyema


TEE


S. aureus


IV antibiotics, stent removal


Death


49/m


Palmaz-Schatz


Fever


1 week


LAD; false aneurysm


Coronary angiogram


P. aeruginosa


IV antibiotics, surgery


Death


38/m


Palmaz-Schatz


Fever, chest pain


4 days


LCX; false aneurysm


CT scan, coronary angiogram


P. aeruginosa


IV antibiotics, debridement, stent removal


Survival


54/m


AVE microstent


AMI, fever


4 days


LAD; vessel destruction


None


S. aureus


None


Death


67/m


Not specified


Fever, chest pain, AMI


4 days


LCX; abcess


CT scan


S. aureus


IV antibiotics


Survival


72/m


NIR


Fever, chest pain


18 days


LAD; false aneurysm


Coronary angiogram


S. aureus


IV antibiotics, debridement, partial stent removal


Survival


55/m


Jostent Flex


Fever, chest pain


14 days


RCA; pericarditis


TEE


CNRS Candida spp.


IV antibiotics, IV antimycotics, stent removal


Survival


53/m


Jomed covered stent


Fever


2 days


Vein graft; abcess


TTE, TEE


S. aureus


IV antibiotics, abcess drainage


Death


56/m


Cypher (sirolimuseluting stent)


Fever


4 days


LAD; mycotic aneurysm


Coronary angiogram


S. aureus


IV antibiotics


Survival


80/m


Jomed heparin-coated


Fever, chills


5 days


LAD


CT scan


S. aureus


IV antibiotics


Survival


AMI, acute myocardial infarction; CNRS, coagulase-negative oxacillin-resistant staphylococci; CT, computed tomography; IV, intravenous; LAD, left anterior descending coronary artery; LCX, left circumflex coronary artery; MRI, magnetic resonance imaging; RCA, right coronary artery; TEE, transesophageal echocardiography; TTE, transthoracic echocardiography. Adapted with permission from Jarvis W, Murphy C, Hall K, et al. Health care-associated bloodstream infections associated with negative-or positive-pressure or displacement mechanical valve needleless connectors. Clin Infect Dis. 2009;49:1821-1827.



In addition to choosing the best site to decrease infection risk, the best aseptic techniques must be used. Rijnders et al. studied colonization rates of arterial catheter tips and found no difference in the incidence of arterial catheter colonization when the catheter was inserted under maximal sterile barrier precautions, defined as an HCW wearing sterile gloves and a sterile gown, a mask, and a cap with the use of a large sterile sheet; skin was disinfected with 0.5% chlorhexidine in 70% alcohol vs. the standard of care group in which handwashing was done, sterile gloves were worn, and the same skin disinfection was applied (66). However, similar studies of CVC insertions have shown definite infection control value of maximal barrier precautions whose benefit could be operator-dependent (e.g., of more benefit with less-experienced inserters). The current recommendation is to use a minimum of a cap, mask, sterile gloves, and a small sterile fenestrated drape while inserting peripheral arterial catheters. During axillary or femoral artery catheter insertion, maximal sterile barrier precautions should be used. The catheter should be replaced when there is a clinical indication, and should be removed when it is no longer needed (77). Arterial catheters are frequently accessed for blood sampling, therefore, the focus for decreasing the rate of infection should be on protecting the catheter hub during manipulation (69).

The infectious complications of arterial catheters have been studied in neonates also. In different centers, the incidence of colonization of indwelling umbilical artery catheters has varied from 6% to 60% (78,79,80,81). Neonates with very low birth weight who also received antibiotics for >10 days were at increased risk for umbilical artery catheter-related BSI (82). Unexpectedly, however, the incidence of colonization fails to increase with duration of catheterization, which suggests that catheters become contaminated initially or soon after insertion through the umbilical stump, an area that is heavily colonized and impossible to sterilize completely by local or systemic antibiotics. Indeed, the same organisms usually are isolated from both the umbilical cord and catheter in any individual patient. The most frequent contaminants are staphylococci, streptococci, and gram-negative bacilli, particularly Pseudomonas spp., Proteus spp., Escherichia coli, and Klebsiella spp. The clinical significance of umbilical catheter colonization is difficult to assess because the incidence of sepsis in most studies has been low. When serial prospective blood cultures have been obtained from umbilical-catheterized neonates, however, transient CR-BSI has been noted. In a prospective study of temporary (2 to 4 hours) umbilical catheterization for exchange transfusion, investigators documented a 60% incidence of catheter contamination and a 10% incidence of transient BSI due to Staphylococcus epidermidis (and, in one patient, Proteus spp.) that occurred 4 to 6 hours after transfusion; this study suggests that the risk of BSI from umbilical catheterization could be highest during catheter insertion and removal (83). This study and others found that prophylactic systemic antibiotics failed to reduce the incidence of catheter contamination and BSI. At present, systemic antibiotic prophylaxis does not appear to be beneficial during umbilical catheterization; instead, attention should be focused on meticulous cord preparation and care. Tincture of iodine should be avoided, to prevent the potential effect on the neonatal thyroid; topical antibiotics should be avoided as they may facilitate fungal infections and/or antimicrobial resistance (84). Other infectious complications of umbilical arterial catheters include mycotic aneurysm or pseudoaneurysm with or without hemoperitoneum (85,86,87). Umbilical catheters should be removed as soon as possible, and if possible should not be left in place for >5 days to avoid catheter-associated thrombosis (88,77).

A review of 200 published, prospective studies indicates that the pulmonary artery catheter-related BSI rate is 3.7 per 1,000 catheter-days (61). This is slightly higher than for unmedicated and nontunneled CVCs (2.7 per 1,000 catheter-days). Flow-directed pulmonary artery catheters carry the added risk of right-sided endocarditis related to endocardial trauma and septic thrombosis of the great vein or pulmonary artery. One autopsy study found that 7% of 55 patients had endocarditis in association with these catheters (89). Studies that used multivariate analysis have identified a number of risk factors for infection associated with the use of pulmonary artery catheters (90). Strong independent predictors of an increased risk of catheter colonization were the use of catheters in neonates and in younger children, the placement of the catheter without using maximal barrier precautions, the placement in an internal jugular (rather than a subclavian) vein, the heavy cutaneous colonization at the insertion site, and a prolonged catheterization, particularly >4 days (60,62,90,91,92,93,94,95,96,94). On the basis of at least one study (97), pulmonary artery catheters do not need to be changed more frequently than every 7 days. Catheters contained within a plastic sleeve have been shown to reduce the risk of CR-BSI (98). Pulmonary artery catheters that are inserted through a Teflon introducer are mostly heparin bonded, which reduces microbial adherence (99).


TRANSDUCERS

Pressure-monitoring devices are used regularly to monitor cardiovascular pressures of critically ill patients. Guidelines for preventing infections related to intravascular pressure monitoring have been formulated and updated by the CDC (77). Reusable transducers have been sources of HAIs in outbreaks of gram-negative BSI, candidemia, and dialysis-associated hepatitis (100), and disposable rather than reusable pressure transducers should be used and can be safely used without being changed for 4 days, even in busy ICUs (101). Currently, it is recommended that disposable transducer assemblies be used and replaced every 96 hours and that all components of the pressure monitoring system should be kept sterile (77).


TRANSFUSION-ASSOCIATED INFECTIONS


BLOOD TRANSFUSION AND BACTEREMIA

Transfusion-associated sepsis is the leading cause of allogeneic blood transfusion-related death (102). The three main postulated mechanisms of bacterial contamination of blood products are the use of nonsterile tubing or collection bags due to improper manufacturing, bacteria from the donor’s skin or blood, and unsterile handling during preparation and/or storage (103). Now that systematic blood donor programs have greatly reduced the frequency of transfusion-transmitted viral infections by carefully screening donors and using nucleic acid testing (NAT; for HIV and HCV), transfusion-associated bacterial contamination is the most frequent transfusiontransmitted infection. The first case reports of transfusion-related sepsis appeared in the 1940s and 1950s and involved
shock syndromes produced by transfusion of cold-stored blood contaminated with psychrophilic organisms able to survive and grow at 4°C (30°F), such as Achromobacter and some Pseudomonas spp. Prospective microbiologic studies soon followed these reports and documented a contamination rate of 1% to 6% in banked blood (104). Most contaminants were normal skin flora, presumably introduced with fragments of donor skin cored out during phlebotomy. Such contaminants usually were present in extremely low concentrations (several logarithmic factors below the level of ~100 organisms per milliliter of blood associated with transfusion sepsis), and multiplication of organisms during storage seemed unlikely because of the long lag phase produced by refrigeration and of the antibacterial action of blood. Indeed, retrospective studies failed to document any clinical illness associated with the transfusion of blood that contained low-level skin flora contamination (105). Nevertheless, asymptomatic patients or patients with nonspecific gastrointestinal (GI) symptoms on rare occasions still could be a source of bacterial contamination, especially of Yersinia enterocolitica-contaminated red blood cell transfusions. Infections due to this contamination have been associated with a high mortality rate, particularly with units stored >25 days at 1° to 6°C (34° to 43°F). The donors presumably had asymptomatic bacteremia at the time of donation. An example of bacterial contamination of blood components during collection or processing is illustrated by an outbreak of S. marcescens traced to the use of blood bags intrinsically contaminated during manufacturing (106).

Investigators sought to determine the risk of bacterial contamination of blood components by combining data reported to the CDC from blood-collection facilities and transfusion services affiliated with the American Red Cross (ARC), AABB (formerly known as the American Association of Blood Banks), and Department of Defense blood programs from 1998 to 2000. A case was defined as any transfusion reaction meeting clinical criteria in which the same bacterial species was cultured from a blood component and from recipient blood and molecular typing confirmed the organism pair as identical. There were 34 episodes and nine deaths. The rate of transfusion-transmitted bacteremia (in events per million units) was 9.98 for single-donor platelets, 10.64 for pooled platelets, and 0.21 for red blood cells (RBC) units; for fatal reactions, the rates were 1.94, 2.22, and 0.13, respectively. Patients at greatest risk for death received components containing gram-negative organisms (OR, 7.5; 95% CI, 1.3 to 64.2) (107).

The French BACTHEM study assessed transfusion-associated bacterial contamination determinants using a matched case-control study design. The cases were derived from a database of patients presenting during a 3-year period with a transfusion-related adverse event reported to the French blood agency as a suspected case of transfusion-associated bacterial contamination. Of the 158 episodes of suspected transfusion-associated bacterial contamination reported during the study period, 41 episodes and 82 matched controls were included. The bacteria were gram-negatives (42%), gram-positive cocci (28%), gram-positive rods (21%), or others (9%). The overall incidence rate of contamination was 6.9 per million units issued. The risk of contamination was >12 times higher after platelet pool transfusion and 5.5 times higher after aphaeresis platelet transfusion than after RBC transfusion. Gram-negative rods accounted for nearly 50% of the bacterial species involved and for six deaths. The risk factors included patients receiving RBC for pancytopenia, platelets for thrombocytopenia and pancytopenia, immunosuppressive treatment, shelf life >1 day for platelets or 8 days for red blood cells, and >20 previous donations by donors (108).

Such techniques as integration of diversion pouches into blood bags to divert the first 30 mL of blood during blood collection, in addition to skin disinfection, can significantly reduce the risk of bacterial contamination (109).


BLOOD TRANSFUSION AND PARASITEMIA

The frequent use of blood transfusions and the increased travel to and from countries where malaria is endemic have led to an increased occurrence of transfusion-related malaria. During the period 1911 to 1950, ~350 episodes of transfusion-associated malaria were reported worldwide. In contrast, during the period 1950 to 1972, the number of reported episodes was >2,000 (110). In the United States, 103 episodes of transfusion-induced malaria were reported during 1958 to 1998 (111,112). In 2010, CDC received 1,691 reported episodes of malaria; 1,688 of these episodes were classified as imported. Only one transfusion-related episode was reported among persons in the United States. The total number of episodes represents an increase of 14% from episodes reported for 2009. Plasmodium falciparum, P. vivax, P. malariae, and P. ovale were identified in 58%, 19%, 2%, and 2% of episodes, respectively. The number of episodes reported in 2010 marked the largest number reported since 1980 (113). Still, malaria transmitted through blood transfusion now is rare in the United States and occurs at a rate of <1 per 1 million units of blood transfused (114).

Although P. falciparum is the most commonly found malaria infection in the United States, P. malariae has been the most common cause of transfusion-associated malaria worldwide, accounting for almost 50% of episodes. P. vivax and P. falciparum are second and third in worldwide incidence, respectively. This ordering probably reflects the fact that although P. malariae infection can persist in an asymptomatic donor for many years, the longevity of P. vivax malaria in humans rarely exceeds 3 years and that of P. falciparum rarely 1 year. Hence, there is a higher chance for an asymptomatic donor infected with P. malariae to escape detection and become the source of a contaminated transfusion.

The AABB adopted recommended guidelines for the selection of blood donors to prevent transmission of malaria in 1970, but they were relaxed in 1974 (111,112). In the changes added to the 24th edition of the Standards for Blood Banks and Transfusion Services, prospective donors who have a definite history of malaria are deferred for 3 years after becoming asymptomatic. Individuals who have lived for ~5 consecutive years in areas considered malaria-endemic by the CDC are deferred 3 years after departure from the area(s). These standards are still in effect. Travelers from an endemic area with malaria are deferred from donating blood for 1 year after their return (115).

To prevent blood donor deferrals, investigators are developing molecular diagnostic techniques to screen blood for parasites. One group developed a pan-Plasmodium polymerase chain reaction (PCR) test that detects all five human Plasmodium spp. This technique detected 78/78 smear film-positive and 19/101 (19%) of smear-negative samples from asymptomatic individuals in Ghana (116). Another molecular diagnostic (real-time PCR) to detect P. vivax was used to identify blood
donors infected with malaria parasites. Samples from 595 blood donors were collected in northern Brazil. The assay identified eight healthy individuals in the sample (1.34%) infected with P. vivax at the time of blood donation. The real-time PCR with TaqMan® probes enabled the identification of P. vivax in clinically healthy donors, demonstrating the need to use sensitive screening methods to detect malaria in blood banks (117). Still, the identification of donors with the potential to transmit malaria depends largely on the donor exposure history obtained during the donor interview. To minimize donor loss associated with deferrals for malaria risk, on November 16, 2009, the FDA again sought advice from the Blood Products Advisory Committee (BPAC) on an alternative strategy to minimize donor loss. The draft guidelines are available on the FDA Web site (118).

Because platelet and leukocyte preparations also have been incriminated in the transmission of malaria, the guidelines must be applied to potential donors of any formed elements of blood.

Chagas disease (American trypanosomiasis) is prevalent through South and Central America and is spreading into non-endemic countries. The potential for bloodborne transmission is high because some infected individuals become asymptomatic and maintain persistent parasitemia for 10 to 30 years. After 10 days of storage, the infectivity of blood contaminated with this parasite declines, but storage is not a useful method for preventing transmission; moreover, the parasite is viable in whole blood and RBC stored at 4°C (30°F) for ≥21 days. Serologic screening of blood of donors has become mandatory in many South American countries. The problem of transfusion-associated Chagas disease also has become an issue for U.S. blood banks, secondary to increased immigration and to more potentially infectious U.S. blood donors. In the early 1990s, it was estimated that about 100,000 infected people lived in the United States (119). CDC now estimates that >300,000 persons with Trypanosoma cruzi infection live in the United States (120). One in 2,000 blood donors, in 2006, was positive for T. cruzi antibodies in the Los Angeles metropolitan area (121), compared with a previous rate of 1 in 7,500 in 1998 (122).

Eight episodes of transfusion-transmitted Chagas disease have been reported in the United States and Canada since the mid-1980s (123,124,125,126,127,128,129). The ARC, Canadian Blood Services, and Spanish Red Cross reviewed transfusion-transmitted T. cruzi episodes and recipient tracing, undertaken in North America and Spain. They found T. cruzi infection in 20 transfusion recipients linked to 18 serologically confirmed donors between 1987 and 2011, including 11 identified only by recipient tracing. There were 11 definite transmissions, from implicated apheresis or whole blood-derived platelets, none by RBCs or frozen products (130).

Given the increasing rate of T. cruzi infection in the United States, sensitive detection tools are needed to prevent further transmission via blood products. An investigational enzyme-linked immunosorbent assay (ELISA), developed in 2005, and manufactured by Ortho-Clinical Diagnostics (Raritan, New Jersey), for detecting T. cruzi antibodies in blood donations was evaluated by the ARC during August 2006 to January 2007. It was found that of the 148,969 blood samples tested, 63 were repeat reactive for T. cruzi antibodies, and 32 donations (~1 in 4,655) were confirmed as positive for T. cruzi antibodies by a radioimmunoprecipitation assay (131). The French Blood Services introduced systematic screening of at-risk blood donors for anti-T. cruzi antibodies in May 2007. From May 2007 to December 2008, 163,740 of 4,637,479 donations (3.5%) were screened. The prevalence of anti-T.cruzi antibodies was 1 in 32,800 donations (132). More recently, from 2007 until 2011, the New York Blood Center screened donations for the presence of T. cruzi antibodies using an FDA-approved ELISA; 204 of 1,066,516 unique donors screened (0.019%) were T. cruzi antibody-positive. At least 154 units from 29 of the confirmed-positive donors had been transfused to 141 recipients. Forty-eight of the 141 recipients were alive for testing and seven underwent T. cruzi screening. Two recipients were found to be immunofluorescence assay (IFA)-positive. Both IFA-positive recipients received a leukoreduced apheresis platelet unit (two separate donations) from the same confirmed-positive donor. Look-back analysis was able to identify the first two episodes of probable transfusion-transmitted T. cruzi infection since the implementation of the national screening program, which increases the total number of reported episodes in the United States to eight (133).

The FDA has approved two blood donor screening tests for T. cruzi. They have not yet required, but recommend, that donors be screened for antibodies. The ARC and Blood Systems, Inc., the blood-collection organizations that are responsible for ~65% of the U.S. blood supply, however, began screening all donations for T. cruzi on January 29, 2007. AABB has established the Web-based Chagas’ Biovigilance Network to track the results of the testing (134). Finally, recommendations in the December 2010 Guidance for Industry “Use of Serological Tests to Reduce the Risk of Transmission of Trypanosoma cruzi Infection in Whole Blood and Blood Components Intended for Transfusion” are summarized as follows: Ask all presenting allogeneic donors if they have a history of Chagas disease; test each allogeneic donor one time for antibodies to T. cruzi and allow donors with non-reactive results to donate without further testing of subsequent donations for antibodies to T. cruzi; and a “yes” response to the question or a repeat reactive result using an FDA-licensed test will result in the donor being indefinitely deferred (135).

Toxoplasmosis also can be transmitted via blood transfusion. One prospective survey of thalassemia patients who were frequently transfused detected subclinical toxoplasmosis at a rate comparable to that seen in a control group and therefore was considered to be evidence against the transmission of toxoplasmosis by transfusion (136). Another study found, however, that patients treated for acute leukemia acquired toxoplasmosis after leukocyte transfusions from donors with chronic myelogenous leukemia; serologic data retrospectively obtained from donors revealed elevated anti-Toxoplasma antibody titers (137). This inferential evidence for transfusion-associated toxoplasmosis is supported by the findings that the disease can be transmitted between animals by transfusion, that Toxoplasma organisms retain their viability in stored blood for 50 days, and that organisms can be recovered from the blood buffy-coat layers of patients with toxoplasmosis. Because it seems likely that toxoplasmosis can be transmitted if large concentrations of leukocytes are transfused and all of the leukocyte donors had chronic myelogenous leukemia, it is recommended that blood or leukocytes from patients with leukemia not be used especially because the recipients’ host defenses usually are severely compromised. There are no known episodes of transmission of toxoplasmosis from RBCs or fresh frozen plasma (FFP). One possible episode of platelet transfusion toxoplasmosis has been reported (138).


Babesia microti is endemic in the United States—in Connecticut, Massachusetts, New Jersey, New York, Rhode Island, Minnesota, and Wisconsin. B. divergens is primarily a bovine parasite, but human infections have been documented in immunocompromised hosts in Europe. Before January 2011, there was not a standard case definition of Babesia infection for reporting purposes in the United States. In 2011, national surveillance for human babesiosis was begun in 18 states and one city, using a standard case definition developed by CDC and the Council of State and Territorial Epidemiologists. For the first year of babesiosis surveillance, health departments notified CDC of 1,124 confirmed and probable cases; 1,092 cases (97%) were reported by seven states (Connecticut, Massachusetts, Minnesota, New Jersey, New York, Rhode Island, and Wisconsin). Ten cases of babesiosis in transfusion recipients were classified, by the reporting health departments, as transfusion-associated (139).

There have been >100 reports of transfusion-transmitted Babesia (TTB) in the last 30 years (140,141). Donations from a group of blood donors in Babesia-endemic areas of Connecticut were seropositive 1.4% of the time, and >50% of those had demonstrable parasitemia (142). CDC summarized 159 U.S. episodes of TTB from 1979 to 2009, largely from blood centers in the U.S. Northeast. Most of the episodes were associated with red cell components, and peak TTB periods occurred from July to October. Seventy-seven percent of the episodes were reported in the last 10 years, suggesting either an increase in the frequency or better surveillance monitoring for TTB (143,144,145,146). In fact, B. microti is the most frequently reported transfusion-transmitted infectious agent in the United States.

Screening using real-time PCR and indirect IFA was effective in preventing the transmission of B. microti in a laboratory-based blood donor screening program for B. microti (147). However, there is no currently licensed test available for screening U.S. blood donors for evidence of Babesia infection (148). Donors with a history of babesiosis are indefinitely restricted from donating blood because of the possibility of ongoing asymptomatic parasitemia (149).

Visceral leishmaniasis (VL) caused by Leishmania infantum, a zoonotic disease endemic throughout the Mediterranean basin, can exist as an asymptomatic human infection and therefore there is the risk of transmission by blood transfusion. Riera et al. looked at the prevalence of Leishmania infection in 1,437 blood donors from the Balearic Islands (Majorca, Formentera, and Minorca) using immunologic (Western blot and delayed-type hypersensitivity [DTH]), parasitologic (culture), and molecular (nested PCR) methods. In addition, the efficiency of leukoreduction by filtration to remove the parasite was tested by nested PCR in the RBC units. Leishmania antibodies were detected in 44 of the 1,437 blood donors tested (3.1%). A sample of 304 donors from Majorca was selected at random. L. infantum DNA was amplified in peripheral blood mononuclear cells (PBMNCs) in 18 of the 304 (5.9%), and cultures were positive in 2 of the 304 (0.6%). DTH was performed on 73 of the 304 donors and was positive for eight of them (11%). Of the 18 donors with positive L. infantum nested PCR, only two were seropositive. All the RBC samples tested (13 of 18) from donors with a positive PBMNC nested PCR yielded negative nested PCR results after leukodepletion.

Cryptic Leishmania infection is highly prevalent in blood donors from the Balearic Islands. DTH and L. infantum nested PCR appear to be more sensitive to detect asymptomatic infection than the serology. The use of leukodepletion filters appears to remove parasites from RBC units efficiently (150).

The U.S. Army has reported a number of persons with cutaneous leishmaniasis among U.S. service members deployed to Iraq (150A), and therefore donors who have traveled or lived in Iraq are deferred from donating blood for 1 year from the last date of departure from Iraq and Afghanistan. Deferral for a history of leishmaniasis has been discussed, but no regulation or standard exists covering civilian blood banks (151).

As molecular technology, such as PCR, becomes more sophisticated and available, we will be better equipped to efficiently identify parasites in donated blood and prevent transfusion-associated transmission (152,153) without causing the deferral of a large number of noninfectious donors or significantly increasing costs.


PLATELET TRANSFUSION

Approximately 9 million platelet unit concentrates are estimated to be transfused in the United States each year and 1 in 1,000 to 3,000 platelet units is estimated to be contaminated with bacteria, resulting in possible transfusion-associated sepsis (154,155) and of fatalities related to transfusion-transmitted disease in developed countries. It is estimated that the risk of bacterial-related death after the transfusion of a platelet unit ranges from 1:7,500 to 1:500,000 (156,157,158).

In fact, transfusion-transmitted bacterial contamination of platelets is the most common cause of infectious complication in transfusion medicine (159). A review of data from the conventional arms of two studies (160) revealed that hematology-oncology patients receiving many units of apheresis platelets can have a risk as high as 1 in 250 of receiving a contaminated unit (161).

Because it is recommended that platelets be stored at room temperature (20° to 24°C/68° to 75°F) to increase in vivo half-life, concern over the true incidence of intrinsic contamination and the possible proliferation of contaminants during storage is justified. Of interest, platelets historically have been stored at 4°C (39°F). In 1969, Murphy and Gardner demonstrated that platelet storage at 22°C (72°F) led to improved in vivo viability and function as compared to storage at 13°C, 20°C, and 37°C (55°F, 68°F, 99°F) (162). These observations led to the current practice of storing platelets at room temperature for up to 5 days. It seems reasonable to assume that platelet concentrates are as susceptible to contamination during collection as is blood, which is routinely found to have a 1% to 6% incidence of low-level contamination. Moreover, platelet concentrates, unlike blood, have no protective antibacterial activity, and platelet transfusions are frequently obtained by pooling the contributions of several donors, which additionally increases the risk of contamination. Despite this seemingly negative picture, most bacterial contaminants isolated from platelet concentrates have been normal skin flora, such as S. epidermidis

Only gold members can continue reading. Log In or Register to continue

Stay updated, free articles. Join our Telegram channel

Jun 16, 2016 | Posted by in INFECTIOUS DISEASE | Comments Off on Miscellaneous Procedure-Related Infections

Full access? Get Clinical Tree

Get Clinical Tree app for offline access