VTE prophylaxis in colorectal surgery patients increases over time, but VTE rates remain unchanged, study finds
Prophylaxis against venous thromboembolism (VTE) increased significantly among colorectal surgery patients, but their rate of VTE occurrence did not change, a study found.
The prospective study included 16,120 patients (mean age, 61.4 years; 54.5% female) who underwent colorectal surgery in Washington State between 2006 and 2011. The main outcome was VTE within 90 days of surgery. Results were published in JAMA Surgery on June 10.
The percentage of patients receiving VTE chemoprophylaxis increased significantly over the studied time period. Perioperative prophylaxis increased from 31.6% to 86.4%, in-hospital postoperative use went from 59.6% to 91.4%, and postdischarge prophylaxis rose from 8.6% to 11.7% (P<0.001 for trend for all). Meanwhile, VTE rates showed no significant changes over time, occurring in 2.2% of the patients. After adjustment, the researchers found that older age, nonelective surgery, history of VTE, and operations for inflammatory disease were associated with increased VTE risk.
The study can't explain why increased use of prophylaxis hasn't reduced VTE rates, the authors said. Possible explanations include that identification of VTE has increased due to closer surveillance or that appropriate patients are being treated with prophylaxis but not all VTEs are preventable. Another issue is that postoperative VTE risk peaks 3 weeks after surgery, but only 10.6% of patients were discharged on a prophylaxis regimen. However, “it becomes difficult to argue for more prophylaxis in the discharge setting” when VTE hasn't decreased despite increasing prophylaxis in the hospital, the authors said. The study's findings should influence future research and guidelines, they concluded.
The study “adds to the mounting body of evidence regarding the lack of benefit of an aggressive VTE prevention strategy,” according to an accompanying editorial. Unnecessary prophylaxis could result in bleeding complications, and financial penalties for discovering silent VTE could lead to reluctance to perform imaging, the editorialists said. Thus, CMS’ inclusion of postoperative VTE rates as a Patient Safety Indicator “may ultimately prove to be detrimental to patient care.”
Targeted perioperative intervention lowers rates of complex S. aureus SSIs, study shows
A perioperative bundled intervention, including screening and decolonizing, was associated with a modest, statistically significant decrease in complex Staphylococcus aureus surgical site infections (SSIs) in patients undergoing cardiac operations or hip or knee arthroplasties, a study found.
The Study to Optimally Prevent SSIs in Select Cardiac and Orthopedic Procedures (STOP-SSI) included 42,534 operations among 38,049 unique patients in 20 hospitals. The study was published on June 2 in the Journal of the American Medical Association.
The bundle including screening for methicillin-resistant S. aureus (MRSA) or methicillin-susceptible S. aureus (MSSA) and having positive patients apply mupirocin intranasally twice daily and bathe daily with chlorhexidine-gluconate (CHG) for up to 5 days before surgery. MRSA carriers also received vancomycin and cefazolin or cefuroxime, and the others received cefazolin or cefuroxime. Patients with negative screens were bathed with CHG the night before and morning of their operations.
In the hospital-level time-series analysis, a Poisson regression model showed that, during the intervention, the monthly rates of complex S. aureus SSIs decreased from 36 to 21 per 10,000 operations (mean difference, −15 [95% CI, −35 to −2]; rate ratio [RR], 0.58 [95% CI, 0.37 to 0.92]). The number of months without any complex S. aureus SSIs increased from 2 of 39 months (5.1%) to 8 of 22 months (36.4%; P=0.006). After the researchers controlled for a number of factors, implementation of the bundle was associated with a significant reduction in the complex infections (odds ratio, 0.60 [95% CI, 0.37 to 0.98]).
Subgroup analyses showed that the rates of complex S. aureus SSIs decreased significantly after scheduled operations (RR, 0.55; 95% CI, 0.35 to 0.86) but did not decrease after urgent or emergent operations. Rates decreased significantly for hip or knee arthroplasties (difference per 10,000 operations, −17 [95% CI, −39 to 0]; RR, 0.48 [95% CI, 0.29 to 0.80]) but not for cardiac operations (difference per 10,000 operations, −6 [95% CI, −48 to 8]; RR, 0.86 [95% CI, 0.47 to 1.57]).
After a 3-month phase-in period, bundle adherence was 83% (39% full adherence; 44% partial adherence). The infections decreased significantly among patients in the fully adherent group compared with the pre-intervention period (RR, 0.26; 95% CI, 0.10 to 0.69) but not in the partially adherent or nonadherent group (RR, 0.80; 95% CI, 0.49 to 1.31). After implementation, the rate of complex S. aureus SSIs was 50 per 10,000 operations done by surgeons implementing at least part of the bundle compared to 240 per 10,000 with surgeons who did not implement any part of the bundle.
The study authors concluded that their bundle was associated with a modest, but statistically significant decrease in complex S. aureus SSIs. They noted limitations, such as variance in surveillance for SSI among the hospitals, although surveillance practices remained consistent throughout the study. The authors also noted that the results may not be generalizable to large academic health centers or hospitals without strong quality-improvement infrastructures.
“Although the absolute difference of 15 infections per 10,000 operations seems modest, each complex SSI prevented is clinically meaningful,” according to an accompanying editorial. The editorialist wrote that although S. aureus is the principal pathogen in terms of SSI prevalence and associated morbidity, many other organisms also cause SSIs. Therefore, decolonization of MRSA and MSSA is only one aspect of SSI prevention, and additional strategies are still needed in order to bring the rate closer to the goal of zero.
Coronary CT angiography shows similar outcomes, better patient experience compared to MPI
Coronary CT angiography and radionuclide myocardial perfusion imaging (MPI) had similar outcomes in patients admitted with chest pain, a recent study found.
The trial included 400 patients (63% women and 95% ethnic minorities) who were admitted to the telemetry ward of an inner-city medical center with acute chest pain and randomized to either CT angiography or MPI. Results were published online June 9 in Annals of Internal Medicine.
The 2 groups had similar rates of the study's primary outcome, cardiac catheterization not leading to percutaneous or surgical revascularization within 1 year: 15% of the CT patients versus 16% of the MPI patients. (The primary goal of noninvasive coronary imaging was to select patients who may benefit from revascularization and to avoid cardiac catheterization in the remaining patients.) Ten percent of MPI patients did not undergo revascularization, compared to 7.5% of the CT group (hazard ratio, 0.77; 95% CI, 0.40 to 1.49; P=0.44). The CT and MPI patients had similar results on a number of outcomes: length of stay (28.9 vs. 30.4 hours), death (0.5% vs. 3%; P=0.12), nonfatal cardiovascular events (4.5% vs. 4.5%), rehospitalization (43% vs. 49%), ED visits (63% vs. 58%), and outpatient cardiology visits (23% vs. 21%). The CT group did have significantly lower total radiation exposure (24 vs. 29 mSv; P<0.001) and were more likely to grade the experience favorably (P=0.001) and be willing to undergo the examination again (P=0.003).
This study was the first direct comparison of the imaging strategies in this patient population, and showed that the tests did not differ on outcomes or resource utilization, but that CT was associated with less radiation exposure and a more positive patient experience, the study authors concluded. The study's finding of no significant differences in rates of catheterization and percutaneous coronary intervention differs from previous research, the authors noted, suggesting that differing care settings could be partially responsible. The single-center setting limits the generalizability of this study, and decisions to perform cardiac catheterization and revascularization were made clinically without an algorithm, they said.
Acute burden of illness associated with earlier readmissions
Different factors may be associated with increased risk of hospital readmission within 7 days or within 30 days, a recent study found.
The single-center retrospective cohort study included 13,334 admissions of 8,078 patients in 2009 and 2010. Readmissions were considered early if they occurred within 0 to 7 days after discharge and late if they occurred between 8 and 30 days after discharge. Results were published in the June 2 Annals of Internal Medicine.
The early readmissions, unlike the late readmissions, were found to have an association with factors related to the patient's burden of acute illness—for example, patients were more likely to be readmitted in a week if they had longer length of stay (odds ratio [OR], 1.02; 95% CI, 1.00 to 1.03) or were seen by a rapid response team (OR, 1.48; 95% CI, 1.15 to 1.89). Both early and late readmissions were associated with some markers of increased burden of chronic illness, such as receiving a medication indicating organ failure, and social issues, such as barriers to learning.
The study also found that early readmissions were less likely in patients discharged between 8 a.m. and 1 p.m. (OR, 0.75; 95% CI, 0.58 to 0.99) and that late readmissions were more common in patients with unsupplemented Medicare or Medicaid coverage (OR, 1.16; 95% CI, 1.01 to 1.33). The authors concluded that the causes of readmissions within 30 days may vary by the time since discharge, and thus, the strategies to prevent these readmissions may need to differ, since hospitals may have less ability to affect late readmissions.
An accompanying editorial expressed some doubts about the conclusions drawn by the study authors, noting that the CIs for some of their factors had substantial overlap between the early and late readmission categories. Further investigation is needed into the possibility that predictors of readmission change with time, the editorialists said, noting that “any markers that differentiate patients should have the potential to improve outcomes through higher quality care.” It is preferable to tailor interventions based on individual patient factors, rather than the time after discharge, the editorial said.
Benefit from perioperative beta-blockers in noncardiac surgery may depend on risk factors, study finds
Patients undergoing noncardiac surgery who have multiple cardiac risk factors appear to benefit from perioperative beta-blocker use, but those with fewer risk factors may not, according to a recent study.
Researchers performed an analysis of 326,489 patients with varying cardiac risk factors undergoing surgery in Veterans Affairs hospitals from Oct. 1, 2008, through July 31, 2013.
Of the patients included in the analysis, 314,114 (96.2%) underwent noncardiac surgery and 12,375 (3.8%) underwent cardiac surgery. A total of 132,614 (42.2%) of patients undergoing noncardiac surgery received a beta-blocker, whereas 8,571 (69.3%) of the cardiac surgery patients received one. A simple cardiac risk score based on the Revised Cardiac Risk Index was determined for each patient by assigning 1 point each for serum creatinine >2.0 mg/dL, coronary artery disease, diabetes mellitus, or surgery involving the thoracic or abdominal cavity.
The study, published online on May 27 in JAMA Surgery, found that beta-blockers significantly lowered the odds ratios for mortality to 0.63 in patients with 3 to 4 cardiac risk factors undergoing noncardiac surgery. Beta-blocker therapy was associated with a nonsignificant decrease in mortality (odds ratio, 0.95) in patients with 1 to 2 risk factors but was associated with a significantly higher chance of death (odds ratio, 1.19) in patients with no risk factors undergoing noncardiac surgery.
The unadjusted 30-day mortality rate in patients with no cardiac risk factors undergoing noncardiac surgery who did not receive beta-blockers was 0.5%, compared to 1.0% in those who received beta-blockers. For those with 1 to 2 cardiac risk factors, unadjusted 30-day mortality rates were 1.4% without beta-blockers and 1.7% with beta-blockers. For those with 3 to 4 cardiac risk factors, receiving beta-blockers decreased the unadjusted 30-day mortality rate from 6.7% to 3.5%.
For the cardiac surgery patients, odds ratios for mortality with beta-blocker therapy versus no beta-blocker therapy were 0.70 in those with no cardiac risk factors, 1.25 in those with 1 to 2 risk factors, and negligible in those with 3 to 4 risk factors.
The authors noted that their study included mostly men, that the specific beta-blockers used were not known, and that data on when patients first took a beta-blocker (at home or in the hospital) were not available, among other limitations. However, they concluded that perioperative beta-blocker therapy is beneficial in patients undergoing noncardiac surgery who have 3 to 4 cardiac risk factors but has no efficacy in those with 1 to 2 risk factors. “Most important, the use of [beta]-blockers in patients with no cardiac risk factors appears to be associated with a higher risk of death, which has, to our knowledge, not been previously reported,” they wrote.
Short-course antimicrobial therapy yields similar outcomes to longer therapy in intra-abdominal infection, study finds
Treatment with approximately 4 days of antibiotic therapy yielded similar outcomes to approximately 8 days of treatment in patients with intra-abdominal infection, following an adequate source-control procedure, according to a recent study.
Researchers randomly assigned patients with complicated intra-abdominal infection and adequate source control to the control group or the experimental group. Source control was defined as “procedures that eliminate infectious foci, control factors that promote ongoing infection, and correct or control anatomical derangements to restore normal physiological function.” The control group received a longer course of antibiotics, defined as lasting until 2 days after resolution of fever, leukocytosis, and ileus (10 days maximum), and the experimental group received a fixed course of treatment lasting 4±1 days. The study's primary outcome was surgical-site infection, recurrent intra-abdominal infection, or death 30 days or less after the index source-control procedure; secondary outcomes were therapy duration and rates of subsequent infections. The results were published in the May 21 New England Journal of Medicine.
From August 2008 through August 2013, 518 patients were randomly assigned to a treatment group, although 1 patient in the experimental group withdrew consent after randomization. Most patients were men, and the mean age was 52.2 years. Two hundred eleven of 258 patients in the experimental group and 189 of 260 patients in the control group adhered to the protocol (81.8% vs. 72.7%; P=0.02). The composite primary outcome occurred in 56 of 257 patients in the experimental group (21.8%) versus 58 of 260 patients in the control group (22.3%; absolute difference, −0.5 percentage point; P=0.92). Duration of therapy was 4.0 days in the experimental group versus 8.0 days in the control group (P<0.001). The researchers found no between-group differences for rates of any component of the primary outcome or in any secondary outcome.
The authors noted that their trial excluded patients without adequate source control and included only a few patients with immunosuppression, that nonadherence rates for the study protocol were moderately high, and that the original calculated sample size was not achieved, meaning that “the null hypothesis of equal efficacy cannot be rejected.” However, they concluded that outcomes appear to be similar in patients with intra-abdominal infections and a successful source-control procedure, whether they receive a fixed-dose 4-day course of antimicrobial therapy or a longer course given until signs and symptoms of sepsis resolve.
The authors of an accompanying editorial pointed out that the study did not have data on antibiotic-related adverse events, between-group differences in postoperative hospital stays, or antibiograms of organisms in patients with complications. However, they added that shorter antibiotic courses as described in this study could save the U.S. health care system $97 million per year and could decrease the burden of adverse events related to antibiotic use. The editorialists were concerned that over 20% of patients in both study groups had post-treatment complications and said it seems likely that inadequate source control was partly to blame, although some might feel that more days of antibiotics were needed.
“In the future, there may be improved approaches to source control in abdominal sepsis and safer antibiotics for limiting microbial growth,” the editorialists wrote. “In the meantime, we have encouraging data from the [current] trial that suggest cost savings and improved safety.”
Optimal duration of antiplatelet therapy after drug-eluting stent remains unclear, study finds
A systematic review and meta-analysis found that the benefit of shorter versus longer dual-antiplatelet therapy after placement of a drug-eluting stent (DES) is unclear.
Researchers searched Ovid MEDLINE and EMBASE from 1996 to March 27, 2015, for randomized, controlled trials that compared duration of dual antiplatelet therapy after DES placement. The analysis focused on all-cause death, myocardial infarction (MI), and major bleeding, as well as cardiovascular mortality, any stroke, and recurrent revascularization. Dual-antiplatelet therapy was defined as aspirin plus a P2Y12 inhibitor.
Studies were included if they had a shorter-duration treatment arm in which patients received dual-antiplatelet therapy for at least 3 months after DES placement and a longer-duration treatment arm in which patients received such treatment for at least 6 months longer than the shorter-duration arm. The study results were published online May 26 by Annals of Internal Medicine.
Nine trials involving complete data for 28,808 patients, mostly hypertensive men in their 60s, were included in the meta-analysis. The difference in duration between the longer and shorter arms ranged from 6 months to 24 months, and clopidogrel was the most common P2Y12 inhibitor used. According to moderate-quality evidence, longer dual-antiplatelet therapy decreased MI risk and increased risk for mortality (risk ratios, 0.73 [95% CI, 0.58 to 0.92] and 1.19 [95% CI, 1.34 to 1.99]) versus shorter therapy. According to high-quality evidence, dual-antiplatelet therapy increased major bleeding risk (risk ratio, 1.63 [95% CI, 1.34 to 1.99]).
The researchers said that their confidence in their estimates was affected by imprecision (especially for MI), bias due to lack of blinding, variable use of first- and second-generation stents, and off-protocol use of dual-antiplatelet therapy, among other limitations. However, they concluded that longer dual-antiplatelet therapy after DES placement is associated with approximately 8 fewer MIs and 6 more major bleeding events per 1,000 treated patients annually compared to shorter therapy.
“Because absolute effects are very small and closely balanced, decisions regarding the duration of [dual-antiplatelet therapy] must take into account patients' values and preference judgments,” the authors concluded. They noted that patients who don't feel the benefits of the drugs outweigh the risk, those who are afraid to risk bleeding events, and those who are generally risk-averse are not likely to opt for extended therapy.
Protocol for dabigatran management avoids bridging, showed low risk of bleeding
A protocol for perioperative management of dabigatran resulted in few bleeding or thromboembolic complications, a recent study found.
The study included 541 patients who were treated with dabigatran and scheduled for an invasive procedure. At a baseline visit 1 week before the procedure, their creatinine clearance was calculated and data on patient characteristics, current antithrombotic treatment, and type of surgery were collected. Based on procedure-related bleeding risk and creatinine clearance, the protocol determined whether the patient's last dose of dabigatran should be taken 24, 48, or 96 hours before surgery. The protocol also recommended a time and dose for resumption of dabigatran based on the procedure.
Sixty percent of patients underwent a procedure with standard risk of bleeding, while 40% had a procedure associated with increased bleeding risk. The last dose of dabigatran was given 24 hours before surgery in 46% of patients, while 37% stopped 48 hours before and 6% stopped 96 hours before. The protocol's recommendations were followed 89% of the time. After surgery, dabigatran was resumed according to protocol 77% of the time, with 40% of patients restarting with a 75-mg dose on the day of procedure and 73% restarting by 2 days.
Overall, 10 patients (1.8%; 95% CI, 0.7% to 3.0%) had major bleeding within 30 days of surgery and 28 (5.2%; 95% CI, 3.3% to 7.0%) had minor bleeding. The only thromboembolic complication was 1 patient with transient ischemic attack (0.2%; 95% CI, 0% to 0.5%). There were 4 deaths unrelated to bleeding or thrombosis, and no preoperative heparin bridging was used, but 9 patients (1.7%) received postoperative bridging. The study was published by Circulation on May 12.
Based on the results, the protocol appeared to be feasible and safe and showed no need for bridging except when patients could not take oral medications or when heparin was indicated (i.e., after vascular surgery), the study authors concluded. They noted that the study's incidence of major bleeding was lower than that reported by previous studies of perioperative management of vitamin K antagonists, which often entail heparin bridging.
The American Society of Regional Anesthesia currently recommends stopping dabigatran 5 days before surgery with neuraxial block, due to concerns (but not evidence) about epidural hematoma risk, but this study included 13 such surgeries and no related bleeding complications, although the study's 95% CI for this risk would be 0% to 25%, the authors noted. The study was also limited by not having a randomized, controlled design.
Digoxin linked to increased mortality risk
Digoxin may be associated with increased mortality risk, especially in patients with atrial fibrillation, according to a systematic review and meta-analysis.
Researchers looked at English-language studies published in peer-reviewed journals beginning in 1993. Randomized, controlled trials, case-control studies, and cohort studies were included if they examined the relationship between digoxin and all-cause mortality in patients with atrial fibrillation or congestive heart failure and reported effect sizes as hazard ratios. The study results were published online May 4 by the European Heart Journal.
Nineteen trials were included in the meta-analysis. Of these, 9 involved patients with atrial fibrillation, 7 involved patients with congestive heart failure, and 3 involved patients with both conditions. A total of 326,426 patients were included in the studies, with a follow-up period ranging from 0.83 to 4.7 years. The researchers analyzed adjusted mortality results and found that digoxin use was associated with increased relative risk of all-cause mortality versus no digoxin use (hazard ratio, 1.21; 95% CI, 1.07 to 1.38; P<0.01). In the 235,047 patients who had atrial fibrillation, those who were taking digoxin had a higher mortality risk than those who were not (hazard ratio, 1.29; 95% CI, 1.21 to 1.39; P<0.01). A higher mortality risk associated with digoxin was also seen in the 91,379 patients with heart failure, but to a lesser extent (hazard ratio, 1.14; 95% CI, 1.06 to 1.22; P<0.01).
The authors noted that they did not have access to individual patient data, that residual confounding may have affected their results, and that data on digoxin dose or plasma levels were limited. However, they concluded that based on their findings, digoxin therapy appears to be associated with higher mortality risk in patients with atrial fibrillation and those with congestive heart failure, especially the former group. Randomized, controlled trials of dose-adjusted digoxin therapy are needed, the authors wrote. “Until such proper randomized controlled trials are being completed, digoxin should be used with great caution (including monitoring plasma levels), particularly when administered for rate control in AF,” they concluded.