Eur Urol Oncol
BACKGROUND: Patients with residual muscle-invasive urinary tract cancer after neoadjuvant chemotherapy (NAC) have a high risk of recurrence. OBJECTIVE: To retrospectively evaluate whether additional adjuvant chemotherapy (AC) improves outcomes compared with surveillance in patients with significant residual disease despite NAC. DESIGN, SETTING, AND PARTICIPANTS: We identified 474 patients who received NAC from the Retrospective International Study of Cancers of the Urothelium database, of whom 129 had adverse residual disease (>/=ypT3 and/or ypN(+)). OUTCOME MEASUREMENTS AND STATISTICAL ANALYSIS: Time to relapse (TTR) was the primary endpoint assessed starting from 2mo after surgery to minimize immortal time bias. Secondary endpoints included overall survival (OS), incidence of AC use, and chemotherapy patterns. Kaplan-Meier and Cox regression models estimated TTR, OS, and associations with AC, adjusting for the type of NAC, age, and pathological stage in multivariable analyses. RESULTS AND LIMITATIONS: A total of 106 patients underwent surveillance, while 23 received AC. Gemcitabine-cisplatin was the most frequent regimen employed in both settings (30.4%), and the majority (82.6%) of the patients switched to a different regimen. Median follow-up was 30mo. Over 50% of patients developed a recurrence. Median TTR was 16mo (range: <1-108mo). Longer median TTR was observed with AC compared with surveillance (18 vs 10mo, p=0.06). Risk of relapse significantly decreased with AC when adjusted in multivariable analyses (p=0.01). The subgroup analyses of ypT4b/ypN(+) patients (AC: 19; surveillance: 50) who received AC had significantly greater median TTR (20 vs 9mo; hazard ratio 0.43; 95% confidence interval: 0.21-0.89). No difference in OS was found. Limitations include the retrospective design. CONCLUSIONS: The utilization of AC after NAC in patients with high-risk residual disease is not frequent in clinical practice but might reduce the risk of recurrence. Further investigation is needed in this high-risk population to identify optimal therapy and to improve clinical outcomes such as the ongoing adjuvant immunotherapy trials. PATIENT SUMMARY: We found that administering additional chemotherapy in patients who had significant residual disease despite preoperative chemotherapy is not frequent in clinical practice. While it might reduce the risk of recurrence, it did not clearly increase overall survival. We encourage participation in the ongoing immunotherapy trials to see whether we can improve outcomes using a different type of therapy that stimulates the immune system.
Allogeneic bone marrow transplantation (BMT) is curative therapy for the treatment of patients with severe aplastic anemia (SAA). However, several conditioning regimens can be used for BMT. We evaluated transplant conditioning regimens for BMT in SAA after HLA-matched sibling and unrelated donor BMT. For recipients of HLA-matched sibling donor transplantation (n = 955), fludarabine (Flu)/cyclophosphamide (Cy)/antithymocyte globulin (ATG) or Cy/ATG led to the best survival. The 5-year probabilities of survival with Flu/Cy/ATG, Cy/ATG, Cy ± Flu, and busulfan/Cy were 91%, 91%, 80%, and 84%, respectively (P = .001). For recipients of 8/8 and 7/8 HLA allele-matched unrelated donor transplantation (n = 409), there were no differences in survival between regimens. The 5-year probabilities of survival with Cy/ATG/total body irradiation 200 cGy, Flu/Cy/ATG/total body irradiation 200 cGy, Flu/Cy/ATG, and Cy/ATG were 77%, 80%, 75%, and 72%, respectively (P = .61). Rabbit-derived ATG compared with equine-derived ATG was associated with a lower risk of grade II to IV acute graft-versus-host disease (GVHD) (hazard ratio [HR], 0.39; P < .001) but not chronic GVHD. Independent of conditioning regimen, survival was lower in patients aged >30 years after HLA-matched sibling (HR, 2.74; P < .001) or unrelated donor (HR, 1.98; P = .001) transplantation. These data support Flu/Cy/ATG and Cy/ATG as optimal regimens for HLA-matched sibling BMT. Although survival after an unrelated donor BMT did not differ between regimens, use of rabbit-derived ATG may be preferred because of lower risks of acute GVHD.
Journal of virology
Methods Mol Biol
DNA vaccines have been licensed in veterinary medicine and have promise for humans. This format is relatively immunogenic in mice and guinea pigs, the two principle HSV-2 animal models, permitting rapid assessment of vectors, antigens, adjuvants, and delivery systems. Limitations include the relatively poor immunogenicity of naked DNA in humans and the profound differences in HSV-2 pathogenesis between host species. Herein, we detail lessons learned investigating candidate DNA vaccines in the progesterone-primed female mouse vaginal model of HSV-2 infection as a guide to investigators in the field.
J Clin Oncol
PURPOSE: Therapeutic radiation in childhood cancer has decreased over time with a concomitant increase in chemotherapy. Limited data exist on chemotherapy-associated subsequent malignant neoplasm (SMN) risk. PATIENTS AND METHODS: SMNs occurring > 5 years from diagnosis, excluding nonmelanoma skin cancers, were evaluated in survivors diagnosed when they were < 21 years old, from 1970 to 1999 in the Childhood Cancer Survivor Study (median age at diagnosis, 7.0 years; median age at last follow-up, 31.8 years). Thirty-year SMN cumulative incidence and standardized incidence ratios (SIRs) were estimated by treatment: chemotherapy-only (n = 7,448), chemotherapy plus radiation (n = 10,485), radiation only (n = 2,063), or neither (n = 2,158). Multivariable models were used to assess chemotherapy-associated SMN risk, including dose-response relationships. RESULTS: Of 1,498 SMNs among 1,344 survivors, 229 occurred among 206 survivors treated with chemotherapy only. Thirty-year SMN cumulative incidence was 3.9%, 9.0%, 10.8%, and 3.4% for the chemotherapy-only, chemotherapy plus radiation, radiation-only, or neither-treatment groups, respectively. Chemotherapy-only survivors had a 2.8-fold increased SMN risk compared with the general population (95% CI, 2.5 to 3.2), with SIRs increased for subsequent leukemia/lymphoma (1.9; 95% CI, 1.3 to 2.7), breast cancer (4.6; 95% CI, 3.5 to 6.0), soft-tissue sarcoma (3.4; 95% CI, 1.9 to 5.7), thyroid cancer (3.8; 95% CI, 2.7 to 5.1), and melanoma (2.3; 95% CI, 1.5 to 3.5). SMN rate was associated with > 750 mg/m2 platinum (relative rate [RR] 2.7; 95% CI, 1.1 to 6.5), and a dose response was observed between alkylating agents and SMN rate (RR, 1.2/5,000 mg/m2; 95% CI, 1.1 to 1.3). A linear dose response was also demonstrated between anthracyclines and breast cancer rate (RR, 1.3/100 mg/m2; 95% CI, 1.2 to 1.6). CONCLUSION: Childhood cancer survivors treated with chemotherapy only, particularly higher cumulative doses of platinum and alkylating agents, face increased SMN risk. Linear dose responses were seen between alkylating agents and SMN rates and between anthracyclines and breast cancer rates. Limiting cumulative doses and consideration of alternate chemotherapies may reduce SMN risk.
Clin Infect Dis
BACKGROUND: Patients with reported beta lactam antibiotic allergies (BLA) are more likely to receive broad-spectrum antibiotics and experience adverse outcomes. Data describing antibiotic allergies among solid organ transplant (SOT) and hematopoietic cell transplant (HCT) recipients are limited. METHODS: We reviewed records of adult SOT or allogeneic HCT recipients from 1/1/2013-12/31/2017 to characterize reported antibiotic allergies at time of transplant. Inpatient antibiotic use was examined for 100 days post-transplant. Incidence rate ratios (IRR) comparing antibiotic use in BLA and non-BLA groups were calculated using multivariable negative binomial models for two metrics: days of therapy (DOT)/1000 inpatient days and percentage of antibiotic exposure days. RESULTS: Among 2153 SOT (65%) and HCT (35%) recipients, 634 (29%) reported any antibiotic allergy and 347 (16%) reported BLA. Inpatient antibiotics were administered to 2020 (94%) patients during the first 100 days post-transplant; average antibiotic exposure was 41% of inpatient days (Interquartile range (IQR) 16.7%, 62.5%). BLA patients had significantly higher DOT for vancomycin (IRR 1.4; 95% confidence interval (CI) [1.2, 1.7]; p<0.001), clindamycin (IRR 7.6; 95% CI [2.2, 32.4]; p=0.001), aztreonam in HCT (IRR 9.7; 95% CI [3.3, 35.0]; p<0.001), and fluoroquinolones in SOT (IRR 2.9; 95% CI [2.1, 4.0]; p<0.001); these findings were consistent when using percentage of antibiotic exposure days. CONCLUSIONS: Transplant recipients are frequently exposed to antibiotics and have a high prevalence of reported antibiotic allergies. Reported BLA was associated with greater use of beta lactam antibiotic alternatives. Pre-transplant antibiotic allergy evaluation may optimize antibiotic use in this population.
J Infect Dis
BACKGROUND: Virus infections result in a range of clinical outcomes for the host, from asymptomatic to severe or even lethal disease. Despite global efforts to prevent and treat virus infections to limit morbidity and mortality, the continued emergence and re-emergence of new outbreaks as well as common infections such as influenza persist as a health threat. Challenges to the prevention of severe disease after virus infection include both a paucity of protective vaccines, as well as the early identification of individuals with the highest risk that may require supportive treatment. METHODS: We completed a screen of mice from the Collaborative Cross (CC) that we infected with influenza, SARS-coronavirus, and West Nile virus. RESULTS: CC mice exhibited a range of disease manifestations upon infections, and we used this natural variation to identify strains with mortality following infection and strains exhibiting no mortality. We then used comprehensive pre-infection immunophenotyping to identify global baseline immune correlates of protection from mortality to virus infection. CONCLUSIONS: These data suggest that immune phenotypes might be leveraged to identify humans at highest risk of adverse clinical outcomes upon infection, who may most benefit from intensive clinical interventions, in addition to providing insight for rational vaccine design.
Clin Infect Dis
BACKGROUND: Chemoprophylaxis vaccination with sporozoites (CVac) with chloroquine induces protection against homologous P. falciparum sporozoite (PfSPZ) challenge, but whether blood-stage parasite exposure is required for protection remains unclear. Chloroquine suppresses and clears blood-stage parasitemia, while other antimalarial drugs such as primaquine act against liver-stage parasites. Here, we evaluate CVac regimens using chloroquine or primaquine as the partner drug to discern whether blood stage parasite exposure impacts protection against homologous controlled human malaria infection. METHODS: In a phase 1, randomized, partial double-blind, placebo-controlled study of 36 malaria-nave adults, all CVac subjects received chloroquine prophylaxis and bites from 12-15 P. falciparum-infected mosquitoes (CVac-chloroquine arm) at 3 monthly iterations, and some received post-exposure primaquine (CVac-primaquine/chloroquine arm). Drug control subjects received primaquine, chloroquine, and uninfected mosquito bites. After chloroquine washout, subjects, including treatment-nave infectivity controls, underwent homologous PfSPZ controlled human malaria infection and were monitored for parasitemia for 21 days. RESULTS: No serious adverse events occurred. During CVac, all but one subject in the study remained blood smear-negative while only one subject (primaquine/chloroquine arm) remained PCR-negative. Upon challenge, compared to infectivity controls, 3/3 chloroquine arm subjects displayed delayed patent parasitemia (p=0.01) but not sterile protection, while 3/11 primaquine/chloroquine subjects remained blood smear negative. CONCLUSIONS: CVac-primaquine/chloroquine is safe and induces sterile immunity to P. falciparum in some recipients, but a single 45 mg dose of primaquine post-exposure does not completely prevent blood-stage parasitemia. Unlike previous studies, CVac-chloroquine did not produce sterile immunity. CLINICAL TRIALS REGISTRATION: ClinicalTrials.gov identifier NCT01500980.
Int J Cancer
In the Women's Health Initiative (WHI) Life and Longevity After Cancer (LILAC) cohort we examined predictors of guideline-concordant treatment among endometrial cancer (EC) survivors and associations between receipt of guideline-concordant treatment and survival. Receipt of guideline-concordant EC treatment was defined according to year-specific National Comprehensive Cancer Network (NCCN) guidelines. Multivariable logistic regression was used to estimate odds ratios (ORs) and 95% confidence intervals (CIs) for predictors of guideline-concordant treatment receipt. We estimated multivariable-adjusted hazard ratios (HRs) and 95% CIs for relationships between guideline-concordant treatment and overall survival using Cox proportional hazards regression. We included 629 women with EC, of whom 83.6% (n=526) received guideline-concordant treatment. Receipt of guideline-concordant treatment was less common among women with non-endometrioid histology (OR=0.24, 95% CI=0.13-0.45) but was more common among women living in the Midwest (OR=2.09, 95% CI=1.06-4.12) or West (OR=3.02, 95% CI=1.49-6.13) compared to the Northeast. In Cox regression models adjusted for age, histology, and stage, receipt of guideline-concordant EC treatment was borderline associated with improved overall survival (HR=0.80, 95% CI=0.60-1.01) in the overall population. Guideline-concordant treatment was also linked with better overall survival among women with low-grade uterine-confined endometrioid EC or widely metastatic endometrioid EC. Guideline-concordant treatment varies by some patient characteristics and those women in receipt of guideline-concordant care had borderline improved survival. Studies evaluating regional differences in treatment along with randomized clinical trials to determine appropriate treatment regimens for women with aggressive tumor characteristics are warranted. This article is protected by copyright. All rights reserved.
J Acquir Immune Defic Syndr
BACKGROUND: Expanded access to HIV antiretrovirals has dramatically reduced mother-to-child transmission of HIV. However, there is increasing concern around false-positive HIV test results in perinatally HIV-exposed infants but few insights into the use of indeterminate range to improve infant HIV diagnosis. METHODS: A systematic review and meta-analysis was conducted to evaluate the use of an indeterminate range for HIV early infant diagnosis. Published and unpublished studies from 2000 to 2018 were included. Study quality was evaluated using GRADE and QUADAS-2 criteria. A random-effects model compared various indeterminate ranges for identifying true and false positives. RESULTS: The review identified 32 studies with data from over 1.3 million infants across 14 countries published from 2000 to 2018. Indeterminate results accounted for 16.5% of initial non-negative test results, and 76% of indeterminate results were negative on repeat testing. Most results were from Roche tests. In the random-effects model, an indeterminate range using a polymerase chain reaction cycle threshold value of 33 captured over 93% of false positives while classifying fewer than 9% of true positives as indeterminate. CONCLUSIONS: Without the use of an indeterminate range, over 10% of infants could be incorrectly diagnosed as HIV positive if their initial test results are not confirmed. Use of an indeterminate range appears to lead to substantial improvements in the accuracy of early infant diagnosis testing and supports current recommendations to confirm all initial positive tests.