AbstractBackground.In a global, phase III, open‐label, noninferiority trial (REFLECT), lenvatinib demonstrated noninferiority to sorafenib in overall survival and a statistically significant increase in progression‐free survival in patients with unresectable hepatocellular carcinoma (HCC). Recently, lenvatinib became the first agent in more than 10 years to receive approval as first‐line therapy for unresectable HCC, along with the previously approved sorafenib. The objective of this study was to determine the comparative cost‐effectiveness of lenvatinib and sorafenib as a first‐line therapy of unresectable HCC.Materials and Methods.A state‐transition model of unresectable HCC was developed in the form of a cost–utility analysis. The model time horizon was 5 years; the efficacy of the model was informed by the REFLECT trial, and costs and utilities were obtained from published literature. Probabilistic sensitivity analyses and subgroup analyses were performed to test the robustness of the model.Results.Lenvatinib dominated sorafenib in the base case analysis. A probabilistic sensitivity analysis indicated that lenvatinib remains a cost‐saving measure in 64.87% of the simulations. However, if the cost of sorafenib was reduced by 57%, lenvatinib would no longer be the dominant strategy.Conclusion.Lenvatinib offered a similar clinical effectiveness at a lower cost than sorafenib, suggesting that lenvatinib would be a cost‐saving alternative in treating unresectable HCC. However, lenvatinib may fail to remain cost‐saving if a significantly cheaper generic sorafenib becomes available.Implications for Practice.This analysis suggests an actionable clinical policy that will achieve cost saving. This cost–utility analysis showed that lenvatinib had a similar clinical effectiveness at a lower cost than sorafenib, indicating that lenvatinib may be a cost‐saving measure in patients with unresectable HCC, in which $23,719 could be saved per patient. The introduction of a new therapeutic option for the first time in 10 years in Canada provides an important opportunity for clinicians, researchers, and health care decision‐makers to explore potential modifications in recommendations and practice guidelines.
AbstractBackground.Current literature is inconsistent in the associations between computed tomography (CT)‐based body composition measures and adverse outcomes in older patients with colorectal cancer (CRC). Moreover, the associations with consecutive treatment modalities have not been studied. This study compared the associations of CT‐based body composition measures with surgery‐ and chemotherapy‐related complications and survival in older patients with CRC.Materials and Methods.A retrospective single‐center cohort study was conducted in patients with CRC aged ≥65 years who underwent elective surgery between 2010 and 2014. Gender‐specific standardized scores of preoperative CT‐based skeletal muscle (SM), muscle density, intermuscular adipose tissue (IMAT), visceral adipose tissue (VAT), subcutaneous adipose tissue, IMAT percentage, SM/VAT, and body mass index (BMI) were tested for their associations with severe postoperative complications, prolonged length of stay (LOS), readmission, and dose‐limiting toxicity using logistic regression and 1‐year and long‐term survival (range 3.7–6.6 years) using Cox regression. Bonferroni correction was applied to account for multiple testing.Results.The study population consisted of 378 patients with CRC with a median age of 73.4 (interquartile range 69.5–78.4) years. Severe postoperative complications occurred in 13.0%, and 39.4% of patients died during follow‐up. Dose‐limiting toxicity occurred in 77.4% of patients receiving chemotherapy (n = 53). SM, muscle density, VAT, SM/VAT, and BMI were associated with surgery‐related complications, and muscle density, IMAT, IMAT percentage, and SM/VAT were associated with long‐term survival. After Bonferroni correction, no CT‐based body composition measure was significantly associated with adverse outcomes. Higher BMI was associated with prolonged LOS.Conclusion.The associations between CT‐based body composition measures and adverse outcomes of consecutive treatment modalities in older patients with CRC were not consistent or statistically significant.Implications for Practice.Computed tomography (CT)‐based body composition, including muscle mass, muscle density, and intermuscular, visceral, and subcutaneous adipose tissue, showed inconsistent and nonsignificant associations with surgery‐related complications, dose‐limiting toxicity, and overall survival in older adults with colorectal cancer. This study underscores the need to verify whether CT‐based body composition measures are worth implementing in clinical practice.
AbstractBackground.Malnutrition and physical inactivity are common in patients with advanced cancer and are associated with poor outcomes. There are increasing data that altered body composition is related to the pharmacokinetic properties of cancer therapies. These adverse conditions may impact outcomes in early‐phase oncology clinical trials.Materials and Methods.We aimed to understand the relationships between baseline nutrition and exercise status with important trial endpoints including treatment‐related toxicity and survival. Baseline assessments of nutrition and exercise status were conducted in patients prior to initiation of phase I and II oncology clinical trials. Patients were followed prospectively for the onset of adverse events. Tumor response and survival data were also obtained. Fisher's exact test and chi‐square analysis were used to determine statistical significance. Kaplan‐Meier curves were used to compare patient duration on study and survival.Results.One hundred patients were recruited, of whom 87 were initiating a phase I trial. Sixty percent were initiating trials studying immunotherapeutic agents. Critical malnutrition was found in 39% of patients, and 52% were sedentary. Patients who were malnourished had significantly increased rates of grade ≥ 3 toxicity (p = .001), hospitalizations (p = .001), and inferior disease control rate (p = .019). Six‐month overall survival was significantly reduced in malnourished patients versus nonmalnourished patients (47% vs. 84%; p = .0003), as was median duration on study (48 days vs. 105 days; p = .047). Being sedentary at baseline was associated with decreased duration on study (57 days vs. 105 days; p = .019).Conclusion.Malnutrition and sedentary lifestyle are highly prevalent in patients enrolling on early‐phase oncology clinical trials and are associated with poor outcomes. The quality of data from these studies may be compromised as a result of these pre‐existing conditions.Implications for Practice.Phase I and II trials are critical steps in the development of effective cancer therapeutics, yet only a small percentage of agents are ultimately approved for human cancer care. Despite increasing awareness of the interactions between malnutrition, sarcopenia, and treatment‐related outcomes such as toxicity and response, these factors are not commonly incorporated into therapeutic decision making at the time of clinical trial consideration. Nutritional status and physical performance may be key biomarkers of mechanisms mediating treatment‐related toxicity, dose modifications, risk of hospitalizations, and success of novel agents. This study advocates that a baseline nutritional assessment and early nutritional support may improve tolerability and response to experimental therapies.
AbstractThere are currently seven approved immune checkpoint inhibitors (ICIs) for the treatment of various cancers. These drugs are associated with profound, durable responses in a subset of patients with advanced cancers. Unfortunately, in addition to individuals whose tumors show resistance, there is a minority subgroup treated with ICIs who demonstrate a paradoxical acceleration in the rate of growth or their tumors—hyperprogressive disease. Hyperprogressive disease is associated with significantly worse outcomes in these patients. This phenomenon, though still a matter of dispute, has been recognized by multiple groups of investigators across the globe and in diverse types of cancers. There are not yet consensus standardized criteria for defining hyperprogressive disease, but most commonly time to treatment failure less than 2 months and an increase in pace of progression of at least twofold between pre‐immunotherapy and on‐treatment imaging has been used. In some patients, the change in rate of progression can be especially dramatic—up to 35‐ to 40‐fold. MDM2 amplification and EGFR mutations have been suggested as genomic correlates of increased risk of hyperprogression, but these correlates require validation. The underlying mechanism for hyperprogression is not known but warrants urgent investigation.
AbstractObjective.TAS‐102 is effective for treating patients with metastatic colorectal cancer (mCRC). This study determined whether combining bevacizumab (Bmab) with TAS‐102 improves clinical outcomes in refractory mCRC.Patients and Methods.We retrospectively analyzed data from Japanese patients with refractory mCRC who received TAS‐102 (35 mg/m2, twice a day) with (T‐B group) or without Bmab (TAS‐102 monotherapy; T group) between July 2014 and December 2018. The primary endpoint was median overall survival (OS), and secondary endpoints were median time to treatment failure, overall response rate, and the incidence of adverse events. Clinical outcomes were compared using propensity score matched analysis.Results.Data from 57 patients were analyzed (T‐B group: 21 patients, T group: 36 patients). Median OS was significantly longer in the T‐B group than the T group (14.4 months vs. 4.5 months, p < .001). Cox proportional hazard analysis showed that combination therapy with Bmab was significantly correlated with OS. Propensity score matched analysis confirmed that the median OS was significantly longer in the T‐B group than the T group (14.4 months vs. 6.1 months, p = .006) and that there was a significant correlation between Bmab and OS. The incidence of hypertension (grade ≥2) as an adverse event was significantly higher in the T‐B group than the T group (23.8% vs. 0.0%, p = .005), whereas other adverse events were comparable between the two groups.Conclusion.Treatment with Bmab in combination with TAS‐102 is significantly associated with improved clinical outcomes in patients with mCRC refractory to standard therapies.Implications for Practice.Combining bevacizumab (Bmab) with TAS‐102 significantly improved overall survival and several prognostic indicators in patients with metastatic colorectal cancer (mCRC) refractory to standard therapies, with manageable toxicities. Treatment with Bmab in combination with TAS‐102 is significantly associated with improved clinical outcomes in patients with mCRC.
AbstractBackground.Human epidermal growth factor receptor 2 (HER2)‐mutant lung cancer remains an orphan of specific targeted therapy. The variable responses to anti‐HER2 therapies in these patients prompt us to examine impact of HER2 variants and co‐mutations on responses to anti‐HER2 treatments in lung cancer.Patients and Methods.Patients with stage IV/recurrent HER2‐mutant lung cancers identified through next‐generation sequencings were recruited from seven hospitals. The study comprised a cohort A to establish the patterns of HER2 variants and co‐mutations in lung cancer and a cohort B to assess associations between HER2 variants, co‐mutations, and clinical outcomes.Results.The study included 118 patients (cohort A, n = 86; cohort B, n = 32). Thirty‐one HER2 variants and 35 co‐mutations were detected. Predominant variants were A775_G776insYVMA (49/118, 42%), G778_P780dup (11/118, 9%), and G776delinsVC (9/118, 8%). TP53 was the most common co‐mutation (61/118, 52%). In cohort B, objective response rates with afatinib were 0% (0/14, 95% confidence interval [CI], 0%–26.8%), 40% (4/10, 14.7%–72.6%), and 13% (1/8, 0.7%–53.3%) in group 1 (A775_G776insYVMA, n = 14), group 2 (G778_P780dup, G776delinsVC, n = 10), and group 3 (missense mutation, n = 8), respectively (p = .018). Median progression‐free survival in group 1 (1.2 months; 95% CI, 0–2.4) was shorter than those in group 2 (7.6 months, 4.9–10.4; hazard ratio [HR], 0.009; 95% CI, 0.001–0.079; p < .001) and group 3 (3.6 months, 2.6–4.5; HR, 0.184; 95% CI, 0.062–0.552; p = .003). TP53 co‐mutations (6.317; 95% CI, 2.180–18.302; p = .001) and PI3K/AKT/mTOR pathway activations (19.422; 95% CI, 4.098–92.039; p < .001) conferred additional resistance to afatinib.Conclusion.G778_P780dup and G776delinsVC derived the greatest benefits from afatinib among HER2 variants. Co‐mutation patterns were additional response modifiers. Refining patient population based on patterns of HER2 variants and co‐mutations may help improve the efficacy of anti‐HER2 treatment in lung cancer.Implications for Practice.Human epidermal growth factor receptor 2 (HER2)‐mutant lung cancers are a group of heterogenous diseases with up to 31 different variants and 35 concomitant genomic aberrations. Different HER2 variants exhibit divergent sensitivities to anti‐HER2 treatments. Certain variants, G778_P780dup and G776delinsVC, derive sustained clinical benefits from afatinib, whereas the predominant variant, A775_G776insYVMA, is resistant to most anti‐HER2 treatments. TP53 is the most common co‐mutation in HER2‐mutant lung cancers. Co‐mutations in TP53 and the PI3K/AKT/mTOR pathway confer additional resistance to anti‐HER2 treatments in lung cancer. The present data suggest that different HER2 mutations in lung cancer, like its sibling epidermal growth factor receptor, should be analyzed independently in future studies.
AbstractBackground.The role of horizontal growth index of tumor size in survival prediction is still underappreciated in colon cancer because of the identification of vertical infiltration index reflected by T stage. We sought to reveal the impact of T stage on the prognostic and predictive value of tumor size in colon cancer.Materials and Methods.Data of patients with stage I–III colon cancer were extracted from Surveillance, Epidemiology, and End Results Program (SEER) and Fudan University Shanghai Cancer Center (FUSCC) databases. Harrell's concordance index (c‐index) and time‐dependent receiver operating characteristic curve (ROC) were used to analyze the discriminative ability of prognostic factors.Results.Stratified analyses based on T stage found that the increase of T stage significantly and negatively repressed the effect of tumor size on death and recurrence risk. In addition, tumor size showed the greatest hazard ratio of cancer‐specific death and relapse in T1 colon cancer. Even more importantly, the discriminatory ability of tumor size outperformed any other widely accepted prognostic clinical features in predicting cancer‐specific survival (SEER: c‐index 0.637, area under the ROC [AUC] 0.649; FUSCC: c‐index 0.673, AUC 0.686) and disease‐free survival (FUSCC: c‐index 0.645, AUC 0.656) in T1 stage colon cancer.Conclusion.Tumor size is a critical clinical factor with considerable prognostic and predictive value for T1 colon cancer, and it should be selectively incorporated into the current staging system to facilitate prediction of death and recurrence risk.Implications for Practice.To date, no consensus has been reached about the prognostic and predictive value of tumor size in colon cancer. Although tumor size is an independent prognostic factor for patients with colon cancer, the impact of tumor size on death or recurrence risk decreased notably with the increase of T stage. More importantly, the discriminative ability of tumor size outperformed any other clinical factors including N stage in patients with T1 colon cancer. Therefore, tumor size should be recommended to be incorporated into current staging systems to facilitate prognosis prediction for patients with T1 colon cancer.
AbstractBackground.Although predictive value of immune‐related adverse events (irAEs) induced by immune checkpoint inhibitors (ICIs) have been suggested by several studies, their assessments were insufficient because patients were categorized only by the occurrence of irAEs. It has not been elucidated whether irAEs also play a significant role even in responders.Materials and Methods.Between December 2015 and September 2018, 106 patients with advanced non‐small cell lung cancer treated with ICIs were enrolled in our prospective biomarker study. Twenty‐three of these were responders, defined as those with complete or partial response. We investigated the proportion of irAEs among overall and responders. For responders, progression‐free survival (PFS) and overall survival of ICIs were compared between those with and without irAEs. As an exploratory analysis, we measured 41 proteins from peripheral blood before and after ICI treatment.Results.The proportion of irAEs was significantly higher in responders than nonresponders (65.2% vs. 19.3%, p < .01). Among responders, clinical characteristics did not differ regardless of the occurrence of irAEs. However, there was a significant difference in PFS among responders (irAE group 19.1 months vs. non‐irAE group 5.6 months; hazard ratio: 0.30 [95% confidence interval: 0.10–0.85]; p = .02). Of 41 protein analyses, fibroblast growth factor‐2 at baseline and monocyte chemoattractant protein fold change showed significant differences between them (p < .04).Conclusion.Although this is a small sample–sized study, irAE might be a predictive factor of durable efficacy, even in patients who responded to ICIs. Investigation into the significance of irAEs in responders will contribute to the establishment of optimal administration of ICI.Implications for Practice.Although the predictive value of immune‐related adverse events (irAEs) induced by immune checkpoint inhibitors (ICIs) has been suggested by several studies, it has not been elucidated whether irAEs also play a significant role even in responders. This study showed that more than 60% of responders had irAEs. It demonstrated the strong correlation between irAEs and efficacy even in responders. Investigation into the significance of irAEs in responders will contribute to the establishment of optimal administration of ICI.
AbstractBackground.Ultrasound plays a critical role in evaluating thyroid nodules. We compared the performance of the two most popular ultrasound malignancy risk stratification systems, the 2015 American Thyroid Association (ATA) guidelines and the American College of Radiology Thyroid Imaging and Reporting Data System (ACR TI‐RADS).Materials and Methods.We retrospectively identified 250 thyroid nodules that were surgically removed from 137 patients. Their ultrasound images were independently rated using both ATA and ACR TI‐RADS by six raters with expertise in ultrasound interpretation. For each system, we generated a receiver operating characteristic curve and calculated the area under the curve (AUC).Results.Sixty‐five (26%) nodules were malignant. There was “fair agreement” among raters for both ATA and ACR TI‐RADS. Our observed malignancy risks for ATA and ACR TI‐RADS categories were similar to expected risk thresholds with a few notable exceptions including the intermediate ATA risk category and the three highest risk categories for ACR TI‐RADS. Biopsy of 226 of the 250 nodules would be indicated by ATA guidelines based on nodule size and mean ATA rating. One hundred forty‐six nodules would be biopsied based on ACR TI‐RADS. The sensitivity, specificity, and negative and positive predictive values were 92%, 10%, 79%, and 27%, respectively, for ATA and 74%, 47%, 84%, and 33%, respectively, for ACR TI‐RADS. The AUC for ATA was 0.734 and for ACR TI‐RADS was 0.718.Conclusion.Although both systems demonstrated good diagnostic performance, ATA guidelines resulted in a greater number of thyroid biopsies and exhibited more consistent malignancy risk prediction for higher risk categories.Implications for Practice.With the rising incidence of thyroid nodules, the need for accurate detection of malignancy is important to avoid the overtreatment of benign nodules. Ultrasonography is one of the key tools for the evaluation of thyroid nodules, although the use of many different ultrasound risk stratification systems is a hindrance to clinical collaboration in everyday practice and the comparison of data in research. The first step toward the development of a universal thyroid nodule ultrasound malignancy risk stratification system is to better understand the strengths and weaknesses of the current systems in use.
AbstractBackground.Pancreatic ductal adenocarcinoma (PDAC) remains resistant to chemotherapy and immunotherapy individually because of its desmoplastic stroma and immunosuppressive tumor microenvironment. Synergizing cytotoxic T‐lymphocyte–associated antigen 4 (CTLA‐4) immune checkpoint blockade with chemotherapy could overcome these barriers. Here we present results of a phase Ib trial combining ipilimumab and gemcitabine in advanced PDAC.Materials and Methods.This was a single‐institution study with a 3 + 3 dose‐escalation design. The primary objective was to determine the maximum tolerated dose (MTD). Secondary objectives included determining the toxicity profile, objective response rate (ORR), median progression‐free survival (PFS), and overall survival (OS).Results.Twenty‐one patients were enrolled, 13 during dose escalation and 8 at the MTD. The median age was 66 years, 62% were female, 95% had stage IV disease, and 67% had received at least one prior line of therapy. The primary objective to establish the MTD was achieved at doses of ipilimumab 3 mg/kg and gemcitabine 1,000 mg/m2. The most common grade 3 or 4 adverse events were anemia (48%), leukopenia (48%), and neutropenia (43%). The ORR was 14% (3/21), and seven patients had stable disease. Median response duration for the three responders was 11 months, with one response duration of 19.8 months. Median PFS was 2.78 months (95% confidence interval [CI], 1.61–4.83 months), and median OS was 6.90 months (95% CI, 2.63–9.57 months).Conclusion.Gemcitabine and ipilimumab is a safe and tolerable regimen for PDAC with a similar response rate to gemcitabine alone. As in other immunotherapy trials, responses were relatively durable in this study.Implications for Practice.Gemcitabine and ipilimumab is a safe and feasible regimen for treating advanced pancreatic cancer. Although one patient in this study had a relatively durable response of nearly 20 months, adding ipilimumab to gemcitabine does not appear to be more effective than gemcitabine alone in advanced pancreatic cancer.
AbstractBackground.Erdheim‐Chester disease (ECD) is a rare non‐Langerhans cell histiocytosis. The BRAF inhibitor vemurafenib is approved by the U.S. Food and Drug Administration (FDA) for patients with ECD harboring a BRAF V600E mutation. Successful treatment has also been reported with MEK‐targeted therapies, likely because of the fact that BRAF mutant–negative patients harbor MEK pathway alterations. In our Rare Tumor Clinic, we noted that these patients have frequent drug‐related toxicity, consistent with previous reports indicating the need to markedly lower doses of interferon‐alpha when that agent is used in these patients.Patients and Methods.We performed a review of ten patients with ECD seen at the Rare Tumor Clinic at University of California San Diego receiving 16 regimens of targeted BRAF, MEK, or combined therapies.Results.The median age of the ten patients with ECD was 53 years (range, 29–77); seven were men. The median dose percentage (percent of FDA‐approved dose) tolerated was 25% (range, 25%–50%). The most common clinically significant adverse effects resulting in dose adjustments of targeted therapies were rash, arthralgias, and uveitis. Renal toxicity and congestive heart failure were seen in one patient each. In spite of these issues, eight of ten patients (80%) achieved a partial remission on therapy.Discussion.Patients with ECD appear to require substantially reduced doses of BRAF and MEK inhibitors but are responsive to these lower doses.
AbstractBackground.Direct comparisons between Guardant360 (G360) circulating tumor DNA (ctDNA) and FoundationOne (F1) tumor biopsy genomic profiling in metastatic colorectal cancer (mCRC) are limited. We aim to assess the concordance across overlapping genes tested in both F1 and G360 in patients with mCRC.Materials and Methods.We retrospectively analyzed 75 patients with mCRC who underwent G360 and F1 testing. We evaluated the concordance among gene mutations tested by both G360 and F1 among three categories of patients: untreated, treated without, and treated with EGFR inhibitors, while considering the clonal and/or subclonal nature of each genomic alteration.Results.There was a high rate of concordance in APC, TP53, KRAS, NRAS, and BRAF mutations in the treatment‐naive and non–anti‐EGFR‐treated cohorts. There was increased discordance in the anti‐EGFR treated patients in three drivers of anti‐EGFR resistance: KRAS, NRAS, and EGFR somatic mutations. Based on percentage of ctDNA, discordant somatic mutations were mostly subclonal instead of clonal and may have limited clinical significance. Most discordant amplifications noted on G360 showed the magnitude below the top decile, occurred in all three cohorts of patients, and were of unknown clinical significance. Serial ctDNA in anti‐EGFR treated patients showed the emergence of multiple new alterations that affected the EGFR pathway: EGFR and RAS mutations and MET, RAS, and BRAF amplifications.Conclusion.G360 Next‐Generation Sequencing platform may be used as an alternative to F1 to detect targetable somatic alterations in non–anti‐EGFR treated mCRC, but larger prospective studies are needed to further validate our findings.Implications for Practice.Genomic analysis of tissue biopsy is currently the optimal method for identifying DNA genomic alterations to help physicians target specific genes but has many disadvantages that may be mitigated by a circulating free tumor DNA (ctDNA) assay. This study showed a high concordance rate in certain gene mutations in patients who were treatment naive and treated with non–anti‐EGFR therapy prior to ctDNA testing. This suggests that ctDNA genomic analysis may potentially be used as an alternative to tumor biopsy to identify appropriate patients for treatment selection in mCRC, but larger prospective studies are needed to further validate concordance among tissue and ctDNA tumor profiling.
AbstractLessons Learned.Concurrent ETBX‐011, ETBX‐051, and ETBX‐061 can be safely administered to patients with advanced cancer.All patients developed CD4+ and/or CD8+ T‐cell responses after vaccination to at least one tumor‐associated antigen (TAA) encoded by the vaccine; 5/6 patients (83%) developed MUC1‐specific T cells, 4/6 (67%) developed CEA‐specific T cells, and 3/6 (50%) developed brachyury‐specific T cells.The presence of adenovirus 5‐neutralizing antibodies did not prevent the generation of TAA‐specific T cells.Background.A novel adenovirus‐based vaccine targeting three human tumor‐associated antigens—CEA, MUC1, and brachyury—has demonstrated antitumor cytolytic T‐cell responses in preclinical animal models of cancer.Methods.This open‐label, phase I trial evaluated concurrent administration of three therapeutic vaccines (ETBX‐011 = CEA, ETBX‐051 = MUC1, and ETBX‐061 = brachyury). All three vaccines used the same modified adenovirus 5 (Ad5) vector backbone and were administered at a single dose level (DL) of 5 × 1011 viral particles (VP) per vector. The vaccine regimen consisting of all three vaccines was given every 3 weeks for three doses then every 8 weeks for up to 1 year. Clinical and immune responses were evaluated.Results.Ten patients enrolled on trial (DL1 = 6 with 4 in the DL1 expansion cohort). All treatment‐related adverse events were temporary, self‐limiting, grade 1/2 and included injection site reactions and flu‐like symptoms. Antigen‐specific T cells to MUC1, CEA, and/or brachyury were generated in all patients. There was no evidence of antigenic competition. The administration of the vaccine regimen produced stable disease as the best clinical response.Conclusion.Concurrent ETBX‐011, ETBX‐051, and ETBX‐061 can be safely administered to patients with advanced cancer. Further studies of the vaccine regimen in combination with other agents, including immune checkpoint blockade, are planned.
AbstractBackground.From 2014 to 2017, the Palliative Medicine Working Group developed and published best practice recommendations for the integration of palliative care in Comprehensive Cancer Centers (CCCs) in Germany. To evaluate the implementation level of these recommendations in the CCCs an online survey was performed. Based on the results of this study, strategic tandem partnerships between CCCs should be built in order to foster further local development.Materials and Methods.Directors of all CCCs were contacted by e‐mail between December 2017 and February 2018. At the time of the survey, 15 CCCs were funded by the German Cancer Aid. The level of implementation of the recommendations in individual CCCs was established using a transtheoretical model.Results.Between December 2017 and February 2018, all 15 contacted directors or their representatives of the CCCs took part in the survey. More than two thirds of the CCCs have a palliative service as well as a day clinic and palliative outpatient clinic. Regional networking and the provision of a palliative care unit were approved by all CCCs.Conclusion.The publication of best practice recommendations was a milestone for the integration of palliative care in the CCCs. The majority of the German CCCs already fulfill essential organizational and structural requirements. There is a particular need for optimization in the provision of a basic qualification for general palliative care and emergency admission personnel.Implications for Practice.In 2017, the Palliative Medicine Working Group in the network of the German Comprehensive Cancer Centers (CCCs) published the best practice recommendations it had developed for the integration of palliative medicine in CCCs in Germany. In order to evaluate the level of implementation of the recommendations, an online survey of the CCC directors was established. The majority of German CCCs fulfil elementary organizational and structural requirements. However, there is still room for improvement in the provision of a basic qualification for general palliative care and emergency admission personnel.
AbstractOn November 15, 2018, the Committee for Medicinal Products for Human Use (CHMP) recommended the extension of indication for blinatumomab to include the treatment of adults with minimal residual disease (MRD) positive B‐cell precursor acute lymphoblastic leukemia (ALL). Blinatumomab was authorized to treat relapsed or refractory B‐precursor ALL, and the change concerned an extension of use. On March 29, 2018, the U.S. Food and Drug Administration (FDA) granted accelerated approval to blinatumomab to treat both adults and children with B‐cell precursor ALL who are in remission but still have MRD. On July 26, 2018, the CHMP had originally adopted a negative opinion on the extension. The reason for the initial refusal was that although blinatumomab helped to reduce the amount of residual cancer cells in many patients, there was no strong evidence that it led to improved survival. During the re‐examination, the CHMP consulted the scientific advisory group. The CHMP agreed with the expert group's conclusion that, although there was no strong evidence of patients living longer, the available data from the main study (MT103‐203) indicated a good durable response to blinatumomab, with an overall complete response rate for the primary endpoint full analysis set (defined as all subjects with an Ig or T‐cell receptor polymerase chain reaction MRD assay with the minimum required sensitivity of 1 × 10–4 at central lab established at baseline [n = 113]) as 79.6% (90/113; 95% confidence interval, 71.0–86.6), with a median time to complete MRD response of 29.0 days (range, 5–71). Therefore, the CHMP concluded that the benefits of blinatumomab outweigh its risks and recommended granting the change to the marketing authorization.The Committee for Orphan Medicinal Products, following reassessment, considered that significant benefit continued to be met and recommended maintaining the orphan designation and thus 10 years market exclusivity (the Orphan Designation is a legal procedure that allows for the designation of a medicinal substance with therapeutic potential for a rare disease, before its first administration in humans or during its clinical development). The marketing authorization holder for this medicinal product is Amgen Europe B.V.Implications for Practice.Immunotherapy with blinatumomab has excellent and sustainable results, offering new hope for patients with minimal residual disease‐positive acute lymphoblastic leukemia, a disease with poor prognosis. New recommendations and change of practice for treatment of this patient group are detailed.
AbstractBackground.The loss of muscle mass, known as sarcopenia, is a natural process of aging that is associated with adverse health outcomes regardless of age. Because cancer is a disease of aging, interest in sarcopenia and its potential impact in multiple cancer populations has increased significantly. Bioelectrical impedance analysis (BIA) is a guideline‐accepted method for sarcopenia detection. This systematic review assesses the literature pertaining to BIA use in the detection of sarcopenia in adults with cancer.Materials and Methods.In this systematic review, a search of the literature for randomized controlled trials and observational studies was conducted using MEDLINE, Cochrane CENTRAL, and EMBASE, through July 15, 2019. The study is registered at Prospero (CRD 42019130707). For study inclusion, patients had to be aged 18 years or older and diagnosed with solid or hematological neoplasia, and BIA had to be used to detect sarcopenia.Results.Through our search strategy, 5,045 articles were identified, of which 24 studies were selected for inclusion in the review (total number of 3,607 patients). In five studies, BIA was rated comparable to axial computed tomography (CT) scan, calf circumference, or grip strength for sarcopenia screening. In 14 studies, BIA‐identified sarcopenia was associated with adverse clinical outcomes.Conclusion.BIA is an accurate method for detecting sarcopenia in adults with cancer prior to treatment and is a viable alternative to CT, dual‐energy x‐ray absorptiometry, and magnetic resonance imaging in oncology clinical practice.Implications for Practice.Bioelectrical impedance analysis (BIA) is an attractive method for identifying sarcopenic patients in clinical practice because it provides an affordable, noninvasive test that can be completed within a few minutes during a clinic visit. BIA does not require highly skilled personnel, and results are immediately available. This systematic review summarizes the literature pertaining to BIA assessment of sarcopenia in adults with cancer, with a focus on its use in diverse cancer populations.
AbstractBackground.Somatostatin analogs (SSAs) are the mainstay of neuroendocrine tumor (NET) treatment. Biliary stone disease is reported as a common side effect of SSAs, with a frequency ranging from 10% to 63%. Studies on SSA‐treated patients for acromegaly report an increased incidence of biliary stone disease compared with the general population, whereas data on patients with NETs are few. Guidelines are based on weak evidence, thus resulting in conflicting recommendations. The aim of the study is to evaluate biliary stone disease incidence, complications, and risk factors in a large population of SSA‐treated patients with NETs.Materials and Methods.A retrospective analysis of a prospectively collected database was performed. Patients with a diagnosis of NET in seven dedicated centers from 1995 to 2017 were included at the time of SSA start.Results.A total of 754 SSA‐treated patients were evaluated. Patients with history of cholecystectomy or with known biliary stone disease were excluded; 478 patients were included. Among them, 118 patients (24.7%) received prophylactic ursodeoxycholic acid (UDCA). During the study period, 129 patients (27.0%) developed biliary stone disease; of them, 36 (27.9%) developed biliary complications. On multivariate analysis, primary gastrointestinal (GI)‐NET (hazard ratio [HR] 1.76) and related surgery (HR 1.58) were independent risk factors for biliary stone disease.Conclusion.We report a high incidence of biliary stone disease particularly in GI‐NET or GI surgery. UDCA prophylaxis does not seem to have a protective role. Our data suggest that all patients with primary GI‐NET or undergoing abdominal surgery should be considered for prophylactic cholecystectomy; no conclusion could be drawn on the indication of prophylactic cholecystectomy in patients with primary pancreatic or thoracic NET for whom abdominal surgery is not planned.Implications for Practice.The results of this study confirm an increased rate of gallstones development and related complications in patients with neuroendocrine tumors (NETs) treated with somatostatin analogs (SSAs). NETs of the gastrointestinal (GI) tract and related surgery are independent risk factors for biliary stone disease development. Therefore, all patients with primary GI‐NET or undergoing abdominal surgery should be considered for prophylactic cholecystectomy. Data on other subgroups are not exhaustive, and management also evaluating additional clinical features (life expectancy, surgical and anesthesiological risks) should be considered. Prophylactic treatment with ursodeoxycholic acid does not seem to be a protective factor for SSA‐related biliary stone disease.
AbstractBackground.Somatic alterations in circulating tumor DNA (ctDNA) may be associated with treatment response or prognosis in prostate cancer (PCa). The goal was to characterize androgen receptor gene (AR) amplifications and mutations detected in ctDNA from patients with PCa and to further understand the somatic genetic heterogeneity of advanced prostate cancer.Patients and Methods.This study included a heterogeneous group of 892 patients with advanced PCa (predominantly castrate‐resistant prostate cancer) with AR alterations detected in ctDNA that underwent next‐generation sequencing of 54 to 73 genes via Guardant360 testing (Guardant Health, Inc., Redwood City, CA). Distribution and summary of AR alterations detected, the association of AR alterations with other genes, and a pathway analysis are reported.Results.The median absolute plasma copy number of AR amplifications was 3.3 (range, 1.2–165.2). Many patients had multiple AR mutations; a total of 112 unique mutations were identified in AR, including L702H (25%), T878A (14%), H875Y (11%), W742C (8%), W742L (4%), F877L (2%), and T878S (2%). Other ctDNA gene alterations in the Guardant assays included TP53 (50%), MYC (34%), BRAF (32%), PIK3CA (29%), MET (25%), CDK6 (26%), EGFR (24%), FGFR1 (21%), and APC (12%). Many of these non‐AR alterations are not tissue verified in other studies. AR amplification cosegregated with alterations in MYC (p < .001), BRAF (p < .001), PIK3CA (p < .001), MET (p < .001), CDK6 (p < .001), EGFR (p < .001), FGFR1 (p = .391), and more. Alterations in APC were significantly associated with mutations in AR (p < .001).Conclusion.Several AR alterations and concomitant non‐AR alterations that associate with drug resistance were detected. These findings provide additional insights into the heterogeneity of advanced prostate cancer.Implications for Practice.The goal was to characterize androgen receptor gene (AR) amplifications and mutations detected in circulating tumor DNA (ctDNA) from patients with prostate cancer in relation to non‐AR gene alterations detected in the ctDNA landscape. The study included 892 patients with prostate cancer with AR alterations in ctDNA. AR alterations were significantly associated with other gene alterations detected in ctDNA. The common AR mutations found are linked to resistance to abiraterone, enzalutamide, or bicalutamide. Characterization of the circulating AR landscape and gene alterations provides potential additional insight into the somatic genetic heterogeneity of advanced prostate cancer.
The Oncologist RSS feed -- Early Online Editions of Accepted Articles
Subscribe to Early View from The Oncologist feed