User login
Preventing Skeletal-Related Events in Veterans on Bisphosphonates for Bone Metastases
Purpose: Multiple myeloma and solid tumor metastases can cause bone disease leading to skeletal-related events (SREs) such as bone pain, fractures, and spinal cord compression. Intravenous bisphosphonate therapy—which is indicated in such cases—can lead to osteonecrosis of the jaw and hypocalcemia further putting patients at risk for SREs. These risks can be avoided by dental evaluation before bisphosphonate therapy and calcium and vitamin D supplementation throughout treatment. Our study of veterans treated with bisphosphonates for bone metastases or multiple myeloma aimed to (1) assess screening dental evaluation prior to treatment; and (2) measure effectiveness of calcium and vitamin D supplementation.
Methods: We performed a retrospective chart review at the James J. Peters VAMC of 117 veterans with multiple myeloma or bone metastases who received intravenous bisphosphonate therapy between January 2008 and November 2013. Those receiving bisphosphonates for other morbidities such as osteoporosis or hypercalcemia were excluded. Those getting dental clearance before intravenous bisphosphonate therapy and supplementation of vitamin D and calcium were assessed. Charts were further reviewed to gather outcomes data on incidence of osteonecrosis of the jaw and SREs such as bone pain, pathologic and traumatic fractures, orthopedic surgery, spine or nerve root compression. These data were analyzed using descriptive statistics to calculate frequencies, mean/median, and proportions. Odds ratios were calculated to assess differences in SRE outcomes for those who received supplementation as compared to those who did not get supplementation with calcium and vitamin D.
Results: Of the 117 patients included in the study, 97% were males aged from 58 to 92 years. Of these, 55 (47%) had prostate cancer, 21 (17%) had multiple myeloma, and 16 (14%) had lung cancer. All patients receiving bisphosphonates for bone metastases had undergone a dental evaluation prior to starting therapy; none were reported to have osteonecrosis of the jaw. However, only 78% had vitamin D levels checked before therapy; 69% of these were vitamin D deficient and received vitamin D supplementation. Overall, rates of calcium and vitamin D supplementation were very low (34% and 41%, respectively). Fifty-four percent of the patients reported an SRE; 49% with bone pain, 13% with pathological fractures, 7% with traumatic fractures, and 8% with nerve root compression. Vitamin D supplementation significantly reduced the odds of an SRE for our patients (OR 0.37, 95% CI = 0.19- 0.74, P < .05).
Conclusions: Onset of SREs can be reduced or delayed with bisphosphonates; however, patients need prior screening for osteonecrosis of the jaw and optimized calcium and vitamin D levels. Our study showed that although screening for osteonecrosis of the jaw was at optimum levels, supplementation with calcium and vitamin D was lacking in patients on bisphosphonates. In our study, vitamin D supplementation reduced the risk of an SRE by 63%. Hence, adequate prevention with vitamin D supplementation can improve bone health among veterans with multiple myeloma or bone metastases. Data-based policies and practices need to be incorporated to provide care to ensure adequate bone health.
Purpose: Multiple myeloma and solid tumor metastases can cause bone disease leading to skeletal-related events (SREs) such as bone pain, fractures, and spinal cord compression. Intravenous bisphosphonate therapy—which is indicated in such cases—can lead to osteonecrosis of the jaw and hypocalcemia further putting patients at risk for SREs. These risks can be avoided by dental evaluation before bisphosphonate therapy and calcium and vitamin D supplementation throughout treatment. Our study of veterans treated with bisphosphonates for bone metastases or multiple myeloma aimed to (1) assess screening dental evaluation prior to treatment; and (2) measure effectiveness of calcium and vitamin D supplementation.
Methods: We performed a retrospective chart review at the James J. Peters VAMC of 117 veterans with multiple myeloma or bone metastases who received intravenous bisphosphonate therapy between January 2008 and November 2013. Those receiving bisphosphonates for other morbidities such as osteoporosis or hypercalcemia were excluded. Those getting dental clearance before intravenous bisphosphonate therapy and supplementation of vitamin D and calcium were assessed. Charts were further reviewed to gather outcomes data on incidence of osteonecrosis of the jaw and SREs such as bone pain, pathologic and traumatic fractures, orthopedic surgery, spine or nerve root compression. These data were analyzed using descriptive statistics to calculate frequencies, mean/median, and proportions. Odds ratios were calculated to assess differences in SRE outcomes for those who received supplementation as compared to those who did not get supplementation with calcium and vitamin D.
Results: Of the 117 patients included in the study, 97% were males aged from 58 to 92 years. Of these, 55 (47%) had prostate cancer, 21 (17%) had multiple myeloma, and 16 (14%) had lung cancer. All patients receiving bisphosphonates for bone metastases had undergone a dental evaluation prior to starting therapy; none were reported to have osteonecrosis of the jaw. However, only 78% had vitamin D levels checked before therapy; 69% of these were vitamin D deficient and received vitamin D supplementation. Overall, rates of calcium and vitamin D supplementation were very low (34% and 41%, respectively). Fifty-four percent of the patients reported an SRE; 49% with bone pain, 13% with pathological fractures, 7% with traumatic fractures, and 8% with nerve root compression. Vitamin D supplementation significantly reduced the odds of an SRE for our patients (OR 0.37, 95% CI = 0.19- 0.74, P < .05).
Conclusions: Onset of SREs can be reduced or delayed with bisphosphonates; however, patients need prior screening for osteonecrosis of the jaw and optimized calcium and vitamin D levels. Our study showed that although screening for osteonecrosis of the jaw was at optimum levels, supplementation with calcium and vitamin D was lacking in patients on bisphosphonates. In our study, vitamin D supplementation reduced the risk of an SRE by 63%. Hence, adequate prevention with vitamin D supplementation can improve bone health among veterans with multiple myeloma or bone metastases. Data-based policies and practices need to be incorporated to provide care to ensure adequate bone health.
Purpose: Multiple myeloma and solid tumor metastases can cause bone disease leading to skeletal-related events (SREs) such as bone pain, fractures, and spinal cord compression. Intravenous bisphosphonate therapy—which is indicated in such cases—can lead to osteonecrosis of the jaw and hypocalcemia further putting patients at risk for SREs. These risks can be avoided by dental evaluation before bisphosphonate therapy and calcium and vitamin D supplementation throughout treatment. Our study of veterans treated with bisphosphonates for bone metastases or multiple myeloma aimed to (1) assess screening dental evaluation prior to treatment; and (2) measure effectiveness of calcium and vitamin D supplementation.
Methods: We performed a retrospective chart review at the James J. Peters VAMC of 117 veterans with multiple myeloma or bone metastases who received intravenous bisphosphonate therapy between January 2008 and November 2013. Those receiving bisphosphonates for other morbidities such as osteoporosis or hypercalcemia were excluded. Those getting dental clearance before intravenous bisphosphonate therapy and supplementation of vitamin D and calcium were assessed. Charts were further reviewed to gather outcomes data on incidence of osteonecrosis of the jaw and SREs such as bone pain, pathologic and traumatic fractures, orthopedic surgery, spine or nerve root compression. These data were analyzed using descriptive statistics to calculate frequencies, mean/median, and proportions. Odds ratios were calculated to assess differences in SRE outcomes for those who received supplementation as compared to those who did not get supplementation with calcium and vitamin D.
Results: Of the 117 patients included in the study, 97% were males aged from 58 to 92 years. Of these, 55 (47%) had prostate cancer, 21 (17%) had multiple myeloma, and 16 (14%) had lung cancer. All patients receiving bisphosphonates for bone metastases had undergone a dental evaluation prior to starting therapy; none were reported to have osteonecrosis of the jaw. However, only 78% had vitamin D levels checked before therapy; 69% of these were vitamin D deficient and received vitamin D supplementation. Overall, rates of calcium and vitamin D supplementation were very low (34% and 41%, respectively). Fifty-four percent of the patients reported an SRE; 49% with bone pain, 13% with pathological fractures, 7% with traumatic fractures, and 8% with nerve root compression. Vitamin D supplementation significantly reduced the odds of an SRE for our patients (OR 0.37, 95% CI = 0.19- 0.74, P < .05).
Conclusions: Onset of SREs can be reduced or delayed with bisphosphonates; however, patients need prior screening for osteonecrosis of the jaw and optimized calcium and vitamin D levels. Our study showed that although screening for osteonecrosis of the jaw was at optimum levels, supplementation with calcium and vitamin D was lacking in patients on bisphosphonates. In our study, vitamin D supplementation reduced the risk of an SRE by 63%. Hence, adequate prevention with vitamin D supplementation can improve bone health among veterans with multiple myeloma or bone metastases. Data-based policies and practices need to be incorporated to provide care to ensure adequate bone health.
The Use and Beliefs of Complementary and Alternative Medicine Among Veteran Cancer Patients
Purpose: Population studies indicate that > 40% of cancer patients use complementary and alternative medicine (CAM) therapies. Data suggest that most patients are not well informed of their chosen CAM therapy and do not discuss it with their health care providers (HCPs). However, no information exists on the use of CAM among veterans with cancer. We evaluated the type and prevalence of CAM use among VA patients with cancer, the disclosure of its use, and the beliefs veterans hold regarding CAM therapies.
Methods: Cancer patients from the radiation oncology and medical oncology clinic waiting areas of a VA hospital were surveyed by convenience sampling over a 3-month period. A hard copy-validated survey regarding attitudes and beliefs of CAM was used to evaluate patient responses. Relationships of the use and beliefs regarding CAM therapies were examined and analyzed using chi-square test and Spearman correlation.
Results: Two hundred twenty-two questionnaires were distributed to predominantly male veteran outpatient cancer patients, and 196 patients (97% male) returned completed questionnaires (88.3% response rate). Twenty-nine of the 196 (14.8%) patients reported CAM use. Dietary supplements (55.2%) and herbal supplements (41.4%) were the most commonly used forms of CAM; acupuncture (6.9%) and meditation techniques (6.9%) were the least common. Of the patients reporting CAM use, 89.7% of patients discussed their using CAM with their physicians. CAM users compared with nonusers believed that using CAM would improve their physical health (P < .0001), boost their immune system (P < .0001), reduce stress (P < .05), reduce symptoms such as pain or fatigue related to cancer and its treatment (P < .05), help them to live longer (P < .05), help cure their cancer (P < .05), prevent development of future health problems (P < .001), and help them to cope with the experience of having cancer (P < .0001). Neither group felt that CAM would decrease their emotional distress. Users of CAM believed that the use of CAM was encouraged by their family (P < .001) as well as their HCPs (P < .05), and their HCPs were open to their use of CAM (P < .05). Nonusers of CAM more often believed that CAM treatments were not based on scientific research (P < .0001), might interfere with the conventional cancer treatments (P < .001),CAM treatments cost too much money (P < .05), they do not have time to go to CAM treatments (P < .05), and they do not have adequate knowledge about CAM treatments (P < .05).
Conclusions: Whereas the prevalence of CAM use among veteran cancer patients is lower than that of the general population, they are more likely to report its use to their HCPs. Veterans have divergent beliefs regarding the potential benefits of CAM therapies. Because both users and nonusers of CAM express lack of knowledge regarding CAM treatments, veteran patients with cancer may benefit from education regarding the various CAM modalities and their utilities.
Purpose: Population studies indicate that > 40% of cancer patients use complementary and alternative medicine (CAM) therapies. Data suggest that most patients are not well informed of their chosen CAM therapy and do not discuss it with their health care providers (HCPs). However, no information exists on the use of CAM among veterans with cancer. We evaluated the type and prevalence of CAM use among VA patients with cancer, the disclosure of its use, and the beliefs veterans hold regarding CAM therapies.
Methods: Cancer patients from the radiation oncology and medical oncology clinic waiting areas of a VA hospital were surveyed by convenience sampling over a 3-month period. A hard copy-validated survey regarding attitudes and beliefs of CAM was used to evaluate patient responses. Relationships of the use and beliefs regarding CAM therapies were examined and analyzed using chi-square test and Spearman correlation.
Results: Two hundred twenty-two questionnaires were distributed to predominantly male veteran outpatient cancer patients, and 196 patients (97% male) returned completed questionnaires (88.3% response rate). Twenty-nine of the 196 (14.8%) patients reported CAM use. Dietary supplements (55.2%) and herbal supplements (41.4%) were the most commonly used forms of CAM; acupuncture (6.9%) and meditation techniques (6.9%) were the least common. Of the patients reporting CAM use, 89.7% of patients discussed their using CAM with their physicians. CAM users compared with nonusers believed that using CAM would improve their physical health (P < .0001), boost their immune system (P < .0001), reduce stress (P < .05), reduce symptoms such as pain or fatigue related to cancer and its treatment (P < .05), help them to live longer (P < .05), help cure their cancer (P < .05), prevent development of future health problems (P < .001), and help them to cope with the experience of having cancer (P < .0001). Neither group felt that CAM would decrease their emotional distress. Users of CAM believed that the use of CAM was encouraged by their family (P < .001) as well as their HCPs (P < .05), and their HCPs were open to their use of CAM (P < .05). Nonusers of CAM more often believed that CAM treatments were not based on scientific research (P < .0001), might interfere with the conventional cancer treatments (P < .001),CAM treatments cost too much money (P < .05), they do not have time to go to CAM treatments (P < .05), and they do not have adequate knowledge about CAM treatments (P < .05).
Conclusions: Whereas the prevalence of CAM use among veteran cancer patients is lower than that of the general population, they are more likely to report its use to their HCPs. Veterans have divergent beliefs regarding the potential benefits of CAM therapies. Because both users and nonusers of CAM express lack of knowledge regarding CAM treatments, veteran patients with cancer may benefit from education regarding the various CAM modalities and their utilities.
Purpose: Population studies indicate that > 40% of cancer patients use complementary and alternative medicine (CAM) therapies. Data suggest that most patients are not well informed of their chosen CAM therapy and do not discuss it with their health care providers (HCPs). However, no information exists on the use of CAM among veterans with cancer. We evaluated the type and prevalence of CAM use among VA patients with cancer, the disclosure of its use, and the beliefs veterans hold regarding CAM therapies.
Methods: Cancer patients from the radiation oncology and medical oncology clinic waiting areas of a VA hospital were surveyed by convenience sampling over a 3-month period. A hard copy-validated survey regarding attitudes and beliefs of CAM was used to evaluate patient responses. Relationships of the use and beliefs regarding CAM therapies were examined and analyzed using chi-square test and Spearman correlation.
Results: Two hundred twenty-two questionnaires were distributed to predominantly male veteran outpatient cancer patients, and 196 patients (97% male) returned completed questionnaires (88.3% response rate). Twenty-nine of the 196 (14.8%) patients reported CAM use. Dietary supplements (55.2%) and herbal supplements (41.4%) were the most commonly used forms of CAM; acupuncture (6.9%) and meditation techniques (6.9%) were the least common. Of the patients reporting CAM use, 89.7% of patients discussed their using CAM with their physicians. CAM users compared with nonusers believed that using CAM would improve their physical health (P < .0001), boost their immune system (P < .0001), reduce stress (P < .05), reduce symptoms such as pain or fatigue related to cancer and its treatment (P < .05), help them to live longer (P < .05), help cure their cancer (P < .05), prevent development of future health problems (P < .001), and help them to cope with the experience of having cancer (P < .0001). Neither group felt that CAM would decrease their emotional distress. Users of CAM believed that the use of CAM was encouraged by their family (P < .001) as well as their HCPs (P < .05), and their HCPs were open to their use of CAM (P < .05). Nonusers of CAM more often believed that CAM treatments were not based on scientific research (P < .0001), might interfere with the conventional cancer treatments (P < .001),CAM treatments cost too much money (P < .05), they do not have time to go to CAM treatments (P < .05), and they do not have adequate knowledge about CAM treatments (P < .05).
Conclusions: Whereas the prevalence of CAM use among veteran cancer patients is lower than that of the general population, they are more likely to report its use to their HCPs. Veterans have divergent beliefs regarding the potential benefits of CAM therapies. Because both users and nonusers of CAM express lack of knowledge regarding CAM treatments, veteran patients with cancer may benefit from education regarding the various CAM modalities and their utilities.
Breast Cancer Treatment Among Rural and Urban Women at the Veterans Health Administration
Purpose: Women with breast cancer are increasingly being diagnosed and cared for within the VA. Breast cancer specialists are available only at large VA hospitals in urban regions, possibly impacting the outcomes of rural women. The health outcomes of rural women at the VA have not been well described and are currently a research priority. We described the differences between urban and rural women’s demographics and breast cancer characteristics. We then compared urban and rural women with nonmetastatic breast cancer on type of lymph node biopsy, type of breast surgery, adjuvant radiation, adjuvant chemotherapy, and hormone therapy.
Methods: Following IRB approval, 4,025 women with nonmetastatic breast cancer from 1995 to 2012 were identified from the Veterans Affairs Central Cancer Registry (VACCR). This dataset contained diagnosis date, histology, tumor size, tumor grade, lymph node status, and estrogen receptor status. The VACCR also gathered type of lymph node surgery, type of breast surgery, adjuvant radiation, adjuvant chemotherapy, and adjuvant hormone therapy. Patient-specific data included date of birth, ethnicity, and zip code of residence at the time of diagnosis. The Rural Urban Commuting Areas 2.0 (RUCA) was used to define rural status and collated further into 3 categories of urban, large rural, and small rural. STATA data analysis and statistical software was used to organize and analyze data. The associations between the 3 rural/urban categories and diagnosis year, age, ethnicity, histology and tumor grade were assessed by ordinal logistic regression. Tumor size was compared using rank sum test. Lymph node and estrogen receptor status were compared with logistic regression, and lymph node sampling methods with multinomial regression. All other treatments were compared between small rural and urban women using logistic regression, and further analyzed with adjustments for factors that could influence treatment choices, including diagnosis year, age, ethnicity, tumor size and grade, lymph node status, and estrogen receptor status.
Results: Most women (n = 3,192) with nonmetastatic breast cancer resided in urban regions, 423 women in large rural regions, and 410 in small rural regions. The number of women living in urban and rural regions did not shift significantly over time (P = .48). The age distributions of rural and urban women did not differ. Women with breast cancer in rural regions were more likely to be white (P ≤ .001, 69% white urban; 90% white small rural; 24% black urban, and 6% black small rural). Tumor histology, size, grade, and lymph node and estrogen receptor status did not differ significantly between rural and urban. Mastectomy was more common among rural women initially, but after adjustments for patient demographics and breast cancer characteristics, urban and rural women received similar proportions of mastectomies. After adjustments, urban and rural women received equivalent breast cancer surgery, adjuvant radiation and adjuvant hormone therapy. However, after controlling for confounding factors, a disproportionate number of urban women receive no lymph node biopsy (P = .05). Additionally, women from large rural regions were statistically more likely to receive adjuvant chemotherapy (P = .04), although the chemotherapy administration did not differ statistically between women from urban and small rural regions (P = .7).
Conclusions: Most women diagnosed with breast cancer at the VA from 1995 to 2012 resided in urban areas. Rural women were much more likely to be white, but the age at diagnosis did not differ. Breast cancer characteristics were similar between rural and urban women. Women living in large rural regions were more likely to receive adjuvant chemotherapy than were women from urban or small rural regions; however reporting differences should be considered as an explanation. A higher proportion of urban women received no lymph node biopsy, which merits further investigation. Breast conservation therapy was administered consistently among rural and urban women veterans.
Purpose: Women with breast cancer are increasingly being diagnosed and cared for within the VA. Breast cancer specialists are available only at large VA hospitals in urban regions, possibly impacting the outcomes of rural women. The health outcomes of rural women at the VA have not been well described and are currently a research priority. We described the differences between urban and rural women’s demographics and breast cancer characteristics. We then compared urban and rural women with nonmetastatic breast cancer on type of lymph node biopsy, type of breast surgery, adjuvant radiation, adjuvant chemotherapy, and hormone therapy.
Methods: Following IRB approval, 4,025 women with nonmetastatic breast cancer from 1995 to 2012 were identified from the Veterans Affairs Central Cancer Registry (VACCR). This dataset contained diagnosis date, histology, tumor size, tumor grade, lymph node status, and estrogen receptor status. The VACCR also gathered type of lymph node surgery, type of breast surgery, adjuvant radiation, adjuvant chemotherapy, and adjuvant hormone therapy. Patient-specific data included date of birth, ethnicity, and zip code of residence at the time of diagnosis. The Rural Urban Commuting Areas 2.0 (RUCA) was used to define rural status and collated further into 3 categories of urban, large rural, and small rural. STATA data analysis and statistical software was used to organize and analyze data. The associations between the 3 rural/urban categories and diagnosis year, age, ethnicity, histology and tumor grade were assessed by ordinal logistic regression. Tumor size was compared using rank sum test. Lymph node and estrogen receptor status were compared with logistic regression, and lymph node sampling methods with multinomial regression. All other treatments were compared between small rural and urban women using logistic regression, and further analyzed with adjustments for factors that could influence treatment choices, including diagnosis year, age, ethnicity, tumor size and grade, lymph node status, and estrogen receptor status.
Results: Most women (n = 3,192) with nonmetastatic breast cancer resided in urban regions, 423 women in large rural regions, and 410 in small rural regions. The number of women living in urban and rural regions did not shift significantly over time (P = .48). The age distributions of rural and urban women did not differ. Women with breast cancer in rural regions were more likely to be white (P ≤ .001, 69% white urban; 90% white small rural; 24% black urban, and 6% black small rural). Tumor histology, size, grade, and lymph node and estrogen receptor status did not differ significantly between rural and urban. Mastectomy was more common among rural women initially, but after adjustments for patient demographics and breast cancer characteristics, urban and rural women received similar proportions of mastectomies. After adjustments, urban and rural women received equivalent breast cancer surgery, adjuvant radiation and adjuvant hormone therapy. However, after controlling for confounding factors, a disproportionate number of urban women receive no lymph node biopsy (P = .05). Additionally, women from large rural regions were statistically more likely to receive adjuvant chemotherapy (P = .04), although the chemotherapy administration did not differ statistically between women from urban and small rural regions (P = .7).
Conclusions: Most women diagnosed with breast cancer at the VA from 1995 to 2012 resided in urban areas. Rural women were much more likely to be white, but the age at diagnosis did not differ. Breast cancer characteristics were similar between rural and urban women. Women living in large rural regions were more likely to receive adjuvant chemotherapy than were women from urban or small rural regions; however reporting differences should be considered as an explanation. A higher proportion of urban women received no lymph node biopsy, which merits further investigation. Breast conservation therapy was administered consistently among rural and urban women veterans.
Purpose: Women with breast cancer are increasingly being diagnosed and cared for within the VA. Breast cancer specialists are available only at large VA hospitals in urban regions, possibly impacting the outcomes of rural women. The health outcomes of rural women at the VA have not been well described and are currently a research priority. We described the differences between urban and rural women’s demographics and breast cancer characteristics. We then compared urban and rural women with nonmetastatic breast cancer on type of lymph node biopsy, type of breast surgery, adjuvant radiation, adjuvant chemotherapy, and hormone therapy.
Methods: Following IRB approval, 4,025 women with nonmetastatic breast cancer from 1995 to 2012 were identified from the Veterans Affairs Central Cancer Registry (VACCR). This dataset contained diagnosis date, histology, tumor size, tumor grade, lymph node status, and estrogen receptor status. The VACCR also gathered type of lymph node surgery, type of breast surgery, adjuvant radiation, adjuvant chemotherapy, and adjuvant hormone therapy. Patient-specific data included date of birth, ethnicity, and zip code of residence at the time of diagnosis. The Rural Urban Commuting Areas 2.0 (RUCA) was used to define rural status and collated further into 3 categories of urban, large rural, and small rural. STATA data analysis and statistical software was used to organize and analyze data. The associations between the 3 rural/urban categories and diagnosis year, age, ethnicity, histology and tumor grade were assessed by ordinal logistic regression. Tumor size was compared using rank sum test. Lymph node and estrogen receptor status were compared with logistic regression, and lymph node sampling methods with multinomial regression. All other treatments were compared between small rural and urban women using logistic regression, and further analyzed with adjustments for factors that could influence treatment choices, including diagnosis year, age, ethnicity, tumor size and grade, lymph node status, and estrogen receptor status.
Results: Most women (n = 3,192) with nonmetastatic breast cancer resided in urban regions, 423 women in large rural regions, and 410 in small rural regions. The number of women living in urban and rural regions did not shift significantly over time (P = .48). The age distributions of rural and urban women did not differ. Women with breast cancer in rural regions were more likely to be white (P ≤ .001, 69% white urban; 90% white small rural; 24% black urban, and 6% black small rural). Tumor histology, size, grade, and lymph node and estrogen receptor status did not differ significantly between rural and urban. Mastectomy was more common among rural women initially, but after adjustments for patient demographics and breast cancer characteristics, urban and rural women received similar proportions of mastectomies. After adjustments, urban and rural women received equivalent breast cancer surgery, adjuvant radiation and adjuvant hormone therapy. However, after controlling for confounding factors, a disproportionate number of urban women receive no lymph node biopsy (P = .05). Additionally, women from large rural regions were statistically more likely to receive adjuvant chemotherapy (P = .04), although the chemotherapy administration did not differ statistically between women from urban and small rural regions (P = .7).
Conclusions: Most women diagnosed with breast cancer at the VA from 1995 to 2012 resided in urban areas. Rural women were much more likely to be white, but the age at diagnosis did not differ. Breast cancer characteristics were similar between rural and urban women. Women living in large rural regions were more likely to receive adjuvant chemotherapy than were women from urban or small rural regions; however reporting differences should be considered as an explanation. A higher proportion of urban women received no lymph node biopsy, which merits further investigation. Breast conservation therapy was administered consistently among rural and urban women veterans.
Durable Palliation of Lung Tumors Using Stereotactic Body Radiotherapy
Purpose: Stereotactic body radiotherapy (SBRT) is a safe and effective modality for treatment of early stage non-small cell lung cancer. We report our single institution experience in using protracted course of SBRT as a palliative treatment for lung tumors.
Methods: Patients with symptomatic lung lesions treated with palliative intent SBRT were retrospectively reviewed. These patients were not amenable to curative treatment due to previous irradiation, large-sized lesions, or advanced disease. Patients received 50-52 Gy in 10-12 fractions daily for 2 weeks.
Results: Ten patients, 5 males and 5 females, were treated over 3 years. Seven primary lesions, 2 metastatic lesions, and 1 recurrent primary lesion were treated. Patients ranged from aged 41-84 years with a mean age of 72 years. With a median follow-up of 11.5 months, the median overall survival was 18 months. Of 14 symptoms that were treated, 9 (64%) had complete resolution. Two patients (14%) had partial improvement, and 2 patients (14%) had no response. One patient (7%) had worsening symptoms following treatment. The median time to response was 27 days. Symptoms relapsed in 3 (21%) patients with median time of 3.7 months. A majority of patients (70%) remained symptom free until last follow-up. None of the patients experienced grade 3 or higher toxicity.
Conclusions: SBRT is a safe, effective, and durable treatment modality for palliating lung tumors that are not suitable for curative treatment.
Purpose: Stereotactic body radiotherapy (SBRT) is a safe and effective modality for treatment of early stage non-small cell lung cancer. We report our single institution experience in using protracted course of SBRT as a palliative treatment for lung tumors.
Methods: Patients with symptomatic lung lesions treated with palliative intent SBRT were retrospectively reviewed. These patients were not amenable to curative treatment due to previous irradiation, large-sized lesions, or advanced disease. Patients received 50-52 Gy in 10-12 fractions daily for 2 weeks.
Results: Ten patients, 5 males and 5 females, were treated over 3 years. Seven primary lesions, 2 metastatic lesions, and 1 recurrent primary lesion were treated. Patients ranged from aged 41-84 years with a mean age of 72 years. With a median follow-up of 11.5 months, the median overall survival was 18 months. Of 14 symptoms that were treated, 9 (64%) had complete resolution. Two patients (14%) had partial improvement, and 2 patients (14%) had no response. One patient (7%) had worsening symptoms following treatment. The median time to response was 27 days. Symptoms relapsed in 3 (21%) patients with median time of 3.7 months. A majority of patients (70%) remained symptom free until last follow-up. None of the patients experienced grade 3 or higher toxicity.
Conclusions: SBRT is a safe, effective, and durable treatment modality for palliating lung tumors that are not suitable for curative treatment.
Purpose: Stereotactic body radiotherapy (SBRT) is a safe and effective modality for treatment of early stage non-small cell lung cancer. We report our single institution experience in using protracted course of SBRT as a palliative treatment for lung tumors.
Methods: Patients with symptomatic lung lesions treated with palliative intent SBRT were retrospectively reviewed. These patients were not amenable to curative treatment due to previous irradiation, large-sized lesions, or advanced disease. Patients received 50-52 Gy in 10-12 fractions daily for 2 weeks.
Results: Ten patients, 5 males and 5 females, were treated over 3 years. Seven primary lesions, 2 metastatic lesions, and 1 recurrent primary lesion were treated. Patients ranged from aged 41-84 years with a mean age of 72 years. With a median follow-up of 11.5 months, the median overall survival was 18 months. Of 14 symptoms that were treated, 9 (64%) had complete resolution. Two patients (14%) had partial improvement, and 2 patients (14%) had no response. One patient (7%) had worsening symptoms following treatment. The median time to response was 27 days. Symptoms relapsed in 3 (21%) patients with median time of 3.7 months. A majority of patients (70%) remained symptom free until last follow-up. None of the patients experienced grade 3 or higher toxicity.
Conclusions: SBRT is a safe, effective, and durable treatment modality for palliating lung tumors that are not suitable for curative treatment.
Effects of Feeding Tube Placement on Weight and Treatment Breaks in Patients With Locally Advanced Head and Neck Cancer Who Undergo Definitive Radiotherapy
Purpose: To evaluate the effects of feeding tube placement on patient weight and length of treatment breaks during definitive radiotherapy with at least 50 Gy to the bilateral necks of patients with head and neck (H&N) cancer.
Methods: Thirty-five H&N cancer patients underwent definitive radiotherapy at the Radiation Oncology Department at Michael E. DeBakey VA Medical Center from July 23, 2012 to April 25, 2013. Twenty-three patients received doses of ≥ 50 Gy to bilateral necks, and constituted the study group. The remaining 12 patients did not receive ≥ 50 Gy and were excluded from the study. Among the 23 patients, 11 underwent feeding tube placement (group 1). Group 2 consisted of 12 patients without feeding tubes. All patients with feeding tube placement had concurrent chemotherapy. Some patients in group 2 received radiation treatment only.
Results: Twenty-two patients had weight loss, 1 patient gained 6.9 lb during the course of treatment. The median weight loss for group 1 was 17.8 lb (ranging from 4.4 to 34.4 lb), compared with 18.6 lb in patients in group 2. Those in group 2 who only received radiation therapy had the least median weight loss (5.4 lb). The average treatment break was 3.3 days for patients in group 1, 3.7 days for those in group 2 with concurrent chemo-radiotherapy (chemoRT), and 3.2 days for group 2 subjects receiving radiation therapy only.
Conclusions: In H&N cancer patients, feeding tube placement did not minimize weight loss and did not reduce average treatment breaks in those given concurrent chemoRT. An interesting additional finding of the study was that chemotherapy seems to have greater impact on a patient’s ability to tolerate radiation therapy. Our findings in this small, retrospective study, though suggestive, are insufficient to draw any definitive conclusions about the effectiveness of prophylactic feeding tube placement in the target patient population. Published studies on this subject are contradictory. Treatment decisions should be based on physician expertise and individualized to clinical needs of patients.
Purpose: To evaluate the effects of feeding tube placement on patient weight and length of treatment breaks during definitive radiotherapy with at least 50 Gy to the bilateral necks of patients with head and neck (H&N) cancer.
Methods: Thirty-five H&N cancer patients underwent definitive radiotherapy at the Radiation Oncology Department at Michael E. DeBakey VA Medical Center from July 23, 2012 to April 25, 2013. Twenty-three patients received doses of ≥ 50 Gy to bilateral necks, and constituted the study group. The remaining 12 patients did not receive ≥ 50 Gy and were excluded from the study. Among the 23 patients, 11 underwent feeding tube placement (group 1). Group 2 consisted of 12 patients without feeding tubes. All patients with feeding tube placement had concurrent chemotherapy. Some patients in group 2 received radiation treatment only.
Results: Twenty-two patients had weight loss, 1 patient gained 6.9 lb during the course of treatment. The median weight loss for group 1 was 17.8 lb (ranging from 4.4 to 34.4 lb), compared with 18.6 lb in patients in group 2. Those in group 2 who only received radiation therapy had the least median weight loss (5.4 lb). The average treatment break was 3.3 days for patients in group 1, 3.7 days for those in group 2 with concurrent chemo-radiotherapy (chemoRT), and 3.2 days for group 2 subjects receiving radiation therapy only.
Conclusions: In H&N cancer patients, feeding tube placement did not minimize weight loss and did not reduce average treatment breaks in those given concurrent chemoRT. An interesting additional finding of the study was that chemotherapy seems to have greater impact on a patient’s ability to tolerate radiation therapy. Our findings in this small, retrospective study, though suggestive, are insufficient to draw any definitive conclusions about the effectiveness of prophylactic feeding tube placement in the target patient population. Published studies on this subject are contradictory. Treatment decisions should be based on physician expertise and individualized to clinical needs of patients.
Purpose: To evaluate the effects of feeding tube placement on patient weight and length of treatment breaks during definitive radiotherapy with at least 50 Gy to the bilateral necks of patients with head and neck (H&N) cancer.
Methods: Thirty-five H&N cancer patients underwent definitive radiotherapy at the Radiation Oncology Department at Michael E. DeBakey VA Medical Center from July 23, 2012 to April 25, 2013. Twenty-three patients received doses of ≥ 50 Gy to bilateral necks, and constituted the study group. The remaining 12 patients did not receive ≥ 50 Gy and were excluded from the study. Among the 23 patients, 11 underwent feeding tube placement (group 1). Group 2 consisted of 12 patients without feeding tubes. All patients with feeding tube placement had concurrent chemotherapy. Some patients in group 2 received radiation treatment only.
Results: Twenty-two patients had weight loss, 1 patient gained 6.9 lb during the course of treatment. The median weight loss for group 1 was 17.8 lb (ranging from 4.4 to 34.4 lb), compared with 18.6 lb in patients in group 2. Those in group 2 who only received radiation therapy had the least median weight loss (5.4 lb). The average treatment break was 3.3 days for patients in group 1, 3.7 days for those in group 2 with concurrent chemo-radiotherapy (chemoRT), and 3.2 days for group 2 subjects receiving radiation therapy only.
Conclusions: In H&N cancer patients, feeding tube placement did not minimize weight loss and did not reduce average treatment breaks in those given concurrent chemoRT. An interesting additional finding of the study was that chemotherapy seems to have greater impact on a patient’s ability to tolerate radiation therapy. Our findings in this small, retrospective study, though suggestive, are insufficient to draw any definitive conclusions about the effectiveness of prophylactic feeding tube placement in the target patient population. Published studies on this subject are contradictory. Treatment decisions should be based on physician expertise and individualized to clinical needs of patients.
Comparison of Low-Dose Platinum vs High-Dose Platinum vs Cetuximab and Intensity-Modulated Radiation Therapy in Advanced Head and Neck Cancers
Purpose: High-dose cisplatin is standard in head and neck cancers. Recently, weekly low-dose platinum alone or with cetuximab has been used. All patients received 70 Gy standard fraction intensity-modulated radiation therapy. We are comparing the former used at the VAMC with the latter regimens used in the department of Oncology at the University of Mississippi. We will present the toxicity, response, patterns of failure, and survival by the use of the different types of chemotherapy.
Purpose: High-dose cisplatin is standard in head and neck cancers. Recently, weekly low-dose platinum alone or with cetuximab has been used. All patients received 70 Gy standard fraction intensity-modulated radiation therapy. We are comparing the former used at the VAMC with the latter regimens used in the department of Oncology at the University of Mississippi. We will present the toxicity, response, patterns of failure, and survival by the use of the different types of chemotherapy.
Purpose: High-dose cisplatin is standard in head and neck cancers. Recently, weekly low-dose platinum alone or with cetuximab has been used. All patients received 70 Gy standard fraction intensity-modulated radiation therapy. We are comparing the former used at the VAMC with the latter regimens used in the department of Oncology at the University of Mississippi. We will present the toxicity, response, patterns of failure, and survival by the use of the different types of chemotherapy.
Blastic Plasmacytoid Dendritic Cell Neoplasm: A Case Successfully Treated With HyperCVAD Followed by Allogeneic Stem Cell Transplantation
Introduction: Blastic plasmacytoid dendritic cell neoplasm (BPDCN) is a rare, clinically aggressive tumor derived from the precursors of palsmacytoid dendritic cells with a high frequency of cutaneous and bone marrow involvement and leukaemic dissemination. The prognosis is poor, and an optimal treatment approach has not been defined. We describe a case of BPDCN that has been successfully treated with acute leukaemia-type induction (hyperCVAD) followed by allogeneic stem cell transplantation (ASCT).
Purpose: To evaluate the efficacy of hyperCVAD and ASCT in treatment of BPDCN.
Method: A case report and literature review. Result: An African American male, aged 49 years, presented with firm violaceous nodules on the left calf, back, and shoulders, accompanied by 40 lb weight loss. Patient did not experience night sweats, fevers, and chill. Complete blood cell count and serum lactate dehydrogenase were normal. A skin biopsy revealed dense dermal infiltrates consisting of intermediate-sized cells with high N:C ratio. The tumor cells were strongly positive for CD4, CD56; partially positive for TdT; weakly positive for CD45 and CD43; and negative for CD3, CD20, CD30, MPO, CD34, CD117. Bone marrow biopsy and aspirate smear showed hypercellular marrow with predominant blastic cells with high nuclear-cytoplasmic ratio, finely chromatin, and prominent nucleoli. Flow cytometric analysis demonstrated 85% blasts that were positive for HLA-DR, CD4, CD56, CD38, and TdT (partial); but negative for CD34, CD117, CD33, CD13, CD14, CD15, CD2, CD3, CD5, CD11c, CD7, CD19, CD10, CD20, CD22, CD24, Kappa, Lambda, CD25, CD52, and MPO. Cyotogenetic analysis reported an abnormal complex chromosome abnormality: 46, XY, add (7)(q22), add (8)(p11.2), add (9)(q13), psu dic(13:6)(p12;q16), del(13)(q12q22), -17, +21, +mar[9]. The peripheral blood smear revealed rare blasts. The patient responded well to hyperCVAD chemo followed by ASCT. He has remained disease free for > 5 years.
Conclusions: Aggressive chemotherapy followed by ASCT is a favorable treatment plan for BPDCN.
Introduction: Blastic plasmacytoid dendritic cell neoplasm (BPDCN) is a rare, clinically aggressive tumor derived from the precursors of palsmacytoid dendritic cells with a high frequency of cutaneous and bone marrow involvement and leukaemic dissemination. The prognosis is poor, and an optimal treatment approach has not been defined. We describe a case of BPDCN that has been successfully treated with acute leukaemia-type induction (hyperCVAD) followed by allogeneic stem cell transplantation (ASCT).
Purpose: To evaluate the efficacy of hyperCVAD and ASCT in treatment of BPDCN.
Method: A case report and literature review. Result: An African American male, aged 49 years, presented with firm violaceous nodules on the left calf, back, and shoulders, accompanied by 40 lb weight loss. Patient did not experience night sweats, fevers, and chill. Complete blood cell count and serum lactate dehydrogenase were normal. A skin biopsy revealed dense dermal infiltrates consisting of intermediate-sized cells with high N:C ratio. The tumor cells were strongly positive for CD4, CD56; partially positive for TdT; weakly positive for CD45 and CD43; and negative for CD3, CD20, CD30, MPO, CD34, CD117. Bone marrow biopsy and aspirate smear showed hypercellular marrow with predominant blastic cells with high nuclear-cytoplasmic ratio, finely chromatin, and prominent nucleoli. Flow cytometric analysis demonstrated 85% blasts that were positive for HLA-DR, CD4, CD56, CD38, and TdT (partial); but negative for CD34, CD117, CD33, CD13, CD14, CD15, CD2, CD3, CD5, CD11c, CD7, CD19, CD10, CD20, CD22, CD24, Kappa, Lambda, CD25, CD52, and MPO. Cyotogenetic analysis reported an abnormal complex chromosome abnormality: 46, XY, add (7)(q22), add (8)(p11.2), add (9)(q13), psu dic(13:6)(p12;q16), del(13)(q12q22), -17, +21, +mar[9]. The peripheral blood smear revealed rare blasts. The patient responded well to hyperCVAD chemo followed by ASCT. He has remained disease free for > 5 years.
Conclusions: Aggressive chemotherapy followed by ASCT is a favorable treatment plan for BPDCN.
Introduction: Blastic plasmacytoid dendritic cell neoplasm (BPDCN) is a rare, clinically aggressive tumor derived from the precursors of palsmacytoid dendritic cells with a high frequency of cutaneous and bone marrow involvement and leukaemic dissemination. The prognosis is poor, and an optimal treatment approach has not been defined. We describe a case of BPDCN that has been successfully treated with acute leukaemia-type induction (hyperCVAD) followed by allogeneic stem cell transplantation (ASCT).
Purpose: To evaluate the efficacy of hyperCVAD and ASCT in treatment of BPDCN.
Method: A case report and literature review. Result: An African American male, aged 49 years, presented with firm violaceous nodules on the left calf, back, and shoulders, accompanied by 40 lb weight loss. Patient did not experience night sweats, fevers, and chill. Complete blood cell count and serum lactate dehydrogenase were normal. A skin biopsy revealed dense dermal infiltrates consisting of intermediate-sized cells with high N:C ratio. The tumor cells were strongly positive for CD4, CD56; partially positive for TdT; weakly positive for CD45 and CD43; and negative for CD3, CD20, CD30, MPO, CD34, CD117. Bone marrow biopsy and aspirate smear showed hypercellular marrow with predominant blastic cells with high nuclear-cytoplasmic ratio, finely chromatin, and prominent nucleoli. Flow cytometric analysis demonstrated 85% blasts that were positive for HLA-DR, CD4, CD56, CD38, and TdT (partial); but negative for CD34, CD117, CD33, CD13, CD14, CD15, CD2, CD3, CD5, CD11c, CD7, CD19, CD10, CD20, CD22, CD24, Kappa, Lambda, CD25, CD52, and MPO. Cyotogenetic analysis reported an abnormal complex chromosome abnormality: 46, XY, add (7)(q22), add (8)(p11.2), add (9)(q13), psu dic(13:6)(p12;q16), del(13)(q12q22), -17, +21, +mar[9]. The peripheral blood smear revealed rare blasts. The patient responded well to hyperCVAD chemo followed by ASCT. He has remained disease free for > 5 years.
Conclusions: Aggressive chemotherapy followed by ASCT is a favorable treatment plan for BPDCN.
Liver grafts donated after circulatory death increase early risk of diabetes
SAN FRANCISCO – The type of liver graft used in transplantation plays a large role in early development of new-onset diabetes, according to a retrospective study of 430 patients from the United Kingdom.
A team led by Dr. Hermien Hartog, an honorary clinical fellow in the Liver Unit, Queen Elizabeth Hospital, Birmingham, England, studied patients undergoing primary liver transplant between 2008 and 2012. Patients were excluded from the study if they had preexisting diabetes, had died, or had undergone retransplantation within 90 days.
The investigators assessed both the development of new-onset diabetes after transplant (NODAT), using criteria adapted from a published article (Transplantation 2013;96:58-64), and its resolution, defined as the date of cessation of antihyperglycemic therapy or the last episode of hyperglycemia.
Seventy-nine percent of the patients received grafts donated after brain death (DBD), Dr. Hartog reported at the annual meeting of the 2014 World Transplant Congress. Among the recipients of grafts donated after circulatory death (DCD), the mean warm ischemic time was 21 minutes.
With a median follow-up of 2.5 years, the cumulative 1-year incidence of NODAT was 19% in the entire cohort, with a median time to onset of 30 days. In the 44% of affected patients whose NODAT resolved, the median time to resolution was 150 days post transplantation, Dr. Hartog reported at the congress, which was sponsored by the American Society of Transplant Surgeons.
The cumulative 1-year incidence of NODAT was 23% in DCD graft recipients and 18% in DBD graft recipients, a nonsignificant difference. But when patients were stratified by graft type, "we saw an early occurrence and high peak incidence of NODAT in DCD graft recipients. Also, a larger proportion of these patients resolved their NODAT over time," she commented.
The overall temporal pattern suggested that "the effect that we see of graft type seems to be temporary and [lessens] over time when multifactorial factors come into play," according to Dr. Hartog.
In multivariate analyses, the risk of NODAT within 90 days of transplantation was higher for patients who received a DCD graft (hazard ratio, 1.8). More detailed analysis showed that the elevation of risk was greatest within the first 15 days.
"Our study confirms known associations with NODAT after liver transplantation but identifies DCD graft as a novel risk factor. This causes a temporary effect in the early post-transplant period that is independent from known risk factors," Dr. Hartog commented.
"Based on our observations, we hypothesize that hyperglycemia may be related to liver graft function through ischemia-reperfusion–induced hepatic insulin resistance," she added. "We are currently trying to confirm our data in an independent data set, which will also include postreperfusion glucose levels and correlation with the insulin receptor pathway in time-zero liver biopsies."
"The clinical relevance of our findings is as yet unknown," she acknowledged. However, they may help inform new approaches for graft optimization and selection.
Session cochair Dr. Darius Mirza, also of the University of Birmingham, asked, "Why does the pattern of recovery seem to be different in the DCDs versus the DBDs? Also, why are the cumulative incidence and the time frame so different?"
"Actually, in the literature, I have not seen any reports looking at the early post-transplant period. So most reports look at one time point, normally 1 year," Dr. Hartog replied. "What I think is that there is an early peak caused by DCD grafts that would explain why there is an early peak, but also why those patients recover later on. I think this peak is a bit obscure because there are also other factors that come into play, maybe after a while, that will obscure that first peak. If you would take those other factors out of the equation, I think you would just see a peak in the early period."
Dr. Mirza also wondered about the role of using DCD grafts that are accepted under extended criteria. "So you start off using mainly young, fit DCD livers. Now, the vast majority are extended-criteria DCD livers. Do you think that plays a role, or is it too early to say?"
"Yes, I think so," Dr. Hartog said, while adding that this phenomenon is likely not restricted to DCD grafts. "From earlier literature, there is a clear difference between a living donated graft and deceased donation. And it might also be that the extended grafts or the more steatotic grafts may exhibit this effect more than the better grafts."
Dr. Hartog disclosed no conflicts of interest relevant to the research.
SAN FRANCISCO – The type of liver graft used in transplantation plays a large role in early development of new-onset diabetes, according to a retrospective study of 430 patients from the United Kingdom.
A team led by Dr. Hermien Hartog, an honorary clinical fellow in the Liver Unit, Queen Elizabeth Hospital, Birmingham, England, studied patients undergoing primary liver transplant between 2008 and 2012. Patients were excluded from the study if they had preexisting diabetes, had died, or had undergone retransplantation within 90 days.
The investigators assessed both the development of new-onset diabetes after transplant (NODAT), using criteria adapted from a published article (Transplantation 2013;96:58-64), and its resolution, defined as the date of cessation of antihyperglycemic therapy or the last episode of hyperglycemia.
Seventy-nine percent of the patients received grafts donated after brain death (DBD), Dr. Hartog reported at the annual meeting of the 2014 World Transplant Congress. Among the recipients of grafts donated after circulatory death (DCD), the mean warm ischemic time was 21 minutes.
With a median follow-up of 2.5 years, the cumulative 1-year incidence of NODAT was 19% in the entire cohort, with a median time to onset of 30 days. In the 44% of affected patients whose NODAT resolved, the median time to resolution was 150 days post transplantation, Dr. Hartog reported at the congress, which was sponsored by the American Society of Transplant Surgeons.
The cumulative 1-year incidence of NODAT was 23% in DCD graft recipients and 18% in DBD graft recipients, a nonsignificant difference. But when patients were stratified by graft type, "we saw an early occurrence and high peak incidence of NODAT in DCD graft recipients. Also, a larger proportion of these patients resolved their NODAT over time," she commented.
The overall temporal pattern suggested that "the effect that we see of graft type seems to be temporary and [lessens] over time when multifactorial factors come into play," according to Dr. Hartog.
In multivariate analyses, the risk of NODAT within 90 days of transplantation was higher for patients who received a DCD graft (hazard ratio, 1.8). More detailed analysis showed that the elevation of risk was greatest within the first 15 days.
"Our study confirms known associations with NODAT after liver transplantation but identifies DCD graft as a novel risk factor. This causes a temporary effect in the early post-transplant period that is independent from known risk factors," Dr. Hartog commented.
"Based on our observations, we hypothesize that hyperglycemia may be related to liver graft function through ischemia-reperfusion–induced hepatic insulin resistance," she added. "We are currently trying to confirm our data in an independent data set, which will also include postreperfusion glucose levels and correlation with the insulin receptor pathway in time-zero liver biopsies."
"The clinical relevance of our findings is as yet unknown," she acknowledged. However, they may help inform new approaches for graft optimization and selection.
Session cochair Dr. Darius Mirza, also of the University of Birmingham, asked, "Why does the pattern of recovery seem to be different in the DCDs versus the DBDs? Also, why are the cumulative incidence and the time frame so different?"
"Actually, in the literature, I have not seen any reports looking at the early post-transplant period. So most reports look at one time point, normally 1 year," Dr. Hartog replied. "What I think is that there is an early peak caused by DCD grafts that would explain why there is an early peak, but also why those patients recover later on. I think this peak is a bit obscure because there are also other factors that come into play, maybe after a while, that will obscure that first peak. If you would take those other factors out of the equation, I think you would just see a peak in the early period."
Dr. Mirza also wondered about the role of using DCD grafts that are accepted under extended criteria. "So you start off using mainly young, fit DCD livers. Now, the vast majority are extended-criteria DCD livers. Do you think that plays a role, or is it too early to say?"
"Yes, I think so," Dr. Hartog said, while adding that this phenomenon is likely not restricted to DCD grafts. "From earlier literature, there is a clear difference between a living donated graft and deceased donation. And it might also be that the extended grafts or the more steatotic grafts may exhibit this effect more than the better grafts."
Dr. Hartog disclosed no conflicts of interest relevant to the research.
SAN FRANCISCO – The type of liver graft used in transplantation plays a large role in early development of new-onset diabetes, according to a retrospective study of 430 patients from the United Kingdom.
A team led by Dr. Hermien Hartog, an honorary clinical fellow in the Liver Unit, Queen Elizabeth Hospital, Birmingham, England, studied patients undergoing primary liver transplant between 2008 and 2012. Patients were excluded from the study if they had preexisting diabetes, had died, or had undergone retransplantation within 90 days.
The investigators assessed both the development of new-onset diabetes after transplant (NODAT), using criteria adapted from a published article (Transplantation 2013;96:58-64), and its resolution, defined as the date of cessation of antihyperglycemic therapy or the last episode of hyperglycemia.
Seventy-nine percent of the patients received grafts donated after brain death (DBD), Dr. Hartog reported at the annual meeting of the 2014 World Transplant Congress. Among the recipients of grafts donated after circulatory death (DCD), the mean warm ischemic time was 21 minutes.
With a median follow-up of 2.5 years, the cumulative 1-year incidence of NODAT was 19% in the entire cohort, with a median time to onset of 30 days. In the 44% of affected patients whose NODAT resolved, the median time to resolution was 150 days post transplantation, Dr. Hartog reported at the congress, which was sponsored by the American Society of Transplant Surgeons.
The cumulative 1-year incidence of NODAT was 23% in DCD graft recipients and 18% in DBD graft recipients, a nonsignificant difference. But when patients were stratified by graft type, "we saw an early occurrence and high peak incidence of NODAT in DCD graft recipients. Also, a larger proportion of these patients resolved their NODAT over time," she commented.
The overall temporal pattern suggested that "the effect that we see of graft type seems to be temporary and [lessens] over time when multifactorial factors come into play," according to Dr. Hartog.
In multivariate analyses, the risk of NODAT within 90 days of transplantation was higher for patients who received a DCD graft (hazard ratio, 1.8). More detailed analysis showed that the elevation of risk was greatest within the first 15 days.
"Our study confirms known associations with NODAT after liver transplantation but identifies DCD graft as a novel risk factor. This causes a temporary effect in the early post-transplant period that is independent from known risk factors," Dr. Hartog commented.
"Based on our observations, we hypothesize that hyperglycemia may be related to liver graft function through ischemia-reperfusion–induced hepatic insulin resistance," she added. "We are currently trying to confirm our data in an independent data set, which will also include postreperfusion glucose levels and correlation with the insulin receptor pathway in time-zero liver biopsies."
"The clinical relevance of our findings is as yet unknown," she acknowledged. However, they may help inform new approaches for graft optimization and selection.
Session cochair Dr. Darius Mirza, also of the University of Birmingham, asked, "Why does the pattern of recovery seem to be different in the DCDs versus the DBDs? Also, why are the cumulative incidence and the time frame so different?"
"Actually, in the literature, I have not seen any reports looking at the early post-transplant period. So most reports look at one time point, normally 1 year," Dr. Hartog replied. "What I think is that there is an early peak caused by DCD grafts that would explain why there is an early peak, but also why those patients recover later on. I think this peak is a bit obscure because there are also other factors that come into play, maybe after a while, that will obscure that first peak. If you would take those other factors out of the equation, I think you would just see a peak in the early period."
Dr. Mirza also wondered about the role of using DCD grafts that are accepted under extended criteria. "So you start off using mainly young, fit DCD livers. Now, the vast majority are extended-criteria DCD livers. Do you think that plays a role, or is it too early to say?"
"Yes, I think so," Dr. Hartog said, while adding that this phenomenon is likely not restricted to DCD grafts. "From earlier literature, there is a clear difference between a living donated graft and deceased donation. And it might also be that the extended grafts or the more steatotic grafts may exhibit this effect more than the better grafts."
Dr. Hartog disclosed no conflicts of interest relevant to the research.
AT THE 2014 WORLD TRANSPLANT CONGRESS
Key clinical point: Recipients of liver grafts donated after circulatory death are at a slightly higher risk for post-transplant new-onset diabetes.
Major finding: The risk of new-onset diabetes within 90 days of transplantation was 1.8-fold higher for patients who received a DCD graft than for peers who received a DBD graft.
Data source: A retrospective cohort study of 430 primary liver transplant recipients
Disclosures: Dr. Hartog disclosed no relevant conflicts of interest.
Survival of Patients With Untreated Early Stage Non-Small Cell Lung Cancer
Method: A retrospective chart review was conducted in patients diagnosed with stage 1 and 2 NSCLC in Samuel S. Stratton VAMC in Albany from January 1, 1999, to January 1, 2009. Patients who were not treated were identified. Recorded data included demographic information, including age at diagnosis and gender, stage at presentation, pathology, smoking history, performance status, reason for nontreatment, vital status, cause of death, and time from diagnosis to death.
Results: There were 256 patients of early stage NSCLC diagnosed; 39 of them did not receive any therapy. All the patients were male: 95% of them were smokers; 35.9% of patients had ECOG performance status 3 or 4. The reasons that they did not get any treatment included poor functional status, poor cardiac or lung function, other comorbidities, or simple refusal. Mean age at diagnosis was 76.9 ± 8.2 years. Mean survival length was 24.36 ± 28.07 months. Five-year survival rate was 12.8%.
Conclusions: Untreated early stage NSCLC has a much lower 5-year survival rate than that of stage-matched resected disease (32%-63%). Newer molecular target oral agents might be an option for those patients who are not candidates for standard lobectomy or definitive radiation therapy. Further studies are needed in this field.
Method: A retrospective chart review was conducted in patients diagnosed with stage 1 and 2 NSCLC in Samuel S. Stratton VAMC in Albany from January 1, 1999, to January 1, 2009. Patients who were not treated were identified. Recorded data included demographic information, including age at diagnosis and gender, stage at presentation, pathology, smoking history, performance status, reason for nontreatment, vital status, cause of death, and time from diagnosis to death.
Results: There were 256 patients of early stage NSCLC diagnosed; 39 of them did not receive any therapy. All the patients were male: 95% of them were smokers; 35.9% of patients had ECOG performance status 3 or 4. The reasons that they did not get any treatment included poor functional status, poor cardiac or lung function, other comorbidities, or simple refusal. Mean age at diagnosis was 76.9 ± 8.2 years. Mean survival length was 24.36 ± 28.07 months. Five-year survival rate was 12.8%.
Conclusions: Untreated early stage NSCLC has a much lower 5-year survival rate than that of stage-matched resected disease (32%-63%). Newer molecular target oral agents might be an option for those patients who are not candidates for standard lobectomy or definitive radiation therapy. Further studies are needed in this field.
Method: A retrospective chart review was conducted in patients diagnosed with stage 1 and 2 NSCLC in Samuel S. Stratton VAMC in Albany from January 1, 1999, to January 1, 2009. Patients who were not treated were identified. Recorded data included demographic information, including age at diagnosis and gender, stage at presentation, pathology, smoking history, performance status, reason for nontreatment, vital status, cause of death, and time from diagnosis to death.
Results: There were 256 patients of early stage NSCLC diagnosed; 39 of them did not receive any therapy. All the patients were male: 95% of them were smokers; 35.9% of patients had ECOG performance status 3 or 4. The reasons that they did not get any treatment included poor functional status, poor cardiac or lung function, other comorbidities, or simple refusal. Mean age at diagnosis was 76.9 ± 8.2 years. Mean survival length was 24.36 ± 28.07 months. Five-year survival rate was 12.8%.
Conclusions: Untreated early stage NSCLC has a much lower 5-year survival rate than that of stage-matched resected disease (32%-63%). Newer molecular target oral agents might be an option for those patients who are not candidates for standard lobectomy or definitive radiation therapy. Further studies are needed in this field.
Gender Disparity in Breast Cancer: A Veteran Population Based Comparison
Introduction: Male breast cancer (MBC) comprises < 1% of all cancers in men and continues to rise. Because of MBC rarity, there is paucity in the literature. Management of MBC is generalized from female breast cancer (FBC). The Veterans Affairs Central Cancer Registry (VACCR) provides a unique source for the study of MBC. The objective of this retrospective analysis was to compare and contrast the characteristics and outcomes of MBC with FBC in the VA population.
Methods: VACCR data from 153 VAMCs were used to analyze the database of VA patients who had breast cancer diagnosed between 1998 and 2013. Primary site codes were identified for breast cancer (50.0-50.9). Data were entered and analyzed using biostatistical software (SAS 9.3).
Results: In total, 6,443 patient records were reviewed, and 1,123 MBC patients were compared with 5,320 FBC patients. The mean age at diagnosis was 70 years for MBS and 57 years for FBC (P < .0001). In patients aged > 50 years, higher numbers of MBC diagnosis (95%) were made compared with FBC diagnosis (72%). Seventy-five percent of patients with breast cancer were white in both genders. More MBC patients (40% in men vs 24% in women) presented with higher disease stage (3 and 4) compared with FBC (21% had ductal carcinoma in situ and 53% stage 1). The dominant histology was ductal carcinoma. No difference in laterality was observed. Estrogen and progesterone receptor-positive tumors were more common in MBC compared with FBC. Forty-five percent and 36% of patients with MBC or FBC, respectively, received hormonal treatment as first course, but fewer MBC patients received chemotherapy and radiation. The mean follow up time was 754 days. As of December 2013, 355 (32%) MBC and 791 (15%) FBC patients died during the course of the study. Males had higher odds of death compared with that of females, but when adjusted for age, race, stage, and grade, survival was better among males.
Conclusions: To the authors’ knowledge, this is the largest series of MBC and FBC completed to date in the veteran population. The results suggested that males were older at presentation and had higher stage of breast cancer compared with that of FBC. The higher mortality rate in MBC may be due to higher stage and/or tumor biology.
Introduction: Male breast cancer (MBC) comprises < 1% of all cancers in men and continues to rise. Because of MBC rarity, there is paucity in the literature. Management of MBC is generalized from female breast cancer (FBC). The Veterans Affairs Central Cancer Registry (VACCR) provides a unique source for the study of MBC. The objective of this retrospective analysis was to compare and contrast the characteristics and outcomes of MBC with FBC in the VA population.
Methods: VACCR data from 153 VAMCs were used to analyze the database of VA patients who had breast cancer diagnosed between 1998 and 2013. Primary site codes were identified for breast cancer (50.0-50.9). Data were entered and analyzed using biostatistical software (SAS 9.3).
Results: In total, 6,443 patient records were reviewed, and 1,123 MBC patients were compared with 5,320 FBC patients. The mean age at diagnosis was 70 years for MBS and 57 years for FBC (P < .0001). In patients aged > 50 years, higher numbers of MBC diagnosis (95%) were made compared with FBC diagnosis (72%). Seventy-five percent of patients with breast cancer were white in both genders. More MBC patients (40% in men vs 24% in women) presented with higher disease stage (3 and 4) compared with FBC (21% had ductal carcinoma in situ and 53% stage 1). The dominant histology was ductal carcinoma. No difference in laterality was observed. Estrogen and progesterone receptor-positive tumors were more common in MBC compared with FBC. Forty-five percent and 36% of patients with MBC or FBC, respectively, received hormonal treatment as first course, but fewer MBC patients received chemotherapy and radiation. The mean follow up time was 754 days. As of December 2013, 355 (32%) MBC and 791 (15%) FBC patients died during the course of the study. Males had higher odds of death compared with that of females, but when adjusted for age, race, stage, and grade, survival was better among males.
Conclusions: To the authors’ knowledge, this is the largest series of MBC and FBC completed to date in the veteran population. The results suggested that males were older at presentation and had higher stage of breast cancer compared with that of FBC. The higher mortality rate in MBC may be due to higher stage and/or tumor biology.
Introduction: Male breast cancer (MBC) comprises < 1% of all cancers in men and continues to rise. Because of MBC rarity, there is paucity in the literature. Management of MBC is generalized from female breast cancer (FBC). The Veterans Affairs Central Cancer Registry (VACCR) provides a unique source for the study of MBC. The objective of this retrospective analysis was to compare and contrast the characteristics and outcomes of MBC with FBC in the VA population.
Methods: VACCR data from 153 VAMCs were used to analyze the database of VA patients who had breast cancer diagnosed between 1998 and 2013. Primary site codes were identified for breast cancer (50.0-50.9). Data were entered and analyzed using biostatistical software (SAS 9.3).
Results: In total, 6,443 patient records were reviewed, and 1,123 MBC patients were compared with 5,320 FBC patients. The mean age at diagnosis was 70 years for MBS and 57 years for FBC (P < .0001). In patients aged > 50 years, higher numbers of MBC diagnosis (95%) were made compared with FBC diagnosis (72%). Seventy-five percent of patients with breast cancer were white in both genders. More MBC patients (40% in men vs 24% in women) presented with higher disease stage (3 and 4) compared with FBC (21% had ductal carcinoma in situ and 53% stage 1). The dominant histology was ductal carcinoma. No difference in laterality was observed. Estrogen and progesterone receptor-positive tumors were more common in MBC compared with FBC. Forty-five percent and 36% of patients with MBC or FBC, respectively, received hormonal treatment as first course, but fewer MBC patients received chemotherapy and radiation. The mean follow up time was 754 days. As of December 2013, 355 (32%) MBC and 791 (15%) FBC patients died during the course of the study. Males had higher odds of death compared with that of females, but when adjusted for age, race, stage, and grade, survival was better among males.
Conclusions: To the authors’ knowledge, this is the largest series of MBC and FBC completed to date in the veteran population. The results suggested that males were older at presentation and had higher stage of breast cancer compared with that of FBC. The higher mortality rate in MBC may be due to higher stage and/or tumor biology.