User login
Fulminant Spread of a Femur Anaerobic Osteomyelitis to Abdomen in a 17-Year-Old Boy
Evidence Suggests Optimal Intervals for Osteoporosis Screening
Based on available evidence, osteoporosis screening should take place at 15-year intervals for postmenopausal women who have normal bone density or mild osteopenia at their first assessment, at 5-year intervals for those who have moderate osteopenia, and at 1-year intervals for those who have advanced osteopenia, according to a report in the Jan. 19 New England Journal of Medicine.
Screening at shorter intervals is unlikely to improve prediction of the transition to osteoporosis, and thus won’t help clinicians judge when to start osteoporosis therapy so as to avert hip or vertebral fractures, said Dr. Margaret L. Gourlay of the department of family medicine, University of North Carolina, Chapel Hill, and her associates in the Study of Osteoporotic Fractures research group.
Current guidelines do not specify how long to wait between bone mineral density screening with dual-energy x-ray absorptiometry (DEXA), and no U.S. study to date "has addressed this clinical uncertainty," they noted.
"To determine how the BMD testing interval relates to the timing of the transition from normal [bone mineral density] or osteopenia to the development of osteoporosis before a hip or clinical vertebral fracture occurs, we conducted competing-risk analyses of data from 4,957 women, 67 years of age or older, who did not have osteoporosis at baseline and who were followed longitudinally for up to 15 years in the [Study of Osteoporotic Fractures]," the investigators said.
The appropriate screening interval was defined as the estimated time for 10% of the study subjects in each category of osteopenia severity to make the transition from normal BMD or osteopenia to osteoporosis before fractures occurred and before treatment for osteoporosis was initiated. The three categories of severity were normal BMD/mild osteopenia (T score of greater than -1.50) at the initial assessment, moderate osteopenia (T score of -1.50 to -1.99), and advanced osteopenia (T score of -2.00 to -2.49).
This interval was found to be 15 years for normal BMD/mild osteopenia, 5 years for moderate osteopenia, and 1 year for advanced osteopenia, Dr. Gourlay and her colleagues said (N. Engl. J. Med. 2012;366:225-33).
"Recent controversy over the harms of excessive screening for other chronic diseases reinforces the importance of developing a rational screening program for osteoporosis that is based on the best available evidence rather than on health care marketing, advocacy, and public beliefs that have encouraged overtesting and overtreatment in the U.S.," they noted.
"Our estimates for BMD testing proved to be robust after adjustment for major clinical risk factors" such as fracture history, smoking status, use of estrogen, and use of glucocorticoids. "However, clinicians may choose to reevaluate patients before our estimated screening intervals if there is evidence of decreased activity or mobility, weight loss, or other risk factors not considered in our analyses," they said.
This study was supported by the National Institutes of Health. No potential conflicts of interest were reported.
Based on available evidence, osteoporosis screening should take place at 15-year intervals for postmenopausal women who have normal bone density or mild osteopenia at their first assessment, at 5-year intervals for those who have moderate osteopenia, and at 1-year intervals for those who have advanced osteopenia, according to a report in the Jan. 19 New England Journal of Medicine.
Screening at shorter intervals is unlikely to improve prediction of the transition to osteoporosis, and thus won’t help clinicians judge when to start osteoporosis therapy so as to avert hip or vertebral fractures, said Dr. Margaret L. Gourlay of the department of family medicine, University of North Carolina, Chapel Hill, and her associates in the Study of Osteoporotic Fractures research group.
Current guidelines do not specify how long to wait between bone mineral density screening with dual-energy x-ray absorptiometry (DEXA), and no U.S. study to date "has addressed this clinical uncertainty," they noted.
"To determine how the BMD testing interval relates to the timing of the transition from normal [bone mineral density] or osteopenia to the development of osteoporosis before a hip or clinical vertebral fracture occurs, we conducted competing-risk analyses of data from 4,957 women, 67 years of age or older, who did not have osteoporosis at baseline and who were followed longitudinally for up to 15 years in the [Study of Osteoporotic Fractures]," the investigators said.
The appropriate screening interval was defined as the estimated time for 10% of the study subjects in each category of osteopenia severity to make the transition from normal BMD or osteopenia to osteoporosis before fractures occurred and before treatment for osteoporosis was initiated. The three categories of severity were normal BMD/mild osteopenia (T score of greater than -1.50) at the initial assessment, moderate osteopenia (T score of -1.50 to -1.99), and advanced osteopenia (T score of -2.00 to -2.49).
This interval was found to be 15 years for normal BMD/mild osteopenia, 5 years for moderate osteopenia, and 1 year for advanced osteopenia, Dr. Gourlay and her colleagues said (N. Engl. J. Med. 2012;366:225-33).
"Recent controversy over the harms of excessive screening for other chronic diseases reinforces the importance of developing a rational screening program for osteoporosis that is based on the best available evidence rather than on health care marketing, advocacy, and public beliefs that have encouraged overtesting and overtreatment in the U.S.," they noted.
"Our estimates for BMD testing proved to be robust after adjustment for major clinical risk factors" such as fracture history, smoking status, use of estrogen, and use of glucocorticoids. "However, clinicians may choose to reevaluate patients before our estimated screening intervals if there is evidence of decreased activity or mobility, weight loss, or other risk factors not considered in our analyses," they said.
This study was supported by the National Institutes of Health. No potential conflicts of interest were reported.
Based on available evidence, osteoporosis screening should take place at 15-year intervals for postmenopausal women who have normal bone density or mild osteopenia at their first assessment, at 5-year intervals for those who have moderate osteopenia, and at 1-year intervals for those who have advanced osteopenia, according to a report in the Jan. 19 New England Journal of Medicine.
Screening at shorter intervals is unlikely to improve prediction of the transition to osteoporosis, and thus won’t help clinicians judge when to start osteoporosis therapy so as to avert hip or vertebral fractures, said Dr. Margaret L. Gourlay of the department of family medicine, University of North Carolina, Chapel Hill, and her associates in the Study of Osteoporotic Fractures research group.
Current guidelines do not specify how long to wait between bone mineral density screening with dual-energy x-ray absorptiometry (DEXA), and no U.S. study to date "has addressed this clinical uncertainty," they noted.
"To determine how the BMD testing interval relates to the timing of the transition from normal [bone mineral density] or osteopenia to the development of osteoporosis before a hip or clinical vertebral fracture occurs, we conducted competing-risk analyses of data from 4,957 women, 67 years of age or older, who did not have osteoporosis at baseline and who were followed longitudinally for up to 15 years in the [Study of Osteoporotic Fractures]," the investigators said.
The appropriate screening interval was defined as the estimated time for 10% of the study subjects in each category of osteopenia severity to make the transition from normal BMD or osteopenia to osteoporosis before fractures occurred and before treatment for osteoporosis was initiated. The three categories of severity were normal BMD/mild osteopenia (T score of greater than -1.50) at the initial assessment, moderate osteopenia (T score of -1.50 to -1.99), and advanced osteopenia (T score of -2.00 to -2.49).
This interval was found to be 15 years for normal BMD/mild osteopenia, 5 years for moderate osteopenia, and 1 year for advanced osteopenia, Dr. Gourlay and her colleagues said (N. Engl. J. Med. 2012;366:225-33).
"Recent controversy over the harms of excessive screening for other chronic diseases reinforces the importance of developing a rational screening program for osteoporosis that is based on the best available evidence rather than on health care marketing, advocacy, and public beliefs that have encouraged overtesting and overtreatment in the U.S.," they noted.
"Our estimates for BMD testing proved to be robust after adjustment for major clinical risk factors" such as fracture history, smoking status, use of estrogen, and use of glucocorticoids. "However, clinicians may choose to reevaluate patients before our estimated screening intervals if there is evidence of decreased activity or mobility, weight loss, or other risk factors not considered in our analyses," they said.
This study was supported by the National Institutes of Health. No potential conflicts of interest were reported.
FROM THE NEW ENGLAND JOURNAL OF MEDICINE
Major Finding: DEXA screening for osteoporosis should be done at 15-year intervals for women with normal BMD or mild osteopenia (T score of greater than -1.50) at their initial screen, at 5-year intervals for those who have moderate osteopenia (T score of -1.50 to -1.99), and at 1-year intervals for those who have advanced osteopenia (T score of -2.00 to -2.49).
Data Source: Analysis of data from the longitudinal Study of Osteoporotic Fractures involving 4,957 postmenopausal women aged 67 years and older at baseline who were followed for 15 years.
Disclosures: This study was supported by the National Institutes of Health. No potential conflicts of interest were reported.
DXA Reimbursement Slated to Plummet March 1
Medicare payments for the use of dual energy x-ray absorptiometry are set to drop on March 1 unless Congress acts to extend the current payment rates for the screening procedure.
Physicians performing dual energy x-ray absorptiometry (DXA) testing currently are reimbursed at around $100 per test on average, but those payments are scheduled to drop to about $50 without congressional intervention. Physicians also are concerned that the steep cut could force more office-based physicians to stop offering bone density screening, thereby limiting access for patients.
"We will definitely be changing the way that we do business when it comes to DXA," said Dr. Christopher R. Shuhart, a family physician and medical director for bone health and osteoporosis at Swedish Medical Group in Seattle.
The large multispecialty group is considering a range of options, including whether to limit the number of sites where they offer DXA testing, in part because of the potential Medicare fee cut. They had already been considering changes to their DXA practice for other reasons, Dr. Shuhart said, but the financial pressure that would come with a cut to $50 per test is a significant factor.
"DXA is both clinically valuable and cost effective."
Dr. Shuhart, who also is cochair for facility accreditation at the International Society for Clinical Densitometry (ISCD), said that Swedish Medical Group is in a better position to absorb the DXA cuts than most small private practices because they are part of a large health care system, which allows them to benefit from higher private insurance contract rates that help to offset cuts in Medicare payments.
Congress first cut DXA payments in 2007, when it slashed Medicare payments for imaging services as part of the Deficit Reduction Act of 2005. Although DXA wasn’t one of the high-cost imaging modalities lawmakers had in their crosshairs, it still was included in the law.
Physicians took another payment hit when the Centers for Medicare and Medicaid Services reduced payments for physician work involved in interpreting the results of a DXA test. The cuts were phased in over time, but by the beginning of 2010, the average Medicare payment for DXA had dropped from a high of about $140 to about $62.
Under the Affordable Care Act, DXA payments were restored to 70% of the 2006 level, bringing them back up to nearly $100. But that increase was only for 2 years. At the end of last year, the increase was scheduled to expire, when Congress granted a 2-month reprieve by including DXA in the Temporary Payroll Tax Cut Continuation Act of 2011, which also extended the 2011 Medicare physician fee schedule rates, the payroll tax holiday, and federal unemployment benefits temporarily.
What will happen next with DXA payment is uncertain and depends in large part on the fate of the larger legislative package currently under consideration in Congress.
"It looks like the [Medicare Sustainable Growth Rate formula], DXA, and payroll taxes are wrapped up together for at least the immediate future," said Dr. Jonathan D. Leffert, an endocrinologist in Dallas and chair of the Legislative and Regulatory Committee for the American Association of Clinical Endocrinologists.
Dr. Leffert said that a permanent stabilization of DXA payment rates is not likely right now and that an extension of the current payment rates could range anywhere from 2 months to 22 months. "At this point, it’s pretty much not about the policy, but about the money," he said. Specifically, lawmakers are struggling to find ways to pay for not only the increased DXA payments to physicians but a temporary fix for the 27% Medicare physician fee cut that is also slated to take effect on March 1.
But just getting lawmakers to include DXA in the Temporary Payroll Tax Cut Continuation Act was a big step, said Dr. Andrew J. Laster, a rheumatologist in private practice in Charlotte, N.C., and chair of the public policy committee for the ISCD. There were only about a dozen health provisions that made it into that bill, so it shows that Congress recognizes the value of identifying people with osteoporosis and treating them. "Now at least we are in the room and acknowledged," he said.
"We will definitely be changing the way that we do business when it comes to DXA."
Dr. Laster added that he’s also encouraged by a recent study published in the journal Health Affairs that helps bolster the case for increasing payments for bone density testing (Health Aff. 2011;30:2362-70). Looking at a large population of Medicare beneficiaries, researchers found that as payments for DXA dropped, Medicare claims for the test plateaued. For instance, from 1996 through 2006, before the payment cuts went into effect, DXA testing was growing at a rate of about 6.5%. From 2007 through 2009, however, about 800,000 fewer tests were administered to Medicare beneficiaries than would have been expected based on the earlier growth rate.
The study also showed that DXA testing was linked to fewer fractures. The researchers found that, over a 3-year period, fracture rates were nearly 20% lower in elderly women who had received a DXA test, compared with those who did not.
Alison King, Ph.D., one of the coauthors of the study and a health care consultant, said the results make a strong case for averting the scheduled payment cut for DXA both to improve public health and to save money in the long term. "There are many valuable medical interventions that do not save money," she said. "DXA is both clinically valuable and cost effective."
Dr. Charles King, a rheumatologist in Tupelo, Miss., and chair of the American College of Rheumatology’s Committee on Rheumatologic Care, said that he thinks lawmakers are starting to hear the message about the cost effectiveness of DXA. "We are getting the word out," he said. The problem, he noted, is that Congress has been reticent to provide legislative carve-outs for certain diseases or treatments.
Dr. King advised physicians that if they want to continue to offer DXA they need to view it as a loss leader, but he urged rheumatologists to keep performing it. Rheumatologists must continue to read and interpret these studies because they inadvertently contribute to the development of osteoporosis through the use of corticosteroids and other treatments for rheumatic conditions. "We have to retain ownership of this disease," he said.
Medicare payments for the use of dual energy x-ray absorptiometry are set to drop on March 1 unless Congress acts to extend the current payment rates for the screening procedure.
Physicians performing dual energy x-ray absorptiometry (DXA) testing currently are reimbursed at around $100 per test on average, but those payments are scheduled to drop to about $50 without congressional intervention. Physicians also are concerned that the steep cut could force more office-based physicians to stop offering bone density screening, thereby limiting access for patients.
"We will definitely be changing the way that we do business when it comes to DXA," said Dr. Christopher R. Shuhart, a family physician and medical director for bone health and osteoporosis at Swedish Medical Group in Seattle.
The large multispecialty group is considering a range of options, including whether to limit the number of sites where they offer DXA testing, in part because of the potential Medicare fee cut. They had already been considering changes to their DXA practice for other reasons, Dr. Shuhart said, but the financial pressure that would come with a cut to $50 per test is a significant factor.
"DXA is both clinically valuable and cost effective."
Dr. Shuhart, who also is cochair for facility accreditation at the International Society for Clinical Densitometry (ISCD), said that Swedish Medical Group is in a better position to absorb the DXA cuts than most small private practices because they are part of a large health care system, which allows them to benefit from higher private insurance contract rates that help to offset cuts in Medicare payments.
Congress first cut DXA payments in 2007, when it slashed Medicare payments for imaging services as part of the Deficit Reduction Act of 2005. Although DXA wasn’t one of the high-cost imaging modalities lawmakers had in their crosshairs, it still was included in the law.
Physicians took another payment hit when the Centers for Medicare and Medicaid Services reduced payments for physician work involved in interpreting the results of a DXA test. The cuts were phased in over time, but by the beginning of 2010, the average Medicare payment for DXA had dropped from a high of about $140 to about $62.
Under the Affordable Care Act, DXA payments were restored to 70% of the 2006 level, bringing them back up to nearly $100. But that increase was only for 2 years. At the end of last year, the increase was scheduled to expire, when Congress granted a 2-month reprieve by including DXA in the Temporary Payroll Tax Cut Continuation Act of 2011, which also extended the 2011 Medicare physician fee schedule rates, the payroll tax holiday, and federal unemployment benefits temporarily.
What will happen next with DXA payment is uncertain and depends in large part on the fate of the larger legislative package currently under consideration in Congress.
"It looks like the [Medicare Sustainable Growth Rate formula], DXA, and payroll taxes are wrapped up together for at least the immediate future," said Dr. Jonathan D. Leffert, an endocrinologist in Dallas and chair of the Legislative and Regulatory Committee for the American Association of Clinical Endocrinologists.
Dr. Leffert said that a permanent stabilization of DXA payment rates is not likely right now and that an extension of the current payment rates could range anywhere from 2 months to 22 months. "At this point, it’s pretty much not about the policy, but about the money," he said. Specifically, lawmakers are struggling to find ways to pay for not only the increased DXA payments to physicians but a temporary fix for the 27% Medicare physician fee cut that is also slated to take effect on March 1.
But just getting lawmakers to include DXA in the Temporary Payroll Tax Cut Continuation Act was a big step, said Dr. Andrew J. Laster, a rheumatologist in private practice in Charlotte, N.C., and chair of the public policy committee for the ISCD. There were only about a dozen health provisions that made it into that bill, so it shows that Congress recognizes the value of identifying people with osteoporosis and treating them. "Now at least we are in the room and acknowledged," he said.
"We will definitely be changing the way that we do business when it comes to DXA."
Dr. Laster added that he’s also encouraged by a recent study published in the journal Health Affairs that helps bolster the case for increasing payments for bone density testing (Health Aff. 2011;30:2362-70). Looking at a large population of Medicare beneficiaries, researchers found that as payments for DXA dropped, Medicare claims for the test plateaued. For instance, from 1996 through 2006, before the payment cuts went into effect, DXA testing was growing at a rate of about 6.5%. From 2007 through 2009, however, about 800,000 fewer tests were administered to Medicare beneficiaries than would have been expected based on the earlier growth rate.
The study also showed that DXA testing was linked to fewer fractures. The researchers found that, over a 3-year period, fracture rates were nearly 20% lower in elderly women who had received a DXA test, compared with those who did not.
Alison King, Ph.D., one of the coauthors of the study and a health care consultant, said the results make a strong case for averting the scheduled payment cut for DXA both to improve public health and to save money in the long term. "There are many valuable medical interventions that do not save money," she said. "DXA is both clinically valuable and cost effective."
Dr. Charles King, a rheumatologist in Tupelo, Miss., and chair of the American College of Rheumatology’s Committee on Rheumatologic Care, said that he thinks lawmakers are starting to hear the message about the cost effectiveness of DXA. "We are getting the word out," he said. The problem, he noted, is that Congress has been reticent to provide legislative carve-outs for certain diseases or treatments.
Dr. King advised physicians that if they want to continue to offer DXA they need to view it as a loss leader, but he urged rheumatologists to keep performing it. Rheumatologists must continue to read and interpret these studies because they inadvertently contribute to the development of osteoporosis through the use of corticosteroids and other treatments for rheumatic conditions. "We have to retain ownership of this disease," he said.
Medicare payments for the use of dual energy x-ray absorptiometry are set to drop on March 1 unless Congress acts to extend the current payment rates for the screening procedure.
Physicians performing dual energy x-ray absorptiometry (DXA) testing currently are reimbursed at around $100 per test on average, but those payments are scheduled to drop to about $50 without congressional intervention. Physicians also are concerned that the steep cut could force more office-based physicians to stop offering bone density screening, thereby limiting access for patients.
"We will definitely be changing the way that we do business when it comes to DXA," said Dr. Christopher R. Shuhart, a family physician and medical director for bone health and osteoporosis at Swedish Medical Group in Seattle.
The large multispecialty group is considering a range of options, including whether to limit the number of sites where they offer DXA testing, in part because of the potential Medicare fee cut. They had already been considering changes to their DXA practice for other reasons, Dr. Shuhart said, but the financial pressure that would come with a cut to $50 per test is a significant factor.
"DXA is both clinically valuable and cost effective."
Dr. Shuhart, who also is cochair for facility accreditation at the International Society for Clinical Densitometry (ISCD), said that Swedish Medical Group is in a better position to absorb the DXA cuts than most small private practices because they are part of a large health care system, which allows them to benefit from higher private insurance contract rates that help to offset cuts in Medicare payments.
Congress first cut DXA payments in 2007, when it slashed Medicare payments for imaging services as part of the Deficit Reduction Act of 2005. Although DXA wasn’t one of the high-cost imaging modalities lawmakers had in their crosshairs, it still was included in the law.
Physicians took another payment hit when the Centers for Medicare and Medicaid Services reduced payments for physician work involved in interpreting the results of a DXA test. The cuts were phased in over time, but by the beginning of 2010, the average Medicare payment for DXA had dropped from a high of about $140 to about $62.
Under the Affordable Care Act, DXA payments were restored to 70% of the 2006 level, bringing them back up to nearly $100. But that increase was only for 2 years. At the end of last year, the increase was scheduled to expire, when Congress granted a 2-month reprieve by including DXA in the Temporary Payroll Tax Cut Continuation Act of 2011, which also extended the 2011 Medicare physician fee schedule rates, the payroll tax holiday, and federal unemployment benefits temporarily.
What will happen next with DXA payment is uncertain and depends in large part on the fate of the larger legislative package currently under consideration in Congress.
"It looks like the [Medicare Sustainable Growth Rate formula], DXA, and payroll taxes are wrapped up together for at least the immediate future," said Dr. Jonathan D. Leffert, an endocrinologist in Dallas and chair of the Legislative and Regulatory Committee for the American Association of Clinical Endocrinologists.
Dr. Leffert said that a permanent stabilization of DXA payment rates is not likely right now and that an extension of the current payment rates could range anywhere from 2 months to 22 months. "At this point, it’s pretty much not about the policy, but about the money," he said. Specifically, lawmakers are struggling to find ways to pay for not only the increased DXA payments to physicians but a temporary fix for the 27% Medicare physician fee cut that is also slated to take effect on March 1.
But just getting lawmakers to include DXA in the Temporary Payroll Tax Cut Continuation Act was a big step, said Dr. Andrew J. Laster, a rheumatologist in private practice in Charlotte, N.C., and chair of the public policy committee for the ISCD. There were only about a dozen health provisions that made it into that bill, so it shows that Congress recognizes the value of identifying people with osteoporosis and treating them. "Now at least we are in the room and acknowledged," he said.
"We will definitely be changing the way that we do business when it comes to DXA."
Dr. Laster added that he’s also encouraged by a recent study published in the journal Health Affairs that helps bolster the case for increasing payments for bone density testing (Health Aff. 2011;30:2362-70). Looking at a large population of Medicare beneficiaries, researchers found that as payments for DXA dropped, Medicare claims for the test plateaued. For instance, from 1996 through 2006, before the payment cuts went into effect, DXA testing was growing at a rate of about 6.5%. From 2007 through 2009, however, about 800,000 fewer tests were administered to Medicare beneficiaries than would have been expected based on the earlier growth rate.
The study also showed that DXA testing was linked to fewer fractures. The researchers found that, over a 3-year period, fracture rates were nearly 20% lower in elderly women who had received a DXA test, compared with those who did not.
Alison King, Ph.D., one of the coauthors of the study and a health care consultant, said the results make a strong case for averting the scheduled payment cut for DXA both to improve public health and to save money in the long term. "There are many valuable medical interventions that do not save money," she said. "DXA is both clinically valuable and cost effective."
Dr. Charles King, a rheumatologist in Tupelo, Miss., and chair of the American College of Rheumatology’s Committee on Rheumatologic Care, said that he thinks lawmakers are starting to hear the message about the cost effectiveness of DXA. "We are getting the word out," he said. The problem, he noted, is that Congress has been reticent to provide legislative carve-outs for certain diseases or treatments.
Dr. King advised physicians that if they want to continue to offer DXA they need to view it as a loss leader, but he urged rheumatologists to keep performing it. Rheumatologists must continue to read and interpret these studies because they inadvertently contribute to the development of osteoporosis through the use of corticosteroids and other treatments for rheumatic conditions. "We have to retain ownership of this disease," he said.
A Man With Back Pain and Fever
A Complex Injury of the Distal Ulnar Physis: A Case Report and Brief Review of the Literature
When Is a Medial Epicondyle Fracture a Medial Condyle Fracture?
Pre-Anthracycline-Based Chemo Cardiac Imaging Questioned
SAN ANTONIO – The guideline-recommended practice of routinely measuring left ventricular ejection fraction before anthracycline-based chemotherapy to screen out patients at increased risk for treatment-induced heart failure has come under fire as unproductive and financially wasteful.
It’s a practice endorsed by the American Heart Association and American College of Cardiology, enshrined in Food and Drug Administration labeling, required as part of most U.S. clinical trials, and common in community-based oncology practice.
Yet there are no data to support the utility of this practice as a screening tool aimed at minimizing heart failure induced by anthracycline-based chemotherapy, according to a report at the San Antonio Breast Cancer Symposium.
Dr. Seema M. Policepatil of the Gundersen Lutheran Medical Foundation, La Crosse, Wis., and colleagues presented a retrospective study that suggested routine cardiac ejection fraction screening under these circumstances is without merit. The study included 466 patients with early-stage, HER2-negative invasive breast cancer who were under consideration for anthracycline-based chemotherapy as part of their initial therapy. None had prior heart failure.
Left ventricular ejection fraction (LVEF) was measured by echocardiography, nuclear imaging, or MRI prior to chemotherapy in 241 of the patients. This reflects institutional practice: at Gundersen, pretreatment assessment of cardiac pump function is common but not uniform.
One of the 241 patients was found to have asymptomatic left ventricular dysfunction, with a screening ejection fraction of 48%, and she therefore didn’t receive anthracycline-based chemotherapy. Thus, modification of the treatment strategy in response to screening of ejection fraction occurred only rarely.
In addition, nine patients – six who had pretreatment cardiac imaging and three who did not – skipped the chemotherapy, either because of physician or patient preference or participation in clinical trials.
During a mean 5 years of follow-up, 3 of the remaining 456 women were diagnosed with heart failure: 2 among those with a pretreatment LVEF measurement, and 1 among those without it. That’s an acceptably low 0.7% event rate, she declared.
Current practice guidelines recommending pretreatment LVEF measurement are based upon expert consensus. It’s time to incorporate the available evidence, which in the case of the Gundersen experience doesn’t support the practice, Dr. Policepatil continued.
Assuming that nationally half of all patients with early-stage HER2-negative breast cancer undergo measurement of their LV ejection fraction before getting chemotherapy, eliminating this routine practice would save $7 million to $17 million annually based upon Medicare and Medicaid reimbursement rates, the physician added.
This study was funded by the Center for Cancer and Blood Disorders at the Gundersen Lutheran Medical Foundation. Dr. Policepatil declared having no financial conflicts.
SAN ANTONIO – The guideline-recommended practice of routinely measuring left ventricular ejection fraction before anthracycline-based chemotherapy to screen out patients at increased risk for treatment-induced heart failure has come under fire as unproductive and financially wasteful.
It’s a practice endorsed by the American Heart Association and American College of Cardiology, enshrined in Food and Drug Administration labeling, required as part of most U.S. clinical trials, and common in community-based oncology practice.
Yet there are no data to support the utility of this practice as a screening tool aimed at minimizing heart failure induced by anthracycline-based chemotherapy, according to a report at the San Antonio Breast Cancer Symposium.
Dr. Seema M. Policepatil of the Gundersen Lutheran Medical Foundation, La Crosse, Wis., and colleagues presented a retrospective study that suggested routine cardiac ejection fraction screening under these circumstances is without merit. The study included 466 patients with early-stage, HER2-negative invasive breast cancer who were under consideration for anthracycline-based chemotherapy as part of their initial therapy. None had prior heart failure.
Left ventricular ejection fraction (LVEF) was measured by echocardiography, nuclear imaging, or MRI prior to chemotherapy in 241 of the patients. This reflects institutional practice: at Gundersen, pretreatment assessment of cardiac pump function is common but not uniform.
One of the 241 patients was found to have asymptomatic left ventricular dysfunction, with a screening ejection fraction of 48%, and she therefore didn’t receive anthracycline-based chemotherapy. Thus, modification of the treatment strategy in response to screening of ejection fraction occurred only rarely.
In addition, nine patients – six who had pretreatment cardiac imaging and three who did not – skipped the chemotherapy, either because of physician or patient preference or participation in clinical trials.
During a mean 5 years of follow-up, 3 of the remaining 456 women were diagnosed with heart failure: 2 among those with a pretreatment LVEF measurement, and 1 among those without it. That’s an acceptably low 0.7% event rate, she declared.
Current practice guidelines recommending pretreatment LVEF measurement are based upon expert consensus. It’s time to incorporate the available evidence, which in the case of the Gundersen experience doesn’t support the practice, Dr. Policepatil continued.
Assuming that nationally half of all patients with early-stage HER2-negative breast cancer undergo measurement of their LV ejection fraction before getting chemotherapy, eliminating this routine practice would save $7 million to $17 million annually based upon Medicare and Medicaid reimbursement rates, the physician added.
This study was funded by the Center for Cancer and Blood Disorders at the Gundersen Lutheran Medical Foundation. Dr. Policepatil declared having no financial conflicts.
SAN ANTONIO – The guideline-recommended practice of routinely measuring left ventricular ejection fraction before anthracycline-based chemotherapy to screen out patients at increased risk for treatment-induced heart failure has come under fire as unproductive and financially wasteful.
It’s a practice endorsed by the American Heart Association and American College of Cardiology, enshrined in Food and Drug Administration labeling, required as part of most U.S. clinical trials, and common in community-based oncology practice.
Yet there are no data to support the utility of this practice as a screening tool aimed at minimizing heart failure induced by anthracycline-based chemotherapy, according to a report at the San Antonio Breast Cancer Symposium.
Dr. Seema M. Policepatil of the Gundersen Lutheran Medical Foundation, La Crosse, Wis., and colleagues presented a retrospective study that suggested routine cardiac ejection fraction screening under these circumstances is without merit. The study included 466 patients with early-stage, HER2-negative invasive breast cancer who were under consideration for anthracycline-based chemotherapy as part of their initial therapy. None had prior heart failure.
Left ventricular ejection fraction (LVEF) was measured by echocardiography, nuclear imaging, or MRI prior to chemotherapy in 241 of the patients. This reflects institutional practice: at Gundersen, pretreatment assessment of cardiac pump function is common but not uniform.
One of the 241 patients was found to have asymptomatic left ventricular dysfunction, with a screening ejection fraction of 48%, and she therefore didn’t receive anthracycline-based chemotherapy. Thus, modification of the treatment strategy in response to screening of ejection fraction occurred only rarely.
In addition, nine patients – six who had pretreatment cardiac imaging and three who did not – skipped the chemotherapy, either because of physician or patient preference or participation in clinical trials.
During a mean 5 years of follow-up, 3 of the remaining 456 women were diagnosed with heart failure: 2 among those with a pretreatment LVEF measurement, and 1 among those without it. That’s an acceptably low 0.7% event rate, she declared.
Current practice guidelines recommending pretreatment LVEF measurement are based upon expert consensus. It’s time to incorporate the available evidence, which in the case of the Gundersen experience doesn’t support the practice, Dr. Policepatil continued.
Assuming that nationally half of all patients with early-stage HER2-negative breast cancer undergo measurement of their LV ejection fraction before getting chemotherapy, eliminating this routine practice would save $7 million to $17 million annually based upon Medicare and Medicaid reimbursement rates, the physician added.
This study was funded by the Center for Cancer and Blood Disorders at the Gundersen Lutheran Medical Foundation. Dr. Policepatil declared having no financial conflicts.
FROM THE SAN ANTONIO BREAST CANCER SYMPOSIUM
Major Finding: Heart failure was diagnosed in three women within 5 years of anthracycline-based therapy – for an event rate of 0.7%.
Data Source: A single-center retrospective study of 466 breast cancer patients under consideration for anthracycline-based chemotherapy.
Disclosures: This study was funded by the Center for Cancer and Blood Disorders at the Gundersen Lutheran Medical Foundation. Dr. Policepatil declared having no financial conflicts.
Transesophageal Echocardiogram Appears Safe in Patients With Varices
SAN FRANCISCO – Transesophageal echocardiogram does not cause bleeding in cirrhotic patients with small varices or with large varices that have been previously treated, according to Australian researchers at the annual meeting of the American Association for the Study of Liver Diseases.
In a retrospective case series, 75 patients with varices had undergone 78 transesophageal echocardiograms (TEEs), most of which were done to rule out endocarditis or to monitor hemodynamic stability during liver transplantation. A total of 62 patients (83%) had esophageal varices, and 19 (25%) had gastric varices.
About half of the esophageal varices were larger than 5 mm, sometimes with red whale marks, spurting, or other endoscopic stigmata. In a few cases, the varices filled up more than half of the esophageal lumen. Large varices were treated before TEE, usually by banding or transjugular intrahepatic portosystemic shunt.
Twenty-two percent (14) of the esophageal varices and 33% (6) of the gastric varices were treated before TEE. The rest were deemed too small to need treatment.
None of the patients bled from the TEE procedure, which was done in one case just a week after banding, but in most cases was done several months later, said investigator and gastroenterologist Lucy Lim.
It’s an important finding because the theoretical bleeding risk means that "there’s still a lot of controversy" about whether TEEs are safe in patients with varices. Cardiologists – who most often do the procedure – sometimes "outright refuse if there are even small varices. They are the ones we need to convince," said Dr. Lim of the gastroenterology and liver transplant unit at the Austin Hospital in Melbourne.
Almost 70% (52) of the patients were men (average age, 54 years). The majority had Childs-Pugh A cirrhosis; the rest had Childs-Pugh B. The cirrhosis was attributed to hepatitis C or alcohol consumption in most cases. Varices are a common complication of cirrhosis-induced portal vein hypertension.
A smaller 2009 case series also found that TEEs were safe in patients with varices (J. Am. Soc. Echocardiogr. 2009;22:396-400). A comprehensive review of the issue was published in 2010 (J. Am. Soc. Echocardiogr. 2010;23:1115-27).
Dr. Lim said she had no relevant financial disclosures.
SAN FRANCISCO – Transesophageal echocardiogram does not cause bleeding in cirrhotic patients with small varices or with large varices that have been previously treated, according to Australian researchers at the annual meeting of the American Association for the Study of Liver Diseases.
In a retrospective case series, 75 patients with varices had undergone 78 transesophageal echocardiograms (TEEs), most of which were done to rule out endocarditis or to monitor hemodynamic stability during liver transplantation. A total of 62 patients (83%) had esophageal varices, and 19 (25%) had gastric varices.
About half of the esophageal varices were larger than 5 mm, sometimes with red whale marks, spurting, or other endoscopic stigmata. In a few cases, the varices filled up more than half of the esophageal lumen. Large varices were treated before TEE, usually by banding or transjugular intrahepatic portosystemic shunt.
Twenty-two percent (14) of the esophageal varices and 33% (6) of the gastric varices were treated before TEE. The rest were deemed too small to need treatment.
None of the patients bled from the TEE procedure, which was done in one case just a week after banding, but in most cases was done several months later, said investigator and gastroenterologist Lucy Lim.
It’s an important finding because the theoretical bleeding risk means that "there’s still a lot of controversy" about whether TEEs are safe in patients with varices. Cardiologists – who most often do the procedure – sometimes "outright refuse if there are even small varices. They are the ones we need to convince," said Dr. Lim of the gastroenterology and liver transplant unit at the Austin Hospital in Melbourne.
Almost 70% (52) of the patients were men (average age, 54 years). The majority had Childs-Pugh A cirrhosis; the rest had Childs-Pugh B. The cirrhosis was attributed to hepatitis C or alcohol consumption in most cases. Varices are a common complication of cirrhosis-induced portal vein hypertension.
A smaller 2009 case series also found that TEEs were safe in patients with varices (J. Am. Soc. Echocardiogr. 2009;22:396-400). A comprehensive review of the issue was published in 2010 (J. Am. Soc. Echocardiogr. 2010;23:1115-27).
Dr. Lim said she had no relevant financial disclosures.
SAN FRANCISCO – Transesophageal echocardiogram does not cause bleeding in cirrhotic patients with small varices or with large varices that have been previously treated, according to Australian researchers at the annual meeting of the American Association for the Study of Liver Diseases.
In a retrospective case series, 75 patients with varices had undergone 78 transesophageal echocardiograms (TEEs), most of which were done to rule out endocarditis or to monitor hemodynamic stability during liver transplantation. A total of 62 patients (83%) had esophageal varices, and 19 (25%) had gastric varices.
About half of the esophageal varices were larger than 5 mm, sometimes with red whale marks, spurting, or other endoscopic stigmata. In a few cases, the varices filled up more than half of the esophageal lumen. Large varices were treated before TEE, usually by banding or transjugular intrahepatic portosystemic shunt.
Twenty-two percent (14) of the esophageal varices and 33% (6) of the gastric varices were treated before TEE. The rest were deemed too small to need treatment.
None of the patients bled from the TEE procedure, which was done in one case just a week after banding, but in most cases was done several months later, said investigator and gastroenterologist Lucy Lim.
It’s an important finding because the theoretical bleeding risk means that "there’s still a lot of controversy" about whether TEEs are safe in patients with varices. Cardiologists – who most often do the procedure – sometimes "outright refuse if there are even small varices. They are the ones we need to convince," said Dr. Lim of the gastroenterology and liver transplant unit at the Austin Hospital in Melbourne.
Almost 70% (52) of the patients were men (average age, 54 years). The majority had Childs-Pugh A cirrhosis; the rest had Childs-Pugh B. The cirrhosis was attributed to hepatitis C or alcohol consumption in most cases. Varices are a common complication of cirrhosis-induced portal vein hypertension.
A smaller 2009 case series also found that TEEs were safe in patients with varices (J. Am. Soc. Echocardiogr. 2009;22:396-400). A comprehensive review of the issue was published in 2010 (J. Am. Soc. Echocardiogr. 2010;23:1115-27).
Dr. Lim said she had no relevant financial disclosures.
FROM THE ANNUAL MEETING OF THE AMERICAN ASSOCIATION FOR THE STUDY OF LIVER DISEASES
Major Finding: None of 75 patients with esophageal or gastric varices had bleeding after transesophageal echocardiograms.
Data Source: A retrospective case series.
Disclosures: Dr. Lim said she had no relevant financial disclosures.
Combined Imaging Modalities May Enable 'Optic Biopsy' in Stomach
Magnifying narrow-band imaging, when used in addition to conventional white-light imaging, greatly improves accuracy, sensitivity, and specificity in the detection of gastric mucosal cancers, Dr. Yasumasa Ezoe and colleagues reported in the December issue of Gastroenterology.
"This has enormous significance in clinical practice, because the examination with high positive predictive value and high negative predictive value might enable the clinician to make appropriate judgments as to which lesion needs pathology to confirm the diagnosis," the authors wrote (Gastroenterology 2011 Dec. 1 [10.1053/j.gastro.2011.08.007]).
Indeed, when combined, the two imaging modalities "might be the best approach for making accurate diagnoses of small gastric cancers," offering a so-called "optic biopsy" for gastric mucosal cancers, the authors suggested.
Dr. Ezoe of Kyoto University and colleagues studied patients aged 20 years or older seen at multiple institutions in Japan who had either untreated gastric cancers or a history of gastric cancer. Patients with prior surgical stomach resection were excluded from the study, although prior minimally invasive procedures, such as endoscopic mucosal resection and endoscopic submucosal dissection, were allowed.
All 353 patients included in the study initially underwent conventional white-light imaging (C-WLI). When a newly detected, undiagnosed, small (10 mm or less), depressed gastric lesion was detected, patients were immediately randomized to undergo detailed examination of the lesion using either C-WLI (n = 176) or magnifying narrow-band imaging (M-NBI) in a 1:1 ratio.
All lesions initially evaluated with C-WLI were subsequently evaluated with M-NBI, to ascertain the predictive value of both modalities together.
In the case of multiple lesions, only the first lesion was included in the study. Small, depressed lesions with apparent erosion or ulceration were also not evaluated, "as it is difficult to visualize surface changes in these lesions," wrote the authors.
After all the target lesions were examined, at least one biopsy specimen was collected and the revised Vienna classification system was used to diagnose C4 (mucosal high-grade neoplasia) or C5 (submucosal invasion by neoplasia) specimens as cancerous.
Overall, 20 patients in each group had a newly diagnosed gastric cancer (13% for both groups).
According to the authors, when compared with biopsy results, the diagnostic accuracy of M-NBI was significantly greater than that of C-WLI (90.4% versus 64.8%, respectively; P less than .001). M-NBI also beat C-WLI on specificity (94.3% vs. 67.9%, respectively; P less than .001).
However, in terms of sensitivity, the two techniques were similar, with M-NBI at 60.0% and C-WLI at 40.0%, respectively (P = .34).
The authors then looked at M-NBI plus C-WLI, and found that the former "significantly enhanced the diagnostic performance of the latter," with accuracy increasing to 96.6% when both were used together, specificity increasing to 96.8%, and sensitivity to 95%, with P less than .001 for all values when compared with C-WLI alone.
Similarly, "C-WLI followed by M-NBI dramatically improved the positive predictive value from 13.8% to 79.2%" compared with C-WLI alone (P less than .001). The negative predictive value also increased, from 89.8% with C-WLI alone to 99.3% when both techniques were used together (P less than .001).
The authors conceded that their sample size was small, and that larger studies will be needed to confirm the diagnostic utility of each modality. Additionally, they did not compare these modalities to dye-based imaging methods.
However, they added, dyes "are only used in a few countries and institutes, and then, the standard worldwide endoscopic method to diagnose early gastric cancer is still C-WLI without any dye use," they added.
The authors disclosed no personal conflicts of interest. The study was sponsored by a grant from the Ministry of Health, Labor, and Welfare of Japan.
Magnifying narrow-band imaging, when used in addition to conventional white-light imaging, greatly improves accuracy, sensitivity, and specificity in the detection of gastric mucosal cancers, Dr. Yasumasa Ezoe and colleagues reported in the December issue of Gastroenterology.
"This has enormous significance in clinical practice, because the examination with high positive predictive value and high negative predictive value might enable the clinician to make appropriate judgments as to which lesion needs pathology to confirm the diagnosis," the authors wrote (Gastroenterology 2011 Dec. 1 [10.1053/j.gastro.2011.08.007]).
Indeed, when combined, the two imaging modalities "might be the best approach for making accurate diagnoses of small gastric cancers," offering a so-called "optic biopsy" for gastric mucosal cancers, the authors suggested.
Dr. Ezoe of Kyoto University and colleagues studied patients aged 20 years or older seen at multiple institutions in Japan who had either untreated gastric cancers or a history of gastric cancer. Patients with prior surgical stomach resection were excluded from the study, although prior minimally invasive procedures, such as endoscopic mucosal resection and endoscopic submucosal dissection, were allowed.
All 353 patients included in the study initially underwent conventional white-light imaging (C-WLI). When a newly detected, undiagnosed, small (10 mm or less), depressed gastric lesion was detected, patients were immediately randomized to undergo detailed examination of the lesion using either C-WLI (n = 176) or magnifying narrow-band imaging (M-NBI) in a 1:1 ratio.
All lesions initially evaluated with C-WLI were subsequently evaluated with M-NBI, to ascertain the predictive value of both modalities together.
In the case of multiple lesions, only the first lesion was included in the study. Small, depressed lesions with apparent erosion or ulceration were also not evaluated, "as it is difficult to visualize surface changes in these lesions," wrote the authors.
After all the target lesions were examined, at least one biopsy specimen was collected and the revised Vienna classification system was used to diagnose C4 (mucosal high-grade neoplasia) or C5 (submucosal invasion by neoplasia) specimens as cancerous.
Overall, 20 patients in each group had a newly diagnosed gastric cancer (13% for both groups).
According to the authors, when compared with biopsy results, the diagnostic accuracy of M-NBI was significantly greater than that of C-WLI (90.4% versus 64.8%, respectively; P less than .001). M-NBI also beat C-WLI on specificity (94.3% vs. 67.9%, respectively; P less than .001).
However, in terms of sensitivity, the two techniques were similar, with M-NBI at 60.0% and C-WLI at 40.0%, respectively (P = .34).
The authors then looked at M-NBI plus C-WLI, and found that the former "significantly enhanced the diagnostic performance of the latter," with accuracy increasing to 96.6% when both were used together, specificity increasing to 96.8%, and sensitivity to 95%, with P less than .001 for all values when compared with C-WLI alone.
Similarly, "C-WLI followed by M-NBI dramatically improved the positive predictive value from 13.8% to 79.2%" compared with C-WLI alone (P less than .001). The negative predictive value also increased, from 89.8% with C-WLI alone to 99.3% when both techniques were used together (P less than .001).
The authors conceded that their sample size was small, and that larger studies will be needed to confirm the diagnostic utility of each modality. Additionally, they did not compare these modalities to dye-based imaging methods.
However, they added, dyes "are only used in a few countries and institutes, and then, the standard worldwide endoscopic method to diagnose early gastric cancer is still C-WLI without any dye use," they added.
The authors disclosed no personal conflicts of interest. The study was sponsored by a grant from the Ministry of Health, Labor, and Welfare of Japan.
Magnifying narrow-band imaging, when used in addition to conventional white-light imaging, greatly improves accuracy, sensitivity, and specificity in the detection of gastric mucosal cancers, Dr. Yasumasa Ezoe and colleagues reported in the December issue of Gastroenterology.
"This has enormous significance in clinical practice, because the examination with high positive predictive value and high negative predictive value might enable the clinician to make appropriate judgments as to which lesion needs pathology to confirm the diagnosis," the authors wrote (Gastroenterology 2011 Dec. 1 [10.1053/j.gastro.2011.08.007]).
Indeed, when combined, the two imaging modalities "might be the best approach for making accurate diagnoses of small gastric cancers," offering a so-called "optic biopsy" for gastric mucosal cancers, the authors suggested.
Dr. Ezoe of Kyoto University and colleagues studied patients aged 20 years or older seen at multiple institutions in Japan who had either untreated gastric cancers or a history of gastric cancer. Patients with prior surgical stomach resection were excluded from the study, although prior minimally invasive procedures, such as endoscopic mucosal resection and endoscopic submucosal dissection, were allowed.
All 353 patients included in the study initially underwent conventional white-light imaging (C-WLI). When a newly detected, undiagnosed, small (10 mm or less), depressed gastric lesion was detected, patients were immediately randomized to undergo detailed examination of the lesion using either C-WLI (n = 176) or magnifying narrow-band imaging (M-NBI) in a 1:1 ratio.
All lesions initially evaluated with C-WLI were subsequently evaluated with M-NBI, to ascertain the predictive value of both modalities together.
In the case of multiple lesions, only the first lesion was included in the study. Small, depressed lesions with apparent erosion or ulceration were also not evaluated, "as it is difficult to visualize surface changes in these lesions," wrote the authors.
After all the target lesions were examined, at least one biopsy specimen was collected and the revised Vienna classification system was used to diagnose C4 (mucosal high-grade neoplasia) or C5 (submucosal invasion by neoplasia) specimens as cancerous.
Overall, 20 patients in each group had a newly diagnosed gastric cancer (13% for both groups).
According to the authors, when compared with biopsy results, the diagnostic accuracy of M-NBI was significantly greater than that of C-WLI (90.4% versus 64.8%, respectively; P less than .001). M-NBI also beat C-WLI on specificity (94.3% vs. 67.9%, respectively; P less than .001).
However, in terms of sensitivity, the two techniques were similar, with M-NBI at 60.0% and C-WLI at 40.0%, respectively (P = .34).
The authors then looked at M-NBI plus C-WLI, and found that the former "significantly enhanced the diagnostic performance of the latter," with accuracy increasing to 96.6% when both were used together, specificity increasing to 96.8%, and sensitivity to 95%, with P less than .001 for all values when compared with C-WLI alone.
Similarly, "C-WLI followed by M-NBI dramatically improved the positive predictive value from 13.8% to 79.2%" compared with C-WLI alone (P less than .001). The negative predictive value also increased, from 89.8% with C-WLI alone to 99.3% when both techniques were used together (P less than .001).
The authors conceded that their sample size was small, and that larger studies will be needed to confirm the diagnostic utility of each modality. Additionally, they did not compare these modalities to dye-based imaging methods.
However, they added, dyes "are only used in a few countries and institutes, and then, the standard worldwide endoscopic method to diagnose early gastric cancer is still C-WLI without any dye use," they added.
The authors disclosed no personal conflicts of interest. The study was sponsored by a grant from the Ministry of Health, Labor, and Welfare of Japan.
FROM GASTROENTEROLOGY
Major Finding: Compared with conventional white-light imaging alone, narrow-band imaging in addition to white-light imaging was significantly better, with an accuracy of 96.6%, a specificity of 96.8%, and a sensitivity of 95% (P less than .001 for all) in the diagnosis of gastric mucosal cancer.
Data Source: A randomized, controlled, open-label, multicenter trial of 353 patients assigned to either white-light or magnified narrow-band imaging endoscopy.
Disclosures: The authors disclosed no personal conflicts of interest. The study was sponsored by a grant from the Ministry of Health, Labor, and Welfare of Japan.