User login
HM16 Session Analysis: ICD-10 Coding Tips
Presenter: Aziz Ansari, DO, FHM
Summary: With the implementation of ICD-10, correct and specific documentation to ensure proper patient diagnosis categorization has become increasingly important. Hospitalists are urged to understand the impact CDI has on quality and reimbursement.
Quality Impact: Documentation has a direct impact on quality reporting for mortality and complication rates, risk of mortality, as well as severity of illness. Documenting present on admission (POA) also directly impacts the hospital-acquired condition (HAC) classifications.
Reimbursement Impact: Documentation has a direct impact on expected length of stay, case mix index (CMI), cost reporting, and appropriate hospital reimbursement.
HM Takeaways:
- Be clear and specific.
- Document principle diagnosis and secondary diagnoses, and their associated interactions, are critically important.
- Ensure all diagnoses are a part of the discharge summary.
- Avoid saying “History of.”
- It’s OK to document “possible,” “probably,” “likely,” or “suspected.”
- Document “why” the patient has the diagnosis.
- List all differentials, and identify if ruled in or ruled out.
- Indicate acuity, even if obvious.
This presenter also reviewed common CDI opportunities in hospital medicine.
Note: This discussion was specific to the needs of the hospital patient diagnosis and billing, and not related to physician billing and CPT codes.
Presenter: Aziz Ansari, DO, FHM
Summary: With the implementation of ICD-10, correct and specific documentation to ensure proper patient diagnosis categorization has become increasingly important. Hospitalists are urged to understand the impact CDI has on quality and reimbursement.
Quality Impact: Documentation has a direct impact on quality reporting for mortality and complication rates, risk of mortality, as well as severity of illness. Documenting present on admission (POA) also directly impacts the hospital-acquired condition (HAC) classifications.
Reimbursement Impact: Documentation has a direct impact on expected length of stay, case mix index (CMI), cost reporting, and appropriate hospital reimbursement.
HM Takeaways:
- Be clear and specific.
- Document principle diagnosis and secondary diagnoses, and their associated interactions, are critically important.
- Ensure all diagnoses are a part of the discharge summary.
- Avoid saying “History of.”
- It’s OK to document “possible,” “probably,” “likely,” or “suspected.”
- Document “why” the patient has the diagnosis.
- List all differentials, and identify if ruled in or ruled out.
- Indicate acuity, even if obvious.
This presenter also reviewed common CDI opportunities in hospital medicine.
Note: This discussion was specific to the needs of the hospital patient diagnosis and billing, and not related to physician billing and CPT codes.
Presenter: Aziz Ansari, DO, FHM
Summary: With the implementation of ICD-10, correct and specific documentation to ensure proper patient diagnosis categorization has become increasingly important. Hospitalists are urged to understand the impact CDI has on quality and reimbursement.
Quality Impact: Documentation has a direct impact on quality reporting for mortality and complication rates, risk of mortality, as well as severity of illness. Documenting present on admission (POA) also directly impacts the hospital-acquired condition (HAC) classifications.
Reimbursement Impact: Documentation has a direct impact on expected length of stay, case mix index (CMI), cost reporting, and appropriate hospital reimbursement.
HM Takeaways:
- Be clear and specific.
- Document principle diagnosis and secondary diagnoses, and their associated interactions, are critically important.
- Ensure all diagnoses are a part of the discharge summary.
- Avoid saying “History of.”
- It’s OK to document “possible,” “probably,” “likely,” or “suspected.”
- Document “why” the patient has the diagnosis.
- List all differentials, and identify if ruled in or ruled out.
- Indicate acuity, even if obvious.
This presenter also reviewed common CDI opportunities in hospital medicine.
Note: This discussion was specific to the needs of the hospital patient diagnosis and billing, and not related to physician billing and CPT codes.
QUIZ: Will My COPD Patient Benefit from Noninvasive Positive Pressure Ventilation (NIPPV)?
[WpProQuiz 5]
[WpProQuiz_toplist 5]
[WpProQuiz 5]
[WpProQuiz_toplist 5]
[WpProQuiz 5]
[WpProQuiz_toplist 5]
First U.S. uterus transplant raises questions about ethics, cost
The recipient of the first uterus transplant performed in the United States is recovering well and looking forward to the next phase of the research project: pregnancy, according to the team of Cleveland Clinic surgeons who performed the groundbreaking Feb. 24 transplant.
“I’m pleased to report to you that our patient is doing very well,” lead surgeon Dr. Andreas G. Tzakis said during a press conference on March 7.
The transplant recipient, known only as “Lindsey” to protect her privacy, is a 26-year-old woman with uterine factor infertility (UFI). She was selected from among more than 250 applicants to undergo the 9-hour procedure involving transplantation of a uterus from a deceased organ donor of reproductive age. She is the first of 10 women set to undergo the procedure as part of the Cleveland Clinic study.
“Lindsey” made an appearance at the press conference and expressed, first and foremost, her “immense gratitude” to the donor’s family.
“They have provided me with a gift that I will never be able to repay,” she said in an emotional statement in which she shared about learning at age 16 that she would never be able to have children.
“From that moment on I prayed that God would allow me the opportunity to experience pregnancy, and here we are today at the beginning of that journey,” she said.
Lindsey will undergo a year of anti-rejection treatment prior to undergoing in vitro fertilization; her eggs were previously fertilized using her husband’s sperm, and the embryos are in cryogenic storage, according to Dr. Rebecca Flyckt, another member of the surgical team.
The embryos will be transferred one by one until the goal of a healthy pregnancy and healthy baby delivered by cesarean section is achieved, Dr. Flyckt said.
After one or two successful pregnancies and births, the uterus will be removed. Although substantial evidence suggests that anti-rejection medications are relatively safe in pregnancy, minimizing and ultimately eliminating the need for them is advisable, thus uterus transplants, at least under the Cleveland Clinic research protocol, are meant to be temporary.
Uterus transplants are performed to enable patients – who have either lost their uterus to disease or who were born without a uterus or a functioning uterus – to experience pregnancy and childbirth.
Prior to the Cleveland Clinic transplant, nine had been performed successfully with healthy outcomes (the first post-transplant baby was born healthy in 2014). All were performed at the University of Gothenburg in Sweden. Dr. Tzakis traveled there to work with surgeons prior to performing the procedure at the Cleveland Clinic, where he is the Transplant Center program director.
Ethical issues
Addressing the bioethical issues surrounding uterine transplant was an important part of this project, according to team member Dr. Ruth Farrell, a bioethicist and ob.gyn. surgeon at the Cleveland Clinic.
“Despite the name uterine transplant, the focus of this procedure is not on the uterus. It’s on women and children and families,” she said, adding that to understand the ethical issues of uterine transplant, it’s important to consider the perspectives of women with UFI.
“For instance, women born without a uterus have a medical condition that affects every aspect of their lives, from the time they are diagnosed in adolescence, to when they are adults looking for relationships and trying to decide if and how to have a family,” she said, adding that “these women face the real possibility of never having children.”
But this advance in reproductive medicine and science also requires a close look at how the potential risks and benefits of uterine transplant weigh against existing options of adoption and surrogacy. While there are many successful stories involving surrogacy and adoption, these options are not available to all women because of “legal, cultural, religious, and other very personal reasons,” Dr. Farrell said.
“Our research on uterine transplant, we hope, may give women another option which may work better for them and their families,” she said, also noting that while living donor transplants have been performed, these come with some risk, thus deceased donors are being used for the current study.
Insurance coverage
As for the feasibility of uterine transplant, many questions remain to be answered, according to Dr. Tommaso Falcone, chair of the department of obstetrics and gynecology at the Cleveland Clinic, who said that the procedures done as part of the current study will be paid for by an institutional grant.
While the American Society for Reproductive Medicine contends that infertility is a disease worthy of insurance coverage, that is “outside of our hands,” he said.
Indeed, while this first uterine transplant in the United States is a “huge and exciting breakthrough,” and while “the folks at Cleveland Clinic should be congratulated,” the possibility of this procedure ever being covered by insurance and being available to women outside of the research protocol is questionable, especially given the available alternatives, such as gestational carriers and surrogacy, Dr. David A. Forstein, a reproductive endocrinologist at the University of South Carolina, Greenville, said in an interview.
In a patient-centered model, this “tremendous, wonderful gift from science” would give patients – like the 1 in 5,000 women in the United States who are born without a uterus – a very viable alternative, but the question is whether having one’s own children is a right, and whether extensive financial resources should be committed to helping women achieve that, he said.
Dr. Charles E. Miller, a reproductive endocrinologist in Chicago, and head of minimally invasive gynecologic surgery at Advocate Lutheran General Hospital in Park Ridge, Ill., agreed that this transplant is to be celebrated.
“It’s a big moment, to say the least,” he said in an interview.
But Dr. Miller also questioned the feasibility of the procedure and whether society will “look favorably upon donation,” given the availability of alternatives for which there is now great success.
“I salute the pioneering effort,” he said, “But at the end of the day, can society take on this burden of a procedure performed not to sustain life, but to help create life? That’s a tough one.”
The recipient of the first uterus transplant performed in the United States is recovering well and looking forward to the next phase of the research project: pregnancy, according to the team of Cleveland Clinic surgeons who performed the groundbreaking Feb. 24 transplant.
“I’m pleased to report to you that our patient is doing very well,” lead surgeon Dr. Andreas G. Tzakis said during a press conference on March 7.
The transplant recipient, known only as “Lindsey” to protect her privacy, is a 26-year-old woman with uterine factor infertility (UFI). She was selected from among more than 250 applicants to undergo the 9-hour procedure involving transplantation of a uterus from a deceased organ donor of reproductive age. She is the first of 10 women set to undergo the procedure as part of the Cleveland Clinic study.
“Lindsey” made an appearance at the press conference and expressed, first and foremost, her “immense gratitude” to the donor’s family.
“They have provided me with a gift that I will never be able to repay,” she said in an emotional statement in which she shared about learning at age 16 that she would never be able to have children.
“From that moment on I prayed that God would allow me the opportunity to experience pregnancy, and here we are today at the beginning of that journey,” she said.
Lindsey will undergo a year of anti-rejection treatment prior to undergoing in vitro fertilization; her eggs were previously fertilized using her husband’s sperm, and the embryos are in cryogenic storage, according to Dr. Rebecca Flyckt, another member of the surgical team.
The embryos will be transferred one by one until the goal of a healthy pregnancy and healthy baby delivered by cesarean section is achieved, Dr. Flyckt said.
After one or two successful pregnancies and births, the uterus will be removed. Although substantial evidence suggests that anti-rejection medications are relatively safe in pregnancy, minimizing and ultimately eliminating the need for them is advisable, thus uterus transplants, at least under the Cleveland Clinic research protocol, are meant to be temporary.
Uterus transplants are performed to enable patients – who have either lost their uterus to disease or who were born without a uterus or a functioning uterus – to experience pregnancy and childbirth.
Prior to the Cleveland Clinic transplant, nine had been performed successfully with healthy outcomes (the first post-transplant baby was born healthy in 2014). All were performed at the University of Gothenburg in Sweden. Dr. Tzakis traveled there to work with surgeons prior to performing the procedure at the Cleveland Clinic, where he is the Transplant Center program director.
Ethical issues
Addressing the bioethical issues surrounding uterine transplant was an important part of this project, according to team member Dr. Ruth Farrell, a bioethicist and ob.gyn. surgeon at the Cleveland Clinic.
“Despite the name uterine transplant, the focus of this procedure is not on the uterus. It’s on women and children and families,” she said, adding that to understand the ethical issues of uterine transplant, it’s important to consider the perspectives of women with UFI.
“For instance, women born without a uterus have a medical condition that affects every aspect of their lives, from the time they are diagnosed in adolescence, to when they are adults looking for relationships and trying to decide if and how to have a family,” she said, adding that “these women face the real possibility of never having children.”
But this advance in reproductive medicine and science also requires a close look at how the potential risks and benefits of uterine transplant weigh against existing options of adoption and surrogacy. While there are many successful stories involving surrogacy and adoption, these options are not available to all women because of “legal, cultural, religious, and other very personal reasons,” Dr. Farrell said.
“Our research on uterine transplant, we hope, may give women another option which may work better for them and their families,” she said, also noting that while living donor transplants have been performed, these come with some risk, thus deceased donors are being used for the current study.
Insurance coverage
As for the feasibility of uterine transplant, many questions remain to be answered, according to Dr. Tommaso Falcone, chair of the department of obstetrics and gynecology at the Cleveland Clinic, who said that the procedures done as part of the current study will be paid for by an institutional grant.
While the American Society for Reproductive Medicine contends that infertility is a disease worthy of insurance coverage, that is “outside of our hands,” he said.
Indeed, while this first uterine transplant in the United States is a “huge and exciting breakthrough,” and while “the folks at Cleveland Clinic should be congratulated,” the possibility of this procedure ever being covered by insurance and being available to women outside of the research protocol is questionable, especially given the available alternatives, such as gestational carriers and surrogacy, Dr. David A. Forstein, a reproductive endocrinologist at the University of South Carolina, Greenville, said in an interview.
In a patient-centered model, this “tremendous, wonderful gift from science” would give patients – like the 1 in 5,000 women in the United States who are born without a uterus – a very viable alternative, but the question is whether having one’s own children is a right, and whether extensive financial resources should be committed to helping women achieve that, he said.
Dr. Charles E. Miller, a reproductive endocrinologist in Chicago, and head of minimally invasive gynecologic surgery at Advocate Lutheran General Hospital in Park Ridge, Ill., agreed that this transplant is to be celebrated.
“It’s a big moment, to say the least,” he said in an interview.
But Dr. Miller also questioned the feasibility of the procedure and whether society will “look favorably upon donation,” given the availability of alternatives for which there is now great success.
“I salute the pioneering effort,” he said, “But at the end of the day, can society take on this burden of a procedure performed not to sustain life, but to help create life? That’s a tough one.”
The recipient of the first uterus transplant performed in the United States is recovering well and looking forward to the next phase of the research project: pregnancy, according to the team of Cleveland Clinic surgeons who performed the groundbreaking Feb. 24 transplant.
“I’m pleased to report to you that our patient is doing very well,” lead surgeon Dr. Andreas G. Tzakis said during a press conference on March 7.
The transplant recipient, known only as “Lindsey” to protect her privacy, is a 26-year-old woman with uterine factor infertility (UFI). She was selected from among more than 250 applicants to undergo the 9-hour procedure involving transplantation of a uterus from a deceased organ donor of reproductive age. She is the first of 10 women set to undergo the procedure as part of the Cleveland Clinic study.
“Lindsey” made an appearance at the press conference and expressed, first and foremost, her “immense gratitude” to the donor’s family.
“They have provided me with a gift that I will never be able to repay,” she said in an emotional statement in which she shared about learning at age 16 that she would never be able to have children.
“From that moment on I prayed that God would allow me the opportunity to experience pregnancy, and here we are today at the beginning of that journey,” she said.
Lindsey will undergo a year of anti-rejection treatment prior to undergoing in vitro fertilization; her eggs were previously fertilized using her husband’s sperm, and the embryos are in cryogenic storage, according to Dr. Rebecca Flyckt, another member of the surgical team.
The embryos will be transferred one by one until the goal of a healthy pregnancy and healthy baby delivered by cesarean section is achieved, Dr. Flyckt said.
After one or two successful pregnancies and births, the uterus will be removed. Although substantial evidence suggests that anti-rejection medications are relatively safe in pregnancy, minimizing and ultimately eliminating the need for them is advisable, thus uterus transplants, at least under the Cleveland Clinic research protocol, are meant to be temporary.
Uterus transplants are performed to enable patients – who have either lost their uterus to disease or who were born without a uterus or a functioning uterus – to experience pregnancy and childbirth.
Prior to the Cleveland Clinic transplant, nine had been performed successfully with healthy outcomes (the first post-transplant baby was born healthy in 2014). All were performed at the University of Gothenburg in Sweden. Dr. Tzakis traveled there to work with surgeons prior to performing the procedure at the Cleveland Clinic, where he is the Transplant Center program director.
Ethical issues
Addressing the bioethical issues surrounding uterine transplant was an important part of this project, according to team member Dr. Ruth Farrell, a bioethicist and ob.gyn. surgeon at the Cleveland Clinic.
“Despite the name uterine transplant, the focus of this procedure is not on the uterus. It’s on women and children and families,” she said, adding that to understand the ethical issues of uterine transplant, it’s important to consider the perspectives of women with UFI.
“For instance, women born without a uterus have a medical condition that affects every aspect of their lives, from the time they are diagnosed in adolescence, to when they are adults looking for relationships and trying to decide if and how to have a family,” she said, adding that “these women face the real possibility of never having children.”
But this advance in reproductive medicine and science also requires a close look at how the potential risks and benefits of uterine transplant weigh against existing options of adoption and surrogacy. While there are many successful stories involving surrogacy and adoption, these options are not available to all women because of “legal, cultural, religious, and other very personal reasons,” Dr. Farrell said.
“Our research on uterine transplant, we hope, may give women another option which may work better for them and their families,” she said, also noting that while living donor transplants have been performed, these come with some risk, thus deceased donors are being used for the current study.
Insurance coverage
As for the feasibility of uterine transplant, many questions remain to be answered, according to Dr. Tommaso Falcone, chair of the department of obstetrics and gynecology at the Cleveland Clinic, who said that the procedures done as part of the current study will be paid for by an institutional grant.
While the American Society for Reproductive Medicine contends that infertility is a disease worthy of insurance coverage, that is “outside of our hands,” he said.
Indeed, while this first uterine transplant in the United States is a “huge and exciting breakthrough,” and while “the folks at Cleveland Clinic should be congratulated,” the possibility of this procedure ever being covered by insurance and being available to women outside of the research protocol is questionable, especially given the available alternatives, such as gestational carriers and surrogacy, Dr. David A. Forstein, a reproductive endocrinologist at the University of South Carolina, Greenville, said in an interview.
In a patient-centered model, this “tremendous, wonderful gift from science” would give patients – like the 1 in 5,000 women in the United States who are born without a uterus – a very viable alternative, but the question is whether having one’s own children is a right, and whether extensive financial resources should be committed to helping women achieve that, he said.
Dr. Charles E. Miller, a reproductive endocrinologist in Chicago, and head of minimally invasive gynecologic surgery at Advocate Lutheran General Hospital in Park Ridge, Ill., agreed that this transplant is to be celebrated.
“It’s a big moment, to say the least,” he said in an interview.
But Dr. Miller also questioned the feasibility of the procedure and whether society will “look favorably upon donation,” given the availability of alternatives for which there is now great success.
“I salute the pioneering effort,” he said, “But at the end of the day, can society take on this burden of a procedure performed not to sustain life, but to help create life? That’s a tough one.”
PFS a surrogate for overall survival in soft tissue sarcoma trials
The use of progression-free survival and response rate as surrogates for overall survival were supported by significant correlations between the endpoints, in randomized trials of advanced soft tissue sarcoma.
However, 3-month progression-free survival (PFS) and 6-month PFS were not significantly correlated with overall survival (OS) and were not recommended as surrogates for OS, according to the researchers.
Significant correlations were observed between overall survival and PFS (correlation coefficient, 0.61) and overall survival and response rate (0.51). Correlations between 3-month PFS and 6-month PFS with overall survival (0.27 and 0.31, respectively) were not significant.
“In soft tissue sarcoma, trial design is particularly challenging, owing to the rarity and heterogeneity of this disease. Time-based endpoints including PFS, 3-month PFS, and 6-month PFS are gaining popularity as primary endpoints in phase III RCTs [randomized controlled trials], despite the fact that current data support their use only to screen for effective drugs in phase II trials,” wrote Dr. Alona Zer of Princess Margaret Cancer Centre and the University of Toronto and colleagues (J Clin Oncol. 2016 March 7. Doi: 10.1200/JCO.2016.66.4581).
“Data show that the assessment of outcomes as a single point in time … only rarely mirrors the hazards of the same endpoint. … As such, the odds ratio for 3-month PFS or 6-month PFS likely do not approximate the hazard ratio for PFS, making it difficult to justify the use of these endpoints in definitive phase III trials.”
PFS and overall survival have shown poor correlation in other cancer types, and evidence suggests that survival post progression may influence the association, with weaker correlations at longer survival post progression. The majority of soft tissue sarcoma reports had survival post progression of less than 12 months, likely explaining the high correlation between PFS and overall survival.
The investigators performed a systematic review of 52 randomized controlled trials, published from 1974 to 2014, that included 9,762 patients who received systemic therapy for advanced/metastatic soft tissue sarcoma.
Comprehensive toxicity assessment was included in just 20 studies (47%) and poorly reported in 6 studies (14%). Few studies included quality of life as a secondary endpoint. The authors noted that in the soft tissue sarcoma setting in which the purpose of systemic treatment may be palliation of symptoms, this is a concern.
Over the 4 decades represented by the systematic review, several trends appeared. Overall, a low proportion of studies included intent-to-treat analyses and clearly defined primary endpoints, but these characteristics improved over time. Endpoint selection has shifted away from response rate in favor of time-based events, including PFS, 3-month PFS, and 6-month PFS. Overall survival was the primary endpoint in just 4% of studies. Studies published in the last 2 decades were more likely to be supported by industry (5% vs. 35%).
Dr. Zer and coauthors reported having no relevant financial disclosures.
In evaluating drug efficacy in oncology clinical trials, overall survival is considered the most reliable and meaningful endpoint because it is objective, precise, and easy to measure. However, it requires a large sample size, prolonged follow-up, and may be confounded by postprogression therapies. Given these drawbacks, surrogate markers are especially useful for rare diseases and diseases with effective subsequent-line therapies.
For a practical definition of surrogate endpoints, researchers use criteria that describe the association between surrogate and clinical endpoints at the individual level and trial level as requirements for validation.
In their systematic review, Zer et al. investigated trial-level surrogacy of progression-free survival and response rate for overall survival in advanced soft tissue sarcoma, concluding that PFS and response rate are appropriate surrogates of overall survival. For trial-level surrogacy analysis, the effective sample size is the number of trials, which should be large enough to reliably estimate the correlation between treatment effects on the surrogate and clinical endpoints. Of note in this review, the sample size was 13 trials.
The influence of postprogression survival on the association between PFS and overall survival is important in this analysis. The median postprogression survival was less than 12 months in most of the studies included, and this figure might improve when more effective drugs become available. Longer postprogression survival may translate to a lower correlation between PFS and overall survival.
Another challenge to the surrogacy of an endpoint may come from new treatments in which the mode of action is substantially different from that used to validate the surrogate. Caution is required in generalizing the validation of surrogate endpoints.
In general, time-to-event outcomes assessed at a single time point (for example, 3-month and 6-month PFS) may be misleading. This is also true of the median value of a time-to-event outcome, such as median PFS. For trial-level surrogacy validation, the hazard ratio is the most appropriate measure for time-to-event outcomes.
Efforts to improve the validation of surrogate endpoints are important in this era of personalized medicine and rapid development of oncology drugs. Surrogate markers usually allow for smaller trials and shorter completion times.
Fengman Zhao, Ph.D., is a biostatistician at Dana-Farber Cancer Institute, Boston. These remarks were part of an editorial accompanying the report by Zer et al. (J Clin Oncol. 2016 March 7. doi: 10.1200/JCO.2015.64.3437).
In evaluating drug efficacy in oncology clinical trials, overall survival is considered the most reliable and meaningful endpoint because it is objective, precise, and easy to measure. However, it requires a large sample size, prolonged follow-up, and may be confounded by postprogression therapies. Given these drawbacks, surrogate markers are especially useful for rare diseases and diseases with effective subsequent-line therapies.
For a practical definition of surrogate endpoints, researchers use criteria that describe the association between surrogate and clinical endpoints at the individual level and trial level as requirements for validation.
In their systematic review, Zer et al. investigated trial-level surrogacy of progression-free survival and response rate for overall survival in advanced soft tissue sarcoma, concluding that PFS and response rate are appropriate surrogates of overall survival. For trial-level surrogacy analysis, the effective sample size is the number of trials, which should be large enough to reliably estimate the correlation between treatment effects on the surrogate and clinical endpoints. Of note in this review, the sample size was 13 trials.
The influence of postprogression survival on the association between PFS and overall survival is important in this analysis. The median postprogression survival was less than 12 months in most of the studies included, and this figure might improve when more effective drugs become available. Longer postprogression survival may translate to a lower correlation between PFS and overall survival.
Another challenge to the surrogacy of an endpoint may come from new treatments in which the mode of action is substantially different from that used to validate the surrogate. Caution is required in generalizing the validation of surrogate endpoints.
In general, time-to-event outcomes assessed at a single time point (for example, 3-month and 6-month PFS) may be misleading. This is also true of the median value of a time-to-event outcome, such as median PFS. For trial-level surrogacy validation, the hazard ratio is the most appropriate measure for time-to-event outcomes.
Efforts to improve the validation of surrogate endpoints are important in this era of personalized medicine and rapid development of oncology drugs. Surrogate markers usually allow for smaller trials and shorter completion times.
Fengman Zhao, Ph.D., is a biostatistician at Dana-Farber Cancer Institute, Boston. These remarks were part of an editorial accompanying the report by Zer et al. (J Clin Oncol. 2016 March 7. doi: 10.1200/JCO.2015.64.3437).
In evaluating drug efficacy in oncology clinical trials, overall survival is considered the most reliable and meaningful endpoint because it is objective, precise, and easy to measure. However, it requires a large sample size, prolonged follow-up, and may be confounded by postprogression therapies. Given these drawbacks, surrogate markers are especially useful for rare diseases and diseases with effective subsequent-line therapies.
For a practical definition of surrogate endpoints, researchers use criteria that describe the association between surrogate and clinical endpoints at the individual level and trial level as requirements for validation.
In their systematic review, Zer et al. investigated trial-level surrogacy of progression-free survival and response rate for overall survival in advanced soft tissue sarcoma, concluding that PFS and response rate are appropriate surrogates of overall survival. For trial-level surrogacy analysis, the effective sample size is the number of trials, which should be large enough to reliably estimate the correlation between treatment effects on the surrogate and clinical endpoints. Of note in this review, the sample size was 13 trials.
The influence of postprogression survival on the association between PFS and overall survival is important in this analysis. The median postprogression survival was less than 12 months in most of the studies included, and this figure might improve when more effective drugs become available. Longer postprogression survival may translate to a lower correlation between PFS and overall survival.
Another challenge to the surrogacy of an endpoint may come from new treatments in which the mode of action is substantially different from that used to validate the surrogate. Caution is required in generalizing the validation of surrogate endpoints.
In general, time-to-event outcomes assessed at a single time point (for example, 3-month and 6-month PFS) may be misleading. This is also true of the median value of a time-to-event outcome, such as median PFS. For trial-level surrogacy validation, the hazard ratio is the most appropriate measure for time-to-event outcomes.
Efforts to improve the validation of surrogate endpoints are important in this era of personalized medicine and rapid development of oncology drugs. Surrogate markers usually allow for smaller trials and shorter completion times.
Fengman Zhao, Ph.D., is a biostatistician at Dana-Farber Cancer Institute, Boston. These remarks were part of an editorial accompanying the report by Zer et al. (J Clin Oncol. 2016 March 7. doi: 10.1200/JCO.2015.64.3437).
The use of progression-free survival and response rate as surrogates for overall survival were supported by significant correlations between the endpoints, in randomized trials of advanced soft tissue sarcoma.
However, 3-month progression-free survival (PFS) and 6-month PFS were not significantly correlated with overall survival (OS) and were not recommended as surrogates for OS, according to the researchers.
Significant correlations were observed between overall survival and PFS (correlation coefficient, 0.61) and overall survival and response rate (0.51). Correlations between 3-month PFS and 6-month PFS with overall survival (0.27 and 0.31, respectively) were not significant.
“In soft tissue sarcoma, trial design is particularly challenging, owing to the rarity and heterogeneity of this disease. Time-based endpoints including PFS, 3-month PFS, and 6-month PFS are gaining popularity as primary endpoints in phase III RCTs [randomized controlled trials], despite the fact that current data support their use only to screen for effective drugs in phase II trials,” wrote Dr. Alona Zer of Princess Margaret Cancer Centre and the University of Toronto and colleagues (J Clin Oncol. 2016 March 7. Doi: 10.1200/JCO.2016.66.4581).
“Data show that the assessment of outcomes as a single point in time … only rarely mirrors the hazards of the same endpoint. … As such, the odds ratio for 3-month PFS or 6-month PFS likely do not approximate the hazard ratio for PFS, making it difficult to justify the use of these endpoints in definitive phase III trials.”
PFS and overall survival have shown poor correlation in other cancer types, and evidence suggests that survival post progression may influence the association, with weaker correlations at longer survival post progression. The majority of soft tissue sarcoma reports had survival post progression of less than 12 months, likely explaining the high correlation between PFS and overall survival.
The investigators performed a systematic review of 52 randomized controlled trials, published from 1974 to 2014, that included 9,762 patients who received systemic therapy for advanced/metastatic soft tissue sarcoma.
Comprehensive toxicity assessment was included in just 20 studies (47%) and poorly reported in 6 studies (14%). Few studies included quality of life as a secondary endpoint. The authors noted that in the soft tissue sarcoma setting in which the purpose of systemic treatment may be palliation of symptoms, this is a concern.
Over the 4 decades represented by the systematic review, several trends appeared. Overall, a low proportion of studies included intent-to-treat analyses and clearly defined primary endpoints, but these characteristics improved over time. Endpoint selection has shifted away from response rate in favor of time-based events, including PFS, 3-month PFS, and 6-month PFS. Overall survival was the primary endpoint in just 4% of studies. Studies published in the last 2 decades were more likely to be supported by industry (5% vs. 35%).
Dr. Zer and coauthors reported having no relevant financial disclosures.
The use of progression-free survival and response rate as surrogates for overall survival were supported by significant correlations between the endpoints, in randomized trials of advanced soft tissue sarcoma.
However, 3-month progression-free survival (PFS) and 6-month PFS were not significantly correlated with overall survival (OS) and were not recommended as surrogates for OS, according to the researchers.
Significant correlations were observed between overall survival and PFS (correlation coefficient, 0.61) and overall survival and response rate (0.51). Correlations between 3-month PFS and 6-month PFS with overall survival (0.27 and 0.31, respectively) were not significant.
“In soft tissue sarcoma, trial design is particularly challenging, owing to the rarity and heterogeneity of this disease. Time-based endpoints including PFS, 3-month PFS, and 6-month PFS are gaining popularity as primary endpoints in phase III RCTs [randomized controlled trials], despite the fact that current data support their use only to screen for effective drugs in phase II trials,” wrote Dr. Alona Zer of Princess Margaret Cancer Centre and the University of Toronto and colleagues (J Clin Oncol. 2016 March 7. Doi: 10.1200/JCO.2016.66.4581).
“Data show that the assessment of outcomes as a single point in time … only rarely mirrors the hazards of the same endpoint. … As such, the odds ratio for 3-month PFS or 6-month PFS likely do not approximate the hazard ratio for PFS, making it difficult to justify the use of these endpoints in definitive phase III trials.”
PFS and overall survival have shown poor correlation in other cancer types, and evidence suggests that survival post progression may influence the association, with weaker correlations at longer survival post progression. The majority of soft tissue sarcoma reports had survival post progression of less than 12 months, likely explaining the high correlation between PFS and overall survival.
The investigators performed a systematic review of 52 randomized controlled trials, published from 1974 to 2014, that included 9,762 patients who received systemic therapy for advanced/metastatic soft tissue sarcoma.
Comprehensive toxicity assessment was included in just 20 studies (47%) and poorly reported in 6 studies (14%). Few studies included quality of life as a secondary endpoint. The authors noted that in the soft tissue sarcoma setting in which the purpose of systemic treatment may be palliation of symptoms, this is a concern.
Over the 4 decades represented by the systematic review, several trends appeared. Overall, a low proportion of studies included intent-to-treat analyses and clearly defined primary endpoints, but these characteristics improved over time. Endpoint selection has shifted away from response rate in favor of time-based events, including PFS, 3-month PFS, and 6-month PFS. Overall survival was the primary endpoint in just 4% of studies. Studies published in the last 2 decades were more likely to be supported by industry (5% vs. 35%).
Dr. Zer and coauthors reported having no relevant financial disclosures.
Key clinical point: In randomized trials of advanced soft tissue sarcoma, intermediate endpoints of progression-free survival and response rate were significantly correlated with overall survival; 3-month PFS and 6-month PFS were not significantly correlated with overall survival.
Major finding: The correlation coefficient between overall survival and PFS was 0.61; overall survival and response rate, 0.51; overall survival and 3-month PFS, 0.27; and overall survival and 6-month PFS, 0.31.
Data sources: A systematic review of 52 randomized controlled trials published from 1974 to 2014 involving 9,762 patients who received systemic therapy for advanced/metastatic soft tissue sarcoma.
Disclosures: Dr. Zer and coauthors reported having no relevant financial disclosures.
Children who have stem cell transplants need skin exams, sun protection
WASHINGTON – Children who have had a hematopoietic stem cell transplant (HSCT) have an increased risk of benign and atypical nevi, Dr. Johanna Sheu reported at the annual meeting of the American Academy of Dermatology.
These patients need to have routine skin exams and be educated about sun protection needs, she said. Based on her study, these needs are not routinely met.
At least 1 year after undergoing HSCT at Boston Children’s Hospital, 85 posttransplant patients had significantly more nevi and more atypical nevi than did 85 healthy controls who were matched by age, gender, and Fitzpatrick skin type. In addition, 41% of the transplant recipients had at least one actinic keratosis, a basal or squamous cell carcinoma, or a solar lentigo; 11% had at least one nevus spilus.
Moreover, “sun protection … and dermatology follow-up was poor” among the transplant recipients, said Dr. Sheu, of MassGeneral Hospital for Children, Boston. About 40% of the transplant recipients reported having a sunburn since their transplant, only 15% reported daily use of sunscreen, and 53% said they did not recall being told that sunburn could trigger graft-versus-host disease (GVHD).
About one-third of the patients had never seen a dermatologist; of those who had, two-thirds had only seen the dermatologist once, Dr. Sheu reported.
Late skin effects of HSCT are not as well described in children as they are in adults, she said. In adults, late skin effects include vitiligo, psoriasis, nonmelanoma skin cancers, and an increased nevi count.
The children in the study had undergone an HSCT between 1998 and 2013, at a median age of about 7 years (range was 1 month to 19 years). At the time of their skin exams, their mean age was 14 years, and they had been followed for a median of almost 4 years. Nevi were counted on the forearms, backs, legs, palms, and soles.
The median nevi count was 44 nevi, significantly more than the level seen in control subjects. Transplant recipients also had significantly more nevi in sun-exposed areas of the body, as well as on the palms and soles. Transplant recipients were more likely to have atypical nevi and to have nevi greater than 5 mm in diameter.
In addition to fair skin, factors associated with an increase in the overall nevi count included being older than age 10 at the time of the transplant and having total body irradiation, pretransplant chemotherapy, and myeloablative conditioning. Having had a sunburn since the transplant, reported by 40%, was also a risk factor.
Chronic GVHD and chronic GVHD of the skin were associated with the presence of atypical nevi; acute GVHD, the duration of immune suppression, and the use of topical steroids or calcineurin inhibitors were not associated with increased risk of atypical nevi.
She and her coinvestigators are currently analyzing the pathogenesis of these late effects in this population, and autoimmune skin conditions – vitiligo and alopecia – in 25% of the transplant recipients in the study.
In 2013, 1,100 children under aged 16 years in the United States underwent a bone marrow transplant, she noted.
Dr. Sheu had no disclosures.
WASHINGTON – Children who have had a hematopoietic stem cell transplant (HSCT) have an increased risk of benign and atypical nevi, Dr. Johanna Sheu reported at the annual meeting of the American Academy of Dermatology.
These patients need to have routine skin exams and be educated about sun protection needs, she said. Based on her study, these needs are not routinely met.
At least 1 year after undergoing HSCT at Boston Children’s Hospital, 85 posttransplant patients had significantly more nevi and more atypical nevi than did 85 healthy controls who were matched by age, gender, and Fitzpatrick skin type. In addition, 41% of the transplant recipients had at least one actinic keratosis, a basal or squamous cell carcinoma, or a solar lentigo; 11% had at least one nevus spilus.
Moreover, “sun protection … and dermatology follow-up was poor” among the transplant recipients, said Dr. Sheu, of MassGeneral Hospital for Children, Boston. About 40% of the transplant recipients reported having a sunburn since their transplant, only 15% reported daily use of sunscreen, and 53% said they did not recall being told that sunburn could trigger graft-versus-host disease (GVHD).
About one-third of the patients had never seen a dermatologist; of those who had, two-thirds had only seen the dermatologist once, Dr. Sheu reported.
Late skin effects of HSCT are not as well described in children as they are in adults, she said. In adults, late skin effects include vitiligo, psoriasis, nonmelanoma skin cancers, and an increased nevi count.
The children in the study had undergone an HSCT between 1998 and 2013, at a median age of about 7 years (range was 1 month to 19 years). At the time of their skin exams, their mean age was 14 years, and they had been followed for a median of almost 4 years. Nevi were counted on the forearms, backs, legs, palms, and soles.
The median nevi count was 44 nevi, significantly more than the level seen in control subjects. Transplant recipients also had significantly more nevi in sun-exposed areas of the body, as well as on the palms and soles. Transplant recipients were more likely to have atypical nevi and to have nevi greater than 5 mm in diameter.
In addition to fair skin, factors associated with an increase in the overall nevi count included being older than age 10 at the time of the transplant and having total body irradiation, pretransplant chemotherapy, and myeloablative conditioning. Having had a sunburn since the transplant, reported by 40%, was also a risk factor.
Chronic GVHD and chronic GVHD of the skin were associated with the presence of atypical nevi; acute GVHD, the duration of immune suppression, and the use of topical steroids or calcineurin inhibitors were not associated with increased risk of atypical nevi.
She and her coinvestigators are currently analyzing the pathogenesis of these late effects in this population, and autoimmune skin conditions – vitiligo and alopecia – in 25% of the transplant recipients in the study.
In 2013, 1,100 children under aged 16 years in the United States underwent a bone marrow transplant, she noted.
Dr. Sheu had no disclosures.
WASHINGTON – Children who have had a hematopoietic stem cell transplant (HSCT) have an increased risk of benign and atypical nevi, Dr. Johanna Sheu reported at the annual meeting of the American Academy of Dermatology.
These patients need to have routine skin exams and be educated about sun protection needs, she said. Based on her study, these needs are not routinely met.
At least 1 year after undergoing HSCT at Boston Children’s Hospital, 85 posttransplant patients had significantly more nevi and more atypical nevi than did 85 healthy controls who were matched by age, gender, and Fitzpatrick skin type. In addition, 41% of the transplant recipients had at least one actinic keratosis, a basal or squamous cell carcinoma, or a solar lentigo; 11% had at least one nevus spilus.
Moreover, “sun protection … and dermatology follow-up was poor” among the transplant recipients, said Dr. Sheu, of MassGeneral Hospital for Children, Boston. About 40% of the transplant recipients reported having a sunburn since their transplant, only 15% reported daily use of sunscreen, and 53% said they did not recall being told that sunburn could trigger graft-versus-host disease (GVHD).
About one-third of the patients had never seen a dermatologist; of those who had, two-thirds had only seen the dermatologist once, Dr. Sheu reported.
Late skin effects of HSCT are not as well described in children as they are in adults, she said. In adults, late skin effects include vitiligo, psoriasis, nonmelanoma skin cancers, and an increased nevi count.
The children in the study had undergone an HSCT between 1998 and 2013, at a median age of about 7 years (range was 1 month to 19 years). At the time of their skin exams, their mean age was 14 years, and they had been followed for a median of almost 4 years. Nevi were counted on the forearms, backs, legs, palms, and soles.
The median nevi count was 44 nevi, significantly more than the level seen in control subjects. Transplant recipients also had significantly more nevi in sun-exposed areas of the body, as well as on the palms and soles. Transplant recipients were more likely to have atypical nevi and to have nevi greater than 5 mm in diameter.
In addition to fair skin, factors associated with an increase in the overall nevi count included being older than age 10 at the time of the transplant and having total body irradiation, pretransplant chemotherapy, and myeloablative conditioning. Having had a sunburn since the transplant, reported by 40%, was also a risk factor.
Chronic GVHD and chronic GVHD of the skin were associated with the presence of atypical nevi; acute GVHD, the duration of immune suppression, and the use of topical steroids or calcineurin inhibitors were not associated with increased risk of atypical nevi.
She and her coinvestigators are currently analyzing the pathogenesis of these late effects in this population, and autoimmune skin conditions – vitiligo and alopecia – in 25% of the transplant recipients in the study.
In 2013, 1,100 children under aged 16 years in the United States underwent a bone marrow transplant, she noted.
Dr. Sheu had no disclosures.
AT AAD 16
Key clinical point: Children who have had a hematopoietic stem cell transplant need to have routine skin exams and be educated about sun protection needs.
Major finding: 41% of the transplant recipients had at least one actinic keratosis, a basal or squamous cell carcinoma, or a solar lentigo; 11% had at least one nevus spilus.
Data source: A single-center study of 85 posttransplant patients and 85 healthy controls who were matched by age, gender, and Fitzpatrick skin type.
Disclosures: The study was not sponsored and Dr. Sheu had no disclosures.
Evidence builds for mesenchymal stem cell therapy in MS
NEW ORLEANS – Repeated intrathecal administration of autologous mesenchymal bone marrow-derived stromal stem cells for the treatment of multiple sclerosis was safe and induced accelerated beneficial effects in some patients in an uncontrolled, prospective study.
Of 28 patients with either secondary progressive or relapsing-progressive MS who were experiencing severe clinical deterioration and failure to respond to first- and second-line immunomodulatory treatments, 25 experienced either stable or improved Expanded Disability Status Scale (EDSS) scores following autologous mesenchymal stem cell (MSC) injections. The mean score decreased from 6.76 at study entry to 6.57 at a mean follow-up of 3.6 years, Dr. Panayiota Petrou of Hadassah-Hebrew University Medical Center, Jerusalem, Israel, and her colleagues reported in a poster at a meeting held by the Americas Committee for Treatment and Research in Multiple Sclerosis.
In addition, 17 patients experienced improvements in at least one functional system of the EDSS, including 14 who experienced improved motor function, 5 who experienced improved speech/bulbar functions, 4 who experienced improved urinary functions, and 6 who experienced improved cerebellar function. Eight patients remained stable during the entire follow-up period.
In a prior pilot trial, intrathecal administration of MSCs was shown to be safe and provided “some indications of potentially clinically meaningful beneficial effects on the progression of the disease,” the investigators said.
The current study provides further support for those findings. It included patients who experienced severe clinical deterioration (at least 0.5-1 points in the EDSS) during the year prior to study enrollment, or who had at least one major relapse without sufficient recovery following steroid treatment. Study subjects had a mean age of 56 years and mean disease duration of 15.4 years. They received at least 2 courses and up to 10 injections with 1 million cells/kg; most received 2 (8 patients) or 3 (9 patients) injections, and they were followed for up to 6 years.
No serious side effects were observed during long-term follow-up after repeated intrathecal injections. Eight patients experienced headaches and/or fever in the hours and days after injection, and two experienced symptoms of encephalopathy, which resolved within a few hours. Also, one patient experienced back pain and one had neck rigidity, but no long-term side effects were reported, the investigators said.
Immunological follow-up showed a transient up-regulation of regulatory T cells and down-regulation of the proliferative ability of lymphocytes and of several immune activation surface markers for up to 3 months, they noted.
The investigators reported having no disclosures.
NEW ORLEANS – Repeated intrathecal administration of autologous mesenchymal bone marrow-derived stromal stem cells for the treatment of multiple sclerosis was safe and induced accelerated beneficial effects in some patients in an uncontrolled, prospective study.
Of 28 patients with either secondary progressive or relapsing-progressive MS who were experiencing severe clinical deterioration and failure to respond to first- and second-line immunomodulatory treatments, 25 experienced either stable or improved Expanded Disability Status Scale (EDSS) scores following autologous mesenchymal stem cell (MSC) injections. The mean score decreased from 6.76 at study entry to 6.57 at a mean follow-up of 3.6 years, Dr. Panayiota Petrou of Hadassah-Hebrew University Medical Center, Jerusalem, Israel, and her colleagues reported in a poster at a meeting held by the Americas Committee for Treatment and Research in Multiple Sclerosis.
In addition, 17 patients experienced improvements in at least one functional system of the EDSS, including 14 who experienced improved motor function, 5 who experienced improved speech/bulbar functions, 4 who experienced improved urinary functions, and 6 who experienced improved cerebellar function. Eight patients remained stable during the entire follow-up period.
In a prior pilot trial, intrathecal administration of MSCs was shown to be safe and provided “some indications of potentially clinically meaningful beneficial effects on the progression of the disease,” the investigators said.
The current study provides further support for those findings. It included patients who experienced severe clinical deterioration (at least 0.5-1 points in the EDSS) during the year prior to study enrollment, or who had at least one major relapse without sufficient recovery following steroid treatment. Study subjects had a mean age of 56 years and mean disease duration of 15.4 years. They received at least 2 courses and up to 10 injections with 1 million cells/kg; most received 2 (8 patients) or 3 (9 patients) injections, and they were followed for up to 6 years.
No serious side effects were observed during long-term follow-up after repeated intrathecal injections. Eight patients experienced headaches and/or fever in the hours and days after injection, and two experienced symptoms of encephalopathy, which resolved within a few hours. Also, one patient experienced back pain and one had neck rigidity, but no long-term side effects were reported, the investigators said.
Immunological follow-up showed a transient up-regulation of regulatory T cells and down-regulation of the proliferative ability of lymphocytes and of several immune activation surface markers for up to 3 months, they noted.
The investigators reported having no disclosures.
NEW ORLEANS – Repeated intrathecal administration of autologous mesenchymal bone marrow-derived stromal stem cells for the treatment of multiple sclerosis was safe and induced accelerated beneficial effects in some patients in an uncontrolled, prospective study.
Of 28 patients with either secondary progressive or relapsing-progressive MS who were experiencing severe clinical deterioration and failure to respond to first- and second-line immunomodulatory treatments, 25 experienced either stable or improved Expanded Disability Status Scale (EDSS) scores following autologous mesenchymal stem cell (MSC) injections. The mean score decreased from 6.76 at study entry to 6.57 at a mean follow-up of 3.6 years, Dr. Panayiota Petrou of Hadassah-Hebrew University Medical Center, Jerusalem, Israel, and her colleagues reported in a poster at a meeting held by the Americas Committee for Treatment and Research in Multiple Sclerosis.
In addition, 17 patients experienced improvements in at least one functional system of the EDSS, including 14 who experienced improved motor function, 5 who experienced improved speech/bulbar functions, 4 who experienced improved urinary functions, and 6 who experienced improved cerebellar function. Eight patients remained stable during the entire follow-up period.
In a prior pilot trial, intrathecal administration of MSCs was shown to be safe and provided “some indications of potentially clinically meaningful beneficial effects on the progression of the disease,” the investigators said.
The current study provides further support for those findings. It included patients who experienced severe clinical deterioration (at least 0.5-1 points in the EDSS) during the year prior to study enrollment, or who had at least one major relapse without sufficient recovery following steroid treatment. Study subjects had a mean age of 56 years and mean disease duration of 15.4 years. They received at least 2 courses and up to 10 injections with 1 million cells/kg; most received 2 (8 patients) or 3 (9 patients) injections, and they were followed for up to 6 years.
No serious side effects were observed during long-term follow-up after repeated intrathecal injections. Eight patients experienced headaches and/or fever in the hours and days after injection, and two experienced symptoms of encephalopathy, which resolved within a few hours. Also, one patient experienced back pain and one had neck rigidity, but no long-term side effects were reported, the investigators said.
Immunological follow-up showed a transient up-regulation of regulatory T cells and down-regulation of the proliferative ability of lymphocytes and of several immune activation surface markers for up to 3 months, they noted.
The investigators reported having no disclosures.
AT ACTRIMS FORUM 2016
Key clinical point: Repeated intrathecal administration of autologous mesenchymal bone marrow–derived stromal stem cells stabilized or improved EDSS scores in most MS patients at up to 6 years of follow-up.
Major finding: The mean EDSS score decreased from 6.76 at study entry to 6.57 at a mean follow-up of 3.6 years.
Data source: An uncontrolled, prospective study involving 28 MS patients.
Disclosures: The investigators reported having no disclosures.
AAAAI: Early peanut consumption brings lasting protection from allergy
LOS ANGELES – A peanut allergy prevention strategy based upon regular consumption of peanut-containing foods from infancy to age 5 continued to provide protection even after peanut intake was halted for a full year from age 5 to 6, according to new results from an extension of the landmark LEAP trial, known as LEAP-On, presented at the annual meeting of the American Academy of Allergy, Asthma, and Immunology.
The impetus for LEAP-On was the investigators’ concern that a period of peanut avoidance might cause loss of the protective state. But that didn’t occur.
“I think there is no doubt that we have prevented peanut allergy so far in these high-risk children. Next, the LEAP-Ad Lib study will tell us whether we’ve prevented it by age 10,” said Dr. Gideon Lack of King’s College London, who headed LEAP-On.
A second major randomized trial known as EAT (Enquiring About Tolerance) presented at the meeting provided further support for early dietary introduction of allergenic foods. EAT differed from LEAP (Learning Early About Peanut Allergy) and LEAP-On in that it ambitiously randomized infants to early introduction or avoidance of not one but six allergenic foods: peanut, cooked egg, cow’s milk, fish, sesame, and wheat. Also, while LEAP and LEAP-On involved roughly 600 infants known to be at very high risk for allergy, EAT was conducted in a general population of 1,303 infants who weren’t at increased risk, all of whom were exclusively breast-fed until the intervention beginning at age 3 months.
The presentation of the LEAP-On and EAT results at the AAAAI annual meeting was a major event marked by the National Institute of Allergy and Infectious Diseases by same-day release of new NIAID-sponsored draft recommendations for the diagnosis and management of food allergies.
In a press conference held at the AAAAI annual meeting to announce the start of a 45-day public comment period for the draft update of the 2010 guidelines, Dr. Daniel Rotrosen, director of NIAID’s division of allergy, immunology and transplantation, said the new guidelines were developed largely in response to the compelling LEAP findings. That trial demonstrated that sustained consumption of peanut starting in infancy resulted in an 81% lower rate of peanut allergy at age 5 years compared to a strategy of peanut avoidance (N Engl J Med. 2015;372:803-13).
The draft guidelines, now available on the NIAID website, represent a sharp departure from the former recommendation that physicians encourage exclusive breastfeeding for the first 6 months of life followed by cautious introduction of other foods. Whereas the former orthodoxy was that delayed introduction of allergenic foods protects against development of food allergy, the new evidence-based concept supported by the LEAP and EAT findings is that just the opposite is true: that is, introduction of such foods during the period of immunologic plasticity in infancy induces tolerance.
Thus, the draft guidelines recommend that infants at high risk for peanut allergy because they have severe eczema and/or egg allergy should have introduction of peanut-containing food at 4-6 months of age to reduce their risk of peanut allergy, preceded by evaluation using peanut-specific IgE or skin prick testing to make sure it’s safe. That age window coincides with well-child visits and vaccination schedules, Dr. Rotrosen noted.
These guidelines represent the consensus of 26 organizations that participated in their development. Among them are the American Academy of Pediatrics, the American Academy of Family Physicians, the American Academy of Dermatology, the American College of Gastroenterology, and AAAAI.
“I expect the new guidelines, when finalized, to be endorsed by the leadership of all the participating organizations,” Dr. Rotrosen said.
The new paradigm will require cultural change, said Dr. James R. Baker Jr., CEO and chief medical officer of Food Allergy Research and Education, a nonprofit organization that provided partial funding for LEAP and LEAP-On.
“I think for a long time we’ve vilified these foods. There’s nothing inherently wrong with their intake, and that’s a message we need to get across to parents and physicians so they can start thinking differently,” he said.
“The good news about these studies is that they show there’s no reason not to do this,” Dr. Baker added. “There’s no harm that comes from the early introduction.”
Dr. Lack, who led the EAT trial, noted that the study didn’t meet it’s primary endpoint of a significantly lower prevalence of food allergy to any of the six intervention foods at age 3 years in the intention-to-treat analysis. But adherence to the demanding EAT early-introduction protocol was a problem. Indeed, only 43% of participants adhered to the study protocol. In a per-protocol analysis restricted to the adherent group, however, early introduction was associated with a highly significant 67% reduction in the relative risk of food allergy at 3 years of age compared to controls. And for the two most prevalent food allergies – to peanut and egg – the relative risk reductions in the early-introduction group were 100% and 75%, respectively.
The EAT results suggest that an effective preventive dose of peanut in infants at least 3 months of age is roughly 2 g of peanut protein per week, equivalent to just under 2 tsp of peanut butter, according to Dr. Lack.
Simultaneously with presentation of the LEAP-On and EAT trials in Los Angeles, the studies were published online at NEJM.org (doi: 10.1056/NEJMoa1514210 for LEAP-ON and 10.1056/NEJMoa1514209 for EAT).
LEAP-On was supported primarily by NIAID. EAT was funded mainly by the UK Foods Standards Agency and the Medical Research Council. Dr. Lack reported receiving grants from those agencies as well as Food Allergy Research and Education.
LOS ANGELES – A peanut allergy prevention strategy based upon regular consumption of peanut-containing foods from infancy to age 5 continued to provide protection even after peanut intake was halted for a full year from age 5 to 6, according to new results from an extension of the landmark LEAP trial, known as LEAP-On, presented at the annual meeting of the American Academy of Allergy, Asthma, and Immunology.
The impetus for LEAP-On was the investigators’ concern that a period of peanut avoidance might cause loss of the protective state. But that didn’t occur.
“I think there is no doubt that we have prevented peanut allergy so far in these high-risk children. Next, the LEAP-Ad Lib study will tell us whether we’ve prevented it by age 10,” said Dr. Gideon Lack of King’s College London, who headed LEAP-On.
A second major randomized trial known as EAT (Enquiring About Tolerance) presented at the meeting provided further support for early dietary introduction of allergenic foods. EAT differed from LEAP (Learning Early About Peanut Allergy) and LEAP-On in that it ambitiously randomized infants to early introduction or avoidance of not one but six allergenic foods: peanut, cooked egg, cow’s milk, fish, sesame, and wheat. Also, while LEAP and LEAP-On involved roughly 600 infants known to be at very high risk for allergy, EAT was conducted in a general population of 1,303 infants who weren’t at increased risk, all of whom were exclusively breast-fed until the intervention beginning at age 3 months.
The presentation of the LEAP-On and EAT results at the AAAAI annual meeting was a major event marked by the National Institute of Allergy and Infectious Diseases by same-day release of new NIAID-sponsored draft recommendations for the diagnosis and management of food allergies.
In a press conference held at the AAAAI annual meeting to announce the start of a 45-day public comment period for the draft update of the 2010 guidelines, Dr. Daniel Rotrosen, director of NIAID’s division of allergy, immunology and transplantation, said the new guidelines were developed largely in response to the compelling LEAP findings. That trial demonstrated that sustained consumption of peanut starting in infancy resulted in an 81% lower rate of peanut allergy at age 5 years compared to a strategy of peanut avoidance (N Engl J Med. 2015;372:803-13).
The draft guidelines, now available on the NIAID website, represent a sharp departure from the former recommendation that physicians encourage exclusive breastfeeding for the first 6 months of life followed by cautious introduction of other foods. Whereas the former orthodoxy was that delayed introduction of allergenic foods protects against development of food allergy, the new evidence-based concept supported by the LEAP and EAT findings is that just the opposite is true: that is, introduction of such foods during the period of immunologic plasticity in infancy induces tolerance.
Thus, the draft guidelines recommend that infants at high risk for peanut allergy because they have severe eczema and/or egg allergy should have introduction of peanut-containing food at 4-6 months of age to reduce their risk of peanut allergy, preceded by evaluation using peanut-specific IgE or skin prick testing to make sure it’s safe. That age window coincides with well-child visits and vaccination schedules, Dr. Rotrosen noted.
These guidelines represent the consensus of 26 organizations that participated in their development. Among them are the American Academy of Pediatrics, the American Academy of Family Physicians, the American Academy of Dermatology, the American College of Gastroenterology, and AAAAI.
“I expect the new guidelines, when finalized, to be endorsed by the leadership of all the participating organizations,” Dr. Rotrosen said.
The new paradigm will require cultural change, said Dr. James R. Baker Jr., CEO and chief medical officer of Food Allergy Research and Education, a nonprofit organization that provided partial funding for LEAP and LEAP-On.
“I think for a long time we’ve vilified these foods. There’s nothing inherently wrong with their intake, and that’s a message we need to get across to parents and physicians so they can start thinking differently,” he said.
“The good news about these studies is that they show there’s no reason not to do this,” Dr. Baker added. “There’s no harm that comes from the early introduction.”
Dr. Lack, who led the EAT trial, noted that the study didn’t meet it’s primary endpoint of a significantly lower prevalence of food allergy to any of the six intervention foods at age 3 years in the intention-to-treat analysis. But adherence to the demanding EAT early-introduction protocol was a problem. Indeed, only 43% of participants adhered to the study protocol. In a per-protocol analysis restricted to the adherent group, however, early introduction was associated with a highly significant 67% reduction in the relative risk of food allergy at 3 years of age compared to controls. And for the two most prevalent food allergies – to peanut and egg – the relative risk reductions in the early-introduction group were 100% and 75%, respectively.
The EAT results suggest that an effective preventive dose of peanut in infants at least 3 months of age is roughly 2 g of peanut protein per week, equivalent to just under 2 tsp of peanut butter, according to Dr. Lack.
Simultaneously with presentation of the LEAP-On and EAT trials in Los Angeles, the studies were published online at NEJM.org (doi: 10.1056/NEJMoa1514210 for LEAP-ON and 10.1056/NEJMoa1514209 for EAT).
LEAP-On was supported primarily by NIAID. EAT was funded mainly by the UK Foods Standards Agency and the Medical Research Council. Dr. Lack reported receiving grants from those agencies as well as Food Allergy Research and Education.
LOS ANGELES – A peanut allergy prevention strategy based upon regular consumption of peanut-containing foods from infancy to age 5 continued to provide protection even after peanut intake was halted for a full year from age 5 to 6, according to new results from an extension of the landmark LEAP trial, known as LEAP-On, presented at the annual meeting of the American Academy of Allergy, Asthma, and Immunology.
The impetus for LEAP-On was the investigators’ concern that a period of peanut avoidance might cause loss of the protective state. But that didn’t occur.
“I think there is no doubt that we have prevented peanut allergy so far in these high-risk children. Next, the LEAP-Ad Lib study will tell us whether we’ve prevented it by age 10,” said Dr. Gideon Lack of King’s College London, who headed LEAP-On.
A second major randomized trial known as EAT (Enquiring About Tolerance) presented at the meeting provided further support for early dietary introduction of allergenic foods. EAT differed from LEAP (Learning Early About Peanut Allergy) and LEAP-On in that it ambitiously randomized infants to early introduction or avoidance of not one but six allergenic foods: peanut, cooked egg, cow’s milk, fish, sesame, and wheat. Also, while LEAP and LEAP-On involved roughly 600 infants known to be at very high risk for allergy, EAT was conducted in a general population of 1,303 infants who weren’t at increased risk, all of whom were exclusively breast-fed until the intervention beginning at age 3 months.
The presentation of the LEAP-On and EAT results at the AAAAI annual meeting was a major event marked by the National Institute of Allergy and Infectious Diseases by same-day release of new NIAID-sponsored draft recommendations for the diagnosis and management of food allergies.
In a press conference held at the AAAAI annual meeting to announce the start of a 45-day public comment period for the draft update of the 2010 guidelines, Dr. Daniel Rotrosen, director of NIAID’s division of allergy, immunology and transplantation, said the new guidelines were developed largely in response to the compelling LEAP findings. That trial demonstrated that sustained consumption of peanut starting in infancy resulted in an 81% lower rate of peanut allergy at age 5 years compared to a strategy of peanut avoidance (N Engl J Med. 2015;372:803-13).
The draft guidelines, now available on the NIAID website, represent a sharp departure from the former recommendation that physicians encourage exclusive breastfeeding for the first 6 months of life followed by cautious introduction of other foods. Whereas the former orthodoxy was that delayed introduction of allergenic foods protects against development of food allergy, the new evidence-based concept supported by the LEAP and EAT findings is that just the opposite is true: that is, introduction of such foods during the period of immunologic plasticity in infancy induces tolerance.
Thus, the draft guidelines recommend that infants at high risk for peanut allergy because they have severe eczema and/or egg allergy should have introduction of peanut-containing food at 4-6 months of age to reduce their risk of peanut allergy, preceded by evaluation using peanut-specific IgE or skin prick testing to make sure it’s safe. That age window coincides with well-child visits and vaccination schedules, Dr. Rotrosen noted.
These guidelines represent the consensus of 26 organizations that participated in their development. Among them are the American Academy of Pediatrics, the American Academy of Family Physicians, the American Academy of Dermatology, the American College of Gastroenterology, and AAAAI.
“I expect the new guidelines, when finalized, to be endorsed by the leadership of all the participating organizations,” Dr. Rotrosen said.
The new paradigm will require cultural change, said Dr. James R. Baker Jr., CEO and chief medical officer of Food Allergy Research and Education, a nonprofit organization that provided partial funding for LEAP and LEAP-On.
“I think for a long time we’ve vilified these foods. There’s nothing inherently wrong with their intake, and that’s a message we need to get across to parents and physicians so they can start thinking differently,” he said.
“The good news about these studies is that they show there’s no reason not to do this,” Dr. Baker added. “There’s no harm that comes from the early introduction.”
Dr. Lack, who led the EAT trial, noted that the study didn’t meet it’s primary endpoint of a significantly lower prevalence of food allergy to any of the six intervention foods at age 3 years in the intention-to-treat analysis. But adherence to the demanding EAT early-introduction protocol was a problem. Indeed, only 43% of participants adhered to the study protocol. In a per-protocol analysis restricted to the adherent group, however, early introduction was associated with a highly significant 67% reduction in the relative risk of food allergy at 3 years of age compared to controls. And for the two most prevalent food allergies – to peanut and egg – the relative risk reductions in the early-introduction group were 100% and 75%, respectively.
The EAT results suggest that an effective preventive dose of peanut in infants at least 3 months of age is roughly 2 g of peanut protein per week, equivalent to just under 2 tsp of peanut butter, according to Dr. Lack.
Simultaneously with presentation of the LEAP-On and EAT trials in Los Angeles, the studies were published online at NEJM.org (doi: 10.1056/NEJMoa1514210 for LEAP-ON and 10.1056/NEJMoa1514209 for EAT).
LEAP-On was supported primarily by NIAID. EAT was funded mainly by the UK Foods Standards Agency and the Medical Research Council. Dr. Lack reported receiving grants from those agencies as well as Food Allergy Research and Education.
EXPERT ANALYSIS FROM THE 2016 AAAAI ANNUAL MEETING
U.S. Surgeon General Vivek Murthy, MD, MBA, Encourages Hospitalists to Lead, Improve Healthcare
Dr. Murthy, who previously worked as a hospitalist in Boston, spoke about how the urgency to build a foundation for health in America, where chronic illness and healthcare costs have skyrocketed, could not be any greater. Health is the key to opportunity, he said. He explored the following strategies to make America healthier:
- Make the pursuit of healthy appealing;
- Improve the safety of our communities;
- Focus on the mind and spirit; and,
- Cultivate our ability to give and receive kindness.
Specifically, hospitalists should contemplate the following questions:
- How can hospitalists leverage their leadership in the hospital to improve systems and create a culture that supports healing and health?
- How can hospitalists be a powerful of force of change both inside and outside the hospital?
- How can hospitalists inspire the next generation of physicians to safeguard the health of their community by treating and preventing illness?
Dr. Murthy challenged hospitalists to commit to strengthening the foundation of health in our country and shift our culture towards the well-being of our communities through prevention.
He left the group by saying, “In the end, the world gets better when people choose to come together to make it better.” TH
Dr. Murthy, who previously worked as a hospitalist in Boston, spoke about how the urgency to build a foundation for health in America, where chronic illness and healthcare costs have skyrocketed, could not be any greater. Health is the key to opportunity, he said. He explored the following strategies to make America healthier:
- Make the pursuit of healthy appealing;
- Improve the safety of our communities;
- Focus on the mind and spirit; and,
- Cultivate our ability to give and receive kindness.
Specifically, hospitalists should contemplate the following questions:
- How can hospitalists leverage their leadership in the hospital to improve systems and create a culture that supports healing and health?
- How can hospitalists be a powerful of force of change both inside and outside the hospital?
- How can hospitalists inspire the next generation of physicians to safeguard the health of their community by treating and preventing illness?
Dr. Murthy challenged hospitalists to commit to strengthening the foundation of health in our country and shift our culture towards the well-being of our communities through prevention.
He left the group by saying, “In the end, the world gets better when people choose to come together to make it better.” TH
Dr. Murthy, who previously worked as a hospitalist in Boston, spoke about how the urgency to build a foundation for health in America, where chronic illness and healthcare costs have skyrocketed, could not be any greater. Health is the key to opportunity, he said. He explored the following strategies to make America healthier:
- Make the pursuit of healthy appealing;
- Improve the safety of our communities;
- Focus on the mind and spirit; and,
- Cultivate our ability to give and receive kindness.
Specifically, hospitalists should contemplate the following questions:
- How can hospitalists leverage their leadership in the hospital to improve systems and create a culture that supports healing and health?
- How can hospitalists be a powerful of force of change both inside and outside the hospital?
- How can hospitalists inspire the next generation of physicians to safeguard the health of their community by treating and preventing illness?
Dr. Murthy challenged hospitalists to commit to strengthening the foundation of health in our country and shift our culture towards the well-being of our communities through prevention.
He left the group by saying, “In the end, the world gets better when people choose to come together to make it better.” TH
Comorbid depression worsens asthma outcomes in older adults
LOS ANGELES – Adults over age 55 with asthma and depression have nearly twice as many emergency department visits for asthma as do asthma patients without depression, based on findings in 402 asthma patients, Dr. Pooja O. Patel reported at the annual meeting of the American Academy of Allergy, Asthma, and Immunology.
Comorbid depression also is associated with more asthma-related sleep disturbances and worse health-related quality of life, even though spirometry findings are similar in asthma patients with and without depression, added Dr. Patel of the University of Michigan, Ann Arbor.
She analyzed data on 7,256 adults over age 55 who participated in the National Health and Nutrition Examination Survey during 2007-2012. The prevalence of physician-diagnosed asthma in this nationally representative group of older adults was 5.5%. And 196 of those 402 asthma patients, or fully 49%, had comorbid depression as defined by their scores on the Patient Health Questionnaire-9 (PHQ-9), a validated brief and reliable self-administered measure of depression severity.
One or more emergency department visits for asthma within the last 12 months occurred in 18.8% of the group with asthma alone, compared with 28.1% in those with comorbid depression. In a multivariate regression analysis adjusted for the demographic differences, this translated to a twofold increased likelihood of ED visits specifically for asthma in the group with asthma and depression.
These data make a compelling case for routine screening for depression in older adults with asthma. The PHQ-9 is a good, simple tool for this purpose. Future studies will need to be done in order to learn whether early identification and treatment of comorbid depression in older asthmatic adults will result in improved asthma outcomes, but that is a reasonable hope, Dr. Patel added.

Why is asthma control worse in older adults with depression? In an interview, Dr. Patel said she thinks demographic differences may play a role. The older asthma patients with depression had less education and lower socioeconomic status than those without depression. Also, depression could adversely affect adherence to asthma controller medications, as depression is linked to worse medication adherence.
Dr. Patel reported having no financial conflicts regarding her study.
LOS ANGELES – Adults over age 55 with asthma and depression have nearly twice as many emergency department visits for asthma as do asthma patients without depression, based on findings in 402 asthma patients, Dr. Pooja O. Patel reported at the annual meeting of the American Academy of Allergy, Asthma, and Immunology.
Comorbid depression also is associated with more asthma-related sleep disturbances and worse health-related quality of life, even though spirometry findings are similar in asthma patients with and without depression, added Dr. Patel of the University of Michigan, Ann Arbor.
She analyzed data on 7,256 adults over age 55 who participated in the National Health and Nutrition Examination Survey during 2007-2012. The prevalence of physician-diagnosed asthma in this nationally representative group of older adults was 5.5%. And 196 of those 402 asthma patients, or fully 49%, had comorbid depression as defined by their scores on the Patient Health Questionnaire-9 (PHQ-9), a validated brief and reliable self-administered measure of depression severity.
One or more emergency department visits for asthma within the last 12 months occurred in 18.8% of the group with asthma alone, compared with 28.1% in those with comorbid depression. In a multivariate regression analysis adjusted for the demographic differences, this translated to a twofold increased likelihood of ED visits specifically for asthma in the group with asthma and depression.
These data make a compelling case for routine screening for depression in older adults with asthma. The PHQ-9 is a good, simple tool for this purpose. Future studies will need to be done in order to learn whether early identification and treatment of comorbid depression in older asthmatic adults will result in improved asthma outcomes, but that is a reasonable hope, Dr. Patel added.

Why is asthma control worse in older adults with depression? In an interview, Dr. Patel said she thinks demographic differences may play a role. The older asthma patients with depression had less education and lower socioeconomic status than those without depression. Also, depression could adversely affect adherence to asthma controller medications, as depression is linked to worse medication adherence.
Dr. Patel reported having no financial conflicts regarding her study.
LOS ANGELES – Adults over age 55 with asthma and depression have nearly twice as many emergency department visits for asthma as do asthma patients without depression, based on findings in 402 asthma patients, Dr. Pooja O. Patel reported at the annual meeting of the American Academy of Allergy, Asthma, and Immunology.
Comorbid depression also is associated with more asthma-related sleep disturbances and worse health-related quality of life, even though spirometry findings are similar in asthma patients with and without depression, added Dr. Patel of the University of Michigan, Ann Arbor.
She analyzed data on 7,256 adults over age 55 who participated in the National Health and Nutrition Examination Survey during 2007-2012. The prevalence of physician-diagnosed asthma in this nationally representative group of older adults was 5.5%. And 196 of those 402 asthma patients, or fully 49%, had comorbid depression as defined by their scores on the Patient Health Questionnaire-9 (PHQ-9), a validated brief and reliable self-administered measure of depression severity.
One or more emergency department visits for asthma within the last 12 months occurred in 18.8% of the group with asthma alone, compared with 28.1% in those with comorbid depression. In a multivariate regression analysis adjusted for the demographic differences, this translated to a twofold increased likelihood of ED visits specifically for asthma in the group with asthma and depression.
These data make a compelling case for routine screening for depression in older adults with asthma. The PHQ-9 is a good, simple tool for this purpose. Future studies will need to be done in order to learn whether early identification and treatment of comorbid depression in older asthmatic adults will result in improved asthma outcomes, but that is a reasonable hope, Dr. Patel added.

Why is asthma control worse in older adults with depression? In an interview, Dr. Patel said she thinks demographic differences may play a role. The older asthma patients with depression had less education and lower socioeconomic status than those without depression. Also, depression could adversely affect adherence to asthma controller medications, as depression is linked to worse medication adherence.
Dr. Patel reported having no financial conflicts regarding her study.
AT 2016 AAAAI ANNUAL MEETING
Key clinical point: Screening for depression in older adults with asthma has a high yield, with a 49% prevalence found in a nationally representative population.
Major finding: Older adults with asthma and depression were twice as likely as adults with asthma alone to have one or more emergency department visits for asthma in the past year.
Data source: An analysis of cross-sectional data on a nationally representative sample of 402 adults with physician-diagnosed asthma who participated in the National Health and Nutrition Examination Survey during 2007-2012.
Disclosures: The presenter reported having no financial conflicts regarding her study.
EHR Report: How Zika virus reveals the fault in our EHRs
It is always noteworthy when the headlines in the medical and mainstream media appear to be the same.
Typically, this means one of two things: 1) Sensationalism has propelled a minor issue into the common lexicon; or 2) a truly serious issue has grown to the point where the whole world is finally taking notice.
With the recent resurgence of Zika virus, something that initially seemed to be the former has unmistakably developed into the latter, and health care providers are again facing an age-old question: How do we adequately fight an evolving and serious illness in the midst of an ever-changing battlefield?
As has been the case countless times before, the answer to this question really lies in early identification. One might think that the advent of modern technology would make this a much easier proposition, but that has not exactly been the case.
In fact, recent Ebola and Zika outbreaks have actually served to demonstrate a big problem in many modern electronic health records: poor clinical decision support.
In this column, we felt it would be helpful to highlight this shortcoming, and make the suggestion that in the world of EHRs …
Change needs to be faster than Zika
Zika virus is not new (it was first identified in the Zika Forest of Uganda in 1947), and neither is the concept of serious mosquito-born illness. While the current Zika hot zones are South America, Central America, Mexico, and the Caribbean, case reports indicate the virus is quickly migrating. At the time of this writing, more than 150 travel-associated cases of Zika have been identified in the continental United States, and it is clear that the consequences of undiagnosed Zika in pregnancy can be devastating.
Furthermore, Zika is just the latest of many viruses to threaten the health and welfare of modern civilization (for example, Ebola, swine flu, SARS, and so on), so screening and prevention is far from a novel idea.
Unfortunately, electronic record vendors don’t seem to have gotten the message that the ability to adapt quickly to public health threats should be a core element of any modern EHR.
On the contrary, EHRs seem to be designed for fixed “best practice” workflows, and updates are often slow in coming (typically requiring a major upgrade or “patch”). This renders them fairly unable to react nimbly to change.
This fact became evident to us as we attempted to implement a reminder for staff members to perform a Zika-focused travel history on all patients. We felt it was critical for this reminder to be prominent, be easy to interact with, and appear at the most appropriate time for screening.
Despite multiple attempts, we discovered that our top-ranked, industry-leading EHR was unable to do this seemingly straightforward task, and eventually we reverted to the age-old practice of hanging signs in all of the exam rooms. These encouraged patients to inform their doctor “of worrisome symptoms or recent travel history to affected areas.”
We refuse to accept the inability of any modern electronic health record to create simple and flexible clinical support rules and improve on the efficacy of the paper sign. This, especially in light of the fact that one of the core requirements of the Meaningful Use (MU) program – for which all EHRs are certified – is clinical decision support!
Unfortunately, the MU guidelines are not specific, so most vendors choose to include a standard set of rules and don’t allow the ability for customization. That just isn’t good enough. If Ebola and Zika have taught the health information technology community one thing, it’s that …
It is time for smarter EHRs!
For many people, the notion of artificial intelligence seems to be science fiction, but they don’t realize they are carrying incredible “AI” devices with them everywhere they go. We are, of course, referring to our cell phones, which seem to be getting more intelligent all the time.
If you own an iPhone, you may have noticed it often seems to know where you are about to drive and how long it will take you to get there. This can be a bit creepy at first, until you realize how helpful – and smart – it actually is.
Essentially, our devices are constantly collecting data, reading the patterns of our lives, and learning ways to enhance them. Smartphones have revolutionized how we communicate, work, and play. Why, then, can’t our electronic health record software do the same?
It will surprise exactly none of our readers that the Meaningful Use program has fallen short of its goal of promoting the true benefits of electronic records. Many critics have suggested that the incentive program has faltered because EHRs have made physicians work harder, without helping them work smarter.
Zika virus proves the critics correct. Beyond creating just simple reminders as mentioned above, EHRs should be able to make intelligent suggestions based on patient data and current practice guidelines.
Some EHRs get it half correct. For example, they are “smart” enough to remind clinicians that women of a certain age should have mammograms, but they fall short in the ability to efficiently update those reminders when the U.S. Preventive Services Task Force updates the screening recommendation (as they did recently).
Other EHRs do allow you to customize preventative health reminders, but do not place them in a position of prominence – so they are easily overlooked by providers as they care for patients.
Few products seem to get it just right, and it’s time for this to change.
Simply put, as questions in the media loom about how to stop this rising threat, we as frontline health care providers should have the tools – and the decision support – required to provide meaningful answers.
Dr. Notte is a family physician and clinical informaticist for Abington (Pa.) Memorial Hospital. He is a partner in EHR Practice Consultants, a firm that aids physicians in adopting electronic health records. Dr. Skolnik is associate director of the family medicine residency program at Abington Memorial Hospital and professor of family and community medicine at Temple University in Philadelphia.
It is always noteworthy when the headlines in the medical and mainstream media appear to be the same.
Typically, this means one of two things: 1) Sensationalism has propelled a minor issue into the common lexicon; or 2) a truly serious issue has grown to the point where the whole world is finally taking notice.
With the recent resurgence of Zika virus, something that initially seemed to be the former has unmistakably developed into the latter, and health care providers are again facing an age-old question: How do we adequately fight an evolving and serious illness in the midst of an ever-changing battlefield?
As has been the case countless times before, the answer to this question really lies in early identification. One might think that the advent of modern technology would make this a much easier proposition, but that has not exactly been the case.
In fact, recent Ebola and Zika outbreaks have actually served to demonstrate a big problem in many modern electronic health records: poor clinical decision support.
In this column, we felt it would be helpful to highlight this shortcoming, and make the suggestion that in the world of EHRs …
Change needs to be faster than Zika
Zika virus is not new (it was first identified in the Zika Forest of Uganda in 1947), and neither is the concept of serious mosquito-born illness. While the current Zika hot zones are South America, Central America, Mexico, and the Caribbean, case reports indicate the virus is quickly migrating. At the time of this writing, more than 150 travel-associated cases of Zika have been identified in the continental United States, and it is clear that the consequences of undiagnosed Zika in pregnancy can be devastating.
Furthermore, Zika is just the latest of many viruses to threaten the health and welfare of modern civilization (for example, Ebola, swine flu, SARS, and so on), so screening and prevention is far from a novel idea.
Unfortunately, electronic record vendors don’t seem to have gotten the message that the ability to adapt quickly to public health threats should be a core element of any modern EHR.
On the contrary, EHRs seem to be designed for fixed “best practice” workflows, and updates are often slow in coming (typically requiring a major upgrade or “patch”). This renders them fairly unable to react nimbly to change.
This fact became evident to us as we attempted to implement a reminder for staff members to perform a Zika-focused travel history on all patients. We felt it was critical for this reminder to be prominent, be easy to interact with, and appear at the most appropriate time for screening.
Despite multiple attempts, we discovered that our top-ranked, industry-leading EHR was unable to do this seemingly straightforward task, and eventually we reverted to the age-old practice of hanging signs in all of the exam rooms. These encouraged patients to inform their doctor “of worrisome symptoms or recent travel history to affected areas.”
We refuse to accept the inability of any modern electronic health record to create simple and flexible clinical support rules and improve on the efficacy of the paper sign. This, especially in light of the fact that one of the core requirements of the Meaningful Use (MU) program – for which all EHRs are certified – is clinical decision support!
Unfortunately, the MU guidelines are not specific, so most vendors choose to include a standard set of rules and don’t allow the ability for customization. That just isn’t good enough. If Ebola and Zika have taught the health information technology community one thing, it’s that …
It is time for smarter EHRs!
For many people, the notion of artificial intelligence seems to be science fiction, but they don’t realize they are carrying incredible “AI” devices with them everywhere they go. We are, of course, referring to our cell phones, which seem to be getting more intelligent all the time.
If you own an iPhone, you may have noticed it often seems to know where you are about to drive and how long it will take you to get there. This can be a bit creepy at first, until you realize how helpful – and smart – it actually is.
Essentially, our devices are constantly collecting data, reading the patterns of our lives, and learning ways to enhance them. Smartphones have revolutionized how we communicate, work, and play. Why, then, can’t our electronic health record software do the same?
It will surprise exactly none of our readers that the Meaningful Use program has fallen short of its goal of promoting the true benefits of electronic records. Many critics have suggested that the incentive program has faltered because EHRs have made physicians work harder, without helping them work smarter.
Zika virus proves the critics correct. Beyond creating just simple reminders as mentioned above, EHRs should be able to make intelligent suggestions based on patient data and current practice guidelines.
Some EHRs get it half correct. For example, they are “smart” enough to remind clinicians that women of a certain age should have mammograms, but they fall short in the ability to efficiently update those reminders when the U.S. Preventive Services Task Force updates the screening recommendation (as they did recently).
Other EHRs do allow you to customize preventative health reminders, but do not place them in a position of prominence – so they are easily overlooked by providers as they care for patients.
Few products seem to get it just right, and it’s time for this to change.
Simply put, as questions in the media loom about how to stop this rising threat, we as frontline health care providers should have the tools – and the decision support – required to provide meaningful answers.
Dr. Notte is a family physician and clinical informaticist for Abington (Pa.) Memorial Hospital. He is a partner in EHR Practice Consultants, a firm that aids physicians in adopting electronic health records. Dr. Skolnik is associate director of the family medicine residency program at Abington Memorial Hospital and professor of family and community medicine at Temple University in Philadelphia.
It is always noteworthy when the headlines in the medical and mainstream media appear to be the same.
Typically, this means one of two things: 1) Sensationalism has propelled a minor issue into the common lexicon; or 2) a truly serious issue has grown to the point where the whole world is finally taking notice.
With the recent resurgence of Zika virus, something that initially seemed to be the former has unmistakably developed into the latter, and health care providers are again facing an age-old question: How do we adequately fight an evolving and serious illness in the midst of an ever-changing battlefield?
As has been the case countless times before, the answer to this question really lies in early identification. One might think that the advent of modern technology would make this a much easier proposition, but that has not exactly been the case.
In fact, recent Ebola and Zika outbreaks have actually served to demonstrate a big problem in many modern electronic health records: poor clinical decision support.
In this column, we felt it would be helpful to highlight this shortcoming, and make the suggestion that in the world of EHRs …
Change needs to be faster than Zika
Zika virus is not new (it was first identified in the Zika Forest of Uganda in 1947), and neither is the concept of serious mosquito-born illness. While the current Zika hot zones are South America, Central America, Mexico, and the Caribbean, case reports indicate the virus is quickly migrating. At the time of this writing, more than 150 travel-associated cases of Zika have been identified in the continental United States, and it is clear that the consequences of undiagnosed Zika in pregnancy can be devastating.
Furthermore, Zika is just the latest of many viruses to threaten the health and welfare of modern civilization (for example, Ebola, swine flu, SARS, and so on), so screening and prevention is far from a novel idea.
Unfortunately, electronic record vendors don’t seem to have gotten the message that the ability to adapt quickly to public health threats should be a core element of any modern EHR.
On the contrary, EHRs seem to be designed for fixed “best practice” workflows, and updates are often slow in coming (typically requiring a major upgrade or “patch”). This renders them fairly unable to react nimbly to change.
This fact became evident to us as we attempted to implement a reminder for staff members to perform a Zika-focused travel history on all patients. We felt it was critical for this reminder to be prominent, be easy to interact with, and appear at the most appropriate time for screening.
Despite multiple attempts, we discovered that our top-ranked, industry-leading EHR was unable to do this seemingly straightforward task, and eventually we reverted to the age-old practice of hanging signs in all of the exam rooms. These encouraged patients to inform their doctor “of worrisome symptoms or recent travel history to affected areas.”
We refuse to accept the inability of any modern electronic health record to create simple and flexible clinical support rules and improve on the efficacy of the paper sign. This, especially in light of the fact that one of the core requirements of the Meaningful Use (MU) program – for which all EHRs are certified – is clinical decision support!
Unfortunately, the MU guidelines are not specific, so most vendors choose to include a standard set of rules and don’t allow the ability for customization. That just isn’t good enough. If Ebola and Zika have taught the health information technology community one thing, it’s that …
It is time for smarter EHRs!
For many people, the notion of artificial intelligence seems to be science fiction, but they don’t realize they are carrying incredible “AI” devices with them everywhere they go. We are, of course, referring to our cell phones, which seem to be getting more intelligent all the time.
If you own an iPhone, you may have noticed it often seems to know where you are about to drive and how long it will take you to get there. This can be a bit creepy at first, until you realize how helpful – and smart – it actually is.
Essentially, our devices are constantly collecting data, reading the patterns of our lives, and learning ways to enhance them. Smartphones have revolutionized how we communicate, work, and play. Why, then, can’t our electronic health record software do the same?
It will surprise exactly none of our readers that the Meaningful Use program has fallen short of its goal of promoting the true benefits of electronic records. Many critics have suggested that the incentive program has faltered because EHRs have made physicians work harder, without helping them work smarter.
Zika virus proves the critics correct. Beyond creating just simple reminders as mentioned above, EHRs should be able to make intelligent suggestions based on patient data and current practice guidelines.
Some EHRs get it half correct. For example, they are “smart” enough to remind clinicians that women of a certain age should have mammograms, but they fall short in the ability to efficiently update those reminders when the U.S. Preventive Services Task Force updates the screening recommendation (as they did recently).
Other EHRs do allow you to customize preventative health reminders, but do not place them in a position of prominence – so they are easily overlooked by providers as they care for patients.
Few products seem to get it just right, and it’s time for this to change.
Simply put, as questions in the media loom about how to stop this rising threat, we as frontline health care providers should have the tools – and the decision support – required to provide meaningful answers.
Dr. Notte is a family physician and clinical informaticist for Abington (Pa.) Memorial Hospital. He is a partner in EHR Practice Consultants, a firm that aids physicians in adopting electronic health records. Dr. Skolnik is associate director of the family medicine residency program at Abington Memorial Hospital and professor of family and community medicine at Temple University in Philadelphia.