User login
Cannabis: Doctors tell FDA to get out of the weeds
The Food and Drug Administration held its first-ever public hearing about products containing cannabis or cannabis-derived compounds – and it got an earful.
Hearing from over 100 individuals, the all-staff FDA panel was asked repeatedly to take the lead in bringing order to a confused morass of state and local cannabis regulations. The regulatory landscape currently contains many voids that slow research and put consumers at risk, many witnesses testified.
The federal Farm Bill of 2018 legalized the cultivation of hemp – cannabis with very low 9-tetrahydrocannabinol (THC) content – with regulatory restrictions.
However, the Farm Bill did not legalize low-THC cannabis products, said FDA Acting Commissioner Norman Sharpless, MD. The agency has concluded that both THC and cannabidiol (CBD) are drugs – not dietary supplements – and any exception to these provisions “would be new terrain for the FDA,” he said.
And although restrictions on CBD sales have generally not been enforced, “under current law, CBD and THC cannot lawfully be added to a food or marketed as a dietary supplement,” said Dr. Sharpless.
Though the FDA could choose to carve out regulatory exceptions, it has not yet done so.
Stakeholders who gave testimony included not just physicians, scientists, consumers, and advocates, but also growers, manufacturers, distributors, and retailers – as well as the legal firms that represent these interests.
Broadly, physicians and scientists encouraged the FDA to move forward with classifying CBD and most CBD-containing products as drugs, rather than dietary supplements. In general, the opposite approach was promoted by agriculture and manufacturing representatives who testified.
However, all were united in asking the FDA for clarity – and alacrity.
Again and again, speakers asked the FDA to move posthaste in tidying up the current clutter of regulations. Ryan Vandrey, PhD, of Johns Hopkins University, Baltimore, explained that today, “Hemp-derived CBD is unscheduled, Epidiolex is Schedule V, and synthetic CBD is Schedule I in the DEA’s current framework.”
Kevin Chapman, MD, of Children’s Hospital Colorado, representing the American Epilepsy Society, called for regulation of CBD as a drug, and an accelerated clinical trial path. He noted that many families of children with epilepsy other than Lennox-Gastaut or Dravet syndrome (the only approved Epidiolex indications) are dosing other CBD products, “making it up as they go along."
Jacqueline French MD, chief scientific officer of the Epilepsy Foundation, agreed that many families of children with epilepsy are doing their best to find consistent and unadulterated CBD products. She said her worry is that “abrupt withdrawal of CBD from the market could lead to seizure worsening, injury, or even death for patients who now rely on non-pharmaceutical CBD for seizure control.”
Dr. French and Dr. Chapman each made the point that without insurance reimbursement, Epidiolex costs families over $30,000 annually, while CBD is a small fraction of that – as little as $1,000, said Dr. Chapman.
The lack of standards for CBD preparations means the content of oils, tinctures, and e-cigarette products can vary widely. Michelle Peace, PhD, a forensic scientist at Virginia Commonwealth University, Richmond, receives funding from the U.S. Department of Justice to study licit and illicit drug use with e-cigarettes. She and her colleagues have found dextromethorphan and the potent synthetic cannabinoid 5F-ADB in vaping supplies advertised as containing only CBD.
In a recent investigation prompted by multiple consumer reports of hallucinations after vaping CBD-labeled products, “17 of 18 samples contained a synthetic cannabinoid. Clinics will not find these kinds of drugs when they just do drug testing,” Dr. Peace said.
Another sampling of CBD products available from retail outlets showed that claims often bore little relation to cannabinoid content, said Bill Gurley, PhD, co-director of the Center for Dietary Supplement Research at the University of Arkansas, Little Rock. While several products that claimed to have CBD actually contained none at all, some had many more CBD than the labeled amount – 228 times more, in one instance. One tested product was actually 45% THC, with little CBD content.
The potential for CBD to interact with many other drugs is cause for concern, noted several presenters. Barry Gidal, PharmD, director of pharmacy practice at the University of Wisconsin, Madison, said that “CBD is a complicated molecule. It has a complicated biotransformation pathway” through at least 9 enzymes within the hepatic cytochrome P450 system.
“It wasn’t until the Epidiolex development program that we began to understand the clinical ramifications of this…. The effect of CBD on other drugs may go beyond the anti-seizure drugs that have been studied so far,” said Dr. Gidal. He pointed to recent published case reports of potential CBD-drug interactions reporting elevated international normalized ratios for a patient on warfarin using CBD, and another report of elevated tacrolimus levels in a patient using CBD.
The way forward
A variety of regulatory pathways were proposed at the hearing. To prevent adulteration and contamination issues, many advocated standardized good manufacturing practices (GMPs), product reporting, and identification or registration, and a centralized reporting registry for adverse events.
Patient advocates, physicians, and scientists called for an easing of access to cannabis for medical research. Currently, cannabis, still classified as a Schedule I substance by the Drug Enforcement Administration, is only legally available for this purpose through a supply maintained by the National Institute on Drug Abuse. A limited number of strains are available, and access requires a lengthy approval process.
Most discussion centered around CBD, though some presenters asked for smoother sailing for THC research as well, particularly as a potential adjunct or alternative to opioids for chronic pain. Cannabidiol has generally been recognized as non-psychoactive, and the FDA assigned it a very low probability of causing dependence or addiction problems in its own review of human data.
However, in his opening remarks, Dr. Sharpless warned that this fact does not make CBD a benign substance, and many questions remain unanswered.
“How much CBD is safe to consume in a day? What if someone applies a topical CBD lotion, consumes a CBD beverage or candy, and also consumes some CBD oil? How much is too much? How will it interact with other drugs the person might be taking? What if she’s pregnant? What if children access CBD products like gummy edibles? What happens when someone chronically uses CBD for prolonged periods? These and many other questions represent important and significant gaps in our knowledge,” said Dr. Sharpless.
The FDA has established a public docket where the public may submit comments and documents until July 2, 2019.
The Food and Drug Administration held its first-ever public hearing about products containing cannabis or cannabis-derived compounds – and it got an earful.
Hearing from over 100 individuals, the all-staff FDA panel was asked repeatedly to take the lead in bringing order to a confused morass of state and local cannabis regulations. The regulatory landscape currently contains many voids that slow research and put consumers at risk, many witnesses testified.
The federal Farm Bill of 2018 legalized the cultivation of hemp – cannabis with very low 9-tetrahydrocannabinol (THC) content – with regulatory restrictions.
However, the Farm Bill did not legalize low-THC cannabis products, said FDA Acting Commissioner Norman Sharpless, MD. The agency has concluded that both THC and cannabidiol (CBD) are drugs – not dietary supplements – and any exception to these provisions “would be new terrain for the FDA,” he said.
And although restrictions on CBD sales have generally not been enforced, “under current law, CBD and THC cannot lawfully be added to a food or marketed as a dietary supplement,” said Dr. Sharpless.
Though the FDA could choose to carve out regulatory exceptions, it has not yet done so.
Stakeholders who gave testimony included not just physicians, scientists, consumers, and advocates, but also growers, manufacturers, distributors, and retailers – as well as the legal firms that represent these interests.
Broadly, physicians and scientists encouraged the FDA to move forward with classifying CBD and most CBD-containing products as drugs, rather than dietary supplements. In general, the opposite approach was promoted by agriculture and manufacturing representatives who testified.
However, all were united in asking the FDA for clarity – and alacrity.
Again and again, speakers asked the FDA to move posthaste in tidying up the current clutter of regulations. Ryan Vandrey, PhD, of Johns Hopkins University, Baltimore, explained that today, “Hemp-derived CBD is unscheduled, Epidiolex is Schedule V, and synthetic CBD is Schedule I in the DEA’s current framework.”
Kevin Chapman, MD, of Children’s Hospital Colorado, representing the American Epilepsy Society, called for regulation of CBD as a drug, and an accelerated clinical trial path. He noted that many families of children with epilepsy other than Lennox-Gastaut or Dravet syndrome (the only approved Epidiolex indications) are dosing other CBD products, “making it up as they go along."
Jacqueline French MD, chief scientific officer of the Epilepsy Foundation, agreed that many families of children with epilepsy are doing their best to find consistent and unadulterated CBD products. She said her worry is that “abrupt withdrawal of CBD from the market could lead to seizure worsening, injury, or even death for patients who now rely on non-pharmaceutical CBD for seizure control.”
Dr. French and Dr. Chapman each made the point that without insurance reimbursement, Epidiolex costs families over $30,000 annually, while CBD is a small fraction of that – as little as $1,000, said Dr. Chapman.
The lack of standards for CBD preparations means the content of oils, tinctures, and e-cigarette products can vary widely. Michelle Peace, PhD, a forensic scientist at Virginia Commonwealth University, Richmond, receives funding from the U.S. Department of Justice to study licit and illicit drug use with e-cigarettes. She and her colleagues have found dextromethorphan and the potent synthetic cannabinoid 5F-ADB in vaping supplies advertised as containing only CBD.
In a recent investigation prompted by multiple consumer reports of hallucinations after vaping CBD-labeled products, “17 of 18 samples contained a synthetic cannabinoid. Clinics will not find these kinds of drugs when they just do drug testing,” Dr. Peace said.
Another sampling of CBD products available from retail outlets showed that claims often bore little relation to cannabinoid content, said Bill Gurley, PhD, co-director of the Center for Dietary Supplement Research at the University of Arkansas, Little Rock. While several products that claimed to have CBD actually contained none at all, some had many more CBD than the labeled amount – 228 times more, in one instance. One tested product was actually 45% THC, with little CBD content.
The potential for CBD to interact with many other drugs is cause for concern, noted several presenters. Barry Gidal, PharmD, director of pharmacy practice at the University of Wisconsin, Madison, said that “CBD is a complicated molecule. It has a complicated biotransformation pathway” through at least 9 enzymes within the hepatic cytochrome P450 system.
“It wasn’t until the Epidiolex development program that we began to understand the clinical ramifications of this…. The effect of CBD on other drugs may go beyond the anti-seizure drugs that have been studied so far,” said Dr. Gidal. He pointed to recent published case reports of potential CBD-drug interactions reporting elevated international normalized ratios for a patient on warfarin using CBD, and another report of elevated tacrolimus levels in a patient using CBD.
The way forward
A variety of regulatory pathways were proposed at the hearing. To prevent adulteration and contamination issues, many advocated standardized good manufacturing practices (GMPs), product reporting, and identification or registration, and a centralized reporting registry for adverse events.
Patient advocates, physicians, and scientists called for an easing of access to cannabis for medical research. Currently, cannabis, still classified as a Schedule I substance by the Drug Enforcement Administration, is only legally available for this purpose through a supply maintained by the National Institute on Drug Abuse. A limited number of strains are available, and access requires a lengthy approval process.
Most discussion centered around CBD, though some presenters asked for smoother sailing for THC research as well, particularly as a potential adjunct or alternative to opioids for chronic pain. Cannabidiol has generally been recognized as non-psychoactive, and the FDA assigned it a very low probability of causing dependence or addiction problems in its own review of human data.
However, in his opening remarks, Dr. Sharpless warned that this fact does not make CBD a benign substance, and many questions remain unanswered.
“How much CBD is safe to consume in a day? What if someone applies a topical CBD lotion, consumes a CBD beverage or candy, and also consumes some CBD oil? How much is too much? How will it interact with other drugs the person might be taking? What if she’s pregnant? What if children access CBD products like gummy edibles? What happens when someone chronically uses CBD for prolonged periods? These and many other questions represent important and significant gaps in our knowledge,” said Dr. Sharpless.
The FDA has established a public docket where the public may submit comments and documents until July 2, 2019.
The Food and Drug Administration held its first-ever public hearing about products containing cannabis or cannabis-derived compounds – and it got an earful.
Hearing from over 100 individuals, the all-staff FDA panel was asked repeatedly to take the lead in bringing order to a confused morass of state and local cannabis regulations. The regulatory landscape currently contains many voids that slow research and put consumers at risk, many witnesses testified.
The federal Farm Bill of 2018 legalized the cultivation of hemp – cannabis with very low 9-tetrahydrocannabinol (THC) content – with regulatory restrictions.
However, the Farm Bill did not legalize low-THC cannabis products, said FDA Acting Commissioner Norman Sharpless, MD. The agency has concluded that both THC and cannabidiol (CBD) are drugs – not dietary supplements – and any exception to these provisions “would be new terrain for the FDA,” he said.
And although restrictions on CBD sales have generally not been enforced, “under current law, CBD and THC cannot lawfully be added to a food or marketed as a dietary supplement,” said Dr. Sharpless.
Though the FDA could choose to carve out regulatory exceptions, it has not yet done so.
Stakeholders who gave testimony included not just physicians, scientists, consumers, and advocates, but also growers, manufacturers, distributors, and retailers – as well as the legal firms that represent these interests.
Broadly, physicians and scientists encouraged the FDA to move forward with classifying CBD and most CBD-containing products as drugs, rather than dietary supplements. In general, the opposite approach was promoted by agriculture and manufacturing representatives who testified.
However, all were united in asking the FDA for clarity – and alacrity.
Again and again, speakers asked the FDA to move posthaste in tidying up the current clutter of regulations. Ryan Vandrey, PhD, of Johns Hopkins University, Baltimore, explained that today, “Hemp-derived CBD is unscheduled, Epidiolex is Schedule V, and synthetic CBD is Schedule I in the DEA’s current framework.”
Kevin Chapman, MD, of Children’s Hospital Colorado, representing the American Epilepsy Society, called for regulation of CBD as a drug, and an accelerated clinical trial path. He noted that many families of children with epilepsy other than Lennox-Gastaut or Dravet syndrome (the only approved Epidiolex indications) are dosing other CBD products, “making it up as they go along."
Jacqueline French MD, chief scientific officer of the Epilepsy Foundation, agreed that many families of children with epilepsy are doing their best to find consistent and unadulterated CBD products. She said her worry is that “abrupt withdrawal of CBD from the market could lead to seizure worsening, injury, or even death for patients who now rely on non-pharmaceutical CBD for seizure control.”
Dr. French and Dr. Chapman each made the point that without insurance reimbursement, Epidiolex costs families over $30,000 annually, while CBD is a small fraction of that – as little as $1,000, said Dr. Chapman.
The lack of standards for CBD preparations means the content of oils, tinctures, and e-cigarette products can vary widely. Michelle Peace, PhD, a forensic scientist at Virginia Commonwealth University, Richmond, receives funding from the U.S. Department of Justice to study licit and illicit drug use with e-cigarettes. She and her colleagues have found dextromethorphan and the potent synthetic cannabinoid 5F-ADB in vaping supplies advertised as containing only CBD.
In a recent investigation prompted by multiple consumer reports of hallucinations after vaping CBD-labeled products, “17 of 18 samples contained a synthetic cannabinoid. Clinics will not find these kinds of drugs when they just do drug testing,” Dr. Peace said.
Another sampling of CBD products available from retail outlets showed that claims often bore little relation to cannabinoid content, said Bill Gurley, PhD, co-director of the Center for Dietary Supplement Research at the University of Arkansas, Little Rock. While several products that claimed to have CBD actually contained none at all, some had many more CBD than the labeled amount – 228 times more, in one instance. One tested product was actually 45% THC, with little CBD content.
The potential for CBD to interact with many other drugs is cause for concern, noted several presenters. Barry Gidal, PharmD, director of pharmacy practice at the University of Wisconsin, Madison, said that “CBD is a complicated molecule. It has a complicated biotransformation pathway” through at least 9 enzymes within the hepatic cytochrome P450 system.
“It wasn’t until the Epidiolex development program that we began to understand the clinical ramifications of this…. The effect of CBD on other drugs may go beyond the anti-seizure drugs that have been studied so far,” said Dr. Gidal. He pointed to recent published case reports of potential CBD-drug interactions reporting elevated international normalized ratios for a patient on warfarin using CBD, and another report of elevated tacrolimus levels in a patient using CBD.
The way forward
A variety of regulatory pathways were proposed at the hearing. To prevent adulteration and contamination issues, many advocated standardized good manufacturing practices (GMPs), product reporting, and identification or registration, and a centralized reporting registry for adverse events.
Patient advocates, physicians, and scientists called for an easing of access to cannabis for medical research. Currently, cannabis, still classified as a Schedule I substance by the Drug Enforcement Administration, is only legally available for this purpose through a supply maintained by the National Institute on Drug Abuse. A limited number of strains are available, and access requires a lengthy approval process.
Most discussion centered around CBD, though some presenters asked for smoother sailing for THC research as well, particularly as a potential adjunct or alternative to opioids for chronic pain. Cannabidiol has generally been recognized as non-psychoactive, and the FDA assigned it a very low probability of causing dependence or addiction problems in its own review of human data.
However, in his opening remarks, Dr. Sharpless warned that this fact does not make CBD a benign substance, and many questions remain unanswered.
“How much CBD is safe to consume in a day? What if someone applies a topical CBD lotion, consumes a CBD beverage or candy, and also consumes some CBD oil? How much is too much? How will it interact with other drugs the person might be taking? What if she’s pregnant? What if children access CBD products like gummy edibles? What happens when someone chronically uses CBD for prolonged periods? These and many other questions represent important and significant gaps in our knowledge,” said Dr. Sharpless.
The FDA has established a public docket where the public may submit comments and documents until July 2, 2019.
FROM AN FDA PUBLIC HEARING
Artificial intelligence advances optical biopsy
SAN DIEGO – Artificial intelligence is improving the accuracy of optical biopsies, a development that may ultimately avoid the need for tissue biopsies of many low-risk colonic polyps, Michael Byrne, MD, said at the annual Digestive Disease Week.
Dr. Byrne, chief executive officer of Satisfai Health, founder of ai4gi, and gastroenterologist at Vancouver General Hospital; Nicolas Guizard, medical imaging researcher at Imagia; and their colleagues at ai4gi developed a “full clinical workflow” for detecting colonic polyps and performing optical biopsies of the polyps.”
Using narrow band imaging (NBI) enhanced with artificial intelligence, the system was used to review 21,804 colonoscopy frames and it achieved a “near-perfect” diagnostic accuracy of 99.9%. In an assessment of colonoscopy videos that included 125 polyps, the system had 95.9% sensitivity, with a specificity of 91.6% and a negative predictive value of 93.6%, Dr. Byrne said.
The speed of the system’s decision-making is rapid, with a typical reaction time of 360 milliseconds. The system was able to make diagnostic inferences at a rate of 26 milliseconds per frame.
With exposure to more learning experiences, the artificial intelligence system improved and committed to a prediction for 97.6% of the polyps it visualized. Dr. Byrne said this result represented a 12.8% improvement from previously published data on the model’s performance.
Dr. Byrne and his colleagues found the system had a tracking accuracy of 92.8%, meaning that this percentage of polyps was both correctly detected and assigned to a unique identifier for follow-up of the site of each excised polyp over time. The interface worked even when multiple polyps were seen on the same screen.
In a video interview, Dr. Byrne discussed the implications for gastroenterology and plans for a clinical trial for rigorous testing of the model.
ai4gi is developing the AI colonoscopy technology. Dr. Byrne is founder of the ai4gi joint venture, which holds a technology codevelopment agreement with Olympus US.
SAN DIEGO – Artificial intelligence is improving the accuracy of optical biopsies, a development that may ultimately avoid the need for tissue biopsies of many low-risk colonic polyps, Michael Byrne, MD, said at the annual Digestive Disease Week.
Dr. Byrne, chief executive officer of Satisfai Health, founder of ai4gi, and gastroenterologist at Vancouver General Hospital; Nicolas Guizard, medical imaging researcher at Imagia; and their colleagues at ai4gi developed a “full clinical workflow” for detecting colonic polyps and performing optical biopsies of the polyps.”
Using narrow band imaging (NBI) enhanced with artificial intelligence, the system was used to review 21,804 colonoscopy frames and it achieved a “near-perfect” diagnostic accuracy of 99.9%. In an assessment of colonoscopy videos that included 125 polyps, the system had 95.9% sensitivity, with a specificity of 91.6% and a negative predictive value of 93.6%, Dr. Byrne said.
The speed of the system’s decision-making is rapid, with a typical reaction time of 360 milliseconds. The system was able to make diagnostic inferences at a rate of 26 milliseconds per frame.
With exposure to more learning experiences, the artificial intelligence system improved and committed to a prediction for 97.6% of the polyps it visualized. Dr. Byrne said this result represented a 12.8% improvement from previously published data on the model’s performance.
Dr. Byrne and his colleagues found the system had a tracking accuracy of 92.8%, meaning that this percentage of polyps was both correctly detected and assigned to a unique identifier for follow-up of the site of each excised polyp over time. The interface worked even when multiple polyps were seen on the same screen.
In a video interview, Dr. Byrne discussed the implications for gastroenterology and plans for a clinical trial for rigorous testing of the model.
ai4gi is developing the AI colonoscopy technology. Dr. Byrne is founder of the ai4gi joint venture, which holds a technology codevelopment agreement with Olympus US.
SAN DIEGO – Artificial intelligence is improving the accuracy of optical biopsies, a development that may ultimately avoid the need for tissue biopsies of many low-risk colonic polyps, Michael Byrne, MD, said at the annual Digestive Disease Week.
Dr. Byrne, chief executive officer of Satisfai Health, founder of ai4gi, and gastroenterologist at Vancouver General Hospital; Nicolas Guizard, medical imaging researcher at Imagia; and their colleagues at ai4gi developed a “full clinical workflow” for detecting colonic polyps and performing optical biopsies of the polyps.”
Using narrow band imaging (NBI) enhanced with artificial intelligence, the system was used to review 21,804 colonoscopy frames and it achieved a “near-perfect” diagnostic accuracy of 99.9%. In an assessment of colonoscopy videos that included 125 polyps, the system had 95.9% sensitivity, with a specificity of 91.6% and a negative predictive value of 93.6%, Dr. Byrne said.
The speed of the system’s decision-making is rapid, with a typical reaction time of 360 milliseconds. The system was able to make diagnostic inferences at a rate of 26 milliseconds per frame.
With exposure to more learning experiences, the artificial intelligence system improved and committed to a prediction for 97.6% of the polyps it visualized. Dr. Byrne said this result represented a 12.8% improvement from previously published data on the model’s performance.
Dr. Byrne and his colleagues found the system had a tracking accuracy of 92.8%, meaning that this percentage of polyps was both correctly detected and assigned to a unique identifier for follow-up of the site of each excised polyp over time. The interface worked even when multiple polyps were seen on the same screen.
In a video interview, Dr. Byrne discussed the implications for gastroenterology and plans for a clinical trial for rigorous testing of the model.
ai4gi is developing the AI colonoscopy technology. Dr. Byrne is founder of the ai4gi joint venture, which holds a technology codevelopment agreement with Olympus US.
REPORTING FROM DDW 2019
Coffee, tea, and soda all up GERD risk
SAN DIEGO – .
In an interview following the oral presentation, Raaj S. Mehta, MD, said that patients in his primary care panel at Massachusetts General Hospital, Boston, where he’s a senior resident, frequently came to him with GERD. In addition to questions about diet, patients frequently wanted to know which beverages might provoke or exacerbate their GERD.
In trying to help his patients, Dr. Mehta said he realized that there wasn’t a prospective evidence base to answer their questions about beverages and GERD, so he and his colleagues used data from the Nurses’ Health Study II (NHS II), a prospective cohort study, to look at the association between various beverages and the incidence of GERD.
“What’s exciting is that we were able to find that coffee, tea, and soda – all three – increase your risk for gastroesophageal reflux disease,” Dr. Mehta said in a video interview. “At the highest quintile level, so looking at people who consume six or more cups per day, you’re looking at maybe a 25%-35% increase in risk of reflux disease.”
There was a dose-response relationship as well: “You do see a slight increase as you go from one cup, to two, to three, and so on, all the way up to six cups” of the offending beverages, said Dr. Mehta.
Overall, the risk for GERD rose from 1.17 to 1.34 with coffee consumption as servings per day increased from less than one to six or more (P for trend less than .0001). Tea consumption was associated with increased GERD risk ranging from 1.08 to 1.26 as consumption rose (P for trend .001). For soda, the increased risk went from 1.12 at less than one serving daily, to 1.41 at four to five servings daily, and then fell to 1.29 at six or more daily servings (P for trend less than .0001).
Whether the beverages were caffeinated or not, said Dr. Mehta, only made a “minimal difference” in GERD risk.
“In contrast, we didn’t see an association for beverages like water, juice, and milk,” he said – reassuring findings in light of fruit juice’s anecdotal status as a GERD culprit.
The NHS II collected data every 2 years from 48,308 female nurses aged 42-62 years at the beginning of the study. Every 4 years dietary information was collected, and on the opposite 4-year cycle, participants answered questions about GERD. Medication use, including the incident use of proton pump inhibitors, was collected every 2 years.
Patients with baseline GERD or use of PPIs or H2 receptor antagonists were excluded from participation.
The quantity and type of beverages were assessed by food frequency questionnaires; other demographic, dietary, and medication variables were also gathered and used to adjust the statistical analysis.
A substitution analysis answered the “what-if” question of the effect of substituting two glasses of plain water daily for either coffee, tea, or soda. Dr. Mehta and colleagues saw a modest reduction in risk for GERD with this strategy.
In addition to the prospective nature of the study (abstract 514, doi: 10.1016/S0016-5085(19)37044-1), the large sample size, high follow-up rates, and well validated dietary data were all strengths, said Dr. Mehta. However, the study’s population is relatively homogeneous, and residual confounding couldn’t be excluded. Also, GERD was defined by self-report, though participants were asked to respond to clear, validated criteria.
For Dr. Mehta, he’s glad to have a clear answer to a common clinic question. “I think that this is one additional thing that I can recommend as a primary care provider to my patients when they come into my office,” he said.
Dr. Mehta reported no conflicts of interest.
SAN DIEGO – .
In an interview following the oral presentation, Raaj S. Mehta, MD, said that patients in his primary care panel at Massachusetts General Hospital, Boston, where he’s a senior resident, frequently came to him with GERD. In addition to questions about diet, patients frequently wanted to know which beverages might provoke or exacerbate their GERD.
In trying to help his patients, Dr. Mehta said he realized that there wasn’t a prospective evidence base to answer their questions about beverages and GERD, so he and his colleagues used data from the Nurses’ Health Study II (NHS II), a prospective cohort study, to look at the association between various beverages and the incidence of GERD.
“What’s exciting is that we were able to find that coffee, tea, and soda – all three – increase your risk for gastroesophageal reflux disease,” Dr. Mehta said in a video interview. “At the highest quintile level, so looking at people who consume six or more cups per day, you’re looking at maybe a 25%-35% increase in risk of reflux disease.”
There was a dose-response relationship as well: “You do see a slight increase as you go from one cup, to two, to three, and so on, all the way up to six cups” of the offending beverages, said Dr. Mehta.
Overall, the risk for GERD rose from 1.17 to 1.34 with coffee consumption as servings per day increased from less than one to six or more (P for trend less than .0001). Tea consumption was associated with increased GERD risk ranging from 1.08 to 1.26 as consumption rose (P for trend .001). For soda, the increased risk went from 1.12 at less than one serving daily, to 1.41 at four to five servings daily, and then fell to 1.29 at six or more daily servings (P for trend less than .0001).
Whether the beverages were caffeinated or not, said Dr. Mehta, only made a “minimal difference” in GERD risk.
“In contrast, we didn’t see an association for beverages like water, juice, and milk,” he said – reassuring findings in light of fruit juice’s anecdotal status as a GERD culprit.
The NHS II collected data every 2 years from 48,308 female nurses aged 42-62 years at the beginning of the study. Every 4 years dietary information was collected, and on the opposite 4-year cycle, participants answered questions about GERD. Medication use, including the incident use of proton pump inhibitors, was collected every 2 years.
Patients with baseline GERD or use of PPIs or H2 receptor antagonists were excluded from participation.
The quantity and type of beverages were assessed by food frequency questionnaires; other demographic, dietary, and medication variables were also gathered and used to adjust the statistical analysis.
A substitution analysis answered the “what-if” question of the effect of substituting two glasses of plain water daily for either coffee, tea, or soda. Dr. Mehta and colleagues saw a modest reduction in risk for GERD with this strategy.
In addition to the prospective nature of the study (abstract 514, doi: 10.1016/S0016-5085(19)37044-1), the large sample size, high follow-up rates, and well validated dietary data were all strengths, said Dr. Mehta. However, the study’s population is relatively homogeneous, and residual confounding couldn’t be excluded. Also, GERD was defined by self-report, though participants were asked to respond to clear, validated criteria.
For Dr. Mehta, he’s glad to have a clear answer to a common clinic question. “I think that this is one additional thing that I can recommend as a primary care provider to my patients when they come into my office,” he said.
Dr. Mehta reported no conflicts of interest.
SAN DIEGO – .
In an interview following the oral presentation, Raaj S. Mehta, MD, said that patients in his primary care panel at Massachusetts General Hospital, Boston, where he’s a senior resident, frequently came to him with GERD. In addition to questions about diet, patients frequently wanted to know which beverages might provoke or exacerbate their GERD.
In trying to help his patients, Dr. Mehta said he realized that there wasn’t a prospective evidence base to answer their questions about beverages and GERD, so he and his colleagues used data from the Nurses’ Health Study II (NHS II), a prospective cohort study, to look at the association between various beverages and the incidence of GERD.
“What’s exciting is that we were able to find that coffee, tea, and soda – all three – increase your risk for gastroesophageal reflux disease,” Dr. Mehta said in a video interview. “At the highest quintile level, so looking at people who consume six or more cups per day, you’re looking at maybe a 25%-35% increase in risk of reflux disease.”
There was a dose-response relationship as well: “You do see a slight increase as you go from one cup, to two, to three, and so on, all the way up to six cups” of the offending beverages, said Dr. Mehta.
Overall, the risk for GERD rose from 1.17 to 1.34 with coffee consumption as servings per day increased from less than one to six or more (P for trend less than .0001). Tea consumption was associated with increased GERD risk ranging from 1.08 to 1.26 as consumption rose (P for trend .001). For soda, the increased risk went from 1.12 at less than one serving daily, to 1.41 at four to five servings daily, and then fell to 1.29 at six or more daily servings (P for trend less than .0001).
Whether the beverages were caffeinated or not, said Dr. Mehta, only made a “minimal difference” in GERD risk.
“In contrast, we didn’t see an association for beverages like water, juice, and milk,” he said – reassuring findings in light of fruit juice’s anecdotal status as a GERD culprit.
The NHS II collected data every 2 years from 48,308 female nurses aged 42-62 years at the beginning of the study. Every 4 years dietary information was collected, and on the opposite 4-year cycle, participants answered questions about GERD. Medication use, including the incident use of proton pump inhibitors, was collected every 2 years.
Patients with baseline GERD or use of PPIs or H2 receptor antagonists were excluded from participation.
The quantity and type of beverages were assessed by food frequency questionnaires; other demographic, dietary, and medication variables were also gathered and used to adjust the statistical analysis.
A substitution analysis answered the “what-if” question of the effect of substituting two glasses of plain water daily for either coffee, tea, or soda. Dr. Mehta and colleagues saw a modest reduction in risk for GERD with this strategy.
In addition to the prospective nature of the study (abstract 514, doi: 10.1016/S0016-5085(19)37044-1), the large sample size, high follow-up rates, and well validated dietary data were all strengths, said Dr. Mehta. However, the study’s population is relatively homogeneous, and residual confounding couldn’t be excluded. Also, GERD was defined by self-report, though participants were asked to respond to clear, validated criteria.
For Dr. Mehta, he’s glad to have a clear answer to a common clinic question. “I think that this is one additional thing that I can recommend as a primary care provider to my patients when they come into my office,” he said.
Dr. Mehta reported no conflicts of interest.
REPORTING FROM DDW 2019
Early cholecystectomy prevents recurrent biliary events
SAN DIEGO – In a retrospective study of 234 patients admitted for gallstone pancreatitis, almost 90% of recurrent biliary events occurred in patients who did not receive a cholecystectomy within 60 days of hospital discharge. The overall rate of recurrence was 19%, and over half of patients (59%) did not receive a cholecystectomy during their index hospitalization.
Additionally, none of the recurrent biliary events occurred in those patients who did receive a cholecystectomy during the index hospitalization or within the first 30 days after discharge. “It really is the case that, ‘if you snooze, you lose,’ ” said Vijay Dalapathi, MD, presenting the findings during an oral presentation at the annual Digestive Disease Week.
Dr. Dalapathi and colleagues had observed that cholecystectomy during an index hospitalization for mild biliary pancreatitis was a far from universal practice, despite guidelines recommending early cholecystectomy.
To delve further into practice patterns, Dr. Dalapathi, first author Mohammed Ullah, MD, and their coauthors at the University of Rochester (N.Y.) conducted a single-site retrospective study of patients who were admitted with gallstone pancreatitis over a 5-year period ending December 2017. Dr. Dalapathi and Dr. Ullah are both second-year gastroenterology fellows.
The study had twin primary outcome measures: cholecystectomy rates performed during an index hospitalization for gallstone pancreatitis and recurrent biliary events after hospitalization. Adult patients were included if they had a diagnosis of acute gallstone pancreatitis, with or without recurrent cholangitis, choledocholithiasis, or acute cholecystitis. Pediatric patients and those with prior cholecystectomy were excluded.
A total of 234 patients were included in the study. Their mean age was 58.3 years, and patients were mostly female (57.3%) and white (91.5%). Mean body mass index was 29.1 kg/m2. A total of 175 patients (74.8%) had endoscopic retrograde cholangiopancreatography.
Out of the entire cohort of patients, 138 (59%) did not have a cholecystectomy during the index hospitalization. Among the patients who did not receive a cholecystectomy, 33 (24%) were deemed unsuitable candidates for the procedure, either because they were critically ill or because they were poor candidates for surgery for other reasons. No reason was provided for the nonperformance of cholecystectomy for an additional 28 patients (20%).
The remaining 75 patients (54%) were deferred to outpatient management. Looking at this subgroup of patients, Dr. Dalapathi and his coinvestigators tracked the amount of time that passed before cholecystectomy.
The researchers found that 19 patients (25%) had not had a cholecystectomy within the study period. A total of 21 patients (28%) had the procedure more than 60 days from hospitalization, and another 23 (31%) had the procedure between 30 and 60 days after hospitalization. Just 12 patients (16%) of this subgroup had their cholecystectomy within 30 days of hospitalization.
Among patients who were discharged without a cholecystectomy, Dr. Dalapathi and his coauthors found 26 recurrent biliary events (19%): 15 were gallstone pancreatitis and 10 were cholecystitis; 1 patient developed cholangitis.
The crux of the study’s findings came when the investigators looked at the association between recurrent events and cholecystectomy timing. They found no recurrent biliary events among those who received cholecystectomy while hospitalized or within the first 30 days after discharge. Of the 26 events, 3 (12%) occurred in those whose cholecystectomies came 30-60 days after discharge. The remaining 23 events (88%) were seen in those receiving a cholecystectomy more than 60 days after discharge, or not at all.
Guidelines from the American Gastroenterological Association, the Society of American Gastrointestinal and Endoscopic Surgeons, and the American College of Gastroenterology all recommend early cholecystectomy after mild acute gallstone pancreatitis, said Dr. Dalapathi.
However, two separate systematic reviews including a total of 22 studies and over 3,000 patients showed that about half (48% and 51%) of patients admitted with mild acute biliary pancreatitis received a cholecystectomy during the index hospitalization or within 14 days of the hospitalization.
Further, he said, previous work had shown recurrent biliary event rates approaching 20% for patients whose biliary pancreatitis bout was not followed by cholecystectomy, a figure in line with the rate seen in the present study.
“Cholecystectomy should be performed during index hospitalization or as soon as possible within 30 days of mild biliary pancreatitis to minimize risk of recurrent biliary events,” said Dr. Dalapathi.
The authors reported no outside sources of funding and no conflicts of interest.
SOURCE: Ullah M. et al. DDW 2019, Abstract 24.
SAN DIEGO – In a retrospective study of 234 patients admitted for gallstone pancreatitis, almost 90% of recurrent biliary events occurred in patients who did not receive a cholecystectomy within 60 days of hospital discharge. The overall rate of recurrence was 19%, and over half of patients (59%) did not receive a cholecystectomy during their index hospitalization.
Additionally, none of the recurrent biliary events occurred in those patients who did receive a cholecystectomy during the index hospitalization or within the first 30 days after discharge. “It really is the case that, ‘if you snooze, you lose,’ ” said Vijay Dalapathi, MD, presenting the findings during an oral presentation at the annual Digestive Disease Week.
Dr. Dalapathi and colleagues had observed that cholecystectomy during an index hospitalization for mild biliary pancreatitis was a far from universal practice, despite guidelines recommending early cholecystectomy.
To delve further into practice patterns, Dr. Dalapathi, first author Mohammed Ullah, MD, and their coauthors at the University of Rochester (N.Y.) conducted a single-site retrospective study of patients who were admitted with gallstone pancreatitis over a 5-year period ending December 2017. Dr. Dalapathi and Dr. Ullah are both second-year gastroenterology fellows.
The study had twin primary outcome measures: cholecystectomy rates performed during an index hospitalization for gallstone pancreatitis and recurrent biliary events after hospitalization. Adult patients were included if they had a diagnosis of acute gallstone pancreatitis, with or without recurrent cholangitis, choledocholithiasis, or acute cholecystitis. Pediatric patients and those with prior cholecystectomy were excluded.
A total of 234 patients were included in the study. Their mean age was 58.3 years, and patients were mostly female (57.3%) and white (91.5%). Mean body mass index was 29.1 kg/m2. A total of 175 patients (74.8%) had endoscopic retrograde cholangiopancreatography.
Out of the entire cohort of patients, 138 (59%) did not have a cholecystectomy during the index hospitalization. Among the patients who did not receive a cholecystectomy, 33 (24%) were deemed unsuitable candidates for the procedure, either because they were critically ill or because they were poor candidates for surgery for other reasons. No reason was provided for the nonperformance of cholecystectomy for an additional 28 patients (20%).
The remaining 75 patients (54%) were deferred to outpatient management. Looking at this subgroup of patients, Dr. Dalapathi and his coinvestigators tracked the amount of time that passed before cholecystectomy.
The researchers found that 19 patients (25%) had not had a cholecystectomy within the study period. A total of 21 patients (28%) had the procedure more than 60 days from hospitalization, and another 23 (31%) had the procedure between 30 and 60 days after hospitalization. Just 12 patients (16%) of this subgroup had their cholecystectomy within 30 days of hospitalization.
Among patients who were discharged without a cholecystectomy, Dr. Dalapathi and his coauthors found 26 recurrent biliary events (19%): 15 were gallstone pancreatitis and 10 were cholecystitis; 1 patient developed cholangitis.
The crux of the study’s findings came when the investigators looked at the association between recurrent events and cholecystectomy timing. They found no recurrent biliary events among those who received cholecystectomy while hospitalized or within the first 30 days after discharge. Of the 26 events, 3 (12%) occurred in those whose cholecystectomies came 30-60 days after discharge. The remaining 23 events (88%) were seen in those receiving a cholecystectomy more than 60 days after discharge, or not at all.
Guidelines from the American Gastroenterological Association, the Society of American Gastrointestinal and Endoscopic Surgeons, and the American College of Gastroenterology all recommend early cholecystectomy after mild acute gallstone pancreatitis, said Dr. Dalapathi.
However, two separate systematic reviews including a total of 22 studies and over 3,000 patients showed that about half (48% and 51%) of patients admitted with mild acute biliary pancreatitis received a cholecystectomy during the index hospitalization or within 14 days of the hospitalization.
Further, he said, previous work had shown recurrent biliary event rates approaching 20% for patients whose biliary pancreatitis bout was not followed by cholecystectomy, a figure in line with the rate seen in the present study.
“Cholecystectomy should be performed during index hospitalization or as soon as possible within 30 days of mild biliary pancreatitis to minimize risk of recurrent biliary events,” said Dr. Dalapathi.
The authors reported no outside sources of funding and no conflicts of interest.
SOURCE: Ullah M. et al. DDW 2019, Abstract 24.
SAN DIEGO – In a retrospective study of 234 patients admitted for gallstone pancreatitis, almost 90% of recurrent biliary events occurred in patients who did not receive a cholecystectomy within 60 days of hospital discharge. The overall rate of recurrence was 19%, and over half of patients (59%) did not receive a cholecystectomy during their index hospitalization.
Additionally, none of the recurrent biliary events occurred in those patients who did receive a cholecystectomy during the index hospitalization or within the first 30 days after discharge. “It really is the case that, ‘if you snooze, you lose,’ ” said Vijay Dalapathi, MD, presenting the findings during an oral presentation at the annual Digestive Disease Week.
Dr. Dalapathi and colleagues had observed that cholecystectomy during an index hospitalization for mild biliary pancreatitis was a far from universal practice, despite guidelines recommending early cholecystectomy.
To delve further into practice patterns, Dr. Dalapathi, first author Mohammed Ullah, MD, and their coauthors at the University of Rochester (N.Y.) conducted a single-site retrospective study of patients who were admitted with gallstone pancreatitis over a 5-year period ending December 2017. Dr. Dalapathi and Dr. Ullah are both second-year gastroenterology fellows.
The study had twin primary outcome measures: cholecystectomy rates performed during an index hospitalization for gallstone pancreatitis and recurrent biliary events after hospitalization. Adult patients were included if they had a diagnosis of acute gallstone pancreatitis, with or without recurrent cholangitis, choledocholithiasis, or acute cholecystitis. Pediatric patients and those with prior cholecystectomy were excluded.
A total of 234 patients were included in the study. Their mean age was 58.3 years, and patients were mostly female (57.3%) and white (91.5%). Mean body mass index was 29.1 kg/m2. A total of 175 patients (74.8%) had endoscopic retrograde cholangiopancreatography.
Out of the entire cohort of patients, 138 (59%) did not have a cholecystectomy during the index hospitalization. Among the patients who did not receive a cholecystectomy, 33 (24%) were deemed unsuitable candidates for the procedure, either because they were critically ill or because they were poor candidates for surgery for other reasons. No reason was provided for the nonperformance of cholecystectomy for an additional 28 patients (20%).
The remaining 75 patients (54%) were deferred to outpatient management. Looking at this subgroup of patients, Dr. Dalapathi and his coinvestigators tracked the amount of time that passed before cholecystectomy.
The researchers found that 19 patients (25%) had not had a cholecystectomy within the study period. A total of 21 patients (28%) had the procedure more than 60 days from hospitalization, and another 23 (31%) had the procedure between 30 and 60 days after hospitalization. Just 12 patients (16%) of this subgroup had their cholecystectomy within 30 days of hospitalization.
Among patients who were discharged without a cholecystectomy, Dr. Dalapathi and his coauthors found 26 recurrent biliary events (19%): 15 were gallstone pancreatitis and 10 were cholecystitis; 1 patient developed cholangitis.
The crux of the study’s findings came when the investigators looked at the association between recurrent events and cholecystectomy timing. They found no recurrent biliary events among those who received cholecystectomy while hospitalized or within the first 30 days after discharge. Of the 26 events, 3 (12%) occurred in those whose cholecystectomies came 30-60 days after discharge. The remaining 23 events (88%) were seen in those receiving a cholecystectomy more than 60 days after discharge, or not at all.
Guidelines from the American Gastroenterological Association, the Society of American Gastrointestinal and Endoscopic Surgeons, and the American College of Gastroenterology all recommend early cholecystectomy after mild acute gallstone pancreatitis, said Dr. Dalapathi.
However, two separate systematic reviews including a total of 22 studies and over 3,000 patients showed that about half (48% and 51%) of patients admitted with mild acute biliary pancreatitis received a cholecystectomy during the index hospitalization or within 14 days of the hospitalization.
Further, he said, previous work had shown recurrent biliary event rates approaching 20% for patients whose biliary pancreatitis bout was not followed by cholecystectomy, a figure in line with the rate seen in the present study.
“Cholecystectomy should be performed during index hospitalization or as soon as possible within 30 days of mild biliary pancreatitis to minimize risk of recurrent biliary events,” said Dr. Dalapathi.
The authors reported no outside sources of funding and no conflicts of interest.
SOURCE: Ullah M. et al. DDW 2019, Abstract 24.
REPORTING FROM DDW 2019
Low-dose CT has a place in spondyloarthritis imaging toolbox
MADISON, WISC. – said Robert Lambert, MD, speaking at the annual meeting of the Spondyloarthritis Research and Treatment Network (SPARTAN).
“Low-dose CT of the SI [sacroiliac] joints is probably underutilized,” said Dr. Lambert, chair of the department of radiology and diagnostic imaging at the University of Alberta, Edmonton. “Subtle bony changes are demonstrated very well, and it can be an excellent test for resolving equivocal findings on x-ray or MRI.”
An important first step in making imaging decisions is to put concerns about radiation exposure in context, said Dr. Lambert. “Today, almost all CT effective exposure doses are considered low risk.”
Older studies have shown that a conventional two-view chest radiograph delivers a dose of 0.1 mSv – the equivalent of 10 days of background radiation – whereas the highest radiation dose is delivered by a CT scan of the abdomen and pelvis with and without contrast. This examination delivers an effective dose of 20 mSv, the equivalent of 7 years of background radiation. Dr. Lambert pointed out what the “moderate” additional lifetime risk of malignancy – 1:500 – associated with this scan looks like in real-world numbers: “So the lifetime risk of cancer would increase from 20% to 20.2%.”
Recently, measurements of effective doses delivered in low-dose CT (ldCT) have shown that “most doses are significantly lower than previously quoted,” said Dr. Lambert. For example, ldCT of the SI joints delivers just 0.42 mSv, a radiation dose that’s in the same minimal risk category as a chest radiograph. In fact, for patients with high body mass, the radiation dose from ldCT of the SI joints can be less than that from a conventional radiograph.
“Could low-dose CT of the spine better detect new bone formation, compared to x-ray?” Dr. Lambert asked. A recent study attempted to answer the question, looking at 40 patients with ankylosing spondylitis who received ldCT at baseline and 2 years later (Ann Rheum Dis. 2018;77:371-7). In developing a CT syndesmophyte score (CTSS), two independent readers, blinded to the time order in which images were obtained, assessed vertebral syndesmophytes in the coronal and sagittal planes for each patient. The conclusion was that “new bone formation in the spine of patients with ankylosing spondylitis can be assessed reliably,” Dr. Lambert said.
A related study directly compared the new CTSS system with the modified Stoke Ankylosing Spondylitis Spine Score, used for conventional radiographs. Both studies used data from the Sensitive Imaging in Ankylosing Spondylitis cohort.
In this latter study, whole spine ldCT tracked progression better than conventional radiographs because it detected more new and growing syndesmophytes, Dr. Lambert said. One important reason for this was that conventional radiography only has utility in the cervical and lumbar spine and the pelvis, while most progression was seen in the thoracic spine with ldCT (Ann Rheum Dis. 2018;77:293-9).
The radiation dose for ldCT of the spine – approximately 4 mSv – is about 10 times that for ldCT of the SI joints, but still one-half to three-quarters of the dose for a whole-spine CT, Dr. Lambert said. Put another way, the ldCT whole-spine dose is nearly equivalent to the dose for the three radiographic studies required to image the cervical, thoracic, and lumbar spine.
Another imaging approach using CT zooms in on the thoracolumbar spine, imaging vertebrae T10-L4. Through sophisticated computational reconstruction techniques, the researchers were able to quantify syndesmophyte height circumferentially around each vertebra (J Rheumatol. 2015;42[3]:472-8).
The study, which imaged 33 patients at baseline and then at year 1 and year 2, found that the circumferential syndesmophyte height correlated well with spinal flexibility. Variation was low between two scans performed on the same day, at 0.893% per patient. Despite these advantages of high reliability and good sensitivity to change, one consideration for clinical consideration is the radiation dose, estimated about 8 mSv, Dr. Lambert noted.
Though MRI is a keystone for diagnosis and management of spondyloarthritis, Dr. Lambert pointed out that it’s more expensive than CT and still not routinely available everywhere. He also noted that reimbursement and prior authorizations may be easier to obtain for CT.
“Low-dose CT has tremendous research potential, especially in the thoracic spine,” said Dr. Lambert. “But it’s not ready for routine clinical use. First, the dose is not trivial, at about 4 mSv.” Also, it’s time consuming to interpret and not all CT scanners are compatible with ldCT techniques. “Lower dose can mean lower imaging quality,” and syndesmophytes can be harder to detect in larger individuals.
Dr. Lambert reported no relevant conflicts of interest.
MADISON, WISC. – said Robert Lambert, MD, speaking at the annual meeting of the Spondyloarthritis Research and Treatment Network (SPARTAN).
“Low-dose CT of the SI [sacroiliac] joints is probably underutilized,” said Dr. Lambert, chair of the department of radiology and diagnostic imaging at the University of Alberta, Edmonton. “Subtle bony changes are demonstrated very well, and it can be an excellent test for resolving equivocal findings on x-ray or MRI.”
An important first step in making imaging decisions is to put concerns about radiation exposure in context, said Dr. Lambert. “Today, almost all CT effective exposure doses are considered low risk.”
Older studies have shown that a conventional two-view chest radiograph delivers a dose of 0.1 mSv – the equivalent of 10 days of background radiation – whereas the highest radiation dose is delivered by a CT scan of the abdomen and pelvis with and without contrast. This examination delivers an effective dose of 20 mSv, the equivalent of 7 years of background radiation. Dr. Lambert pointed out what the “moderate” additional lifetime risk of malignancy – 1:500 – associated with this scan looks like in real-world numbers: “So the lifetime risk of cancer would increase from 20% to 20.2%.”
Recently, measurements of effective doses delivered in low-dose CT (ldCT) have shown that “most doses are significantly lower than previously quoted,” said Dr. Lambert. For example, ldCT of the SI joints delivers just 0.42 mSv, a radiation dose that’s in the same minimal risk category as a chest radiograph. In fact, for patients with high body mass, the radiation dose from ldCT of the SI joints can be less than that from a conventional radiograph.
“Could low-dose CT of the spine better detect new bone formation, compared to x-ray?” Dr. Lambert asked. A recent study attempted to answer the question, looking at 40 patients with ankylosing spondylitis who received ldCT at baseline and 2 years later (Ann Rheum Dis. 2018;77:371-7). In developing a CT syndesmophyte score (CTSS), two independent readers, blinded to the time order in which images were obtained, assessed vertebral syndesmophytes in the coronal and sagittal planes for each patient. The conclusion was that “new bone formation in the spine of patients with ankylosing spondylitis can be assessed reliably,” Dr. Lambert said.
A related study directly compared the new CTSS system with the modified Stoke Ankylosing Spondylitis Spine Score, used for conventional radiographs. Both studies used data from the Sensitive Imaging in Ankylosing Spondylitis cohort.
In this latter study, whole spine ldCT tracked progression better than conventional radiographs because it detected more new and growing syndesmophytes, Dr. Lambert said. One important reason for this was that conventional radiography only has utility in the cervical and lumbar spine and the pelvis, while most progression was seen in the thoracic spine with ldCT (Ann Rheum Dis. 2018;77:293-9).
The radiation dose for ldCT of the spine – approximately 4 mSv – is about 10 times that for ldCT of the SI joints, but still one-half to three-quarters of the dose for a whole-spine CT, Dr. Lambert said. Put another way, the ldCT whole-spine dose is nearly equivalent to the dose for the three radiographic studies required to image the cervical, thoracic, and lumbar spine.
Another imaging approach using CT zooms in on the thoracolumbar spine, imaging vertebrae T10-L4. Through sophisticated computational reconstruction techniques, the researchers were able to quantify syndesmophyte height circumferentially around each vertebra (J Rheumatol. 2015;42[3]:472-8).
The study, which imaged 33 patients at baseline and then at year 1 and year 2, found that the circumferential syndesmophyte height correlated well with spinal flexibility. Variation was low between two scans performed on the same day, at 0.893% per patient. Despite these advantages of high reliability and good sensitivity to change, one consideration for clinical consideration is the radiation dose, estimated about 8 mSv, Dr. Lambert noted.
Though MRI is a keystone for diagnosis and management of spondyloarthritis, Dr. Lambert pointed out that it’s more expensive than CT and still not routinely available everywhere. He also noted that reimbursement and prior authorizations may be easier to obtain for CT.
“Low-dose CT has tremendous research potential, especially in the thoracic spine,” said Dr. Lambert. “But it’s not ready for routine clinical use. First, the dose is not trivial, at about 4 mSv.” Also, it’s time consuming to interpret and not all CT scanners are compatible with ldCT techniques. “Lower dose can mean lower imaging quality,” and syndesmophytes can be harder to detect in larger individuals.
Dr. Lambert reported no relevant conflicts of interest.
MADISON, WISC. – said Robert Lambert, MD, speaking at the annual meeting of the Spondyloarthritis Research and Treatment Network (SPARTAN).
“Low-dose CT of the SI [sacroiliac] joints is probably underutilized,” said Dr. Lambert, chair of the department of radiology and diagnostic imaging at the University of Alberta, Edmonton. “Subtle bony changes are demonstrated very well, and it can be an excellent test for resolving equivocal findings on x-ray or MRI.”
An important first step in making imaging decisions is to put concerns about radiation exposure in context, said Dr. Lambert. “Today, almost all CT effective exposure doses are considered low risk.”
Older studies have shown that a conventional two-view chest radiograph delivers a dose of 0.1 mSv – the equivalent of 10 days of background radiation – whereas the highest radiation dose is delivered by a CT scan of the abdomen and pelvis with and without contrast. This examination delivers an effective dose of 20 mSv, the equivalent of 7 years of background radiation. Dr. Lambert pointed out what the “moderate” additional lifetime risk of malignancy – 1:500 – associated with this scan looks like in real-world numbers: “So the lifetime risk of cancer would increase from 20% to 20.2%.”
Recently, measurements of effective doses delivered in low-dose CT (ldCT) have shown that “most doses are significantly lower than previously quoted,” said Dr. Lambert. For example, ldCT of the SI joints delivers just 0.42 mSv, a radiation dose that’s in the same minimal risk category as a chest radiograph. In fact, for patients with high body mass, the radiation dose from ldCT of the SI joints can be less than that from a conventional radiograph.
“Could low-dose CT of the spine better detect new bone formation, compared to x-ray?” Dr. Lambert asked. A recent study attempted to answer the question, looking at 40 patients with ankylosing spondylitis who received ldCT at baseline and 2 years later (Ann Rheum Dis. 2018;77:371-7). In developing a CT syndesmophyte score (CTSS), two independent readers, blinded to the time order in which images were obtained, assessed vertebral syndesmophytes in the coronal and sagittal planes for each patient. The conclusion was that “new bone formation in the spine of patients with ankylosing spondylitis can be assessed reliably,” Dr. Lambert said.
A related study directly compared the new CTSS system with the modified Stoke Ankylosing Spondylitis Spine Score, used for conventional radiographs. Both studies used data from the Sensitive Imaging in Ankylosing Spondylitis cohort.
In this latter study, whole spine ldCT tracked progression better than conventional radiographs because it detected more new and growing syndesmophytes, Dr. Lambert said. One important reason for this was that conventional radiography only has utility in the cervical and lumbar spine and the pelvis, while most progression was seen in the thoracic spine with ldCT (Ann Rheum Dis. 2018;77:293-9).
The radiation dose for ldCT of the spine – approximately 4 mSv – is about 10 times that for ldCT of the SI joints, but still one-half to three-quarters of the dose for a whole-spine CT, Dr. Lambert said. Put another way, the ldCT whole-spine dose is nearly equivalent to the dose for the three radiographic studies required to image the cervical, thoracic, and lumbar spine.
Another imaging approach using CT zooms in on the thoracolumbar spine, imaging vertebrae T10-L4. Through sophisticated computational reconstruction techniques, the researchers were able to quantify syndesmophyte height circumferentially around each vertebra (J Rheumatol. 2015;42[3]:472-8).
The study, which imaged 33 patients at baseline and then at year 1 and year 2, found that the circumferential syndesmophyte height correlated well with spinal flexibility. Variation was low between two scans performed on the same day, at 0.893% per patient. Despite these advantages of high reliability and good sensitivity to change, one consideration for clinical consideration is the radiation dose, estimated about 8 mSv, Dr. Lambert noted.
Though MRI is a keystone for diagnosis and management of spondyloarthritis, Dr. Lambert pointed out that it’s more expensive than CT and still not routinely available everywhere. He also noted that reimbursement and prior authorizations may be easier to obtain for CT.
“Low-dose CT has tremendous research potential, especially in the thoracic spine,” said Dr. Lambert. “But it’s not ready for routine clinical use. First, the dose is not trivial, at about 4 mSv.” Also, it’s time consuming to interpret and not all CT scanners are compatible with ldCT techniques. “Lower dose can mean lower imaging quality,” and syndesmophytes can be harder to detect in larger individuals.
Dr. Lambert reported no relevant conflicts of interest.
EXPERT ANALYSIS FROM SPARTAN 2019
In duodenal neuroendocrine tumors, resection technique matters
SAN DIEGO – according to a study presented at the annual Digestive Disease Week.
In a retrospective case series of 20 patients, local recurrence was seen primarily in patients who had cold forceps, rather than deeper, excision techniques. However, most patients who had cold forceps resections also remained recurrence-free, said Jonathan Ragheb, MD, a resident physician at the Cleveland Clinic.
Duodenal neuroendocrine tumors are becoming increasingly prevalent, so Dr. Ragheb and colleagues were interested in “seeing what we should do with them when we encounter them in clinical practice – whether it be surgery or endoscopic intervention,” he said.
In an interview, Dr. Ragheb said that he and his colleagues structured the study to answer the question: “What is the impact of the margin status on the recurrence of the tumor?” This relationship is important in guiding neuroendocrine tumor (NET) management, he said. The technique used for NET removal may also have effects on recurrence rates, so Dr. Ragheb and his collaborators were also interested in answering that question.
The investigators looked at patients at two facilities with a histopathologic diagnosis of duodenal NET who had endoscopic tumor resection during 2004-2018. They excluded patients who had cold forceps endoscopic resection (ER) and clear margins, patients who had further surgical therapy, and those who were lost to endoscopic follow-up.
Assessment of resection margin status was performed independently by pathologists at each study center.
“We found that people with clear margins tend not to have any recurrence, and this is over the course of a year to a year and a half of follow-up,” said Dr. Ragheb, adding, “Those patients who did have some positive margins – whether lateral margins or vertical margins – the majority of them did not have recurrence over that time period.” However, 4 of the patients in the 20-patient cohort did have some tumor recurrence, and all of these patients had an incomplete initial resection.
The investigators took a closer look at which resection techniques were most likely to result in clear margins and no recurrences, and they found that deeper techniques were associated with fewer recurrences. These included endoscopic submucosal or mucosal resection and en bloc snare polypectomy; all were associated with fewer recurrences than resections performed with cold forceps biopsy.
In all, 7 patients had clear (R0) margins, while 13 patients had an incomplete (R1) resection from the biopsy. Of the patients who had R1 margins with local recurrence, three had received a cold forceps biopsy. The other recurrence was in a patient who had endoscopic mucosal resection.
“Margin status is not the sole contributor to recurrence rates of these duodenal neuroendocrine tumors,” said Dr. Ragheb, noting that previous work has identified other possible factors, including tumor grade and biology, that can affect recurrence.
Knowledge gaps still exist regarding best practices for biopsy and decision of duodenal NETs, acknowledged Dr. Ragheb. The present study only followed patients for about a year and a half, so longer-term recurrence patterns and their relationship with various resection techniques aren’t known.
“Larger studies considering tumor grading and ER [endoscopic resection] technique are needed to fully elucidate the risk of local recurrences after ER,” wrote Dr. Ragheb and colleagues.
Dr. Ragheb reported no outside sources of funding and no conflicts of interest.
SAN DIEGO – according to a study presented at the annual Digestive Disease Week.
In a retrospective case series of 20 patients, local recurrence was seen primarily in patients who had cold forceps, rather than deeper, excision techniques. However, most patients who had cold forceps resections also remained recurrence-free, said Jonathan Ragheb, MD, a resident physician at the Cleveland Clinic.
Duodenal neuroendocrine tumors are becoming increasingly prevalent, so Dr. Ragheb and colleagues were interested in “seeing what we should do with them when we encounter them in clinical practice – whether it be surgery or endoscopic intervention,” he said.
In an interview, Dr. Ragheb said that he and his colleagues structured the study to answer the question: “What is the impact of the margin status on the recurrence of the tumor?” This relationship is important in guiding neuroendocrine tumor (NET) management, he said. The technique used for NET removal may also have effects on recurrence rates, so Dr. Ragheb and his collaborators were also interested in answering that question.
The investigators looked at patients at two facilities with a histopathologic diagnosis of duodenal NET who had endoscopic tumor resection during 2004-2018. They excluded patients who had cold forceps endoscopic resection (ER) and clear margins, patients who had further surgical therapy, and those who were lost to endoscopic follow-up.
Assessment of resection margin status was performed independently by pathologists at each study center.
“We found that people with clear margins tend not to have any recurrence, and this is over the course of a year to a year and a half of follow-up,” said Dr. Ragheb, adding, “Those patients who did have some positive margins – whether lateral margins or vertical margins – the majority of them did not have recurrence over that time period.” However, 4 of the patients in the 20-patient cohort did have some tumor recurrence, and all of these patients had an incomplete initial resection.
The investigators took a closer look at which resection techniques were most likely to result in clear margins and no recurrences, and they found that deeper techniques were associated with fewer recurrences. These included endoscopic submucosal or mucosal resection and en bloc snare polypectomy; all were associated with fewer recurrences than resections performed with cold forceps biopsy.
In all, 7 patients had clear (R0) margins, while 13 patients had an incomplete (R1) resection from the biopsy. Of the patients who had R1 margins with local recurrence, three had received a cold forceps biopsy. The other recurrence was in a patient who had endoscopic mucosal resection.
“Margin status is not the sole contributor to recurrence rates of these duodenal neuroendocrine tumors,” said Dr. Ragheb, noting that previous work has identified other possible factors, including tumor grade and biology, that can affect recurrence.
Knowledge gaps still exist regarding best practices for biopsy and decision of duodenal NETs, acknowledged Dr. Ragheb. The present study only followed patients for about a year and a half, so longer-term recurrence patterns and their relationship with various resection techniques aren’t known.
“Larger studies considering tumor grading and ER [endoscopic resection] technique are needed to fully elucidate the risk of local recurrences after ER,” wrote Dr. Ragheb and colleagues.
Dr. Ragheb reported no outside sources of funding and no conflicts of interest.
SAN DIEGO – according to a study presented at the annual Digestive Disease Week.
In a retrospective case series of 20 patients, local recurrence was seen primarily in patients who had cold forceps, rather than deeper, excision techniques. However, most patients who had cold forceps resections also remained recurrence-free, said Jonathan Ragheb, MD, a resident physician at the Cleveland Clinic.
Duodenal neuroendocrine tumors are becoming increasingly prevalent, so Dr. Ragheb and colleagues were interested in “seeing what we should do with them when we encounter them in clinical practice – whether it be surgery or endoscopic intervention,” he said.
In an interview, Dr. Ragheb said that he and his colleagues structured the study to answer the question: “What is the impact of the margin status on the recurrence of the tumor?” This relationship is important in guiding neuroendocrine tumor (NET) management, he said. The technique used for NET removal may also have effects on recurrence rates, so Dr. Ragheb and his collaborators were also interested in answering that question.
The investigators looked at patients at two facilities with a histopathologic diagnosis of duodenal NET who had endoscopic tumor resection during 2004-2018. They excluded patients who had cold forceps endoscopic resection (ER) and clear margins, patients who had further surgical therapy, and those who were lost to endoscopic follow-up.
Assessment of resection margin status was performed independently by pathologists at each study center.
“We found that people with clear margins tend not to have any recurrence, and this is over the course of a year to a year and a half of follow-up,” said Dr. Ragheb, adding, “Those patients who did have some positive margins – whether lateral margins or vertical margins – the majority of them did not have recurrence over that time period.” However, 4 of the patients in the 20-patient cohort did have some tumor recurrence, and all of these patients had an incomplete initial resection.
The investigators took a closer look at which resection techniques were most likely to result in clear margins and no recurrences, and they found that deeper techniques were associated with fewer recurrences. These included endoscopic submucosal or mucosal resection and en bloc snare polypectomy; all were associated with fewer recurrences than resections performed with cold forceps biopsy.
In all, 7 patients had clear (R0) margins, while 13 patients had an incomplete (R1) resection from the biopsy. Of the patients who had R1 margins with local recurrence, three had received a cold forceps biopsy. The other recurrence was in a patient who had endoscopic mucosal resection.
“Margin status is not the sole contributor to recurrence rates of these duodenal neuroendocrine tumors,” said Dr. Ragheb, noting that previous work has identified other possible factors, including tumor grade and biology, that can affect recurrence.
Knowledge gaps still exist regarding best practices for biopsy and decision of duodenal NETs, acknowledged Dr. Ragheb. The present study only followed patients for about a year and a half, so longer-term recurrence patterns and their relationship with various resection techniques aren’t known.
“Larger studies considering tumor grading and ER [endoscopic resection] technique are needed to fully elucidate the risk of local recurrences after ER,” wrote Dr. Ragheb and colleagues.
Dr. Ragheb reported no outside sources of funding and no conflicts of interest.
REPORTING FROM DDW 2019
Immunostaining boosts pathologists’ accuracy in Barrett’s esophagus
SAN DIEGO – Years of experience and an academic medical center affiliation predicted the accuracy of pathologists reviewing biopsies from patients with Barrett’s esophagus, according to the results of a multinational study.
Those with 5 or more years of experience were less likely to make major diagnostic errors in reviewing Barrett’s esophagus biopsies (odds ratio [OR], 0.48, 95% confidence interval, 0.31-0.74). Pathologists who worked in nonacademic settings were more likely to make a major diagnostic error (OR, 1.76; 95% CI, 1.15-2.69) when reviewing hematoxylin and eosin-stained slides alone, but the addition of p53 immunostaining greatly improved accuracy.
Current guidelines recommend expert evaluation of Barrett’s esophagus biopsies that show dysplasia, but exact determination of expert review status had been lacking, according to Marnix Jansen, MD, a pathologist at University College London.
“The guidelines say that biopsies with dysplasia need to be reviewed by an expert pathologist, but don’t define what makes an expert pathologist,” Dr. Jansen said in an interview at the annual Digestive Disease Week.
“We wanted to advance the field by for the first time creating objective and quantitative standards” to delineate the characteristics of an expert pathologist in reviewing Barrett’s esophagus tissue samples, said Dr. Jansen. The study’s first author is Myrtle J. van der Wel, MD, of Amsterdam University Medical Center, the Netherlands.
More than 6,000 individual case diagnoses were used in the study, which included pathologists from more than 20 countries. Before the pathologists began reviewing the case set, they answered a questionnaire about training, practice context, years of experience, case volume, and other demographic characteristics.
“We then sent those biopsies around the world to ... 55 pathologists in the U.S., in Europe, Japan, Australia, even some in South America – so really around the whole globe,” explained Dr. Jansen. Biopsies were assessed by each pathologist before and after p53 immunostaining.
“Once we had the final dataset – which is massive, because we had 6,000 case diagnoses within our dataset – we could then regress those variables back onto the consensus data,” providing a first-ever look at “clear predictors of what the pathologist looks like that will score on a par with where the experts are,” said Dr. Jansen.
The results? “You need at least 5 years of experience. On top of that, if you are a pathologist working in a [nonacademic center], you are at a slightly increased risk of making major diagnostic errors,” said Dr. Jansen. However, the analysis convincingly showed that the addition of p53 immunostaining neutralized the risk for these pathologists – a strength of having such a large dataset, he said.
The study also affirmed the safety of digital pathology for expert review, said Dr. Jansen: “One of the reassuring points of our study was that we found that the best concordance was for nondysplastic Barrett’s, and high-grade dysplasia, which really replicates known glass slide characteristics. So we can really say that digital pathology is safe for this application – which is very relevant for pathologists that are taking in cases from outside for expert review.”
Concordance rates for nondysplastic Barrett’s esophagus and high-grade dysplasia were over 70%; for low-grade dysplasia, rates were intermediate at 42%.
Going forward, the study can inform the next iteration of guidelines for pathologist review of Barrett’s dysplasia, said Dr. Jansen. Rather than just recommending expert review, the guidelines can include a quantitative assessment of what’s needed. “You need to have to have at least 5 years of experience, and if you work in a [community hospital], to use a p53, and that is collectively what amounts to expertise in Barrett’s pathology.”
A follow-up study with a similar design is planned within the United Kingdom, the Netherlands, and the United States. This study, which Dr. Jansen said would enroll hundreds of pathologists, will include an intervention arm that administers a tutorial with the aim of improving concordance scoring.
Dr. Jansen reported no relevant conflicts of interest.
SAN DIEGO – Years of experience and an academic medical center affiliation predicted the accuracy of pathologists reviewing biopsies from patients with Barrett’s esophagus, according to the results of a multinational study.
Those with 5 or more years of experience were less likely to make major diagnostic errors in reviewing Barrett’s esophagus biopsies (odds ratio [OR], 0.48, 95% confidence interval, 0.31-0.74). Pathologists who worked in nonacademic settings were more likely to make a major diagnostic error (OR, 1.76; 95% CI, 1.15-2.69) when reviewing hematoxylin and eosin-stained slides alone, but the addition of p53 immunostaining greatly improved accuracy.
Current guidelines recommend expert evaluation of Barrett’s esophagus biopsies that show dysplasia, but exact determination of expert review status had been lacking, according to Marnix Jansen, MD, a pathologist at University College London.
“The guidelines say that biopsies with dysplasia need to be reviewed by an expert pathologist, but don’t define what makes an expert pathologist,” Dr. Jansen said in an interview at the annual Digestive Disease Week.
“We wanted to advance the field by for the first time creating objective and quantitative standards” to delineate the characteristics of an expert pathologist in reviewing Barrett’s esophagus tissue samples, said Dr. Jansen. The study’s first author is Myrtle J. van der Wel, MD, of Amsterdam University Medical Center, the Netherlands.
More than 6,000 individual case diagnoses were used in the study, which included pathologists from more than 20 countries. Before the pathologists began reviewing the case set, they answered a questionnaire about training, practice context, years of experience, case volume, and other demographic characteristics.
“We then sent those biopsies around the world to ... 55 pathologists in the U.S., in Europe, Japan, Australia, even some in South America – so really around the whole globe,” explained Dr. Jansen. Biopsies were assessed by each pathologist before and after p53 immunostaining.
“Once we had the final dataset – which is massive, because we had 6,000 case diagnoses within our dataset – we could then regress those variables back onto the consensus data,” providing a first-ever look at “clear predictors of what the pathologist looks like that will score on a par with where the experts are,” said Dr. Jansen.
The results? “You need at least 5 years of experience. On top of that, if you are a pathologist working in a [nonacademic center], you are at a slightly increased risk of making major diagnostic errors,” said Dr. Jansen. However, the analysis convincingly showed that the addition of p53 immunostaining neutralized the risk for these pathologists – a strength of having such a large dataset, he said.
The study also affirmed the safety of digital pathology for expert review, said Dr. Jansen: “One of the reassuring points of our study was that we found that the best concordance was for nondysplastic Barrett’s, and high-grade dysplasia, which really replicates known glass slide characteristics. So we can really say that digital pathology is safe for this application – which is very relevant for pathologists that are taking in cases from outside for expert review.”
Concordance rates for nondysplastic Barrett’s esophagus and high-grade dysplasia were over 70%; for low-grade dysplasia, rates were intermediate at 42%.
Going forward, the study can inform the next iteration of guidelines for pathologist review of Barrett’s dysplasia, said Dr. Jansen. Rather than just recommending expert review, the guidelines can include a quantitative assessment of what’s needed. “You need to have to have at least 5 years of experience, and if you work in a [community hospital], to use a p53, and that is collectively what amounts to expertise in Barrett’s pathology.”
A follow-up study with a similar design is planned within the United Kingdom, the Netherlands, and the United States. This study, which Dr. Jansen said would enroll hundreds of pathologists, will include an intervention arm that administers a tutorial with the aim of improving concordance scoring.
Dr. Jansen reported no relevant conflicts of interest.
SAN DIEGO – Years of experience and an academic medical center affiliation predicted the accuracy of pathologists reviewing biopsies from patients with Barrett’s esophagus, according to the results of a multinational study.
Those with 5 or more years of experience were less likely to make major diagnostic errors in reviewing Barrett’s esophagus biopsies (odds ratio [OR], 0.48, 95% confidence interval, 0.31-0.74). Pathologists who worked in nonacademic settings were more likely to make a major diagnostic error (OR, 1.76; 95% CI, 1.15-2.69) when reviewing hematoxylin and eosin-stained slides alone, but the addition of p53 immunostaining greatly improved accuracy.
Current guidelines recommend expert evaluation of Barrett’s esophagus biopsies that show dysplasia, but exact determination of expert review status had been lacking, according to Marnix Jansen, MD, a pathologist at University College London.
“The guidelines say that biopsies with dysplasia need to be reviewed by an expert pathologist, but don’t define what makes an expert pathologist,” Dr. Jansen said in an interview at the annual Digestive Disease Week.
“We wanted to advance the field by for the first time creating objective and quantitative standards” to delineate the characteristics of an expert pathologist in reviewing Barrett’s esophagus tissue samples, said Dr. Jansen. The study’s first author is Myrtle J. van der Wel, MD, of Amsterdam University Medical Center, the Netherlands.
More than 6,000 individual case diagnoses were used in the study, which included pathologists from more than 20 countries. Before the pathologists began reviewing the case set, they answered a questionnaire about training, practice context, years of experience, case volume, and other demographic characteristics.
“We then sent those biopsies around the world to ... 55 pathologists in the U.S., in Europe, Japan, Australia, even some in South America – so really around the whole globe,” explained Dr. Jansen. Biopsies were assessed by each pathologist before and after p53 immunostaining.
“Once we had the final dataset – which is massive, because we had 6,000 case diagnoses within our dataset – we could then regress those variables back onto the consensus data,” providing a first-ever look at “clear predictors of what the pathologist looks like that will score on a par with where the experts are,” said Dr. Jansen.
The results? “You need at least 5 years of experience. On top of that, if you are a pathologist working in a [nonacademic center], you are at a slightly increased risk of making major diagnostic errors,” said Dr. Jansen. However, the analysis convincingly showed that the addition of p53 immunostaining neutralized the risk for these pathologists – a strength of having such a large dataset, he said.
The study also affirmed the safety of digital pathology for expert review, said Dr. Jansen: “One of the reassuring points of our study was that we found that the best concordance was for nondysplastic Barrett’s, and high-grade dysplasia, which really replicates known glass slide characteristics. So we can really say that digital pathology is safe for this application – which is very relevant for pathologists that are taking in cases from outside for expert review.”
Concordance rates for nondysplastic Barrett’s esophagus and high-grade dysplasia were over 70%; for low-grade dysplasia, rates were intermediate at 42%.
Going forward, the study can inform the next iteration of guidelines for pathologist review of Barrett’s dysplasia, said Dr. Jansen. Rather than just recommending expert review, the guidelines can include a quantitative assessment of what’s needed. “You need to have to have at least 5 years of experience, and if you work in a [community hospital], to use a p53, and that is collectively what amounts to expertise in Barrett’s pathology.”
A follow-up study with a similar design is planned within the United Kingdom, the Netherlands, and the United States. This study, which Dr. Jansen said would enroll hundreds of pathologists, will include an intervention arm that administers a tutorial with the aim of improving concordance scoring.
Dr. Jansen reported no relevant conflicts of interest.
REPORTING FROM DDW 2019
Opioid use associated with common bile duct dilation
SAN DIEGO – Biliary duct dilation in the setting of an intact gallbladder and normal bilirubin levels was more common among those who used opioids, based on the results of a large, retrospective, single-center cohort study.
Patients were included in the study if they had a documented measurement for the diameter of the common bile duct, with no evidence of an obstructive lesion and a normal bilirubin level. The mean common bile duct diameter was significantly higher at 8.67 mm for 867 patients who used opioids, compared with 7.24 mm for 818 similar patients who did not use opioids (P less than .001). The association was strongest among opioid users with an intact gallbladder.
“Opiate use is associated with biliary dilation in the setting of normal bilirubin,” Monique Barakat, MD, a gastroenterologist at Stanford (Calif.) University, reported at the annual Digestive Disease Week. “Known opiate users with normal LFTs [liver function tests] may not require expensive and potentially risky endoscopic evaluation for biliary dilation.”
Dr. Barakat and senior author Subhas Banerjee, MD, professor of gastroenterology and hepatology at Stanford, decided to examine a possible association between biliary duct dilation and opioid use based on previous small clinical studies that found a possible association. Along with opioid status, Dr. Barakat and her coauthor also looked at patient age, cholecystectomy status, ethnicity, weight, and height for possible associations with bile duct diameter.
The researchers took a random 20% sample of adults seen for all causes in the ED at Stanford over a 5-year period. Using a health informatic platform based on the electronic medical record, they identified all patients who had received an abdominal CT or MRI. Patients were included in the study if they had a documented measurement for the diameter of the common bile duct, with no evidence of obstructive lesion and a normal bilirubin level.
Compared with 818 patients who did not use opioids, the 867 patients who used opioids had a significantly larger common bile duct diameter. Using 7 mm as the threshold for biliary duct enlargement, 84% of patients who used opioids had an enlarged common bile duct, compared with 27% of nonopioid users (P less than .001), said Dr. Barakat, recipient of an early investigator award for the study.
“We frequently get referrals for bile duct dilation with concern for more sinister causes of biliary duct dilation – stones, strictures, and malignancy,” said Dr. Barakat. Because of the increase in cross-sectional imaging via CT or MRI, bile duct dilation is being detected at increasingly higher rates.
Dr. Barakat said that about one-third of referrals to the therapeutic endoscopy clinic at Stanford are now for patients with biliary dilation and normal liver function tests. And similar increases are being “seen across all settings – so office, primary care clinic, inpatient, and most markedly, the emergency department. Coupled with this, the population is aging, and patients who present to each of these settings are more likely, if they are older, to undergo cross-sectional imaging.”
Other contributors to higher rates of bile duct dilation include increased rates of obesity and increased prevalence of nonalcoholic steatohepatitis (NASH). About 20% of individuals with NASH will also have abnormal LFTs, she said, and NASH can be the trigger for cross-sectional imaging.
For most of these patients with biliary duct dilation and normal LFTs, no obstructive process was found on endoscopic evaluation.
Although gastroenterology textbooks may say that bile duct diameter increases with age, Dr. Barakat and colleagues didn’t find this to be the case. Among nonopioid users in the study cohort, age did not predict of common bile duct diameter. Among the entire cohort, “Advancing age weakly predicts increased common bile duct diameter,” she said, suggesting that factors other than age along may drive increased bile duct diameter.
Limitations included the retrospective nature of the study, as well as the limitations of information from the electronic medical record. Also, interobserver variability may have come into play, as bile duct diameter measurements were made by multiple radiologists in the course of clinical care.
The study was funded by the National Institutes of Health. Dr. Barakat reported no relevant financial disclosures.
SAN DIEGO – Biliary duct dilation in the setting of an intact gallbladder and normal bilirubin levels was more common among those who used opioids, based on the results of a large, retrospective, single-center cohort study.
Patients were included in the study if they had a documented measurement for the diameter of the common bile duct, with no evidence of an obstructive lesion and a normal bilirubin level. The mean common bile duct diameter was significantly higher at 8.67 mm for 867 patients who used opioids, compared with 7.24 mm for 818 similar patients who did not use opioids (P less than .001). The association was strongest among opioid users with an intact gallbladder.
“Opiate use is associated with biliary dilation in the setting of normal bilirubin,” Monique Barakat, MD, a gastroenterologist at Stanford (Calif.) University, reported at the annual Digestive Disease Week. “Known opiate users with normal LFTs [liver function tests] may not require expensive and potentially risky endoscopic evaluation for biliary dilation.”
Dr. Barakat and senior author Subhas Banerjee, MD, professor of gastroenterology and hepatology at Stanford, decided to examine a possible association between biliary duct dilation and opioid use based on previous small clinical studies that found a possible association. Along with opioid status, Dr. Barakat and her coauthor also looked at patient age, cholecystectomy status, ethnicity, weight, and height for possible associations with bile duct diameter.
The researchers took a random 20% sample of adults seen for all causes in the ED at Stanford over a 5-year period. Using a health informatic platform based on the electronic medical record, they identified all patients who had received an abdominal CT or MRI. Patients were included in the study if they had a documented measurement for the diameter of the common bile duct, with no evidence of obstructive lesion and a normal bilirubin level.
Compared with 818 patients who did not use opioids, the 867 patients who used opioids had a significantly larger common bile duct diameter. Using 7 mm as the threshold for biliary duct enlargement, 84% of patients who used opioids had an enlarged common bile duct, compared with 27% of nonopioid users (P less than .001), said Dr. Barakat, recipient of an early investigator award for the study.
“We frequently get referrals for bile duct dilation with concern for more sinister causes of biliary duct dilation – stones, strictures, and malignancy,” said Dr. Barakat. Because of the increase in cross-sectional imaging via CT or MRI, bile duct dilation is being detected at increasingly higher rates.
Dr. Barakat said that about one-third of referrals to the therapeutic endoscopy clinic at Stanford are now for patients with biliary dilation and normal liver function tests. And similar increases are being “seen across all settings – so office, primary care clinic, inpatient, and most markedly, the emergency department. Coupled with this, the population is aging, and patients who present to each of these settings are more likely, if they are older, to undergo cross-sectional imaging.”
Other contributors to higher rates of bile duct dilation include increased rates of obesity and increased prevalence of nonalcoholic steatohepatitis (NASH). About 20% of individuals with NASH will also have abnormal LFTs, she said, and NASH can be the trigger for cross-sectional imaging.
For most of these patients with biliary duct dilation and normal LFTs, no obstructive process was found on endoscopic evaluation.
Although gastroenterology textbooks may say that bile duct diameter increases with age, Dr. Barakat and colleagues didn’t find this to be the case. Among nonopioid users in the study cohort, age did not predict of common bile duct diameter. Among the entire cohort, “Advancing age weakly predicts increased common bile duct diameter,” she said, suggesting that factors other than age along may drive increased bile duct diameter.
Limitations included the retrospective nature of the study, as well as the limitations of information from the electronic medical record. Also, interobserver variability may have come into play, as bile duct diameter measurements were made by multiple radiologists in the course of clinical care.
The study was funded by the National Institutes of Health. Dr. Barakat reported no relevant financial disclosures.
SAN DIEGO – Biliary duct dilation in the setting of an intact gallbladder and normal bilirubin levels was more common among those who used opioids, based on the results of a large, retrospective, single-center cohort study.
Patients were included in the study if they had a documented measurement for the diameter of the common bile duct, with no evidence of an obstructive lesion and a normal bilirubin level. The mean common bile duct diameter was significantly higher at 8.67 mm for 867 patients who used opioids, compared with 7.24 mm for 818 similar patients who did not use opioids (P less than .001). The association was strongest among opioid users with an intact gallbladder.
“Opiate use is associated with biliary dilation in the setting of normal bilirubin,” Monique Barakat, MD, a gastroenterologist at Stanford (Calif.) University, reported at the annual Digestive Disease Week. “Known opiate users with normal LFTs [liver function tests] may not require expensive and potentially risky endoscopic evaluation for biliary dilation.”
Dr. Barakat and senior author Subhas Banerjee, MD, professor of gastroenterology and hepatology at Stanford, decided to examine a possible association between biliary duct dilation and opioid use based on previous small clinical studies that found a possible association. Along with opioid status, Dr. Barakat and her coauthor also looked at patient age, cholecystectomy status, ethnicity, weight, and height for possible associations with bile duct diameter.
The researchers took a random 20% sample of adults seen for all causes in the ED at Stanford over a 5-year period. Using a health informatic platform based on the electronic medical record, they identified all patients who had received an abdominal CT or MRI. Patients were included in the study if they had a documented measurement for the diameter of the common bile duct, with no evidence of obstructive lesion and a normal bilirubin level.
Compared with 818 patients who did not use opioids, the 867 patients who used opioids had a significantly larger common bile duct diameter. Using 7 mm as the threshold for biliary duct enlargement, 84% of patients who used opioids had an enlarged common bile duct, compared with 27% of nonopioid users (P less than .001), said Dr. Barakat, recipient of an early investigator award for the study.
“We frequently get referrals for bile duct dilation with concern for more sinister causes of biliary duct dilation – stones, strictures, and malignancy,” said Dr. Barakat. Because of the increase in cross-sectional imaging via CT or MRI, bile duct dilation is being detected at increasingly higher rates.
Dr. Barakat said that about one-third of referrals to the therapeutic endoscopy clinic at Stanford are now for patients with biliary dilation and normal liver function tests. And similar increases are being “seen across all settings – so office, primary care clinic, inpatient, and most markedly, the emergency department. Coupled with this, the population is aging, and patients who present to each of these settings are more likely, if they are older, to undergo cross-sectional imaging.”
Other contributors to higher rates of bile duct dilation include increased rates of obesity and increased prevalence of nonalcoholic steatohepatitis (NASH). About 20% of individuals with NASH will also have abnormal LFTs, she said, and NASH can be the trigger for cross-sectional imaging.
For most of these patients with biliary duct dilation and normal LFTs, no obstructive process was found on endoscopic evaluation.
Although gastroenterology textbooks may say that bile duct diameter increases with age, Dr. Barakat and colleagues didn’t find this to be the case. Among nonopioid users in the study cohort, age did not predict of common bile duct diameter. Among the entire cohort, “Advancing age weakly predicts increased common bile duct diameter,” she said, suggesting that factors other than age along may drive increased bile duct diameter.
Limitations included the retrospective nature of the study, as well as the limitations of information from the electronic medical record. Also, interobserver variability may have come into play, as bile duct diameter measurements were made by multiple radiologists in the course of clinical care.
The study was funded by the National Institutes of Health. Dr. Barakat reported no relevant financial disclosures.
REPORTING FROM DDW 2019
VA system lags in getting DMARDs to veterans with inflammatory arthritis
MADISON, WISC. – Only half of United States veterans with inflammatory arthritis received disease-modifying medication within 90 days of diagnosis if they received care within the Veterans Health Administration, according to a study presented at the annual meeting of the Spondyloarthritis Research and Treatment Network (SPARTAN).
Over the study period, 58.2% of all inflammatory arthritis patients began a disease-modifying antirheumatic drug (DMARD) within 12 months of diagnosis. Rates of DMARD initiation were similar for patients with rheumatoid arthritis (RA, 57.7%) and psoriatic arthritis (PsA, 64.3%), said the first author of the poster presentation, Sogol S. Amjadi, DO, a resident physician at Bingham Memorial Hospital, Blackfoot, Idaho.
However, at 12 months after diagnosis, only 29.6% of ankylosing spondylitis (AS) patients had not been started on a DMARD. “The ankylosing spondylitis group really had the lowest DMARD initiation over time,” said Dr. Amjadi in an interview.
The study used diagnosis codes and natural language processing to look for incident cases of the three inflammatory arthritides (IAs) among patients receiving care within the Veterans Health Administration from 2007 through 2015.
In all, 12,118 patients with incident IA were identified. Of these, 9,711 had RA, 1,472 had PsA, and 935 had AS. Patients were mostly (91.3%) male, with a mean age of 63.7 years.
Over the study period, 41.2% of IA patients were dispensed a DMARD within 30 days of diagnosis, and 50% received a DMARD within 90 days of diagnosis. Patients with PsA or RA had similar rates of DMARD prescription within 30 days of diagnosis (about 42% and 43%, respectively).
The investigators discovered in their analysis that another factor in prompt treatment was access to specialty care.“Timely access to a rheumatology provider is likely important for early DMARD treatment,” wrote Dr. Amjadi and her coauthors in the poster accompanying the presentation. Of patients who did receive a DMARD, 82.7% had received rheumatology specialty care before nonbiologic DMARD dispensing, as had 90.0% of patients receiving biologic DMARDs. Over the entire study period, about 10% of all IA patients had biologic DMARD exposure.
There was a trend over time for increased DMARD dispensing, said the investigators. “The percentage of IA patients with DMARD exposure during the 12-month follow-up period increased from 48.8% in 2008 to 66.4% in 2015.”
For AS patients, early DMARD prescribing rates rose from about 20% in 2007 to nearly 30% in 2015. “DMARD treatment rates during the initial 12 months after diagnosis increased between 2007 and 2015, but nontreatment remained common, particularly in patients with AS,” wrote the investigators. “Delays in treatment for inflammatory arthritis are associated with unfavorable outcomes, including impaired quality of life, irreversible joint damage, and disability.”
The authors reported no conflicts of interest and no outside sources of funding.
MADISON, WISC. – Only half of United States veterans with inflammatory arthritis received disease-modifying medication within 90 days of diagnosis if they received care within the Veterans Health Administration, according to a study presented at the annual meeting of the Spondyloarthritis Research and Treatment Network (SPARTAN).
Over the study period, 58.2% of all inflammatory arthritis patients began a disease-modifying antirheumatic drug (DMARD) within 12 months of diagnosis. Rates of DMARD initiation were similar for patients with rheumatoid arthritis (RA, 57.7%) and psoriatic arthritis (PsA, 64.3%), said the first author of the poster presentation, Sogol S. Amjadi, DO, a resident physician at Bingham Memorial Hospital, Blackfoot, Idaho.
However, at 12 months after diagnosis, only 29.6% of ankylosing spondylitis (AS) patients had not been started on a DMARD. “The ankylosing spondylitis group really had the lowest DMARD initiation over time,” said Dr. Amjadi in an interview.
The study used diagnosis codes and natural language processing to look for incident cases of the three inflammatory arthritides (IAs) among patients receiving care within the Veterans Health Administration from 2007 through 2015.
In all, 12,118 patients with incident IA were identified. Of these, 9,711 had RA, 1,472 had PsA, and 935 had AS. Patients were mostly (91.3%) male, with a mean age of 63.7 years.
Over the study period, 41.2% of IA patients were dispensed a DMARD within 30 days of diagnosis, and 50% received a DMARD within 90 days of diagnosis. Patients with PsA or RA had similar rates of DMARD prescription within 30 days of diagnosis (about 42% and 43%, respectively).
The investigators discovered in their analysis that another factor in prompt treatment was access to specialty care.“Timely access to a rheumatology provider is likely important for early DMARD treatment,” wrote Dr. Amjadi and her coauthors in the poster accompanying the presentation. Of patients who did receive a DMARD, 82.7% had received rheumatology specialty care before nonbiologic DMARD dispensing, as had 90.0% of patients receiving biologic DMARDs. Over the entire study period, about 10% of all IA patients had biologic DMARD exposure.
There was a trend over time for increased DMARD dispensing, said the investigators. “The percentage of IA patients with DMARD exposure during the 12-month follow-up period increased from 48.8% in 2008 to 66.4% in 2015.”
For AS patients, early DMARD prescribing rates rose from about 20% in 2007 to nearly 30% in 2015. “DMARD treatment rates during the initial 12 months after diagnosis increased between 2007 and 2015, but nontreatment remained common, particularly in patients with AS,” wrote the investigators. “Delays in treatment for inflammatory arthritis are associated with unfavorable outcomes, including impaired quality of life, irreversible joint damage, and disability.”
The authors reported no conflicts of interest and no outside sources of funding.
MADISON, WISC. – Only half of United States veterans with inflammatory arthritis received disease-modifying medication within 90 days of diagnosis if they received care within the Veterans Health Administration, according to a study presented at the annual meeting of the Spondyloarthritis Research and Treatment Network (SPARTAN).
Over the study period, 58.2% of all inflammatory arthritis patients began a disease-modifying antirheumatic drug (DMARD) within 12 months of diagnosis. Rates of DMARD initiation were similar for patients with rheumatoid arthritis (RA, 57.7%) and psoriatic arthritis (PsA, 64.3%), said the first author of the poster presentation, Sogol S. Amjadi, DO, a resident physician at Bingham Memorial Hospital, Blackfoot, Idaho.
However, at 12 months after diagnosis, only 29.6% of ankylosing spondylitis (AS) patients had not been started on a DMARD. “The ankylosing spondylitis group really had the lowest DMARD initiation over time,” said Dr. Amjadi in an interview.
The study used diagnosis codes and natural language processing to look for incident cases of the three inflammatory arthritides (IAs) among patients receiving care within the Veterans Health Administration from 2007 through 2015.
In all, 12,118 patients with incident IA were identified. Of these, 9,711 had RA, 1,472 had PsA, and 935 had AS. Patients were mostly (91.3%) male, with a mean age of 63.7 years.
Over the study period, 41.2% of IA patients were dispensed a DMARD within 30 days of diagnosis, and 50% received a DMARD within 90 days of diagnosis. Patients with PsA or RA had similar rates of DMARD prescription within 30 days of diagnosis (about 42% and 43%, respectively).
The investigators discovered in their analysis that another factor in prompt treatment was access to specialty care.“Timely access to a rheumatology provider is likely important for early DMARD treatment,” wrote Dr. Amjadi and her coauthors in the poster accompanying the presentation. Of patients who did receive a DMARD, 82.7% had received rheumatology specialty care before nonbiologic DMARD dispensing, as had 90.0% of patients receiving biologic DMARDs. Over the entire study period, about 10% of all IA patients had biologic DMARD exposure.
There was a trend over time for increased DMARD dispensing, said the investigators. “The percentage of IA patients with DMARD exposure during the 12-month follow-up period increased from 48.8% in 2008 to 66.4% in 2015.”
For AS patients, early DMARD prescribing rates rose from about 20% in 2007 to nearly 30% in 2015. “DMARD treatment rates during the initial 12 months after diagnosis increased between 2007 and 2015, but nontreatment remained common, particularly in patients with AS,” wrote the investigators. “Delays in treatment for inflammatory arthritis are associated with unfavorable outcomes, including impaired quality of life, irreversible joint damage, and disability.”
The authors reported no conflicts of interest and no outside sources of funding.
REPORTING FROM SPARTAN 2019
Key clinical point:
Major finding: Overall, 58.2% of inflammatory arthritis patients received a DMARD within the first year of diagnosis.
Study details: Retrospective review of 12,118 incident cases of inflammatory arthritis in the Veterans Health Administration during the period from 2007 through 2015.
Disclosures: The authors reported no conflicts of interest and no outside sources of funding.
Source: Amjadi SS et al. SPARTAN 2019.
Ankylosing spondylitis patients taking COX-2 inhibitors may see fewer cardiovascular events
MADISON, WISC. – Patients with ankylosing spondylitis had a small but significant reduction in risk for cardiovascular events if they were taking cyclooxygenase-2 (COX-2) inhibitors, according to a new systematic review and meta-analysis.
The reduced risk observed in this analysis (risk ratio, 0.48; 95% confidence interval, 0.33-0.70) contrasts with the increased risk for cardiovascular events seen with COX-2 inhibitor use in the general population, said Paras Karmacharya, MBBS, speaking in an interview at the annual meeting of the Spondyloarthritis Research and Treatment Network (SPARTAN). The overall effect was highly statistically significant (P = .0001), and the finding provides “reassuring” data for a population that’s known to have an elevated risk for cardiovascular events, he said.
“[W]e found in the subgroup analysis that COX-2 inhibitors were associated with a reduced risk of cardiovascular events as a whole,” an association also seen when looking just at ischemic stroke, Dr. Karmacharya said. “So that was sort of surprising; in the general population, there are some concerns about using COX-2 inhibitors.”
Looking at data for the nine studies that met criteria for inclusion in the meta-analysis, Dr. Karmacharya, a rheumatology fellow at the Mayo Clinic, Rochester, Minn., and his collaborators calculated risk ratios for a composite outcome of all cardiovascular events (CVE) for all individuals taking NSAIDs, compared with individuals with ankylosing spondylitis (AS) who were not taking NSAIDs. Here, they found a relative risk of 0.94 (95% CI, 0.50-1.75; P = .84).
Next, the investigators calculated a relative risk of 0.78 for the composite CVE outcome just for those taking nonselective NSAIDs (95% CI, 0.44-1.38; P = .40).
Along with NSAIDs, Dr. Karmacharya and his coauthors also looked at the relationship between tumor necrosis factor inhibitors (TNFIs) and cardiovascular events. They found a significantly increased risk for the composite endpoint among AS patients taking TNFIs (RR, 1.60; 95% CI, 1.05-2.41; P = .03), but the comparison was limited to only one study.
In their analysis, the investigators also broke out risk of acute coronary syndrome (ACS)/ischemic heart disease. “The only place where we found some increased risk was ACS and ischemic heart disease, and that was with nonselective NSAIDS,” Dr. Karmacharya said (RR, 1.21; 95% CI, 1.06-1.39; P = .005). No significant changes in relative risk for ACS and ischemic heart disease were seen for the total group of NSAID users, for the subgroups taking COX-2 inhibitors, or for those taking TNFIs.
Finally, the investigators found a relative risk of 0.58 for stroke among the full group of NSAID users and a relative risk of 0.59 for those taking COX-2 inhibitors, but no reduced risk for the subgroup taking nonselective NSAIDs (P = .02, .04, and .37, respectively).
“While NSAIDs are known to be associated with an increased risk of CVE in the general population, whether the anti-inflammatory effects of NSAIDs reduce or modify the CVE risk in AS is controversial,” Dr. Karmacharya and his collaborators wrote. In this context, the meta-analysis provides a useful perspective for rheumatologists who care for AS patients, Dr. Karmacharya said: “I think it’s important, because most of these patients are on NSAIDs long-term.”
However, all of the studies included in the meta-analysis were observational, with no randomized, controlled trials meeting inclusion criteria. Also, some analyses presented in the poster involved as few as two studies, so findings should be interpreted with caution, he added. “We don’t have a lot of studies included in the analysis. ... so we need more data for sure, but I think what data we have so far look reassuring.”
Dr. Karmacharya reported that he had no conflicts of interest, and reported no outside sources of funding.
MADISON, WISC. – Patients with ankylosing spondylitis had a small but significant reduction in risk for cardiovascular events if they were taking cyclooxygenase-2 (COX-2) inhibitors, according to a new systematic review and meta-analysis.
The reduced risk observed in this analysis (risk ratio, 0.48; 95% confidence interval, 0.33-0.70) contrasts with the increased risk for cardiovascular events seen with COX-2 inhibitor use in the general population, said Paras Karmacharya, MBBS, speaking in an interview at the annual meeting of the Spondyloarthritis Research and Treatment Network (SPARTAN). The overall effect was highly statistically significant (P = .0001), and the finding provides “reassuring” data for a population that’s known to have an elevated risk for cardiovascular events, he said.
“[W]e found in the subgroup analysis that COX-2 inhibitors were associated with a reduced risk of cardiovascular events as a whole,” an association also seen when looking just at ischemic stroke, Dr. Karmacharya said. “So that was sort of surprising; in the general population, there are some concerns about using COX-2 inhibitors.”
Looking at data for the nine studies that met criteria for inclusion in the meta-analysis, Dr. Karmacharya, a rheumatology fellow at the Mayo Clinic, Rochester, Minn., and his collaborators calculated risk ratios for a composite outcome of all cardiovascular events (CVE) for all individuals taking NSAIDs, compared with individuals with ankylosing spondylitis (AS) who were not taking NSAIDs. Here, they found a relative risk of 0.94 (95% CI, 0.50-1.75; P = .84).
Next, the investigators calculated a relative risk of 0.78 for the composite CVE outcome just for those taking nonselective NSAIDs (95% CI, 0.44-1.38; P = .40).
Along with NSAIDs, Dr. Karmacharya and his coauthors also looked at the relationship between tumor necrosis factor inhibitors (TNFIs) and cardiovascular events. They found a significantly increased risk for the composite endpoint among AS patients taking TNFIs (RR, 1.60; 95% CI, 1.05-2.41; P = .03), but the comparison was limited to only one study.
In their analysis, the investigators also broke out risk of acute coronary syndrome (ACS)/ischemic heart disease. “The only place where we found some increased risk was ACS and ischemic heart disease, and that was with nonselective NSAIDS,” Dr. Karmacharya said (RR, 1.21; 95% CI, 1.06-1.39; P = .005). No significant changes in relative risk for ACS and ischemic heart disease were seen for the total group of NSAID users, for the subgroups taking COX-2 inhibitors, or for those taking TNFIs.
Finally, the investigators found a relative risk of 0.58 for stroke among the full group of NSAID users and a relative risk of 0.59 for those taking COX-2 inhibitors, but no reduced risk for the subgroup taking nonselective NSAIDs (P = .02, .04, and .37, respectively).
“While NSAIDs are known to be associated with an increased risk of CVE in the general population, whether the anti-inflammatory effects of NSAIDs reduce or modify the CVE risk in AS is controversial,” Dr. Karmacharya and his collaborators wrote. In this context, the meta-analysis provides a useful perspective for rheumatologists who care for AS patients, Dr. Karmacharya said: “I think it’s important, because most of these patients are on NSAIDs long-term.”
However, all of the studies included in the meta-analysis were observational, with no randomized, controlled trials meeting inclusion criteria. Also, some analyses presented in the poster involved as few as two studies, so findings should be interpreted with caution, he added. “We don’t have a lot of studies included in the analysis. ... so we need more data for sure, but I think what data we have so far look reassuring.”
Dr. Karmacharya reported that he had no conflicts of interest, and reported no outside sources of funding.
MADISON, WISC. – Patients with ankylosing spondylitis had a small but significant reduction in risk for cardiovascular events if they were taking cyclooxygenase-2 (COX-2) inhibitors, according to a new systematic review and meta-analysis.
The reduced risk observed in this analysis (risk ratio, 0.48; 95% confidence interval, 0.33-0.70) contrasts with the increased risk for cardiovascular events seen with COX-2 inhibitor use in the general population, said Paras Karmacharya, MBBS, speaking in an interview at the annual meeting of the Spondyloarthritis Research and Treatment Network (SPARTAN). The overall effect was highly statistically significant (P = .0001), and the finding provides “reassuring” data for a population that’s known to have an elevated risk for cardiovascular events, he said.
“[W]e found in the subgroup analysis that COX-2 inhibitors were associated with a reduced risk of cardiovascular events as a whole,” an association also seen when looking just at ischemic stroke, Dr. Karmacharya said. “So that was sort of surprising; in the general population, there are some concerns about using COX-2 inhibitors.”
Looking at data for the nine studies that met criteria for inclusion in the meta-analysis, Dr. Karmacharya, a rheumatology fellow at the Mayo Clinic, Rochester, Minn., and his collaborators calculated risk ratios for a composite outcome of all cardiovascular events (CVE) for all individuals taking NSAIDs, compared with individuals with ankylosing spondylitis (AS) who were not taking NSAIDs. Here, they found a relative risk of 0.94 (95% CI, 0.50-1.75; P = .84).
Next, the investigators calculated a relative risk of 0.78 for the composite CVE outcome just for those taking nonselective NSAIDs (95% CI, 0.44-1.38; P = .40).
Along with NSAIDs, Dr. Karmacharya and his coauthors also looked at the relationship between tumor necrosis factor inhibitors (TNFIs) and cardiovascular events. They found a significantly increased risk for the composite endpoint among AS patients taking TNFIs (RR, 1.60; 95% CI, 1.05-2.41; P = .03), but the comparison was limited to only one study.
In their analysis, the investigators also broke out risk of acute coronary syndrome (ACS)/ischemic heart disease. “The only place where we found some increased risk was ACS and ischemic heart disease, and that was with nonselective NSAIDS,” Dr. Karmacharya said (RR, 1.21; 95% CI, 1.06-1.39; P = .005). No significant changes in relative risk for ACS and ischemic heart disease were seen for the total group of NSAID users, for the subgroups taking COX-2 inhibitors, or for those taking TNFIs.
Finally, the investigators found a relative risk of 0.58 for stroke among the full group of NSAID users and a relative risk of 0.59 for those taking COX-2 inhibitors, but no reduced risk for the subgroup taking nonselective NSAIDs (P = .02, .04, and .37, respectively).
“While NSAIDs are known to be associated with an increased risk of CVE in the general population, whether the anti-inflammatory effects of NSAIDs reduce or modify the CVE risk in AS is controversial,” Dr. Karmacharya and his collaborators wrote. In this context, the meta-analysis provides a useful perspective for rheumatologists who care for AS patients, Dr. Karmacharya said: “I think it’s important, because most of these patients are on NSAIDs long-term.”
However, all of the studies included in the meta-analysis were observational, with no randomized, controlled trials meeting inclusion criteria. Also, some analyses presented in the poster involved as few as two studies, so findings should be interpreted with caution, he added. “We don’t have a lot of studies included in the analysis. ... so we need more data for sure, but I think what data we have so far look reassuring.”
Dr. Karmacharya reported that he had no conflicts of interest, and reported no outside sources of funding.
REPORTING FROM SPARTAN 2019
Key clinical point: Individuals with ankylosing spondylitis (AS) who took cyclooxygenase 2 (COX-2) inhibitors had a reduced risk of cardiovascular events, compared with AS patients who were not using COX-2 inhibitors.
Major finding: Individuals with AS taking COX-2 inhibitors had a risk ratio of 0.48 for cardiovascular events (95% CI, 0.33-0.70; P = .001).
Study details: Systematic review and meta-analysis of nine observational studies that variably examined the association between NSAID use and tumor necrosis factor inhibitor use and cardiovascular events among individuals with AS.
Disclosures: The authors reported no conflicts of interest and no outside sources of funding.
Source: Karmacharya P. et al. SPARTAN 2019.