Gastroenterologists among the most likely to adopt telemedicine

GI telemedicine: A need for proactive efforts to ensure equity
Article Type
Changed

 

It’s no secret that the COVID-19 pandemic has disrupted medical practice and led to a surge in telemedicine visits. A new report issued by the health care social network Doximity in September predicts that these changes will be permanent, and that the telehealth industry will more than triple from $29 billion at the end of this year to about $106 billion by 2023.

The report, titled “2020 State of Telemedicine,” follows a similar 2019 publication and captures the changes created by the pandemic. “Obviously, telemedicine has been around for many years, but the pandemic around COVID-19 has really changed the game. Something that had been getting gradual adoption really rocketed to the forefront,” said Peter Alparin, MD, who is an internist in San Francisco and vice president of product at Doximity, in an interview. The report predicts that 20% of medical visits will be conducted through telemedicine by the end of 2020.

Gastroenterology is one of the top specialties to adopt telemedicine, ranking third behind endocrinology and rheumatology, and that should come as no surprise. “Chronic disease patients lend themselves well to telemedicine because they have ongoing relationships with their physicians, so they can be seen more often and it’s more convenient for them. The specialties that take care of patients with those sorts of illnesses were the ones that adopted it the most readily,” said Dr. Alparin.

That’s probably in part because specialists dealing with chronic conditions have been triaging patients with telephone calls for years, making it easier to tell when a patient needs to come in for a physical visit. “It’s a skill you learn, to tell when something is just a little bit different for a patient. It’s really a clinical judgment that has been honed over years of experience,” said Dr. Alparin. The report backs up that idea, as it found that the physician age groups that most often adopted telemedicine were those in their 40s, 50s, and 60s.

Telemedicine is popular with patients once they try it, and it can greatly expand a physician’s reach, according to Dr. Alparin. “If you’re a specialist, you can perhaps see patients in areas where that specialty is underrepresented, whether that’s the inner city or a very rural area,” he said. The most important barrier is high-speed Internet access, which remains a problem in many areas.

Doximity researchers surveyed more than 2,000 U.S. adults to get their opinions on telemedicine, and analyzed telemedicine adoption data from the platform’s own set of telemedicine tools, and compared it to data from the 2019 report. They also reviewed studies looking at disparities in medicine and patient access to telemedicine.

Telemedicine use among patients grew from 14% before the pandemic, to 35% who reported at least one telemedicine visit after COVID-19. A total of 23% said they planned to continue use of telemedicine after the pandemic ends, and 27% said they had become more comfortable using telemedicine. Among patients, 28% said telemedicine provides the same or better benefit as an in-person visit, and this rose to 53% among those with chronic illnesses.

Among physicians, telemedicine adoption rose by 20% between 2015 and 2018, but increased by 38% between 2019 and 2020. The highest percentage of physician telemedicine adopters were in large metro areas and East Coast states, led by Massachusetts, North Carolina, and New Jersey. None of the top 10 adopter states were west of Illinois.

Equity concerns remain: 64.3% of households with annual incomes of $25,000 or lower have access to broadband internet, compared with 93.5% of those with incomes of $50,000 or lower. In nonmetropolitan areas, 78.1% of households have access, compared with 86.7% of metropolitan households. The good news is that many patients prefer cell phone use for telemedicine, and nearly as many Black and Hispanic Americans own cell phones as White Americans. “That has really democratized access,” said Dr. Alperin.

A key to successful telemedicine appointments is to make sure that the patient is prepared, according to Dr. Alparin. Make sure the patient is in a relatively quiet, well-lit place, and that they have thought about the questions they want to ask. It’s possible to replicate some aspects of a physical appointment with the right conditions. “You can visualize how they move their arms and legs; you can see how they’re breathing. You can gain a lot of information by just watching somebody,” said Dr. Alparin. A physician might also spot clues in the patient’s surroundings. “If a patient is asthmatic and you see cats walking all over the place, or a patient is allergic to gluten and they have loaves of bread everywhere,” he added.

A big concern for telemedicine has been reimbursement. In response to the pandemic, the Centers for Medicare & Medicaid Services created a number of waivers to requirements for billing for telemedicine services, and private insurers followed suit. In August, the agency announced it would make some of those waivers permanent, though others such as removal of restrictions on the site of care, eligible providers, and nonrural areas will likely require an act of Congress to enshrine, CMS administrator Seema Verma told reporters at an August press conference.

SOURCE: 2020 State of Telemedicine Report.

Body

Dr. Yuval A. Patel
The COVID-19 pandemic has emphasized the importance of social determinants of health. Historically underserved populations in the United States – particularly African American, Hispanic/Latino, and Native American – have been disproportionately affected, suffering from higher hospitalization rates and worsened morbidity/mortality related to the disease. Telemedicine feels like it should be the great equalizer of access in this time of national and personal stress, a technological solution that aspires for a universal reach. However, early lessons in the pandemic inform us that this is not inherently guaranteed. A review of access metrics in the Duke University Liver Clinic during the pandemic finds disparities in overall use and suboptimal use (phone versus video) for vulnerable populations, including older patients, underserved minorities, and those on Medicaid/Medicare insurance. Though a phone visit is better than no visit, a video visit may be considered ideal for certain disease states such as cirrhosis, where exam findings such as jaundice, muscle wasting, and edema/ascites can be evaluated. Our experience underscores disparities in digital literacy or access that likely are at play throughout our country. As telemedicine becomes a staple of GI and liver chronic disease care, proactive methods are needed by health providers to ensure equitable access. This may include advocating for reduced-cost internet, education outreach for digital skills, ensuring adequate language interpreter access, and monitoring access metrics. 

Yuval A. Patel MD, MHS, is assistant professor of medicine, division of gastroenterology, Duke University School of Medicine, Durham, N.C. He has no conflicts of interest.

Publications
Topics
Sections
Body

Dr. Yuval A. Patel
The COVID-19 pandemic has emphasized the importance of social determinants of health. Historically underserved populations in the United States – particularly African American, Hispanic/Latino, and Native American – have been disproportionately affected, suffering from higher hospitalization rates and worsened morbidity/mortality related to the disease. Telemedicine feels like it should be the great equalizer of access in this time of national and personal stress, a technological solution that aspires for a universal reach. However, early lessons in the pandemic inform us that this is not inherently guaranteed. A review of access metrics in the Duke University Liver Clinic during the pandemic finds disparities in overall use and suboptimal use (phone versus video) for vulnerable populations, including older patients, underserved minorities, and those on Medicaid/Medicare insurance. Though a phone visit is better than no visit, a video visit may be considered ideal for certain disease states such as cirrhosis, where exam findings such as jaundice, muscle wasting, and edema/ascites can be evaluated. Our experience underscores disparities in digital literacy or access that likely are at play throughout our country. As telemedicine becomes a staple of GI and liver chronic disease care, proactive methods are needed by health providers to ensure equitable access. This may include advocating for reduced-cost internet, education outreach for digital skills, ensuring adequate language interpreter access, and monitoring access metrics. 

Yuval A. Patel MD, MHS, is assistant professor of medicine, division of gastroenterology, Duke University School of Medicine, Durham, N.C. He has no conflicts of interest.

Body

Dr. Yuval A. Patel
The COVID-19 pandemic has emphasized the importance of social determinants of health. Historically underserved populations in the United States – particularly African American, Hispanic/Latino, and Native American – have been disproportionately affected, suffering from higher hospitalization rates and worsened morbidity/mortality related to the disease. Telemedicine feels like it should be the great equalizer of access in this time of national and personal stress, a technological solution that aspires for a universal reach. However, early lessons in the pandemic inform us that this is not inherently guaranteed. A review of access metrics in the Duke University Liver Clinic during the pandemic finds disparities in overall use and suboptimal use (phone versus video) for vulnerable populations, including older patients, underserved minorities, and those on Medicaid/Medicare insurance. Though a phone visit is better than no visit, a video visit may be considered ideal for certain disease states such as cirrhosis, where exam findings such as jaundice, muscle wasting, and edema/ascites can be evaluated. Our experience underscores disparities in digital literacy or access that likely are at play throughout our country. As telemedicine becomes a staple of GI and liver chronic disease care, proactive methods are needed by health providers to ensure equitable access. This may include advocating for reduced-cost internet, education outreach for digital skills, ensuring adequate language interpreter access, and monitoring access metrics. 

Yuval A. Patel MD, MHS, is assistant professor of medicine, division of gastroenterology, Duke University School of Medicine, Durham, N.C. He has no conflicts of interest.

Title
GI telemedicine: A need for proactive efforts to ensure equity
GI telemedicine: A need for proactive efforts to ensure equity

 

It’s no secret that the COVID-19 pandemic has disrupted medical practice and led to a surge in telemedicine visits. A new report issued by the health care social network Doximity in September predicts that these changes will be permanent, and that the telehealth industry will more than triple from $29 billion at the end of this year to about $106 billion by 2023.

The report, titled “2020 State of Telemedicine,” follows a similar 2019 publication and captures the changes created by the pandemic. “Obviously, telemedicine has been around for many years, but the pandemic around COVID-19 has really changed the game. Something that had been getting gradual adoption really rocketed to the forefront,” said Peter Alparin, MD, who is an internist in San Francisco and vice president of product at Doximity, in an interview. The report predicts that 20% of medical visits will be conducted through telemedicine by the end of 2020.

Gastroenterology is one of the top specialties to adopt telemedicine, ranking third behind endocrinology and rheumatology, and that should come as no surprise. “Chronic disease patients lend themselves well to telemedicine because they have ongoing relationships with their physicians, so they can be seen more often and it’s more convenient for them. The specialties that take care of patients with those sorts of illnesses were the ones that adopted it the most readily,” said Dr. Alparin.

That’s probably in part because specialists dealing with chronic conditions have been triaging patients with telephone calls for years, making it easier to tell when a patient needs to come in for a physical visit. “It’s a skill you learn, to tell when something is just a little bit different for a patient. It’s really a clinical judgment that has been honed over years of experience,” said Dr. Alparin. The report backs up that idea, as it found that the physician age groups that most often adopted telemedicine were those in their 40s, 50s, and 60s.

Telemedicine is popular with patients once they try it, and it can greatly expand a physician’s reach, according to Dr. Alparin. “If you’re a specialist, you can perhaps see patients in areas where that specialty is underrepresented, whether that’s the inner city or a very rural area,” he said. The most important barrier is high-speed Internet access, which remains a problem in many areas.

Doximity researchers surveyed more than 2,000 U.S. adults to get their opinions on telemedicine, and analyzed telemedicine adoption data from the platform’s own set of telemedicine tools, and compared it to data from the 2019 report. They also reviewed studies looking at disparities in medicine and patient access to telemedicine.

Telemedicine use among patients grew from 14% before the pandemic, to 35% who reported at least one telemedicine visit after COVID-19. A total of 23% said they planned to continue use of telemedicine after the pandemic ends, and 27% said they had become more comfortable using telemedicine. Among patients, 28% said telemedicine provides the same or better benefit as an in-person visit, and this rose to 53% among those with chronic illnesses.

Among physicians, telemedicine adoption rose by 20% between 2015 and 2018, but increased by 38% between 2019 and 2020. The highest percentage of physician telemedicine adopters were in large metro areas and East Coast states, led by Massachusetts, North Carolina, and New Jersey. None of the top 10 adopter states were west of Illinois.

Equity concerns remain: 64.3% of households with annual incomes of $25,000 or lower have access to broadband internet, compared with 93.5% of those with incomes of $50,000 or lower. In nonmetropolitan areas, 78.1% of households have access, compared with 86.7% of metropolitan households. The good news is that many patients prefer cell phone use for telemedicine, and nearly as many Black and Hispanic Americans own cell phones as White Americans. “That has really democratized access,” said Dr. Alperin.

A key to successful telemedicine appointments is to make sure that the patient is prepared, according to Dr. Alparin. Make sure the patient is in a relatively quiet, well-lit place, and that they have thought about the questions they want to ask. It’s possible to replicate some aspects of a physical appointment with the right conditions. “You can visualize how they move their arms and legs; you can see how they’re breathing. You can gain a lot of information by just watching somebody,” said Dr. Alparin. A physician might also spot clues in the patient’s surroundings. “If a patient is asthmatic and you see cats walking all over the place, or a patient is allergic to gluten and they have loaves of bread everywhere,” he added.

A big concern for telemedicine has been reimbursement. In response to the pandemic, the Centers for Medicare & Medicaid Services created a number of waivers to requirements for billing for telemedicine services, and private insurers followed suit. In August, the agency announced it would make some of those waivers permanent, though others such as removal of restrictions on the site of care, eligible providers, and nonrural areas will likely require an act of Congress to enshrine, CMS administrator Seema Verma told reporters at an August press conference.

SOURCE: 2020 State of Telemedicine Report.

 

It’s no secret that the COVID-19 pandemic has disrupted medical practice and led to a surge in telemedicine visits. A new report issued by the health care social network Doximity in September predicts that these changes will be permanent, and that the telehealth industry will more than triple from $29 billion at the end of this year to about $106 billion by 2023.

The report, titled “2020 State of Telemedicine,” follows a similar 2019 publication and captures the changes created by the pandemic. “Obviously, telemedicine has been around for many years, but the pandemic around COVID-19 has really changed the game. Something that had been getting gradual adoption really rocketed to the forefront,” said Peter Alparin, MD, who is an internist in San Francisco and vice president of product at Doximity, in an interview. The report predicts that 20% of medical visits will be conducted through telemedicine by the end of 2020.

Gastroenterology is one of the top specialties to adopt telemedicine, ranking third behind endocrinology and rheumatology, and that should come as no surprise. “Chronic disease patients lend themselves well to telemedicine because they have ongoing relationships with their physicians, so they can be seen more often and it’s more convenient for them. The specialties that take care of patients with those sorts of illnesses were the ones that adopted it the most readily,” said Dr. Alparin.

That’s probably in part because specialists dealing with chronic conditions have been triaging patients with telephone calls for years, making it easier to tell when a patient needs to come in for a physical visit. “It’s a skill you learn, to tell when something is just a little bit different for a patient. It’s really a clinical judgment that has been honed over years of experience,” said Dr. Alparin. The report backs up that idea, as it found that the physician age groups that most often adopted telemedicine were those in their 40s, 50s, and 60s.

Telemedicine is popular with patients once they try it, and it can greatly expand a physician’s reach, according to Dr. Alparin. “If you’re a specialist, you can perhaps see patients in areas where that specialty is underrepresented, whether that’s the inner city or a very rural area,” he said. The most important barrier is high-speed Internet access, which remains a problem in many areas.

Doximity researchers surveyed more than 2,000 U.S. adults to get their opinions on telemedicine, and analyzed telemedicine adoption data from the platform’s own set of telemedicine tools, and compared it to data from the 2019 report. They also reviewed studies looking at disparities in medicine and patient access to telemedicine.

Telemedicine use among patients grew from 14% before the pandemic, to 35% who reported at least one telemedicine visit after COVID-19. A total of 23% said they planned to continue use of telemedicine after the pandemic ends, and 27% said they had become more comfortable using telemedicine. Among patients, 28% said telemedicine provides the same or better benefit as an in-person visit, and this rose to 53% among those with chronic illnesses.

Among physicians, telemedicine adoption rose by 20% between 2015 and 2018, but increased by 38% between 2019 and 2020. The highest percentage of physician telemedicine adopters were in large metro areas and East Coast states, led by Massachusetts, North Carolina, and New Jersey. None of the top 10 adopter states were west of Illinois.

Equity concerns remain: 64.3% of households with annual incomes of $25,000 or lower have access to broadband internet, compared with 93.5% of those with incomes of $50,000 or lower. In nonmetropolitan areas, 78.1% of households have access, compared with 86.7% of metropolitan households. The good news is that many patients prefer cell phone use for telemedicine, and nearly as many Black and Hispanic Americans own cell phones as White Americans. “That has really democratized access,” said Dr. Alperin.

A key to successful telemedicine appointments is to make sure that the patient is prepared, according to Dr. Alparin. Make sure the patient is in a relatively quiet, well-lit place, and that they have thought about the questions they want to ask. It’s possible to replicate some aspects of a physical appointment with the right conditions. “You can visualize how they move their arms and legs; you can see how they’re breathing. You can gain a lot of information by just watching somebody,” said Dr. Alparin. A physician might also spot clues in the patient’s surroundings. “If a patient is asthmatic and you see cats walking all over the place, or a patient is allergic to gluten and they have loaves of bread everywhere,” he added.

A big concern for telemedicine has been reimbursement. In response to the pandemic, the Centers for Medicare & Medicaid Services created a number of waivers to requirements for billing for telemedicine services, and private insurers followed suit. In August, the agency announced it would make some of those waivers permanent, though others such as removal of restrictions on the site of care, eligible providers, and nonrural areas will likely require an act of Congress to enshrine, CMS administrator Seema Verma told reporters at an August press conference.

SOURCE: 2020 State of Telemedicine Report.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article

Endoscopic screening for gastric cancer is cost effective in Asian Americans

Article Type
Changed

 

A new model of gastric cancer screening suggests that, for Asian Americans, endoscopic screening alongside colonoscopy and follow-up surveillance of gastric preneoplasia is a cost-effective strategy. Incremental cost-effectiveness ratios (ICERs) were lowest for Chinese, Japanese, and Korean Americans. The model simulated results for asymptomatic 50-year-old subjects.

Gastric cancer risk is highest in Asian Pacific, Latin American, and Eastern European countries. Asia Pacific countries alone represent about half of all new cases. Helicobacter pylori–related gastritis is the strongest known risk factor for intestinal-type noncardia gastric adenocarcinoma (NCGA), which is the most common gastric cancer, and this chronic inflammation can lead to gastric intestinal metaplasia (GIM). Individuals with GIM have a 0.16% increased annual risk of NCGA, which makes them good candidates for endoscopic screening that could catch new cancers at an early stage.

In a previous study (Gastroenterology. 2018 May 17;155[3]:648-60), researchers at Vanderbilt University Medical Center in Durham, N.C., at Boston University School of Medicine, and at the University of Pennsylvania in Philadelphia showed that, in asymptomatic 50-year-old Asian Americans, Hispanic patients, and non-Hispanic Black patients, performing a single esophagogastroduodenoscopy (EGD) concomitantly with a colonoscopy, followed by screening EGDs if indicated (such as for a GIM diagnosis), is a cost-effective strategy. They found ongoing screening was not cost effective if the original results were normal.

In the new study published in Gastroenterology and Hepatology, the researchers followed up this finding with an attempt to tease out the cost-effectiveness of screening in different subgroups, as well as by sex. They built a Markov decision model focusing on the six most common Asian groups in the United States: Chinese, Filipino, Southeast Asian, Vietnamese, Korean, and Japanese Americans.

Model inputs were based on the published literature, and the outputs were compared with data from the Surveillance, Epidemiology, and End Results (SEER) data for disaggregated Asian Americans between 2001 and 2014 and separately with the California Cancer Registry (2011-2015). The model produced a good fit to the epidemiological data.

The model then compared cost-effectiveness of three hypothetical screening strategies in asymptomatic 50-year-old Asian Americans: one-time upper EGD with biopsies conducted at the time of colonoscopies for colorectal cancer screening, followed by EGDs every 3 years if GIM was detected (or other appropriate management of higher-grade pathology); EGD with biopsy at a colonoscopy for CRC screening followed by EGD biennially regardless of initial findings; and no endoscopy screening.

The one-time EGD strategy was the most cost-effective, regardless of sex, with an ICER of $75,959 per quality-adjusted life-year (QALY) in males and $74,329/QALY in females. The lowest ICER was found for Chinese Americans (males and females, $68,256/QALY), followed by Japanese Americans (males, $69,011/QALY; females, $73,748/QALY), and Korean Americans (males, $70,739/ QALY; females, $70,236/QALY). The highest ICERs were among Filipino American males and females, but the strategy was still cost-effective at the predetermined willingness-to-pay threshold of $100,000 ($83,732/QALY).

In all ethnic groups, the biennial screening strategy produced more harm than good and was costlier.

The authors believe that the strategy could be applied to other ethnic groups that come from countries with populations at higher relative risk of gastric cancer, such as Central and Latin American countries.

Asked to comment on the study, Mimi Tan, MD, an assistant professor of gastroenterology at Baylor College of Medicine in Houston, suggested that the estimates of precancerous lesions used in the Markov model were quite high because they were based on pathology databases. These sources tend to be biased toward symptomatic individuals since these are the patients typically referred for upper endoscopy biopsies. “Therefore, these probabilities may not represent true probability of these precancerous lesions among asymptomatic screening populations,” Dr. Tan said in an interview. She also questioned whether the study represented the true risk in female populations since the literature for women is sparse.

Dr. Tan suggested that a more cost-effective screening strategy might be one-time H. pylori immunoglobulin G testing in Asian Americans. The Houston Consensus Conference on Testing for H. pylori Infection already recommends testing for first-generation immigrants from high prevalence areas and Latino and African American racial or ethnic groups (Clin Gastroenterol Hepatol. 2018 Jul;16[7]:992-1002). “Future studies should compare cost-effectiveness of one-time upper endoscopy, which is more costly but able to detect premalignant lesions, to one-time H. pylori testing,” said Dr. Tan.

SOURCE: Shah SC et al. Clin Gastroenterol Hepatol. 2020 July 21. doi: 10.1016/j.cgh.2020.07.031.

Publications
Topics
Sections

 

A new model of gastric cancer screening suggests that, for Asian Americans, endoscopic screening alongside colonoscopy and follow-up surveillance of gastric preneoplasia is a cost-effective strategy. Incremental cost-effectiveness ratios (ICERs) were lowest for Chinese, Japanese, and Korean Americans. The model simulated results for asymptomatic 50-year-old subjects.

Gastric cancer risk is highest in Asian Pacific, Latin American, and Eastern European countries. Asia Pacific countries alone represent about half of all new cases. Helicobacter pylori–related gastritis is the strongest known risk factor for intestinal-type noncardia gastric adenocarcinoma (NCGA), which is the most common gastric cancer, and this chronic inflammation can lead to gastric intestinal metaplasia (GIM). Individuals with GIM have a 0.16% increased annual risk of NCGA, which makes them good candidates for endoscopic screening that could catch new cancers at an early stage.

In a previous study (Gastroenterology. 2018 May 17;155[3]:648-60), researchers at Vanderbilt University Medical Center in Durham, N.C., at Boston University School of Medicine, and at the University of Pennsylvania in Philadelphia showed that, in asymptomatic 50-year-old Asian Americans, Hispanic patients, and non-Hispanic Black patients, performing a single esophagogastroduodenoscopy (EGD) concomitantly with a colonoscopy, followed by screening EGDs if indicated (such as for a GIM diagnosis), is a cost-effective strategy. They found ongoing screening was not cost effective if the original results were normal.

In the new study published in Gastroenterology and Hepatology, the researchers followed up this finding with an attempt to tease out the cost-effectiveness of screening in different subgroups, as well as by sex. They built a Markov decision model focusing on the six most common Asian groups in the United States: Chinese, Filipino, Southeast Asian, Vietnamese, Korean, and Japanese Americans.

Model inputs were based on the published literature, and the outputs were compared with data from the Surveillance, Epidemiology, and End Results (SEER) data for disaggregated Asian Americans between 2001 and 2014 and separately with the California Cancer Registry (2011-2015). The model produced a good fit to the epidemiological data.

The model then compared cost-effectiveness of three hypothetical screening strategies in asymptomatic 50-year-old Asian Americans: one-time upper EGD with biopsies conducted at the time of colonoscopies for colorectal cancer screening, followed by EGDs every 3 years if GIM was detected (or other appropriate management of higher-grade pathology); EGD with biopsy at a colonoscopy for CRC screening followed by EGD biennially regardless of initial findings; and no endoscopy screening.

The one-time EGD strategy was the most cost-effective, regardless of sex, with an ICER of $75,959 per quality-adjusted life-year (QALY) in males and $74,329/QALY in females. The lowest ICER was found for Chinese Americans (males and females, $68,256/QALY), followed by Japanese Americans (males, $69,011/QALY; females, $73,748/QALY), and Korean Americans (males, $70,739/ QALY; females, $70,236/QALY). The highest ICERs were among Filipino American males and females, but the strategy was still cost-effective at the predetermined willingness-to-pay threshold of $100,000 ($83,732/QALY).

In all ethnic groups, the biennial screening strategy produced more harm than good and was costlier.

The authors believe that the strategy could be applied to other ethnic groups that come from countries with populations at higher relative risk of gastric cancer, such as Central and Latin American countries.

Asked to comment on the study, Mimi Tan, MD, an assistant professor of gastroenterology at Baylor College of Medicine in Houston, suggested that the estimates of precancerous lesions used in the Markov model were quite high because they were based on pathology databases. These sources tend to be biased toward symptomatic individuals since these are the patients typically referred for upper endoscopy biopsies. “Therefore, these probabilities may not represent true probability of these precancerous lesions among asymptomatic screening populations,” Dr. Tan said in an interview. She also questioned whether the study represented the true risk in female populations since the literature for women is sparse.

Dr. Tan suggested that a more cost-effective screening strategy might be one-time H. pylori immunoglobulin G testing in Asian Americans. The Houston Consensus Conference on Testing for H. pylori Infection already recommends testing for first-generation immigrants from high prevalence areas and Latino and African American racial or ethnic groups (Clin Gastroenterol Hepatol. 2018 Jul;16[7]:992-1002). “Future studies should compare cost-effectiveness of one-time upper endoscopy, which is more costly but able to detect premalignant lesions, to one-time H. pylori testing,” said Dr. Tan.

SOURCE: Shah SC et al. Clin Gastroenterol Hepatol. 2020 July 21. doi: 10.1016/j.cgh.2020.07.031.

 

A new model of gastric cancer screening suggests that, for Asian Americans, endoscopic screening alongside colonoscopy and follow-up surveillance of gastric preneoplasia is a cost-effective strategy. Incremental cost-effectiveness ratios (ICERs) were lowest for Chinese, Japanese, and Korean Americans. The model simulated results for asymptomatic 50-year-old subjects.

Gastric cancer risk is highest in Asian Pacific, Latin American, and Eastern European countries. Asia Pacific countries alone represent about half of all new cases. Helicobacter pylori–related gastritis is the strongest known risk factor for intestinal-type noncardia gastric adenocarcinoma (NCGA), which is the most common gastric cancer, and this chronic inflammation can lead to gastric intestinal metaplasia (GIM). Individuals with GIM have a 0.16% increased annual risk of NCGA, which makes them good candidates for endoscopic screening that could catch new cancers at an early stage.

In a previous study (Gastroenterology. 2018 May 17;155[3]:648-60), researchers at Vanderbilt University Medical Center in Durham, N.C., at Boston University School of Medicine, and at the University of Pennsylvania in Philadelphia showed that, in asymptomatic 50-year-old Asian Americans, Hispanic patients, and non-Hispanic Black patients, performing a single esophagogastroduodenoscopy (EGD) concomitantly with a colonoscopy, followed by screening EGDs if indicated (such as for a GIM diagnosis), is a cost-effective strategy. They found ongoing screening was not cost effective if the original results were normal.

In the new study published in Gastroenterology and Hepatology, the researchers followed up this finding with an attempt to tease out the cost-effectiveness of screening in different subgroups, as well as by sex. They built a Markov decision model focusing on the six most common Asian groups in the United States: Chinese, Filipino, Southeast Asian, Vietnamese, Korean, and Japanese Americans.

Model inputs were based on the published literature, and the outputs were compared with data from the Surveillance, Epidemiology, and End Results (SEER) data for disaggregated Asian Americans between 2001 and 2014 and separately with the California Cancer Registry (2011-2015). The model produced a good fit to the epidemiological data.

The model then compared cost-effectiveness of three hypothetical screening strategies in asymptomatic 50-year-old Asian Americans: one-time upper EGD with biopsies conducted at the time of colonoscopies for colorectal cancer screening, followed by EGDs every 3 years if GIM was detected (or other appropriate management of higher-grade pathology); EGD with biopsy at a colonoscopy for CRC screening followed by EGD biennially regardless of initial findings; and no endoscopy screening.

The one-time EGD strategy was the most cost-effective, regardless of sex, with an ICER of $75,959 per quality-adjusted life-year (QALY) in males and $74,329/QALY in females. The lowest ICER was found for Chinese Americans (males and females, $68,256/QALY), followed by Japanese Americans (males, $69,011/QALY; females, $73,748/QALY), and Korean Americans (males, $70,739/ QALY; females, $70,236/QALY). The highest ICERs were among Filipino American males and females, but the strategy was still cost-effective at the predetermined willingness-to-pay threshold of $100,000 ($83,732/QALY).

In all ethnic groups, the biennial screening strategy produced more harm than good and was costlier.

The authors believe that the strategy could be applied to other ethnic groups that come from countries with populations at higher relative risk of gastric cancer, such as Central and Latin American countries.

Asked to comment on the study, Mimi Tan, MD, an assistant professor of gastroenterology at Baylor College of Medicine in Houston, suggested that the estimates of precancerous lesions used in the Markov model were quite high because they were based on pathology databases. These sources tend to be biased toward symptomatic individuals since these are the patients typically referred for upper endoscopy biopsies. “Therefore, these probabilities may not represent true probability of these precancerous lesions among asymptomatic screening populations,” Dr. Tan said in an interview. She also questioned whether the study represented the true risk in female populations since the literature for women is sparse.

Dr. Tan suggested that a more cost-effective screening strategy might be one-time H. pylori immunoglobulin G testing in Asian Americans. The Houston Consensus Conference on Testing for H. pylori Infection already recommends testing for first-generation immigrants from high prevalence areas and Latino and African American racial or ethnic groups (Clin Gastroenterol Hepatol. 2018 Jul;16[7]:992-1002). “Future studies should compare cost-effectiveness of one-time upper endoscopy, which is more costly but able to detect premalignant lesions, to one-time H. pylori testing,” said Dr. Tan.

SOURCE: Shah SC et al. Clin Gastroenterol Hepatol. 2020 July 21. doi: 10.1016/j.cgh.2020.07.031.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article

Blood biomarker may predict Parkinson’s disease progression

Article Type
Changed

 

novel biomarker could help identify progression in Parkinson’s disease, distinguish it from other neurodegenerative disorders, and monitor response to treatments. Although the biomarker, neurofilament light chain (NfL), is not especially specific, it is the first blood-based biomarker for Parkinson’s disease.

Neurofilaments are components of the neural cytoskeleton, where they maintain structure along with other functions. Following axonal damage, NfL gets released into extracellular fluids. Previously, NfL has been detected in cerebrospinal fluid (CSF) in patients with multiple sclerosis and neurodegenerative dementias. NfL in the CSF can distinguish Parkinson’s disease (PD) from multiple system atrophy and progressive supranuclear palsy.

That’s useful, but a serum marker would open new doors. “An easily accessible biomarker that will serve as an indicator of diagnosis, disease state, and progression, as well as a marker of response to therapeutic intervention is needed. A biomarker will strengthen the ability to select patients for inclusion or stratification within clinical trials,” commented Okeanis Vaou, MD, director of the movement disorders program at St. Elizabeth’s Medical Center in Brighton, Mass. Dr. Vaou was not involved in the study, which was published Aug. 15 in Movement Disorders.
 

A potential biomarker?

To determine if serum NfL levels would correlate with CSF values and had potential as a biomarker, a large, multi-institutional team of researchers led by Brit Mollenhauer, MD, of the University Medical Center Goettingen (Germany), and Danielle Graham, MD, of Biogen, drew data from a prospective, longitudinal, single-center project called the De Novo Parkinson’s disease (DeNoPa) cohort.

The researchers analyzed data from 176 subjects, including drug-naive patients with newly diagnosed PD; age, sex, and education matched healthy controls; and patients who were initially diagnosed with Parkinson’s disease but had their diagnoses changed to a cognate or neurodegenerative disorder (OND). The researchers also drew 514 serum samples from the prospective longitudinal, observational, international multicenter study Parkinson’s Progression Marker Initiative (PPMI) cohort.

In the DeNoPa cohort, OND patients had the highest median CSF NfL levels at baseline (839 pg/mL) followed by PD patients (562 pg/mL) and healthy controls (494 pg/mL; P = .01). There was a strong correlation between CSF and serum NfL levels in a cross-sectional exploratory study with the PPMI cohort.

Age and sex covariates in the PPMI cohort explained 51% of NfL variability. After adjustment for age and sex, baseline median blood NfL levels were highest in the OND group (16.23 pg/mL), followed by the genetic PD group (13.36 pg/mL), prodromal participants (12.20 pg/mL), PD patients (11.73 pg/mL), unaffected mutation carriers (11.63 pg/mL), and healthy controls (11.05 pg/mL; F test P < .0001). Median serum NfL increased by 3.35% per year of age (P < .0001), and median serum NfL was 6.79% higher in women (P = .0002).

Doubling of adjusted serum NfL levels were associated with a median increase in the Movement Disorder Society Unified Parkinson’s Disease Rating Scale total score of 3.45 points (false-discovery rate–adjusted P = .0115), a median decrease in Symbol Digit Modality Test total score of 1.39 (FDR P = .026), a median decrease in Hopkins Verbal Learning Tests with discrimination recognition score of 0.3 (FDR P = .03), and a median decrease in Hopkins Verbal Learning Tests with retention score of 0.029 (FDR P = .04).
 

 

 

More specific markers needed

The findings are intriguing, said Dr Vaou, but “we need to acknowledge that increased NfL levels are not specific enough to Parkinson’s disease and reflect neuronal and axonal damage. Therefore, there is a need for more specific markers to support diagnostic accuracy, rate of progression, and ultimate prognosis. A serum NfL assay may be useful to clinicians evaluating patients with PD or OND diagnosis and mitigate the misdiagnosis of atypical PD. NfL may be particularly useful in differentiating PD from cognate disorders such as multiple system atrophy, progressive supranuclear palsy, and dementia with Lewy bodies.”

The current success is the result of large patient databases containing phenotypic data, imaging, and tests of tissue, blood, and cerebrospinal fluid, along with collaborations between advocacy groups, academia, and industry, according to Dr. Vaou. As that work continues, it could uncover more specific biomarkers “that will allow us not only to help with diagnosis and treatment but with disease progression, inclusion, recruitment and stratification in clinical studies, as well as (be an) indicator of response to therapeutic intervention of an investigational drug.”

The study was funded by the Michael J. Fox Foundation for Parkinson’s Research. Dr. Vaou had no relevant financial disclosures.

SOURCE: Mollenhauer B et al. Mov Disord. 2020 Aug 15. doi: 10.1002/mds.28206.

Issue
Neurology Reviews- 28(10)
Publications
Topics
Sections

 

novel biomarker could help identify progression in Parkinson’s disease, distinguish it from other neurodegenerative disorders, and monitor response to treatments. Although the biomarker, neurofilament light chain (NfL), is not especially specific, it is the first blood-based biomarker for Parkinson’s disease.

Neurofilaments are components of the neural cytoskeleton, where they maintain structure along with other functions. Following axonal damage, NfL gets released into extracellular fluids. Previously, NfL has been detected in cerebrospinal fluid (CSF) in patients with multiple sclerosis and neurodegenerative dementias. NfL in the CSF can distinguish Parkinson’s disease (PD) from multiple system atrophy and progressive supranuclear palsy.

That’s useful, but a serum marker would open new doors. “An easily accessible biomarker that will serve as an indicator of diagnosis, disease state, and progression, as well as a marker of response to therapeutic intervention is needed. A biomarker will strengthen the ability to select patients for inclusion or stratification within clinical trials,” commented Okeanis Vaou, MD, director of the movement disorders program at St. Elizabeth’s Medical Center in Brighton, Mass. Dr. Vaou was not involved in the study, which was published Aug. 15 in Movement Disorders.
 

A potential biomarker?

To determine if serum NfL levels would correlate with CSF values and had potential as a biomarker, a large, multi-institutional team of researchers led by Brit Mollenhauer, MD, of the University Medical Center Goettingen (Germany), and Danielle Graham, MD, of Biogen, drew data from a prospective, longitudinal, single-center project called the De Novo Parkinson’s disease (DeNoPa) cohort.

The researchers analyzed data from 176 subjects, including drug-naive patients with newly diagnosed PD; age, sex, and education matched healthy controls; and patients who were initially diagnosed with Parkinson’s disease but had their diagnoses changed to a cognate or neurodegenerative disorder (OND). The researchers also drew 514 serum samples from the prospective longitudinal, observational, international multicenter study Parkinson’s Progression Marker Initiative (PPMI) cohort.

In the DeNoPa cohort, OND patients had the highest median CSF NfL levels at baseline (839 pg/mL) followed by PD patients (562 pg/mL) and healthy controls (494 pg/mL; P = .01). There was a strong correlation between CSF and serum NfL levels in a cross-sectional exploratory study with the PPMI cohort.

Age and sex covariates in the PPMI cohort explained 51% of NfL variability. After adjustment for age and sex, baseline median blood NfL levels were highest in the OND group (16.23 pg/mL), followed by the genetic PD group (13.36 pg/mL), prodromal participants (12.20 pg/mL), PD patients (11.73 pg/mL), unaffected mutation carriers (11.63 pg/mL), and healthy controls (11.05 pg/mL; F test P < .0001). Median serum NfL increased by 3.35% per year of age (P < .0001), and median serum NfL was 6.79% higher in women (P = .0002).

Doubling of adjusted serum NfL levels were associated with a median increase in the Movement Disorder Society Unified Parkinson’s Disease Rating Scale total score of 3.45 points (false-discovery rate–adjusted P = .0115), a median decrease in Symbol Digit Modality Test total score of 1.39 (FDR P = .026), a median decrease in Hopkins Verbal Learning Tests with discrimination recognition score of 0.3 (FDR P = .03), and a median decrease in Hopkins Verbal Learning Tests with retention score of 0.029 (FDR P = .04).
 

 

 

More specific markers needed

The findings are intriguing, said Dr Vaou, but “we need to acknowledge that increased NfL levels are not specific enough to Parkinson’s disease and reflect neuronal and axonal damage. Therefore, there is a need for more specific markers to support diagnostic accuracy, rate of progression, and ultimate prognosis. A serum NfL assay may be useful to clinicians evaluating patients with PD or OND diagnosis and mitigate the misdiagnosis of atypical PD. NfL may be particularly useful in differentiating PD from cognate disorders such as multiple system atrophy, progressive supranuclear palsy, and dementia with Lewy bodies.”

The current success is the result of large patient databases containing phenotypic data, imaging, and tests of tissue, blood, and cerebrospinal fluid, along with collaborations between advocacy groups, academia, and industry, according to Dr. Vaou. As that work continues, it could uncover more specific biomarkers “that will allow us not only to help with diagnosis and treatment but with disease progression, inclusion, recruitment and stratification in clinical studies, as well as (be an) indicator of response to therapeutic intervention of an investigational drug.”

The study was funded by the Michael J. Fox Foundation for Parkinson’s Research. Dr. Vaou had no relevant financial disclosures.

SOURCE: Mollenhauer B et al. Mov Disord. 2020 Aug 15. doi: 10.1002/mds.28206.

 

novel biomarker could help identify progression in Parkinson’s disease, distinguish it from other neurodegenerative disorders, and monitor response to treatments. Although the biomarker, neurofilament light chain (NfL), is not especially specific, it is the first blood-based biomarker for Parkinson’s disease.

Neurofilaments are components of the neural cytoskeleton, where they maintain structure along with other functions. Following axonal damage, NfL gets released into extracellular fluids. Previously, NfL has been detected in cerebrospinal fluid (CSF) in patients with multiple sclerosis and neurodegenerative dementias. NfL in the CSF can distinguish Parkinson’s disease (PD) from multiple system atrophy and progressive supranuclear palsy.

That’s useful, but a serum marker would open new doors. “An easily accessible biomarker that will serve as an indicator of diagnosis, disease state, and progression, as well as a marker of response to therapeutic intervention is needed. A biomarker will strengthen the ability to select patients for inclusion or stratification within clinical trials,” commented Okeanis Vaou, MD, director of the movement disorders program at St. Elizabeth’s Medical Center in Brighton, Mass. Dr. Vaou was not involved in the study, which was published Aug. 15 in Movement Disorders.
 

A potential biomarker?

To determine if serum NfL levels would correlate with CSF values and had potential as a biomarker, a large, multi-institutional team of researchers led by Brit Mollenhauer, MD, of the University Medical Center Goettingen (Germany), and Danielle Graham, MD, of Biogen, drew data from a prospective, longitudinal, single-center project called the De Novo Parkinson’s disease (DeNoPa) cohort.

The researchers analyzed data from 176 subjects, including drug-naive patients with newly diagnosed PD; age, sex, and education matched healthy controls; and patients who were initially diagnosed with Parkinson’s disease but had their diagnoses changed to a cognate or neurodegenerative disorder (OND). The researchers also drew 514 serum samples from the prospective longitudinal, observational, international multicenter study Parkinson’s Progression Marker Initiative (PPMI) cohort.

In the DeNoPa cohort, OND patients had the highest median CSF NfL levels at baseline (839 pg/mL) followed by PD patients (562 pg/mL) and healthy controls (494 pg/mL; P = .01). There was a strong correlation between CSF and serum NfL levels in a cross-sectional exploratory study with the PPMI cohort.

Age and sex covariates in the PPMI cohort explained 51% of NfL variability. After adjustment for age and sex, baseline median blood NfL levels were highest in the OND group (16.23 pg/mL), followed by the genetic PD group (13.36 pg/mL), prodromal participants (12.20 pg/mL), PD patients (11.73 pg/mL), unaffected mutation carriers (11.63 pg/mL), and healthy controls (11.05 pg/mL; F test P < .0001). Median serum NfL increased by 3.35% per year of age (P < .0001), and median serum NfL was 6.79% higher in women (P = .0002).

Doubling of adjusted serum NfL levels were associated with a median increase in the Movement Disorder Society Unified Parkinson’s Disease Rating Scale total score of 3.45 points (false-discovery rate–adjusted P = .0115), a median decrease in Symbol Digit Modality Test total score of 1.39 (FDR P = .026), a median decrease in Hopkins Verbal Learning Tests with discrimination recognition score of 0.3 (FDR P = .03), and a median decrease in Hopkins Verbal Learning Tests with retention score of 0.029 (FDR P = .04).
 

 

 

More specific markers needed

The findings are intriguing, said Dr Vaou, but “we need to acknowledge that increased NfL levels are not specific enough to Parkinson’s disease and reflect neuronal and axonal damage. Therefore, there is a need for more specific markers to support diagnostic accuracy, rate of progression, and ultimate prognosis. A serum NfL assay may be useful to clinicians evaluating patients with PD or OND diagnosis and mitigate the misdiagnosis of atypical PD. NfL may be particularly useful in differentiating PD from cognate disorders such as multiple system atrophy, progressive supranuclear palsy, and dementia with Lewy bodies.”

The current success is the result of large patient databases containing phenotypic data, imaging, and tests of tissue, blood, and cerebrospinal fluid, along with collaborations between advocacy groups, academia, and industry, according to Dr. Vaou. As that work continues, it could uncover more specific biomarkers “that will allow us not only to help with diagnosis and treatment but with disease progression, inclusion, recruitment and stratification in clinical studies, as well as (be an) indicator of response to therapeutic intervention of an investigational drug.”

The study was funded by the Michael J. Fox Foundation for Parkinson’s Research. Dr. Vaou had no relevant financial disclosures.

SOURCE: Mollenhauer B et al. Mov Disord. 2020 Aug 15. doi: 10.1002/mds.28206.

Issue
Neurology Reviews- 28(10)
Issue
Neurology Reviews- 28(10)
Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM MOVEMENT DISORDERS

Citation Override
Publish date: September 10, 2020
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article

FDA grants approval to weekly growth hormone for adults

Article Type
Changed

The human growth hormone formulation somapacitan for adults with growth hormone deficiency was approved by the Food and Drug Administration on Sept. 1. The drug is injected once a week, while other FDA-approved human growth hormone formulations require daily jabs.

Somapacitan contains an albumin-binding element attached to the growth hormone, causing the reversible binding to albumin proteins in the body. This reduces clearance and increases the half-life of the hormone. The formulation has previous demonstrated safety and efficacy in children with growth hormone deficiency (J Clin Endocrinol Metab. 2020 Apr 1. doi: 10.1210/clinem/dgz310).

Growth hormone treatment can counter abdominal obesity, reduced lean body mass, fatigue, osteopenia, cardiovascular risks, and other manifestations of growth hormone deficiency in adults, but daily injections can be burdensome for patients. That makes long-acting versions attractive, but the lifelong nature of the treatment makes it important to characterize safety and tolerability.

The approval comes on the strength of a randomized, placebo-controlled phase 3 trial (REAL 1) of 300 adult patients in 17 countries with growth hormone deficiency (J Clin Endocrinol Metab. 2020 Apr 1. doi: 10.1210/clinem/dgaa049). Participants had either never received growth hormone treatment, or had stopped taking one at least 6 months before starting the trial. Subjects received once-weekly somapacitan, once-weekly placebo, or daily somatropin, which is FDA approved.

The primary endpoint was percentage change of truncal fat, which is regulated by growth hormone, and can lead to medical problems. After 34 weeks, subjects in the somapacitan group experienced a 1.06% decrease in truncal fat, compared with a 0.47% increase in the placebo group (P = .009) and a 2.23% decrease in the daily somatropin group.

After 34 weeks, a 52-week extension trial began. The somapacitan group continued on the drug and the placebo group was offered somapacitan. Patients on daily somatropin were randomized to continue daily treatment with somatropin or to switch to somapacitan.

At the end of the extension trial, those taking somapacitan for the full 86-week duration had an average reduction of 1.52% in truncal fat. After 86 weeks, the somapacitan and daily somatropin groups had similar values for percentage change in visceral fat, lean body mass, or appendicular skeletal muscle mass.

Common side effects of somapacitan were back pain, joint paint, indigestion, a sleep disorder, dizziness, tonsillitis, swelling in the arms or lower legs, vomiting, adrenal insufficiency, hypertension, increase in blood creatine phosphokinase, weight increase, and anemia.

Somapacitan, marketed as Sogroya by Novo Nordisk, is contraindicated in patients with an allergy to the drug, as well as those with an active malignancy, diabetic eye disease where increases in blood sugars could lead to retinal damage, acute critical illness, or acute respiratory failure.

The FDA recommends that providers perform an eye examination before drug initiation, as well as periodically while the patient is taking the drug, to rule out preexisting papilledema. This could be a sign of intracranial hypertension, which could be caused or worsened by growth hormones.

Publications
Topics
Sections

The human growth hormone formulation somapacitan for adults with growth hormone deficiency was approved by the Food and Drug Administration on Sept. 1. The drug is injected once a week, while other FDA-approved human growth hormone formulations require daily jabs.

Somapacitan contains an albumin-binding element attached to the growth hormone, causing the reversible binding to albumin proteins in the body. This reduces clearance and increases the half-life of the hormone. The formulation has previous demonstrated safety and efficacy in children with growth hormone deficiency (J Clin Endocrinol Metab. 2020 Apr 1. doi: 10.1210/clinem/dgz310).

Growth hormone treatment can counter abdominal obesity, reduced lean body mass, fatigue, osteopenia, cardiovascular risks, and other manifestations of growth hormone deficiency in adults, but daily injections can be burdensome for patients. That makes long-acting versions attractive, but the lifelong nature of the treatment makes it important to characterize safety and tolerability.

The approval comes on the strength of a randomized, placebo-controlled phase 3 trial (REAL 1) of 300 adult patients in 17 countries with growth hormone deficiency (J Clin Endocrinol Metab. 2020 Apr 1. doi: 10.1210/clinem/dgaa049). Participants had either never received growth hormone treatment, or had stopped taking one at least 6 months before starting the trial. Subjects received once-weekly somapacitan, once-weekly placebo, or daily somatropin, which is FDA approved.

The primary endpoint was percentage change of truncal fat, which is regulated by growth hormone, and can lead to medical problems. After 34 weeks, subjects in the somapacitan group experienced a 1.06% decrease in truncal fat, compared with a 0.47% increase in the placebo group (P = .009) and a 2.23% decrease in the daily somatropin group.

After 34 weeks, a 52-week extension trial began. The somapacitan group continued on the drug and the placebo group was offered somapacitan. Patients on daily somatropin were randomized to continue daily treatment with somatropin or to switch to somapacitan.

At the end of the extension trial, those taking somapacitan for the full 86-week duration had an average reduction of 1.52% in truncal fat. After 86 weeks, the somapacitan and daily somatropin groups had similar values for percentage change in visceral fat, lean body mass, or appendicular skeletal muscle mass.

Common side effects of somapacitan were back pain, joint paint, indigestion, a sleep disorder, dizziness, tonsillitis, swelling in the arms or lower legs, vomiting, adrenal insufficiency, hypertension, increase in blood creatine phosphokinase, weight increase, and anemia.

Somapacitan, marketed as Sogroya by Novo Nordisk, is contraindicated in patients with an allergy to the drug, as well as those with an active malignancy, diabetic eye disease where increases in blood sugars could lead to retinal damage, acute critical illness, or acute respiratory failure.

The FDA recommends that providers perform an eye examination before drug initiation, as well as periodically while the patient is taking the drug, to rule out preexisting papilledema. This could be a sign of intracranial hypertension, which could be caused or worsened by growth hormones.

The human growth hormone formulation somapacitan for adults with growth hormone deficiency was approved by the Food and Drug Administration on Sept. 1. The drug is injected once a week, while other FDA-approved human growth hormone formulations require daily jabs.

Somapacitan contains an albumin-binding element attached to the growth hormone, causing the reversible binding to albumin proteins in the body. This reduces clearance and increases the half-life of the hormone. The formulation has previous demonstrated safety and efficacy in children with growth hormone deficiency (J Clin Endocrinol Metab. 2020 Apr 1. doi: 10.1210/clinem/dgz310).

Growth hormone treatment can counter abdominal obesity, reduced lean body mass, fatigue, osteopenia, cardiovascular risks, and other manifestations of growth hormone deficiency in adults, but daily injections can be burdensome for patients. That makes long-acting versions attractive, but the lifelong nature of the treatment makes it important to characterize safety and tolerability.

The approval comes on the strength of a randomized, placebo-controlled phase 3 trial (REAL 1) of 300 adult patients in 17 countries with growth hormone deficiency (J Clin Endocrinol Metab. 2020 Apr 1. doi: 10.1210/clinem/dgaa049). Participants had either never received growth hormone treatment, or had stopped taking one at least 6 months before starting the trial. Subjects received once-weekly somapacitan, once-weekly placebo, or daily somatropin, which is FDA approved.

The primary endpoint was percentage change of truncal fat, which is regulated by growth hormone, and can lead to medical problems. After 34 weeks, subjects in the somapacitan group experienced a 1.06% decrease in truncal fat, compared with a 0.47% increase in the placebo group (P = .009) and a 2.23% decrease in the daily somatropin group.

After 34 weeks, a 52-week extension trial began. The somapacitan group continued on the drug and the placebo group was offered somapacitan. Patients on daily somatropin were randomized to continue daily treatment with somatropin or to switch to somapacitan.

At the end of the extension trial, those taking somapacitan for the full 86-week duration had an average reduction of 1.52% in truncal fat. After 86 weeks, the somapacitan and daily somatropin groups had similar values for percentage change in visceral fat, lean body mass, or appendicular skeletal muscle mass.

Common side effects of somapacitan were back pain, joint paint, indigestion, a sleep disorder, dizziness, tonsillitis, swelling in the arms or lower legs, vomiting, adrenal insufficiency, hypertension, increase in blood creatine phosphokinase, weight increase, and anemia.

Somapacitan, marketed as Sogroya by Novo Nordisk, is contraindicated in patients with an allergy to the drug, as well as those with an active malignancy, diabetic eye disease where increases in blood sugars could lead to retinal damage, acute critical illness, or acute respiratory failure.

The FDA recommends that providers perform an eye examination before drug initiation, as well as periodically while the patient is taking the drug, to rule out preexisting papilledema. This could be a sign of intracranial hypertension, which could be caused or worsened by growth hormones.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article

Researchers home in on optimal biopsy length for giant cell arteritis

Article Type
Changed

 

A new retrospective analysis has found 1.5-2 cm to be the optimal length of a temporal artery biopsy for detecting giant cell arteritis. Longer lengths did not yield enough improvement in diagnosis to justify the increased risk of complications. The length calculation accounts for post-fixation shrinkage.

The study, published Aug. 20 in Lancet Rheumatology, represents an “important contribution” to help with the diagnosis of giant cell arteritis when a decision has been made to perform a temporal artery biopsy, according to authors of an editorial accompanying the study.

Giant cell arteritis is an inflammatory condition of medium and large arteries, usually affecting the aorta and proximal aorta. Diagnosis includes a combination of clinical presentation and imaging or histology via a temporal artery biopsy, but the optimal tissue length for a biopsy has not been established. Longer lengths were initially considered best because inflammation can be non-uniform, and a shorter length could therefore raise the risk of a false negative if it contained few signs of inflammation.

Studies in the 1990s and early 2000s concluded that biopsies 2-5 cm in length were optimal. But later studies determined that a minimum of just 0.5 cm was necessary. The European League Against Rheumatism updated its recommendations in 2018 and the British Society for Rheumatology followed suit in 2020, both with a suggested minimum length of 1.0 cm. Despite these guidances, the optimal biopsy length beyond 1 cm remains unknown.

For the study, first author Raymond Chu, MD, of the University of Alberta Hospital, Edmonton, reviewed electronic medical records of all patients who underwent temporal artery biopsies in Alberta between Jan. 1, 2008, and Jan. 1, 2018. A single pathologist reviewed all positive findings to ensure uniformity of pathological interpretation. When the reviewer disagreed with the initial diagnosis, researchers removed the result from the analysis.



The study included 1,203 biopsies from 1,176 patients at 22 institutions. A total of 13 positive biopsies were removed following pathologist review. The median biopsy length was 1.3 cm. Median erythrocyte sedimentation rate (ESR) was 41 mm/hour, and median C-reactive protein (CRP) level was 14.7 mg/L. Univariate analyses found associations between positive biopsy and increased age (75.3 vs. 71.3 years; P < .0001), increased ESR (57 vs. 36 mm/hour; P < .0001), lower CRP (12.1 vs. 41.8 mg/L; P < .0001), and longer biopsy length (1.6 vs. 1.2 cm; P = .0025).

In a multivariate analysis, the only variables associated with a positive biopsy were age (adjusted odds ratio [aOR], 1.04; P = .0001), lower CRP levels (aOR, 1.01; P = .0006), and biopsy length (aOR, 1.22; P = .047). The researchers then stratified the sample by biopsy length, using categories of < 0.5 cm, 0.5-1.0 cm, 1.0-1.5 cm, 1.5-2.0 cm, 2.0-2.5 cm, and ≥ 2.5 cm. They identified the two top change points according to the Akaike information criterion as 1.5 cm and 2.0 cm, but only 1.5 cm was statistically significant (≥ 1.5 versus < 1.5; OR, 1.57; P = .011).

Accounting for an average 8% contraction following excision, the researchers recommend an optimal pre-fixation biopsy length of 1.5-2.0 cm.

Some previous studies had suggested no association between increased sample length and false negatives, but they were based on small sample sizes. The current study is limited by its retrospective design and lack of treatment data. The lack of marked inflammation in the sample population suggests that patients were frequently treated empirically with glucocorticoids, and this could have increased the frequency of false negative biopsies, the researchers said.

Dr. Frank Buttgereit

In the accompanying editorial, Frank Buttgereit, MD, of Charité University Medicine in Berlin and Christian Dejaco, MD, PhD, of the Medical University of Graz (Austria) point out that ultrasound is now often used for the diagnosis of giant cell arteritis, following clinical examination and laboratory testing. When it has been determined that biopsy is necessary, they said that it is imperative that the harvest be carried out by an experienced physician, and the new study provides a useful contribution through its clear recommendation for biopsy length.

Dr. Christian Dejaco

The authors of the editorial also point out the importance of experienced pathologists, but interpretation is subject to inter- and intraobserver variability, as shown in a previous study that found that ultrasound and histology have similar reliability.

The study received no funding. Several authors reported receiving personal fees from Hoffmann-LaRoche and serving as site primary investigators for industry-sponsored vasculitis trials.

SOURCE: Chu R et al. Lancet Rheumatol. 2020 Aug 20. doi: 10.1016/S2665-9913(20)30222-8.

Publications
Topics
Sections

 

A new retrospective analysis has found 1.5-2 cm to be the optimal length of a temporal artery biopsy for detecting giant cell arteritis. Longer lengths did not yield enough improvement in diagnosis to justify the increased risk of complications. The length calculation accounts for post-fixation shrinkage.

The study, published Aug. 20 in Lancet Rheumatology, represents an “important contribution” to help with the diagnosis of giant cell arteritis when a decision has been made to perform a temporal artery biopsy, according to authors of an editorial accompanying the study.

Giant cell arteritis is an inflammatory condition of medium and large arteries, usually affecting the aorta and proximal aorta. Diagnosis includes a combination of clinical presentation and imaging or histology via a temporal artery biopsy, but the optimal tissue length for a biopsy has not been established. Longer lengths were initially considered best because inflammation can be non-uniform, and a shorter length could therefore raise the risk of a false negative if it contained few signs of inflammation.

Studies in the 1990s and early 2000s concluded that biopsies 2-5 cm in length were optimal. But later studies determined that a minimum of just 0.5 cm was necessary. The European League Against Rheumatism updated its recommendations in 2018 and the British Society for Rheumatology followed suit in 2020, both with a suggested minimum length of 1.0 cm. Despite these guidances, the optimal biopsy length beyond 1 cm remains unknown.

For the study, first author Raymond Chu, MD, of the University of Alberta Hospital, Edmonton, reviewed electronic medical records of all patients who underwent temporal artery biopsies in Alberta between Jan. 1, 2008, and Jan. 1, 2018. A single pathologist reviewed all positive findings to ensure uniformity of pathological interpretation. When the reviewer disagreed with the initial diagnosis, researchers removed the result from the analysis.



The study included 1,203 biopsies from 1,176 patients at 22 institutions. A total of 13 positive biopsies were removed following pathologist review. The median biopsy length was 1.3 cm. Median erythrocyte sedimentation rate (ESR) was 41 mm/hour, and median C-reactive protein (CRP) level was 14.7 mg/L. Univariate analyses found associations between positive biopsy and increased age (75.3 vs. 71.3 years; P < .0001), increased ESR (57 vs. 36 mm/hour; P < .0001), lower CRP (12.1 vs. 41.8 mg/L; P < .0001), and longer biopsy length (1.6 vs. 1.2 cm; P = .0025).

In a multivariate analysis, the only variables associated with a positive biopsy were age (adjusted odds ratio [aOR], 1.04; P = .0001), lower CRP levels (aOR, 1.01; P = .0006), and biopsy length (aOR, 1.22; P = .047). The researchers then stratified the sample by biopsy length, using categories of < 0.5 cm, 0.5-1.0 cm, 1.0-1.5 cm, 1.5-2.0 cm, 2.0-2.5 cm, and ≥ 2.5 cm. They identified the two top change points according to the Akaike information criterion as 1.5 cm and 2.0 cm, but only 1.5 cm was statistically significant (≥ 1.5 versus < 1.5; OR, 1.57; P = .011).

Accounting for an average 8% contraction following excision, the researchers recommend an optimal pre-fixation biopsy length of 1.5-2.0 cm.

Some previous studies had suggested no association between increased sample length and false negatives, but they were based on small sample sizes. The current study is limited by its retrospective design and lack of treatment data. The lack of marked inflammation in the sample population suggests that patients were frequently treated empirically with glucocorticoids, and this could have increased the frequency of false negative biopsies, the researchers said.

Dr. Frank Buttgereit

In the accompanying editorial, Frank Buttgereit, MD, of Charité University Medicine in Berlin and Christian Dejaco, MD, PhD, of the Medical University of Graz (Austria) point out that ultrasound is now often used for the diagnosis of giant cell arteritis, following clinical examination and laboratory testing. When it has been determined that biopsy is necessary, they said that it is imperative that the harvest be carried out by an experienced physician, and the new study provides a useful contribution through its clear recommendation for biopsy length.

Dr. Christian Dejaco

The authors of the editorial also point out the importance of experienced pathologists, but interpretation is subject to inter- and intraobserver variability, as shown in a previous study that found that ultrasound and histology have similar reliability.

The study received no funding. Several authors reported receiving personal fees from Hoffmann-LaRoche and serving as site primary investigators for industry-sponsored vasculitis trials.

SOURCE: Chu R et al. Lancet Rheumatol. 2020 Aug 20. doi: 10.1016/S2665-9913(20)30222-8.

 

A new retrospective analysis has found 1.5-2 cm to be the optimal length of a temporal artery biopsy for detecting giant cell arteritis. Longer lengths did not yield enough improvement in diagnosis to justify the increased risk of complications. The length calculation accounts for post-fixation shrinkage.

The study, published Aug. 20 in Lancet Rheumatology, represents an “important contribution” to help with the diagnosis of giant cell arteritis when a decision has been made to perform a temporal artery biopsy, according to authors of an editorial accompanying the study.

Giant cell arteritis is an inflammatory condition of medium and large arteries, usually affecting the aorta and proximal aorta. Diagnosis includes a combination of clinical presentation and imaging or histology via a temporal artery biopsy, but the optimal tissue length for a biopsy has not been established. Longer lengths were initially considered best because inflammation can be non-uniform, and a shorter length could therefore raise the risk of a false negative if it contained few signs of inflammation.

Studies in the 1990s and early 2000s concluded that biopsies 2-5 cm in length were optimal. But later studies determined that a minimum of just 0.5 cm was necessary. The European League Against Rheumatism updated its recommendations in 2018 and the British Society for Rheumatology followed suit in 2020, both with a suggested minimum length of 1.0 cm. Despite these guidances, the optimal biopsy length beyond 1 cm remains unknown.

For the study, first author Raymond Chu, MD, of the University of Alberta Hospital, Edmonton, reviewed electronic medical records of all patients who underwent temporal artery biopsies in Alberta between Jan. 1, 2008, and Jan. 1, 2018. A single pathologist reviewed all positive findings to ensure uniformity of pathological interpretation. When the reviewer disagreed with the initial diagnosis, researchers removed the result from the analysis.



The study included 1,203 biopsies from 1,176 patients at 22 institutions. A total of 13 positive biopsies were removed following pathologist review. The median biopsy length was 1.3 cm. Median erythrocyte sedimentation rate (ESR) was 41 mm/hour, and median C-reactive protein (CRP) level was 14.7 mg/L. Univariate analyses found associations between positive biopsy and increased age (75.3 vs. 71.3 years; P < .0001), increased ESR (57 vs. 36 mm/hour; P < .0001), lower CRP (12.1 vs. 41.8 mg/L; P < .0001), and longer biopsy length (1.6 vs. 1.2 cm; P = .0025).

In a multivariate analysis, the only variables associated with a positive biopsy were age (adjusted odds ratio [aOR], 1.04; P = .0001), lower CRP levels (aOR, 1.01; P = .0006), and biopsy length (aOR, 1.22; P = .047). The researchers then stratified the sample by biopsy length, using categories of < 0.5 cm, 0.5-1.0 cm, 1.0-1.5 cm, 1.5-2.0 cm, 2.0-2.5 cm, and ≥ 2.5 cm. They identified the two top change points according to the Akaike information criterion as 1.5 cm and 2.0 cm, but only 1.5 cm was statistically significant (≥ 1.5 versus < 1.5; OR, 1.57; P = .011).

Accounting for an average 8% contraction following excision, the researchers recommend an optimal pre-fixation biopsy length of 1.5-2.0 cm.

Some previous studies had suggested no association between increased sample length and false negatives, but they were based on small sample sizes. The current study is limited by its retrospective design and lack of treatment data. The lack of marked inflammation in the sample population suggests that patients were frequently treated empirically with glucocorticoids, and this could have increased the frequency of false negative biopsies, the researchers said.

Dr. Frank Buttgereit

In the accompanying editorial, Frank Buttgereit, MD, of Charité University Medicine in Berlin and Christian Dejaco, MD, PhD, of the Medical University of Graz (Austria) point out that ultrasound is now often used for the diagnosis of giant cell arteritis, following clinical examination and laboratory testing. When it has been determined that biopsy is necessary, they said that it is imperative that the harvest be carried out by an experienced physician, and the new study provides a useful contribution through its clear recommendation for biopsy length.

Dr. Christian Dejaco

The authors of the editorial also point out the importance of experienced pathologists, but interpretation is subject to inter- and intraobserver variability, as shown in a previous study that found that ultrasound and histology have similar reliability.

The study received no funding. Several authors reported receiving personal fees from Hoffmann-LaRoche and serving as site primary investigators for industry-sponsored vasculitis trials.

SOURCE: Chu R et al. Lancet Rheumatol. 2020 Aug 20. doi: 10.1016/S2665-9913(20)30222-8.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM LANCET RHEUMATOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article

Choroid plexuses may play a role in migraine

Article Type
Changed

Among migraine sufferers, levels of soluble vascular cell adhesion molecule 1 (sVCAM1) in cerebrospinal fluid (CSF) are higher in those who experience more frequent attacks, according to a new study. The molecule could be a novel biomarker for the study of the mechanisms that underlie migraine. The work also suggests that the barrier between blood and CSF, sometimes described as leaky, is in fact selectively permeable.

The findings complement recent PET and dynamic contrast-enhanced MRI studies that have shown no sign of damage to the blood brain barrier (BBB) in migraine. Instead, there may be heightened transport of some molecules from blood to the CSF, evidenced by greater increases in fibrinogen levels in CSF than albumin. sVCAM1 might influence BBB or blood-CSF barrier permeability, possibly as a protective measure against fibrinogen, according to Michael Harrington, MD, scientific director of neuroscience at the Huntington Medical Research Institutes, Pasadena, Calif., who presented the findings in a poster at the virtual annual meeting of the American Headache Society.
 

BBB disruption?

The BBB is a well-known structure that regulates what molecules enter the brain, but the blood-CSF barrier, while lesser known, is also important. It comprises choroid plexus epithelial cells that oversee selective exchange of waste products, ions, and nutrients. Acute inflammation or chronic effects from conditions like stroke, multiple sclerosis, and Alzheimer’s disease can alter the function of this barrier.

No other capillary biomarkers were different between controls and patients with migraine – only sVCAM1. “My data supports a highly selective transport change from blood to CSF, which I propose is less likely to come from brain capillaries than choroid plexuses, especially since choroid plexuses produce the bulk of the CSF. It’s a work in progress, but based on this likelihood of choroid plexus involvement, I am accumulating more data that support the choroid plexuses as the primary source of change in migraine,” said Dr. Harrington in an interview.

“The most important finding of the study is that the blood brain barrier is not compromised in people with migraine,” said Rami Burstein, PhD, professor of neuroscience at Harvard Medical School, Boston, who was asked to comment on the findings. “Most unwanted adverse events are caused by drug action inside the brain, and thus, peripherally acting drugs become more favorable as they usually have fewer side effects. Given that the headache aspect of migraine could be intercepted outside the brain, the fact that the BBB is not compromised is a very good news,” Dr. Burstein added.

Dr. Harrington’s team recruited 74 subjects: 14 nonmigraine controls, 16 who were experiencing migraine illness (ictal), 27 not experiencing migraine illness (interictal), and 17 with chronic migraine. The CSF/serum quotient for albumen was higher in the 60 migraineurs than in the 14 controls (5.6 g/L vs. 4.1 g/L; P = .04), as was the CSF/serum quotient for fibrinogen (161.5 g/L vs. 86.1 g/L; P = .007). CSF levels of plasminogen were also higher in patients with migraine (240.7 ng/mL vs. 186.2 ng/mL; P = .03).

When the researchers compared ictal to interictal subjects, they found no difference in fibrinogen or albumen. That suggested that these values are generally increased in migraine patients compared with controls, rather than spiking during attacks. They also divided subjects by annual frequency, including groups experiencing fewer than 24 migraines per year, 24-180 attacks per year, and more than 180 attacks per year. The quotient for fibrinogen increased in migraineurs in general, compared with controls, but then decreased as the frequency of migraine went up (198.6 g/L, 167.0 g/L, and 121.6 g/L, respectively; P = .004).

CSF levels of sVCAM1 were 4.7 ng/mL in controls, 4.5 in the group with fewer than 24 migraines per year, 5.5 in the 24-180 group, and 7.1 in the group with more than 180 (P = .004).
 

Implications for therapy

The research, though at a very early stage, could have implications for therapies. Most drugs that treat migraine remain something of a mystery because researchers don’t know for sure where they act. In the brain? Systemically? The question of permeability of various molecules through both barriers could lend insight into what’s happening. “That’s why there is interest in barrier transport, and we’re showing there is a selective change of transport in migraineurs,” said Dr. Harrington.

As for more general therapeutic implications, “I can only speculate, but clearly there is baseline altered transport, probably in the choroid plexuses of these people,” said Dr. Harrington. He added that in time researchers might test drugs to see if they alter sVCAM1 levels or even develop novel drug candidates to act directly on it.

But he also sounded a note of caution because of the exploratory nature of the study. “These are all really early speculations.”

The study was funded by NIH, the Sunstar Foundation, Wyngs Foundation, and the Higgins Family. Dr. Harrington has no relevant disclosures.

SOURCE: Harrington M et al. AHS 2020, Abstract 842752.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

Among migraine sufferers, levels of soluble vascular cell adhesion molecule 1 (sVCAM1) in cerebrospinal fluid (CSF) are higher in those who experience more frequent attacks, according to a new study. The molecule could be a novel biomarker for the study of the mechanisms that underlie migraine. The work also suggests that the barrier between blood and CSF, sometimes described as leaky, is in fact selectively permeable.

The findings complement recent PET and dynamic contrast-enhanced MRI studies that have shown no sign of damage to the blood brain barrier (BBB) in migraine. Instead, there may be heightened transport of some molecules from blood to the CSF, evidenced by greater increases in fibrinogen levels in CSF than albumin. sVCAM1 might influence BBB or blood-CSF barrier permeability, possibly as a protective measure against fibrinogen, according to Michael Harrington, MD, scientific director of neuroscience at the Huntington Medical Research Institutes, Pasadena, Calif., who presented the findings in a poster at the virtual annual meeting of the American Headache Society.
 

BBB disruption?

The BBB is a well-known structure that regulates what molecules enter the brain, but the blood-CSF barrier, while lesser known, is also important. It comprises choroid plexus epithelial cells that oversee selective exchange of waste products, ions, and nutrients. Acute inflammation or chronic effects from conditions like stroke, multiple sclerosis, and Alzheimer’s disease can alter the function of this barrier.

No other capillary biomarkers were different between controls and patients with migraine – only sVCAM1. “My data supports a highly selective transport change from blood to CSF, which I propose is less likely to come from brain capillaries than choroid plexuses, especially since choroid plexuses produce the bulk of the CSF. It’s a work in progress, but based on this likelihood of choroid plexus involvement, I am accumulating more data that support the choroid plexuses as the primary source of change in migraine,” said Dr. Harrington in an interview.

“The most important finding of the study is that the blood brain barrier is not compromised in people with migraine,” said Rami Burstein, PhD, professor of neuroscience at Harvard Medical School, Boston, who was asked to comment on the findings. “Most unwanted adverse events are caused by drug action inside the brain, and thus, peripherally acting drugs become more favorable as they usually have fewer side effects. Given that the headache aspect of migraine could be intercepted outside the brain, the fact that the BBB is not compromised is a very good news,” Dr. Burstein added.

Dr. Harrington’s team recruited 74 subjects: 14 nonmigraine controls, 16 who were experiencing migraine illness (ictal), 27 not experiencing migraine illness (interictal), and 17 with chronic migraine. The CSF/serum quotient for albumen was higher in the 60 migraineurs than in the 14 controls (5.6 g/L vs. 4.1 g/L; P = .04), as was the CSF/serum quotient for fibrinogen (161.5 g/L vs. 86.1 g/L; P = .007). CSF levels of plasminogen were also higher in patients with migraine (240.7 ng/mL vs. 186.2 ng/mL; P = .03).

When the researchers compared ictal to interictal subjects, they found no difference in fibrinogen or albumen. That suggested that these values are generally increased in migraine patients compared with controls, rather than spiking during attacks. They also divided subjects by annual frequency, including groups experiencing fewer than 24 migraines per year, 24-180 attacks per year, and more than 180 attacks per year. The quotient for fibrinogen increased in migraineurs in general, compared with controls, but then decreased as the frequency of migraine went up (198.6 g/L, 167.0 g/L, and 121.6 g/L, respectively; P = .004).

CSF levels of sVCAM1 were 4.7 ng/mL in controls, 4.5 in the group with fewer than 24 migraines per year, 5.5 in the 24-180 group, and 7.1 in the group with more than 180 (P = .004).
 

Implications for therapy

The research, though at a very early stage, could have implications for therapies. Most drugs that treat migraine remain something of a mystery because researchers don’t know for sure where they act. In the brain? Systemically? The question of permeability of various molecules through both barriers could lend insight into what’s happening. “That’s why there is interest in barrier transport, and we’re showing there is a selective change of transport in migraineurs,” said Dr. Harrington.

As for more general therapeutic implications, “I can only speculate, but clearly there is baseline altered transport, probably in the choroid plexuses of these people,” said Dr. Harrington. He added that in time researchers might test drugs to see if they alter sVCAM1 levels or even develop novel drug candidates to act directly on it.

But he also sounded a note of caution because of the exploratory nature of the study. “These are all really early speculations.”

The study was funded by NIH, the Sunstar Foundation, Wyngs Foundation, and the Higgins Family. Dr. Harrington has no relevant disclosures.

SOURCE: Harrington M et al. AHS 2020, Abstract 842752.

Among migraine sufferers, levels of soluble vascular cell adhesion molecule 1 (sVCAM1) in cerebrospinal fluid (CSF) are higher in those who experience more frequent attacks, according to a new study. The molecule could be a novel biomarker for the study of the mechanisms that underlie migraine. The work also suggests that the barrier between blood and CSF, sometimes described as leaky, is in fact selectively permeable.

The findings complement recent PET and dynamic contrast-enhanced MRI studies that have shown no sign of damage to the blood brain barrier (BBB) in migraine. Instead, there may be heightened transport of some molecules from blood to the CSF, evidenced by greater increases in fibrinogen levels in CSF than albumin. sVCAM1 might influence BBB or blood-CSF barrier permeability, possibly as a protective measure against fibrinogen, according to Michael Harrington, MD, scientific director of neuroscience at the Huntington Medical Research Institutes, Pasadena, Calif., who presented the findings in a poster at the virtual annual meeting of the American Headache Society.
 

BBB disruption?

The BBB is a well-known structure that regulates what molecules enter the brain, but the blood-CSF barrier, while lesser known, is also important. It comprises choroid plexus epithelial cells that oversee selective exchange of waste products, ions, and nutrients. Acute inflammation or chronic effects from conditions like stroke, multiple sclerosis, and Alzheimer’s disease can alter the function of this barrier.

No other capillary biomarkers were different between controls and patients with migraine – only sVCAM1. “My data supports a highly selective transport change from blood to CSF, which I propose is less likely to come from brain capillaries than choroid plexuses, especially since choroid plexuses produce the bulk of the CSF. It’s a work in progress, but based on this likelihood of choroid plexus involvement, I am accumulating more data that support the choroid plexuses as the primary source of change in migraine,” said Dr. Harrington in an interview.

“The most important finding of the study is that the blood brain barrier is not compromised in people with migraine,” said Rami Burstein, PhD, professor of neuroscience at Harvard Medical School, Boston, who was asked to comment on the findings. “Most unwanted adverse events are caused by drug action inside the brain, and thus, peripherally acting drugs become more favorable as they usually have fewer side effects. Given that the headache aspect of migraine could be intercepted outside the brain, the fact that the BBB is not compromised is a very good news,” Dr. Burstein added.

Dr. Harrington’s team recruited 74 subjects: 14 nonmigraine controls, 16 who were experiencing migraine illness (ictal), 27 not experiencing migraine illness (interictal), and 17 with chronic migraine. The CSF/serum quotient for albumen was higher in the 60 migraineurs than in the 14 controls (5.6 g/L vs. 4.1 g/L; P = .04), as was the CSF/serum quotient for fibrinogen (161.5 g/L vs. 86.1 g/L; P = .007). CSF levels of plasminogen were also higher in patients with migraine (240.7 ng/mL vs. 186.2 ng/mL; P = .03).

When the researchers compared ictal to interictal subjects, they found no difference in fibrinogen or albumen. That suggested that these values are generally increased in migraine patients compared with controls, rather than spiking during attacks. They also divided subjects by annual frequency, including groups experiencing fewer than 24 migraines per year, 24-180 attacks per year, and more than 180 attacks per year. The quotient for fibrinogen increased in migraineurs in general, compared with controls, but then decreased as the frequency of migraine went up (198.6 g/L, 167.0 g/L, and 121.6 g/L, respectively; P = .004).

CSF levels of sVCAM1 were 4.7 ng/mL in controls, 4.5 in the group with fewer than 24 migraines per year, 5.5 in the 24-180 group, and 7.1 in the group with more than 180 (P = .004).
 

Implications for therapy

The research, though at a very early stage, could have implications for therapies. Most drugs that treat migraine remain something of a mystery because researchers don’t know for sure where they act. In the brain? Systemically? The question of permeability of various molecules through both barriers could lend insight into what’s happening. “That’s why there is interest in barrier transport, and we’re showing there is a selective change of transport in migraineurs,” said Dr. Harrington.

As for more general therapeutic implications, “I can only speculate, but clearly there is baseline altered transport, probably in the choroid plexuses of these people,” said Dr. Harrington. He added that in time researchers might test drugs to see if they alter sVCAM1 levels or even develop novel drug candidates to act directly on it.

But he also sounded a note of caution because of the exploratory nature of the study. “These are all really early speculations.”

The study was funded by NIH, the Sunstar Foundation, Wyngs Foundation, and the Higgins Family. Dr. Harrington has no relevant disclosures.

SOURCE: Harrington M et al. AHS 2020, Abstract 842752.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM AHS 2020

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article

SGLT2 inhibitors with metformin look safe for bone

Article Type
Changed

 

The combination of sodium-glucose transporter-2 (SGLT-2) inhibitors and metformin is not associated with an increase in fracture risk among patients with type 2 diabetes (T2D), according to a new meta-analysis of 25 randomized, controlled trials.

Researchers at The Second Clinical College of Dalian Medical University in Jiangsu, China, compared fracture risk associated with the metformin/SLGT2 combination to metformin alone as well as other T2D therapeutics, and found no differences in risk. The study was published online Aug. 11 in Osteoporosis International.

T2D is associated with an increased risk of fracture, though causative mechanisms remain uncertain. Some lines of evidence suggest multiple factors may contribute to fractures, including hyperglycemia, oxidative stress, toxic effects of advanced glycosylation end-products, altered insulin levels, and treatment-induced hypoglycemia, as well as an association between T2D and increased risk of falls.

Antidiabetes drugs can have positive or negative effects on bone. thiazolidinediones, insulin, and sulfonylureas may increase risk of fractures, while dipeptidyl peptidase-4 (DPP-4) inhibitors and glucagon-like peptide-2 (GLP-2) receptor agonists may be protective. Metformin may also reduce fracture risk.

SGLT-2 inhibitors interrupt glucose reabsorption in the kidney, leading to improved glycemic control. Other benefits include improved renal and cardiovascular outcomes, weight loss, and reduced blood pressure, liver fat, and serum uric acid levels.

These properties have made SGLT-2 inhibitors combined with metformin an important therapy for patients at high risk of atherosclerotic disease, or who have heart failure or chronic kidney disease.

But SGLT-2 inhibition increases osmotic diuresis, and this could alter the mineral balance within bone. Some studies also showed that SGLT-2 inhibitors led to changes in bone turnover markers, bone mineral density, and bone microarchitecture. Observational studies of the SGLT-2 inhibitor canagliflozin found associations with a higher rate of fracture risk in patients taking the drug.

Such studies carry the risk of confounding factors, so the researchers took advantage of the fact that many recent clinical trials have examined the impact of SGLT-2 inhibitors on T2D. They pooled data from 25 clinical trials with a total of 19,500 participants, 9,662 of whom received SGLT-2 inhibitors plus metformin; 9,838 received other active comparators.

The fracture rate was 0.91% in the SGLT-2 inhibitors/metformin group, and 0.80% among controls (odds ratio, 0.97; 95% CI, 0.71-1.32), with no heterogeneity. Metformin alone was not associated with a change in fracture rate (OR, 0.95; 95% CI, 0.44-2.08), nor were other forms of diabetes control (OR, 0.95; 95% CI, 0.69-1.31).

There were some differences in fracture risk among SGLT-2 inhibitors when studied individually, though none differed significantly from controls. The highest risk was associated with the canagliflozin/metformin (OR, 2.19; 95% CI, 0.66-7.27), followed by dapagliflozin/metformin (OR, 0.91; 95% CI, 0.50-1.64), empagliflozin/metformin (OR, 0.94; 95% CI, 0.59-1.50), and ertugliflozin/metformin (OR, 0.76; 95% CI, 0.38-1.54).

There were no differences with respect to hip or lumbar spine fractures, or other fractures. The researchers found no differences in bone mineral density or bone turnover markers.

The meta-analysis is limited by the relatively short average follow-up in the included studies, which was 61 weeks. Bone damage may occur over longer time periods. Bone fractures were also not a prespecified adverse event in most included studies.

The studies also did not provide detailed information on the types of fractures experienced, such as whether they were result of a fall, or the location of the fracture, or bone health parameters. Although the results support a belief that SGLT-2 inhibitors do not adversely affect bone health, “given limited information on bone health outcomes, further work is needed to validate this conclusion,” the authors wrote.

The authors did not disclose any funding and had no relevant conflicts of interest.

SOURCE: B-B Qian et al. Osteoporosis Int. 2020 Aug 11. doi: 10.1007/s00198-020-05590-y.

Publications
Topics
Sections

 

The combination of sodium-glucose transporter-2 (SGLT-2) inhibitors and metformin is not associated with an increase in fracture risk among patients with type 2 diabetes (T2D), according to a new meta-analysis of 25 randomized, controlled trials.

Researchers at The Second Clinical College of Dalian Medical University in Jiangsu, China, compared fracture risk associated with the metformin/SLGT2 combination to metformin alone as well as other T2D therapeutics, and found no differences in risk. The study was published online Aug. 11 in Osteoporosis International.

T2D is associated with an increased risk of fracture, though causative mechanisms remain uncertain. Some lines of evidence suggest multiple factors may contribute to fractures, including hyperglycemia, oxidative stress, toxic effects of advanced glycosylation end-products, altered insulin levels, and treatment-induced hypoglycemia, as well as an association between T2D and increased risk of falls.

Antidiabetes drugs can have positive or negative effects on bone. thiazolidinediones, insulin, and sulfonylureas may increase risk of fractures, while dipeptidyl peptidase-4 (DPP-4) inhibitors and glucagon-like peptide-2 (GLP-2) receptor agonists may be protective. Metformin may also reduce fracture risk.

SGLT-2 inhibitors interrupt glucose reabsorption in the kidney, leading to improved glycemic control. Other benefits include improved renal and cardiovascular outcomes, weight loss, and reduced blood pressure, liver fat, and serum uric acid levels.

These properties have made SGLT-2 inhibitors combined with metformin an important therapy for patients at high risk of atherosclerotic disease, or who have heart failure or chronic kidney disease.

But SGLT-2 inhibition increases osmotic diuresis, and this could alter the mineral balance within bone. Some studies also showed that SGLT-2 inhibitors led to changes in bone turnover markers, bone mineral density, and bone microarchitecture. Observational studies of the SGLT-2 inhibitor canagliflozin found associations with a higher rate of fracture risk in patients taking the drug.

Such studies carry the risk of confounding factors, so the researchers took advantage of the fact that many recent clinical trials have examined the impact of SGLT-2 inhibitors on T2D. They pooled data from 25 clinical trials with a total of 19,500 participants, 9,662 of whom received SGLT-2 inhibitors plus metformin; 9,838 received other active comparators.

The fracture rate was 0.91% in the SGLT-2 inhibitors/metformin group, and 0.80% among controls (odds ratio, 0.97; 95% CI, 0.71-1.32), with no heterogeneity. Metformin alone was not associated with a change in fracture rate (OR, 0.95; 95% CI, 0.44-2.08), nor were other forms of diabetes control (OR, 0.95; 95% CI, 0.69-1.31).

There were some differences in fracture risk among SGLT-2 inhibitors when studied individually, though none differed significantly from controls. The highest risk was associated with the canagliflozin/metformin (OR, 2.19; 95% CI, 0.66-7.27), followed by dapagliflozin/metformin (OR, 0.91; 95% CI, 0.50-1.64), empagliflozin/metformin (OR, 0.94; 95% CI, 0.59-1.50), and ertugliflozin/metformin (OR, 0.76; 95% CI, 0.38-1.54).

There were no differences with respect to hip or lumbar spine fractures, or other fractures. The researchers found no differences in bone mineral density or bone turnover markers.

The meta-analysis is limited by the relatively short average follow-up in the included studies, which was 61 weeks. Bone damage may occur over longer time periods. Bone fractures were also not a prespecified adverse event in most included studies.

The studies also did not provide detailed information on the types of fractures experienced, such as whether they were result of a fall, or the location of the fracture, or bone health parameters. Although the results support a belief that SGLT-2 inhibitors do not adversely affect bone health, “given limited information on bone health outcomes, further work is needed to validate this conclusion,” the authors wrote.

The authors did not disclose any funding and had no relevant conflicts of interest.

SOURCE: B-B Qian et al. Osteoporosis Int. 2020 Aug 11. doi: 10.1007/s00198-020-05590-y.

 

The combination of sodium-glucose transporter-2 (SGLT-2) inhibitors and metformin is not associated with an increase in fracture risk among patients with type 2 diabetes (T2D), according to a new meta-analysis of 25 randomized, controlled trials.

Researchers at The Second Clinical College of Dalian Medical University in Jiangsu, China, compared fracture risk associated with the metformin/SLGT2 combination to metformin alone as well as other T2D therapeutics, and found no differences in risk. The study was published online Aug. 11 in Osteoporosis International.

T2D is associated with an increased risk of fracture, though causative mechanisms remain uncertain. Some lines of evidence suggest multiple factors may contribute to fractures, including hyperglycemia, oxidative stress, toxic effects of advanced glycosylation end-products, altered insulin levels, and treatment-induced hypoglycemia, as well as an association between T2D and increased risk of falls.

Antidiabetes drugs can have positive or negative effects on bone. thiazolidinediones, insulin, and sulfonylureas may increase risk of fractures, while dipeptidyl peptidase-4 (DPP-4) inhibitors and glucagon-like peptide-2 (GLP-2) receptor agonists may be protective. Metformin may also reduce fracture risk.

SGLT-2 inhibitors interrupt glucose reabsorption in the kidney, leading to improved glycemic control. Other benefits include improved renal and cardiovascular outcomes, weight loss, and reduced blood pressure, liver fat, and serum uric acid levels.

These properties have made SGLT-2 inhibitors combined with metformin an important therapy for patients at high risk of atherosclerotic disease, or who have heart failure or chronic kidney disease.

But SGLT-2 inhibition increases osmotic diuresis, and this could alter the mineral balance within bone. Some studies also showed that SGLT-2 inhibitors led to changes in bone turnover markers, bone mineral density, and bone microarchitecture. Observational studies of the SGLT-2 inhibitor canagliflozin found associations with a higher rate of fracture risk in patients taking the drug.

Such studies carry the risk of confounding factors, so the researchers took advantage of the fact that many recent clinical trials have examined the impact of SGLT-2 inhibitors on T2D. They pooled data from 25 clinical trials with a total of 19,500 participants, 9,662 of whom received SGLT-2 inhibitors plus metformin; 9,838 received other active comparators.

The fracture rate was 0.91% in the SGLT-2 inhibitors/metformin group, and 0.80% among controls (odds ratio, 0.97; 95% CI, 0.71-1.32), with no heterogeneity. Metformin alone was not associated with a change in fracture rate (OR, 0.95; 95% CI, 0.44-2.08), nor were other forms of diabetes control (OR, 0.95; 95% CI, 0.69-1.31).

There were some differences in fracture risk among SGLT-2 inhibitors when studied individually, though none differed significantly from controls. The highest risk was associated with the canagliflozin/metformin (OR, 2.19; 95% CI, 0.66-7.27), followed by dapagliflozin/metformin (OR, 0.91; 95% CI, 0.50-1.64), empagliflozin/metformin (OR, 0.94; 95% CI, 0.59-1.50), and ertugliflozin/metformin (OR, 0.76; 95% CI, 0.38-1.54).

There were no differences with respect to hip or lumbar spine fractures, or other fractures. The researchers found no differences in bone mineral density or bone turnover markers.

The meta-analysis is limited by the relatively short average follow-up in the included studies, which was 61 weeks. Bone damage may occur over longer time periods. Bone fractures were also not a prespecified adverse event in most included studies.

The studies also did not provide detailed information on the types of fractures experienced, such as whether they were result of a fall, or the location of the fracture, or bone health parameters. Although the results support a belief that SGLT-2 inhibitors do not adversely affect bone health, “given limited information on bone health outcomes, further work is needed to validate this conclusion,” the authors wrote.

The authors did not disclose any funding and had no relevant conflicts of interest.

SOURCE: B-B Qian et al. Osteoporosis Int. 2020 Aug 11. doi: 10.1007/s00198-020-05590-y.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM OSTEOPOROSIS INTERNATIONAL

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article

Cohort study finds a twofold greater psoriasis risk linked to a PCOS diagnosis

Article Type
Changed

Polycystic ovarian syndrome (PCOS) was associated with a nearly doubled risk of developing psoriasis in a propensity score–matched analysis conducted in Taiwan.

PCOS is characterized by androgen elevation that can lead to insulin resistance and metabolic syndrome, which have also been associated with an increased risk of psoriasis. Previous retrospective analyses have suggested an increased risk of psoriasis associated with PCOS, and psoriasis patients with PCOS have been reported to have more severe skin lesions, compared with those who do not have PCOS.

“The incidence of psoriasis is indeed higher in the PCOS group than in the control group, and the comorbidities related to metabolic syndrome did not modify the adjusted hazard ratio,” said Ming-Li Chen, during her presentation of the study results at the virtual annual meeting of the Group for Research and Assessment of Psoriasis and Psoriatic Arthritis. Dr. Chen is at Chung Shan Medical University in Taiwan.

The researchers analyzed 1 million randomly selected records from Taiwan’s Longitudinal Health Insurance database, a subset of the country’s National Health Insurance Program. Between 2000 and 2012, they identified a case group with at least three outpatient diagnoses or one inpatient diagnosis of PCOS; they then compared each with four patients who did not have PCOS who were matched by age and index year. The mean age in both groups was about 27 years.

The mean follow-up times were 6.99 years for 4,707 cases and 6.94 years for 18,828 controls. Comorbidities were slightly higher in the PCOS group, including asthma (6.7% vs. 4.9%; P less than .001), chronic obstructive pulmonary disease (14% vs. 11%; P less than .001), chronic liver disease (8.0% vs. 5.0%; P less than .001), diabetes mellitus (3.0% vs. 1.4%; P less than .001), hypertension (2.4% vs. 1.5%; P less than .001), hyperlipidemia (5.4% vs. 2.5%; P less than .001), depression (5.4% vs. 3.9%; P less than .001), and sleep apnea (0.23% vs. 0.10%; P = .040).



There was a higher cumulative incidence of psoriasis in the PCOS group (adjusted hazard ratio, 2.07; 95% confidence interval, 1.25-3.44). Other factors associated with increased risk of psoriasis were advanced age (greater than 50 years old; aHR, 14.13; 95% CI, 1.8-110.7) and having a cancer diagnosis (aHR, 11.72; 95% CI, 2.87-47.9).

When PCOS patients were stratified by age, the researchers noted a higher risk of psoriasis among those 20 years or younger (aHR, 4.02; 95% CI, 1.16-13.9) than among those aged 20-50 years (aHR, 1.88; 95% CI, 1.07-3.29). Among those older than 50 years, there was no significantly increased risk, although the number of psoriasis diagnoses and population sizes were small in the latter category. Among patients with PCOS, a cancer diagnosis was not associated with a statistically significant increased risk of psoriasis.

The mechanisms underlying the association between PCOS and psoriasis should be studied further, she noted.

Following Dr. Chen’s prerecorded presentation, there was a live discussion session led by Alice Gottlieb, MD, PhD, medical director of Mount Sinai Beth Israel Dermatology, New York, and Ennio Lubrano, MD, associate professor of rheumatology at the University of Molise (Italy). Dr. Gottlieb noted that the study did not appear to account for weight in the association between PCOS and psoriasis, since heavier people are known to be at greater risk of developing psoriasis. Dr. Chen acknowledged that the study had no records of BMI or weight.

Dr. Gottlieb also wondered if treatment of PCOS led to any improvements in psoriasis in patients with the two diagnoses. “If we treat PCOS, does the psoriasis get better?” Again, the study did not address the question. “We didn’t follow up on therapies,” Dr. Chen said.

Dr. Chen reported no relevant financial disclosures. Dr. Gottlieb is a consultant, advisory board member and/or speaker for AbbVie, Allergan, Avotres Therapeutics, Beiersdorf, Bristol-Myers Squibb, Celgene, Dermira, Eli Lilly, Incyte, Janssen, Leo, Novartis, Reddy Labs, Sun Pharmaceutical Industries, UCB Pharma and Xbiotech. She has received research or educational grants from Boehringer Ingelheim, Incyte, Janssen, Novartis and Xbiotech.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

Polycystic ovarian syndrome (PCOS) was associated with a nearly doubled risk of developing psoriasis in a propensity score–matched analysis conducted in Taiwan.

PCOS is characterized by androgen elevation that can lead to insulin resistance and metabolic syndrome, which have also been associated with an increased risk of psoriasis. Previous retrospective analyses have suggested an increased risk of psoriasis associated with PCOS, and psoriasis patients with PCOS have been reported to have more severe skin lesions, compared with those who do not have PCOS.

“The incidence of psoriasis is indeed higher in the PCOS group than in the control group, and the comorbidities related to metabolic syndrome did not modify the adjusted hazard ratio,” said Ming-Li Chen, during her presentation of the study results at the virtual annual meeting of the Group for Research and Assessment of Psoriasis and Psoriatic Arthritis. Dr. Chen is at Chung Shan Medical University in Taiwan.

The researchers analyzed 1 million randomly selected records from Taiwan’s Longitudinal Health Insurance database, a subset of the country’s National Health Insurance Program. Between 2000 and 2012, they identified a case group with at least three outpatient diagnoses or one inpatient diagnosis of PCOS; they then compared each with four patients who did not have PCOS who were matched by age and index year. The mean age in both groups was about 27 years.

The mean follow-up times were 6.99 years for 4,707 cases and 6.94 years for 18,828 controls. Comorbidities were slightly higher in the PCOS group, including asthma (6.7% vs. 4.9%; P less than .001), chronic obstructive pulmonary disease (14% vs. 11%; P less than .001), chronic liver disease (8.0% vs. 5.0%; P less than .001), diabetes mellitus (3.0% vs. 1.4%; P less than .001), hypertension (2.4% vs. 1.5%; P less than .001), hyperlipidemia (5.4% vs. 2.5%; P less than .001), depression (5.4% vs. 3.9%; P less than .001), and sleep apnea (0.23% vs. 0.10%; P = .040).



There was a higher cumulative incidence of psoriasis in the PCOS group (adjusted hazard ratio, 2.07; 95% confidence interval, 1.25-3.44). Other factors associated with increased risk of psoriasis were advanced age (greater than 50 years old; aHR, 14.13; 95% CI, 1.8-110.7) and having a cancer diagnosis (aHR, 11.72; 95% CI, 2.87-47.9).

When PCOS patients were stratified by age, the researchers noted a higher risk of psoriasis among those 20 years or younger (aHR, 4.02; 95% CI, 1.16-13.9) than among those aged 20-50 years (aHR, 1.88; 95% CI, 1.07-3.29). Among those older than 50 years, there was no significantly increased risk, although the number of psoriasis diagnoses and population sizes were small in the latter category. Among patients with PCOS, a cancer diagnosis was not associated with a statistically significant increased risk of psoriasis.

The mechanisms underlying the association between PCOS and psoriasis should be studied further, she noted.

Following Dr. Chen’s prerecorded presentation, there was a live discussion session led by Alice Gottlieb, MD, PhD, medical director of Mount Sinai Beth Israel Dermatology, New York, and Ennio Lubrano, MD, associate professor of rheumatology at the University of Molise (Italy). Dr. Gottlieb noted that the study did not appear to account for weight in the association between PCOS and psoriasis, since heavier people are known to be at greater risk of developing psoriasis. Dr. Chen acknowledged that the study had no records of BMI or weight.

Dr. Gottlieb also wondered if treatment of PCOS led to any improvements in psoriasis in patients with the two diagnoses. “If we treat PCOS, does the psoriasis get better?” Again, the study did not address the question. “We didn’t follow up on therapies,” Dr. Chen said.

Dr. Chen reported no relevant financial disclosures. Dr. Gottlieb is a consultant, advisory board member and/or speaker for AbbVie, Allergan, Avotres Therapeutics, Beiersdorf, Bristol-Myers Squibb, Celgene, Dermira, Eli Lilly, Incyte, Janssen, Leo, Novartis, Reddy Labs, Sun Pharmaceutical Industries, UCB Pharma and Xbiotech. She has received research or educational grants from Boehringer Ingelheim, Incyte, Janssen, Novartis and Xbiotech.

Polycystic ovarian syndrome (PCOS) was associated with a nearly doubled risk of developing psoriasis in a propensity score–matched analysis conducted in Taiwan.

PCOS is characterized by androgen elevation that can lead to insulin resistance and metabolic syndrome, which have also been associated with an increased risk of psoriasis. Previous retrospective analyses have suggested an increased risk of psoriasis associated with PCOS, and psoriasis patients with PCOS have been reported to have more severe skin lesions, compared with those who do not have PCOS.

“The incidence of psoriasis is indeed higher in the PCOS group than in the control group, and the comorbidities related to metabolic syndrome did not modify the adjusted hazard ratio,” said Ming-Li Chen, during her presentation of the study results at the virtual annual meeting of the Group for Research and Assessment of Psoriasis and Psoriatic Arthritis. Dr. Chen is at Chung Shan Medical University in Taiwan.

The researchers analyzed 1 million randomly selected records from Taiwan’s Longitudinal Health Insurance database, a subset of the country’s National Health Insurance Program. Between 2000 and 2012, they identified a case group with at least three outpatient diagnoses or one inpatient diagnosis of PCOS; they then compared each with four patients who did not have PCOS who were matched by age and index year. The mean age in both groups was about 27 years.

The mean follow-up times were 6.99 years for 4,707 cases and 6.94 years for 18,828 controls. Comorbidities were slightly higher in the PCOS group, including asthma (6.7% vs. 4.9%; P less than .001), chronic obstructive pulmonary disease (14% vs. 11%; P less than .001), chronic liver disease (8.0% vs. 5.0%; P less than .001), diabetes mellitus (3.0% vs. 1.4%; P less than .001), hypertension (2.4% vs. 1.5%; P less than .001), hyperlipidemia (5.4% vs. 2.5%; P less than .001), depression (5.4% vs. 3.9%; P less than .001), and sleep apnea (0.23% vs. 0.10%; P = .040).



There was a higher cumulative incidence of psoriasis in the PCOS group (adjusted hazard ratio, 2.07; 95% confidence interval, 1.25-3.44). Other factors associated with increased risk of psoriasis were advanced age (greater than 50 years old; aHR, 14.13; 95% CI, 1.8-110.7) and having a cancer diagnosis (aHR, 11.72; 95% CI, 2.87-47.9).

When PCOS patients were stratified by age, the researchers noted a higher risk of psoriasis among those 20 years or younger (aHR, 4.02; 95% CI, 1.16-13.9) than among those aged 20-50 years (aHR, 1.88; 95% CI, 1.07-3.29). Among those older than 50 years, there was no significantly increased risk, although the number of psoriasis diagnoses and population sizes were small in the latter category. Among patients with PCOS, a cancer diagnosis was not associated with a statistically significant increased risk of psoriasis.

The mechanisms underlying the association between PCOS and psoriasis should be studied further, she noted.

Following Dr. Chen’s prerecorded presentation, there was a live discussion session led by Alice Gottlieb, MD, PhD, medical director of Mount Sinai Beth Israel Dermatology, New York, and Ennio Lubrano, MD, associate professor of rheumatology at the University of Molise (Italy). Dr. Gottlieb noted that the study did not appear to account for weight in the association between PCOS and psoriasis, since heavier people are known to be at greater risk of developing psoriasis. Dr. Chen acknowledged that the study had no records of BMI or weight.

Dr. Gottlieb also wondered if treatment of PCOS led to any improvements in psoriasis in patients with the two diagnoses. “If we treat PCOS, does the psoriasis get better?” Again, the study did not address the question. “We didn’t follow up on therapies,” Dr. Chen said.

Dr. Chen reported no relevant financial disclosures. Dr. Gottlieb is a consultant, advisory board member and/or speaker for AbbVie, Allergan, Avotres Therapeutics, Beiersdorf, Bristol-Myers Squibb, Celgene, Dermira, Eli Lilly, Incyte, Janssen, Leo, Novartis, Reddy Labs, Sun Pharmaceutical Industries, UCB Pharma and Xbiotech. She has received research or educational grants from Boehringer Ingelheim, Incyte, Janssen, Novartis and Xbiotech.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM GRAPPA 2020 VIRTUAL ANNUAL MEETING

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article

Biologics may delay psoriatic arthritis, study finds

Article Type
Changed

Treatment of psoriasis with biologics was associated with a reduced risk of developing psoriatic arthritis compared with conventional disease-modifying antirheumatic drugs (DMARDs), in a single center retrospective analysis in Argentina that followed patients for almost 2 decades.

About 30%-40% of patients with psoriasis go on to develop psoriatic arthritis (PsA), usually on average about 10 years after the onset of psoriasis. One potential mechanism of PsA onset is through enthesitis, which has been described at subclinical levels in psoriasis.

“It could be speculated that treatment with biologics in patients with psoriasis could prevent the development of psoriatic arthritis, perhaps by inhibiting the subclinical development of enthesitis,” Luciano Lo Giudice, MD, a rheumatology fellow at Hospital Italiano de Buenos Aires, said during his presentation at the virtual annual meeting of the Group for Research and Assessment of Psoriasis and Psoriatic Arthritis.

Although these results do not prove that treatment of the underlying disease delays progression to PsA, it is suggestive, and highlights an emerging field of research, according to Diamant Thaçi, MD, PhD, professor of medicine at University Hospital Schleswig-Holstein, Germany, who led a live discussion following a prerecorded presentation of the results. “We’re going in this direction – how can we prevent psoriatic arthritis, how can we delay it. We are just starting to think about this,” Dr. Thaçi said in an interview.

The researchers examined medical records of 1,626 patients with psoriasis treated at their center between 2000 and 2019, with a total of 15,152 years of follow-up. Of these patients, 1,293 were treated with topical medication, 229 with conventional DMARDs (methotrexate in 77%, cyclosporine in 13%, and both in 10%), and 104 with biologics, including etanercept (34%), secukinumab (20%), adalimumab (20%), ustekinumab (12%), ixekizumab (9%), and infliximab (5%).



They found that 11% in the topical treatment group developed PsA, as did 3.5% in the conventional DMARD group, 1.9% in the biologics group, and 9.1% overall. Treatment with biologics was associated with a significantly lower odds of developing PsA compared with treatment with conventional DMARDs (3 versus 17.2 per 1,000 patient-years; incidence rate ratio [IRR], 0.17; P = .0177). There was a trend toward reduced odds of developing PsA among those on biologic therapy compared with those on topicals (3 versus 9.8 per 1,000 patient-years; IRR, 0.3; P = .0588).

The researchers confirmed all medical encounters using electronic medical records and the study had a long follow-up time, but was limited by the single center and its retrospective nature. It also could not associate reduced risk with specific biologics.

The findings probably reflect the presence of subclinical PsA that many clinicians don’t see, according to Dr. Thaçi. While a dermatology practice might find PsA in 2% or 3%, or at most, 10% of patients with psoriasis, “in our department it’s about 50 to 60 percent of patients who have psoriatic arthritis, because we diagnose it early,” he said.

He found the results of the study encouraging. “It looks like some of the biologics, for example IL [interleukin]-17 or even IL-23 [blockers] may have an influence on occurrence or delay the occurrence of psoriatic arthritis.”

Dr. Thaçi noted that early treatment of skin lesions can increase the probability of longer remissions, especially with IL-23 blockers. Still, that’s no guarantee the same would hold true for PsA risk. “Skin is skin and joints are joints,” Dr. Thaçi said.

Dr. Thaçi and Dr. Lo Giudice had no relevant financial disclosures.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

Treatment of psoriasis with biologics was associated with a reduced risk of developing psoriatic arthritis compared with conventional disease-modifying antirheumatic drugs (DMARDs), in a single center retrospective analysis in Argentina that followed patients for almost 2 decades.

About 30%-40% of patients with psoriasis go on to develop psoriatic arthritis (PsA), usually on average about 10 years after the onset of psoriasis. One potential mechanism of PsA onset is through enthesitis, which has been described at subclinical levels in psoriasis.

“It could be speculated that treatment with biologics in patients with psoriasis could prevent the development of psoriatic arthritis, perhaps by inhibiting the subclinical development of enthesitis,” Luciano Lo Giudice, MD, a rheumatology fellow at Hospital Italiano de Buenos Aires, said during his presentation at the virtual annual meeting of the Group for Research and Assessment of Psoriasis and Psoriatic Arthritis.

Although these results do not prove that treatment of the underlying disease delays progression to PsA, it is suggestive, and highlights an emerging field of research, according to Diamant Thaçi, MD, PhD, professor of medicine at University Hospital Schleswig-Holstein, Germany, who led a live discussion following a prerecorded presentation of the results. “We’re going in this direction – how can we prevent psoriatic arthritis, how can we delay it. We are just starting to think about this,” Dr. Thaçi said in an interview.

The researchers examined medical records of 1,626 patients with psoriasis treated at their center between 2000 and 2019, with a total of 15,152 years of follow-up. Of these patients, 1,293 were treated with topical medication, 229 with conventional DMARDs (methotrexate in 77%, cyclosporine in 13%, and both in 10%), and 104 with biologics, including etanercept (34%), secukinumab (20%), adalimumab (20%), ustekinumab (12%), ixekizumab (9%), and infliximab (5%).



They found that 11% in the topical treatment group developed PsA, as did 3.5% in the conventional DMARD group, 1.9% in the biologics group, and 9.1% overall. Treatment with biologics was associated with a significantly lower odds of developing PsA compared with treatment with conventional DMARDs (3 versus 17.2 per 1,000 patient-years; incidence rate ratio [IRR], 0.17; P = .0177). There was a trend toward reduced odds of developing PsA among those on biologic therapy compared with those on topicals (3 versus 9.8 per 1,000 patient-years; IRR, 0.3; P = .0588).

The researchers confirmed all medical encounters using electronic medical records and the study had a long follow-up time, but was limited by the single center and its retrospective nature. It also could not associate reduced risk with specific biologics.

The findings probably reflect the presence of subclinical PsA that many clinicians don’t see, according to Dr. Thaçi. While a dermatology practice might find PsA in 2% or 3%, or at most, 10% of patients with psoriasis, “in our department it’s about 50 to 60 percent of patients who have psoriatic arthritis, because we diagnose it early,” he said.

He found the results of the study encouraging. “It looks like some of the biologics, for example IL [interleukin]-17 or even IL-23 [blockers] may have an influence on occurrence or delay the occurrence of psoriatic arthritis.”

Dr. Thaçi noted that early treatment of skin lesions can increase the probability of longer remissions, especially with IL-23 blockers. Still, that’s no guarantee the same would hold true for PsA risk. “Skin is skin and joints are joints,” Dr. Thaçi said.

Dr. Thaçi and Dr. Lo Giudice had no relevant financial disclosures.

Treatment of psoriasis with biologics was associated with a reduced risk of developing psoriatic arthritis compared with conventional disease-modifying antirheumatic drugs (DMARDs), in a single center retrospective analysis in Argentina that followed patients for almost 2 decades.

About 30%-40% of patients with psoriasis go on to develop psoriatic arthritis (PsA), usually on average about 10 years after the onset of psoriasis. One potential mechanism of PsA onset is through enthesitis, which has been described at subclinical levels in psoriasis.

“It could be speculated that treatment with biologics in patients with psoriasis could prevent the development of psoriatic arthritis, perhaps by inhibiting the subclinical development of enthesitis,” Luciano Lo Giudice, MD, a rheumatology fellow at Hospital Italiano de Buenos Aires, said during his presentation at the virtual annual meeting of the Group for Research and Assessment of Psoriasis and Psoriatic Arthritis.

Although these results do not prove that treatment of the underlying disease delays progression to PsA, it is suggestive, and highlights an emerging field of research, according to Diamant Thaçi, MD, PhD, professor of medicine at University Hospital Schleswig-Holstein, Germany, who led a live discussion following a prerecorded presentation of the results. “We’re going in this direction – how can we prevent psoriatic arthritis, how can we delay it. We are just starting to think about this,” Dr. Thaçi said in an interview.

The researchers examined medical records of 1,626 patients with psoriasis treated at their center between 2000 and 2019, with a total of 15,152 years of follow-up. Of these patients, 1,293 were treated with topical medication, 229 with conventional DMARDs (methotrexate in 77%, cyclosporine in 13%, and both in 10%), and 104 with biologics, including etanercept (34%), secukinumab (20%), adalimumab (20%), ustekinumab (12%), ixekizumab (9%), and infliximab (5%).



They found that 11% in the topical treatment group developed PsA, as did 3.5% in the conventional DMARD group, 1.9% in the biologics group, and 9.1% overall. Treatment with biologics was associated with a significantly lower odds of developing PsA compared with treatment with conventional DMARDs (3 versus 17.2 per 1,000 patient-years; incidence rate ratio [IRR], 0.17; P = .0177). There was a trend toward reduced odds of developing PsA among those on biologic therapy compared with those on topicals (3 versus 9.8 per 1,000 patient-years; IRR, 0.3; P = .0588).

The researchers confirmed all medical encounters using electronic medical records and the study had a long follow-up time, but was limited by the single center and its retrospective nature. It also could not associate reduced risk with specific biologics.

The findings probably reflect the presence of subclinical PsA that many clinicians don’t see, according to Dr. Thaçi. While a dermatology practice might find PsA in 2% or 3%, or at most, 10% of patients with psoriasis, “in our department it’s about 50 to 60 percent of patients who have psoriatic arthritis, because we diagnose it early,” he said.

He found the results of the study encouraging. “It looks like some of the biologics, for example IL [interleukin]-17 or even IL-23 [blockers] may have an influence on occurrence or delay the occurrence of psoriatic arthritis.”

Dr. Thaçi noted that early treatment of skin lesions can increase the probability of longer remissions, especially with IL-23 blockers. Still, that’s no guarantee the same would hold true for PsA risk. “Skin is skin and joints are joints,” Dr. Thaçi said.

Dr. Thaçi and Dr. Lo Giudice had no relevant financial disclosures.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM GRAPPA 2020 VIRTUAL ANNUAL MEETING

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article

In epilepsy, brain-responsive stimulation passes long-term tests

Article Type
Changed

Two new long-term studies, one an extension trial and the other an analysis of real-world experience, show that the RNS System direct brain responsive neurostimulator leads to reduction of seizure frequency in most epilepsy patients who had it implanted. Both studies showed that the benefit from the devices increased over time.

That accruing benefit may be because of improved protocols as clinicians gain experience with the device or because of network remodeling that occurs over time as seizures are controlled. “I think it’s both,” said Martha Morrell, MD, a clinical professor of neurology at Stanford (Calif.) University and chief medical officer at NeuroPace, the company that has marketed the device since it gained FDA approval in 2013.

In both studies, the slope of improvement over time was similar, but the real-world study showed greater improvement at the beginning of treatment. “I think the slopes represent physiological changes, but the fact that [the real-world study] starts with better outcomes is, I think, directly attributable to learning. When the long-term study was started in 2004, this had never been done before, and we had to make a highly educated guess about what we should do, and the initial stimulatory parameters were programmed in a way that’s very similar to what was used for movement disorders,” Dr. Morrell said in an interview.

The long-term treatment study appeared online July 20 in the journal Neurology, while the real-world analysis was published July 13 in Epilepsia.
 

An alternative option

Medications can effectively treat some seizures, but 30%-40% of patients must turn to other options for control. Surgery can sometimes be curative, but is not suitable for some patients. Other stimulation devices include vagus nerve stimulation (VNS), which sends pulses from a chest implant to the vagus nerve, reducing epileptic attacks through an unknown mechanism. Deep brain stimulation (DBS) places electrodes that deliver stimulation to the anterior nucleus of the thalamus, which can spread initially localized seizures.

The RNS device consists of a neurostimulator implanted cranially and connected to leads that are placed based on the individual patient’s seizure focus or foci. It also continuously monitors brain activity and delivers stimulation only when its signal suggests the beginning of a seizure.

That capacity for recording is a key benefit because the information can be stored and analyzed, according to Vikram Rao, MD, PhD, a coinvestigator in the real-world trial and an associate professor and the epilepsy division chief at the University of California, San Francisco, which was one of the trial centers. “You know more precisely than we previously did how many seizures a patient is having. Many of our patients are not able to quantify their seizures with perfect accuracy, so we’re better quantifying their seizure burden,” Dr. Rao said in an interview.

The ability to monitor patients can also improve clinical management. Dr. Morrell recounted an elderly patient who for many years has driven 5 hours for appointments. Recently she was able to review his data from the RNS System remotely. She determined that he was doing fine and, after a telephone consultation, told him he didn’t need to come in for a scheduled visit.
 

 

 

Real-world analysis

In the real-world analysis, researchers led by Babak Razavi, PhD, and Casey Halpern, MD, at Stanford University conducted a chart review of 150 patients at eight centers who underwent treatment with the RNS system between 2013 and 2018. All patients were followed at least 1 year, with a mean of 2.3 years. Patients had a median of 7.7 disabling seizures per month. The mean value was 52 and the numbers ranged from 0.1 to 3,000. A total of 60% had abnormal brain MRI findings.

At 1 year, subjects achieved a mean 67% decrease in seizure frequency (interquartile range, 50%-94%). At 2 years, that grew to 77%; at 3 or more years, 84%. There was no significant difference in seizure reduction at 1 year according to age, age at epilepsy onset, duration of epilepsy, location of seizure foci, presence of brain MRI abnormalities, prior intracranial monitoring, prior epilepsy surgery, or prior VNS treatment. When patients who underwent a resection at the time of RNS placement were excluded, the results were similar. There were no significant differences in outcome by center.

A total of 11.3% of patients experienced a device-related serious adverse event, and 4% developed infections. The rate of infection was not significantly different between patients who had the neurostimulator and leads implanted alone (3.0%) and patients who had intracranial EEG diagnostic monitoring (ICM) electrodes removed at the same time (6.1%; P = .38).

Although about one-third of the patients who started the long-term study dropped out before completion, most were because the participants moved away from treatment centers, according to Dr. Morrell, and other evidence points squarely to patient satisfaction. “At the end of the battery’s longevity, the neurostimulator needs to be replaced. It’s an outpatient, 45-minute procedure. Over 90% of patients chose to have it replaced. It’s not the answer for everybody, but the substantial majority of patients choose to continue,” she said.
 

Extension trial

The open-label extension trial, led by Dileep Nair, MD, of the Cleveland Clinic Foundation and Dr. Morrell, followed 230 of the 256 patients who participated in 2-year phase 3 study or feasibility studies, extending device usage to 9 years. A total of 162 completed follow-up (mean, 7.5 years). The median reduction of seizure frequency was 58% at the end of year 3, and 75% by year 9 (P < .0001; Wilcoxon signed rank). Although patient population enrichment could have explained this observation, other analyses confirmed that the improvement was real.

Nearly 75% had at least a 50% reduction in seizure frequency; 35% had a 90% or greater reduction in seizure frequency. Some patients (18.4%) had at least a full year with no seizures, and 62% who had a 1-year seizure-free period experienced no seizures at the latest follow-up. Overall, 21% had no seizures in the last 6 months of follow-up.

For those with a seizure-free period of more than 1 year, the average duration was 3.2 years (range, 1.04-9.6 years). There was no difference in response among patients based on previous antiseizure medication use or previous epilepsy surgery, VNS treatment, or intracranial monitoring, and there were no differences by patient age at enrollment, age of seizure onset, brain imaging abnormality, seizure onset locality, or number of foci.

The researchers noted improvement in overall Quality of Life in Epilepsy Inventory–89 scores at 1 year (mean, +3.2; P < .0001), which continued through year 9 (mean, +1.9; P < .05). Improvements were also seen in epilepsy targeted (mean, +4.5; P < .001) and cognitive domains (mean, +2.5; P = .005). Risk of infection was 4.1% per procedure, and 12.1% of subjects overall experienced a serious device-related implant infection. Of 35 infections, 16 led to device removal.

The extension study was funded by NeuroPace. NeuroPace supported data entry and institutional review board submission for the real-world trial. Dr. Morrell owns stock and is an employee of NeuroPace. Dr Rao has received support from and/or consulted for NeuroPace.

SOURCE: Nair DR et al. Neurology. 2020 Jul 20. doi: 10.1212/WNL.0000000000010154. Razavi B et al. Epilepsia. 2020 Jul 13. doi: 10.1111/epi.16593.

Issue
Neurology Reviews- 28(9)
Publications
Topics
Sections

Two new long-term studies, one an extension trial and the other an analysis of real-world experience, show that the RNS System direct brain responsive neurostimulator leads to reduction of seizure frequency in most epilepsy patients who had it implanted. Both studies showed that the benefit from the devices increased over time.

That accruing benefit may be because of improved protocols as clinicians gain experience with the device or because of network remodeling that occurs over time as seizures are controlled. “I think it’s both,” said Martha Morrell, MD, a clinical professor of neurology at Stanford (Calif.) University and chief medical officer at NeuroPace, the company that has marketed the device since it gained FDA approval in 2013.

In both studies, the slope of improvement over time was similar, but the real-world study showed greater improvement at the beginning of treatment. “I think the slopes represent physiological changes, but the fact that [the real-world study] starts with better outcomes is, I think, directly attributable to learning. When the long-term study was started in 2004, this had never been done before, and we had to make a highly educated guess about what we should do, and the initial stimulatory parameters were programmed in a way that’s very similar to what was used for movement disorders,” Dr. Morrell said in an interview.

The long-term treatment study appeared online July 20 in the journal Neurology, while the real-world analysis was published July 13 in Epilepsia.
 

An alternative option

Medications can effectively treat some seizures, but 30%-40% of patients must turn to other options for control. Surgery can sometimes be curative, but is not suitable for some patients. Other stimulation devices include vagus nerve stimulation (VNS), which sends pulses from a chest implant to the vagus nerve, reducing epileptic attacks through an unknown mechanism. Deep brain stimulation (DBS) places electrodes that deliver stimulation to the anterior nucleus of the thalamus, which can spread initially localized seizures.

The RNS device consists of a neurostimulator implanted cranially and connected to leads that are placed based on the individual patient’s seizure focus or foci. It also continuously monitors brain activity and delivers stimulation only when its signal suggests the beginning of a seizure.

That capacity for recording is a key benefit because the information can be stored and analyzed, according to Vikram Rao, MD, PhD, a coinvestigator in the real-world trial and an associate professor and the epilepsy division chief at the University of California, San Francisco, which was one of the trial centers. “You know more precisely than we previously did how many seizures a patient is having. Many of our patients are not able to quantify their seizures with perfect accuracy, so we’re better quantifying their seizure burden,” Dr. Rao said in an interview.

The ability to monitor patients can also improve clinical management. Dr. Morrell recounted an elderly patient who for many years has driven 5 hours for appointments. Recently she was able to review his data from the RNS System remotely. She determined that he was doing fine and, after a telephone consultation, told him he didn’t need to come in for a scheduled visit.
 

 

 

Real-world analysis

In the real-world analysis, researchers led by Babak Razavi, PhD, and Casey Halpern, MD, at Stanford University conducted a chart review of 150 patients at eight centers who underwent treatment with the RNS system between 2013 and 2018. All patients were followed at least 1 year, with a mean of 2.3 years. Patients had a median of 7.7 disabling seizures per month. The mean value was 52 and the numbers ranged from 0.1 to 3,000. A total of 60% had abnormal brain MRI findings.

At 1 year, subjects achieved a mean 67% decrease in seizure frequency (interquartile range, 50%-94%). At 2 years, that grew to 77%; at 3 or more years, 84%. There was no significant difference in seizure reduction at 1 year according to age, age at epilepsy onset, duration of epilepsy, location of seizure foci, presence of brain MRI abnormalities, prior intracranial monitoring, prior epilepsy surgery, or prior VNS treatment. When patients who underwent a resection at the time of RNS placement were excluded, the results were similar. There were no significant differences in outcome by center.

A total of 11.3% of patients experienced a device-related serious adverse event, and 4% developed infections. The rate of infection was not significantly different between patients who had the neurostimulator and leads implanted alone (3.0%) and patients who had intracranial EEG diagnostic monitoring (ICM) electrodes removed at the same time (6.1%; P = .38).

Although about one-third of the patients who started the long-term study dropped out before completion, most were because the participants moved away from treatment centers, according to Dr. Morrell, and other evidence points squarely to patient satisfaction. “At the end of the battery’s longevity, the neurostimulator needs to be replaced. It’s an outpatient, 45-minute procedure. Over 90% of patients chose to have it replaced. It’s not the answer for everybody, but the substantial majority of patients choose to continue,” she said.
 

Extension trial

The open-label extension trial, led by Dileep Nair, MD, of the Cleveland Clinic Foundation and Dr. Morrell, followed 230 of the 256 patients who participated in 2-year phase 3 study or feasibility studies, extending device usage to 9 years. A total of 162 completed follow-up (mean, 7.5 years). The median reduction of seizure frequency was 58% at the end of year 3, and 75% by year 9 (P < .0001; Wilcoxon signed rank). Although patient population enrichment could have explained this observation, other analyses confirmed that the improvement was real.

Nearly 75% had at least a 50% reduction in seizure frequency; 35% had a 90% or greater reduction in seizure frequency. Some patients (18.4%) had at least a full year with no seizures, and 62% who had a 1-year seizure-free period experienced no seizures at the latest follow-up. Overall, 21% had no seizures in the last 6 months of follow-up.

For those with a seizure-free period of more than 1 year, the average duration was 3.2 years (range, 1.04-9.6 years). There was no difference in response among patients based on previous antiseizure medication use or previous epilepsy surgery, VNS treatment, or intracranial monitoring, and there were no differences by patient age at enrollment, age of seizure onset, brain imaging abnormality, seizure onset locality, or number of foci.

The researchers noted improvement in overall Quality of Life in Epilepsy Inventory–89 scores at 1 year (mean, +3.2; P < .0001), which continued through year 9 (mean, +1.9; P < .05). Improvements were also seen in epilepsy targeted (mean, +4.5; P < .001) and cognitive domains (mean, +2.5; P = .005). Risk of infection was 4.1% per procedure, and 12.1% of subjects overall experienced a serious device-related implant infection. Of 35 infections, 16 led to device removal.

The extension study was funded by NeuroPace. NeuroPace supported data entry and institutional review board submission for the real-world trial. Dr. Morrell owns stock and is an employee of NeuroPace. Dr Rao has received support from and/or consulted for NeuroPace.

SOURCE: Nair DR et al. Neurology. 2020 Jul 20. doi: 10.1212/WNL.0000000000010154. Razavi B et al. Epilepsia. 2020 Jul 13. doi: 10.1111/epi.16593.

Two new long-term studies, one an extension trial and the other an analysis of real-world experience, show that the RNS System direct brain responsive neurostimulator leads to reduction of seizure frequency in most epilepsy patients who had it implanted. Both studies showed that the benefit from the devices increased over time.

That accruing benefit may be because of improved protocols as clinicians gain experience with the device or because of network remodeling that occurs over time as seizures are controlled. “I think it’s both,” said Martha Morrell, MD, a clinical professor of neurology at Stanford (Calif.) University and chief medical officer at NeuroPace, the company that has marketed the device since it gained FDA approval in 2013.

In both studies, the slope of improvement over time was similar, but the real-world study showed greater improvement at the beginning of treatment. “I think the slopes represent physiological changes, but the fact that [the real-world study] starts with better outcomes is, I think, directly attributable to learning. When the long-term study was started in 2004, this had never been done before, and we had to make a highly educated guess about what we should do, and the initial stimulatory parameters were programmed in a way that’s very similar to what was used for movement disorders,” Dr. Morrell said in an interview.

The long-term treatment study appeared online July 20 in the journal Neurology, while the real-world analysis was published July 13 in Epilepsia.
 

An alternative option

Medications can effectively treat some seizures, but 30%-40% of patients must turn to other options for control. Surgery can sometimes be curative, but is not suitable for some patients. Other stimulation devices include vagus nerve stimulation (VNS), which sends pulses from a chest implant to the vagus nerve, reducing epileptic attacks through an unknown mechanism. Deep brain stimulation (DBS) places electrodes that deliver stimulation to the anterior nucleus of the thalamus, which can spread initially localized seizures.

The RNS device consists of a neurostimulator implanted cranially and connected to leads that are placed based on the individual patient’s seizure focus or foci. It also continuously monitors brain activity and delivers stimulation only when its signal suggests the beginning of a seizure.

That capacity for recording is a key benefit because the information can be stored and analyzed, according to Vikram Rao, MD, PhD, a coinvestigator in the real-world trial and an associate professor and the epilepsy division chief at the University of California, San Francisco, which was one of the trial centers. “You know more precisely than we previously did how many seizures a patient is having. Many of our patients are not able to quantify their seizures with perfect accuracy, so we’re better quantifying their seizure burden,” Dr. Rao said in an interview.

The ability to monitor patients can also improve clinical management. Dr. Morrell recounted an elderly patient who for many years has driven 5 hours for appointments. Recently she was able to review his data from the RNS System remotely. She determined that he was doing fine and, after a telephone consultation, told him he didn’t need to come in for a scheduled visit.
 

 

 

Real-world analysis

In the real-world analysis, researchers led by Babak Razavi, PhD, and Casey Halpern, MD, at Stanford University conducted a chart review of 150 patients at eight centers who underwent treatment with the RNS system between 2013 and 2018. All patients were followed at least 1 year, with a mean of 2.3 years. Patients had a median of 7.7 disabling seizures per month. The mean value was 52 and the numbers ranged from 0.1 to 3,000. A total of 60% had abnormal brain MRI findings.

At 1 year, subjects achieved a mean 67% decrease in seizure frequency (interquartile range, 50%-94%). At 2 years, that grew to 77%; at 3 or more years, 84%. There was no significant difference in seizure reduction at 1 year according to age, age at epilepsy onset, duration of epilepsy, location of seizure foci, presence of brain MRI abnormalities, prior intracranial monitoring, prior epilepsy surgery, or prior VNS treatment. When patients who underwent a resection at the time of RNS placement were excluded, the results were similar. There were no significant differences in outcome by center.

A total of 11.3% of patients experienced a device-related serious adverse event, and 4% developed infections. The rate of infection was not significantly different between patients who had the neurostimulator and leads implanted alone (3.0%) and patients who had intracranial EEG diagnostic monitoring (ICM) electrodes removed at the same time (6.1%; P = .38).

Although about one-third of the patients who started the long-term study dropped out before completion, most were because the participants moved away from treatment centers, according to Dr. Morrell, and other evidence points squarely to patient satisfaction. “At the end of the battery’s longevity, the neurostimulator needs to be replaced. It’s an outpatient, 45-minute procedure. Over 90% of patients chose to have it replaced. It’s not the answer for everybody, but the substantial majority of patients choose to continue,” she said.
 

Extension trial

The open-label extension trial, led by Dileep Nair, MD, of the Cleveland Clinic Foundation and Dr. Morrell, followed 230 of the 256 patients who participated in 2-year phase 3 study or feasibility studies, extending device usage to 9 years. A total of 162 completed follow-up (mean, 7.5 years). The median reduction of seizure frequency was 58% at the end of year 3, and 75% by year 9 (P < .0001; Wilcoxon signed rank). Although patient population enrichment could have explained this observation, other analyses confirmed that the improvement was real.

Nearly 75% had at least a 50% reduction in seizure frequency; 35% had a 90% or greater reduction in seizure frequency. Some patients (18.4%) had at least a full year with no seizures, and 62% who had a 1-year seizure-free period experienced no seizures at the latest follow-up. Overall, 21% had no seizures in the last 6 months of follow-up.

For those with a seizure-free period of more than 1 year, the average duration was 3.2 years (range, 1.04-9.6 years). There was no difference in response among patients based on previous antiseizure medication use or previous epilepsy surgery, VNS treatment, or intracranial monitoring, and there were no differences by patient age at enrollment, age of seizure onset, brain imaging abnormality, seizure onset locality, or number of foci.

The researchers noted improvement in overall Quality of Life in Epilepsy Inventory–89 scores at 1 year (mean, +3.2; P < .0001), which continued through year 9 (mean, +1.9; P < .05). Improvements were also seen in epilepsy targeted (mean, +4.5; P < .001) and cognitive domains (mean, +2.5; P = .005). Risk of infection was 4.1% per procedure, and 12.1% of subjects overall experienced a serious device-related implant infection. Of 35 infections, 16 led to device removal.

The extension study was funded by NeuroPace. NeuroPace supported data entry and institutional review board submission for the real-world trial. Dr. Morrell owns stock and is an employee of NeuroPace. Dr Rao has received support from and/or consulted for NeuroPace.

SOURCE: Nair DR et al. Neurology. 2020 Jul 20. doi: 10.1212/WNL.0000000000010154. Razavi B et al. Epilepsia. 2020 Jul 13. doi: 10.1111/epi.16593.

Issue
Neurology Reviews- 28(9)
Issue
Neurology Reviews- 28(9)
Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM EPILEPSIA AND FROM NEUROLOGY

Citation Override
Publish date: July 31, 2020
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article