AAP guidance helps distinguish bleeding disorders from abuse

Article Type
Changed
Wed, 09/21/2022 - 09:01

 

In some cases, bruising or bleeding from bleeding disorders may look like signs of child abuse, but new guidance may help clinicians distinguish one from the other.

On Sept. 19 the American Academy of Pediatrics published two reports – a clinical report and a technical report – in the October 2022 issue of Pediatrics on evaluating for bleeding disorders when child abuse is suspected.

The reports were written by the AAP Section on Hematology/Oncology and the AAP Council on Child Abuse and Neglect.
 

One doesn’t rule out the other

The reports emphasize that laboratory testing of bleeding cannot always rule out abuse, just as a history of trauma (accidental or nonaccidental) may not rule out a bleeding disorder or other medical condition.

In the clinical report, led by James Anderst, MD, MSCI, with the division of child adversity and resilience, Children’s Mercy Hospital, University of Missouri–Kansas City, the researchers note that infants are at especially high risk of abusive bruising/bleeding, but bleeding disorders may also present in infancy.

The authors give an example of a situation when taking a thorough history won’t necessarily rule out a bleeding disorder: Male infants who have been circumcised with no significant bleeding issues may still have a bleeding disorder. Therefore, laboratory evaluations are often needed to detect disordered bleeding.

Children’s medications should be documented, the authors note, because certain drugs, such as nonsteroidal anti-inflammatory drugs, some antibiotics, antiepileptics, and herbal supplements, can affect tests that might be used to detect bleeding disorders.

Likewise, asking about restrictive or unusual diets or alternative therapies is important as some could increase the likelihood of bleeding/bruising.

Signs that bleeding disorder is not likely

The authors advise that, if a child has any of the following, an evaluation for a bleeding disorder is generally not needed:

  • Caregivers’ description of trauma sufficiently explains the bruising.
  • The child or an independent witness can provide a history of abuse or nonabusive trauma that explains the bruising.
  • The outline of the bruising follows an object or hand pattern.
  • The location of the bruising is on the ears, neck, or genitals.

“Bruising to the ears, neck, or genitals is rarely seen in either accidental injuries or in children with bleeding disorders,” the authors write.

Specification of which locations for injuries are more indicative of abuse in both mobile and immobile children was among the most important information from the paper, Seattle pediatrician Timothy Joos, MD, said in an interview.

Also very helpful, he said, was the listing of which tests should be done if bruising looks like potential abuse.

The authors write that if bruising is concerning for abuse that necessitates evaluation for bleeding disorders, the following tests should be done: PT (prothrombin time); aPTT (activated partial thromboplastin time); von Willebrand Factor (VWF) activity (Ristocetin cofactor); factor VIII activity level; factor IX activity level; and a complete blood count, including platelets.

“I think that’s what a lot of us suspected, but there’s not a lot of summary evidence regarding that until now,” Dr. Joos said.

 

 

Case-by-case decisions on when to test

The decision on whether to evaluate for a bleeding disorder may be made case by case.

If there is no obvious known trauma or intracranial hemorrhage (ICH), particularly subdural hematoma (SDH) in a nonmobile child, abuse should be suspected, the authors write.

They acknowledge that children can have ICH, such as a small SDH or an epidural hematoma, under the point of impact from a short fall.

“However,” the authors write, “short falls rarely result in significant brain injury.”

Conditions may affect screening tests

Screening tests for bleeding disorders can be falsely positive or falsely negative, the authors caution in the technical report, led by Shannon Carpenter, MD, MS, with the department of pediatrics, University of Missouri–Kansas City.

  • If coagulation laboratory test specimens sit in a hot metal box all day, for instance, factor levels may be falsely low, the authors explain.
  • Conversely, factors such as VWF and factor VIII are acute-phase reactants and factor levels will be deceptively high if blood specimens are taken in a stressful time.
  • Patients who have a traumatic brain injury often show temporary coagulopathy that does not signal a congenital disorder.

Vitamin K deficiency

The technical report explains that if an infant, typically younger than 6 months, presents with bleeding/bruising that raises flags for abuse and has a long PT, clinicians should confirm vitamin K was provided at birth and/or testing for vitamin K deficiency should be performed.

Not all states require vitamin K to be administered at birth and some parents refuse it. Deficiency can lead to bleeding in the skin or from mucosal surfaces from circumcision, generalized ecchymoses, and large intramuscular hemorrhages or ICH.

When infants don’t get vitamin K at birth, vitamin K deficiency bleeding (VKDB) is seen most often in the first days of life, the technical report states. It can also occur 1-3 months after birth.

“Late VKDB occurs from the first month to 3 months after birth,” the authors write. “This deficiency is more prevalent in breast-fed babies, because human milk contains less vitamin K than does cow milk.”

Overall, the authors write, extensive lab tests are usually not necessary, given the rarity of most bleeding disorders and specific clinical factors that decrease the odds that a bleeding disorder caused the child’s findings.

Dr. Joos said the decisions described in this paper are the kind that can keep pediatricians up at night.

“Any kind of guidance is helpful in these difficult cases,” he said. “These are scenarios that can often happen in the middle of the night, and you’re often struggling with evidence or past experience that can help you make some of these decisions.”

Authors of the reports and Dr. Joos declared no relevant financial relationships.

Publications
Topics
Sections

 

In some cases, bruising or bleeding from bleeding disorders may look like signs of child abuse, but new guidance may help clinicians distinguish one from the other.

On Sept. 19 the American Academy of Pediatrics published two reports – a clinical report and a technical report – in the October 2022 issue of Pediatrics on evaluating for bleeding disorders when child abuse is suspected.

The reports were written by the AAP Section on Hematology/Oncology and the AAP Council on Child Abuse and Neglect.
 

One doesn’t rule out the other

The reports emphasize that laboratory testing of bleeding cannot always rule out abuse, just as a history of trauma (accidental or nonaccidental) may not rule out a bleeding disorder or other medical condition.

In the clinical report, led by James Anderst, MD, MSCI, with the division of child adversity and resilience, Children’s Mercy Hospital, University of Missouri–Kansas City, the researchers note that infants are at especially high risk of abusive bruising/bleeding, but bleeding disorders may also present in infancy.

The authors give an example of a situation when taking a thorough history won’t necessarily rule out a bleeding disorder: Male infants who have been circumcised with no significant bleeding issues may still have a bleeding disorder. Therefore, laboratory evaluations are often needed to detect disordered bleeding.

Children’s medications should be documented, the authors note, because certain drugs, such as nonsteroidal anti-inflammatory drugs, some antibiotics, antiepileptics, and herbal supplements, can affect tests that might be used to detect bleeding disorders.

Likewise, asking about restrictive or unusual diets or alternative therapies is important as some could increase the likelihood of bleeding/bruising.

Signs that bleeding disorder is not likely

The authors advise that, if a child has any of the following, an evaluation for a bleeding disorder is generally not needed:

  • Caregivers’ description of trauma sufficiently explains the bruising.
  • The child or an independent witness can provide a history of abuse or nonabusive trauma that explains the bruising.
  • The outline of the bruising follows an object or hand pattern.
  • The location of the bruising is on the ears, neck, or genitals.

“Bruising to the ears, neck, or genitals is rarely seen in either accidental injuries or in children with bleeding disorders,” the authors write.

Specification of which locations for injuries are more indicative of abuse in both mobile and immobile children was among the most important information from the paper, Seattle pediatrician Timothy Joos, MD, said in an interview.

Also very helpful, he said, was the listing of which tests should be done if bruising looks like potential abuse.

The authors write that if bruising is concerning for abuse that necessitates evaluation for bleeding disorders, the following tests should be done: PT (prothrombin time); aPTT (activated partial thromboplastin time); von Willebrand Factor (VWF) activity (Ristocetin cofactor); factor VIII activity level; factor IX activity level; and a complete blood count, including platelets.

“I think that’s what a lot of us suspected, but there’s not a lot of summary evidence regarding that until now,” Dr. Joos said.

 

 

Case-by-case decisions on when to test

The decision on whether to evaluate for a bleeding disorder may be made case by case.

If there is no obvious known trauma or intracranial hemorrhage (ICH), particularly subdural hematoma (SDH) in a nonmobile child, abuse should be suspected, the authors write.

They acknowledge that children can have ICH, such as a small SDH or an epidural hematoma, under the point of impact from a short fall.

“However,” the authors write, “short falls rarely result in significant brain injury.”

Conditions may affect screening tests

Screening tests for bleeding disorders can be falsely positive or falsely negative, the authors caution in the technical report, led by Shannon Carpenter, MD, MS, with the department of pediatrics, University of Missouri–Kansas City.

  • If coagulation laboratory test specimens sit in a hot metal box all day, for instance, factor levels may be falsely low, the authors explain.
  • Conversely, factors such as VWF and factor VIII are acute-phase reactants and factor levels will be deceptively high if blood specimens are taken in a stressful time.
  • Patients who have a traumatic brain injury often show temporary coagulopathy that does not signal a congenital disorder.

Vitamin K deficiency

The technical report explains that if an infant, typically younger than 6 months, presents with bleeding/bruising that raises flags for abuse and has a long PT, clinicians should confirm vitamin K was provided at birth and/or testing for vitamin K deficiency should be performed.

Not all states require vitamin K to be administered at birth and some parents refuse it. Deficiency can lead to bleeding in the skin or from mucosal surfaces from circumcision, generalized ecchymoses, and large intramuscular hemorrhages or ICH.

When infants don’t get vitamin K at birth, vitamin K deficiency bleeding (VKDB) is seen most often in the first days of life, the technical report states. It can also occur 1-3 months after birth.

“Late VKDB occurs from the first month to 3 months after birth,” the authors write. “This deficiency is more prevalent in breast-fed babies, because human milk contains less vitamin K than does cow milk.”

Overall, the authors write, extensive lab tests are usually not necessary, given the rarity of most bleeding disorders and specific clinical factors that decrease the odds that a bleeding disorder caused the child’s findings.

Dr. Joos said the decisions described in this paper are the kind that can keep pediatricians up at night.

“Any kind of guidance is helpful in these difficult cases,” he said. “These are scenarios that can often happen in the middle of the night, and you’re often struggling with evidence or past experience that can help you make some of these decisions.”

Authors of the reports and Dr. Joos declared no relevant financial relationships.

 

In some cases, bruising or bleeding from bleeding disorders may look like signs of child abuse, but new guidance may help clinicians distinguish one from the other.

On Sept. 19 the American Academy of Pediatrics published two reports – a clinical report and a technical report – in the October 2022 issue of Pediatrics on evaluating for bleeding disorders when child abuse is suspected.

The reports were written by the AAP Section on Hematology/Oncology and the AAP Council on Child Abuse and Neglect.
 

One doesn’t rule out the other

The reports emphasize that laboratory testing of bleeding cannot always rule out abuse, just as a history of trauma (accidental or nonaccidental) may not rule out a bleeding disorder or other medical condition.

In the clinical report, led by James Anderst, MD, MSCI, with the division of child adversity and resilience, Children’s Mercy Hospital, University of Missouri–Kansas City, the researchers note that infants are at especially high risk of abusive bruising/bleeding, but bleeding disorders may also present in infancy.

The authors give an example of a situation when taking a thorough history won’t necessarily rule out a bleeding disorder: Male infants who have been circumcised with no significant bleeding issues may still have a bleeding disorder. Therefore, laboratory evaluations are often needed to detect disordered bleeding.

Children’s medications should be documented, the authors note, because certain drugs, such as nonsteroidal anti-inflammatory drugs, some antibiotics, antiepileptics, and herbal supplements, can affect tests that might be used to detect bleeding disorders.

Likewise, asking about restrictive or unusual diets or alternative therapies is important as some could increase the likelihood of bleeding/bruising.

Signs that bleeding disorder is not likely

The authors advise that, if a child has any of the following, an evaluation for a bleeding disorder is generally not needed:

  • Caregivers’ description of trauma sufficiently explains the bruising.
  • The child or an independent witness can provide a history of abuse or nonabusive trauma that explains the bruising.
  • The outline of the bruising follows an object or hand pattern.
  • The location of the bruising is on the ears, neck, or genitals.

“Bruising to the ears, neck, or genitals is rarely seen in either accidental injuries or in children with bleeding disorders,” the authors write.

Specification of which locations for injuries are more indicative of abuse in both mobile and immobile children was among the most important information from the paper, Seattle pediatrician Timothy Joos, MD, said in an interview.

Also very helpful, he said, was the listing of which tests should be done if bruising looks like potential abuse.

The authors write that if bruising is concerning for abuse that necessitates evaluation for bleeding disorders, the following tests should be done: PT (prothrombin time); aPTT (activated partial thromboplastin time); von Willebrand Factor (VWF) activity (Ristocetin cofactor); factor VIII activity level; factor IX activity level; and a complete blood count, including platelets.

“I think that’s what a lot of us suspected, but there’s not a lot of summary evidence regarding that until now,” Dr. Joos said.

 

 

Case-by-case decisions on when to test

The decision on whether to evaluate for a bleeding disorder may be made case by case.

If there is no obvious known trauma or intracranial hemorrhage (ICH), particularly subdural hematoma (SDH) in a nonmobile child, abuse should be suspected, the authors write.

They acknowledge that children can have ICH, such as a small SDH or an epidural hematoma, under the point of impact from a short fall.

“However,” the authors write, “short falls rarely result in significant brain injury.”

Conditions may affect screening tests

Screening tests for bleeding disorders can be falsely positive or falsely negative, the authors caution in the technical report, led by Shannon Carpenter, MD, MS, with the department of pediatrics, University of Missouri–Kansas City.

  • If coagulation laboratory test specimens sit in a hot metal box all day, for instance, factor levels may be falsely low, the authors explain.
  • Conversely, factors such as VWF and factor VIII are acute-phase reactants and factor levels will be deceptively high if blood specimens are taken in a stressful time.
  • Patients who have a traumatic brain injury often show temporary coagulopathy that does not signal a congenital disorder.

Vitamin K deficiency

The technical report explains that if an infant, typically younger than 6 months, presents with bleeding/bruising that raises flags for abuse and has a long PT, clinicians should confirm vitamin K was provided at birth and/or testing for vitamin K deficiency should be performed.

Not all states require vitamin K to be administered at birth and some parents refuse it. Deficiency can lead to bleeding in the skin or from mucosal surfaces from circumcision, generalized ecchymoses, and large intramuscular hemorrhages or ICH.

When infants don’t get vitamin K at birth, vitamin K deficiency bleeding (VKDB) is seen most often in the first days of life, the technical report states. It can also occur 1-3 months after birth.

“Late VKDB occurs from the first month to 3 months after birth,” the authors write. “This deficiency is more prevalent in breast-fed babies, because human milk contains less vitamin K than does cow milk.”

Overall, the authors write, extensive lab tests are usually not necessary, given the rarity of most bleeding disorders and specific clinical factors that decrease the odds that a bleeding disorder caused the child’s findings.

Dr. Joos said the decisions described in this paper are the kind that can keep pediatricians up at night.

“Any kind of guidance is helpful in these difficult cases,” he said. “These are scenarios that can often happen in the middle of the night, and you’re often struggling with evidence or past experience that can help you make some of these decisions.”

Authors of the reports and Dr. Joos declared no relevant financial relationships.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM PEDIATRICS

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

The potential problem(s) with a once-a-year COVID vaccine

Article Type
Changed
Tue, 09/13/2022 - 14:35

Comments from the White House this week suggesting a once-a-year COVID-19 shot for most Americans, “just like your annual flu shot,” were met with backlash from many who say COVID and influenza come from different viruses and need different schedules.

Remarks, from “capitulation” to too few data, hit the airwaves and social media.

Some, however, agree with the White House vision and say that asking people to get one shot in the fall instead of periodic pushes for boosters will raise public confidence and buy-in and reduce consumer confusion.  

Health leaders, including Bob Wachter, MD, chair of the department of medicine at the University of California, San Francisco, say they like the framing of the concept – that people who are not high-risk should plan each year for a COVID shot and a flu shot.

“Doesn’t mean we KNOW shot will prevent transmission for a year. DOES mean it’ll likely lower odds of SEVERE case for a year & we need strategy to bump uptake,” Dr. Wachter tweeted this week.

But the numbers of Americans seeking boosters remain low. Only one-third of all eligible people 50 years and older have gotten a second COVID booster, according to the Centers for Disease Control and Prevention. About half of those who got the original two shots got a first booster.

Meanwhile, the United States is still averaging about 70,000 new COVID cases and more than 300 deaths every day.

The suggested change in approach comes as Pfizer/BioNTech and Moderna roll out their new boosters that target Omicron subvariants BA.4 and BA.5 after the CDC recommended their use and the U.S. Food and Drug Administration approved emergency use authorization. 

“As the virus continues to change, we will now be able to update our vaccines annually to target the dominant variant,” President Joe Biden said in a statement promoting the yearly approach.
 

Some say annual shot premature

Other experts say it’s too soon to tell whether an annual approach will work.

“We have no data to support that current vaccines, including the new BA.5 booster, will provide durable protection beyond 4-6 months. It would be good to aspire to this objective, and much longer duration or protection, but that will likely require next generation and nasal vaccines,” said Eric Topol, MD, Medscape’s editor-in-chief and founder and director of the Scripps Research Translational Institute.

A report in Nature Reviews Immunology states, “Mucosal vaccines offer the potential to trigger robust protective immune responses at the predominant sites of pathogen infection” and potentially “can prevent an infection from becoming established in the first place, rather than only curtailing infection and protecting against the development of disease symptoms.”

Dr. Topol tweeted after the White House statements, “[An annual vaccine] has the ring of Covid capitulation.”

William Schaffner, MD, an infectious disease expert at Vanderbilt University, Nashville, Tenn., told this news organization that he cautions against interpreting the White House comments as official policy.

“This is the difficulty of having public health announcements come out of Washington,” he said. “They ought to come out of the CDC.”

He says there is a reasonable analogy between COVID and influenza, but warns, “don’t push the analogy.”

They are both serious respiratory viruses that can cause much illness and death in essentially the same populations, he notes. These are the older, frail people, people who have underlying illnesses or are immunocompromised.

Both viruses also mutate. But there the paths diverge.

“We’ve gotten into a pattern of annually updating the influenza vaccine because it is such a singularly seasonal virus,” Dr. Schaffner said. “Basically it disappears during the summer. We’ve had plenty of COVID during the summers.”

For COVID, he said, “We will need a periodic booster. Could this be annually? That would certainly make it easier.” But it’s too soon to tell, he said.

Dr. Schaffner noted that several manufacturers are working on a combined flu/COVID vaccine.
 

 

 

Just a ‘first step’ toward annual shot

The currently updated COVID vaccine may be the first step toward an annual vaccine, but it’s only the first step, Dr. Schaffner said. “We haven’t committed to further steps yet because we’re watching this virus.”

Syra Madad, DHSc, MSc, an infectious disease epidemiologist at Harvard University’s Belfer Center for Science and International Affairs, Cambridge, Mass., and the New York City hospital system, told this news organization that arguments on both sides make sense.

Having a single message once a year can help eliminate the considerable confusion involving people on individual timelines with different levels of immunity and separate campaigns for COVID and flu shots coming at different times of the year.

“Communication around vaccines is very muddled and that shows in our overall vaccination rates, particularly booster rates,” she says. “The overall strategy is hopeful and makes sense if we’re going to progress that way based on data.”

However, she said that the data are just not there yet to show it’s time for an annual vaccine. First, scientists will need to see how long protection lasts with the Omicron-specific vaccine and how well and how long it protects against severe disease and death as well as infection.

COVID is less predictable than influenza and the influenza vaccine has been around for decades, Dr. Madad noted. With influenza, the patterns are more easily anticipated with their “ladder-like pattern,” she said. “COVID-19 is not like that.”

What is hopeful, she said, “is that we’ve been in the Omicron dynasty since November of 2021. I’m hopeful that we’ll stick with that particular variant.”

Dr. Topol, Dr. Schaffner, and Dr. Madad declared no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Comments from the White House this week suggesting a once-a-year COVID-19 shot for most Americans, “just like your annual flu shot,” were met with backlash from many who say COVID and influenza come from different viruses and need different schedules.

Remarks, from “capitulation” to too few data, hit the airwaves and social media.

Some, however, agree with the White House vision and say that asking people to get one shot in the fall instead of periodic pushes for boosters will raise public confidence and buy-in and reduce consumer confusion.  

Health leaders, including Bob Wachter, MD, chair of the department of medicine at the University of California, San Francisco, say they like the framing of the concept – that people who are not high-risk should plan each year for a COVID shot and a flu shot.

“Doesn’t mean we KNOW shot will prevent transmission for a year. DOES mean it’ll likely lower odds of SEVERE case for a year & we need strategy to bump uptake,” Dr. Wachter tweeted this week.

But the numbers of Americans seeking boosters remain low. Only one-third of all eligible people 50 years and older have gotten a second COVID booster, according to the Centers for Disease Control and Prevention. About half of those who got the original two shots got a first booster.

Meanwhile, the United States is still averaging about 70,000 new COVID cases and more than 300 deaths every day.

The suggested change in approach comes as Pfizer/BioNTech and Moderna roll out their new boosters that target Omicron subvariants BA.4 and BA.5 after the CDC recommended their use and the U.S. Food and Drug Administration approved emergency use authorization. 

“As the virus continues to change, we will now be able to update our vaccines annually to target the dominant variant,” President Joe Biden said in a statement promoting the yearly approach.
 

Some say annual shot premature

Other experts say it’s too soon to tell whether an annual approach will work.

“We have no data to support that current vaccines, including the new BA.5 booster, will provide durable protection beyond 4-6 months. It would be good to aspire to this objective, and much longer duration or protection, but that will likely require next generation and nasal vaccines,” said Eric Topol, MD, Medscape’s editor-in-chief and founder and director of the Scripps Research Translational Institute.

A report in Nature Reviews Immunology states, “Mucosal vaccines offer the potential to trigger robust protective immune responses at the predominant sites of pathogen infection” and potentially “can prevent an infection from becoming established in the first place, rather than only curtailing infection and protecting against the development of disease symptoms.”

Dr. Topol tweeted after the White House statements, “[An annual vaccine] has the ring of Covid capitulation.”

William Schaffner, MD, an infectious disease expert at Vanderbilt University, Nashville, Tenn., told this news organization that he cautions against interpreting the White House comments as official policy.

“This is the difficulty of having public health announcements come out of Washington,” he said. “They ought to come out of the CDC.”

He says there is a reasonable analogy between COVID and influenza, but warns, “don’t push the analogy.”

They are both serious respiratory viruses that can cause much illness and death in essentially the same populations, he notes. These are the older, frail people, people who have underlying illnesses or are immunocompromised.

Both viruses also mutate. But there the paths diverge.

“We’ve gotten into a pattern of annually updating the influenza vaccine because it is such a singularly seasonal virus,” Dr. Schaffner said. “Basically it disappears during the summer. We’ve had plenty of COVID during the summers.”

For COVID, he said, “We will need a periodic booster. Could this be annually? That would certainly make it easier.” But it’s too soon to tell, he said.

Dr. Schaffner noted that several manufacturers are working on a combined flu/COVID vaccine.
 

 

 

Just a ‘first step’ toward annual shot

The currently updated COVID vaccine may be the first step toward an annual vaccine, but it’s only the first step, Dr. Schaffner said. “We haven’t committed to further steps yet because we’re watching this virus.”

Syra Madad, DHSc, MSc, an infectious disease epidemiologist at Harvard University’s Belfer Center for Science and International Affairs, Cambridge, Mass., and the New York City hospital system, told this news organization that arguments on both sides make sense.

Having a single message once a year can help eliminate the considerable confusion involving people on individual timelines with different levels of immunity and separate campaigns for COVID and flu shots coming at different times of the year.

“Communication around vaccines is very muddled and that shows in our overall vaccination rates, particularly booster rates,” she says. “The overall strategy is hopeful and makes sense if we’re going to progress that way based on data.”

However, she said that the data are just not there yet to show it’s time for an annual vaccine. First, scientists will need to see how long protection lasts with the Omicron-specific vaccine and how well and how long it protects against severe disease and death as well as infection.

COVID is less predictable than influenza and the influenza vaccine has been around for decades, Dr. Madad noted. With influenza, the patterns are more easily anticipated with their “ladder-like pattern,” she said. “COVID-19 is not like that.”

What is hopeful, she said, “is that we’ve been in the Omicron dynasty since November of 2021. I’m hopeful that we’ll stick with that particular variant.”

Dr. Topol, Dr. Schaffner, and Dr. Madad declared no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Comments from the White House this week suggesting a once-a-year COVID-19 shot for most Americans, “just like your annual flu shot,” were met with backlash from many who say COVID and influenza come from different viruses and need different schedules.

Remarks, from “capitulation” to too few data, hit the airwaves and social media.

Some, however, agree with the White House vision and say that asking people to get one shot in the fall instead of periodic pushes for boosters will raise public confidence and buy-in and reduce consumer confusion.  

Health leaders, including Bob Wachter, MD, chair of the department of medicine at the University of California, San Francisco, say they like the framing of the concept – that people who are not high-risk should plan each year for a COVID shot and a flu shot.

“Doesn’t mean we KNOW shot will prevent transmission for a year. DOES mean it’ll likely lower odds of SEVERE case for a year & we need strategy to bump uptake,” Dr. Wachter tweeted this week.

But the numbers of Americans seeking boosters remain low. Only one-third of all eligible people 50 years and older have gotten a second COVID booster, according to the Centers for Disease Control and Prevention. About half of those who got the original two shots got a first booster.

Meanwhile, the United States is still averaging about 70,000 new COVID cases and more than 300 deaths every day.

The suggested change in approach comes as Pfizer/BioNTech and Moderna roll out their new boosters that target Omicron subvariants BA.4 and BA.5 after the CDC recommended their use and the U.S. Food and Drug Administration approved emergency use authorization. 

“As the virus continues to change, we will now be able to update our vaccines annually to target the dominant variant,” President Joe Biden said in a statement promoting the yearly approach.
 

Some say annual shot premature

Other experts say it’s too soon to tell whether an annual approach will work.

“We have no data to support that current vaccines, including the new BA.5 booster, will provide durable protection beyond 4-6 months. It would be good to aspire to this objective, and much longer duration or protection, but that will likely require next generation and nasal vaccines,” said Eric Topol, MD, Medscape’s editor-in-chief and founder and director of the Scripps Research Translational Institute.

A report in Nature Reviews Immunology states, “Mucosal vaccines offer the potential to trigger robust protective immune responses at the predominant sites of pathogen infection” and potentially “can prevent an infection from becoming established in the first place, rather than only curtailing infection and protecting against the development of disease symptoms.”

Dr. Topol tweeted after the White House statements, “[An annual vaccine] has the ring of Covid capitulation.”

William Schaffner, MD, an infectious disease expert at Vanderbilt University, Nashville, Tenn., told this news organization that he cautions against interpreting the White House comments as official policy.

“This is the difficulty of having public health announcements come out of Washington,” he said. “They ought to come out of the CDC.”

He says there is a reasonable analogy between COVID and influenza, but warns, “don’t push the analogy.”

They are both serious respiratory viruses that can cause much illness and death in essentially the same populations, he notes. These are the older, frail people, people who have underlying illnesses or are immunocompromised.

Both viruses also mutate. But there the paths diverge.

“We’ve gotten into a pattern of annually updating the influenza vaccine because it is such a singularly seasonal virus,” Dr. Schaffner said. “Basically it disappears during the summer. We’ve had plenty of COVID during the summers.”

For COVID, he said, “We will need a periodic booster. Could this be annually? That would certainly make it easier.” But it’s too soon to tell, he said.

Dr. Schaffner noted that several manufacturers are working on a combined flu/COVID vaccine.
 

 

 

Just a ‘first step’ toward annual shot

The currently updated COVID vaccine may be the first step toward an annual vaccine, but it’s only the first step, Dr. Schaffner said. “We haven’t committed to further steps yet because we’re watching this virus.”

Syra Madad, DHSc, MSc, an infectious disease epidemiologist at Harvard University’s Belfer Center for Science and International Affairs, Cambridge, Mass., and the New York City hospital system, told this news organization that arguments on both sides make sense.

Having a single message once a year can help eliminate the considerable confusion involving people on individual timelines with different levels of immunity and separate campaigns for COVID and flu shots coming at different times of the year.

“Communication around vaccines is very muddled and that shows in our overall vaccination rates, particularly booster rates,” she says. “The overall strategy is hopeful and makes sense if we’re going to progress that way based on data.”

However, she said that the data are just not there yet to show it’s time for an annual vaccine. First, scientists will need to see how long protection lasts with the Omicron-specific vaccine and how well and how long it protects against severe disease and death as well as infection.

COVID is less predictable than influenza and the influenza vaccine has been around for decades, Dr. Madad noted. With influenza, the patterns are more easily anticipated with their “ladder-like pattern,” she said. “COVID-19 is not like that.”

What is hopeful, she said, “is that we’ve been in the Omicron dynasty since November of 2021. I’m hopeful that we’ll stick with that particular variant.”

Dr. Topol, Dr. Schaffner, and Dr. Madad declared no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

AGA clinical practice update: Expert review on managing short bowel syndrome

Article Type
Changed
Tue, 09/20/2022 - 13:16

 

Caring for patients with short bowel syndrome (SBS) requires a multidisciplinary approach involving dietitians, nurses, surgeons, gastroenterologists or internists, and social workers experienced in SBS care, according to a clinical practice update expert review from the American Gastroenterological Association.

Kishore Iyer, MD, from Mount Sinai Hospital New York; John K. DiBaise, MD, from Mayo Clinic in Scottsdale, Ariz.; and Alberto Rubio-Tapia, MD, from the Cleveland Clinic, Ohio, developed 12 best practice advice statements based on available evidence. The items focus on adult patients with SBS; however, there was some overlap with the management of pediatric SBS. The review was published online in Clinical Gastroenterology and Hepatology.
 

Defining SBS

One update concerns defining SBS. The authors recommend that surgeons performing massive resections should report the residual bowel length, rather than the length of bowel resected.

“It is only the former that dictates outcome,” they wrote.

There is general agreement that a residual small intestinal length of 200 cm or less meets criteria for SBS. Measurement should be taken from “along the antimesenteric border of unstretched bowel, from the duodenojejunal flexure to the ileocecal junction, the site of any small bowel–colon anastomosis, or to the end-ostomy.”

Based on the residual bowel length, patients can be classified into three groups: end-jejunostomy, jejuno-colic, and jejuno-ileo-colic.
 

Assessing nutritional status

A dietitian experienced in SBS should perform a thorough nutritional assessment on all SBS patients. Long-term monitoring should include laboratory studies checking electrolytes and liver and kidney function, fluid balance, weight change, serum micronutrients, and bone density. Bone density should be repeated periodically, every 2-3 years.

Fluid and electrolyte problems may affect outcomes for SBS patients, particularly for those without a colon.
 

Adjusting diets

Most adult patients with SBS have significant malabsorption, so dietary intake “must be increased by at least 50% from their estimated needs,” the authors wrote. It’s best if the patient consumes the increased quantity throughout the day in 5-6 meals, they noted.

An experienced dietitian should counsel the patient based on the patient’s eating preferences. Incorporating preferences can help increase compliance with the adjustments that may become necessary based on symptoms, stool output, and weight.
 

Using pharmacologic therapy

Using antisecretory medications, including proton pump inhibitors or histamine-2 receptor antagonists, helps reduce gastric secretions, the damage of acid on the upper gut mucosa, and the function of pancreatic exocrine enzymes.

Antidiarrheals reduce intestinal motility but also cause a slight reduction in intestinal secretion. Common agents include loperamide, diphenoxylate with atropine, codeine, and tincture of opium. The review authors say loperamide should get preference over opiate drugs because it is not addictive or sedative.

Use of antidiarrheals should be guided by their effect on stool output.

“Loperamide and codeine may have a synergistic effect when used together,” the authors wrote.

Clonidine, which can be given transdermally, has also shown some benefit in treating high-output stool losses, presumably because of its effects on intestinal motility and secretion.

 

 

Weighing risks and benefits of teduglutide

The glucagonlike peptide–2 teduglutide is of particular interest for its ability to help improve intestinal absorption and hopefully wean patients off parenteral nutrition and some will achieve enteral autonomy, the authors wrote. “The very short half-life of native GLP-2 has been extended to allow daily subcutaneous injection in the recombinant molecule, teduglutide.”

However, because teduglutide is a growth factor and can boost the growth of polyps and cancer, it is contraindicated in patients with active gastrointestinal malignancies. Patients should undergo colonoscopy before treatment and periodically thereafter, the authors advised. The benefits of its use in patients with nongastrointestinal malignancy should be weighed carefully with these risks.

“The significant side effects of teduglutide and the cost mandate that teduglutide is employed only after optimizing diet and the more conventional SBS treatments described previously in carefully selected patients with [short bowel syndrome–intestinal failure],” the authors wrote.
 

Dosing drugs effectively

Medications in tablet form need to dissolve before being absorbed. Most oral medications are absorbed within the proximal jejunum, so they can be used in patients with SBS.

“However,” the authors noted, “sustained- and delayed-release medications should be avoided.”

They suggested that, when applicable, alternatives such as liquids and topical medications should be considered, as should the monitoring of medication levels in the blood.

If a patient does not respond, approaches to consider may include increasing a dose, changing dose frequency, or changing drug formulation or route of administration, such as intravenous, subcutaneous, or transdermal.
 

Including parenteral nutrition and oral rehydration

Almost all patients with SBS will need parenteral nutrition (PN) support following resection, and few will be able to stop it before discharge from the hospital.

“Although more than 50% of adults with SBS are able to be weaned completely from PN within 5 years of diagnosis, the probability of eliminating PN use is less than 6% if not successfully accomplished in the first 2 years following the individual’s last bowel resection,” the authors wrote.

For long-term PN, tunneled central venous catheters are preferred over peripherally inserted central venous catheters because of the higher risk of thrombosis and issues related to self-administration of PN with the central catheters. Also, tunneled catheters are preferred over totally implanted devices, or ports, for long-term patients because the main benefit of the port is not realized given that the device needs to be continually accessed and exchanged weekly.

“When calculating PN volume and content, changes in the patient’s weight, laboratory results, stool or ostomy output, urine output, and complaints of thirst should be monitored,” the authors noted.

The authors also discussed oral rehydration solution because patients lose more water and sodium from their stoma than they take in by mouth. Careful consideration of the glucose and sodium levels in oral fluids is important because inappropriate fluids will exacerbate fluid losses in SBS. For example, hypotonic (including water, tea, coffee, alcohol) and hypertonic (including fruit juices and sodas) solutions should be limited.

“A major misconception on the part of patients is that they should drink large quantities of water; however, this generally leads to an increase in ostomy output and creates a vicious cycle further exacerbating fluid and electrolyte disturbances,” they wrote, instead advising glucose–electrolyte rehydration solution to enhance absorption and reduce secretion.
 

 

 

Preventing complications

“A knowledge of these complications is critical for those caring for these patients to be able to not only identify and treat them when they occur but also to prevent their occurrence whenever possible,” the authors wrote. Although they considered it beyond the scope of the review to outline every complication, they indicated some complications and management strategies via an included table. These complications can include cirrhosis, osteoporosis, acute kidney disease, and central venous catheter–related infection or occlusion.

Considering further surgery or intestinal transplantation

The authors noted that any further surgery should be carefully considered, with the following three contexts having possible value: “(1) to recruit unused distal bowel, (2) to augment the function of residual bowel through specific lengthening and tapering operations, or (3) to slow intestinal transit.”

Surgeons involved in managing SBS may need to confront complex intra-abdominal problems such as massive desmoid tumors, mesenteric ischemia, or complex enterocutaneous fistulae; a multidisciplinary intestinal rehabilitation team may be better able to help these patients. The authors noted that care for patients starts even before the first operation, by taking every measure to avoid massive bowel resection and the resulting SBS.

The authors noted the importance of early referral for intestinal transplantation consideration for patients with refractory dependency on parenteral nutrition or even onset of parenteral nutrition failure, which refers to complications such as intestinal failure–associated liver disease.

“At present, nearly 50% of patients being considered for ITX are also requiring simultaneous liver replacement, indicating late referral for ITX,” they wrote, citing a data from a report by the Centers for Medicare and Medicaid.

They also noted that data have shown short- and medium-term outcomes are steadily improving; however, long-term outcomes have been challenged by opportunistic infections, long-term graft attrition, and other impediments that may be preventing early referral for intestinal transplantation.
 

Educating patients, caregivers

Long-term PN may restrict activity for patients, but patients and caregivers should know about some modifications.

One is to cycle the PN over 10-14 hours overnight to allow freedom from the infusion pump during the day. Infusion pumps can be programmable, and some can be carried in a backpack for infusing during the day.

Authors recommend patient support groups, such as the Oley Foundation, which can help with issues surrounding body image and travel.

Because of the relative rarity of SBS, nonspecialist physicians may care for patients without a dedicated multidisciplinary team and may need education support in managing patients with complex care needs. One source the authors recommend is the Learn Intestinal Failure Tele-ECHO (Expanding Community Healthcare Outcomes) (LIFT-ECHO) project. The LIFT-ECHO project has become an online educational community with case-based learning in SBS, intestinal failure, and PN.

The authors disclose relationships with Takeda, Zealand, VectivBio, Napo, and Hanmi.

Publications
Topics
Sections

 

Caring for patients with short bowel syndrome (SBS) requires a multidisciplinary approach involving dietitians, nurses, surgeons, gastroenterologists or internists, and social workers experienced in SBS care, according to a clinical practice update expert review from the American Gastroenterological Association.

Kishore Iyer, MD, from Mount Sinai Hospital New York; John K. DiBaise, MD, from Mayo Clinic in Scottsdale, Ariz.; and Alberto Rubio-Tapia, MD, from the Cleveland Clinic, Ohio, developed 12 best practice advice statements based on available evidence. The items focus on adult patients with SBS; however, there was some overlap with the management of pediatric SBS. The review was published online in Clinical Gastroenterology and Hepatology.
 

Defining SBS

One update concerns defining SBS. The authors recommend that surgeons performing massive resections should report the residual bowel length, rather than the length of bowel resected.

“It is only the former that dictates outcome,” they wrote.

There is general agreement that a residual small intestinal length of 200 cm or less meets criteria for SBS. Measurement should be taken from “along the antimesenteric border of unstretched bowel, from the duodenojejunal flexure to the ileocecal junction, the site of any small bowel–colon anastomosis, or to the end-ostomy.”

Based on the residual bowel length, patients can be classified into three groups: end-jejunostomy, jejuno-colic, and jejuno-ileo-colic.
 

Assessing nutritional status

A dietitian experienced in SBS should perform a thorough nutritional assessment on all SBS patients. Long-term monitoring should include laboratory studies checking electrolytes and liver and kidney function, fluid balance, weight change, serum micronutrients, and bone density. Bone density should be repeated periodically, every 2-3 years.

Fluid and electrolyte problems may affect outcomes for SBS patients, particularly for those without a colon.
 

Adjusting diets

Most adult patients with SBS have significant malabsorption, so dietary intake “must be increased by at least 50% from their estimated needs,” the authors wrote. It’s best if the patient consumes the increased quantity throughout the day in 5-6 meals, they noted.

An experienced dietitian should counsel the patient based on the patient’s eating preferences. Incorporating preferences can help increase compliance with the adjustments that may become necessary based on symptoms, stool output, and weight.
 

Using pharmacologic therapy

Using antisecretory medications, including proton pump inhibitors or histamine-2 receptor antagonists, helps reduce gastric secretions, the damage of acid on the upper gut mucosa, and the function of pancreatic exocrine enzymes.

Antidiarrheals reduce intestinal motility but also cause a slight reduction in intestinal secretion. Common agents include loperamide, diphenoxylate with atropine, codeine, and tincture of opium. The review authors say loperamide should get preference over opiate drugs because it is not addictive or sedative.

Use of antidiarrheals should be guided by their effect on stool output.

“Loperamide and codeine may have a synergistic effect when used together,” the authors wrote.

Clonidine, which can be given transdermally, has also shown some benefit in treating high-output stool losses, presumably because of its effects on intestinal motility and secretion.

 

 

Weighing risks and benefits of teduglutide

The glucagonlike peptide–2 teduglutide is of particular interest for its ability to help improve intestinal absorption and hopefully wean patients off parenteral nutrition and some will achieve enteral autonomy, the authors wrote. “The very short half-life of native GLP-2 has been extended to allow daily subcutaneous injection in the recombinant molecule, teduglutide.”

However, because teduglutide is a growth factor and can boost the growth of polyps and cancer, it is contraindicated in patients with active gastrointestinal malignancies. Patients should undergo colonoscopy before treatment and periodically thereafter, the authors advised. The benefits of its use in patients with nongastrointestinal malignancy should be weighed carefully with these risks.

“The significant side effects of teduglutide and the cost mandate that teduglutide is employed only after optimizing diet and the more conventional SBS treatments described previously in carefully selected patients with [short bowel syndrome–intestinal failure],” the authors wrote.
 

Dosing drugs effectively

Medications in tablet form need to dissolve before being absorbed. Most oral medications are absorbed within the proximal jejunum, so they can be used in patients with SBS.

“However,” the authors noted, “sustained- and delayed-release medications should be avoided.”

They suggested that, when applicable, alternatives such as liquids and topical medications should be considered, as should the monitoring of medication levels in the blood.

If a patient does not respond, approaches to consider may include increasing a dose, changing dose frequency, or changing drug formulation or route of administration, such as intravenous, subcutaneous, or transdermal.
 

Including parenteral nutrition and oral rehydration

Almost all patients with SBS will need parenteral nutrition (PN) support following resection, and few will be able to stop it before discharge from the hospital.

“Although more than 50% of adults with SBS are able to be weaned completely from PN within 5 years of diagnosis, the probability of eliminating PN use is less than 6% if not successfully accomplished in the first 2 years following the individual’s last bowel resection,” the authors wrote.

For long-term PN, tunneled central venous catheters are preferred over peripherally inserted central venous catheters because of the higher risk of thrombosis and issues related to self-administration of PN with the central catheters. Also, tunneled catheters are preferred over totally implanted devices, or ports, for long-term patients because the main benefit of the port is not realized given that the device needs to be continually accessed and exchanged weekly.

“When calculating PN volume and content, changes in the patient’s weight, laboratory results, stool or ostomy output, urine output, and complaints of thirst should be monitored,” the authors noted.

The authors also discussed oral rehydration solution because patients lose more water and sodium from their stoma than they take in by mouth. Careful consideration of the glucose and sodium levels in oral fluids is important because inappropriate fluids will exacerbate fluid losses in SBS. For example, hypotonic (including water, tea, coffee, alcohol) and hypertonic (including fruit juices and sodas) solutions should be limited.

“A major misconception on the part of patients is that they should drink large quantities of water; however, this generally leads to an increase in ostomy output and creates a vicious cycle further exacerbating fluid and electrolyte disturbances,” they wrote, instead advising glucose–electrolyte rehydration solution to enhance absorption and reduce secretion.
 

 

 

Preventing complications

“A knowledge of these complications is critical for those caring for these patients to be able to not only identify and treat them when they occur but also to prevent their occurrence whenever possible,” the authors wrote. Although they considered it beyond the scope of the review to outline every complication, they indicated some complications and management strategies via an included table. These complications can include cirrhosis, osteoporosis, acute kidney disease, and central venous catheter–related infection or occlusion.

Considering further surgery or intestinal transplantation

The authors noted that any further surgery should be carefully considered, with the following three contexts having possible value: “(1) to recruit unused distal bowel, (2) to augment the function of residual bowel through specific lengthening and tapering operations, or (3) to slow intestinal transit.”

Surgeons involved in managing SBS may need to confront complex intra-abdominal problems such as massive desmoid tumors, mesenteric ischemia, or complex enterocutaneous fistulae; a multidisciplinary intestinal rehabilitation team may be better able to help these patients. The authors noted that care for patients starts even before the first operation, by taking every measure to avoid massive bowel resection and the resulting SBS.

The authors noted the importance of early referral for intestinal transplantation consideration for patients with refractory dependency on parenteral nutrition or even onset of parenteral nutrition failure, which refers to complications such as intestinal failure–associated liver disease.

“At present, nearly 50% of patients being considered for ITX are also requiring simultaneous liver replacement, indicating late referral for ITX,” they wrote, citing a data from a report by the Centers for Medicare and Medicaid.

They also noted that data have shown short- and medium-term outcomes are steadily improving; however, long-term outcomes have been challenged by opportunistic infections, long-term graft attrition, and other impediments that may be preventing early referral for intestinal transplantation.
 

Educating patients, caregivers

Long-term PN may restrict activity for patients, but patients and caregivers should know about some modifications.

One is to cycle the PN over 10-14 hours overnight to allow freedom from the infusion pump during the day. Infusion pumps can be programmable, and some can be carried in a backpack for infusing during the day.

Authors recommend patient support groups, such as the Oley Foundation, which can help with issues surrounding body image and travel.

Because of the relative rarity of SBS, nonspecialist physicians may care for patients without a dedicated multidisciplinary team and may need education support in managing patients with complex care needs. One source the authors recommend is the Learn Intestinal Failure Tele-ECHO (Expanding Community Healthcare Outcomes) (LIFT-ECHO) project. The LIFT-ECHO project has become an online educational community with case-based learning in SBS, intestinal failure, and PN.

The authors disclose relationships with Takeda, Zealand, VectivBio, Napo, and Hanmi.

 

Caring for patients with short bowel syndrome (SBS) requires a multidisciplinary approach involving dietitians, nurses, surgeons, gastroenterologists or internists, and social workers experienced in SBS care, according to a clinical practice update expert review from the American Gastroenterological Association.

Kishore Iyer, MD, from Mount Sinai Hospital New York; John K. DiBaise, MD, from Mayo Clinic in Scottsdale, Ariz.; and Alberto Rubio-Tapia, MD, from the Cleveland Clinic, Ohio, developed 12 best practice advice statements based on available evidence. The items focus on adult patients with SBS; however, there was some overlap with the management of pediatric SBS. The review was published online in Clinical Gastroenterology and Hepatology.
 

Defining SBS

One update concerns defining SBS. The authors recommend that surgeons performing massive resections should report the residual bowel length, rather than the length of bowel resected.

“It is only the former that dictates outcome,” they wrote.

There is general agreement that a residual small intestinal length of 200 cm or less meets criteria for SBS. Measurement should be taken from “along the antimesenteric border of unstretched bowel, from the duodenojejunal flexure to the ileocecal junction, the site of any small bowel–colon anastomosis, or to the end-ostomy.”

Based on the residual bowel length, patients can be classified into three groups: end-jejunostomy, jejuno-colic, and jejuno-ileo-colic.
 

Assessing nutritional status

A dietitian experienced in SBS should perform a thorough nutritional assessment on all SBS patients. Long-term monitoring should include laboratory studies checking electrolytes and liver and kidney function, fluid balance, weight change, serum micronutrients, and bone density. Bone density should be repeated periodically, every 2-3 years.

Fluid and electrolyte problems may affect outcomes for SBS patients, particularly for those without a colon.
 

Adjusting diets

Most adult patients with SBS have significant malabsorption, so dietary intake “must be increased by at least 50% from their estimated needs,” the authors wrote. It’s best if the patient consumes the increased quantity throughout the day in 5-6 meals, they noted.

An experienced dietitian should counsel the patient based on the patient’s eating preferences. Incorporating preferences can help increase compliance with the adjustments that may become necessary based on symptoms, stool output, and weight.
 

Using pharmacologic therapy

Using antisecretory medications, including proton pump inhibitors or histamine-2 receptor antagonists, helps reduce gastric secretions, the damage of acid on the upper gut mucosa, and the function of pancreatic exocrine enzymes.

Antidiarrheals reduce intestinal motility but also cause a slight reduction in intestinal secretion. Common agents include loperamide, diphenoxylate with atropine, codeine, and tincture of opium. The review authors say loperamide should get preference over opiate drugs because it is not addictive or sedative.

Use of antidiarrheals should be guided by their effect on stool output.

“Loperamide and codeine may have a synergistic effect when used together,” the authors wrote.

Clonidine, which can be given transdermally, has also shown some benefit in treating high-output stool losses, presumably because of its effects on intestinal motility and secretion.

 

 

Weighing risks and benefits of teduglutide

The glucagonlike peptide–2 teduglutide is of particular interest for its ability to help improve intestinal absorption and hopefully wean patients off parenteral nutrition and some will achieve enteral autonomy, the authors wrote. “The very short half-life of native GLP-2 has been extended to allow daily subcutaneous injection in the recombinant molecule, teduglutide.”

However, because teduglutide is a growth factor and can boost the growth of polyps and cancer, it is contraindicated in patients with active gastrointestinal malignancies. Patients should undergo colonoscopy before treatment and periodically thereafter, the authors advised. The benefits of its use in patients with nongastrointestinal malignancy should be weighed carefully with these risks.

“The significant side effects of teduglutide and the cost mandate that teduglutide is employed only after optimizing diet and the more conventional SBS treatments described previously in carefully selected patients with [short bowel syndrome–intestinal failure],” the authors wrote.
 

Dosing drugs effectively

Medications in tablet form need to dissolve before being absorbed. Most oral medications are absorbed within the proximal jejunum, so they can be used in patients with SBS.

“However,” the authors noted, “sustained- and delayed-release medications should be avoided.”

They suggested that, when applicable, alternatives such as liquids and topical medications should be considered, as should the monitoring of medication levels in the blood.

If a patient does not respond, approaches to consider may include increasing a dose, changing dose frequency, or changing drug formulation or route of administration, such as intravenous, subcutaneous, or transdermal.
 

Including parenteral nutrition and oral rehydration

Almost all patients with SBS will need parenteral nutrition (PN) support following resection, and few will be able to stop it before discharge from the hospital.

“Although more than 50% of adults with SBS are able to be weaned completely from PN within 5 years of diagnosis, the probability of eliminating PN use is less than 6% if not successfully accomplished in the first 2 years following the individual’s last bowel resection,” the authors wrote.

For long-term PN, tunneled central venous catheters are preferred over peripherally inserted central venous catheters because of the higher risk of thrombosis and issues related to self-administration of PN with the central catheters. Also, tunneled catheters are preferred over totally implanted devices, or ports, for long-term patients because the main benefit of the port is not realized given that the device needs to be continually accessed and exchanged weekly.

“When calculating PN volume and content, changes in the patient’s weight, laboratory results, stool or ostomy output, urine output, and complaints of thirst should be monitored,” the authors noted.

The authors also discussed oral rehydration solution because patients lose more water and sodium from their stoma than they take in by mouth. Careful consideration of the glucose and sodium levels in oral fluids is important because inappropriate fluids will exacerbate fluid losses in SBS. For example, hypotonic (including water, tea, coffee, alcohol) and hypertonic (including fruit juices and sodas) solutions should be limited.

“A major misconception on the part of patients is that they should drink large quantities of water; however, this generally leads to an increase in ostomy output and creates a vicious cycle further exacerbating fluid and electrolyte disturbances,” they wrote, instead advising glucose–electrolyte rehydration solution to enhance absorption and reduce secretion.
 

 

 

Preventing complications

“A knowledge of these complications is critical for those caring for these patients to be able to not only identify and treat them when they occur but also to prevent their occurrence whenever possible,” the authors wrote. Although they considered it beyond the scope of the review to outline every complication, they indicated some complications and management strategies via an included table. These complications can include cirrhosis, osteoporosis, acute kidney disease, and central venous catheter–related infection or occlusion.

Considering further surgery or intestinal transplantation

The authors noted that any further surgery should be carefully considered, with the following three contexts having possible value: “(1) to recruit unused distal bowel, (2) to augment the function of residual bowel through specific lengthening and tapering operations, or (3) to slow intestinal transit.”

Surgeons involved in managing SBS may need to confront complex intra-abdominal problems such as massive desmoid tumors, mesenteric ischemia, or complex enterocutaneous fistulae; a multidisciplinary intestinal rehabilitation team may be better able to help these patients. The authors noted that care for patients starts even before the first operation, by taking every measure to avoid massive bowel resection and the resulting SBS.

The authors noted the importance of early referral for intestinal transplantation consideration for patients with refractory dependency on parenteral nutrition or even onset of parenteral nutrition failure, which refers to complications such as intestinal failure–associated liver disease.

“At present, nearly 50% of patients being considered for ITX are also requiring simultaneous liver replacement, indicating late referral for ITX,” they wrote, citing a data from a report by the Centers for Medicare and Medicaid.

They also noted that data have shown short- and medium-term outcomes are steadily improving; however, long-term outcomes have been challenged by opportunistic infections, long-term graft attrition, and other impediments that may be preventing early referral for intestinal transplantation.
 

Educating patients, caregivers

Long-term PN may restrict activity for patients, but patients and caregivers should know about some modifications.

One is to cycle the PN over 10-14 hours overnight to allow freedom from the infusion pump during the day. Infusion pumps can be programmable, and some can be carried in a backpack for infusing during the day.

Authors recommend patient support groups, such as the Oley Foundation, which can help with issues surrounding body image and travel.

Because of the relative rarity of SBS, nonspecialist physicians may care for patients without a dedicated multidisciplinary team and may need education support in managing patients with complex care needs. One source the authors recommend is the Learn Intestinal Failure Tele-ECHO (Expanding Community Healthcare Outcomes) (LIFT-ECHO) project. The LIFT-ECHO project has become an online educational community with case-based learning in SBS, intestinal failure, and PN.

The authors disclose relationships with Takeda, Zealand, VectivBio, Napo, and Hanmi.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Endoscopy experts review training, assessment evidence

The foundation of high-quality endoscopy
Article Type
Changed
Tue, 09/20/2022 - 16:50

Endoscopic training is increasingly complex as benchmarks for quality evolve and tools and procedures advance with innovation.

A team of experts, led by Matthew J. Whitson, MD, with Hofstra University/Northwell Health in Hempstead, N.Y., aimed to simplify challenges for educators and clinical endoscopists with a review of tools and techniques for education, as well as assessment methods.

Their review was published in the Techniques and Innovations in Gastrointestinal Endoscopy.
 

Giving feedback

Key steps to effective feedback include first talking about the goals for the endoscopy session, then careful observation, but minimal feedback during the endoscopy. Most of the feedback should come after the endoscopy, the authors wrote.

A paper by Walsh et al. demonstrated that with beginning endoscopists, giving feedback afterward led to long-term skill development as compared with short-term benefits of frequent feedback in the middle of procedures.

Feedback should be constructive and specific with emphasis on goals for the next procedure. It should be delivered in a respectful, nonthreatening way for greatest effectiveness.
 

Mastery learning

In this model, each trainee must achieve competence in specific skills to progress to the next level.

“For example, the trainee must master retroflexion in the stomach prior to attempting clip hemostasis in the stomach,” the authors wrote.

Repetitively practicing the skill is coupled with direct feedback.

Mastery learning is often paired with simulation so trainees can practice in a safe space before working with patients.
 

Cognitive load theory

Knowing the challenges of learners can help educators with instruction techniques. An important concept is cognitive load theory (CLT). CLT is focused on the working memory of a learner and the harm that an overload of information can have on learning. A learner’s working memory can process only a few pieces of information at any given time, the theory states.

One mitigation strategy by educators may be to assign a trainee a smaller piece of a specific task appropriate to the trainee’s skill level.

“For example, an early trainee endoscopist may be able to inject epinephrine for a bleeding vessel, but not be ready to perform effective BiCap cautery,” the authors suggested.
 

Different learning styles

Learning styles include visual, aural, reading/writing, and kinesthetic styles (when learners need to touch or manipulate to learn a skill).

“A study of surgical trainees demonstrated that kinesthetic learning was the most preferred unimodal learning style of those entering the field,” the authors wrote.

Dr. Whitson and coauthors gave examples of working with trainees with different learning styles.

A trainee who learns visually, they say, might need to learn about “loop reduction” by looking at images of alpha or beta loops or using ScopeGuide during endoscopy. A kinesthetic learner may need to feel a successful loop reduction with hands on the endoscope during simulation to understand the skill better.

“There is some suggestion that the millennial generation – the demographic of the current gastroenterology fellows – may have higher preferences for kinesthetic learning,” the authors wrote.
 

Role of simulation

The Accreditation Council for Graduate Medical Education, which oversees Gastroenterology Fellowship training, mandates simulation in gastroenterology education but does not specify endoscopic simulation. The American Board of Surgeons, however, does require their trainees to complete the flexible endoscopic curriculum, which is simulation-based.

Simulator use appears particularly helpful early in training. One study demonstrated that colonoscopy simulation has benefit in the first 30 colonoscopies in depth of insertion, independent completion, and ability to identify landmarks.

However, another study showed simulation after 50 colonoscopies has shown no benefit, the authors wrote. Finding uses for simulators beyond diagnostics will be important for justifying buying more of them for medical centers given the high cost.
 

Procedural volume

Dr. Whitson and colleagues wrote that using sheer volume of procedures as a measure of competency is falling out of favor and there is recognition in the field that competency will come at different times and at different volumes for individual trainees.

A study assessing competency in esophagogastroduodenoscopy (EGD), for example, demonstrated that, while most trainees will achieve independent intubation rates of the second part of the duodenum by 150 procedures, it will take between 200 and 250 for the average fellow to reach competency of all motor skills for a standard EGD, and 300 to become efficient (Gastrointest Endosc. 2019;90(4):613-20).

Assessment of skills has evolved from numbers of procedures to competency-based assessments to the development of direct observation tools.
 

Coaching for the practicing endoscopist

Most studies on coaching have focused on trainees, but coaching can be used with experienced endoscopists as well.

One study investigated use of direct verbal coaching to train experienced practitioners in water immersion colonoscopy “which resulted in shorter cecal intubation times, improved [adenoma detection rate], and less use of sedation during procedures,” the authors noted. Another study currently underway in the United Kingdom uses electronic feedback coupled with education and training to change behaviors to improve polyp detection performance in colonoscopy.

The authors noted that using one of these tools or strategies does not preclude using another.

“[I]n fact educators likely will recognize the utility of incorporating multiple of these techniques in the same endoscopy session with a trainee,” the authors wrote.

One author holds stock in Boston Scientific. The remaining authors disclose no conflicts.

Body

Whitson, Williams, and Shah eloquently and thoroughly explore the principles central to successful training and review the latest understanding of best practices to apply them. Beyond fellows, this will have ongoing relevance to practicing endoscopists as they must learn new skills and in turn apply them in teaching others.
 

Dr. Jonathan Cohen
Feedback is an essential aspect of deliberative practice in my experience. The art of giving useful feedback requires one to be introspective, interactive, and iterative. Introspective: The instructor must assess the presession skill level and adjust the learning plan with observation in real time of student performance, taking primary consideration simultaneously for patient safety. From these inputs, the teacher decides a) what helpful information to convey, b) how best to deliver the message, and c) when best to do it. Interactive: “How best to deliver” will usually entail an approach to stop the action and ask the trainee to consider the present challenge and possible solutions rather than to dictate an action or demonstrate what to do. It is not always best to wait until after the procedure – contrary to what the authors favor in this article – but interruptions must be few and focused. Iterative: The lessons taught are building blocks that ideally relate to prior challenges and set the agenda for next learning goals.

The authors rightly emphasize the concern for cognitive overload. In practice, a maximum of one or two take-home lessons per mentored training session is a good rule of thumb; the converse should be equally emphasized, that every single proctored training examination ought to be mined for at least one relevant lesson, be it technical, cognitive, or another nontechnical pearl related to teamwork, professionalism, and so on.

Much simulator investigation to date, including my own, has focused on technical skills and performance outcomes – more work is needed especially in web, simulator, and even AI-based tools to teach cognitive skills for recognizing abnormalities, identifying them, and making real-time evidence-based decisions. The value of simulator-based teaching of endoscopic nontechnical skills, practice in troubleshooting common mishaps, and teaching by counter example of what not to do are other areas of great promise. There is not yet a prescribed, evidence-based guideline regarding which simulation devices should be used, at what stages of fellowship, and how often; however, the principles described in this paper provide the road map for how simulation ought to be integrated into endoscopic teaching.

Jonathan Cohen, MD is a clinical professor of medicine at New York University. He is the Editor of “Successful Training in Gastrointestinal Endoscopy,” 2nd ed. (Hoboken, N.J.: Wiley-Blackwell, 2022) and an investigator in ex vivo and computer endoscopy simulators. He is a consultant for Olympus America and Micro-Tech Endoscopy, receives royalties from Wouters-Kluwer and Wiley, and holds stock in GI Windows, Virtual Health Partners, ROMtech, and MD Medical Navigators.

Publications
Topics
Sections
Body

Whitson, Williams, and Shah eloquently and thoroughly explore the principles central to successful training and review the latest understanding of best practices to apply them. Beyond fellows, this will have ongoing relevance to practicing endoscopists as they must learn new skills and in turn apply them in teaching others.
 

Dr. Jonathan Cohen
Feedback is an essential aspect of deliberative practice in my experience. The art of giving useful feedback requires one to be introspective, interactive, and iterative. Introspective: The instructor must assess the presession skill level and adjust the learning plan with observation in real time of student performance, taking primary consideration simultaneously for patient safety. From these inputs, the teacher decides a) what helpful information to convey, b) how best to deliver the message, and c) when best to do it. Interactive: “How best to deliver” will usually entail an approach to stop the action and ask the trainee to consider the present challenge and possible solutions rather than to dictate an action or demonstrate what to do. It is not always best to wait until after the procedure – contrary to what the authors favor in this article – but interruptions must be few and focused. Iterative: The lessons taught are building blocks that ideally relate to prior challenges and set the agenda for next learning goals.

The authors rightly emphasize the concern for cognitive overload. In practice, a maximum of one or two take-home lessons per mentored training session is a good rule of thumb; the converse should be equally emphasized, that every single proctored training examination ought to be mined for at least one relevant lesson, be it technical, cognitive, or another nontechnical pearl related to teamwork, professionalism, and so on.

Much simulator investigation to date, including my own, has focused on technical skills and performance outcomes – more work is needed especially in web, simulator, and even AI-based tools to teach cognitive skills for recognizing abnormalities, identifying them, and making real-time evidence-based decisions. The value of simulator-based teaching of endoscopic nontechnical skills, practice in troubleshooting common mishaps, and teaching by counter example of what not to do are other areas of great promise. There is not yet a prescribed, evidence-based guideline regarding which simulation devices should be used, at what stages of fellowship, and how often; however, the principles described in this paper provide the road map for how simulation ought to be integrated into endoscopic teaching.

Jonathan Cohen, MD is a clinical professor of medicine at New York University. He is the Editor of “Successful Training in Gastrointestinal Endoscopy,” 2nd ed. (Hoboken, N.J.: Wiley-Blackwell, 2022) and an investigator in ex vivo and computer endoscopy simulators. He is a consultant for Olympus America and Micro-Tech Endoscopy, receives royalties from Wouters-Kluwer and Wiley, and holds stock in GI Windows, Virtual Health Partners, ROMtech, and MD Medical Navigators.

Body

Whitson, Williams, and Shah eloquently and thoroughly explore the principles central to successful training and review the latest understanding of best practices to apply them. Beyond fellows, this will have ongoing relevance to practicing endoscopists as they must learn new skills and in turn apply them in teaching others.
 

Dr. Jonathan Cohen
Feedback is an essential aspect of deliberative practice in my experience. The art of giving useful feedback requires one to be introspective, interactive, and iterative. Introspective: The instructor must assess the presession skill level and adjust the learning plan with observation in real time of student performance, taking primary consideration simultaneously for patient safety. From these inputs, the teacher decides a) what helpful information to convey, b) how best to deliver the message, and c) when best to do it. Interactive: “How best to deliver” will usually entail an approach to stop the action and ask the trainee to consider the present challenge and possible solutions rather than to dictate an action or demonstrate what to do. It is not always best to wait until after the procedure – contrary to what the authors favor in this article – but interruptions must be few and focused. Iterative: The lessons taught are building blocks that ideally relate to prior challenges and set the agenda for next learning goals.

The authors rightly emphasize the concern for cognitive overload. In practice, a maximum of one or two take-home lessons per mentored training session is a good rule of thumb; the converse should be equally emphasized, that every single proctored training examination ought to be mined for at least one relevant lesson, be it technical, cognitive, or another nontechnical pearl related to teamwork, professionalism, and so on.

Much simulator investigation to date, including my own, has focused on technical skills and performance outcomes – more work is needed especially in web, simulator, and even AI-based tools to teach cognitive skills for recognizing abnormalities, identifying them, and making real-time evidence-based decisions. The value of simulator-based teaching of endoscopic nontechnical skills, practice in troubleshooting common mishaps, and teaching by counter example of what not to do are other areas of great promise. There is not yet a prescribed, evidence-based guideline regarding which simulation devices should be used, at what stages of fellowship, and how often; however, the principles described in this paper provide the road map for how simulation ought to be integrated into endoscopic teaching.

Jonathan Cohen, MD is a clinical professor of medicine at New York University. He is the Editor of “Successful Training in Gastrointestinal Endoscopy,” 2nd ed. (Hoboken, N.J.: Wiley-Blackwell, 2022) and an investigator in ex vivo and computer endoscopy simulators. He is a consultant for Olympus America and Micro-Tech Endoscopy, receives royalties from Wouters-Kluwer and Wiley, and holds stock in GI Windows, Virtual Health Partners, ROMtech, and MD Medical Navigators.

Title
The foundation of high-quality endoscopy
The foundation of high-quality endoscopy

Endoscopic training is increasingly complex as benchmarks for quality evolve and tools and procedures advance with innovation.

A team of experts, led by Matthew J. Whitson, MD, with Hofstra University/Northwell Health in Hempstead, N.Y., aimed to simplify challenges for educators and clinical endoscopists with a review of tools and techniques for education, as well as assessment methods.

Their review was published in the Techniques and Innovations in Gastrointestinal Endoscopy.
 

Giving feedback

Key steps to effective feedback include first talking about the goals for the endoscopy session, then careful observation, but minimal feedback during the endoscopy. Most of the feedback should come after the endoscopy, the authors wrote.

A paper by Walsh et al. demonstrated that with beginning endoscopists, giving feedback afterward led to long-term skill development as compared with short-term benefits of frequent feedback in the middle of procedures.

Feedback should be constructive and specific with emphasis on goals for the next procedure. It should be delivered in a respectful, nonthreatening way for greatest effectiveness.
 

Mastery learning

In this model, each trainee must achieve competence in specific skills to progress to the next level.

“For example, the trainee must master retroflexion in the stomach prior to attempting clip hemostasis in the stomach,” the authors wrote.

Repetitively practicing the skill is coupled with direct feedback.

Mastery learning is often paired with simulation so trainees can practice in a safe space before working with patients.
 

Cognitive load theory

Knowing the challenges of learners can help educators with instruction techniques. An important concept is cognitive load theory (CLT). CLT is focused on the working memory of a learner and the harm that an overload of information can have on learning. A learner’s working memory can process only a few pieces of information at any given time, the theory states.

One mitigation strategy by educators may be to assign a trainee a smaller piece of a specific task appropriate to the trainee’s skill level.

“For example, an early trainee endoscopist may be able to inject epinephrine for a bleeding vessel, but not be ready to perform effective BiCap cautery,” the authors suggested.
 

Different learning styles

Learning styles include visual, aural, reading/writing, and kinesthetic styles (when learners need to touch or manipulate to learn a skill).

“A study of surgical trainees demonstrated that kinesthetic learning was the most preferred unimodal learning style of those entering the field,” the authors wrote.

Dr. Whitson and coauthors gave examples of working with trainees with different learning styles.

A trainee who learns visually, they say, might need to learn about “loop reduction” by looking at images of alpha or beta loops or using ScopeGuide during endoscopy. A kinesthetic learner may need to feel a successful loop reduction with hands on the endoscope during simulation to understand the skill better.

“There is some suggestion that the millennial generation – the demographic of the current gastroenterology fellows – may have higher preferences for kinesthetic learning,” the authors wrote.
 

Role of simulation

The Accreditation Council for Graduate Medical Education, which oversees Gastroenterology Fellowship training, mandates simulation in gastroenterology education but does not specify endoscopic simulation. The American Board of Surgeons, however, does require their trainees to complete the flexible endoscopic curriculum, which is simulation-based.

Simulator use appears particularly helpful early in training. One study demonstrated that colonoscopy simulation has benefit in the first 30 colonoscopies in depth of insertion, independent completion, and ability to identify landmarks.

However, another study showed simulation after 50 colonoscopies has shown no benefit, the authors wrote. Finding uses for simulators beyond diagnostics will be important for justifying buying more of them for medical centers given the high cost.
 

Procedural volume

Dr. Whitson and colleagues wrote that using sheer volume of procedures as a measure of competency is falling out of favor and there is recognition in the field that competency will come at different times and at different volumes for individual trainees.

A study assessing competency in esophagogastroduodenoscopy (EGD), for example, demonstrated that, while most trainees will achieve independent intubation rates of the second part of the duodenum by 150 procedures, it will take between 200 and 250 for the average fellow to reach competency of all motor skills for a standard EGD, and 300 to become efficient (Gastrointest Endosc. 2019;90(4):613-20).

Assessment of skills has evolved from numbers of procedures to competency-based assessments to the development of direct observation tools.
 

Coaching for the practicing endoscopist

Most studies on coaching have focused on trainees, but coaching can be used with experienced endoscopists as well.

One study investigated use of direct verbal coaching to train experienced practitioners in water immersion colonoscopy “which resulted in shorter cecal intubation times, improved [adenoma detection rate], and less use of sedation during procedures,” the authors noted. Another study currently underway in the United Kingdom uses electronic feedback coupled with education and training to change behaviors to improve polyp detection performance in colonoscopy.

The authors noted that using one of these tools or strategies does not preclude using another.

“[I]n fact educators likely will recognize the utility of incorporating multiple of these techniques in the same endoscopy session with a trainee,” the authors wrote.

One author holds stock in Boston Scientific. The remaining authors disclose no conflicts.

Endoscopic training is increasingly complex as benchmarks for quality evolve and tools and procedures advance with innovation.

A team of experts, led by Matthew J. Whitson, MD, with Hofstra University/Northwell Health in Hempstead, N.Y., aimed to simplify challenges for educators and clinical endoscopists with a review of tools and techniques for education, as well as assessment methods.

Their review was published in the Techniques and Innovations in Gastrointestinal Endoscopy.
 

Giving feedback

Key steps to effective feedback include first talking about the goals for the endoscopy session, then careful observation, but minimal feedback during the endoscopy. Most of the feedback should come after the endoscopy, the authors wrote.

A paper by Walsh et al. demonstrated that with beginning endoscopists, giving feedback afterward led to long-term skill development as compared with short-term benefits of frequent feedback in the middle of procedures.

Feedback should be constructive and specific with emphasis on goals for the next procedure. It should be delivered in a respectful, nonthreatening way for greatest effectiveness.
 

Mastery learning

In this model, each trainee must achieve competence in specific skills to progress to the next level.

“For example, the trainee must master retroflexion in the stomach prior to attempting clip hemostasis in the stomach,” the authors wrote.

Repetitively practicing the skill is coupled with direct feedback.

Mastery learning is often paired with simulation so trainees can practice in a safe space before working with patients.
 

Cognitive load theory

Knowing the challenges of learners can help educators with instruction techniques. An important concept is cognitive load theory (CLT). CLT is focused on the working memory of a learner and the harm that an overload of information can have on learning. A learner’s working memory can process only a few pieces of information at any given time, the theory states.

One mitigation strategy by educators may be to assign a trainee a smaller piece of a specific task appropriate to the trainee’s skill level.

“For example, an early trainee endoscopist may be able to inject epinephrine for a bleeding vessel, but not be ready to perform effective BiCap cautery,” the authors suggested.
 

Different learning styles

Learning styles include visual, aural, reading/writing, and kinesthetic styles (when learners need to touch or manipulate to learn a skill).

“A study of surgical trainees demonstrated that kinesthetic learning was the most preferred unimodal learning style of those entering the field,” the authors wrote.

Dr. Whitson and coauthors gave examples of working with trainees with different learning styles.

A trainee who learns visually, they say, might need to learn about “loop reduction” by looking at images of alpha or beta loops or using ScopeGuide during endoscopy. A kinesthetic learner may need to feel a successful loop reduction with hands on the endoscope during simulation to understand the skill better.

“There is some suggestion that the millennial generation – the demographic of the current gastroenterology fellows – may have higher preferences for kinesthetic learning,” the authors wrote.
 

Role of simulation

The Accreditation Council for Graduate Medical Education, which oversees Gastroenterology Fellowship training, mandates simulation in gastroenterology education but does not specify endoscopic simulation. The American Board of Surgeons, however, does require their trainees to complete the flexible endoscopic curriculum, which is simulation-based.

Simulator use appears particularly helpful early in training. One study demonstrated that colonoscopy simulation has benefit in the first 30 colonoscopies in depth of insertion, independent completion, and ability to identify landmarks.

However, another study showed simulation after 50 colonoscopies has shown no benefit, the authors wrote. Finding uses for simulators beyond diagnostics will be important for justifying buying more of them for medical centers given the high cost.
 

Procedural volume

Dr. Whitson and colleagues wrote that using sheer volume of procedures as a measure of competency is falling out of favor and there is recognition in the field that competency will come at different times and at different volumes for individual trainees.

A study assessing competency in esophagogastroduodenoscopy (EGD), for example, demonstrated that, while most trainees will achieve independent intubation rates of the second part of the duodenum by 150 procedures, it will take between 200 and 250 for the average fellow to reach competency of all motor skills for a standard EGD, and 300 to become efficient (Gastrointest Endosc. 2019;90(4):613-20).

Assessment of skills has evolved from numbers of procedures to competency-based assessments to the development of direct observation tools.
 

Coaching for the practicing endoscopist

Most studies on coaching have focused on trainees, but coaching can be used with experienced endoscopists as well.

One study investigated use of direct verbal coaching to train experienced practitioners in water immersion colonoscopy “which resulted in shorter cecal intubation times, improved [adenoma detection rate], and less use of sedation during procedures,” the authors noted. Another study currently underway in the United Kingdom uses electronic feedback coupled with education and training to change behaviors to improve polyp detection performance in colonoscopy.

The authors noted that using one of these tools or strategies does not preclude using another.

“[I]n fact educators likely will recognize the utility of incorporating multiple of these techniques in the same endoscopy session with a trainee,” the authors wrote.

One author holds stock in Boston Scientific. The remaining authors disclose no conflicts.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM TECHNIQUES AND INNOVATIONS IN GASTROINTESTINAL ENDOSCOPY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Exercise may counteract genetics for gestational diabetes

Article Type
Changed
Thu, 09/01/2022 - 09:26

Women giving birth for the first time have significantly higher odds of developing gestational diabetes if they have a high polygenic risk score (PRS) and low physical activity, new data suggest.

Researchers, led by Kymberleigh A. Pagel, PhD, with the department of computer science, Indiana University, Bloomington, concluded that physical activity early in pregnancy is associated with reduced risk of gestational diabetes and may help women who are at high risk because of genetic predisposition, age, family history of diabetes, and body mass index.

The researchers included 3,533 women in the analysis (average age, 28.6 years) which was a subcohort of a larger study. They found that physical activity’s association with lower gestational diabetes risk “was particularly significant in individuals who were genetically predisposed to diabetes through PRS or family history,” the authors wrote.

Women with high PRS and low level of physical activity had three times the odds of developing gestational diabetes (odds ratio, 3.4; 95% confidence interval, 2.3-5.3).

Those with high PRS and moderate to high activity levels in early pregnancy (metabolic equivalents of task [METs] of at least 450) had gestational diabetes risk similar to that of the general population, according to the researchers.

The findings were published in JAMA Network Open.

Dr. Maisa Feghali

Maisa Feghali, MD, a maternal-fetal specialist at the University of Pittsburgh Medical Center, who was not part of the study, said in an interview she found the link of physical activity and compensation for high predisposition to gestational diabetes most interesting.

“That’s interesting because a lot of studies that have looked at prevention of gestational diabetes either through limited weight gain or through some form of counseling on physical activity have not really shown any benefit,” she noted. “It might just be it’s not just one size fits all and it may be that physical activity is mostly beneficial in those with a high predisposition.”

Research in this area is particularly important as 7% of pregnancies in the United States each year are affected by gestational diabetes and the risk for developing type 2 diabetes “has doubled in the past decade among patients with GD [gestational diabetes],” the authors wrote.

Researchers looked at risks for gestational diabetes in high-risk subgroups, including women who had a body mass index of more than 25 kg/m2 or were at least 35 years old. In that group, women who were either in the in the top 25th percentile for PRS or had low physical activity (METs less than 450) had from 25% to 75% greater risk of developing gestational diabetes.

The findings are consistent with previous research and suggest exercise interventions may be important in improving pregnancy outcomes, the authors wrote.

Christina Han, MD, division director for maternal-fetal medicine at University of California, Los Angeles, who was not part of the study, pointed out several limitations of the study, however.

One of the biggest limitations, she said, was that “they excluded two-thirds of the original study. Essentially, they took only Caucasian [White] patients, which is about one-third of the study.” Additionally, the cohort was made up of people who had never had babies.

“Lots of our gestational diabetes patients are not first-time moms, so this makes the generalizability of the study very limited,” Dr. Han said.

She added that none of the sites where the study was conducted were in the South or Northwest, which also adds questions about generalizability.

Dr. Feghali and Dr. Han reported no relevant financial relationships.

Publications
Topics
Sections

Women giving birth for the first time have significantly higher odds of developing gestational diabetes if they have a high polygenic risk score (PRS) and low physical activity, new data suggest.

Researchers, led by Kymberleigh A. Pagel, PhD, with the department of computer science, Indiana University, Bloomington, concluded that physical activity early in pregnancy is associated with reduced risk of gestational diabetes and may help women who are at high risk because of genetic predisposition, age, family history of diabetes, and body mass index.

The researchers included 3,533 women in the analysis (average age, 28.6 years) which was a subcohort of a larger study. They found that physical activity’s association with lower gestational diabetes risk “was particularly significant in individuals who were genetically predisposed to diabetes through PRS or family history,” the authors wrote.

Women with high PRS and low level of physical activity had three times the odds of developing gestational diabetes (odds ratio, 3.4; 95% confidence interval, 2.3-5.3).

Those with high PRS and moderate to high activity levels in early pregnancy (metabolic equivalents of task [METs] of at least 450) had gestational diabetes risk similar to that of the general population, according to the researchers.

The findings were published in JAMA Network Open.

Dr. Maisa Feghali

Maisa Feghali, MD, a maternal-fetal specialist at the University of Pittsburgh Medical Center, who was not part of the study, said in an interview she found the link of physical activity and compensation for high predisposition to gestational diabetes most interesting.

“That’s interesting because a lot of studies that have looked at prevention of gestational diabetes either through limited weight gain or through some form of counseling on physical activity have not really shown any benefit,” she noted. “It might just be it’s not just one size fits all and it may be that physical activity is mostly beneficial in those with a high predisposition.”

Research in this area is particularly important as 7% of pregnancies in the United States each year are affected by gestational diabetes and the risk for developing type 2 diabetes “has doubled in the past decade among patients with GD [gestational diabetes],” the authors wrote.

Researchers looked at risks for gestational diabetes in high-risk subgroups, including women who had a body mass index of more than 25 kg/m2 or were at least 35 years old. In that group, women who were either in the in the top 25th percentile for PRS or had low physical activity (METs less than 450) had from 25% to 75% greater risk of developing gestational diabetes.

The findings are consistent with previous research and suggest exercise interventions may be important in improving pregnancy outcomes, the authors wrote.

Christina Han, MD, division director for maternal-fetal medicine at University of California, Los Angeles, who was not part of the study, pointed out several limitations of the study, however.

One of the biggest limitations, she said, was that “they excluded two-thirds of the original study. Essentially, they took only Caucasian [White] patients, which is about one-third of the study.” Additionally, the cohort was made up of people who had never had babies.

“Lots of our gestational diabetes patients are not first-time moms, so this makes the generalizability of the study very limited,” Dr. Han said.

She added that none of the sites where the study was conducted were in the South or Northwest, which also adds questions about generalizability.

Dr. Feghali and Dr. Han reported no relevant financial relationships.

Women giving birth for the first time have significantly higher odds of developing gestational diabetes if they have a high polygenic risk score (PRS) and low physical activity, new data suggest.

Researchers, led by Kymberleigh A. Pagel, PhD, with the department of computer science, Indiana University, Bloomington, concluded that physical activity early in pregnancy is associated with reduced risk of gestational diabetes and may help women who are at high risk because of genetic predisposition, age, family history of diabetes, and body mass index.

The researchers included 3,533 women in the analysis (average age, 28.6 years) which was a subcohort of a larger study. They found that physical activity’s association with lower gestational diabetes risk “was particularly significant in individuals who were genetically predisposed to diabetes through PRS or family history,” the authors wrote.

Women with high PRS and low level of physical activity had three times the odds of developing gestational diabetes (odds ratio, 3.4; 95% confidence interval, 2.3-5.3).

Those with high PRS and moderate to high activity levels in early pregnancy (metabolic equivalents of task [METs] of at least 450) had gestational diabetes risk similar to that of the general population, according to the researchers.

The findings were published in JAMA Network Open.

Dr. Maisa Feghali

Maisa Feghali, MD, a maternal-fetal specialist at the University of Pittsburgh Medical Center, who was not part of the study, said in an interview she found the link of physical activity and compensation for high predisposition to gestational diabetes most interesting.

“That’s interesting because a lot of studies that have looked at prevention of gestational diabetes either through limited weight gain or through some form of counseling on physical activity have not really shown any benefit,” she noted. “It might just be it’s not just one size fits all and it may be that physical activity is mostly beneficial in those with a high predisposition.”

Research in this area is particularly important as 7% of pregnancies in the United States each year are affected by gestational diabetes and the risk for developing type 2 diabetes “has doubled in the past decade among patients with GD [gestational diabetes],” the authors wrote.

Researchers looked at risks for gestational diabetes in high-risk subgroups, including women who had a body mass index of more than 25 kg/m2 or were at least 35 years old. In that group, women who were either in the in the top 25th percentile for PRS or had low physical activity (METs less than 450) had from 25% to 75% greater risk of developing gestational diabetes.

The findings are consistent with previous research and suggest exercise interventions may be important in improving pregnancy outcomes, the authors wrote.

Christina Han, MD, division director for maternal-fetal medicine at University of California, Los Angeles, who was not part of the study, pointed out several limitations of the study, however.

One of the biggest limitations, she said, was that “they excluded two-thirds of the original study. Essentially, they took only Caucasian [White] patients, which is about one-third of the study.” Additionally, the cohort was made up of people who had never had babies.

“Lots of our gestational diabetes patients are not first-time moms, so this makes the generalizability of the study very limited,” Dr. Han said.

She added that none of the sites where the study was conducted were in the South or Northwest, which also adds questions about generalizability.

Dr. Feghali and Dr. Han reported no relevant financial relationships.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA NETWORK OPEN

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Vaccine hope now for leading cause of U.S. infant hospitalizations: RSV

Article Type
Changed
Thu, 09/01/2022 - 12:34

Respiratory syncytial virus (RSV) is the leading cause of U.S. infant hospitalizations overall and across population subgroups, new data published in the Journal of Infectious Diseases confirm.

Acute bronchiolitis caused by RSV accounted for 9.6% (95% confidence interval, 9.4%-9.9%) and 9.3% (95% CI, 9.0%-9.6%) of total infant hospitalizations from January 2009 to September 2015 and October 2015 to December 2019, respectively.
 

Journal issue includes 14 RSV studies

The latest issue of the journal includes a special section with results from 14 studies related to the widespread, easy-to-catch virus, highlighting the urgency of finding a solution for all infants.

In one study, authors led by Mina Suh, MPH, with EpidStrategies, a division of ToxStrategies in Rockville, Md., reported that, in children under the age of 5 years in the United States, RSV caused 58,000 annual hospitalizations and from 100 to 500 annual deaths from 2009 to 2019 (the latest year data were available).

Globally, in 2015, among infants younger than 6 months, an estimated 1.4 million hospital admissions and 27,300 in-hospital deaths were attributed to RSV lower respiratory tract infection (LRTI).

The researchers used the largest publicly available, all-payer database in the United States – the National (Nationwide) Inpatient Sample – to describe the leading causes of infant hospitalizations.

The authors noted that, because clinicians don’t routinely perform lab tests for RSV, the true health care burden is likely higher and its public health impact greater than these numbers show.

Immunization candidates advance

There are no preventative options currently available to substantially cut RSV infections in all infants, though immunization candidates are advancing, showing safety and efficacy in clinical trials.

Palivizumab is currently the only available option in the United States to prevent RSV and is recommended only for a small group of infants with particular forms of heart or lung disease and those born prematurely at 29 weeks’ gestational age. Further, palivizumab has to be given monthly throughout the RSV season.

Another of the studies in the journal supplement concluded that a universal immunization strategy with one of the candidates, nirsevimab (Sanofi, AstraZeneca), an investigational long-acting monoclonal antibody, could substantially reduce the health burden and economic burden for U.S. infants in their first RSV season.

The researchers, led by Alexia Kieffer, MSc, MPH, with Sanofi, used static decision-analytic modeling for the estimates. Modeled RSV-related outcomes included primary care and ED visits, hospitalizations, including ICU admission and mechanical ventilations, and RSV-related deaths.

“The results of this model suggested that the use of nirsevimab in all infants could reduce health events by 55% and the overall costs to the payer by 49%,” the authors of the study wrote.

According to the study, universal immunization of all infants with nirsevimab is expected to reduce 290,174 RSV-related medically attended LRTI (MALRTI), 24,986 hospitalizations, and cut $612 million in costs to the health care system.

The authors wrote: “While this reduction would be driven by term infants, who account for most of the RSV-MALRTI burden; all infants, including palivizumab-eligible and preterm infants who suffer from significantly higher rates of disease, would benefit from this immunization strategy.”
 

 

 

Excitement for another option

Jörn-Hendrik Weitkamp, MD, professor of pediatrics and director for patient-oriented research at Monroe Carell Jr. Children’s Hospital at Vanderbilt University, Nashville, Tenn., said in an interview there is much excitement in the field for nirsevimab as it has significant advantages over palivizumab.

Dr. Jörn-Hendrik Weitkamp

RSV “is a huge burden to the children, the families, the hospitals, and the medical system,” he said.

Ideally there would be a vaccine to offer the best protection, he noted.

“People have spent their lives, their careers trying to develop a vaccine for RSV,” he said, but that has been elusive for more than 60 years. Therefore, passive immunization is the best of the current options, he says, and nirsevimab “seems to be very effective.”

What’s not clear, Dr. Weitkamp said, is how much nirsevimab will cost as it is not yet approved by the Food and Drug Administration. However, it has the great advantage of being given only once before the season starts instead of monthly (as required for palivizumab) through the season, “which is painful, inconvenient, and traumatizing. We limit that one to the children at highest risk.”

Rolling out an infant nirsevimab program would likely vary by geographic region, Ms. Kieffer and colleagues said, to help ensure infants are protected during the peak of their region’s RSV season.

The journal’s RSV supplement was supported by Sanofi and AstraZeneca. The studies by Ms. Suh and colleagues and Ms. Kieffer and colleagues were supported by AstraZeneca and Sanofi. Ms. Suh and several coauthors are employees of EpidStrategies. One coauthor is an employee of Sanofi and may hold shares and/or stock options in the company. Ms. Kieffer and several coauthors are employees of Sanofi and may hold shares and/or stock options in the company. Dr. Weitkamp reported no relevant financial relationships.

Publications
Topics
Sections

Respiratory syncytial virus (RSV) is the leading cause of U.S. infant hospitalizations overall and across population subgroups, new data published in the Journal of Infectious Diseases confirm.

Acute bronchiolitis caused by RSV accounted for 9.6% (95% confidence interval, 9.4%-9.9%) and 9.3% (95% CI, 9.0%-9.6%) of total infant hospitalizations from January 2009 to September 2015 and October 2015 to December 2019, respectively.
 

Journal issue includes 14 RSV studies

The latest issue of the journal includes a special section with results from 14 studies related to the widespread, easy-to-catch virus, highlighting the urgency of finding a solution for all infants.

In one study, authors led by Mina Suh, MPH, with EpidStrategies, a division of ToxStrategies in Rockville, Md., reported that, in children under the age of 5 years in the United States, RSV caused 58,000 annual hospitalizations and from 100 to 500 annual deaths from 2009 to 2019 (the latest year data were available).

Globally, in 2015, among infants younger than 6 months, an estimated 1.4 million hospital admissions and 27,300 in-hospital deaths were attributed to RSV lower respiratory tract infection (LRTI).

The researchers used the largest publicly available, all-payer database in the United States – the National (Nationwide) Inpatient Sample – to describe the leading causes of infant hospitalizations.

The authors noted that, because clinicians don’t routinely perform lab tests for RSV, the true health care burden is likely higher and its public health impact greater than these numbers show.

Immunization candidates advance

There are no preventative options currently available to substantially cut RSV infections in all infants, though immunization candidates are advancing, showing safety and efficacy in clinical trials.

Palivizumab is currently the only available option in the United States to prevent RSV and is recommended only for a small group of infants with particular forms of heart or lung disease and those born prematurely at 29 weeks’ gestational age. Further, palivizumab has to be given monthly throughout the RSV season.

Another of the studies in the journal supplement concluded that a universal immunization strategy with one of the candidates, nirsevimab (Sanofi, AstraZeneca), an investigational long-acting monoclonal antibody, could substantially reduce the health burden and economic burden for U.S. infants in their first RSV season.

The researchers, led by Alexia Kieffer, MSc, MPH, with Sanofi, used static decision-analytic modeling for the estimates. Modeled RSV-related outcomes included primary care and ED visits, hospitalizations, including ICU admission and mechanical ventilations, and RSV-related deaths.

“The results of this model suggested that the use of nirsevimab in all infants could reduce health events by 55% and the overall costs to the payer by 49%,” the authors of the study wrote.

According to the study, universal immunization of all infants with nirsevimab is expected to reduce 290,174 RSV-related medically attended LRTI (MALRTI), 24,986 hospitalizations, and cut $612 million in costs to the health care system.

The authors wrote: “While this reduction would be driven by term infants, who account for most of the RSV-MALRTI burden; all infants, including palivizumab-eligible and preterm infants who suffer from significantly higher rates of disease, would benefit from this immunization strategy.”
 

 

 

Excitement for another option

Jörn-Hendrik Weitkamp, MD, professor of pediatrics and director for patient-oriented research at Monroe Carell Jr. Children’s Hospital at Vanderbilt University, Nashville, Tenn., said in an interview there is much excitement in the field for nirsevimab as it has significant advantages over palivizumab.

Dr. Jörn-Hendrik Weitkamp

RSV “is a huge burden to the children, the families, the hospitals, and the medical system,” he said.

Ideally there would be a vaccine to offer the best protection, he noted.

“People have spent their lives, their careers trying to develop a vaccine for RSV,” he said, but that has been elusive for more than 60 years. Therefore, passive immunization is the best of the current options, he says, and nirsevimab “seems to be very effective.”

What’s not clear, Dr. Weitkamp said, is how much nirsevimab will cost as it is not yet approved by the Food and Drug Administration. However, it has the great advantage of being given only once before the season starts instead of monthly (as required for palivizumab) through the season, “which is painful, inconvenient, and traumatizing. We limit that one to the children at highest risk.”

Rolling out an infant nirsevimab program would likely vary by geographic region, Ms. Kieffer and colleagues said, to help ensure infants are protected during the peak of their region’s RSV season.

The journal’s RSV supplement was supported by Sanofi and AstraZeneca. The studies by Ms. Suh and colleagues and Ms. Kieffer and colleagues were supported by AstraZeneca and Sanofi. Ms. Suh and several coauthors are employees of EpidStrategies. One coauthor is an employee of Sanofi and may hold shares and/or stock options in the company. Ms. Kieffer and several coauthors are employees of Sanofi and may hold shares and/or stock options in the company. Dr. Weitkamp reported no relevant financial relationships.

Respiratory syncytial virus (RSV) is the leading cause of U.S. infant hospitalizations overall and across population subgroups, new data published in the Journal of Infectious Diseases confirm.

Acute bronchiolitis caused by RSV accounted for 9.6% (95% confidence interval, 9.4%-9.9%) and 9.3% (95% CI, 9.0%-9.6%) of total infant hospitalizations from January 2009 to September 2015 and October 2015 to December 2019, respectively.
 

Journal issue includes 14 RSV studies

The latest issue of the journal includes a special section with results from 14 studies related to the widespread, easy-to-catch virus, highlighting the urgency of finding a solution for all infants.

In one study, authors led by Mina Suh, MPH, with EpidStrategies, a division of ToxStrategies in Rockville, Md., reported that, in children under the age of 5 years in the United States, RSV caused 58,000 annual hospitalizations and from 100 to 500 annual deaths from 2009 to 2019 (the latest year data were available).

Globally, in 2015, among infants younger than 6 months, an estimated 1.4 million hospital admissions and 27,300 in-hospital deaths were attributed to RSV lower respiratory tract infection (LRTI).

The researchers used the largest publicly available, all-payer database in the United States – the National (Nationwide) Inpatient Sample – to describe the leading causes of infant hospitalizations.

The authors noted that, because clinicians don’t routinely perform lab tests for RSV, the true health care burden is likely higher and its public health impact greater than these numbers show.

Immunization candidates advance

There are no preventative options currently available to substantially cut RSV infections in all infants, though immunization candidates are advancing, showing safety and efficacy in clinical trials.

Palivizumab is currently the only available option in the United States to prevent RSV and is recommended only for a small group of infants with particular forms of heart or lung disease and those born prematurely at 29 weeks’ gestational age. Further, palivizumab has to be given monthly throughout the RSV season.

Another of the studies in the journal supplement concluded that a universal immunization strategy with one of the candidates, nirsevimab (Sanofi, AstraZeneca), an investigational long-acting monoclonal antibody, could substantially reduce the health burden and economic burden for U.S. infants in their first RSV season.

The researchers, led by Alexia Kieffer, MSc, MPH, with Sanofi, used static decision-analytic modeling for the estimates. Modeled RSV-related outcomes included primary care and ED visits, hospitalizations, including ICU admission and mechanical ventilations, and RSV-related deaths.

“The results of this model suggested that the use of nirsevimab in all infants could reduce health events by 55% and the overall costs to the payer by 49%,” the authors of the study wrote.

According to the study, universal immunization of all infants with nirsevimab is expected to reduce 290,174 RSV-related medically attended LRTI (MALRTI), 24,986 hospitalizations, and cut $612 million in costs to the health care system.

The authors wrote: “While this reduction would be driven by term infants, who account for most of the RSV-MALRTI burden; all infants, including palivizumab-eligible and preterm infants who suffer from significantly higher rates of disease, would benefit from this immunization strategy.”
 

 

 

Excitement for another option

Jörn-Hendrik Weitkamp, MD, professor of pediatrics and director for patient-oriented research at Monroe Carell Jr. Children’s Hospital at Vanderbilt University, Nashville, Tenn., said in an interview there is much excitement in the field for nirsevimab as it has significant advantages over palivizumab.

Dr. Jörn-Hendrik Weitkamp

RSV “is a huge burden to the children, the families, the hospitals, and the medical system,” he said.

Ideally there would be a vaccine to offer the best protection, he noted.

“People have spent their lives, their careers trying to develop a vaccine for RSV,” he said, but that has been elusive for more than 60 years. Therefore, passive immunization is the best of the current options, he says, and nirsevimab “seems to be very effective.”

What’s not clear, Dr. Weitkamp said, is how much nirsevimab will cost as it is not yet approved by the Food and Drug Administration. However, it has the great advantage of being given only once before the season starts instead of monthly (as required for palivizumab) through the season, “which is painful, inconvenient, and traumatizing. We limit that one to the children at highest risk.”

Rolling out an infant nirsevimab program would likely vary by geographic region, Ms. Kieffer and colleagues said, to help ensure infants are protected during the peak of their region’s RSV season.

The journal’s RSV supplement was supported by Sanofi and AstraZeneca. The studies by Ms. Suh and colleagues and Ms. Kieffer and colleagues were supported by AstraZeneca and Sanofi. Ms. Suh and several coauthors are employees of EpidStrategies. One coauthor is an employee of Sanofi and may hold shares and/or stock options in the company. Ms. Kieffer and several coauthors are employees of Sanofi and may hold shares and/or stock options in the company. Dr. Weitkamp reported no relevant financial relationships.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE JOURNAL OF INFECTIOUS DISEASES

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Watching TV, using computer have opposite ties to dementia risk

Article Type
Changed
Tue, 08/23/2022 - 13:07

Watching TV may increase your risk of dementia, while using a computer may lower it, new research suggests.

The relationship to dementia with these activities remained strong no matter how much physical activity a person did, the authors wrote in Proceedings of the National Academy of Sciences.

Both watching TV and using a computer have been linked to increased risk of chronic disease and mortality, while exercise and physical activity (PA) have shown benefit in reducing cognitive decline, structural brain atrophy, and dementia risk in older adults, the authors wrote.

The authors said they wanted to try to understand the effects of watching TV and using computers on dementia risk, because people in the United States and Europe have been engaging in both of these activities more often.

They concluded that it’s not the sitting part of sedentary behavior (SB) that potentially has the effect on dementia but what people are doing while sitting.

Some of the results were surprising, lead author David Raichlen, PhD, professor of Human and Evolutionary Biology at University of Southern California, Los Angeles, said in an interview.

Previous literature on sedentary behaviors have documented their negative effects on a wide range of health outcomes, rather than finding positive associations, he explained.
 

More than 140,000 included in study

The researchers conducted their prospective cohort study using data from the United Kingdom Biobank. After excluding people younger than 60, those with prevalent dementia at the start of follow-up, and those without complete data, 146,651 participants were included.

The participants were followed from their baseline visit until they received a dementia diagnosis, died, were lost to follow-up, or were last admitted to the hospital.

TV-watching time was linked with an increased risk of incident dementia (HR [95% confidence interval] = 1.31 [1.23-1.40]), and computer use was linked with a reduced risk of incident dementia HR [95% CI] = 0.80 [0.76-0.85]).

TV’s link with higher dementia risk increased in those who had the highest use, compared with those who had the lowest use (HR [95% CI] = 1.28 [1.18-1.39].

Similarly, the link with risk reduction for dementia with computer use increased with more use.

Both medium and high computer time were associated with reduced risk of incident dementia (HR [95% CI] = 0.70 [0.64-0.76] and HR [95% CI] = 0.76 [0.70-0.83] respectively).

Dr. Raichlen pointed out that the high use of TV in this study was 4 or more hours a day and computer use – which included leisure use, not work use – had benefits on dementia risk after just half an hour.

These results remained significant after researchers adjusted for demographic, health, and lifestyle variables, including time spent on physical activity, sleeping, obesity, alcohol consumption, smoking status, diet scores, education level, body mass index, and employment type.
 

Physical is still better than sedentary activity

One potential reason for the different effects on dementia risk in the two activities studied, the authors write, is that sitting down to watch TV is associated with “uniquely low levels of muscle activity and energy expenditure, compared with sitting to use a computer.”

Andrew Budson, MD, chief of Cognitive & Behavioral Neurology and Associate Chief of Staff for Education for the VA Boston Healthcare System, Mass., who was not part of the study, said he thinks a more likely explanation for the study findings lies in the active versus passive tasks required in the two kinds of viewing that the authors reference.

“When we’re doing cognitive activity involving using the computer, we’re using large parts of our cortex to carry out that activity, whereas when we’re watching TV, there are probably relatively small amounts of our brain that are actually active,” Dr. Budson, author of Seven Steps to Managing Your Memory, explained in an interview.

“This is one of the first times I’ve been convinced that even when the computer activity isn’t completely new and novel, it may be beneficial,” Dr. Budson said.

It would be much better to do physical activity, but if the choice is sedentary activity, active cognitive activities, such as computer use, are better than TV watching, he continued.

The results of the current study are consistent with previous work showing that the type of sedentary behavior matters, according to the authors.

“Several studies have shown that TV time is associated with mortality and poor cardiometabolic biomarkers, whereas computer time is not,” they wrote.

A limitation of the study is that sedentary behaviors were self-reported via questionnaires, and there may be errors in recall.

“The use of objective methods for measuring both SB and PA are needed in future studies,” they write.

The authors receive support from the National Institutes of Health, the State of Arizona, the Arizona Department of Health Services, and the McKnight Brain Research Foundation. Neither the authors nor Dr. Budson declared relevant financial relationships.

Publications
Topics
Sections

Watching TV may increase your risk of dementia, while using a computer may lower it, new research suggests.

The relationship to dementia with these activities remained strong no matter how much physical activity a person did, the authors wrote in Proceedings of the National Academy of Sciences.

Both watching TV and using a computer have been linked to increased risk of chronic disease and mortality, while exercise and physical activity (PA) have shown benefit in reducing cognitive decline, structural brain atrophy, and dementia risk in older adults, the authors wrote.

The authors said they wanted to try to understand the effects of watching TV and using computers on dementia risk, because people in the United States and Europe have been engaging in both of these activities more often.

They concluded that it’s not the sitting part of sedentary behavior (SB) that potentially has the effect on dementia but what people are doing while sitting.

Some of the results were surprising, lead author David Raichlen, PhD, professor of Human and Evolutionary Biology at University of Southern California, Los Angeles, said in an interview.

Previous literature on sedentary behaviors have documented their negative effects on a wide range of health outcomes, rather than finding positive associations, he explained.
 

More than 140,000 included in study

The researchers conducted their prospective cohort study using data from the United Kingdom Biobank. After excluding people younger than 60, those with prevalent dementia at the start of follow-up, and those without complete data, 146,651 participants were included.

The participants were followed from their baseline visit until they received a dementia diagnosis, died, were lost to follow-up, or were last admitted to the hospital.

TV-watching time was linked with an increased risk of incident dementia (HR [95% confidence interval] = 1.31 [1.23-1.40]), and computer use was linked with a reduced risk of incident dementia HR [95% CI] = 0.80 [0.76-0.85]).

TV’s link with higher dementia risk increased in those who had the highest use, compared with those who had the lowest use (HR [95% CI] = 1.28 [1.18-1.39].

Similarly, the link with risk reduction for dementia with computer use increased with more use.

Both medium and high computer time were associated with reduced risk of incident dementia (HR [95% CI] = 0.70 [0.64-0.76] and HR [95% CI] = 0.76 [0.70-0.83] respectively).

Dr. Raichlen pointed out that the high use of TV in this study was 4 or more hours a day and computer use – which included leisure use, not work use – had benefits on dementia risk after just half an hour.

These results remained significant after researchers adjusted for demographic, health, and lifestyle variables, including time spent on physical activity, sleeping, obesity, alcohol consumption, smoking status, diet scores, education level, body mass index, and employment type.
 

Physical is still better than sedentary activity

One potential reason for the different effects on dementia risk in the two activities studied, the authors write, is that sitting down to watch TV is associated with “uniquely low levels of muscle activity and energy expenditure, compared with sitting to use a computer.”

Andrew Budson, MD, chief of Cognitive & Behavioral Neurology and Associate Chief of Staff for Education for the VA Boston Healthcare System, Mass., who was not part of the study, said he thinks a more likely explanation for the study findings lies in the active versus passive tasks required in the two kinds of viewing that the authors reference.

“When we’re doing cognitive activity involving using the computer, we’re using large parts of our cortex to carry out that activity, whereas when we’re watching TV, there are probably relatively small amounts of our brain that are actually active,” Dr. Budson, author of Seven Steps to Managing Your Memory, explained in an interview.

“This is one of the first times I’ve been convinced that even when the computer activity isn’t completely new and novel, it may be beneficial,” Dr. Budson said.

It would be much better to do physical activity, but if the choice is sedentary activity, active cognitive activities, such as computer use, are better than TV watching, he continued.

The results of the current study are consistent with previous work showing that the type of sedentary behavior matters, according to the authors.

“Several studies have shown that TV time is associated with mortality and poor cardiometabolic biomarkers, whereas computer time is not,” they wrote.

A limitation of the study is that sedentary behaviors were self-reported via questionnaires, and there may be errors in recall.

“The use of objective methods for measuring both SB and PA are needed in future studies,” they write.

The authors receive support from the National Institutes of Health, the State of Arizona, the Arizona Department of Health Services, and the McKnight Brain Research Foundation. Neither the authors nor Dr. Budson declared relevant financial relationships.

Watching TV may increase your risk of dementia, while using a computer may lower it, new research suggests.

The relationship to dementia with these activities remained strong no matter how much physical activity a person did, the authors wrote in Proceedings of the National Academy of Sciences.

Both watching TV and using a computer have been linked to increased risk of chronic disease and mortality, while exercise and physical activity (PA) have shown benefit in reducing cognitive decline, structural brain atrophy, and dementia risk in older adults, the authors wrote.

The authors said they wanted to try to understand the effects of watching TV and using computers on dementia risk, because people in the United States and Europe have been engaging in both of these activities more often.

They concluded that it’s not the sitting part of sedentary behavior (SB) that potentially has the effect on dementia but what people are doing while sitting.

Some of the results were surprising, lead author David Raichlen, PhD, professor of Human and Evolutionary Biology at University of Southern California, Los Angeles, said in an interview.

Previous literature on sedentary behaviors have documented their negative effects on a wide range of health outcomes, rather than finding positive associations, he explained.
 

More than 140,000 included in study

The researchers conducted their prospective cohort study using data from the United Kingdom Biobank. After excluding people younger than 60, those with prevalent dementia at the start of follow-up, and those without complete data, 146,651 participants were included.

The participants were followed from their baseline visit until they received a dementia diagnosis, died, were lost to follow-up, or were last admitted to the hospital.

TV-watching time was linked with an increased risk of incident dementia (HR [95% confidence interval] = 1.31 [1.23-1.40]), and computer use was linked with a reduced risk of incident dementia HR [95% CI] = 0.80 [0.76-0.85]).

TV’s link with higher dementia risk increased in those who had the highest use, compared with those who had the lowest use (HR [95% CI] = 1.28 [1.18-1.39].

Similarly, the link with risk reduction for dementia with computer use increased with more use.

Both medium and high computer time were associated with reduced risk of incident dementia (HR [95% CI] = 0.70 [0.64-0.76] and HR [95% CI] = 0.76 [0.70-0.83] respectively).

Dr. Raichlen pointed out that the high use of TV in this study was 4 or more hours a day and computer use – which included leisure use, not work use – had benefits on dementia risk after just half an hour.

These results remained significant after researchers adjusted for demographic, health, and lifestyle variables, including time spent on physical activity, sleeping, obesity, alcohol consumption, smoking status, diet scores, education level, body mass index, and employment type.
 

Physical is still better than sedentary activity

One potential reason for the different effects on dementia risk in the two activities studied, the authors write, is that sitting down to watch TV is associated with “uniquely low levels of muscle activity and energy expenditure, compared with sitting to use a computer.”

Andrew Budson, MD, chief of Cognitive & Behavioral Neurology and Associate Chief of Staff for Education for the VA Boston Healthcare System, Mass., who was not part of the study, said he thinks a more likely explanation for the study findings lies in the active versus passive tasks required in the two kinds of viewing that the authors reference.

“When we’re doing cognitive activity involving using the computer, we’re using large parts of our cortex to carry out that activity, whereas when we’re watching TV, there are probably relatively small amounts of our brain that are actually active,” Dr. Budson, author of Seven Steps to Managing Your Memory, explained in an interview.

“This is one of the first times I’ve been convinced that even when the computer activity isn’t completely new and novel, it may be beneficial,” Dr. Budson said.

It would be much better to do physical activity, but if the choice is sedentary activity, active cognitive activities, such as computer use, are better than TV watching, he continued.

The results of the current study are consistent with previous work showing that the type of sedentary behavior matters, according to the authors.

“Several studies have shown that TV time is associated with mortality and poor cardiometabolic biomarkers, whereas computer time is not,” they wrote.

A limitation of the study is that sedentary behaviors were self-reported via questionnaires, and there may be errors in recall.

“The use of objective methods for measuring both SB and PA are needed in future studies,” they write.

The authors receive support from the National Institutes of Health, the State of Arizona, the Arizona Department of Health Services, and the McKnight Brain Research Foundation. Neither the authors nor Dr. Budson declared relevant financial relationships.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Atrial cardiopathy linked to 35% higher dementia risk

Article Type
Changed
Mon, 08/22/2022 - 14:09

Older adults with atrial cardiopathy may have up to 35% higher risk for dementia even before symptoms develop, new research suggests.

“We cautiously suggest that an understanding of this relationship might provide a basis for new interventional strategies to help thwart the development of dementia,” the authors write.

The research, led by Michelle C. Johansen, MD, department of neurology, Johns Hopkins University, Baltimore, was published online in the Journal of the American Heart Association.

Atrial cardiopathy, characterized by abnormal size and function of the left atrium, has been associated with an increased risk of stroke and atrial fibrillation (AFib), and because both stroke and AFib are associated with an increased dementia risk, the authors write, it was important to investigate whether atrial cardiopathy is linked to dementia.

If that’s the case, they reasoned, the next question was whether that link is independent of AFib and stroke, and their new research suggests that it is.

For this analysis, the researchers conducted a prospective cohort analysis of participants in the Atherosclerosis Risk in Communities (ARIC) study who were attending visit 5 (2011-2013). During their fifth, sixth, and seventh clinical visits, the ARIC participants were evaluated for cognitive decline indicating dementia.

They studied a diverse population of 5,078 older adults living in four U.S. communities: Washington County, Md.; Forsyth County, N.C.; the northwestern suburbs of Minneapolis; and Jackson, Miss.

Just more than a third (34%) had atrial cardiopathy (average age, 75 years; 59% female; 21% Black) and 763 participants developed dementia.

Investigators found that atrial cardiopathy was significantly associated with dementia (adjusted hazard ratio, 1.35 [95% confidence interval, 1.16-1.58]).

They considered ARIC participants to have atrial cardiopathy if they had at least one of the following: P-wave terminal force greater than 5,000 mV·ms in ECG lead V1; NTproBNP greater than 250 pg/mL; or left atrial volume index greater than or equal to 34 mL/m2 by transthoracic echocardiography.

The risk of dementia was even stronger when the researchers defined cardiopathy by at least two biomarkers instead of one (aHR, 1.54 [95% CI, 1.25-1.89]).

The authors point out, however, that this study is observational and cannot make a causal link.

Clifford Kavinsky, MD, PhD, head of the Comprehensive Stroke and Cardiology Clinic at Rush University Medical Center, Chicago, told this news organization that much more research would need to be done to show convincingly that atrial cardiopathy causes dementia.

He called the findings “provocative in trying to understand in a general sense how cardiac dysfunction leads to dementia.”

“We all know heart failure leads to dementia, but now we see there may be a relationship with just dysfunction of the upper chambers,” he said.
 

Unresolved questions

But it still not clear is what is mediating the connection, who is at risk, and how the increased risk can be prevented, he said.

He said he also wonders whether the results eliminated all patients with atrial fibrillation, a point the authors acknowledge as well.

Researchers list in the limitations that “asymptomatic AFib or silent cerebral infarction may have been missed by the ARIC adjudication process.”

There is broad understanding that preventing heart disease is important for a wide array of reasons, Dr. Kavinsky noted, and one of the reasons is cognitive deterioration.

He said this study helps identify that “even dysfunction of the upper chambers of the heart contributes to the evolution of dementia.”

The study amplifies the need to shift to prevention with heart disease in general, and more specifically in atrial dysfunction, Dr. Kavinsky said, noting a lot of atrial dysfunction is mediated by underlying hypertension and coronary disease.

Researchers evaluated cognitive decline in all participants with a comprehensive array of neuropsychological tests and interviewed some of the patients.

“A diagnosis of dementia was generated based on testing results by a computer diagnostic algorithm and then decided upon by an expert based on the Diagnostic and Statistical Manual of Mental Disorders and the criteria outlined by the National Institutes of Health and the National Institutes of Health,” they write.

Dr. Johansen reported funding from National Institute of Neurological Disorders and Stroke. Study coauthor disclosures are listed in the paper. Dr. Kavinsky has disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Older adults with atrial cardiopathy may have up to 35% higher risk for dementia even before symptoms develop, new research suggests.

“We cautiously suggest that an understanding of this relationship might provide a basis for new interventional strategies to help thwart the development of dementia,” the authors write.

The research, led by Michelle C. Johansen, MD, department of neurology, Johns Hopkins University, Baltimore, was published online in the Journal of the American Heart Association.

Atrial cardiopathy, characterized by abnormal size and function of the left atrium, has been associated with an increased risk of stroke and atrial fibrillation (AFib), and because both stroke and AFib are associated with an increased dementia risk, the authors write, it was important to investigate whether atrial cardiopathy is linked to dementia.

If that’s the case, they reasoned, the next question was whether that link is independent of AFib and stroke, and their new research suggests that it is.

For this analysis, the researchers conducted a prospective cohort analysis of participants in the Atherosclerosis Risk in Communities (ARIC) study who were attending visit 5 (2011-2013). During their fifth, sixth, and seventh clinical visits, the ARIC participants were evaluated for cognitive decline indicating dementia.

They studied a diverse population of 5,078 older adults living in four U.S. communities: Washington County, Md.; Forsyth County, N.C.; the northwestern suburbs of Minneapolis; and Jackson, Miss.

Just more than a third (34%) had atrial cardiopathy (average age, 75 years; 59% female; 21% Black) and 763 participants developed dementia.

Investigators found that atrial cardiopathy was significantly associated with dementia (adjusted hazard ratio, 1.35 [95% confidence interval, 1.16-1.58]).

They considered ARIC participants to have atrial cardiopathy if they had at least one of the following: P-wave terminal force greater than 5,000 mV·ms in ECG lead V1; NTproBNP greater than 250 pg/mL; or left atrial volume index greater than or equal to 34 mL/m2 by transthoracic echocardiography.

The risk of dementia was even stronger when the researchers defined cardiopathy by at least two biomarkers instead of one (aHR, 1.54 [95% CI, 1.25-1.89]).

The authors point out, however, that this study is observational and cannot make a causal link.

Clifford Kavinsky, MD, PhD, head of the Comprehensive Stroke and Cardiology Clinic at Rush University Medical Center, Chicago, told this news organization that much more research would need to be done to show convincingly that atrial cardiopathy causes dementia.

He called the findings “provocative in trying to understand in a general sense how cardiac dysfunction leads to dementia.”

“We all know heart failure leads to dementia, but now we see there may be a relationship with just dysfunction of the upper chambers,” he said.
 

Unresolved questions

But it still not clear is what is mediating the connection, who is at risk, and how the increased risk can be prevented, he said.

He said he also wonders whether the results eliminated all patients with atrial fibrillation, a point the authors acknowledge as well.

Researchers list in the limitations that “asymptomatic AFib or silent cerebral infarction may have been missed by the ARIC adjudication process.”

There is broad understanding that preventing heart disease is important for a wide array of reasons, Dr. Kavinsky noted, and one of the reasons is cognitive deterioration.

He said this study helps identify that “even dysfunction of the upper chambers of the heart contributes to the evolution of dementia.”

The study amplifies the need to shift to prevention with heart disease in general, and more specifically in atrial dysfunction, Dr. Kavinsky said, noting a lot of atrial dysfunction is mediated by underlying hypertension and coronary disease.

Researchers evaluated cognitive decline in all participants with a comprehensive array of neuropsychological tests and interviewed some of the patients.

“A diagnosis of dementia was generated based on testing results by a computer diagnostic algorithm and then decided upon by an expert based on the Diagnostic and Statistical Manual of Mental Disorders and the criteria outlined by the National Institutes of Health and the National Institutes of Health,” they write.

Dr. Johansen reported funding from National Institute of Neurological Disorders and Stroke. Study coauthor disclosures are listed in the paper. Dr. Kavinsky has disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Older adults with atrial cardiopathy may have up to 35% higher risk for dementia even before symptoms develop, new research suggests.

“We cautiously suggest that an understanding of this relationship might provide a basis for new interventional strategies to help thwart the development of dementia,” the authors write.

The research, led by Michelle C. Johansen, MD, department of neurology, Johns Hopkins University, Baltimore, was published online in the Journal of the American Heart Association.

Atrial cardiopathy, characterized by abnormal size and function of the left atrium, has been associated with an increased risk of stroke and atrial fibrillation (AFib), and because both stroke and AFib are associated with an increased dementia risk, the authors write, it was important to investigate whether atrial cardiopathy is linked to dementia.

If that’s the case, they reasoned, the next question was whether that link is independent of AFib and stroke, and their new research suggests that it is.

For this analysis, the researchers conducted a prospective cohort analysis of participants in the Atherosclerosis Risk in Communities (ARIC) study who were attending visit 5 (2011-2013). During their fifth, sixth, and seventh clinical visits, the ARIC participants were evaluated for cognitive decline indicating dementia.

They studied a diverse population of 5,078 older adults living in four U.S. communities: Washington County, Md.; Forsyth County, N.C.; the northwestern suburbs of Minneapolis; and Jackson, Miss.

Just more than a third (34%) had atrial cardiopathy (average age, 75 years; 59% female; 21% Black) and 763 participants developed dementia.

Investigators found that atrial cardiopathy was significantly associated with dementia (adjusted hazard ratio, 1.35 [95% confidence interval, 1.16-1.58]).

They considered ARIC participants to have atrial cardiopathy if they had at least one of the following: P-wave terminal force greater than 5,000 mV·ms in ECG lead V1; NTproBNP greater than 250 pg/mL; or left atrial volume index greater than or equal to 34 mL/m2 by transthoracic echocardiography.

The risk of dementia was even stronger when the researchers defined cardiopathy by at least two biomarkers instead of one (aHR, 1.54 [95% CI, 1.25-1.89]).

The authors point out, however, that this study is observational and cannot make a causal link.

Clifford Kavinsky, MD, PhD, head of the Comprehensive Stroke and Cardiology Clinic at Rush University Medical Center, Chicago, told this news organization that much more research would need to be done to show convincingly that atrial cardiopathy causes dementia.

He called the findings “provocative in trying to understand in a general sense how cardiac dysfunction leads to dementia.”

“We all know heart failure leads to dementia, but now we see there may be a relationship with just dysfunction of the upper chambers,” he said.
 

Unresolved questions

But it still not clear is what is mediating the connection, who is at risk, and how the increased risk can be prevented, he said.

He said he also wonders whether the results eliminated all patients with atrial fibrillation, a point the authors acknowledge as well.

Researchers list in the limitations that “asymptomatic AFib or silent cerebral infarction may have been missed by the ARIC adjudication process.”

There is broad understanding that preventing heart disease is important for a wide array of reasons, Dr. Kavinsky noted, and one of the reasons is cognitive deterioration.

He said this study helps identify that “even dysfunction of the upper chambers of the heart contributes to the evolution of dementia.”

The study amplifies the need to shift to prevention with heart disease in general, and more specifically in atrial dysfunction, Dr. Kavinsky said, noting a lot of atrial dysfunction is mediated by underlying hypertension and coronary disease.

Researchers evaluated cognitive decline in all participants with a comprehensive array of neuropsychological tests and interviewed some of the patients.

“A diagnosis of dementia was generated based on testing results by a computer diagnostic algorithm and then decided upon by an expert based on the Diagnostic and Statistical Manual of Mental Disorders and the criteria outlined by the National Institutes of Health and the National Institutes of Health,” they write.

Dr. Johansen reported funding from National Institute of Neurological Disorders and Stroke. Study coauthor disclosures are listed in the paper. Dr. Kavinsky has disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Prenatal test can cut time, cost of finding chromosomal abnormalities

Article Type
Changed
Fri, 08/19/2022 - 14:16

A prenatal test can accurately detect an incorrect number of chromosomes more quickly and at about one-tenth the cost of current clinical genetic tests, new data suggest.

Aneuploid pregnancies are a major cause of pregnancy loss, developmental delays, and fetal structural abnormalities, so there is high interest in screening options.

Study leader Zev Williams, MD, PhD, professor of women’s health and chief of the division of reproductive endocrinology and infertility at Columbia University, New York, and colleagues, describe a test they have developed and validated in a letter to the editor published in the New England Journal of Medicine.

The new test is called STORK (Short-Read Transpore Rapid Karyotyping) and can be used in doctors’ offices. The test uses a palm-sized, nanopore-based DNA sequencer to examine tissue from miscarriages or from a biopsy of the placenta or in vitro fertilization (IVF) embryo to determine if it has a normal count of chromosomes.

Results can be delivered in 2 hours, the researchers said. Sequencing times and costs range from 10 minutes and $200 for a single sample to 2 hours and less than $50 per sample when 10 samples are tested simultaneously.

The currently available tests and results cost thousands of dollars and results take days to weeks.

”What’s so exciting is that STORK can be used to rapidly assess chromosomal health across all reproductive tissue types,” Dr. Williams said in a press release.

“For those patients who are trying to get pregnant through IVF, the test gives the ability to conceive sooner,“ he said.

IVF embryos are typically biopsied for chromosomal testing on day 5 or 6 and are frozen for weeks until they can be implanted in a woman’s uterus. Freezing may not be necessary with rapid tests, as embryos found normal could be transferred immediately.

Dr. Williams added: “For those who are already pregnant, it gives more time to make important family-planning decisions. For those who have had a miscarriage, it can show why the loss happened so that steps can be taken to prevent future pregnancy losses.”

Existing tests include two main approaches. One is a rapid and target approach, which tests only a limited number of chromosomes and the other is a whole-genome approach, which takes days or weeks to get results and requires sending samples to specialized laboratories.

”The affordability of this [STORK] test also means that individuals who have suffered a miscarriage do not have to wait until a second or third loss before insurance will cover expensive lab tests, leaving many women in the dark and often blaming themselves,” Dr. Williams said.

The researchers used STORK to perform blinded testing using 218 specimens from miscarriage tissue, placenta samples, amniotic fluid, and biopsy specimens from embryos undergoing preimplantation genetic tests for aneuploidy (PGT-A).

They compared the results from STORK with those obtained using standard clinical testing,

For miscarriage tissue samples and placenta and amniotic-fluid samples, STORK results calculated the number of chromosomes “with 100% accuracy (95% confidence intervals, 94.3%-100%, 93.2%-100%, and 92.9%-100%, respectively),” the authors wrote.

For PGT-A samples, STORK results were 98.1% matched (95% CI, 89.7%-100%) with the clinical diagnosis of the embryos, they report.

According to the Columbia University press release, the researchers are waiting for authorization from the New York State Department of Health before the test can be offered to Columbia patients.

Dr. Sara Wernimont

Sarah Wernimont, MD, PhD, a maternal-fetal physician with the University of Minnesota, Minneapolis, who was not part of the study, said in an interview she found the results “exciting.”

“For patients I care for who have abnormal screening results, there’s the potential to receive diagnostic testing in a much faster way that can help parents make serious decisions about pregnancy care in a more timely fashion,” she said.

She said the quickest test they use in her practice for detection of aneuploidy, the fluorescence in situ hybridization test, takes 3 days to get results and tests only a few chromosomes.

The STORK test, she noted, has the potential to test all the chromosomes.

She said the sample size is small and she would like to see more external validation in a larger population, but “their sensitivity and specificity compared to the current standard seems to be excellent.”

The study was supported by the National Institutes of Health, the Biomedical Engineering Technology Accelerator at Columbia University and the Wendy and John Havens Innovation fund. Dr. Williams and one study coauthor are inventors on patents filed related to this work. Dr. Wernimont declared no relevant financial relationships.

Publications
Topics
Sections

A prenatal test can accurately detect an incorrect number of chromosomes more quickly and at about one-tenth the cost of current clinical genetic tests, new data suggest.

Aneuploid pregnancies are a major cause of pregnancy loss, developmental delays, and fetal structural abnormalities, so there is high interest in screening options.

Study leader Zev Williams, MD, PhD, professor of women’s health and chief of the division of reproductive endocrinology and infertility at Columbia University, New York, and colleagues, describe a test they have developed and validated in a letter to the editor published in the New England Journal of Medicine.

The new test is called STORK (Short-Read Transpore Rapid Karyotyping) and can be used in doctors’ offices. The test uses a palm-sized, nanopore-based DNA sequencer to examine tissue from miscarriages or from a biopsy of the placenta or in vitro fertilization (IVF) embryo to determine if it has a normal count of chromosomes.

Results can be delivered in 2 hours, the researchers said. Sequencing times and costs range from 10 minutes and $200 for a single sample to 2 hours and less than $50 per sample when 10 samples are tested simultaneously.

The currently available tests and results cost thousands of dollars and results take days to weeks.

”What’s so exciting is that STORK can be used to rapidly assess chromosomal health across all reproductive tissue types,” Dr. Williams said in a press release.

“For those patients who are trying to get pregnant through IVF, the test gives the ability to conceive sooner,“ he said.

IVF embryos are typically biopsied for chromosomal testing on day 5 or 6 and are frozen for weeks until they can be implanted in a woman’s uterus. Freezing may not be necessary with rapid tests, as embryos found normal could be transferred immediately.

Dr. Williams added: “For those who are already pregnant, it gives more time to make important family-planning decisions. For those who have had a miscarriage, it can show why the loss happened so that steps can be taken to prevent future pregnancy losses.”

Existing tests include two main approaches. One is a rapid and target approach, which tests only a limited number of chromosomes and the other is a whole-genome approach, which takes days or weeks to get results and requires sending samples to specialized laboratories.

”The affordability of this [STORK] test also means that individuals who have suffered a miscarriage do not have to wait until a second or third loss before insurance will cover expensive lab tests, leaving many women in the dark and often blaming themselves,” Dr. Williams said.

The researchers used STORK to perform blinded testing using 218 specimens from miscarriage tissue, placenta samples, amniotic fluid, and biopsy specimens from embryos undergoing preimplantation genetic tests for aneuploidy (PGT-A).

They compared the results from STORK with those obtained using standard clinical testing,

For miscarriage tissue samples and placenta and amniotic-fluid samples, STORK results calculated the number of chromosomes “with 100% accuracy (95% confidence intervals, 94.3%-100%, 93.2%-100%, and 92.9%-100%, respectively),” the authors wrote.

For PGT-A samples, STORK results were 98.1% matched (95% CI, 89.7%-100%) with the clinical diagnosis of the embryos, they report.

According to the Columbia University press release, the researchers are waiting for authorization from the New York State Department of Health before the test can be offered to Columbia patients.

Dr. Sara Wernimont

Sarah Wernimont, MD, PhD, a maternal-fetal physician with the University of Minnesota, Minneapolis, who was not part of the study, said in an interview she found the results “exciting.”

“For patients I care for who have abnormal screening results, there’s the potential to receive diagnostic testing in a much faster way that can help parents make serious decisions about pregnancy care in a more timely fashion,” she said.

She said the quickest test they use in her practice for detection of aneuploidy, the fluorescence in situ hybridization test, takes 3 days to get results and tests only a few chromosomes.

The STORK test, she noted, has the potential to test all the chromosomes.

She said the sample size is small and she would like to see more external validation in a larger population, but “their sensitivity and specificity compared to the current standard seems to be excellent.”

The study was supported by the National Institutes of Health, the Biomedical Engineering Technology Accelerator at Columbia University and the Wendy and John Havens Innovation fund. Dr. Williams and one study coauthor are inventors on patents filed related to this work. Dr. Wernimont declared no relevant financial relationships.

A prenatal test can accurately detect an incorrect number of chromosomes more quickly and at about one-tenth the cost of current clinical genetic tests, new data suggest.

Aneuploid pregnancies are a major cause of pregnancy loss, developmental delays, and fetal structural abnormalities, so there is high interest in screening options.

Study leader Zev Williams, MD, PhD, professor of women’s health and chief of the division of reproductive endocrinology and infertility at Columbia University, New York, and colleagues, describe a test they have developed and validated in a letter to the editor published in the New England Journal of Medicine.

The new test is called STORK (Short-Read Transpore Rapid Karyotyping) and can be used in doctors’ offices. The test uses a palm-sized, nanopore-based DNA sequencer to examine tissue from miscarriages or from a biopsy of the placenta or in vitro fertilization (IVF) embryo to determine if it has a normal count of chromosomes.

Results can be delivered in 2 hours, the researchers said. Sequencing times and costs range from 10 minutes and $200 for a single sample to 2 hours and less than $50 per sample when 10 samples are tested simultaneously.

The currently available tests and results cost thousands of dollars and results take days to weeks.

”What’s so exciting is that STORK can be used to rapidly assess chromosomal health across all reproductive tissue types,” Dr. Williams said in a press release.

“For those patients who are trying to get pregnant through IVF, the test gives the ability to conceive sooner,“ he said.

IVF embryos are typically biopsied for chromosomal testing on day 5 or 6 and are frozen for weeks until they can be implanted in a woman’s uterus. Freezing may not be necessary with rapid tests, as embryos found normal could be transferred immediately.

Dr. Williams added: “For those who are already pregnant, it gives more time to make important family-planning decisions. For those who have had a miscarriage, it can show why the loss happened so that steps can be taken to prevent future pregnancy losses.”

Existing tests include two main approaches. One is a rapid and target approach, which tests only a limited number of chromosomes and the other is a whole-genome approach, which takes days or weeks to get results and requires sending samples to specialized laboratories.

”The affordability of this [STORK] test also means that individuals who have suffered a miscarriage do not have to wait until a second or third loss before insurance will cover expensive lab tests, leaving many women in the dark and often blaming themselves,” Dr. Williams said.

The researchers used STORK to perform blinded testing using 218 specimens from miscarriage tissue, placenta samples, amniotic fluid, and biopsy specimens from embryos undergoing preimplantation genetic tests for aneuploidy (PGT-A).

They compared the results from STORK with those obtained using standard clinical testing,

For miscarriage tissue samples and placenta and amniotic-fluid samples, STORK results calculated the number of chromosomes “with 100% accuracy (95% confidence intervals, 94.3%-100%, 93.2%-100%, and 92.9%-100%, respectively),” the authors wrote.

For PGT-A samples, STORK results were 98.1% matched (95% CI, 89.7%-100%) with the clinical diagnosis of the embryos, they report.

According to the Columbia University press release, the researchers are waiting for authorization from the New York State Department of Health before the test can be offered to Columbia patients.

Dr. Sara Wernimont

Sarah Wernimont, MD, PhD, a maternal-fetal physician with the University of Minnesota, Minneapolis, who was not part of the study, said in an interview she found the results “exciting.”

“For patients I care for who have abnormal screening results, there’s the potential to receive diagnostic testing in a much faster way that can help parents make serious decisions about pregnancy care in a more timely fashion,” she said.

She said the quickest test they use in her practice for detection of aneuploidy, the fluorescence in situ hybridization test, takes 3 days to get results and tests only a few chromosomes.

The STORK test, she noted, has the potential to test all the chromosomes.

She said the sample size is small and she would like to see more external validation in a larger population, but “their sensitivity and specificity compared to the current standard seems to be excellent.”

The study was supported by the National Institutes of Health, the Biomedical Engineering Technology Accelerator at Columbia University and the Wendy and John Havens Innovation fund. Dr. Williams and one study coauthor are inventors on patents filed related to this work. Dr. Wernimont declared no relevant financial relationships.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE NEW ENGLAND JOURNAL OF MEDICINE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Postpartum depression risk higher with family psych history

Article Type
Changed
Mon, 08/22/2022 - 08:58

Mothers who have a family history of any psychiatric disorder have almost two times the risk of postpartum depression as do mothers without such history, according to a new study.

Mette-Marie Zacher Kjeldsen, MSc, with the National Centre for Register-based Research at Aarhus (Denmark) University, led the study, a meta-analysis that included 26 studies with information on 100,877 women.

Findings were published online in JAMA Psychiatry.

When mothers had a family history of psychiatric disorders, the odds ratio for PPD was 2.08 (95% confidence interval, 1.67-2.59). That corresponds to a risk ratio of 1.79 (95% CI, 1.52-2.09), assuming a 15% postpartum depression prevalence in the general population.
 

Not doomed to develop PPD

Polina Teslyar, MD, a perinatal psychiatrist at Brigham and Women’s Hospital in Boston told this news organization it’s important to point out that though the risk is higher, women with a family psychiatric history should not feel as though they are destined to develop PPD.

“You are still more likely to not have postpartum depression, but it is important to be aware of personal risk factors so that if a person is experiencing that, they ask for help quickly rather than suffering and not knowing something is amiss,” she emphasized. Dr. Teslyar says she does see the higher risk for PPD, which is preventable and treatable, in her own practice when women have had a family history of psychiatric disorders.

Dr. Polina Teslyar

The association makes sense, but literature on why that is has been varied, she said, and likely involves both genetics and socioeconomic factors. It’s difficult to tease apart how big a part each plays.

In her perinatal practice she sees women even before they are pregnant to discuss risk factors for PPD so she does ask about family history of psychiatric disorders, specifically about history of PPD and anxiety.

The researchers suggest routine perinatal care should include an easy low-cost, two-part question about both personal and family history of psychiatric disorders.

“As the assessment is possible even prior to conception, this would leave time for planning preventive efforts, such as psychosocial and psychological interventions targeting these at-risk women,” the authors write.
 

Asking about family history a challenge

Dr. Teslyar noted though that one of the challenges in asking about family history is that families may not have openly shared psychiatric history details with offspring. Family members may also report conditions they suspect a family member had rather than having a documented diagnosis.

In places where there is universal health care, she noted, finding documented diagnoses is easier, but otherwise “you’re really taking a subjective interpretation.”

The researchers found that subgroup, sensitivity, and meta–regression analyses aligned with the primary findings. The overall certainty of evidence was graded as moderate.

This study was not able to make clear how the specific diagnoses of family members affect the risk of developing PPD because much of the data from the studies came from self-report and questions were not consistent across the studies.

For instance, only 7 studies asked specifically about first-degree family members and 10 asked about specific diagnoses. Diagnoses ranged from mild affective disorders to more intrusive disorders, such as schizophrenia.

And while this study doesn’t seek to determine why the family history and risk of PPD appear to be connected, the authors offer some possible explanations.

“Growing up in an environment with parents struggling with mental health problems potentially influences the social support received from these parents when going into motherhood,” the authors write. “This particular explanation is supported by umbrella reviews concluding that lack of social support is a significant PPD risk factor.”

Screening, extraction, and assessment of studies included was done independently by two reviewers, increasing validity, the authors note.

The authors state that approximately 10%-15% of new mothers experience PPD, but Dr. Teslyar points out the numbers in the United States are typically quoted at up to 20%-30%. PPD ranges from mild to severe episodes and includes symptoms like those for major depression outside the postpartum period.

Study authors received funding from The Lundbeck Foundation and the European Union’s Horizon 2020 Research and Innovation Programme. A coauthor, Vibe G. Frokjaer, MD, PhD, has served as consultant and lecturer for H. Lundbeck and Sage Therapeutics. No other disclosures were reported. Dr. Teslyar reports no relevant financial relationships.

Publications
Topics
Sections

Mothers who have a family history of any psychiatric disorder have almost two times the risk of postpartum depression as do mothers without such history, according to a new study.

Mette-Marie Zacher Kjeldsen, MSc, with the National Centre for Register-based Research at Aarhus (Denmark) University, led the study, a meta-analysis that included 26 studies with information on 100,877 women.

Findings were published online in JAMA Psychiatry.

When mothers had a family history of psychiatric disorders, the odds ratio for PPD was 2.08 (95% confidence interval, 1.67-2.59). That corresponds to a risk ratio of 1.79 (95% CI, 1.52-2.09), assuming a 15% postpartum depression prevalence in the general population.
 

Not doomed to develop PPD

Polina Teslyar, MD, a perinatal psychiatrist at Brigham and Women’s Hospital in Boston told this news organization it’s important to point out that though the risk is higher, women with a family psychiatric history should not feel as though they are destined to develop PPD.

“You are still more likely to not have postpartum depression, but it is important to be aware of personal risk factors so that if a person is experiencing that, they ask for help quickly rather than suffering and not knowing something is amiss,” she emphasized. Dr. Teslyar says she does see the higher risk for PPD, which is preventable and treatable, in her own practice when women have had a family history of psychiatric disorders.

Dr. Polina Teslyar

The association makes sense, but literature on why that is has been varied, she said, and likely involves both genetics and socioeconomic factors. It’s difficult to tease apart how big a part each plays.

In her perinatal practice she sees women even before they are pregnant to discuss risk factors for PPD so she does ask about family history of psychiatric disorders, specifically about history of PPD and anxiety.

The researchers suggest routine perinatal care should include an easy low-cost, two-part question about both personal and family history of psychiatric disorders.

“As the assessment is possible even prior to conception, this would leave time for planning preventive efforts, such as psychosocial and psychological interventions targeting these at-risk women,” the authors write.
 

Asking about family history a challenge

Dr. Teslyar noted though that one of the challenges in asking about family history is that families may not have openly shared psychiatric history details with offspring. Family members may also report conditions they suspect a family member had rather than having a documented diagnosis.

In places where there is universal health care, she noted, finding documented diagnoses is easier, but otherwise “you’re really taking a subjective interpretation.”

The researchers found that subgroup, sensitivity, and meta–regression analyses aligned with the primary findings. The overall certainty of evidence was graded as moderate.

This study was not able to make clear how the specific diagnoses of family members affect the risk of developing PPD because much of the data from the studies came from self-report and questions were not consistent across the studies.

For instance, only 7 studies asked specifically about first-degree family members and 10 asked about specific diagnoses. Diagnoses ranged from mild affective disorders to more intrusive disorders, such as schizophrenia.

And while this study doesn’t seek to determine why the family history and risk of PPD appear to be connected, the authors offer some possible explanations.

“Growing up in an environment with parents struggling with mental health problems potentially influences the social support received from these parents when going into motherhood,” the authors write. “This particular explanation is supported by umbrella reviews concluding that lack of social support is a significant PPD risk factor.”

Screening, extraction, and assessment of studies included was done independently by two reviewers, increasing validity, the authors note.

The authors state that approximately 10%-15% of new mothers experience PPD, but Dr. Teslyar points out the numbers in the United States are typically quoted at up to 20%-30%. PPD ranges from mild to severe episodes and includes symptoms like those for major depression outside the postpartum period.

Study authors received funding from The Lundbeck Foundation and the European Union’s Horizon 2020 Research and Innovation Programme. A coauthor, Vibe G. Frokjaer, MD, PhD, has served as consultant and lecturer for H. Lundbeck and Sage Therapeutics. No other disclosures were reported. Dr. Teslyar reports no relevant financial relationships.

Mothers who have a family history of any psychiatric disorder have almost two times the risk of postpartum depression as do mothers without such history, according to a new study.

Mette-Marie Zacher Kjeldsen, MSc, with the National Centre for Register-based Research at Aarhus (Denmark) University, led the study, a meta-analysis that included 26 studies with information on 100,877 women.

Findings were published online in JAMA Psychiatry.

When mothers had a family history of psychiatric disorders, the odds ratio for PPD was 2.08 (95% confidence interval, 1.67-2.59). That corresponds to a risk ratio of 1.79 (95% CI, 1.52-2.09), assuming a 15% postpartum depression prevalence in the general population.
 

Not doomed to develop PPD

Polina Teslyar, MD, a perinatal psychiatrist at Brigham and Women’s Hospital in Boston told this news organization it’s important to point out that though the risk is higher, women with a family psychiatric history should not feel as though they are destined to develop PPD.

“You are still more likely to not have postpartum depression, but it is important to be aware of personal risk factors so that if a person is experiencing that, they ask for help quickly rather than suffering and not knowing something is amiss,” she emphasized. Dr. Teslyar says she does see the higher risk for PPD, which is preventable and treatable, in her own practice when women have had a family history of psychiatric disorders.

Dr. Polina Teslyar

The association makes sense, but literature on why that is has been varied, she said, and likely involves both genetics and socioeconomic factors. It’s difficult to tease apart how big a part each plays.

In her perinatal practice she sees women even before they are pregnant to discuss risk factors for PPD so she does ask about family history of psychiatric disorders, specifically about history of PPD and anxiety.

The researchers suggest routine perinatal care should include an easy low-cost, two-part question about both personal and family history of psychiatric disorders.

“As the assessment is possible even prior to conception, this would leave time for planning preventive efforts, such as psychosocial and psychological interventions targeting these at-risk women,” the authors write.
 

Asking about family history a challenge

Dr. Teslyar noted though that one of the challenges in asking about family history is that families may not have openly shared psychiatric history details with offspring. Family members may also report conditions they suspect a family member had rather than having a documented diagnosis.

In places where there is universal health care, she noted, finding documented diagnoses is easier, but otherwise “you’re really taking a subjective interpretation.”

The researchers found that subgroup, sensitivity, and meta–regression analyses aligned with the primary findings. The overall certainty of evidence was graded as moderate.

This study was not able to make clear how the specific diagnoses of family members affect the risk of developing PPD because much of the data from the studies came from self-report and questions were not consistent across the studies.

For instance, only 7 studies asked specifically about first-degree family members and 10 asked about specific diagnoses. Diagnoses ranged from mild affective disorders to more intrusive disorders, such as schizophrenia.

And while this study doesn’t seek to determine why the family history and risk of PPD appear to be connected, the authors offer some possible explanations.

“Growing up in an environment with parents struggling with mental health problems potentially influences the social support received from these parents when going into motherhood,” the authors write. “This particular explanation is supported by umbrella reviews concluding that lack of social support is a significant PPD risk factor.”

Screening, extraction, and assessment of studies included was done independently by two reviewers, increasing validity, the authors note.

The authors state that approximately 10%-15% of new mothers experience PPD, but Dr. Teslyar points out the numbers in the United States are typically quoted at up to 20%-30%. PPD ranges from mild to severe episodes and includes symptoms like those for major depression outside the postpartum period.

Study authors received funding from The Lundbeck Foundation and the European Union’s Horizon 2020 Research and Innovation Programme. A coauthor, Vibe G. Frokjaer, MD, PhD, has served as consultant and lecturer for H. Lundbeck and Sage Therapeutics. No other disclosures were reported. Dr. Teslyar reports no relevant financial relationships.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA PSYCHIATRY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article