User login
Is evolution’s greatest triumph its worst blunder?
Of all the dazzling achievements of evolution, the most glorious by far is the emergence of the advanced human brain, especially the prefrontal cortex. Homo sapiens (the wise humans) are without doubt the most transformative development in the consequential annals of evolution. It was evolution’s spectacular “moonshot.” Ironically, it may also have been the seed of its destruction.
The unprecedented growth of the human brain over the past 7 million years (tripling in size) was a monumental tipping point in evolution that ultimately disrupted the entire orderly cascade of evolution on Planet Earth. Because of their superior intelligence, Homo sapiens have substantially “tinkered” with the foundations of evolution, such as “natural selection” and “survival of the fittest,” and may eventually change the course of evolution, or even reverse it. It should also be recognized that 20% of the human genome is Neanderthal, and the 2022 Nobel Prize in Physiology or Medicine was awarded to Svante Pääbo, the founder of the field of paleogenetics, who demonstrated genetically that Homo sapiens interbred with Homo neanderthalensis (who disappeared 30,000 years ago).
The majestic evolution of the human brain, in both size and complexity, led to monumental changes in the history of humankind compared to their primitive predecessors. Thanks to a superior cerebral cortex, humans developed traits and abilities that were nonexistent, even unimaginable, in the rest of animal kingdom, including primates and other mammals. These include thoughts; speech (hundreds of languages), spoken and written, to communicate among themselves; composed music and created numerous instruments to play it; invented mathematics, physics, and chemistry; developed agriculture to sustain and feed the masses; built homes, palaces, and pyramids, with water and sewage systems; hatched hundreds of religions and built thousands of houses of worship; built machines to transport themselves (cars, trains, ships, planes, and space shuttles); paved airports and countless miles of roads and railways; established companies, universities, hospitals, and research laboratories; built sports facilities such as stadiums for Olympic games and all its athletics; created hotels, restaurants, coffee shops, newspapers, and magazines; discovered the amazing DNA double helix and its genome with 23,000 coding genes containing instructions to build the brain and 200 other body tissues; developed surgeries and invented medications for diseases that would have killed millions every year; and established paper money to replace gold and silver coins. Humans established governments that included monarchies, dictatorships, democracies, and pseudodemocracies; stipulated constitutions, laws, and regulations to maintain various societies; and created several civilizations around the world that thrived and then faded. Over the past century, the advanced human brain elevated human existence to a higher sophistication with technologies such as electricity, phones, computers, internet, artificial intelligence, and machine learning. Using powerful rockets and space stations, humans have begun to expand their influence to the moon and planets of the solar system. Humans are very likely to continue achieving what evolution could never have done without evolving the human brain to become the most powerful force in nature.
The key ingredient of the brain that has enabled humans to achieve so much is the development of an advanced cognition, with superior functions that far exceed those of other living organisms. These include neurocognitive functions such as memory and attention, and executive functions that include planning, problem-solving, decision-making, abstract thinking, and insight. Those cognitive functions generate lofty prose, splendiferous poetry, and heavenly symphonies that inspire those who create it and others. The human brain also developed social cognition, with empathy, theory of mind, recognition of facial expressions, and courtship rituals that can trigger infatuation and love. Homo sapiens can experience a wide range of emotions in addition to love and attachment (necessary for procreation), including shame, guilt, surprise, embarrassment, disgust, and indifference, and a unique sense of right and wrong.
Perhaps the most distinctive human attribute, generated by an advanced prefrontal cortex, is a belief system that includes philosophy, politics, religion, and faith. Hundreds of different religions sprouted throughout human history (each claiming a monopoly on “the truth”), mandating rituals and behaviors, but also promoting a profound and unshakable belief in a divine “higher being” and an afterlife that mitigates the fear of death. Humans, unlike other animals, are painfully aware of mortality and the inevitability of death. Faith is an antidote for thanatophobia. Unfortunately, religious beliefs often generated severe and protracted schisms and warfare, with fatal consequences for their followers.
The anti-evolution aspect of the advanced brain
Despite remarkable talents and achievements, the unprecedented evolutionary expansion of the human brain also has a detrimental downside. The same intellectual power that led to astonishing positive accomplishments has a wicked side as well. While most animals have a predator, humans have become the “omni-predator” that preys on all living things. The balanced ecosystems of animals and plants has been dominated and disrupted by humans. Thousands of species that evolution had so ingeniously spawned became extinct because of human actions. The rainforests, jewels of nature’s plantation system, were victimized by human indifference to the deleterious effects on nature and climate. The excavation of coal and oil, exploited as necessary sources of energy for societal infrastructure, came back to haunt humans with climate consequences. In many ways, human “progress” corrupted evolution and dismantled its components. Survival of the fittest among various species was whittled down to “survival of humans” (and their domesticated animals) at the expense of all other organisms, animals, or plants.
Among Homo sapiens, momentous scientific, medical, and technological advances completely undermined the principle of survival of the fittest. Very premature infants, who would have certainly died, were kept alive. Children with disabling genetic disorders who would have perished in childhood were kept alive into the age of procreation, perpetuating the genetic mutations. The discovery of antibiotic and antiviral medications, and especially vaccines, ensured the survival of millions of humans who would have succumbed to infections. With evolution’s natural selection, humans who survived severe infections without medications would have passed on their “infection-resistant genes” to their progeny. The triumph of human medical progress can be conceptualized as a setback for the principles of evolution.
Continue to: The most malignant...
The most malignant consequence of the exceptional human brain is the evil of which it is capable. Human ingenuity led to the development of weapons of individual killing (guns), large-scale murder (machine guns), and massive destruction (nuclear weapons). And because aggression and warfare are an inherent part of human nature, the most potent predator for a human is another human. The history of humans is riddled with conflict and death on a large scale. Ironically, many wars were instigated by various religious groups around the world, who developed intense hostility towards one another.
There are other downsides to the advanced human brain. It can channel its talents and skills into unimaginably wicked and depraved behaviors, such as premeditated and well-planned murder, slavery, cults, child abuse, domestic abuse, pornography, fascism, dictatorships, and political corruption. Astonishingly, the same brain that can be loving, kind, friendly, and empathetic can suddenly become hateful, vengeful, cruel, vile, sinister, vicious, diabolical, and capable of unimaginable violence and atrocities. The advanced human brain definitely has a very dark side.
Finally, unlike other members of the animal kingdom, the human brain generates its virtual counterpart: the highly complex human mind, which is prone to various maladies, labeled as “psychiatric disorders.” No other animal species develops delusions, hallucinations, thought disorders, melancholia, mania, obsessive-compulsive disorder, generalized anxiety, panic attacks, posttraumatic stress disorder, psychopathy, narcissistic and borderline personality disorders, alcohol addiction, and drug abuse. Homo sapiens are the only species whose members decide to end their own life in large numbers. About 25% of human minds are afflicted with one or more of those psychiatric ailments.1,2 The redeeming grace of the large human brain is that it led to the development of pharmacologic and somatic treatments for most of them, including psychotherapy, which is a uniquely human treatment strategy that can mend many psychiatric disorders.
Evolution may not realize what it hath wrought when it evolved the dramatically expanded human brain, with its extraordinary cognition. This awe-inspiring “biological computer” can be creative and adaptive, with superlative survival abilities, but it can also degenerate and become nefarious, villainous, murderous, and even demonic. The human brain has essentially brought evolution to a screeching halt and may at some point end up destroying Earth and all of its Homo sapien inhabitants, who may foolishly use their weapons of mass destruction. The historic achievement of evolution has become the ultimate example of “the law of unintended consequences.”
1. Robin LN, Regier DA. Psychiatric Disorders in America: The Epidemiologic Catchment Area Study. Free Press; 1990.
2. Johns Hopkins Medicine. Mental Health Disorder Statistics. Accessed October 12, 2022. https://www.hopkinsmedicine.org/health/wellness-and-prevention/mental-health-disorder-statistics
Of all the dazzling achievements of evolution, the most glorious by far is the emergence of the advanced human brain, especially the prefrontal cortex. Homo sapiens (the wise humans) are without doubt the most transformative development in the consequential annals of evolution. It was evolution’s spectacular “moonshot.” Ironically, it may also have been the seed of its destruction.
The unprecedented growth of the human brain over the past 7 million years (tripling in size) was a monumental tipping point in evolution that ultimately disrupted the entire orderly cascade of evolution on Planet Earth. Because of their superior intelligence, Homo sapiens have substantially “tinkered” with the foundations of evolution, such as “natural selection” and “survival of the fittest,” and may eventually change the course of evolution, or even reverse it. It should also be recognized that 20% of the human genome is Neanderthal, and the 2022 Nobel Prize in Physiology or Medicine was awarded to Svante Pääbo, the founder of the field of paleogenetics, who demonstrated genetically that Homo sapiens interbred with Homo neanderthalensis (who disappeared 30,000 years ago).
The majestic evolution of the human brain, in both size and complexity, led to monumental changes in the history of humankind compared to their primitive predecessors. Thanks to a superior cerebral cortex, humans developed traits and abilities that were nonexistent, even unimaginable, in the rest of animal kingdom, including primates and other mammals. These include thoughts; speech (hundreds of languages), spoken and written, to communicate among themselves; composed music and created numerous instruments to play it; invented mathematics, physics, and chemistry; developed agriculture to sustain and feed the masses; built homes, palaces, and pyramids, with water and sewage systems; hatched hundreds of religions and built thousands of houses of worship; built machines to transport themselves (cars, trains, ships, planes, and space shuttles); paved airports and countless miles of roads and railways; established companies, universities, hospitals, and research laboratories; built sports facilities such as stadiums for Olympic games and all its athletics; created hotels, restaurants, coffee shops, newspapers, and magazines; discovered the amazing DNA double helix and its genome with 23,000 coding genes containing instructions to build the brain and 200 other body tissues; developed surgeries and invented medications for diseases that would have killed millions every year; and established paper money to replace gold and silver coins. Humans established governments that included monarchies, dictatorships, democracies, and pseudodemocracies; stipulated constitutions, laws, and regulations to maintain various societies; and created several civilizations around the world that thrived and then faded. Over the past century, the advanced human brain elevated human existence to a higher sophistication with technologies such as electricity, phones, computers, internet, artificial intelligence, and machine learning. Using powerful rockets and space stations, humans have begun to expand their influence to the moon and planets of the solar system. Humans are very likely to continue achieving what evolution could never have done without evolving the human brain to become the most powerful force in nature.
The key ingredient of the brain that has enabled humans to achieve so much is the development of an advanced cognition, with superior functions that far exceed those of other living organisms. These include neurocognitive functions such as memory and attention, and executive functions that include planning, problem-solving, decision-making, abstract thinking, and insight. Those cognitive functions generate lofty prose, splendiferous poetry, and heavenly symphonies that inspire those who create it and others. The human brain also developed social cognition, with empathy, theory of mind, recognition of facial expressions, and courtship rituals that can trigger infatuation and love. Homo sapiens can experience a wide range of emotions in addition to love and attachment (necessary for procreation), including shame, guilt, surprise, embarrassment, disgust, and indifference, and a unique sense of right and wrong.
Perhaps the most distinctive human attribute, generated by an advanced prefrontal cortex, is a belief system that includes philosophy, politics, religion, and faith. Hundreds of different religions sprouted throughout human history (each claiming a monopoly on “the truth”), mandating rituals and behaviors, but also promoting a profound and unshakable belief in a divine “higher being” and an afterlife that mitigates the fear of death. Humans, unlike other animals, are painfully aware of mortality and the inevitability of death. Faith is an antidote for thanatophobia. Unfortunately, religious beliefs often generated severe and protracted schisms and warfare, with fatal consequences for their followers.
The anti-evolution aspect of the advanced brain
Despite remarkable talents and achievements, the unprecedented evolutionary expansion of the human brain also has a detrimental downside. The same intellectual power that led to astonishing positive accomplishments has a wicked side as well. While most animals have a predator, humans have become the “omni-predator” that preys on all living things. The balanced ecosystems of animals and plants has been dominated and disrupted by humans. Thousands of species that evolution had so ingeniously spawned became extinct because of human actions. The rainforests, jewels of nature’s plantation system, were victimized by human indifference to the deleterious effects on nature and climate. The excavation of coal and oil, exploited as necessary sources of energy for societal infrastructure, came back to haunt humans with climate consequences. In many ways, human “progress” corrupted evolution and dismantled its components. Survival of the fittest among various species was whittled down to “survival of humans” (and their domesticated animals) at the expense of all other organisms, animals, or plants.
Among Homo sapiens, momentous scientific, medical, and technological advances completely undermined the principle of survival of the fittest. Very premature infants, who would have certainly died, were kept alive. Children with disabling genetic disorders who would have perished in childhood were kept alive into the age of procreation, perpetuating the genetic mutations. The discovery of antibiotic and antiviral medications, and especially vaccines, ensured the survival of millions of humans who would have succumbed to infections. With evolution’s natural selection, humans who survived severe infections without medications would have passed on their “infection-resistant genes” to their progeny. The triumph of human medical progress can be conceptualized as a setback for the principles of evolution.
Continue to: The most malignant...
The most malignant consequence of the exceptional human brain is the evil of which it is capable. Human ingenuity led to the development of weapons of individual killing (guns), large-scale murder (machine guns), and massive destruction (nuclear weapons). And because aggression and warfare are an inherent part of human nature, the most potent predator for a human is another human. The history of humans is riddled with conflict and death on a large scale. Ironically, many wars were instigated by various religious groups around the world, who developed intense hostility towards one another.
There are other downsides to the advanced human brain. It can channel its talents and skills into unimaginably wicked and depraved behaviors, such as premeditated and well-planned murder, slavery, cults, child abuse, domestic abuse, pornography, fascism, dictatorships, and political corruption. Astonishingly, the same brain that can be loving, kind, friendly, and empathetic can suddenly become hateful, vengeful, cruel, vile, sinister, vicious, diabolical, and capable of unimaginable violence and atrocities. The advanced human brain definitely has a very dark side.
Finally, unlike other members of the animal kingdom, the human brain generates its virtual counterpart: the highly complex human mind, which is prone to various maladies, labeled as “psychiatric disorders.” No other animal species develops delusions, hallucinations, thought disorders, melancholia, mania, obsessive-compulsive disorder, generalized anxiety, panic attacks, posttraumatic stress disorder, psychopathy, narcissistic and borderline personality disorders, alcohol addiction, and drug abuse. Homo sapiens are the only species whose members decide to end their own life in large numbers. About 25% of human minds are afflicted with one or more of those psychiatric ailments.1,2 The redeeming grace of the large human brain is that it led to the development of pharmacologic and somatic treatments for most of them, including psychotherapy, which is a uniquely human treatment strategy that can mend many psychiatric disorders.
Evolution may not realize what it hath wrought when it evolved the dramatically expanded human brain, with its extraordinary cognition. This awe-inspiring “biological computer” can be creative and adaptive, with superlative survival abilities, but it can also degenerate and become nefarious, villainous, murderous, and even demonic. The human brain has essentially brought evolution to a screeching halt and may at some point end up destroying Earth and all of its Homo sapien inhabitants, who may foolishly use their weapons of mass destruction. The historic achievement of evolution has become the ultimate example of “the law of unintended consequences.”
Of all the dazzling achievements of evolution, the most glorious by far is the emergence of the advanced human brain, especially the prefrontal cortex. Homo sapiens (the wise humans) are without doubt the most transformative development in the consequential annals of evolution. It was evolution’s spectacular “moonshot.” Ironically, it may also have been the seed of its destruction.
The unprecedented growth of the human brain over the past 7 million years (tripling in size) was a monumental tipping point in evolution that ultimately disrupted the entire orderly cascade of evolution on Planet Earth. Because of their superior intelligence, Homo sapiens have substantially “tinkered” with the foundations of evolution, such as “natural selection” and “survival of the fittest,” and may eventually change the course of evolution, or even reverse it. It should also be recognized that 20% of the human genome is Neanderthal, and the 2022 Nobel Prize in Physiology or Medicine was awarded to Svante Pääbo, the founder of the field of paleogenetics, who demonstrated genetically that Homo sapiens interbred with Homo neanderthalensis (who disappeared 30,000 years ago).
The majestic evolution of the human brain, in both size and complexity, led to monumental changes in the history of humankind compared to their primitive predecessors. Thanks to a superior cerebral cortex, humans developed traits and abilities that were nonexistent, even unimaginable, in the rest of animal kingdom, including primates and other mammals. These include thoughts; speech (hundreds of languages), spoken and written, to communicate among themselves; composed music and created numerous instruments to play it; invented mathematics, physics, and chemistry; developed agriculture to sustain and feed the masses; built homes, palaces, and pyramids, with water and sewage systems; hatched hundreds of religions and built thousands of houses of worship; built machines to transport themselves (cars, trains, ships, planes, and space shuttles); paved airports and countless miles of roads and railways; established companies, universities, hospitals, and research laboratories; built sports facilities such as stadiums for Olympic games and all its athletics; created hotels, restaurants, coffee shops, newspapers, and magazines; discovered the amazing DNA double helix and its genome with 23,000 coding genes containing instructions to build the brain and 200 other body tissues; developed surgeries and invented medications for diseases that would have killed millions every year; and established paper money to replace gold and silver coins. Humans established governments that included monarchies, dictatorships, democracies, and pseudodemocracies; stipulated constitutions, laws, and regulations to maintain various societies; and created several civilizations around the world that thrived and then faded. Over the past century, the advanced human brain elevated human existence to a higher sophistication with technologies such as electricity, phones, computers, internet, artificial intelligence, and machine learning. Using powerful rockets and space stations, humans have begun to expand their influence to the moon and planets of the solar system. Humans are very likely to continue achieving what evolution could never have done without evolving the human brain to become the most powerful force in nature.
The key ingredient of the brain that has enabled humans to achieve so much is the development of an advanced cognition, with superior functions that far exceed those of other living organisms. These include neurocognitive functions such as memory and attention, and executive functions that include planning, problem-solving, decision-making, abstract thinking, and insight. Those cognitive functions generate lofty prose, splendiferous poetry, and heavenly symphonies that inspire those who create it and others. The human brain also developed social cognition, with empathy, theory of mind, recognition of facial expressions, and courtship rituals that can trigger infatuation and love. Homo sapiens can experience a wide range of emotions in addition to love and attachment (necessary for procreation), including shame, guilt, surprise, embarrassment, disgust, and indifference, and a unique sense of right and wrong.
Perhaps the most distinctive human attribute, generated by an advanced prefrontal cortex, is a belief system that includes philosophy, politics, religion, and faith. Hundreds of different religions sprouted throughout human history (each claiming a monopoly on “the truth”), mandating rituals and behaviors, but also promoting a profound and unshakable belief in a divine “higher being” and an afterlife that mitigates the fear of death. Humans, unlike other animals, are painfully aware of mortality and the inevitability of death. Faith is an antidote for thanatophobia. Unfortunately, religious beliefs often generated severe and protracted schisms and warfare, with fatal consequences for their followers.
The anti-evolution aspect of the advanced brain
Despite remarkable talents and achievements, the unprecedented evolutionary expansion of the human brain also has a detrimental downside. The same intellectual power that led to astonishing positive accomplishments has a wicked side as well. While most animals have a predator, humans have become the “omni-predator” that preys on all living things. The balanced ecosystems of animals and plants has been dominated and disrupted by humans. Thousands of species that evolution had so ingeniously spawned became extinct because of human actions. The rainforests, jewels of nature’s plantation system, were victimized by human indifference to the deleterious effects on nature and climate. The excavation of coal and oil, exploited as necessary sources of energy for societal infrastructure, came back to haunt humans with climate consequences. In many ways, human “progress” corrupted evolution and dismantled its components. Survival of the fittest among various species was whittled down to “survival of humans” (and their domesticated animals) at the expense of all other organisms, animals, or plants.
Among Homo sapiens, momentous scientific, medical, and technological advances completely undermined the principle of survival of the fittest. Very premature infants, who would have certainly died, were kept alive. Children with disabling genetic disorders who would have perished in childhood were kept alive into the age of procreation, perpetuating the genetic mutations. The discovery of antibiotic and antiviral medications, and especially vaccines, ensured the survival of millions of humans who would have succumbed to infections. With evolution’s natural selection, humans who survived severe infections without medications would have passed on their “infection-resistant genes” to their progeny. The triumph of human medical progress can be conceptualized as a setback for the principles of evolution.
Continue to: The most malignant...
The most malignant consequence of the exceptional human brain is the evil of which it is capable. Human ingenuity led to the development of weapons of individual killing (guns), large-scale murder (machine guns), and massive destruction (nuclear weapons). And because aggression and warfare are an inherent part of human nature, the most potent predator for a human is another human. The history of humans is riddled with conflict and death on a large scale. Ironically, many wars were instigated by various religious groups around the world, who developed intense hostility towards one another.
There are other downsides to the advanced human brain. It can channel its talents and skills into unimaginably wicked and depraved behaviors, such as premeditated and well-planned murder, slavery, cults, child abuse, domestic abuse, pornography, fascism, dictatorships, and political corruption. Astonishingly, the same brain that can be loving, kind, friendly, and empathetic can suddenly become hateful, vengeful, cruel, vile, sinister, vicious, diabolical, and capable of unimaginable violence and atrocities. The advanced human brain definitely has a very dark side.
Finally, unlike other members of the animal kingdom, the human brain generates its virtual counterpart: the highly complex human mind, which is prone to various maladies, labeled as “psychiatric disorders.” No other animal species develops delusions, hallucinations, thought disorders, melancholia, mania, obsessive-compulsive disorder, generalized anxiety, panic attacks, posttraumatic stress disorder, psychopathy, narcissistic and borderline personality disorders, alcohol addiction, and drug abuse. Homo sapiens are the only species whose members decide to end their own life in large numbers. About 25% of human minds are afflicted with one or more of those psychiatric ailments.1,2 The redeeming grace of the large human brain is that it led to the development of pharmacologic and somatic treatments for most of them, including psychotherapy, which is a uniquely human treatment strategy that can mend many psychiatric disorders.
Evolution may not realize what it hath wrought when it evolved the dramatically expanded human brain, with its extraordinary cognition. This awe-inspiring “biological computer” can be creative and adaptive, with superlative survival abilities, but it can also degenerate and become nefarious, villainous, murderous, and even demonic. The human brain has essentially brought evolution to a screeching halt and may at some point end up destroying Earth and all of its Homo sapien inhabitants, who may foolishly use their weapons of mass destruction. The historic achievement of evolution has become the ultimate example of “the law of unintended consequences.”
1. Robin LN, Regier DA. Psychiatric Disorders in America: The Epidemiologic Catchment Area Study. Free Press; 1990.
2. Johns Hopkins Medicine. Mental Health Disorder Statistics. Accessed October 12, 2022. https://www.hopkinsmedicine.org/health/wellness-and-prevention/mental-health-disorder-statistics
1. Robin LN, Regier DA. Psychiatric Disorders in America: The Epidemiologic Catchment Area Study. Free Press; 1990.
2. Johns Hopkins Medicine. Mental Health Disorder Statistics. Accessed October 12, 2022. https://www.hopkinsmedicine.org/health/wellness-and-prevention/mental-health-disorder-statistics
Warning: Watch out for ‘medication substitution reaction’
Editor’s note: Readers’ Forum is a department for correspondence from readers that is not in response to articles published in
I (MZP) recently started medical school, and one of the first things we learned in our Human Dimension class was to listen to our patients. While this may seem prosaic to seasoned practitioners, I quickly realized the important, real-world consequences of doing so.
Clinicians rightfully presume that when they send a prescription to a pharmacy, the patient will receive what they have ordered or the generic equivalent unless it is ordered “Dispense as written.” Unfortunately, a confluence of increased demand and supply chain disruptions has produced nationwide shortages of generic Adderall extended-release (XR) and Adderall, which are commonly prescribed to patients with attention-deficit/hyperactivity disorder (ADHD).1 While pharmacies should notify patients when they do not have these medications in stock, we have encountered numerous cases where due to shortages, prescriptions for generic dextroamphetamine/amphetamine salts XR or immediate-release (IR) have been filled with the same milligrams of only dextroamphetamine XR or IR, respectively, without notifying the patient or the prescribing clinician. Pharmacies have included several national chains and local independent stores in the New York/New Jersey region.
Over the past several months, we have encountered patients who had been well stabilized on their ADHD medication regimen who began to report anxiety, jitteriness, agitation, fatigue, poor concentration, and/or hyperactivity, and who also reported that their pills “look different.” First, we considered their symptoms could be attributed to a switch between generic manufacturers. However, upon further inspection, we discovered that the medication name printed on the label was different from what had been prescribed. We confirmed this by checking the Prescription Monitoring Program database.
Pharmacists have recently won prescribing privileges for nirmatrelvir/ritonavir (Paxlovid) to treat COVID-19, but they certainly are not permitted to fill prescriptions for psychoactive controlled substances that have different pharmacologic profiles than the medication the clinician ordered. Adderall contains D-amphetamine and L-amphetamine in a ratio of 3:1, which makes it different in potency from dextroamphetamine alone and requires adjustment to the dosage and potentially to the frequency to achieve near equivalency.
Once we realized the issue and helped our patients locate a pharmacy that had generic Adderall XR and Adderall in stock so they could resume their previous regimen, their symptoms resolved.
It is important for all clinicians to add “medication substitution reaction” to their differential diagnosis of new-onset ADHD-related symptoms in previously stable patients.
1. Pharmaceutical Commerce. Innovative solutions for pandemic-driven pharmacy drug shortages. Published February 28, 2022. Accessed September 8, 2022. https://www.pharmaceuticalcommerce.com/view/innovative-solutions-for-pandemic-driven-pharmacy-drug-shortages
Editor’s note: Readers’ Forum is a department for correspondence from readers that is not in response to articles published in
I (MZP) recently started medical school, and one of the first things we learned in our Human Dimension class was to listen to our patients. While this may seem prosaic to seasoned practitioners, I quickly realized the important, real-world consequences of doing so.
Clinicians rightfully presume that when they send a prescription to a pharmacy, the patient will receive what they have ordered or the generic equivalent unless it is ordered “Dispense as written.” Unfortunately, a confluence of increased demand and supply chain disruptions has produced nationwide shortages of generic Adderall extended-release (XR) and Adderall, which are commonly prescribed to patients with attention-deficit/hyperactivity disorder (ADHD).1 While pharmacies should notify patients when they do not have these medications in stock, we have encountered numerous cases where due to shortages, prescriptions for generic dextroamphetamine/amphetamine salts XR or immediate-release (IR) have been filled with the same milligrams of only dextroamphetamine XR or IR, respectively, without notifying the patient or the prescribing clinician. Pharmacies have included several national chains and local independent stores in the New York/New Jersey region.
Over the past several months, we have encountered patients who had been well stabilized on their ADHD medication regimen who began to report anxiety, jitteriness, agitation, fatigue, poor concentration, and/or hyperactivity, and who also reported that their pills “look different.” First, we considered their symptoms could be attributed to a switch between generic manufacturers. However, upon further inspection, we discovered that the medication name printed on the label was different from what had been prescribed. We confirmed this by checking the Prescription Monitoring Program database.
Pharmacists have recently won prescribing privileges for nirmatrelvir/ritonavir (Paxlovid) to treat COVID-19, but they certainly are not permitted to fill prescriptions for psychoactive controlled substances that have different pharmacologic profiles than the medication the clinician ordered. Adderall contains D-amphetamine and L-amphetamine in a ratio of 3:1, which makes it different in potency from dextroamphetamine alone and requires adjustment to the dosage and potentially to the frequency to achieve near equivalency.
Once we realized the issue and helped our patients locate a pharmacy that had generic Adderall XR and Adderall in stock so they could resume their previous regimen, their symptoms resolved.
It is important for all clinicians to add “medication substitution reaction” to their differential diagnosis of new-onset ADHD-related symptoms in previously stable patients.
Editor’s note: Readers’ Forum is a department for correspondence from readers that is not in response to articles published in
I (MZP) recently started medical school, and one of the first things we learned in our Human Dimension class was to listen to our patients. While this may seem prosaic to seasoned practitioners, I quickly realized the important, real-world consequences of doing so.
Clinicians rightfully presume that when they send a prescription to a pharmacy, the patient will receive what they have ordered or the generic equivalent unless it is ordered “Dispense as written.” Unfortunately, a confluence of increased demand and supply chain disruptions has produced nationwide shortages of generic Adderall extended-release (XR) and Adderall, which are commonly prescribed to patients with attention-deficit/hyperactivity disorder (ADHD).1 While pharmacies should notify patients when they do not have these medications in stock, we have encountered numerous cases where due to shortages, prescriptions for generic dextroamphetamine/amphetamine salts XR or immediate-release (IR) have been filled with the same milligrams of only dextroamphetamine XR or IR, respectively, without notifying the patient or the prescribing clinician. Pharmacies have included several national chains and local independent stores in the New York/New Jersey region.
Over the past several months, we have encountered patients who had been well stabilized on their ADHD medication regimen who began to report anxiety, jitteriness, agitation, fatigue, poor concentration, and/or hyperactivity, and who also reported that their pills “look different.” First, we considered their symptoms could be attributed to a switch between generic manufacturers. However, upon further inspection, we discovered that the medication name printed on the label was different from what had been prescribed. We confirmed this by checking the Prescription Monitoring Program database.
Pharmacists have recently won prescribing privileges for nirmatrelvir/ritonavir (Paxlovid) to treat COVID-19, but they certainly are not permitted to fill prescriptions for psychoactive controlled substances that have different pharmacologic profiles than the medication the clinician ordered. Adderall contains D-amphetamine and L-amphetamine in a ratio of 3:1, which makes it different in potency from dextroamphetamine alone and requires adjustment to the dosage and potentially to the frequency to achieve near equivalency.
Once we realized the issue and helped our patients locate a pharmacy that had generic Adderall XR and Adderall in stock so they could resume their previous regimen, their symptoms resolved.
It is important for all clinicians to add “medication substitution reaction” to their differential diagnosis of new-onset ADHD-related symptoms in previously stable patients.
1. Pharmaceutical Commerce. Innovative solutions for pandemic-driven pharmacy drug shortages. Published February 28, 2022. Accessed September 8, 2022. https://www.pharmaceuticalcommerce.com/view/innovative-solutions-for-pandemic-driven-pharmacy-drug-shortages
1. Pharmaceutical Commerce. Innovative solutions for pandemic-driven pharmacy drug shortages. Published February 28, 2022. Accessed September 8, 2022. https://www.pharmaceuticalcommerce.com/view/innovative-solutions-for-pandemic-driven-pharmacy-drug-shortages
Psychotropic medications for chronic pain
The opioid crisis presents a need to consider alternative options for treating chronic pain. There is significant overlap in neuroanatomical circuits that process pain, emotions, and motivation. Neurotransmitters modulated by psychotropic medications are also involved in regulating the pain pathways.1,2 In light of this, psychotropics can be considered for treating chronic pain in certain patients. The Table1-3 outlines various uses and adverse effects of select psychotropic medications used to treat pain, as well as their psychiatric uses.
In addition to its psychiatric indications, the serotonin-norepinephrine reuptake inhibitor duloxetine is FDA-approved for treating fibromyalgia and diabetic neuropathic pain. It is often prescribed in the treatment of multiple pain disorders. Tricyclic antidepressants (TCAs) have the largest effect size in the treatment of neuropathic pain.2 Cyclobenzaprine is a TCA used to treat muscle spasms. Gabapentinoids (alpha-2 delta-1 calcium channel inhibition) are FDA-approved for treating postherpetic neuralgia, fibromyalgia, and diabetic neuropathy.1,2
Ketamine is an anesthetic with analgesic and antidepressant properties used as an IV infusion to manage several pain disorders.2 The alpha-2 adrenergic agonists tizanidine and clonidine are muscle relaxants2; the latter is used to treat attention-deficit/hyperactivity disorder and Tourette syndrome. Benzodiazepines (GABA-A agonists) are used for short-term treatment of anxiety disorders, insomnia, and muscle spasms.1,2 Baclofen (GABA-B receptor agonist) is used to treat spasticity.2 Medical cannabis (tetrahydrocannabinol/cannabidiol) is also gaining popularity for treating chronic pain and insomnia.1-3
1. Sutherland AM, Nicholls J, Bao J, et al. Overlaps in pharmacology for the treatment of chronic pain and mental health disorders. Prog Neuropsychopharmacol Biol Psychiatry. 2018;87(Pt B):290-297.
2. Bajwa ZH, Wootton RJ, Warfield CA. Principles and Practice of Pain Medicine. 3rd ed. McGraw Hill; 2016.
3. McDonagh MS, Selph SS, Buckley DI, et al. Nonopioid Pharmacologic Treatments for Chronic Pain. Comparative Effectiveness Review No. 228. Agency for Healthcare Research and Quality; 2020. doi:10.23970/AHRQEPCCER228
The opioid crisis presents a need to consider alternative options for treating chronic pain. There is significant overlap in neuroanatomical circuits that process pain, emotions, and motivation. Neurotransmitters modulated by psychotropic medications are also involved in regulating the pain pathways.1,2 In light of this, psychotropics can be considered for treating chronic pain in certain patients. The Table1-3 outlines various uses and adverse effects of select psychotropic medications used to treat pain, as well as their psychiatric uses.
In addition to its psychiatric indications, the serotonin-norepinephrine reuptake inhibitor duloxetine is FDA-approved for treating fibromyalgia and diabetic neuropathic pain. It is often prescribed in the treatment of multiple pain disorders. Tricyclic antidepressants (TCAs) have the largest effect size in the treatment of neuropathic pain.2 Cyclobenzaprine is a TCA used to treat muscle spasms. Gabapentinoids (alpha-2 delta-1 calcium channel inhibition) are FDA-approved for treating postherpetic neuralgia, fibromyalgia, and diabetic neuropathy.1,2
Ketamine is an anesthetic with analgesic and antidepressant properties used as an IV infusion to manage several pain disorders.2 The alpha-2 adrenergic agonists tizanidine and clonidine are muscle relaxants2; the latter is used to treat attention-deficit/hyperactivity disorder and Tourette syndrome. Benzodiazepines (GABA-A agonists) are used for short-term treatment of anxiety disorders, insomnia, and muscle spasms.1,2 Baclofen (GABA-B receptor agonist) is used to treat spasticity.2 Medical cannabis (tetrahydrocannabinol/cannabidiol) is also gaining popularity for treating chronic pain and insomnia.1-3
The opioid crisis presents a need to consider alternative options for treating chronic pain. There is significant overlap in neuroanatomical circuits that process pain, emotions, and motivation. Neurotransmitters modulated by psychotropic medications are also involved in regulating the pain pathways.1,2 In light of this, psychotropics can be considered for treating chronic pain in certain patients. The Table1-3 outlines various uses and adverse effects of select psychotropic medications used to treat pain, as well as their psychiatric uses.
In addition to its psychiatric indications, the serotonin-norepinephrine reuptake inhibitor duloxetine is FDA-approved for treating fibromyalgia and diabetic neuropathic pain. It is often prescribed in the treatment of multiple pain disorders. Tricyclic antidepressants (TCAs) have the largest effect size in the treatment of neuropathic pain.2 Cyclobenzaprine is a TCA used to treat muscle spasms. Gabapentinoids (alpha-2 delta-1 calcium channel inhibition) are FDA-approved for treating postherpetic neuralgia, fibromyalgia, and diabetic neuropathy.1,2
Ketamine is an anesthetic with analgesic and antidepressant properties used as an IV infusion to manage several pain disorders.2 The alpha-2 adrenergic agonists tizanidine and clonidine are muscle relaxants2; the latter is used to treat attention-deficit/hyperactivity disorder and Tourette syndrome. Benzodiazepines (GABA-A agonists) are used for short-term treatment of anxiety disorders, insomnia, and muscle spasms.1,2 Baclofen (GABA-B receptor agonist) is used to treat spasticity.2 Medical cannabis (tetrahydrocannabinol/cannabidiol) is also gaining popularity for treating chronic pain and insomnia.1-3
1. Sutherland AM, Nicholls J, Bao J, et al. Overlaps in pharmacology for the treatment of chronic pain and mental health disorders. Prog Neuropsychopharmacol Biol Psychiatry. 2018;87(Pt B):290-297.
2. Bajwa ZH, Wootton RJ, Warfield CA. Principles and Practice of Pain Medicine. 3rd ed. McGraw Hill; 2016.
3. McDonagh MS, Selph SS, Buckley DI, et al. Nonopioid Pharmacologic Treatments for Chronic Pain. Comparative Effectiveness Review No. 228. Agency for Healthcare Research and Quality; 2020. doi:10.23970/AHRQEPCCER228
1. Sutherland AM, Nicholls J, Bao J, et al. Overlaps in pharmacology for the treatment of chronic pain and mental health disorders. Prog Neuropsychopharmacol Biol Psychiatry. 2018;87(Pt B):290-297.
2. Bajwa ZH, Wootton RJ, Warfield CA. Principles and Practice of Pain Medicine. 3rd ed. McGraw Hill; 2016.
3. McDonagh MS, Selph SS, Buckley DI, et al. Nonopioid Pharmacologic Treatments for Chronic Pain. Comparative Effectiveness Review No. 228. Agency for Healthcare Research and Quality; 2020. doi:10.23970/AHRQEPCCER228
The light at the end of the tunnel: Reflecting on a 7-year training journey
Throughout my training, a common refrain from more senior colleagues was that training “goes by quickly.” At the risk of sounding cliché, and even after a 7-year journey spanning psychiatry and preventive medicine residencies as well as a consultation-liaison psychiatry fellowship, I agree without reservations that it does indeed go quickly. In the waning days of my training, reflection and nostalgia have become commonplace, as one might expect after such a meaningful pursuit. In sharing my reflections, I hope others progressing through training will also reflect on elements that added meaning to their experience and how they might improve the journey for future trainees.
Residency is a team sport
One realization that quickly struck me was that residency is a team sport, and finding supportive communities is essential to survival. Other residents, colleagues, and mentors played integral roles in making my experience rewarding. Training might be considered a shared traumatic experience, but having peers to commiserate with at each step has been among its greatest rewards. Residency automatically provided a cohort of colleagues who shared and validated my experiences. Additionally, having mentors who have been through it themselves and find ways to improve the training experience made mine superlative. Mentors assisted me in tailoring my training and developing interests that I could integrate into my future practice. The interpersonal connections I made were critical in helping me survive and thrive during training.
See one, do one, teach one
Residency and fellowship programs might be considered “see one, do one, teach one”1 at large scale. Since their inception, these programs—designed to develop junior physicians—have been inherently educational in nature. The structure is elegant, allowing trainees to continue learning while incrementally gaining more autonomy and teaching responsibility.2 Naively, I did not understand that implicit within my education was an expectation to become an educator and hone my teaching skills. Initially, being a newly minted resident receiving brand-new 3rd-year medical students charged me with apprehension. Thoughts I internalized, such as “these students probably know more than me” or “how can I be responsible for patients and students simultaneously,” may have resulted from a paucity of instruction about teaching available during medical school.3,4 I quickly found, though, that teaching was among the most rewarding facets of training. Helping other learners grow became one of my passions and added to my experience.
Iron sharpens iron
Although my experience was enjoyable, I would be remiss without also considering accompanying trials and tribulations. Seemingly interminable night shifts, sleep deprivation, lack of autonomy, and system inefficiencies frustrated me. Eventually, these frustrations seemed less bothersome. These challenges likely had not vanished with time, but perhaps my capacity to tolerate distress improved—likely corresponding with increasing skill and confidence. These challenges allowed me to hone my clinical decision-making abilities while under duress. My struggles and frustrations were not unique but perhaps lessons themselves.
Residency is not meant to be easy. The crucible of residency taught me that I had resilience to draw upon during challenging times. “Iron sharpens iron,” as the adage goes, and I believe adversity ultimately helped me become a better psychiatrist.
Self-reflection is part of completing training
Reminders that my journey is at an end are everywhere. Seeing notes written by past residents or fellows reminds me that soon I too will merely be a name in the chart to future trainees. Perhaps this line of thought is unfair, reducing my training experience to notes I signed—whereas my training experience was defined by connections made with colleagues and mentors, opportunities to teach junior learners, and confidence gained by overcoming adversity.
While becoming an attending psychiatrist fills me with trepidation, fear need not be an inherent aspect of new beginnings. Reflection has been a powerful practice, allowing me to realize what made my experience so meaningful, and that training is meant to be process-oriented rather than outcome-oriented. My reflection has underscored the realization that challenges are inherent in training, although not without purpose. I believe these struggles were meant to allow me to build meaningful relationships with colleagues, discover joy in teaching, and build resiliency.
The purpose of residencies and fellowships should be to produce clinically excellent psychiatrists, but I feel the journey was as important as the destination. Psychiatrists likely understand this better than most, as we were trained to thoughtfully approach the process of termination with patients.5 While the conclusion of our training journeys may seem unceremonious or anticlimactic, the termination process should include self-reflection on meaningful facets of training. For me, this reflection has itself been invaluable, while also making me hopeful to contribute value to the training journeys of future psychiatrists.
1. Gorrindo T, Beresin EV. Is “See one, do one, teach one” dead? Implications for the professionalization of medical educators in the twenty-first century. Acad Psychiatry. 2015;39(6):613-614. doi:10.1007/s40596-015-0424-8
2. Wright Jr. JR, Schachar NS. Necessity is the mother of invention: William Stewart Halsted’s addiction and its influence on the development of residency training in North America. Can J Surg. 2020;63(1):E13-E19. doi:10.1503/cjs.003319
3. Dandavino M, Snell L, Wiseman J. Why medical students should learn how to teach. Med Teach. 2007;29(6):558-565. doi:10.1080/01421590701477449
4. Liu AC, Liu M, Dannaway J, et al. Are Australian medical students being taught to teach? Clin Teach. 2017;14(5):330-335. doi:10.1111/tct.12591
5. Vasquez MJ, Bingham RP, Barnett JE. Psychotherapy termination: clinical and ethical responsibilities. J Clin Psychol. 2008;64(5):653-665. doi:10.1002/jclp.20478
Throughout my training, a common refrain from more senior colleagues was that training “goes by quickly.” At the risk of sounding cliché, and even after a 7-year journey spanning psychiatry and preventive medicine residencies as well as a consultation-liaison psychiatry fellowship, I agree without reservations that it does indeed go quickly. In the waning days of my training, reflection and nostalgia have become commonplace, as one might expect after such a meaningful pursuit. In sharing my reflections, I hope others progressing through training will also reflect on elements that added meaning to their experience and how they might improve the journey for future trainees.
Residency is a team sport
One realization that quickly struck me was that residency is a team sport, and finding supportive communities is essential to survival. Other residents, colleagues, and mentors played integral roles in making my experience rewarding. Training might be considered a shared traumatic experience, but having peers to commiserate with at each step has been among its greatest rewards. Residency automatically provided a cohort of colleagues who shared and validated my experiences. Additionally, having mentors who have been through it themselves and find ways to improve the training experience made mine superlative. Mentors assisted me in tailoring my training and developing interests that I could integrate into my future practice. The interpersonal connections I made were critical in helping me survive and thrive during training.
See one, do one, teach one
Residency and fellowship programs might be considered “see one, do one, teach one”1 at large scale. Since their inception, these programs—designed to develop junior physicians—have been inherently educational in nature. The structure is elegant, allowing trainees to continue learning while incrementally gaining more autonomy and teaching responsibility.2 Naively, I did not understand that implicit within my education was an expectation to become an educator and hone my teaching skills. Initially, being a newly minted resident receiving brand-new 3rd-year medical students charged me with apprehension. Thoughts I internalized, such as “these students probably know more than me” or “how can I be responsible for patients and students simultaneously,” may have resulted from a paucity of instruction about teaching available during medical school.3,4 I quickly found, though, that teaching was among the most rewarding facets of training. Helping other learners grow became one of my passions and added to my experience.
Iron sharpens iron
Although my experience was enjoyable, I would be remiss without also considering accompanying trials and tribulations. Seemingly interminable night shifts, sleep deprivation, lack of autonomy, and system inefficiencies frustrated me. Eventually, these frustrations seemed less bothersome. These challenges likely had not vanished with time, but perhaps my capacity to tolerate distress improved—likely corresponding with increasing skill and confidence. These challenges allowed me to hone my clinical decision-making abilities while under duress. My struggles and frustrations were not unique but perhaps lessons themselves.
Residency is not meant to be easy. The crucible of residency taught me that I had resilience to draw upon during challenging times. “Iron sharpens iron,” as the adage goes, and I believe adversity ultimately helped me become a better psychiatrist.
Self-reflection is part of completing training
Reminders that my journey is at an end are everywhere. Seeing notes written by past residents or fellows reminds me that soon I too will merely be a name in the chart to future trainees. Perhaps this line of thought is unfair, reducing my training experience to notes I signed—whereas my training experience was defined by connections made with colleagues and mentors, opportunities to teach junior learners, and confidence gained by overcoming adversity.
While becoming an attending psychiatrist fills me with trepidation, fear need not be an inherent aspect of new beginnings. Reflection has been a powerful practice, allowing me to realize what made my experience so meaningful, and that training is meant to be process-oriented rather than outcome-oriented. My reflection has underscored the realization that challenges are inherent in training, although not without purpose. I believe these struggles were meant to allow me to build meaningful relationships with colleagues, discover joy in teaching, and build resiliency.
The purpose of residencies and fellowships should be to produce clinically excellent psychiatrists, but I feel the journey was as important as the destination. Psychiatrists likely understand this better than most, as we were trained to thoughtfully approach the process of termination with patients.5 While the conclusion of our training journeys may seem unceremonious or anticlimactic, the termination process should include self-reflection on meaningful facets of training. For me, this reflection has itself been invaluable, while also making me hopeful to contribute value to the training journeys of future psychiatrists.
Throughout my training, a common refrain from more senior colleagues was that training “goes by quickly.” At the risk of sounding cliché, and even after a 7-year journey spanning psychiatry and preventive medicine residencies as well as a consultation-liaison psychiatry fellowship, I agree without reservations that it does indeed go quickly. In the waning days of my training, reflection and nostalgia have become commonplace, as one might expect after such a meaningful pursuit. In sharing my reflections, I hope others progressing through training will also reflect on elements that added meaning to their experience and how they might improve the journey for future trainees.
Residency is a team sport
One realization that quickly struck me was that residency is a team sport, and finding supportive communities is essential to survival. Other residents, colleagues, and mentors played integral roles in making my experience rewarding. Training might be considered a shared traumatic experience, but having peers to commiserate with at each step has been among its greatest rewards. Residency automatically provided a cohort of colleagues who shared and validated my experiences. Additionally, having mentors who have been through it themselves and find ways to improve the training experience made mine superlative. Mentors assisted me in tailoring my training and developing interests that I could integrate into my future practice. The interpersonal connections I made were critical in helping me survive and thrive during training.
See one, do one, teach one
Residency and fellowship programs might be considered “see one, do one, teach one”1 at large scale. Since their inception, these programs—designed to develop junior physicians—have been inherently educational in nature. The structure is elegant, allowing trainees to continue learning while incrementally gaining more autonomy and teaching responsibility.2 Naively, I did not understand that implicit within my education was an expectation to become an educator and hone my teaching skills. Initially, being a newly minted resident receiving brand-new 3rd-year medical students charged me with apprehension. Thoughts I internalized, such as “these students probably know more than me” or “how can I be responsible for patients and students simultaneously,” may have resulted from a paucity of instruction about teaching available during medical school.3,4 I quickly found, though, that teaching was among the most rewarding facets of training. Helping other learners grow became one of my passions and added to my experience.
Iron sharpens iron
Although my experience was enjoyable, I would be remiss without also considering accompanying trials and tribulations. Seemingly interminable night shifts, sleep deprivation, lack of autonomy, and system inefficiencies frustrated me. Eventually, these frustrations seemed less bothersome. These challenges likely had not vanished with time, but perhaps my capacity to tolerate distress improved—likely corresponding with increasing skill and confidence. These challenges allowed me to hone my clinical decision-making abilities while under duress. My struggles and frustrations were not unique but perhaps lessons themselves.
Residency is not meant to be easy. The crucible of residency taught me that I had resilience to draw upon during challenging times. “Iron sharpens iron,” as the adage goes, and I believe adversity ultimately helped me become a better psychiatrist.
Self-reflection is part of completing training
Reminders that my journey is at an end are everywhere. Seeing notes written by past residents or fellows reminds me that soon I too will merely be a name in the chart to future trainees. Perhaps this line of thought is unfair, reducing my training experience to notes I signed—whereas my training experience was defined by connections made with colleagues and mentors, opportunities to teach junior learners, and confidence gained by overcoming adversity.
While becoming an attending psychiatrist fills me with trepidation, fear need not be an inherent aspect of new beginnings. Reflection has been a powerful practice, allowing me to realize what made my experience so meaningful, and that training is meant to be process-oriented rather than outcome-oriented. My reflection has underscored the realization that challenges are inherent in training, although not without purpose. I believe these struggles were meant to allow me to build meaningful relationships with colleagues, discover joy in teaching, and build resiliency.
The purpose of residencies and fellowships should be to produce clinically excellent psychiatrists, but I feel the journey was as important as the destination. Psychiatrists likely understand this better than most, as we were trained to thoughtfully approach the process of termination with patients.5 While the conclusion of our training journeys may seem unceremonious or anticlimactic, the termination process should include self-reflection on meaningful facets of training. For me, this reflection has itself been invaluable, while also making me hopeful to contribute value to the training journeys of future psychiatrists.
1. Gorrindo T, Beresin EV. Is “See one, do one, teach one” dead? Implications for the professionalization of medical educators in the twenty-first century. Acad Psychiatry. 2015;39(6):613-614. doi:10.1007/s40596-015-0424-8
2. Wright Jr. JR, Schachar NS. Necessity is the mother of invention: William Stewart Halsted’s addiction and its influence on the development of residency training in North America. Can J Surg. 2020;63(1):E13-E19. doi:10.1503/cjs.003319
3. Dandavino M, Snell L, Wiseman J. Why medical students should learn how to teach. Med Teach. 2007;29(6):558-565. doi:10.1080/01421590701477449
4. Liu AC, Liu M, Dannaway J, et al. Are Australian medical students being taught to teach? Clin Teach. 2017;14(5):330-335. doi:10.1111/tct.12591
5. Vasquez MJ, Bingham RP, Barnett JE. Psychotherapy termination: clinical and ethical responsibilities. J Clin Psychol. 2008;64(5):653-665. doi:10.1002/jclp.20478
1. Gorrindo T, Beresin EV. Is “See one, do one, teach one” dead? Implications for the professionalization of medical educators in the twenty-first century. Acad Psychiatry. 2015;39(6):613-614. doi:10.1007/s40596-015-0424-8
2. Wright Jr. JR, Schachar NS. Necessity is the mother of invention: William Stewart Halsted’s addiction and its influence on the development of residency training in North America. Can J Surg. 2020;63(1):E13-E19. doi:10.1503/cjs.003319
3. Dandavino M, Snell L, Wiseman J. Why medical students should learn how to teach. Med Teach. 2007;29(6):558-565. doi:10.1080/01421590701477449
4. Liu AC, Liu M, Dannaway J, et al. Are Australian medical students being taught to teach? Clin Teach. 2017;14(5):330-335. doi:10.1111/tct.12591
5. Vasquez MJ, Bingham RP, Barnett JE. Psychotherapy termination: clinical and ethical responsibilities. J Clin Psychol. 2008;64(5):653-665. doi:10.1002/jclp.20478
Lamotrigine for bipolar depression?
In reading Dr. Nasrallah's August 2022 editorial (“Reversing depression: A plethora of therapeutic strategies and mechanisms,”
Dr. Nasrallah responds
Thanks for your message. Lamotrigine is not FDA-approved for bipolar or unipolar depression, either as monotherapy or as an adjunctive therapy. It has never been approved for mania, either (no efficacy at all). Its only FDA-approved psychiatric indication is maintenance therapy after a patient with bipolar I disorder emerges from mania with the help of one of the antimanic drugs. Yet many clinicians may perceive lamotrigine as useful for bipolar depression because more than 20 years ago the manufacturer sponsored several small studies (not FDA trials). Two studies that showed efficacy were published, but 4 other studies that failed to show efficacy were not published. As a result, many clinicians got the false impression that lamotrigine is an effective antidepressant. I hope this explains why lamotrigine was not included in the list of antidepressants in my editorial.
In reading Dr. Nasrallah's August 2022 editorial (“Reversing depression: A plethora of therapeutic strategies and mechanisms,”
Dr. Nasrallah responds
Thanks for your message. Lamotrigine is not FDA-approved for bipolar or unipolar depression, either as monotherapy or as an adjunctive therapy. It has never been approved for mania, either (no efficacy at all). Its only FDA-approved psychiatric indication is maintenance therapy after a patient with bipolar I disorder emerges from mania with the help of one of the antimanic drugs. Yet many clinicians may perceive lamotrigine as useful for bipolar depression because more than 20 years ago the manufacturer sponsored several small studies (not FDA trials). Two studies that showed efficacy were published, but 4 other studies that failed to show efficacy were not published. As a result, many clinicians got the false impression that lamotrigine is an effective antidepressant. I hope this explains why lamotrigine was not included in the list of antidepressants in my editorial.
In reading Dr. Nasrallah's August 2022 editorial (“Reversing depression: A plethora of therapeutic strategies and mechanisms,”
Dr. Nasrallah responds
Thanks for your message. Lamotrigine is not FDA-approved for bipolar or unipolar depression, either as monotherapy or as an adjunctive therapy. It has never been approved for mania, either (no efficacy at all). Its only FDA-approved psychiatric indication is maintenance therapy after a patient with bipolar I disorder emerges from mania with the help of one of the antimanic drugs. Yet many clinicians may perceive lamotrigine as useful for bipolar depression because more than 20 years ago the manufacturer sponsored several small studies (not FDA trials). Two studies that showed efficacy were published, but 4 other studies that failed to show efficacy were not published. As a result, many clinicians got the false impression that lamotrigine is an effective antidepressant. I hope this explains why lamotrigine was not included in the list of antidepressants in my editorial.
A heartwarming welcome
Dear colleagues,
This November issue of The New Gastroenterologist marks my official transition as the new Editor in Chief! I am humbled with this opportunity to be a part of such a unique publication and have received immense support from Dr. Vijaya Rao, the TNG staff, as well as my mentors and colleagues. With its foundation built by Dr. Bryson Katona and then taken to the next level by Dr. Rao, TNG has grown over the years, and I hope that I can continue to extend its reach to more trainees and early faculty.
In this issue’s In Focus, Dr. Wenfei Wang and Dr. Neil Sengupta (both from University of Chicago) review the management of antithrombotic medications in elective endoscopic procedures and emphasize individualizing the approach while providing guideline recommendations on how to navigate the gastrointestinal bleeding risk and cardiovascular disease in this day and age.
With endoscopic bariatric therapy and antiobesity medications burgeoning within gastroenterology, Dr. Singrid Young (New York University), Dr. Cameron Zenger (New York University), Dr. Erik Holzwanger (Harvard Medical School in Boston), and Dr. Violeta Popov (New York University) review how their multidisciplinary approach has made their endoscopic bariatric program successful in treating patients struggling with obesity. In our Ethics section, Dr. David Ney (Thomas Jefferson University Hospital, Philadelphia) and Dr. Jason Karlawish (University of Pennsylvania, Philadelphia) delve into patient capacity, particularly when consenting for procedures.
Being involved with national society committees may seem daunting to a lot of trainees and early faculty, but Dr. Peter S. Liang (New York University Langone Health) and Dr. Stephanie D. Pointer (Tristar Hendersonville Medical Center in Tennessee) describe their journeys to becoming AGA committee chairs as early-career physicians. While you ponder whether to join a committee, it may be a good time to learn new ways to increase your financial portfolio through passive income, detailed by Dr. Latifat Alli-Akintade (Kaiser Permanente South Sacramento Medical Center in California).
Last but not least, I am excited to introduce a personal favorite in this newsletter – a piece on females supporting female gastroenterologists in career development and more. Dr. Tonya Adams outlines action items on how to create a culture that fosters professional and leadership development among females, using the Gastro Health Women’s Network as an example of how this network has succeeded in cultivating such an environment.
If you are interested in contributing or have ideas for future TNG topics, please contact me ([email protected]), or Jillian Schweitzer ([email protected]), managing editor of TNG.
Until next time, I leave you with an interesting historical fact: William Beaumont, the father of Gastroenterology, published the first findings on the digestive system after performing experiments on Alexis St. Martin when he developed a large gastrocutaneous fistula from an abdominal gunshot wound.
Yours truly,
Judy A. Trieu, MD, MPH
Editor in Chief
Advanced Endoscopy Fellow, University of North Carolina at Chapel Hill, Division of Gastroenterology & Hepatology
Dear colleagues,
This November issue of The New Gastroenterologist marks my official transition as the new Editor in Chief! I am humbled with this opportunity to be a part of such a unique publication and have received immense support from Dr. Vijaya Rao, the TNG staff, as well as my mentors and colleagues. With its foundation built by Dr. Bryson Katona and then taken to the next level by Dr. Rao, TNG has grown over the years, and I hope that I can continue to extend its reach to more trainees and early faculty.
In this issue’s In Focus, Dr. Wenfei Wang and Dr. Neil Sengupta (both from University of Chicago) review the management of antithrombotic medications in elective endoscopic procedures and emphasize individualizing the approach while providing guideline recommendations on how to navigate the gastrointestinal bleeding risk and cardiovascular disease in this day and age.
With endoscopic bariatric therapy and antiobesity medications burgeoning within gastroenterology, Dr. Singrid Young (New York University), Dr. Cameron Zenger (New York University), Dr. Erik Holzwanger (Harvard Medical School in Boston), and Dr. Violeta Popov (New York University) review how their multidisciplinary approach has made their endoscopic bariatric program successful in treating patients struggling with obesity. In our Ethics section, Dr. David Ney (Thomas Jefferson University Hospital, Philadelphia) and Dr. Jason Karlawish (University of Pennsylvania, Philadelphia) delve into patient capacity, particularly when consenting for procedures.
Being involved with national society committees may seem daunting to a lot of trainees and early faculty, but Dr. Peter S. Liang (New York University Langone Health) and Dr. Stephanie D. Pointer (Tristar Hendersonville Medical Center in Tennessee) describe their journeys to becoming AGA committee chairs as early-career physicians. While you ponder whether to join a committee, it may be a good time to learn new ways to increase your financial portfolio through passive income, detailed by Dr. Latifat Alli-Akintade (Kaiser Permanente South Sacramento Medical Center in California).
Last but not least, I am excited to introduce a personal favorite in this newsletter – a piece on females supporting female gastroenterologists in career development and more. Dr. Tonya Adams outlines action items on how to create a culture that fosters professional and leadership development among females, using the Gastro Health Women’s Network as an example of how this network has succeeded in cultivating such an environment.
If you are interested in contributing or have ideas for future TNG topics, please contact me ([email protected]), or Jillian Schweitzer ([email protected]), managing editor of TNG.
Until next time, I leave you with an interesting historical fact: William Beaumont, the father of Gastroenterology, published the first findings on the digestive system after performing experiments on Alexis St. Martin when he developed a large gastrocutaneous fistula from an abdominal gunshot wound.
Yours truly,
Judy A. Trieu, MD, MPH
Editor in Chief
Advanced Endoscopy Fellow, University of North Carolina at Chapel Hill, Division of Gastroenterology & Hepatology
Dear colleagues,
This November issue of The New Gastroenterologist marks my official transition as the new Editor in Chief! I am humbled with this opportunity to be a part of such a unique publication and have received immense support from Dr. Vijaya Rao, the TNG staff, as well as my mentors and colleagues. With its foundation built by Dr. Bryson Katona and then taken to the next level by Dr. Rao, TNG has grown over the years, and I hope that I can continue to extend its reach to more trainees and early faculty.
In this issue’s In Focus, Dr. Wenfei Wang and Dr. Neil Sengupta (both from University of Chicago) review the management of antithrombotic medications in elective endoscopic procedures and emphasize individualizing the approach while providing guideline recommendations on how to navigate the gastrointestinal bleeding risk and cardiovascular disease in this day and age.
With endoscopic bariatric therapy and antiobesity medications burgeoning within gastroenterology, Dr. Singrid Young (New York University), Dr. Cameron Zenger (New York University), Dr. Erik Holzwanger (Harvard Medical School in Boston), and Dr. Violeta Popov (New York University) review how their multidisciplinary approach has made their endoscopic bariatric program successful in treating patients struggling with obesity. In our Ethics section, Dr. David Ney (Thomas Jefferson University Hospital, Philadelphia) and Dr. Jason Karlawish (University of Pennsylvania, Philadelphia) delve into patient capacity, particularly when consenting for procedures.
Being involved with national society committees may seem daunting to a lot of trainees and early faculty, but Dr. Peter S. Liang (New York University Langone Health) and Dr. Stephanie D. Pointer (Tristar Hendersonville Medical Center in Tennessee) describe their journeys to becoming AGA committee chairs as early-career physicians. While you ponder whether to join a committee, it may be a good time to learn new ways to increase your financial portfolio through passive income, detailed by Dr. Latifat Alli-Akintade (Kaiser Permanente South Sacramento Medical Center in California).
Last but not least, I am excited to introduce a personal favorite in this newsletter – a piece on females supporting female gastroenterologists in career development and more. Dr. Tonya Adams outlines action items on how to create a culture that fosters professional and leadership development among females, using the Gastro Health Women’s Network as an example of how this network has succeeded in cultivating such an environment.
If you are interested in contributing or have ideas for future TNG topics, please contact me ([email protected]), or Jillian Schweitzer ([email protected]), managing editor of TNG.
Until next time, I leave you with an interesting historical fact: William Beaumont, the father of Gastroenterology, published the first findings on the digestive system after performing experiments on Alexis St. Martin when he developed a large gastrocutaneous fistula from an abdominal gunshot wound.
Yours truly,
Judy A. Trieu, MD, MPH
Editor in Chief
Advanced Endoscopy Fellow, University of North Carolina at Chapel Hill, Division of Gastroenterology & Hepatology
Management of antithrombotic medications in elective endoscopy
Antithrombotic therapy is increasingly used to either reduce the risk of or treat thromboembolic episodes in patients with various medical conditions such as ischemic and valvular heart disease, atrial fibrillation (AF), cerebrovascular disease, peripheral arterial disease, venous thromboembolism (VTE) and hypercoagulable diseases. Antithrombotics include medications classified as anticoagulants or antiplatelets. Anticoagulants work by interfering with the native clotting cascade and consist of four main classes: vitamin K antagonists (VKA), heparin derivatives, direct factor Xa inhibitors, and direct thrombin inhibitors. Direct oral anticoagulants (DOACs) refer to dabigatran (a direct thrombin inhibitor) and the factor Xa inhibitors (apixaban, rivaroxaban, and edoxaban).
Antiplatelets, on the other hand, work by decreasing platelet aggregation and thus preventing thrombus formation; they include P2Y12 receptor inhibitors, protease-activated receptor-1 inhibitors, glycoprotein IIb/IIIa receptor inhibitors, acetylsalicylic acid (ASA), and nonsteroidal anti-inflammatory drugs. All of these agents may directly cause or increase the risk of gastrointestinal (GI) bleeding from luminal sources such as ulcers or diverticula, as well as after endoscopic interventions such as polypectomy. However, there is also a risk of thromboembolic consequences if some of these agents are withheld. Thus, the management of patients receiving antithrombotic agents and undergoing GI endoscopy represents an important clinical challenge and something that every GI physician has to deal with routinely.
The goal of this review is to discuss the optimal strategy for managing antithrombotics in patients undergoing elective endoscopy based on current available evidence and published clinical guidelines.1-4 Much of our discussion will review recommendations from the recently published joint American College of Gastroenterology (ACG) and Canadian Association of Gastroenterology (CAG) guidelines on management of anticoagulants and antiplatelets in the periendoscopic period by Abraham et al.4
Factors that guide decision-making
The two most vital factors to consider prior to performing endoscopic procedures in patients receiving antithrombotic therapy are to assess the risk of bleeding associated with the procedure and to assess the risk of thromboembolism associated with the underlying medical condition for which the antithrombotic agents are being used. In addition, it is also important to keep in mind the individual characteristics of the antithrombotic agent(s) used when making these decisions.
Estimating procedure-related bleeding risk
Various endoscopic procedures have different risks of associated bleeding. Although guidelines from GI societies may differ when classifying procedures into low or high risk, it is important to know that most of the original data on postprocedural bleeding risks are from studies conducted in patients who are not on complex antithrombotic regimens and thus may not accurately reflect the bleeding risk of patients using newer antithrombotic therapies.1,4-7
Traditionally, some of the common low-risk procedures have included diagnostic EGD and colonoscopy with or without biopsy, ERCP without sphincterotomy, biliary stent placement, and push or balloon-assisted enteroscopy. On the other hand, endoscopic procedures associated with interventions are known to have higher bleeding risk, and other procedural factors can influence this risk as well.8 For example, polypectomy, one of the most common interventions during endoscopy, is associated with bleeding risk ranging from 0.3% to 10% depending on multiple factors including polyp size, location, morphology (nonpolypoid, sessile, pedunculated), resection technique (cold or hot forceps, cold or hot snare), and type of cautery used.9 For some procedures, such as routine screening colonoscopy, however, the preprocedure estimate of bleeding risk can be uncertain because it is unclear if a high risk intervention (e.g., polypectomy of large polyp) will be necessary. For example, in the most recent ACG/CAG guidelines, colonoscopy with polypectomy < 1cm is considered a low/moderate risk bleeding procedure, whereas polypectomy > 1cm is considered high risk for bleeding.4 In these situations, the management of antithrombotic medications may depend on the individual patient’s risk of thrombosis and the specific antithrombotic agent. In the example of a patient undergoing colonoscopy while on antithrombotic medications, the bleeding risk associated with polypectomy can potentially be reduced by procedural techniques such as preferential use of cold snare polypectomy. Further high-quality data on the optimal procedural technique to reduce postpolypectomy bleeding in patients on antithrombotic medications is needed to help guide management.
Estimating thromboembolic risk
The risk of thromboembolic events in patients who are withholding their antithrombotic therapy for an endoscopic procedure depends on their underlying condition and individual characteristics. In patients who are on antithrombotic therapy for stroke prevention in non-valvular AF, the risk of cerebral thromboembolism in these patients is predictable using the CHA2DS2Vasc index.10 This scoring index includes heart failure, hypertension, age 75 years or older, diabetes mellitus, prior stroke or transient ischemic attack (TIA), vascular disease, age 65-74 years, and sex categories.
Patients with previous VTE on anticoagulation or those who have mechanical heart valves may have different risk factors for thromboembolic episodes. Among patients with VTE, time from initial VTE, history of recurrent VTE with antithrombotic interruption, and presence of underlying thrombophilia are most predictive of future thromboembolic risk. And for patients with mechanical heart valves, presence of a mitral valve prosthesis, and the presence or absence of associated heart failure and AF determine the annual risk of thromboembolic events. Bioprosthetic valves are generally considered low risk.
In patients with coronary artery disease (CAD), high thrombosis risk scenarios with holding antiplatelets include patients within 3 months of an acute coronary syndrome (ACS) event, within 6 months of a drug-eluting stent (DES) placement, or within 1 month of a bare metal coronary stent (BMS) placement. In addition, patients with ACS that occurred within the past 12 months of DES placement or within 2 months of BMS placement are also considered high risk.11,12 Even beyond these periods, certain patients may still be at high risk of stent occlusion. In particular, patients with a prior history of stent occlusion, ACS or ST elevation myocardial infection, prior multivessel percutaneous coronary intervention, diabetes, renal failure, or diffuse CAD are at higher risk of stent occlusion or ACS events with alteration of antithrombotic therapy.13 Thus, modification of antithrombotic regimens in these patients should be cautiously approached.
Management of antithrombotics prior to elective endoscopy
In patients who need elective endoscopic procedures, if the indication for antithrombotic therapy is short-term, the procedure is probably best delayed until after that period.13 For patients on long-term or lifelong antithrombotic treatment, the decision to temporarily hold the treatment for endoscopy should occur after a discussion with the patient and all of the involved providers. In some high-risk patients, these agents cannot be interrupted; therefore, clinicians must carefully weigh the risks and benefits of the procedure before proceeding with endoscopy. For patients who are known to be undergoing an elective endoscopic procedure, antithrombotic medications may or may not need to be held prior to the procedure depending on the type of therapy. For example, according to the recent ACG/CAG guidelines, warfarin should be continued, whereas DOACs should be temporarily stopped for patients who are undergoing elective/planned endoscopic GI procedures.
Unfractionated heparin (UFH) administered as a continuous intravenous infusion can generally be held 3-4 hours before the procedure, given its short half-life. Low molecular weight heparin (LMWH), including enoxaparin and dalteparin, should be stopped 24 hours prior to the procedure.2,14 Fondaparinux is a synthetic X-a inhibitor that requires discontinuation at least 36 hours preceding a high risk procedure. For patients on warfarin who are undergoing elective endoscopic procedures that are low risk for inducing bleeding, warfarin can be continued, as opposed to temporarily interrupted, although the dose should be omitted the morning of the procedure.4 For those who are undergoing high-risk endoscopic procedures (including colonoscopy with possible polypectomy > 1 cm), 5 days of temporary interruption without periprocedural bridging is appropriate in most patients. This is contrary to previous guidelines, which had recommended bridging for patients with a CHA2DS2Vasc score ≥ 2. Recent impactful randomized trials (BRIDGE and PERIOP-2) have called into question the benefit of periprocedural bridging with LMWH. Avoiding bridging anticoagulation was generally found to be similar to bridging in regard to prevention of thromboembolic complications, but importantly was associated with a decreased risk of major bleeding.15,16 Of note, periprocedural bridging may still be appropriate in a small subset of patients, including those with mechanical valves, AF with CHADS2 score > 5, and previous thromboembolism during temporary interruption of VKAs. The decision to bridge or not should ideally be made in a multidisciplinary fashion.15-20
Data are lacking on the ideal strategy for periendoscopic DOAC management. As mentioned above, for patients on DOACs who are undergoing elective endoscopic GI procedures, temporarily interrupting DOACs rather than continuing them is recommended. Currently, there are no randomized controlled trials addressing the management of DOACs in the periendoscopic period. However, based on five cohort studies, the ideal duration of DOAC interruption before endoscopic procedures may be between 1 and 2 days, excluding the day of the procedure.21-25 This strategy allows for a short preprocedural duration of DOAC interruption and likely provides a balance between bleeding and thromboembolism risk. Importantly, there are no reliable laboratory assays to assess the anticoagulant effect of DOACs, and an individual patient’s degree of renal dysfunction may impact how long the DOAC should be held. In general, the anti-Xa drugs should be held for 1-2 days if the creatinine clearance (CrCl) is ≥ 60 mL/min, for 3 days if the CrCl is between 30 mL/min and 59 mL/min, and for 4 days if the CrCl is less than 30 mL/min.26 For edoxaban, the recommendation is to hold at least 24 hours before high-risk procedures. The recommendation for stopping dabigatran is 2-3 days before a high-risk procedure in patients with CrCl more than 80 mL/min, 3-4 days prior if between 30 and 49 mL/min, and 4-6 days prior if less than 30 mL/min respectively.27
In regard to antiplatelet management, ASA and the P2Y12 receptor inhibitors (e.g. clopidogrel, prasugrel, and ticagrelor) are the most commonly utilized antiplatelets in patients undergoing endoscopic procedures. For patients who are on ASA monotherapy, whether 81 mg or 325 mg daily, for secondary cardiovascular prevention, no interruption of ASA therapy is necessary for elective procedures. The benefit of ASA for secondary cardiovascular prevention and the possible reduction in thrombotic events seen in RCTs of nonendoscopic surgical procedures is well known. However, there may be certain exceptions in which aspirin should be temporarily held. For example, short-term interruption of ASA could be considered in high risk procedures such as biliary or pancreatic sphincterotomy, ampullectomy, and peroral endoscopic myotomy. For patients on single antiplatelet therapy with a P2Y12 receptor inhibitor who are undergoing elective endoscopic GI procedures, the recent CAG/ACG guidelines did not provide a clear recommendation for or against temporary interruption of the P2Y12 receptor inhibitor. Although interruption of a P2Y12 receptor inhibitor should theoretically decrease a patient’s risk of bleeding, the available evidence reported a nonsignificant increased bleeding risk in patients who stop a P2Y12 receptor inhibitor for an elective endoscopic procedure compared with those who continue the medication.28,29 Therefore, until further data are available, for patients on P2Y12 receptor monotherapy, a reasonable strategy would be to temporarily hold therapy prior to high risk endoscopic procedures, assuming the patients are not at high cardiovascular risk. Clopidogrel and prasugrel have to be stopped 5-7 days prior to allow normal platelet aggregation to resume as opposed to ticagrelor, a reversible P2Y12 receptor inhibitor that can be stopped 3-5 days prior.30
Lastly, for patients who are on dual antiplatelet therapy (DAPT) for secondary prevention, continuation of ASA and temporary interruption of the P2Y12 receptor inhibitor is recommended while undergoing elective endoscopy. Studies have shown that those who discontinued both had a much higher incidence of stent thrombosis compared with those who remained on aspirin alone.4,28,31
Resumption of antithrombotic therapy after endoscopy
In general, antithrombotic therapy should be resumed upon completion of the procedure unless there remains a persistent risk of major bleeding.1,14 This consensus is based on studies available on warfarin and heparin products, with minimal literature available regarding the resumption of DOACs. The benefits of immediate re-initiation of antithrombotic therapy for the prevention of thromboembolic events should be weighed against the risk of hemorrhage associated with the specific agent, the time to onset of the medication, and procedure-specific circumstances. For the small subset of patients on warfarin with a high risk of thromboembolism (e.g., mechanical heart valve), bridging with LMWH should be started at the earliest possible time when there is no risk of major bleeding and continued until the international normalized ratio (INR) reaches a therapeutic level with warfarin. For patients at a lower risk of thromboembolism, warfarin should be restarted within 24 hours of the procedure. In addition, because of the shorter duration of DOACs, if treatment with these agents cannot resume within 24 hours of a high-risk procedure, bridge therapy should be considered with UFH or LMWH in patients with a high risk of thrombosis.18 In patients receiving DOACs for stroke prophylaxis in AF, the DOACS can be safely resumed 1 day after low-risk procedures and 2-3 days after high-risk procedures without the need for bridging.25 All antiplatelet agents should be resumed as soon as hemostasis is achieved.
Conclusion
Antithrombotic therapy is increasingly used given the aging population, widespread burden of cardiovascular comorbidities, and expanding indications for classes of medications such as direct oral anticoagulants. Given the association with antithrombotic medications and gastrointestinal bleeding, it is essential for gastroenterologists to understand the importance, necessity, and timing when holding these medications for endoscopic procedures. Even with the practice guidelines available today to help clinicians navigate certain situations, each patient’s antithrombotic management may be different, and communication with the prescribing physicians and including patients in the decision-making process is essential before planned procedures.
Dr. Wang is a gastroenterology fellow at the University of Chicago. Dr. Sengupta is an associate professor at the University of Chicago. They reported no funding or conflicts of interest.
References
1. ASGE Standards of Practice Committee, Acosta RD et al. The management of antithrombotic agents for patients undergoing GI endoscopy. Gastrointest Endosc. 2016;83(1):3-16.
2. Veitch AM et al. Endoscopy in patients on antiplatelet or anticoagulant therapy, including direct oral anticoagulants: British Society of Gastroenterology (BSG) and European Society of Gastrointestinal Endoscopy (ESGE) guidelines. Endoscopy. 2016;48(4):c1. doi: 10.1055/s-0042-122686.
3. Chan FKL et al. Management of patients on antithrombotic agents undergoing emergency and elective endoscopy: Joint Asian Pacific Association of Gastroenterology (APAGE) and Asian Pacific Society for Digestive Endoscopy (APSDE) practice guidelines. Gut. 2018;67(3):405-17.
4. Abraham NS et al. American College of Gastroenterology – Canadian Association of Gastroenterology clinical practice guideline: Management of anticoagulants and antiplatelets during acute gastrointestinal bleeding and the periendoscopic period. Am J Gastroenterol. 2022;117(4):542-58.
5. Boustière C et al. Endoscopy and antiplatelet agents. European Society of Gastrointestinal Endoscopy (ESGE) guideline. Endoscopy. 2011;43(5):445-61.
6. Fujimoto K et al. Guidelines for gastroenterological endoscopy in patients undergoing antithrombotic treatment. Dig Endosc. 2014;26(1):1-14.
7. Wilke T et al. Patient preferences for oral anticoagulation therapy in atrial fibrillation: A systematic literature review. Patient 2017;10(1):17-37.
8. Gerson LB et al. Adverse events associated with anticoagulation therapy in the periendoscopic period. Gastrointest Endosc. 2010 Jun;71(7):1211-17.e2.
9. Horiuchi A et al. Removal of small colorectal polyps in anticoagulated patients: A prospective randomized comparison of cold snare and conventional polypectomy. Gastrointest Endosc 2014;79(3):417-23.
10. Lip GYH et al. Refining clinical risk stratification for predicting stroke and thromboembolism in atrial fibrillation using a novel risk factor-based approach: The euro heart survey on atrial fibrillation. Chest. 2010;137(2):263-72.
11. 2012 Writing Committee Members, Jneid H et al. 2012 ACCF/AHA focused update of the guideline for the management of patients with unstable angina/non-ST-elevation myocardial infarction (Updating the 2007 guideline and replacing the 2011 focused update): A report of the American College of Cardiology Foundation/American Heart Association Task Force on practice guidelines. Circulation. 2012;126(7):875-910.
12. Douketis JD et al. Perioperative management of antithrombotic therapy: Antithrombotic therapy and prevention of thrombosis, 9th ed: American College of Chest Physicians evidence-based clinical practice guidelines. Chest. 2012 Feb;141(2 Suppl):e326S-e350S.
13. Becker RC et al. Management of platelet-directed pharmacotherapy in patients with atherosclerotic coronary artery disease undergoing elective endoscopic gastrointestinal procedures. J Am Coll Cardiol. 2009;54(24):2261-76.
14. Kwok A and Faigel DO. Management of anticoagulation before and after gastrointestinal endoscopy. Am J Gastroenterol. 2009;104(12):3085-97; quiz 3098.
15. Douketis JD et al. Perioperative bridging anticoagulation in patients with atrial fibrillation. N Engl J Med. 2015;373(9):823-33.
16. Kovacs MJ et al. Postoperative low molecular weight heparin bridging treatment for patients at high risk of arterial thromboembolism (PERIOP2): Double blind randomised controlled trial. BMJ 2021;373:n1205.
17. Tafur A and Douketis J. Perioperative management of anticoagulant and antiplatelet therapy. Heart 2018;104(17):1461-7.
18. Kato M et al. Guidelines for gastroenterological endoscopy in patients undergoing antithrombotic treatment: 2017 appendix on anticoagulants including direct oral anticoagulants. Dig Endosc. 2018;30(4):433-40.
19. Inoue T et al. Clinical features of postpolypectomy bleeding associated with heparin bridge therapy. Dig Endosc. 2014;26(2):243-9.
20. Takeuchi Y et al. Continuous anticoagulation and cold snare polypectomy versus heparin bridging and hot snare polypectomy in patients on anticoagulants with subcentimeter polyps: A randomized controlled trial. Ann Intern Med. 2019;171(4):229-37.
21. Ara N et al. Prospective analysis of risk for bleeding after endoscopic biopsy without cessation of antithrombotics in Japan. Dig Endosc. 2015;27(4):458-64.
22. Yanagisawa N et al. Postpolypectomy bleeding and thromboembolism risks associated with warfarin vs. direct oral anticoagulants. World J Gastroenterol. 2018;24(14):1540-9.
23. Arimoto J et al. Safety of cold snare polypectomy in patients receiving treatment with antithrombotic agents. Dig Dis Sci. 2019;64(11):3247-55.
24. Heublein V et al. Gastrointestinal endoscopy in patients receiving novel direct oral anticoagulants: Results from the prospective Dresden NOAC registry. J Gastroenterol. 2018;53(2):236-46.
25. Douketis JD et al. Perioperative management of patients with atrial fibrillation receiving a direct oral anticoagulant. JAMA Intern Med. 2019;179(11):1469-78.
26. Dubois V et al. Perioperative management of patients on direct oral anticoagulants. Thromb J. 2017;15:14.
27. Weitz JI et al. Periprocedural management and approach to bleeding in patients taking dabigatran. Circulation. 2012 Nov 13;126(20):2428-32.
28. Chan FKL et al. Risk of postpolypectomy bleeding with uninterrupted clopidogrel therapy in an industry-independent, double-blind, randomized trial. Gastroenterology. 2019;156(4):918-25.
29. Watanabe K et al. Effect of antiplatelet agent number, types, and pre-endoscopic management on postpolypectomy bleeding: Validation of endoscopy guidelines. Surg Endosc. 2021;35(1):317-25.
30. Gurbel PA et al. Randomized double-blind assessment of the ONSET and OFFSET of the antiplatelet effects of ticagrelor versus clopidogrel in patients with stable coronary artery disease: The ONSET/OFFSET study. Circulation. 2009;120(25):2577-85.
31. Eisenberg MJ et al. Safety of short-term discontinuation of antiplatelet therapy in patients with drug-eluting stents. Circulation. 2009;119(12):1634-42.
Antithrombotic therapy is increasingly used to either reduce the risk of or treat thromboembolic episodes in patients with various medical conditions such as ischemic and valvular heart disease, atrial fibrillation (AF), cerebrovascular disease, peripheral arterial disease, venous thromboembolism (VTE) and hypercoagulable diseases. Antithrombotics include medications classified as anticoagulants or antiplatelets. Anticoagulants work by interfering with the native clotting cascade and consist of four main classes: vitamin K antagonists (VKA), heparin derivatives, direct factor Xa inhibitors, and direct thrombin inhibitors. Direct oral anticoagulants (DOACs) refer to dabigatran (a direct thrombin inhibitor) and the factor Xa inhibitors (apixaban, rivaroxaban, and edoxaban).
Antiplatelets, on the other hand, work by decreasing platelet aggregation and thus preventing thrombus formation; they include P2Y12 receptor inhibitors, protease-activated receptor-1 inhibitors, glycoprotein IIb/IIIa receptor inhibitors, acetylsalicylic acid (ASA), and nonsteroidal anti-inflammatory drugs. All of these agents may directly cause or increase the risk of gastrointestinal (GI) bleeding from luminal sources such as ulcers or diverticula, as well as after endoscopic interventions such as polypectomy. However, there is also a risk of thromboembolic consequences if some of these agents are withheld. Thus, the management of patients receiving antithrombotic agents and undergoing GI endoscopy represents an important clinical challenge and something that every GI physician has to deal with routinely.
The goal of this review is to discuss the optimal strategy for managing antithrombotics in patients undergoing elective endoscopy based on current available evidence and published clinical guidelines.1-4 Much of our discussion will review recommendations from the recently published joint American College of Gastroenterology (ACG) and Canadian Association of Gastroenterology (CAG) guidelines on management of anticoagulants and antiplatelets in the periendoscopic period by Abraham et al.4
Factors that guide decision-making
The two most vital factors to consider prior to performing endoscopic procedures in patients receiving antithrombotic therapy are to assess the risk of bleeding associated with the procedure and to assess the risk of thromboembolism associated with the underlying medical condition for which the antithrombotic agents are being used. In addition, it is also important to keep in mind the individual characteristics of the antithrombotic agent(s) used when making these decisions.
Estimating procedure-related bleeding risk
Various endoscopic procedures have different risks of associated bleeding. Although guidelines from GI societies may differ when classifying procedures into low or high risk, it is important to know that most of the original data on postprocedural bleeding risks are from studies conducted in patients who are not on complex antithrombotic regimens and thus may not accurately reflect the bleeding risk of patients using newer antithrombotic therapies.1,4-7
Traditionally, some of the common low-risk procedures have included diagnostic EGD and colonoscopy with or without biopsy, ERCP without sphincterotomy, biliary stent placement, and push or balloon-assisted enteroscopy. On the other hand, endoscopic procedures associated with interventions are known to have higher bleeding risk, and other procedural factors can influence this risk as well.8 For example, polypectomy, one of the most common interventions during endoscopy, is associated with bleeding risk ranging from 0.3% to 10% depending on multiple factors including polyp size, location, morphology (nonpolypoid, sessile, pedunculated), resection technique (cold or hot forceps, cold or hot snare), and type of cautery used.9 For some procedures, such as routine screening colonoscopy, however, the preprocedure estimate of bleeding risk can be uncertain because it is unclear if a high risk intervention (e.g., polypectomy of large polyp) will be necessary. For example, in the most recent ACG/CAG guidelines, colonoscopy with polypectomy < 1cm is considered a low/moderate risk bleeding procedure, whereas polypectomy > 1cm is considered high risk for bleeding.4 In these situations, the management of antithrombotic medications may depend on the individual patient’s risk of thrombosis and the specific antithrombotic agent. In the example of a patient undergoing colonoscopy while on antithrombotic medications, the bleeding risk associated with polypectomy can potentially be reduced by procedural techniques such as preferential use of cold snare polypectomy. Further high-quality data on the optimal procedural technique to reduce postpolypectomy bleeding in patients on antithrombotic medications is needed to help guide management.
Estimating thromboembolic risk
The risk of thromboembolic events in patients who are withholding their antithrombotic therapy for an endoscopic procedure depends on their underlying condition and individual characteristics. In patients who are on antithrombotic therapy for stroke prevention in non-valvular AF, the risk of cerebral thromboembolism in these patients is predictable using the CHA2DS2Vasc index.10 This scoring index includes heart failure, hypertension, age 75 years or older, diabetes mellitus, prior stroke or transient ischemic attack (TIA), vascular disease, age 65-74 years, and sex categories.
Patients with previous VTE on anticoagulation or those who have mechanical heart valves may have different risk factors for thromboembolic episodes. Among patients with VTE, time from initial VTE, history of recurrent VTE with antithrombotic interruption, and presence of underlying thrombophilia are most predictive of future thromboembolic risk. And for patients with mechanical heart valves, presence of a mitral valve prosthesis, and the presence or absence of associated heart failure and AF determine the annual risk of thromboembolic events. Bioprosthetic valves are generally considered low risk.
In patients with coronary artery disease (CAD), high thrombosis risk scenarios with holding antiplatelets include patients within 3 months of an acute coronary syndrome (ACS) event, within 6 months of a drug-eluting stent (DES) placement, or within 1 month of a bare metal coronary stent (BMS) placement. In addition, patients with ACS that occurred within the past 12 months of DES placement or within 2 months of BMS placement are also considered high risk.11,12 Even beyond these periods, certain patients may still be at high risk of stent occlusion. In particular, patients with a prior history of stent occlusion, ACS or ST elevation myocardial infection, prior multivessel percutaneous coronary intervention, diabetes, renal failure, or diffuse CAD are at higher risk of stent occlusion or ACS events with alteration of antithrombotic therapy.13 Thus, modification of antithrombotic regimens in these patients should be cautiously approached.
Management of antithrombotics prior to elective endoscopy
In patients who need elective endoscopic procedures, if the indication for antithrombotic therapy is short-term, the procedure is probably best delayed until after that period.13 For patients on long-term or lifelong antithrombotic treatment, the decision to temporarily hold the treatment for endoscopy should occur after a discussion with the patient and all of the involved providers. In some high-risk patients, these agents cannot be interrupted; therefore, clinicians must carefully weigh the risks and benefits of the procedure before proceeding with endoscopy. For patients who are known to be undergoing an elective endoscopic procedure, antithrombotic medications may or may not need to be held prior to the procedure depending on the type of therapy. For example, according to the recent ACG/CAG guidelines, warfarin should be continued, whereas DOACs should be temporarily stopped for patients who are undergoing elective/planned endoscopic GI procedures.
Unfractionated heparin (UFH) administered as a continuous intravenous infusion can generally be held 3-4 hours before the procedure, given its short half-life. Low molecular weight heparin (LMWH), including enoxaparin and dalteparin, should be stopped 24 hours prior to the procedure.2,14 Fondaparinux is a synthetic X-a inhibitor that requires discontinuation at least 36 hours preceding a high risk procedure. For patients on warfarin who are undergoing elective endoscopic procedures that are low risk for inducing bleeding, warfarin can be continued, as opposed to temporarily interrupted, although the dose should be omitted the morning of the procedure.4 For those who are undergoing high-risk endoscopic procedures (including colonoscopy with possible polypectomy > 1 cm), 5 days of temporary interruption without periprocedural bridging is appropriate in most patients. This is contrary to previous guidelines, which had recommended bridging for patients with a CHA2DS2Vasc score ≥ 2. Recent impactful randomized trials (BRIDGE and PERIOP-2) have called into question the benefit of periprocedural bridging with LMWH. Avoiding bridging anticoagulation was generally found to be similar to bridging in regard to prevention of thromboembolic complications, but importantly was associated with a decreased risk of major bleeding.15,16 Of note, periprocedural bridging may still be appropriate in a small subset of patients, including those with mechanical valves, AF with CHADS2 score > 5, and previous thromboembolism during temporary interruption of VKAs. The decision to bridge or not should ideally be made in a multidisciplinary fashion.15-20
Data are lacking on the ideal strategy for periendoscopic DOAC management. As mentioned above, for patients on DOACs who are undergoing elective endoscopic GI procedures, temporarily interrupting DOACs rather than continuing them is recommended. Currently, there are no randomized controlled trials addressing the management of DOACs in the periendoscopic period. However, based on five cohort studies, the ideal duration of DOAC interruption before endoscopic procedures may be between 1 and 2 days, excluding the day of the procedure.21-25 This strategy allows for a short preprocedural duration of DOAC interruption and likely provides a balance between bleeding and thromboembolism risk. Importantly, there are no reliable laboratory assays to assess the anticoagulant effect of DOACs, and an individual patient’s degree of renal dysfunction may impact how long the DOAC should be held. In general, the anti-Xa drugs should be held for 1-2 days if the creatinine clearance (CrCl) is ≥ 60 mL/min, for 3 days if the CrCl is between 30 mL/min and 59 mL/min, and for 4 days if the CrCl is less than 30 mL/min.26 For edoxaban, the recommendation is to hold at least 24 hours before high-risk procedures. The recommendation for stopping dabigatran is 2-3 days before a high-risk procedure in patients with CrCl more than 80 mL/min, 3-4 days prior if between 30 and 49 mL/min, and 4-6 days prior if less than 30 mL/min respectively.27
In regard to antiplatelet management, ASA and the P2Y12 receptor inhibitors (e.g. clopidogrel, prasugrel, and ticagrelor) are the most commonly utilized antiplatelets in patients undergoing endoscopic procedures. For patients who are on ASA monotherapy, whether 81 mg or 325 mg daily, for secondary cardiovascular prevention, no interruption of ASA therapy is necessary for elective procedures. The benefit of ASA for secondary cardiovascular prevention and the possible reduction in thrombotic events seen in RCTs of nonendoscopic surgical procedures is well known. However, there may be certain exceptions in which aspirin should be temporarily held. For example, short-term interruption of ASA could be considered in high risk procedures such as biliary or pancreatic sphincterotomy, ampullectomy, and peroral endoscopic myotomy. For patients on single antiplatelet therapy with a P2Y12 receptor inhibitor who are undergoing elective endoscopic GI procedures, the recent CAG/ACG guidelines did not provide a clear recommendation for or against temporary interruption of the P2Y12 receptor inhibitor. Although interruption of a P2Y12 receptor inhibitor should theoretically decrease a patient’s risk of bleeding, the available evidence reported a nonsignificant increased bleeding risk in patients who stop a P2Y12 receptor inhibitor for an elective endoscopic procedure compared with those who continue the medication.28,29 Therefore, until further data are available, for patients on P2Y12 receptor monotherapy, a reasonable strategy would be to temporarily hold therapy prior to high risk endoscopic procedures, assuming the patients are not at high cardiovascular risk. Clopidogrel and prasugrel have to be stopped 5-7 days prior to allow normal platelet aggregation to resume as opposed to ticagrelor, a reversible P2Y12 receptor inhibitor that can be stopped 3-5 days prior.30
Lastly, for patients who are on dual antiplatelet therapy (DAPT) for secondary prevention, continuation of ASA and temporary interruption of the P2Y12 receptor inhibitor is recommended while undergoing elective endoscopy. Studies have shown that those who discontinued both had a much higher incidence of stent thrombosis compared with those who remained on aspirin alone.4,28,31
Resumption of antithrombotic therapy after endoscopy
In general, antithrombotic therapy should be resumed upon completion of the procedure unless there remains a persistent risk of major bleeding.1,14 This consensus is based on studies available on warfarin and heparin products, with minimal literature available regarding the resumption of DOACs. The benefits of immediate re-initiation of antithrombotic therapy for the prevention of thromboembolic events should be weighed against the risk of hemorrhage associated with the specific agent, the time to onset of the medication, and procedure-specific circumstances. For the small subset of patients on warfarin with a high risk of thromboembolism (e.g., mechanical heart valve), bridging with LMWH should be started at the earliest possible time when there is no risk of major bleeding and continued until the international normalized ratio (INR) reaches a therapeutic level with warfarin. For patients at a lower risk of thromboembolism, warfarin should be restarted within 24 hours of the procedure. In addition, because of the shorter duration of DOACs, if treatment with these agents cannot resume within 24 hours of a high-risk procedure, bridge therapy should be considered with UFH or LMWH in patients with a high risk of thrombosis.18 In patients receiving DOACs for stroke prophylaxis in AF, the DOACS can be safely resumed 1 day after low-risk procedures and 2-3 days after high-risk procedures without the need for bridging.25 All antiplatelet agents should be resumed as soon as hemostasis is achieved.
Conclusion
Antithrombotic therapy is increasingly used given the aging population, widespread burden of cardiovascular comorbidities, and expanding indications for classes of medications such as direct oral anticoagulants. Given the association with antithrombotic medications and gastrointestinal bleeding, it is essential for gastroenterologists to understand the importance, necessity, and timing when holding these medications for endoscopic procedures. Even with the practice guidelines available today to help clinicians navigate certain situations, each patient’s antithrombotic management may be different, and communication with the prescribing physicians and including patients in the decision-making process is essential before planned procedures.
Dr. Wang is a gastroenterology fellow at the University of Chicago. Dr. Sengupta is an associate professor at the University of Chicago. They reported no funding or conflicts of interest.
References
1. ASGE Standards of Practice Committee, Acosta RD et al. The management of antithrombotic agents for patients undergoing GI endoscopy. Gastrointest Endosc. 2016;83(1):3-16.
2. Veitch AM et al. Endoscopy in patients on antiplatelet or anticoagulant therapy, including direct oral anticoagulants: British Society of Gastroenterology (BSG) and European Society of Gastrointestinal Endoscopy (ESGE) guidelines. Endoscopy. 2016;48(4):c1. doi: 10.1055/s-0042-122686.
3. Chan FKL et al. Management of patients on antithrombotic agents undergoing emergency and elective endoscopy: Joint Asian Pacific Association of Gastroenterology (APAGE) and Asian Pacific Society for Digestive Endoscopy (APSDE) practice guidelines. Gut. 2018;67(3):405-17.
4. Abraham NS et al. American College of Gastroenterology – Canadian Association of Gastroenterology clinical practice guideline: Management of anticoagulants and antiplatelets during acute gastrointestinal bleeding and the periendoscopic period. Am J Gastroenterol. 2022;117(4):542-58.
5. Boustière C et al. Endoscopy and antiplatelet agents. European Society of Gastrointestinal Endoscopy (ESGE) guideline. Endoscopy. 2011;43(5):445-61.
6. Fujimoto K et al. Guidelines for gastroenterological endoscopy in patients undergoing antithrombotic treatment. Dig Endosc. 2014;26(1):1-14.
7. Wilke T et al. Patient preferences for oral anticoagulation therapy in atrial fibrillation: A systematic literature review. Patient 2017;10(1):17-37.
8. Gerson LB et al. Adverse events associated with anticoagulation therapy in the periendoscopic period. Gastrointest Endosc. 2010 Jun;71(7):1211-17.e2.
9. Horiuchi A et al. Removal of small colorectal polyps in anticoagulated patients: A prospective randomized comparison of cold snare and conventional polypectomy. Gastrointest Endosc 2014;79(3):417-23.
10. Lip GYH et al. Refining clinical risk stratification for predicting stroke and thromboembolism in atrial fibrillation using a novel risk factor-based approach: The euro heart survey on atrial fibrillation. Chest. 2010;137(2):263-72.
11. 2012 Writing Committee Members, Jneid H et al. 2012 ACCF/AHA focused update of the guideline for the management of patients with unstable angina/non-ST-elevation myocardial infarction (Updating the 2007 guideline and replacing the 2011 focused update): A report of the American College of Cardiology Foundation/American Heart Association Task Force on practice guidelines. Circulation. 2012;126(7):875-910.
12. Douketis JD et al. Perioperative management of antithrombotic therapy: Antithrombotic therapy and prevention of thrombosis, 9th ed: American College of Chest Physicians evidence-based clinical practice guidelines. Chest. 2012 Feb;141(2 Suppl):e326S-e350S.
13. Becker RC et al. Management of platelet-directed pharmacotherapy in patients with atherosclerotic coronary artery disease undergoing elective endoscopic gastrointestinal procedures. J Am Coll Cardiol. 2009;54(24):2261-76.
14. Kwok A and Faigel DO. Management of anticoagulation before and after gastrointestinal endoscopy. Am J Gastroenterol. 2009;104(12):3085-97; quiz 3098.
15. Douketis JD et al. Perioperative bridging anticoagulation in patients with atrial fibrillation. N Engl J Med. 2015;373(9):823-33.
16. Kovacs MJ et al. Postoperative low molecular weight heparin bridging treatment for patients at high risk of arterial thromboembolism (PERIOP2): Double blind randomised controlled trial. BMJ 2021;373:n1205.
17. Tafur A and Douketis J. Perioperative management of anticoagulant and antiplatelet therapy. Heart 2018;104(17):1461-7.
18. Kato M et al. Guidelines for gastroenterological endoscopy in patients undergoing antithrombotic treatment: 2017 appendix on anticoagulants including direct oral anticoagulants. Dig Endosc. 2018;30(4):433-40.
19. Inoue T et al. Clinical features of postpolypectomy bleeding associated with heparin bridge therapy. Dig Endosc. 2014;26(2):243-9.
20. Takeuchi Y et al. Continuous anticoagulation and cold snare polypectomy versus heparin bridging and hot snare polypectomy in patients on anticoagulants with subcentimeter polyps: A randomized controlled trial. Ann Intern Med. 2019;171(4):229-37.
21. Ara N et al. Prospective analysis of risk for bleeding after endoscopic biopsy without cessation of antithrombotics in Japan. Dig Endosc. 2015;27(4):458-64.
22. Yanagisawa N et al. Postpolypectomy bleeding and thromboembolism risks associated with warfarin vs. direct oral anticoagulants. World J Gastroenterol. 2018;24(14):1540-9.
23. Arimoto J et al. Safety of cold snare polypectomy in patients receiving treatment with antithrombotic agents. Dig Dis Sci. 2019;64(11):3247-55.
24. Heublein V et al. Gastrointestinal endoscopy in patients receiving novel direct oral anticoagulants: Results from the prospective Dresden NOAC registry. J Gastroenterol. 2018;53(2):236-46.
25. Douketis JD et al. Perioperative management of patients with atrial fibrillation receiving a direct oral anticoagulant. JAMA Intern Med. 2019;179(11):1469-78.
26. Dubois V et al. Perioperative management of patients on direct oral anticoagulants. Thromb J. 2017;15:14.
27. Weitz JI et al. Periprocedural management and approach to bleeding in patients taking dabigatran. Circulation. 2012 Nov 13;126(20):2428-32.
28. Chan FKL et al. Risk of postpolypectomy bleeding with uninterrupted clopidogrel therapy in an industry-independent, double-blind, randomized trial. Gastroenterology. 2019;156(4):918-25.
29. Watanabe K et al. Effect of antiplatelet agent number, types, and pre-endoscopic management on postpolypectomy bleeding: Validation of endoscopy guidelines. Surg Endosc. 2021;35(1):317-25.
30. Gurbel PA et al. Randomized double-blind assessment of the ONSET and OFFSET of the antiplatelet effects of ticagrelor versus clopidogrel in patients with stable coronary artery disease: The ONSET/OFFSET study. Circulation. 2009;120(25):2577-85.
31. Eisenberg MJ et al. Safety of short-term discontinuation of antiplatelet therapy in patients with drug-eluting stents. Circulation. 2009;119(12):1634-42.
Antithrombotic therapy is increasingly used to either reduce the risk of or treat thromboembolic episodes in patients with various medical conditions such as ischemic and valvular heart disease, atrial fibrillation (AF), cerebrovascular disease, peripheral arterial disease, venous thromboembolism (VTE) and hypercoagulable diseases. Antithrombotics include medications classified as anticoagulants or antiplatelets. Anticoagulants work by interfering with the native clotting cascade and consist of four main classes: vitamin K antagonists (VKA), heparin derivatives, direct factor Xa inhibitors, and direct thrombin inhibitors. Direct oral anticoagulants (DOACs) refer to dabigatran (a direct thrombin inhibitor) and the factor Xa inhibitors (apixaban, rivaroxaban, and edoxaban).
Antiplatelets, on the other hand, work by decreasing platelet aggregation and thus preventing thrombus formation; they include P2Y12 receptor inhibitors, protease-activated receptor-1 inhibitors, glycoprotein IIb/IIIa receptor inhibitors, acetylsalicylic acid (ASA), and nonsteroidal anti-inflammatory drugs. All of these agents may directly cause or increase the risk of gastrointestinal (GI) bleeding from luminal sources such as ulcers or diverticula, as well as after endoscopic interventions such as polypectomy. However, there is also a risk of thromboembolic consequences if some of these agents are withheld. Thus, the management of patients receiving antithrombotic agents and undergoing GI endoscopy represents an important clinical challenge and something that every GI physician has to deal with routinely.
The goal of this review is to discuss the optimal strategy for managing antithrombotics in patients undergoing elective endoscopy based on current available evidence and published clinical guidelines.1-4 Much of our discussion will review recommendations from the recently published joint American College of Gastroenterology (ACG) and Canadian Association of Gastroenterology (CAG) guidelines on management of anticoagulants and antiplatelets in the periendoscopic period by Abraham et al.4
Factors that guide decision-making
The two most vital factors to consider prior to performing endoscopic procedures in patients receiving antithrombotic therapy are to assess the risk of bleeding associated with the procedure and to assess the risk of thromboembolism associated with the underlying medical condition for which the antithrombotic agents are being used. In addition, it is also important to keep in mind the individual characteristics of the antithrombotic agent(s) used when making these decisions.
Estimating procedure-related bleeding risk
Various endoscopic procedures have different risks of associated bleeding. Although guidelines from GI societies may differ when classifying procedures into low or high risk, it is important to know that most of the original data on postprocedural bleeding risks are from studies conducted in patients who are not on complex antithrombotic regimens and thus may not accurately reflect the bleeding risk of patients using newer antithrombotic therapies.1,4-7
Traditionally, some of the common low-risk procedures have included diagnostic EGD and colonoscopy with or without biopsy, ERCP without sphincterotomy, biliary stent placement, and push or balloon-assisted enteroscopy. On the other hand, endoscopic procedures associated with interventions are known to have higher bleeding risk, and other procedural factors can influence this risk as well.8 For example, polypectomy, one of the most common interventions during endoscopy, is associated with bleeding risk ranging from 0.3% to 10% depending on multiple factors including polyp size, location, morphology (nonpolypoid, sessile, pedunculated), resection technique (cold or hot forceps, cold or hot snare), and type of cautery used.9 For some procedures, such as routine screening colonoscopy, however, the preprocedure estimate of bleeding risk can be uncertain because it is unclear if a high risk intervention (e.g., polypectomy of large polyp) will be necessary. For example, in the most recent ACG/CAG guidelines, colonoscopy with polypectomy < 1cm is considered a low/moderate risk bleeding procedure, whereas polypectomy > 1cm is considered high risk for bleeding.4 In these situations, the management of antithrombotic medications may depend on the individual patient’s risk of thrombosis and the specific antithrombotic agent. In the example of a patient undergoing colonoscopy while on antithrombotic medications, the bleeding risk associated with polypectomy can potentially be reduced by procedural techniques such as preferential use of cold snare polypectomy. Further high-quality data on the optimal procedural technique to reduce postpolypectomy bleeding in patients on antithrombotic medications is needed to help guide management.
Estimating thromboembolic risk
The risk of thromboembolic events in patients who are withholding their antithrombotic therapy for an endoscopic procedure depends on their underlying condition and individual characteristics. In patients who are on antithrombotic therapy for stroke prevention in non-valvular AF, the risk of cerebral thromboembolism in these patients is predictable using the CHA2DS2Vasc index.10 This scoring index includes heart failure, hypertension, age 75 years or older, diabetes mellitus, prior stroke or transient ischemic attack (TIA), vascular disease, age 65-74 years, and sex categories.
Patients with previous VTE on anticoagulation or those who have mechanical heart valves may have different risk factors for thromboembolic episodes. Among patients with VTE, time from initial VTE, history of recurrent VTE with antithrombotic interruption, and presence of underlying thrombophilia are most predictive of future thromboembolic risk. And for patients with mechanical heart valves, presence of a mitral valve prosthesis, and the presence or absence of associated heart failure and AF determine the annual risk of thromboembolic events. Bioprosthetic valves are generally considered low risk.
In patients with coronary artery disease (CAD), high thrombosis risk scenarios with holding antiplatelets include patients within 3 months of an acute coronary syndrome (ACS) event, within 6 months of a drug-eluting stent (DES) placement, or within 1 month of a bare metal coronary stent (BMS) placement. In addition, patients with ACS that occurred within the past 12 months of DES placement or within 2 months of BMS placement are also considered high risk.11,12 Even beyond these periods, certain patients may still be at high risk of stent occlusion. In particular, patients with a prior history of stent occlusion, ACS or ST elevation myocardial infection, prior multivessel percutaneous coronary intervention, diabetes, renal failure, or diffuse CAD are at higher risk of stent occlusion or ACS events with alteration of antithrombotic therapy.13 Thus, modification of antithrombotic regimens in these patients should be cautiously approached.
Management of antithrombotics prior to elective endoscopy
In patients who need elective endoscopic procedures, if the indication for antithrombotic therapy is short-term, the procedure is probably best delayed until after that period.13 For patients on long-term or lifelong antithrombotic treatment, the decision to temporarily hold the treatment for endoscopy should occur after a discussion with the patient and all of the involved providers. In some high-risk patients, these agents cannot be interrupted; therefore, clinicians must carefully weigh the risks and benefits of the procedure before proceeding with endoscopy. For patients who are known to be undergoing an elective endoscopic procedure, antithrombotic medications may or may not need to be held prior to the procedure depending on the type of therapy. For example, according to the recent ACG/CAG guidelines, warfarin should be continued, whereas DOACs should be temporarily stopped for patients who are undergoing elective/planned endoscopic GI procedures.
Unfractionated heparin (UFH) administered as a continuous intravenous infusion can generally be held 3-4 hours before the procedure, given its short half-life. Low molecular weight heparin (LMWH), including enoxaparin and dalteparin, should be stopped 24 hours prior to the procedure.2,14 Fondaparinux is a synthetic X-a inhibitor that requires discontinuation at least 36 hours preceding a high risk procedure. For patients on warfarin who are undergoing elective endoscopic procedures that are low risk for inducing bleeding, warfarin can be continued, as opposed to temporarily interrupted, although the dose should be omitted the morning of the procedure.4 For those who are undergoing high-risk endoscopic procedures (including colonoscopy with possible polypectomy > 1 cm), 5 days of temporary interruption without periprocedural bridging is appropriate in most patients. This is contrary to previous guidelines, which had recommended bridging for patients with a CHA2DS2Vasc score ≥ 2. Recent impactful randomized trials (BRIDGE and PERIOP-2) have called into question the benefit of periprocedural bridging with LMWH. Avoiding bridging anticoagulation was generally found to be similar to bridging in regard to prevention of thromboembolic complications, but importantly was associated with a decreased risk of major bleeding.15,16 Of note, periprocedural bridging may still be appropriate in a small subset of patients, including those with mechanical valves, AF with CHADS2 score > 5, and previous thromboembolism during temporary interruption of VKAs. The decision to bridge or not should ideally be made in a multidisciplinary fashion.15-20
Data are lacking on the ideal strategy for periendoscopic DOAC management. As mentioned above, for patients on DOACs who are undergoing elective endoscopic GI procedures, temporarily interrupting DOACs rather than continuing them is recommended. Currently, there are no randomized controlled trials addressing the management of DOACs in the periendoscopic period. However, based on five cohort studies, the ideal duration of DOAC interruption before endoscopic procedures may be between 1 and 2 days, excluding the day of the procedure.21-25 This strategy allows for a short preprocedural duration of DOAC interruption and likely provides a balance between bleeding and thromboembolism risk. Importantly, there are no reliable laboratory assays to assess the anticoagulant effect of DOACs, and an individual patient’s degree of renal dysfunction may impact how long the DOAC should be held. In general, the anti-Xa drugs should be held for 1-2 days if the creatinine clearance (CrCl) is ≥ 60 mL/min, for 3 days if the CrCl is between 30 mL/min and 59 mL/min, and for 4 days if the CrCl is less than 30 mL/min.26 For edoxaban, the recommendation is to hold at least 24 hours before high-risk procedures. The recommendation for stopping dabigatran is 2-3 days before a high-risk procedure in patients with CrCl more than 80 mL/min, 3-4 days prior if between 30 and 49 mL/min, and 4-6 days prior if less than 30 mL/min respectively.27
In regard to antiplatelet management, ASA and the P2Y12 receptor inhibitors (e.g. clopidogrel, prasugrel, and ticagrelor) are the most commonly utilized antiplatelets in patients undergoing endoscopic procedures. For patients who are on ASA monotherapy, whether 81 mg or 325 mg daily, for secondary cardiovascular prevention, no interruption of ASA therapy is necessary for elective procedures. The benefit of ASA for secondary cardiovascular prevention and the possible reduction in thrombotic events seen in RCTs of nonendoscopic surgical procedures is well known. However, there may be certain exceptions in which aspirin should be temporarily held. For example, short-term interruption of ASA could be considered in high risk procedures such as biliary or pancreatic sphincterotomy, ampullectomy, and peroral endoscopic myotomy. For patients on single antiplatelet therapy with a P2Y12 receptor inhibitor who are undergoing elective endoscopic GI procedures, the recent CAG/ACG guidelines did not provide a clear recommendation for or against temporary interruption of the P2Y12 receptor inhibitor. Although interruption of a P2Y12 receptor inhibitor should theoretically decrease a patient’s risk of bleeding, the available evidence reported a nonsignificant increased bleeding risk in patients who stop a P2Y12 receptor inhibitor for an elective endoscopic procedure compared with those who continue the medication.28,29 Therefore, until further data are available, for patients on P2Y12 receptor monotherapy, a reasonable strategy would be to temporarily hold therapy prior to high risk endoscopic procedures, assuming the patients are not at high cardiovascular risk. Clopidogrel and prasugrel have to be stopped 5-7 days prior to allow normal platelet aggregation to resume as opposed to ticagrelor, a reversible P2Y12 receptor inhibitor that can be stopped 3-5 days prior.30
Lastly, for patients who are on dual antiplatelet therapy (DAPT) for secondary prevention, continuation of ASA and temporary interruption of the P2Y12 receptor inhibitor is recommended while undergoing elective endoscopy. Studies have shown that those who discontinued both had a much higher incidence of stent thrombosis compared with those who remained on aspirin alone.4,28,31
Resumption of antithrombotic therapy after endoscopy
In general, antithrombotic therapy should be resumed upon completion of the procedure unless there remains a persistent risk of major bleeding.1,14 This consensus is based on studies available on warfarin and heparin products, with minimal literature available regarding the resumption of DOACs. The benefits of immediate re-initiation of antithrombotic therapy for the prevention of thromboembolic events should be weighed against the risk of hemorrhage associated with the specific agent, the time to onset of the medication, and procedure-specific circumstances. For the small subset of patients on warfarin with a high risk of thromboembolism (e.g., mechanical heart valve), bridging with LMWH should be started at the earliest possible time when there is no risk of major bleeding and continued until the international normalized ratio (INR) reaches a therapeutic level with warfarin. For patients at a lower risk of thromboembolism, warfarin should be restarted within 24 hours of the procedure. In addition, because of the shorter duration of DOACs, if treatment with these agents cannot resume within 24 hours of a high-risk procedure, bridge therapy should be considered with UFH or LMWH in patients with a high risk of thrombosis.18 In patients receiving DOACs for stroke prophylaxis in AF, the DOACS can be safely resumed 1 day after low-risk procedures and 2-3 days after high-risk procedures without the need for bridging.25 All antiplatelet agents should be resumed as soon as hemostasis is achieved.
Conclusion
Antithrombotic therapy is increasingly used given the aging population, widespread burden of cardiovascular comorbidities, and expanding indications for classes of medications such as direct oral anticoagulants. Given the association with antithrombotic medications and gastrointestinal bleeding, it is essential for gastroenterologists to understand the importance, necessity, and timing when holding these medications for endoscopic procedures. Even with the practice guidelines available today to help clinicians navigate certain situations, each patient’s antithrombotic management may be different, and communication with the prescribing physicians and including patients in the decision-making process is essential before planned procedures.
Dr. Wang is a gastroenterology fellow at the University of Chicago. Dr. Sengupta is an associate professor at the University of Chicago. They reported no funding or conflicts of interest.
References
1. ASGE Standards of Practice Committee, Acosta RD et al. The management of antithrombotic agents for patients undergoing GI endoscopy. Gastrointest Endosc. 2016;83(1):3-16.
2. Veitch AM et al. Endoscopy in patients on antiplatelet or anticoagulant therapy, including direct oral anticoagulants: British Society of Gastroenterology (BSG) and European Society of Gastrointestinal Endoscopy (ESGE) guidelines. Endoscopy. 2016;48(4):c1. doi: 10.1055/s-0042-122686.
3. Chan FKL et al. Management of patients on antithrombotic agents undergoing emergency and elective endoscopy: Joint Asian Pacific Association of Gastroenterology (APAGE) and Asian Pacific Society for Digestive Endoscopy (APSDE) practice guidelines. Gut. 2018;67(3):405-17.
4. Abraham NS et al. American College of Gastroenterology – Canadian Association of Gastroenterology clinical practice guideline: Management of anticoagulants and antiplatelets during acute gastrointestinal bleeding and the periendoscopic period. Am J Gastroenterol. 2022;117(4):542-58.
5. Boustière C et al. Endoscopy and antiplatelet agents. European Society of Gastrointestinal Endoscopy (ESGE) guideline. Endoscopy. 2011;43(5):445-61.
6. Fujimoto K et al. Guidelines for gastroenterological endoscopy in patients undergoing antithrombotic treatment. Dig Endosc. 2014;26(1):1-14.
7. Wilke T et al. Patient preferences for oral anticoagulation therapy in atrial fibrillation: A systematic literature review. Patient 2017;10(1):17-37.
8. Gerson LB et al. Adverse events associated with anticoagulation therapy in the periendoscopic period. Gastrointest Endosc. 2010 Jun;71(7):1211-17.e2.
9. Horiuchi A et al. Removal of small colorectal polyps in anticoagulated patients: A prospective randomized comparison of cold snare and conventional polypectomy. Gastrointest Endosc 2014;79(3):417-23.
10. Lip GYH et al. Refining clinical risk stratification for predicting stroke and thromboembolism in atrial fibrillation using a novel risk factor-based approach: The euro heart survey on atrial fibrillation. Chest. 2010;137(2):263-72.
11. 2012 Writing Committee Members, Jneid H et al. 2012 ACCF/AHA focused update of the guideline for the management of patients with unstable angina/non-ST-elevation myocardial infarction (Updating the 2007 guideline and replacing the 2011 focused update): A report of the American College of Cardiology Foundation/American Heart Association Task Force on practice guidelines. Circulation. 2012;126(7):875-910.
12. Douketis JD et al. Perioperative management of antithrombotic therapy: Antithrombotic therapy and prevention of thrombosis, 9th ed: American College of Chest Physicians evidence-based clinical practice guidelines. Chest. 2012 Feb;141(2 Suppl):e326S-e350S.
13. Becker RC et al. Management of platelet-directed pharmacotherapy in patients with atherosclerotic coronary artery disease undergoing elective endoscopic gastrointestinal procedures. J Am Coll Cardiol. 2009;54(24):2261-76.
14. Kwok A and Faigel DO. Management of anticoagulation before and after gastrointestinal endoscopy. Am J Gastroenterol. 2009;104(12):3085-97; quiz 3098.
15. Douketis JD et al. Perioperative bridging anticoagulation in patients with atrial fibrillation. N Engl J Med. 2015;373(9):823-33.
16. Kovacs MJ et al. Postoperative low molecular weight heparin bridging treatment for patients at high risk of arterial thromboembolism (PERIOP2): Double blind randomised controlled trial. BMJ 2021;373:n1205.
17. Tafur A and Douketis J. Perioperative management of anticoagulant and antiplatelet therapy. Heart 2018;104(17):1461-7.
18. Kato M et al. Guidelines for gastroenterological endoscopy in patients undergoing antithrombotic treatment: 2017 appendix on anticoagulants including direct oral anticoagulants. Dig Endosc. 2018;30(4):433-40.
19. Inoue T et al. Clinical features of postpolypectomy bleeding associated with heparin bridge therapy. Dig Endosc. 2014;26(2):243-9.
20. Takeuchi Y et al. Continuous anticoagulation and cold snare polypectomy versus heparin bridging and hot snare polypectomy in patients on anticoagulants with subcentimeter polyps: A randomized controlled trial. Ann Intern Med. 2019;171(4):229-37.
21. Ara N et al. Prospective analysis of risk for bleeding after endoscopic biopsy without cessation of antithrombotics in Japan. Dig Endosc. 2015;27(4):458-64.
22. Yanagisawa N et al. Postpolypectomy bleeding and thromboembolism risks associated with warfarin vs. direct oral anticoagulants. World J Gastroenterol. 2018;24(14):1540-9.
23. Arimoto J et al. Safety of cold snare polypectomy in patients receiving treatment with antithrombotic agents. Dig Dis Sci. 2019;64(11):3247-55.
24. Heublein V et al. Gastrointestinal endoscopy in patients receiving novel direct oral anticoagulants: Results from the prospective Dresden NOAC registry. J Gastroenterol. 2018;53(2):236-46.
25. Douketis JD et al. Perioperative management of patients with atrial fibrillation receiving a direct oral anticoagulant. JAMA Intern Med. 2019;179(11):1469-78.
26. Dubois V et al. Perioperative management of patients on direct oral anticoagulants. Thromb J. 2017;15:14.
27. Weitz JI et al. Periprocedural management and approach to bleeding in patients taking dabigatran. Circulation. 2012 Nov 13;126(20):2428-32.
28. Chan FKL et al. Risk of postpolypectomy bleeding with uninterrupted clopidogrel therapy in an industry-independent, double-blind, randomized trial. Gastroenterology. 2019;156(4):918-25.
29. Watanabe K et al. Effect of antiplatelet agent number, types, and pre-endoscopic management on postpolypectomy bleeding: Validation of endoscopy guidelines. Surg Endosc. 2021;35(1):317-25.
30. Gurbel PA et al. Randomized double-blind assessment of the ONSET and OFFSET of the antiplatelet effects of ticagrelor versus clopidogrel in patients with stable coronary artery disease: The ONSET/OFFSET study. Circulation. 2009;120(25):2577-85.
31. Eisenberg MJ et al. Safety of short-term discontinuation of antiplatelet therapy in patients with drug-eluting stents. Circulation. 2009;119(12):1634-42.
Innovation in GI: What’s the next big thing?
Dear colleagues,
Innovation is the livelihood of our field, driving major advances in endoscopy and attracting many of us to Gastroenterology. From the development of endoscopic retrograde cholangiopancreatography to the wide-spread adoption of third space endoscopy, we continue to push the boundaries of our practice. But what is the next big disruption in GI, and how will it impact us? Dr. Jeremy Glissen Brown discusses the application of artificial intelligence in GI highlighting its promise but also raising important questions. Dr. Raman Muthusamy elaborates on single-use endoscopes – are they the wave of the future in preventing infection and meeting patient preference? Or will their long-term cost and environmental impact limit their use? I welcome your own thoughts on disruptive innovation in Gastroenterology – share with us on Twitter @AGA_GIHN and by email at [email protected].
Gyanprakash A. Ketwaroo, MD, MSc, is an associate professor of medicine, Yale University, New Haven, Conn., and chief of endoscopy at West Haven (Conn.) VA Medical Center. He is an associate editor for GI & Hepatology News.
The AI revolution, with some important caveats
BY JEREMY R. GLISSEN BROWN, MD, MSC
In 2018, Japan’s Pharmaceutical and Medical Device Agency approved the first artificial intelligence (AI)–based tool, a computer-aided diagnosis system (CADx) for use in clinical practice.1 Since that time, we have seen regulatory approval for a variety of deep learning and AI-based tools in endoscopy and beyond. In addition, there has been an enormous amount of commercial and research interest in AI-based tools in clinical medicine and gastroenterology, and it is almost impossible to open a major gastroenterology journal or go to an academic conference without encountering a slew of AI-based projects.
Many thought and industry leaders say that we are in the midst of an AI revolution in gastroenterology. Indeed, we are at a period of unprecedented growth for deep learning and AI for several reasons, including a recent shift toward data-driven approaches, advancement of machine-learning techniques, and increased computing power. There is, however, also an unprecedented amount of scrutiny and thoughtful conversation about the role AI might play in clinical practice and how we use and regulate these tools in the clinical setting. We are thus in a unique position to ask ourselves the essential question: “Are we on the cusp of an AI revolution in gastroenterology, or are we seeing the release of medical software that is perhaps at best useful in a niche environment and at worse a hype-driven novelty without much clinical benefit?” We will use the most popular use-case, computer aided detection (CADe) of polyps in the colon, to explore this question. In the end, I believe that deep-learning technology will fundamentally change the way we practice gastroenterology. However, this is the perfect time to explore what this means now, and what we can do to shape what it will mean for the future.
CADe: Promise and questions
CADe is a computer vision task that involves localization, such as finding a polyp during colonoscopy and highlighting it with a hollow box. CADe in colonoscopy is perhaps the most well-studied application of deep learning in GI endoscopy to date and is furthest along in the development-implementation pipeline. Because of this, it is an ideal use-case for examining both the evidence that currently supports its use as well as the questions that have come up as we are starting to see CADe algorithms deployed in clinical practice. It is honestly astounding to think that, just 5 years ago, we were talking about CADe as a research concept. While early efforts applying traditional machine learning date back at least to the 1990s, we started to see prospective studies of CADe systems with undetectable or nearly undetectable latency in 2019.2 Since that time we have seen the publication of at least 10 randomized clinical trials involving CADe.
CADe clearly has an impact on some of the conventional quality metrics we use for colonoscopy. While there is considerable heterogeneity in region and design among these trials, most show a significant increase in adenoma detection rate (ADR) and adenomas per colonoscopy. Tandem studies show decreases in adenoma miss rate, and at least one study showed a decrease in sessile serrated lesion miss rate as well. In one of the first randomized, controlled trials across multiple endoscopy centers in Italy, Repici and colleagues showed an increase in ADR from 40.4% in the control group to 54.8% in the CADe group (RR, 1.30; 95% confidence interval, 1.14-1.45).3 Because of pioneering trials such as this one, there are currently several CADe systems that have received regulatory approval in Europe, Asia, and the United States and are being deployed commercially.
It is also clear that the technology is there. In clinical practice, the Food and Drug Administration–approved systems work smoothly, with little to no detectable latency and generally low false-positive and false-negative rates. With clinical deployment, however, we have seen the emergence of healthy debate surrounding every aspect of this task-specific AI. On the development side, important questions include transparency of development data, ensuring that algorithm development is ethical and equitable (as deep learning is susceptible to exacerbating human biases) and methods of data labeling. On the deployment level, important concerns include proper regulation of locked versus “open” algorithms and downstream effects on cost.
In addition, with CADe we have seen a variety of clinical questions crop up because of the novelty of the technology. These include the concern that the increase in ADR we have seen thus far is driven in large part by diminutive and small adenomas (with healthy debate in turn as to these entities’ influence on interval colorectal cancer rates), the effect CADe might have on fellowship training to detect polyps with the human eye, and whether the technology affects sessile serrated lesion detection rates or not. The great thing about such questions is that they have inspired novel research related to CADe in the clinical setting, including how CADe affects trainee ADR, how CADe affects gaze patterns, and how CADe affects recommended surveillance intervals.
CADx, novel applications, and the future
Though there is not space to expand in this particular forum, it is safe to say that with the advancement of CADx in endoscopy and colonoscopy, we have seen similar and novel questions come up. The beautiful thing about all of this is that we are just scratching the surface of what is achievable with deep learning. We have started to see novel projects utilizing deep-learning algorithms, from detecting cirrhosis on ECG to automatically classifying stool consistency on the Bristol Stool Scale from pictures of stool. I ultimately do think that the deployment of AI tools will fundamentally change the way we practice and think about gastroenterology. We are at an incredibly exciting time where we as physicians have the power to shape what that looks like, how we think about AI deployment and regulation and where we go from here.
Dr. Glissen Brown is with the division of gastroenterology and hepatology at Duke University Medical Center, Durham, N.C. He has served as a consultant for Medtronic.
References
1. Aisu N et al. PLOS Digital Health. 2021 Jan 18. doi: 10.1371/journal.pdig.0000001.
2. Wang P et al. Gut. 2019 Oct;68(10):1813-9.
3. Repici A et al. Gastroenterology. 2020 Aug;159(2):512-20.e7.
What’s the future of single-use endoscopes?
BY V. RAMAN MUTHUSAMY, MD, MAS
Single-use endoscopes have been proposed as a definitive solution to the risk of endoscope-transmitted infections. While these infections have been reported for several decades, they have traditionally been associated with identified breaches in the reprocessing protocol. In 2015, numerous cases of duodenoscope-transmitted infections were reported after endoscopic retrograde cholangiopancreatography (ERCP) procedures. Many, if not most, of these cases were not associated with identified deviations from standard high-level disinfection protocols and occurred at high-volume experienced facilities. A subsequent FDA postmarket surveillance study found contamination rates were linked with potentially pathogenic bacteria in approximately 5% of duodenoscopes. Thus, amid growing concerns about the ability to adequately clean these complex devices, these events prompted the development of single-use duodenoscopes. Given the multifactorial causes leading to contaminated duodenoscopes, the advantages of such single-use devices are their ability to ensure the elimination of the potential of infection transmission as these devices are never reused. In addition to this primary benefit, the ability to create single-use devices could lead to more easily available specialty scopes and allow variations in endoscope design that could improve ergonomics. Single-use devices may also expand the ability to provide endoscopic services by eliminating the need for device reprocessing equipment at low-volume sites. However, several concerns have been raised regarding their use, especially if it were to become widespread. These include issues of device quality and performance (potentially leading to more failed cases or adverse events), cost, their environmental impact and current uncertainty regarding their indications for use. Furthermore, new alternatives such as reusable devices with partially disposable components or future low-temperature sterilization options may minimize the need for such devices. We will briefly discuss these issues in more detail below.
Given that nearly all cases of GI device–transmitted infections where standard reprocessing protocols were followed have occurred in duodenoscopes, I will focus on single-use duodenoscopes in this article. It is important that we reassure our patients and colleagues that standard reprocessing appears to be extremely effective with all other types of devices, including elevator containing linear echoendoscopes. Studies investigating the causes of why duodenoscopes have primarily been associated with device-transmitted outbreaks have focused on the complexity of the elevator including its recesses, fixed end-cap and wire channels. However, culturing has shown that up to one-third of contamination may occur in the instrument channels or in the region of the biopsy cap, leading to some potential residual sites of infection even when newly developed reusable devices with disposable elevators/end-caps are utilized.1 Another challenge with reprocessing is the ability to prove residual contamination does not exist. While culturing the devices after reprocessing is most used, it should be noted many sites with outbreaks failed to culture the culprit bacteria from the devices as accessing the sites of contamination can be challenging. The use of other markers of residual contamination such as ATP and tests for residual blood/protein have yielded variable results. Specifically, ATP testing has not correlated well with culture results but may be helpful in assessing the quality of manual cleaning.2
These challenges have made the concept of single-use devices more appealing given the lack of a need reprocess devices or validate cleaning efficacy. Currently, there are two FDA-approved devices on the market, but the published literature to date has largely involved one of these devices. To date, in four published studies that have assessed the clinical performance of single-use duodenoscopes in over 400 patients, procedural success rates have ranged from 91% to 97% with adverse event rates and endoscopist satisfaction scores comparable to reusable devices. Most of these users were expert biliary endoscopists and more data are needed regarding the performance of the device in lower-volume and nonexpert users. While indications for use in these studies have varied, I feel that there are four potential scenarios to utilize these devices: in patients with known multidrug-resistant organisms undergoing ERCP; to facilitate logistics/operations when a reusable device is not available; in critically ill patients who would not tolerate a scope-acquired infection; and in procedures associated with a risk of bacteremia.
While preliminary data suggest single-use duodenoscopes are safe and effective in expert hands, concerns exist regarding their implementation more broadly into clinical practice. First, the devices cost between $1,500-3,000, making them impractical for many health systems. One study estimated the break-even cost of the device to be $800-1,300 based on variation in site volume and device contamination rates.3 However, it should be noted that current enhanced reprocessing protocols for reusable devices may add an additional $75,000-$400,000 per year based on center volume.4 In the United States, there is currently payment by federal and some commercial payors that cover part or all of the device cost, but whether this will continue long-term is unclear. In addition, there is significant concern regarding the environmental impact of a broader mover to single-use devices. Reprocessing programs do exist for these devices, but detailed analyses regarding the environmental effects of a strategy using single-use versus reusable devices and the waste generated from each are needed.
Finally, while primarily created to avoid device-related infection transmission, other benefits can be realized with single-use devices. The potential for ergonomic enhancements (variable handle sizes or shaft stiffness, R- and L-handed scopes) as well as the creation of specialty devices (extra-long or thin devices, devices with special optical or rotational capabilities) may become more feasible with a single-use platform. Finally, the pace of endoscopic innovation and refinement is likely to quicken with a single use platform, and new advancements can be incorporated in a timelier manner.
Conclusion
In summary, I believe single-use devices offer the potential to improve the safety of endoscopic procedures as well as improve procedural access, enhance ergonomics, and foster and expedite device innovation. However, reductions in cost, refining their indications, and developing recycling programs to minimize their environmental impact will be essential before more widespread adoption is achieved.
Dr. Muthusamy is a professor of clinical medicine at the University of California, Los Angeles, and the medical director of endoscopy at the UCLA Health System. He reported relationships with Medtronic, Boston Scientific, Motus GI, Endogastric Solutions, and Capsovision.
References
1. Bartles RL et al. Gastrointest Endosc. 2018 Aug;88(2):306-13.e2.
2. Day LW et al. Gastrointest Endosc. 2021 Jan;93(1):11-33.e6.
3. Bang JY et al. Gut. 2019 Nov;68(11):1915-7.
4. Bomman S et al. Endosc Int Open. 2021 Aug 23;9(9):E1404-12.
Dear colleagues,
Innovation is the livelihood of our field, driving major advances in endoscopy and attracting many of us to Gastroenterology. From the development of endoscopic retrograde cholangiopancreatography to the wide-spread adoption of third space endoscopy, we continue to push the boundaries of our practice. But what is the next big disruption in GI, and how will it impact us? Dr. Jeremy Glissen Brown discusses the application of artificial intelligence in GI highlighting its promise but also raising important questions. Dr. Raman Muthusamy elaborates on single-use endoscopes – are they the wave of the future in preventing infection and meeting patient preference? Or will their long-term cost and environmental impact limit their use? I welcome your own thoughts on disruptive innovation in Gastroenterology – share with us on Twitter @AGA_GIHN and by email at [email protected].
Gyanprakash A. Ketwaroo, MD, MSc, is an associate professor of medicine, Yale University, New Haven, Conn., and chief of endoscopy at West Haven (Conn.) VA Medical Center. He is an associate editor for GI & Hepatology News.
The AI revolution, with some important caveats
BY JEREMY R. GLISSEN BROWN, MD, MSC
In 2018, Japan’s Pharmaceutical and Medical Device Agency approved the first artificial intelligence (AI)–based tool, a computer-aided diagnosis system (CADx) for use in clinical practice.1 Since that time, we have seen regulatory approval for a variety of deep learning and AI-based tools in endoscopy and beyond. In addition, there has been an enormous amount of commercial and research interest in AI-based tools in clinical medicine and gastroenterology, and it is almost impossible to open a major gastroenterology journal or go to an academic conference without encountering a slew of AI-based projects.
Many thought and industry leaders say that we are in the midst of an AI revolution in gastroenterology. Indeed, we are at a period of unprecedented growth for deep learning and AI for several reasons, including a recent shift toward data-driven approaches, advancement of machine-learning techniques, and increased computing power. There is, however, also an unprecedented amount of scrutiny and thoughtful conversation about the role AI might play in clinical practice and how we use and regulate these tools in the clinical setting. We are thus in a unique position to ask ourselves the essential question: “Are we on the cusp of an AI revolution in gastroenterology, or are we seeing the release of medical software that is perhaps at best useful in a niche environment and at worse a hype-driven novelty without much clinical benefit?” We will use the most popular use-case, computer aided detection (CADe) of polyps in the colon, to explore this question. In the end, I believe that deep-learning technology will fundamentally change the way we practice gastroenterology. However, this is the perfect time to explore what this means now, and what we can do to shape what it will mean for the future.
CADe: Promise and questions
CADe is a computer vision task that involves localization, such as finding a polyp during colonoscopy and highlighting it with a hollow box. CADe in colonoscopy is perhaps the most well-studied application of deep learning in GI endoscopy to date and is furthest along in the development-implementation pipeline. Because of this, it is an ideal use-case for examining both the evidence that currently supports its use as well as the questions that have come up as we are starting to see CADe algorithms deployed in clinical practice. It is honestly astounding to think that, just 5 years ago, we were talking about CADe as a research concept. While early efforts applying traditional machine learning date back at least to the 1990s, we started to see prospective studies of CADe systems with undetectable or nearly undetectable latency in 2019.2 Since that time we have seen the publication of at least 10 randomized clinical trials involving CADe.
CADe clearly has an impact on some of the conventional quality metrics we use for colonoscopy. While there is considerable heterogeneity in region and design among these trials, most show a significant increase in adenoma detection rate (ADR) and adenomas per colonoscopy. Tandem studies show decreases in adenoma miss rate, and at least one study showed a decrease in sessile serrated lesion miss rate as well. In one of the first randomized, controlled trials across multiple endoscopy centers in Italy, Repici and colleagues showed an increase in ADR from 40.4% in the control group to 54.8% in the CADe group (RR, 1.30; 95% confidence interval, 1.14-1.45).3 Because of pioneering trials such as this one, there are currently several CADe systems that have received regulatory approval in Europe, Asia, and the United States and are being deployed commercially.
It is also clear that the technology is there. In clinical practice, the Food and Drug Administration–approved systems work smoothly, with little to no detectable latency and generally low false-positive and false-negative rates. With clinical deployment, however, we have seen the emergence of healthy debate surrounding every aspect of this task-specific AI. On the development side, important questions include transparency of development data, ensuring that algorithm development is ethical and equitable (as deep learning is susceptible to exacerbating human biases) and methods of data labeling. On the deployment level, important concerns include proper regulation of locked versus “open” algorithms and downstream effects on cost.
In addition, with CADe we have seen a variety of clinical questions crop up because of the novelty of the technology. These include the concern that the increase in ADR we have seen thus far is driven in large part by diminutive and small adenomas (with healthy debate in turn as to these entities’ influence on interval colorectal cancer rates), the effect CADe might have on fellowship training to detect polyps with the human eye, and whether the technology affects sessile serrated lesion detection rates or not. The great thing about such questions is that they have inspired novel research related to CADe in the clinical setting, including how CADe affects trainee ADR, how CADe affects gaze patterns, and how CADe affects recommended surveillance intervals.
CADx, novel applications, and the future
Though there is not space to expand in this particular forum, it is safe to say that with the advancement of CADx in endoscopy and colonoscopy, we have seen similar and novel questions come up. The beautiful thing about all of this is that we are just scratching the surface of what is achievable with deep learning. We have started to see novel projects utilizing deep-learning algorithms, from detecting cirrhosis on ECG to automatically classifying stool consistency on the Bristol Stool Scale from pictures of stool. I ultimately do think that the deployment of AI tools will fundamentally change the way we practice and think about gastroenterology. We are at an incredibly exciting time where we as physicians have the power to shape what that looks like, how we think about AI deployment and regulation and where we go from here.
Dr. Glissen Brown is with the division of gastroenterology and hepatology at Duke University Medical Center, Durham, N.C. He has served as a consultant for Medtronic.
References
1. Aisu N et al. PLOS Digital Health. 2021 Jan 18. doi: 10.1371/journal.pdig.0000001.
2. Wang P et al. Gut. 2019 Oct;68(10):1813-9.
3. Repici A et al. Gastroenterology. 2020 Aug;159(2):512-20.e7.
What’s the future of single-use endoscopes?
BY V. RAMAN MUTHUSAMY, MD, MAS
Single-use endoscopes have been proposed as a definitive solution to the risk of endoscope-transmitted infections. While these infections have been reported for several decades, they have traditionally been associated with identified breaches in the reprocessing protocol. In 2015, numerous cases of duodenoscope-transmitted infections were reported after endoscopic retrograde cholangiopancreatography (ERCP) procedures. Many, if not most, of these cases were not associated with identified deviations from standard high-level disinfection protocols and occurred at high-volume experienced facilities. A subsequent FDA postmarket surveillance study found contamination rates were linked with potentially pathogenic bacteria in approximately 5% of duodenoscopes. Thus, amid growing concerns about the ability to adequately clean these complex devices, these events prompted the development of single-use duodenoscopes. Given the multifactorial causes leading to contaminated duodenoscopes, the advantages of such single-use devices are their ability to ensure the elimination of the potential of infection transmission as these devices are never reused. In addition to this primary benefit, the ability to create single-use devices could lead to more easily available specialty scopes and allow variations in endoscope design that could improve ergonomics. Single-use devices may also expand the ability to provide endoscopic services by eliminating the need for device reprocessing equipment at low-volume sites. However, several concerns have been raised regarding their use, especially if it were to become widespread. These include issues of device quality and performance (potentially leading to more failed cases or adverse events), cost, their environmental impact and current uncertainty regarding their indications for use. Furthermore, new alternatives such as reusable devices with partially disposable components or future low-temperature sterilization options may minimize the need for such devices. We will briefly discuss these issues in more detail below.
Given that nearly all cases of GI device–transmitted infections where standard reprocessing protocols were followed have occurred in duodenoscopes, I will focus on single-use duodenoscopes in this article. It is important that we reassure our patients and colleagues that standard reprocessing appears to be extremely effective with all other types of devices, including elevator containing linear echoendoscopes. Studies investigating the causes of why duodenoscopes have primarily been associated with device-transmitted outbreaks have focused on the complexity of the elevator including its recesses, fixed end-cap and wire channels. However, culturing has shown that up to one-third of contamination may occur in the instrument channels or in the region of the biopsy cap, leading to some potential residual sites of infection even when newly developed reusable devices with disposable elevators/end-caps are utilized.1 Another challenge with reprocessing is the ability to prove residual contamination does not exist. While culturing the devices after reprocessing is most used, it should be noted many sites with outbreaks failed to culture the culprit bacteria from the devices as accessing the sites of contamination can be challenging. The use of other markers of residual contamination such as ATP and tests for residual blood/protein have yielded variable results. Specifically, ATP testing has not correlated well with culture results but may be helpful in assessing the quality of manual cleaning.2
These challenges have made the concept of single-use devices more appealing given the lack of a need reprocess devices or validate cleaning efficacy. Currently, there are two FDA-approved devices on the market, but the published literature to date has largely involved one of these devices. To date, in four published studies that have assessed the clinical performance of single-use duodenoscopes in over 400 patients, procedural success rates have ranged from 91% to 97% with adverse event rates and endoscopist satisfaction scores comparable to reusable devices. Most of these users were expert biliary endoscopists and more data are needed regarding the performance of the device in lower-volume and nonexpert users. While indications for use in these studies have varied, I feel that there are four potential scenarios to utilize these devices: in patients with known multidrug-resistant organisms undergoing ERCP; to facilitate logistics/operations when a reusable device is not available; in critically ill patients who would not tolerate a scope-acquired infection; and in procedures associated with a risk of bacteremia.
While preliminary data suggest single-use duodenoscopes are safe and effective in expert hands, concerns exist regarding their implementation more broadly into clinical practice. First, the devices cost between $1,500-3,000, making them impractical for many health systems. One study estimated the break-even cost of the device to be $800-1,300 based on variation in site volume and device contamination rates.3 However, it should be noted that current enhanced reprocessing protocols for reusable devices may add an additional $75,000-$400,000 per year based on center volume.4 In the United States, there is currently payment by federal and some commercial payors that cover part or all of the device cost, but whether this will continue long-term is unclear. In addition, there is significant concern regarding the environmental impact of a broader mover to single-use devices. Reprocessing programs do exist for these devices, but detailed analyses regarding the environmental effects of a strategy using single-use versus reusable devices and the waste generated from each are needed.
Finally, while primarily created to avoid device-related infection transmission, other benefits can be realized with single-use devices. The potential for ergonomic enhancements (variable handle sizes or shaft stiffness, R- and L-handed scopes) as well as the creation of specialty devices (extra-long or thin devices, devices with special optical or rotational capabilities) may become more feasible with a single-use platform. Finally, the pace of endoscopic innovation and refinement is likely to quicken with a single use platform, and new advancements can be incorporated in a timelier manner.
Conclusion
In summary, I believe single-use devices offer the potential to improve the safety of endoscopic procedures as well as improve procedural access, enhance ergonomics, and foster and expedite device innovation. However, reductions in cost, refining their indications, and developing recycling programs to minimize their environmental impact will be essential before more widespread adoption is achieved.
Dr. Muthusamy is a professor of clinical medicine at the University of California, Los Angeles, and the medical director of endoscopy at the UCLA Health System. He reported relationships with Medtronic, Boston Scientific, Motus GI, Endogastric Solutions, and Capsovision.
References
1. Bartles RL et al. Gastrointest Endosc. 2018 Aug;88(2):306-13.e2.
2. Day LW et al. Gastrointest Endosc. 2021 Jan;93(1):11-33.e6.
3. Bang JY et al. Gut. 2019 Nov;68(11):1915-7.
4. Bomman S et al. Endosc Int Open. 2021 Aug 23;9(9):E1404-12.
Dear colleagues,
Innovation is the livelihood of our field, driving major advances in endoscopy and attracting many of us to Gastroenterology. From the development of endoscopic retrograde cholangiopancreatography to the wide-spread adoption of third space endoscopy, we continue to push the boundaries of our practice. But what is the next big disruption in GI, and how will it impact us? Dr. Jeremy Glissen Brown discusses the application of artificial intelligence in GI highlighting its promise but also raising important questions. Dr. Raman Muthusamy elaborates on single-use endoscopes – are they the wave of the future in preventing infection and meeting patient preference? Or will their long-term cost and environmental impact limit their use? I welcome your own thoughts on disruptive innovation in Gastroenterology – share with us on Twitter @AGA_GIHN and by email at [email protected].
Gyanprakash A. Ketwaroo, MD, MSc, is an associate professor of medicine, Yale University, New Haven, Conn., and chief of endoscopy at West Haven (Conn.) VA Medical Center. He is an associate editor for GI & Hepatology News.
The AI revolution, with some important caveats
BY JEREMY R. GLISSEN BROWN, MD, MSC
In 2018, Japan’s Pharmaceutical and Medical Device Agency approved the first artificial intelligence (AI)–based tool, a computer-aided diagnosis system (CADx) for use in clinical practice.1 Since that time, we have seen regulatory approval for a variety of deep learning and AI-based tools in endoscopy and beyond. In addition, there has been an enormous amount of commercial and research interest in AI-based tools in clinical medicine and gastroenterology, and it is almost impossible to open a major gastroenterology journal or go to an academic conference without encountering a slew of AI-based projects.
Many thought and industry leaders say that we are in the midst of an AI revolution in gastroenterology. Indeed, we are at a period of unprecedented growth for deep learning and AI for several reasons, including a recent shift toward data-driven approaches, advancement of machine-learning techniques, and increased computing power. There is, however, also an unprecedented amount of scrutiny and thoughtful conversation about the role AI might play in clinical practice and how we use and regulate these tools in the clinical setting. We are thus in a unique position to ask ourselves the essential question: “Are we on the cusp of an AI revolution in gastroenterology, or are we seeing the release of medical software that is perhaps at best useful in a niche environment and at worse a hype-driven novelty without much clinical benefit?” We will use the most popular use-case, computer aided detection (CADe) of polyps in the colon, to explore this question. In the end, I believe that deep-learning technology will fundamentally change the way we practice gastroenterology. However, this is the perfect time to explore what this means now, and what we can do to shape what it will mean for the future.
CADe: Promise and questions
CADe is a computer vision task that involves localization, such as finding a polyp during colonoscopy and highlighting it with a hollow box. CADe in colonoscopy is perhaps the most well-studied application of deep learning in GI endoscopy to date and is furthest along in the development-implementation pipeline. Because of this, it is an ideal use-case for examining both the evidence that currently supports its use as well as the questions that have come up as we are starting to see CADe algorithms deployed in clinical practice. It is honestly astounding to think that, just 5 years ago, we were talking about CADe as a research concept. While early efforts applying traditional machine learning date back at least to the 1990s, we started to see prospective studies of CADe systems with undetectable or nearly undetectable latency in 2019.2 Since that time we have seen the publication of at least 10 randomized clinical trials involving CADe.
CADe clearly has an impact on some of the conventional quality metrics we use for colonoscopy. While there is considerable heterogeneity in region and design among these trials, most show a significant increase in adenoma detection rate (ADR) and adenomas per colonoscopy. Tandem studies show decreases in adenoma miss rate, and at least one study showed a decrease in sessile serrated lesion miss rate as well. In one of the first randomized, controlled trials across multiple endoscopy centers in Italy, Repici and colleagues showed an increase in ADR from 40.4% in the control group to 54.8% in the CADe group (RR, 1.30; 95% confidence interval, 1.14-1.45).3 Because of pioneering trials such as this one, there are currently several CADe systems that have received regulatory approval in Europe, Asia, and the United States and are being deployed commercially.
It is also clear that the technology is there. In clinical practice, the Food and Drug Administration–approved systems work smoothly, with little to no detectable latency and generally low false-positive and false-negative rates. With clinical deployment, however, we have seen the emergence of healthy debate surrounding every aspect of this task-specific AI. On the development side, important questions include transparency of development data, ensuring that algorithm development is ethical and equitable (as deep learning is susceptible to exacerbating human biases) and methods of data labeling. On the deployment level, important concerns include proper regulation of locked versus “open” algorithms and downstream effects on cost.
In addition, with CADe we have seen a variety of clinical questions crop up because of the novelty of the technology. These include the concern that the increase in ADR we have seen thus far is driven in large part by diminutive and small adenomas (with healthy debate in turn as to these entities’ influence on interval colorectal cancer rates), the effect CADe might have on fellowship training to detect polyps with the human eye, and whether the technology affects sessile serrated lesion detection rates or not. The great thing about such questions is that they have inspired novel research related to CADe in the clinical setting, including how CADe affects trainee ADR, how CADe affects gaze patterns, and how CADe affects recommended surveillance intervals.
CADx, novel applications, and the future
Though there is not space to expand in this particular forum, it is safe to say that with the advancement of CADx in endoscopy and colonoscopy, we have seen similar and novel questions come up. The beautiful thing about all of this is that we are just scratching the surface of what is achievable with deep learning. We have started to see novel projects utilizing deep-learning algorithms, from detecting cirrhosis on ECG to automatically classifying stool consistency on the Bristol Stool Scale from pictures of stool. I ultimately do think that the deployment of AI tools will fundamentally change the way we practice and think about gastroenterology. We are at an incredibly exciting time where we as physicians have the power to shape what that looks like, how we think about AI deployment and regulation and where we go from here.
Dr. Glissen Brown is with the division of gastroenterology and hepatology at Duke University Medical Center, Durham, N.C. He has served as a consultant for Medtronic.
References
1. Aisu N et al. PLOS Digital Health. 2021 Jan 18. doi: 10.1371/journal.pdig.0000001.
2. Wang P et al. Gut. 2019 Oct;68(10):1813-9.
3. Repici A et al. Gastroenterology. 2020 Aug;159(2):512-20.e7.
What’s the future of single-use endoscopes?
BY V. RAMAN MUTHUSAMY, MD, MAS
Single-use endoscopes have been proposed as a definitive solution to the risk of endoscope-transmitted infections. While these infections have been reported for several decades, they have traditionally been associated with identified breaches in the reprocessing protocol. In 2015, numerous cases of duodenoscope-transmitted infections were reported after endoscopic retrograde cholangiopancreatography (ERCP) procedures. Many, if not most, of these cases were not associated with identified deviations from standard high-level disinfection protocols and occurred at high-volume experienced facilities. A subsequent FDA postmarket surveillance study found contamination rates were linked with potentially pathogenic bacteria in approximately 5% of duodenoscopes. Thus, amid growing concerns about the ability to adequately clean these complex devices, these events prompted the development of single-use duodenoscopes. Given the multifactorial causes leading to contaminated duodenoscopes, the advantages of such single-use devices are their ability to ensure the elimination of the potential of infection transmission as these devices are never reused. In addition to this primary benefit, the ability to create single-use devices could lead to more easily available specialty scopes and allow variations in endoscope design that could improve ergonomics. Single-use devices may also expand the ability to provide endoscopic services by eliminating the need for device reprocessing equipment at low-volume sites. However, several concerns have been raised regarding their use, especially if it were to become widespread. These include issues of device quality and performance (potentially leading to more failed cases or adverse events), cost, their environmental impact and current uncertainty regarding their indications for use. Furthermore, new alternatives such as reusable devices with partially disposable components or future low-temperature sterilization options may minimize the need for such devices. We will briefly discuss these issues in more detail below.
Given that nearly all cases of GI device–transmitted infections where standard reprocessing protocols were followed have occurred in duodenoscopes, I will focus on single-use duodenoscopes in this article. It is important that we reassure our patients and colleagues that standard reprocessing appears to be extremely effective with all other types of devices, including elevator containing linear echoendoscopes. Studies investigating the causes of why duodenoscopes have primarily been associated with device-transmitted outbreaks have focused on the complexity of the elevator including its recesses, fixed end-cap and wire channels. However, culturing has shown that up to one-third of contamination may occur in the instrument channels or in the region of the biopsy cap, leading to some potential residual sites of infection even when newly developed reusable devices with disposable elevators/end-caps are utilized.1 Another challenge with reprocessing is the ability to prove residual contamination does not exist. While culturing the devices after reprocessing is most used, it should be noted many sites with outbreaks failed to culture the culprit bacteria from the devices as accessing the sites of contamination can be challenging. The use of other markers of residual contamination such as ATP and tests for residual blood/protein have yielded variable results. Specifically, ATP testing has not correlated well with culture results but may be helpful in assessing the quality of manual cleaning.2
These challenges have made the concept of single-use devices more appealing given the lack of a need reprocess devices or validate cleaning efficacy. Currently, there are two FDA-approved devices on the market, but the published literature to date has largely involved one of these devices. To date, in four published studies that have assessed the clinical performance of single-use duodenoscopes in over 400 patients, procedural success rates have ranged from 91% to 97% with adverse event rates and endoscopist satisfaction scores comparable to reusable devices. Most of these users were expert biliary endoscopists and more data are needed regarding the performance of the device in lower-volume and nonexpert users. While indications for use in these studies have varied, I feel that there are four potential scenarios to utilize these devices: in patients with known multidrug-resistant organisms undergoing ERCP; to facilitate logistics/operations when a reusable device is not available; in critically ill patients who would not tolerate a scope-acquired infection; and in procedures associated with a risk of bacteremia.
While preliminary data suggest single-use duodenoscopes are safe and effective in expert hands, concerns exist regarding their implementation more broadly into clinical practice. First, the devices cost between $1,500-3,000, making them impractical for many health systems. One study estimated the break-even cost of the device to be $800-1,300 based on variation in site volume and device contamination rates.3 However, it should be noted that current enhanced reprocessing protocols for reusable devices may add an additional $75,000-$400,000 per year based on center volume.4 In the United States, there is currently payment by federal and some commercial payors that cover part or all of the device cost, but whether this will continue long-term is unclear. In addition, there is significant concern regarding the environmental impact of a broader mover to single-use devices. Reprocessing programs do exist for these devices, but detailed analyses regarding the environmental effects of a strategy using single-use versus reusable devices and the waste generated from each are needed.
Finally, while primarily created to avoid device-related infection transmission, other benefits can be realized with single-use devices. The potential for ergonomic enhancements (variable handle sizes or shaft stiffness, R- and L-handed scopes) as well as the creation of specialty devices (extra-long or thin devices, devices with special optical or rotational capabilities) may become more feasible with a single-use platform. Finally, the pace of endoscopic innovation and refinement is likely to quicken with a single use platform, and new advancements can be incorporated in a timelier manner.
Conclusion
In summary, I believe single-use devices offer the potential to improve the safety of endoscopic procedures as well as improve procedural access, enhance ergonomics, and foster and expedite device innovation. However, reductions in cost, refining their indications, and developing recycling programs to minimize their environmental impact will be essential before more widespread adoption is achieved.
Dr. Muthusamy is a professor of clinical medicine at the University of California, Los Angeles, and the medical director of endoscopy at the UCLA Health System. He reported relationships with Medtronic, Boston Scientific, Motus GI, Endogastric Solutions, and Capsovision.
References
1. Bartles RL et al. Gastrointest Endosc. 2018 Aug;88(2):306-13.e2.
2. Day LW et al. Gastrointest Endosc. 2021 Jan;93(1):11-33.e6.
3. Bang JY et al. Gut. 2019 Nov;68(11):1915-7.
4. Bomman S et al. Endosc Int Open. 2021 Aug 23;9(9):E1404-12.
Then and now: Gut microbiome
The HMP, which was supported by “only” approximately $20 million of funding in its first year, served as a catalyst for the development of computational tools, clinical protocols, and reference datasets for an emerging field that now approaches nearly $2 billion per year in market value of diagnostics and therapeutics.
Over the past 15 years, many important discoveries about the microbiome have been made, particularly in the fields of gastroenterology, hepatology, and nutrition. The transplantation of gut microbiome from one person to another has been shown to be more than 90% effective in the treatment of recurrent C. difficile infection, disrupting our current therapeutic algorithms of repetitive antibiotics. Other exciting discoveries have included the relationship between the gut microbiome and enteric nervous system, and its roles in the regulation of metabolism and obesity and in the progression of liver fibrosis and cancer.
Looking ahead, several exciting areas related to digestive health and the microbiome are being prioritized, including the role of probiotics in nutrition, the complex relationship of the bidirectional “gut-brain” axis, and further development of analytics to define and deliver precision medicine across a wide range of digestive disorders. Without a doubt, emerging microbiome discoveries will be prominently featured in the pages of GI & Hepatology News over the coming years to keep our readers informed of these cutting-edge findings.
Dr. Rosenberg is medical director of the North Shore Endoscopy Center and director of clinical research at GI Alliance of Illinois in Gurnee, Ill. Dr. Rosenberg is a consultant for Aimmune Therapeutics and performs clinical research with Ferring Pharmaceuticals.
The HMP, which was supported by “only” approximately $20 million of funding in its first year, served as a catalyst for the development of computational tools, clinical protocols, and reference datasets for an emerging field that now approaches nearly $2 billion per year in market value of diagnostics and therapeutics.
Over the past 15 years, many important discoveries about the microbiome have been made, particularly in the fields of gastroenterology, hepatology, and nutrition. The transplantation of gut microbiome from one person to another has been shown to be more than 90% effective in the treatment of recurrent C. difficile infection, disrupting our current therapeutic algorithms of repetitive antibiotics. Other exciting discoveries have included the relationship between the gut microbiome and enteric nervous system, and its roles in the regulation of metabolism and obesity and in the progression of liver fibrosis and cancer.
Looking ahead, several exciting areas related to digestive health and the microbiome are being prioritized, including the role of probiotics in nutrition, the complex relationship of the bidirectional “gut-brain” axis, and further development of analytics to define and deliver precision medicine across a wide range of digestive disorders. Without a doubt, emerging microbiome discoveries will be prominently featured in the pages of GI & Hepatology News over the coming years to keep our readers informed of these cutting-edge findings.
Dr. Rosenberg is medical director of the North Shore Endoscopy Center and director of clinical research at GI Alliance of Illinois in Gurnee, Ill. Dr. Rosenberg is a consultant for Aimmune Therapeutics and performs clinical research with Ferring Pharmaceuticals.
The HMP, which was supported by “only” approximately $20 million of funding in its first year, served as a catalyst for the development of computational tools, clinical protocols, and reference datasets for an emerging field that now approaches nearly $2 billion per year in market value of diagnostics and therapeutics.
Over the past 15 years, many important discoveries about the microbiome have been made, particularly in the fields of gastroenterology, hepatology, and nutrition. The transplantation of gut microbiome from one person to another has been shown to be more than 90% effective in the treatment of recurrent C. difficile infection, disrupting our current therapeutic algorithms of repetitive antibiotics. Other exciting discoveries have included the relationship between the gut microbiome and enteric nervous system, and its roles in the regulation of metabolism and obesity and in the progression of liver fibrosis and cancer.
Looking ahead, several exciting areas related to digestive health and the microbiome are being prioritized, including the role of probiotics in nutrition, the complex relationship of the bidirectional “gut-brain” axis, and further development of analytics to define and deliver precision medicine across a wide range of digestive disorders. Without a doubt, emerging microbiome discoveries will be prominently featured in the pages of GI & Hepatology News over the coming years to keep our readers informed of these cutting-edge findings.
Dr. Rosenberg is medical director of the North Shore Endoscopy Center and director of clinical research at GI Alliance of Illinois in Gurnee, Ill. Dr. Rosenberg is a consultant for Aimmune Therapeutics and performs clinical research with Ferring Pharmaceuticals.
Treatment of HER2-Low Breast Cancer
Can you talk about the evolution and treatment of human epidermal growth factor receptor 2 (HER2)-low breast cancer?
Dr. Abdou: Until recently, HER2 status had been defined as a positive or negative result, but this convention has evolved, and now a newly defined population with low levels of HER2 expression has been identified. This HER2-low population accounts for about 55% of all breast cancers. Previously, low HER2 expression levels were considered HER2-negative in clinical practice because HER2-targeted therapies had been considered ineffective in this setting. Patients with HER2-low disease therefore had limited targeted treatment options after progression on their primary therapy.
Now, new studies and clinical trials have opened the door to effective treatments for this cohort of patients. The clinical trial DESTINY-Breast04, which was presented at ASCO 2022, led to the first FDA approval in August 2022 of a targeted therapy option for patients with HER2-low breast cancer subtypes, reclassifying this cohort as a new targetable subset in breast cancer.
DESTINY-Breast04 was the first randomized clinical trial to show that targeting HER2 provides clinically meaningful benefits for patients with HER2-low metastatic breast cancer, not only patients with HER2-positive disease. The phase 3 study enrolled about 557 patients with hormone receptor (HR)-negative or -positive breast cancer and centrally confirmed HER2-low expression who were previously treated with 1 or 2 prior lines of chemotherapy. Patients were randomized to receive either the antibody–drug conjugate trastuzumab deruxtecan or physician’s choice of standard chemotherapy. The risk of disease progression was about 50% lower and the risk of death was about 36% lower with trastuzumab deruxtecan compared with chemotherapy.1
These impressive and practice-changing results opened the door to a new treatment option for a substantial group of patients with HER2-low disease and significantly expanded the population of patients who can benefit from HER2-targeted therapy.
What molecular characteristics do you take into consideration to help determine whether patients are eligible for these targeted treatment options?
Dr. Abdou: As we said earlier, HER2 status should no longer be recorded as a binary result of either HER2-positive or HER2-negative. It is important to start routinely testing for the level of HER2 expression in the tumor. Obtaining these levels is done through commonly used immunohistochemical (IHC) assays that allow direct visualization of the HER2 protein. Breast tumors considered to be HER2-low are classified as IHC1+ or as IHC2+ with in situ hybridization or FISH-negative status.
HER2-low breast cancer consists of a heterogeneous group of breast cancers, most of which are HR-positive tumors, whereas about 20% are HR-negative tumors. While these tumors may have distinct molecular profiles leading to clinicopathological and prognostic differences within these groups—HR-positive tumors represent more luminal subtypes and HR-negative tumors tend to be predominantly basal-like subtypes—these distinctions do not necessarily affect patient eligibility for targeted therapy. The benefit of trastuzumab deruxtecan was seen in both subgroups, although the HR-positive population was much more well represented in the DESTINY-Breast04 study.
Other than the HER2 expression status, I also take into consideration the presence of clinical comorbidities, particularly pulmonary comorbidities or prior lung injuries. Trastuzumab deruxtecan can cause a potentially serious type of lung toxicity called interstitial lung disease (ILD). In DESTINY-Breast04, ILD developed in about 12% of patients in the trastuzumab deruxtecan group, with 3 deaths as a result.
Therefore, it’s important for us to carefully select these patients and closely monitor them while they’re on treatment.
What is next in the treatment of HER2-low breast cancer, and what would you like to see in the future?
Dr. Abdou: The exciting new field of HER2-low breast cancer has really opened the door to novel studies and clinical trials, several of which are exploring the role of antibody–drug conjugates in patients with metastatic HER2-low disease and others that are studying early-stage HER2-low breast cancer. In early-stage HER2-low breast cancer, we may potentially see an even greater benefit with these drugs because the disease has not yet developed resistance to therapy. Other studies are examining the role of combination therapy in metastatic breast cancer, such as antibody–drug conjugates in combination with immunotherapy and other targeted agents. I look forward to results from those studies.
Also, importantly, as we start using these therapies more widely, I would like to see more accurate and sensitive ways of assessing the HER2 expression status. The current IHC assay, although widely available, fails to identify many women who have HER2 expression in their tumors. I think more sensitive tests may be able to identify even more women who can benefit from these targeted therapies.
1. Modi S, Jacot W, Yamashita T, et al. Trastuzumab deruxtecan in previously treated HER2-low advanced breast cancer. N Engl J Med. 2022;387(1):9-20. doi:10.1056/NEJMoa2203690
Can you talk about the evolution and treatment of human epidermal growth factor receptor 2 (HER2)-low breast cancer?
Dr. Abdou: Until recently, HER2 status had been defined as a positive or negative result, but this convention has evolved, and now a newly defined population with low levels of HER2 expression has been identified. This HER2-low population accounts for about 55% of all breast cancers. Previously, low HER2 expression levels were considered HER2-negative in clinical practice because HER2-targeted therapies had been considered ineffective in this setting. Patients with HER2-low disease therefore had limited targeted treatment options after progression on their primary therapy.
Now, new studies and clinical trials have opened the door to effective treatments for this cohort of patients. The clinical trial DESTINY-Breast04, which was presented at ASCO 2022, led to the first FDA approval in August 2022 of a targeted therapy option for patients with HER2-low breast cancer subtypes, reclassifying this cohort as a new targetable subset in breast cancer.
DESTINY-Breast04 was the first randomized clinical trial to show that targeting HER2 provides clinically meaningful benefits for patients with HER2-low metastatic breast cancer, not only patients with HER2-positive disease. The phase 3 study enrolled about 557 patients with hormone receptor (HR)-negative or -positive breast cancer and centrally confirmed HER2-low expression who were previously treated with 1 or 2 prior lines of chemotherapy. Patients were randomized to receive either the antibody–drug conjugate trastuzumab deruxtecan or physician’s choice of standard chemotherapy. The risk of disease progression was about 50% lower and the risk of death was about 36% lower with trastuzumab deruxtecan compared with chemotherapy.1
These impressive and practice-changing results opened the door to a new treatment option for a substantial group of patients with HER2-low disease and significantly expanded the population of patients who can benefit from HER2-targeted therapy.
What molecular characteristics do you take into consideration to help determine whether patients are eligible for these targeted treatment options?
Dr. Abdou: As we said earlier, HER2 status should no longer be recorded as a binary result of either HER2-positive or HER2-negative. It is important to start routinely testing for the level of HER2 expression in the tumor. Obtaining these levels is done through commonly used immunohistochemical (IHC) assays that allow direct visualization of the HER2 protein. Breast tumors considered to be HER2-low are classified as IHC1+ or as IHC2+ with in situ hybridization or FISH-negative status.
HER2-low breast cancer consists of a heterogeneous group of breast cancers, most of which are HR-positive tumors, whereas about 20% are HR-negative tumors. While these tumors may have distinct molecular profiles leading to clinicopathological and prognostic differences within these groups—HR-positive tumors represent more luminal subtypes and HR-negative tumors tend to be predominantly basal-like subtypes—these distinctions do not necessarily affect patient eligibility for targeted therapy. The benefit of trastuzumab deruxtecan was seen in both subgroups, although the HR-positive population was much more well represented in the DESTINY-Breast04 study.
Other than the HER2 expression status, I also take into consideration the presence of clinical comorbidities, particularly pulmonary comorbidities or prior lung injuries. Trastuzumab deruxtecan can cause a potentially serious type of lung toxicity called interstitial lung disease (ILD). In DESTINY-Breast04, ILD developed in about 12% of patients in the trastuzumab deruxtecan group, with 3 deaths as a result.
Therefore, it’s important for us to carefully select these patients and closely monitor them while they’re on treatment.
What is next in the treatment of HER2-low breast cancer, and what would you like to see in the future?
Dr. Abdou: The exciting new field of HER2-low breast cancer has really opened the door to novel studies and clinical trials, several of which are exploring the role of antibody–drug conjugates in patients with metastatic HER2-low disease and others that are studying early-stage HER2-low breast cancer. In early-stage HER2-low breast cancer, we may potentially see an even greater benefit with these drugs because the disease has not yet developed resistance to therapy. Other studies are examining the role of combination therapy in metastatic breast cancer, such as antibody–drug conjugates in combination with immunotherapy and other targeted agents. I look forward to results from those studies.
Also, importantly, as we start using these therapies more widely, I would like to see more accurate and sensitive ways of assessing the HER2 expression status. The current IHC assay, although widely available, fails to identify many women who have HER2 expression in their tumors. I think more sensitive tests may be able to identify even more women who can benefit from these targeted therapies.
Can you talk about the evolution and treatment of human epidermal growth factor receptor 2 (HER2)-low breast cancer?
Dr. Abdou: Until recently, HER2 status had been defined as a positive or negative result, but this convention has evolved, and now a newly defined population with low levels of HER2 expression has been identified. This HER2-low population accounts for about 55% of all breast cancers. Previously, low HER2 expression levels were considered HER2-negative in clinical practice because HER2-targeted therapies had been considered ineffective in this setting. Patients with HER2-low disease therefore had limited targeted treatment options after progression on their primary therapy.
Now, new studies and clinical trials have opened the door to effective treatments for this cohort of patients. The clinical trial DESTINY-Breast04, which was presented at ASCO 2022, led to the first FDA approval in August 2022 of a targeted therapy option for patients with HER2-low breast cancer subtypes, reclassifying this cohort as a new targetable subset in breast cancer.
DESTINY-Breast04 was the first randomized clinical trial to show that targeting HER2 provides clinically meaningful benefits for patients with HER2-low metastatic breast cancer, not only patients with HER2-positive disease. The phase 3 study enrolled about 557 patients with hormone receptor (HR)-negative or -positive breast cancer and centrally confirmed HER2-low expression who were previously treated with 1 or 2 prior lines of chemotherapy. Patients were randomized to receive either the antibody–drug conjugate trastuzumab deruxtecan or physician’s choice of standard chemotherapy. The risk of disease progression was about 50% lower and the risk of death was about 36% lower with trastuzumab deruxtecan compared with chemotherapy.1
These impressive and practice-changing results opened the door to a new treatment option for a substantial group of patients with HER2-low disease and significantly expanded the population of patients who can benefit from HER2-targeted therapy.
What molecular characteristics do you take into consideration to help determine whether patients are eligible for these targeted treatment options?
Dr. Abdou: As we said earlier, HER2 status should no longer be recorded as a binary result of either HER2-positive or HER2-negative. It is important to start routinely testing for the level of HER2 expression in the tumor. Obtaining these levels is done through commonly used immunohistochemical (IHC) assays that allow direct visualization of the HER2 protein. Breast tumors considered to be HER2-low are classified as IHC1+ or as IHC2+ with in situ hybridization or FISH-negative status.
HER2-low breast cancer consists of a heterogeneous group of breast cancers, most of which are HR-positive tumors, whereas about 20% are HR-negative tumors. While these tumors may have distinct molecular profiles leading to clinicopathological and prognostic differences within these groups—HR-positive tumors represent more luminal subtypes and HR-negative tumors tend to be predominantly basal-like subtypes—these distinctions do not necessarily affect patient eligibility for targeted therapy. The benefit of trastuzumab deruxtecan was seen in both subgroups, although the HR-positive population was much more well represented in the DESTINY-Breast04 study.
Other than the HER2 expression status, I also take into consideration the presence of clinical comorbidities, particularly pulmonary comorbidities or prior lung injuries. Trastuzumab deruxtecan can cause a potentially serious type of lung toxicity called interstitial lung disease (ILD). In DESTINY-Breast04, ILD developed in about 12% of patients in the trastuzumab deruxtecan group, with 3 deaths as a result.
Therefore, it’s important for us to carefully select these patients and closely monitor them while they’re on treatment.
What is next in the treatment of HER2-low breast cancer, and what would you like to see in the future?
Dr. Abdou: The exciting new field of HER2-low breast cancer has really opened the door to novel studies and clinical trials, several of which are exploring the role of antibody–drug conjugates in patients with metastatic HER2-low disease and others that are studying early-stage HER2-low breast cancer. In early-stage HER2-low breast cancer, we may potentially see an even greater benefit with these drugs because the disease has not yet developed resistance to therapy. Other studies are examining the role of combination therapy in metastatic breast cancer, such as antibody–drug conjugates in combination with immunotherapy and other targeted agents. I look forward to results from those studies.
Also, importantly, as we start using these therapies more widely, I would like to see more accurate and sensitive ways of assessing the HER2 expression status. The current IHC assay, although widely available, fails to identify many women who have HER2 expression in their tumors. I think more sensitive tests may be able to identify even more women who can benefit from these targeted therapies.
1. Modi S, Jacot W, Yamashita T, et al. Trastuzumab deruxtecan in previously treated HER2-low advanced breast cancer. N Engl J Med. 2022;387(1):9-20. doi:10.1056/NEJMoa2203690
1. Modi S, Jacot W, Yamashita T, et al. Trastuzumab deruxtecan in previously treated HER2-low advanced breast cancer. N Engl J Med. 2022;387(1):9-20. doi:10.1056/NEJMoa2203690