Lessons abound for dermatologists when animal health and human health intersect

Article Type
Changed
Mon, 01/14/2019 - 10:09

NEW YORK – We share more than affection with our dogs and cats. We also share diseases – about which our four-legged furry friends can teach us plenty.

That was the conclusion of speakers at a session on “cases at the intersection of human and veterinary dermatology,” presented at the summer meeting of the American Academy of Dermatology.

“Human health is intimately connected to animal health,” said Jennifer Gardner, MD, of the division of dermatology, University of Washington, Seattle, and a collaborating member of the school’s Center for One Health Research. The One Health framework looks at factors involved in the human, environmental, and animal sectors from the molecular level to the individual level and even to the planetary level.

Dr. Gardner challenged her audience to think beyond their individual areas of expertise. “How does the work you’re doing with a patient or test tube connect up the line and make an impact to levels higher up?” she asked.

The One Health framework also challenges practitioners to look horizontally, at how work done in the human world connects to what’s going on in the veterinary world – that is, how treatments for dermatologic conditions in dogs may one day affect how dermatologists treat the same or similar disorders in humans.

Learning from the mighty mite

For example, the study of mites that live on the skin of animals could eventually shed light on how dermatologists treat mite-related conditions in humans.

Dirk M. Elston, MD, professor and chair of the department of dermatology at the Medical University of South Carolina, Charleston, noted that Demodex mites occur in humans and in pets.

Dr. Dirk M. Elston
In people, they play a role in papular eruptions in immunosuppressed patients, and in rosacea, alopecia, and blepharitis, he said. Patients with pityriasis folliculorum may look like they have rosacea, “but with little spines” – which are Demodex mites dining in. “They are so crowded in there that their backsides are sticking out,” he said. “They’re all there munching on the sebaceous glands.”

In such cases, “sulfur tends to be my most reliable” treatment, he said, noting that it releases a rotten egg smell. “You’re basically gassing the organism.” Dr. Elston said he frequently gets calls from fellow dermatologists whose antimite efforts have failed with ivermectin and permethrin and does not hesitate to give his advice. “I’m like a broken record,” he said. “Sulfur, sulfur, sulfur, sulfur.”

The Demodex mite affects dogs to varying degrees, depending on where they live, said Kathryn Rook, VMD, of the department of dermatology at the University of Pennsylvania School of Veterinary Medicine, Philadelphia. In North America, demodicosis occurs in 0.38%-0.58% of dogs, and in 25% of dogs in Mexico, she said.

Amitraz, the only Food and Drug Administration–approved treatment for canine demodicosis, is available only as a dip. But it has fallen from favor as a result of sometimes serious side effects, which can include sedation, bradycardia, ataxia, vomiting, diarrhea, and hyperglycemia.

Daily administration of oral ivermectin – often for months – also carries a risk of side effects, including dilated pupils, ataxia, sedation, stupor, coma, hypersalivation, vomiting, diarrhea, blindness, tremors, seizures, and respiratory depression.
Eriklam/Thinkstock


But the discovery of isoxazoline has “revolutionized” the treatment of demodicosis and other parasitic infestations in dogs, Dr. Rook said, citing quicker resolution of disease and improved quality of life for both the patient and its owner.

Isoxazoline, which Dr. Rook said carries little risk for side effects, is licensed in the United States only as a flea and tick preventive.

Atopic dermatitis

Atopic dermatitis (AD) tends to be similar in people and dogs, according to Charles W. Bradley, DVM, of the University of Pennsylvania School of Veterinary Medicine, Philadelphia. About 10%-30% of children and up to 10% of adults have the disorder, the prevalence of which has more than doubled in recent years, he said.

In dogs, the prevalence is 10%-20%, making it “an extraordinarily common disorder,” he said. Lesions tend to be located on the feet, face, pinnae, ventrum, and axilla/inguinum. Additional sites vary by breed, with Dalmatians tending to get AD on the lips, French Bulldogs on the eyelids, German Shepherds on the elbows, Shar-Peis on the thorax, and Boxers on the ears.

In humans, Staphylococcus aureus is the chief microorganism of concern, said Elizabeth Grice, PhD, of the department of dermatology at the University of Pennsylvania, Philadelphia, who copresented the topic with Dr. Bradley.

Dr. Elizabeth Grice
Concern about drug resistance is “one reason why we want to better understand the entire microbiome and other organisms that are colonizing the skin,” she commented. That means better understanding the relationship among S. aureus, microbial diversity, and disease severity.

“My true love is anything to do with the skin microbiome,” she said. “The more severe the disease, the lower the skin microbiome diversity.”

Though most studies of AD use mice as animal models, dogs would be better, according to Dr. Grice and Dr. Bradley.

That’s because canine AD occurs spontaneously and exhibits immunologic and clinical features similar to those of human AD. They include prevalence, environmental triggers, immunologic profiles, genetic predispositions, lesion distribution, and frequent colonization by Staphylococcus species. In addition, dogs and their owners tend to share the same environment.

A rash of itches

Among dermatology patients – man or beast – itch can outweigh rash as a key focus of concern, according to Brian Kim, MD, of the division of dermatology at Washington University in St. Louis, and codirector for the University’s Center for the Study of Itch. “The problem is my patients don’t complain about their rash; they complain about their itch,” he said. “But we don’t understand the basic question of itch.” In fact, the FDA has not approved any drugs for the treatment of chronic itch, he said.

Dr. Brian Kim
Toward that end, veterinary medicine is moving faster than human medicine, he said, citing work in mice that has succeeded in killing itch.

For dogs, advances have been made with Janus kinase (JAK) inhibitors, which “may function as immunomodulators,” Dr. Kim said. And JAK-1 selective inhibition “may be more effective than broad JAK blockade for itch.”

‘The perfect culture plate’

Lessons can be learned from studying canine AD, which “is immunophysiologically homologous to human AD,” said Daniel O. Morris, DVM, MPH, professor of dermatology, at the University of Pennsylvania School of Veterinary Medicine, Philadelphia. “The main difference: My patients are covered in dense hair coats.” Because of that, systemic treatment is necessary, he said.

Canine AD primarily affects areas where hair is sparse or where the surface microclimate is moist, he said. A dog’s ear canal, which can be 10 times longer than a human’s, harbors plenty of moisture and heat, he said. “It’s the perfect culture plate.”

But, he added, the owners of his patients tend to resist using topical therapies “that could be potentially smeared on the babies and grandma’s diabetic foot ulcer.” So he has long relied on systemic treatments, initially steroids and cyclosporine. But they can have major side effects, and cyclosporine can take 60-90 days before it exerts maximum effect.

A faster-acting compound called oclacitinib has shown promise based on its high affinity for inhibiting JAK-1 enzyme-mediated activation of cytokine expression, including interleukin (IL)-31, he said. “Clinical trials demonstrate an antipruritic efficacy equivalent to both prednisolone and cyclosporine,” he noted. Contraindications include a history of neoplasia, the presence of severe infection, and age under 1 year.

Monoclonal antibody targets IL-31

The latest promising arrival is lokivetmab, a monoclonal antibody that targets canine IL-31, according to Dr. Morris. It acts rapidly (within 1 day for many dogs) and prevents binding of IL-31 to its neuronal receptor for at least a month, thereby interrupting neurotransmission of itch.

But side effects can be serious and common. Equal efficacy with a reduced side effect is the holy grail, he said.

Some doctors are not waiting. “People are throwing these two products at anything that itches,” he said. Unfortunately, they tend to “work miserably” for causes other than AD, he added.

Dr. Gardner, Dr. Elston, Dr. Rook, Dr. Bradley, and Dr. Morris reported no financial conflicts. Dr. Grice’s disclosures include having served as a speaker for GlaxoSmithKline and for L’Oreal France, and having received grants/research funding from Janssen Research & Development. Dr. Kim has served as a consultant to biotechnology and pharmaceutical companies.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

NEW YORK – We share more than affection with our dogs and cats. We also share diseases – about which our four-legged furry friends can teach us plenty.

That was the conclusion of speakers at a session on “cases at the intersection of human and veterinary dermatology,” presented at the summer meeting of the American Academy of Dermatology.

“Human health is intimately connected to animal health,” said Jennifer Gardner, MD, of the division of dermatology, University of Washington, Seattle, and a collaborating member of the school’s Center for One Health Research. The One Health framework looks at factors involved in the human, environmental, and animal sectors from the molecular level to the individual level and even to the planetary level.

Dr. Gardner challenged her audience to think beyond their individual areas of expertise. “How does the work you’re doing with a patient or test tube connect up the line and make an impact to levels higher up?” she asked.

The One Health framework also challenges practitioners to look horizontally, at how work done in the human world connects to what’s going on in the veterinary world – that is, how treatments for dermatologic conditions in dogs may one day affect how dermatologists treat the same or similar disorders in humans.

Learning from the mighty mite

For example, the study of mites that live on the skin of animals could eventually shed light on how dermatologists treat mite-related conditions in humans.

Dirk M. Elston, MD, professor and chair of the department of dermatology at the Medical University of South Carolina, Charleston, noted that Demodex mites occur in humans and in pets.

Dr. Dirk M. Elston
In people, they play a role in papular eruptions in immunosuppressed patients, and in rosacea, alopecia, and blepharitis, he said. Patients with pityriasis folliculorum may look like they have rosacea, “but with little spines” – which are Demodex mites dining in. “They are so crowded in there that their backsides are sticking out,” he said. “They’re all there munching on the sebaceous glands.”

In such cases, “sulfur tends to be my most reliable” treatment, he said, noting that it releases a rotten egg smell. “You’re basically gassing the organism.” Dr. Elston said he frequently gets calls from fellow dermatologists whose antimite efforts have failed with ivermectin and permethrin and does not hesitate to give his advice. “I’m like a broken record,” he said. “Sulfur, sulfur, sulfur, sulfur.”

The Demodex mite affects dogs to varying degrees, depending on where they live, said Kathryn Rook, VMD, of the department of dermatology at the University of Pennsylvania School of Veterinary Medicine, Philadelphia. In North America, demodicosis occurs in 0.38%-0.58% of dogs, and in 25% of dogs in Mexico, she said.

Amitraz, the only Food and Drug Administration–approved treatment for canine demodicosis, is available only as a dip. But it has fallen from favor as a result of sometimes serious side effects, which can include sedation, bradycardia, ataxia, vomiting, diarrhea, and hyperglycemia.

Daily administration of oral ivermectin – often for months – also carries a risk of side effects, including dilated pupils, ataxia, sedation, stupor, coma, hypersalivation, vomiting, diarrhea, blindness, tremors, seizures, and respiratory depression.
Eriklam/Thinkstock


But the discovery of isoxazoline has “revolutionized” the treatment of demodicosis and other parasitic infestations in dogs, Dr. Rook said, citing quicker resolution of disease and improved quality of life for both the patient and its owner.

Isoxazoline, which Dr. Rook said carries little risk for side effects, is licensed in the United States only as a flea and tick preventive.

Atopic dermatitis

Atopic dermatitis (AD) tends to be similar in people and dogs, according to Charles W. Bradley, DVM, of the University of Pennsylvania School of Veterinary Medicine, Philadelphia. About 10%-30% of children and up to 10% of adults have the disorder, the prevalence of which has more than doubled in recent years, he said.

In dogs, the prevalence is 10%-20%, making it “an extraordinarily common disorder,” he said. Lesions tend to be located on the feet, face, pinnae, ventrum, and axilla/inguinum. Additional sites vary by breed, with Dalmatians tending to get AD on the lips, French Bulldogs on the eyelids, German Shepherds on the elbows, Shar-Peis on the thorax, and Boxers on the ears.

In humans, Staphylococcus aureus is the chief microorganism of concern, said Elizabeth Grice, PhD, of the department of dermatology at the University of Pennsylvania, Philadelphia, who copresented the topic with Dr. Bradley.

Dr. Elizabeth Grice
Concern about drug resistance is “one reason why we want to better understand the entire microbiome and other organisms that are colonizing the skin,” she commented. That means better understanding the relationship among S. aureus, microbial diversity, and disease severity.

“My true love is anything to do with the skin microbiome,” she said. “The more severe the disease, the lower the skin microbiome diversity.”

Though most studies of AD use mice as animal models, dogs would be better, according to Dr. Grice and Dr. Bradley.

That’s because canine AD occurs spontaneously and exhibits immunologic and clinical features similar to those of human AD. They include prevalence, environmental triggers, immunologic profiles, genetic predispositions, lesion distribution, and frequent colonization by Staphylococcus species. In addition, dogs and their owners tend to share the same environment.

A rash of itches

Among dermatology patients – man or beast – itch can outweigh rash as a key focus of concern, according to Brian Kim, MD, of the division of dermatology at Washington University in St. Louis, and codirector for the University’s Center for the Study of Itch. “The problem is my patients don’t complain about their rash; they complain about their itch,” he said. “But we don’t understand the basic question of itch.” In fact, the FDA has not approved any drugs for the treatment of chronic itch, he said.

Dr. Brian Kim
Toward that end, veterinary medicine is moving faster than human medicine, he said, citing work in mice that has succeeded in killing itch.

For dogs, advances have been made with Janus kinase (JAK) inhibitors, which “may function as immunomodulators,” Dr. Kim said. And JAK-1 selective inhibition “may be more effective than broad JAK blockade for itch.”

‘The perfect culture plate’

Lessons can be learned from studying canine AD, which “is immunophysiologically homologous to human AD,” said Daniel O. Morris, DVM, MPH, professor of dermatology, at the University of Pennsylvania School of Veterinary Medicine, Philadelphia. “The main difference: My patients are covered in dense hair coats.” Because of that, systemic treatment is necessary, he said.

Canine AD primarily affects areas where hair is sparse or where the surface microclimate is moist, he said. A dog’s ear canal, which can be 10 times longer than a human’s, harbors plenty of moisture and heat, he said. “It’s the perfect culture plate.”

But, he added, the owners of his patients tend to resist using topical therapies “that could be potentially smeared on the babies and grandma’s diabetic foot ulcer.” So he has long relied on systemic treatments, initially steroids and cyclosporine. But they can have major side effects, and cyclosporine can take 60-90 days before it exerts maximum effect.

A faster-acting compound called oclacitinib has shown promise based on its high affinity for inhibiting JAK-1 enzyme-mediated activation of cytokine expression, including interleukin (IL)-31, he said. “Clinical trials demonstrate an antipruritic efficacy equivalent to both prednisolone and cyclosporine,” he noted. Contraindications include a history of neoplasia, the presence of severe infection, and age under 1 year.

Monoclonal antibody targets IL-31

The latest promising arrival is lokivetmab, a monoclonal antibody that targets canine IL-31, according to Dr. Morris. It acts rapidly (within 1 day for many dogs) and prevents binding of IL-31 to its neuronal receptor for at least a month, thereby interrupting neurotransmission of itch.

But side effects can be serious and common. Equal efficacy with a reduced side effect is the holy grail, he said.

Some doctors are not waiting. “People are throwing these two products at anything that itches,” he said. Unfortunately, they tend to “work miserably” for causes other than AD, he added.

Dr. Gardner, Dr. Elston, Dr. Rook, Dr. Bradley, and Dr. Morris reported no financial conflicts. Dr. Grice’s disclosures include having served as a speaker for GlaxoSmithKline and for L’Oreal France, and having received grants/research funding from Janssen Research & Development. Dr. Kim has served as a consultant to biotechnology and pharmaceutical companies.

NEW YORK – We share more than affection with our dogs and cats. We also share diseases – about which our four-legged furry friends can teach us plenty.

That was the conclusion of speakers at a session on “cases at the intersection of human and veterinary dermatology,” presented at the summer meeting of the American Academy of Dermatology.

“Human health is intimately connected to animal health,” said Jennifer Gardner, MD, of the division of dermatology, University of Washington, Seattle, and a collaborating member of the school’s Center for One Health Research. The One Health framework looks at factors involved in the human, environmental, and animal sectors from the molecular level to the individual level and even to the planetary level.

Dr. Gardner challenged her audience to think beyond their individual areas of expertise. “How does the work you’re doing with a patient or test tube connect up the line and make an impact to levels higher up?” she asked.

The One Health framework also challenges practitioners to look horizontally, at how work done in the human world connects to what’s going on in the veterinary world – that is, how treatments for dermatologic conditions in dogs may one day affect how dermatologists treat the same or similar disorders in humans.

Learning from the mighty mite

For example, the study of mites that live on the skin of animals could eventually shed light on how dermatologists treat mite-related conditions in humans.

Dirk M. Elston, MD, professor and chair of the department of dermatology at the Medical University of South Carolina, Charleston, noted that Demodex mites occur in humans and in pets.

Dr. Dirk M. Elston
In people, they play a role in papular eruptions in immunosuppressed patients, and in rosacea, alopecia, and blepharitis, he said. Patients with pityriasis folliculorum may look like they have rosacea, “but with little spines” – which are Demodex mites dining in. “They are so crowded in there that their backsides are sticking out,” he said. “They’re all there munching on the sebaceous glands.”

In such cases, “sulfur tends to be my most reliable” treatment, he said, noting that it releases a rotten egg smell. “You’re basically gassing the organism.” Dr. Elston said he frequently gets calls from fellow dermatologists whose antimite efforts have failed with ivermectin and permethrin and does not hesitate to give his advice. “I’m like a broken record,” he said. “Sulfur, sulfur, sulfur, sulfur.”

The Demodex mite affects dogs to varying degrees, depending on where they live, said Kathryn Rook, VMD, of the department of dermatology at the University of Pennsylvania School of Veterinary Medicine, Philadelphia. In North America, demodicosis occurs in 0.38%-0.58% of dogs, and in 25% of dogs in Mexico, she said.

Amitraz, the only Food and Drug Administration–approved treatment for canine demodicosis, is available only as a dip. But it has fallen from favor as a result of sometimes serious side effects, which can include sedation, bradycardia, ataxia, vomiting, diarrhea, and hyperglycemia.

Daily administration of oral ivermectin – often for months – also carries a risk of side effects, including dilated pupils, ataxia, sedation, stupor, coma, hypersalivation, vomiting, diarrhea, blindness, tremors, seizures, and respiratory depression.
Eriklam/Thinkstock


But the discovery of isoxazoline has “revolutionized” the treatment of demodicosis and other parasitic infestations in dogs, Dr. Rook said, citing quicker resolution of disease and improved quality of life for both the patient and its owner.

Isoxazoline, which Dr. Rook said carries little risk for side effects, is licensed in the United States only as a flea and tick preventive.

Atopic dermatitis

Atopic dermatitis (AD) tends to be similar in people and dogs, according to Charles W. Bradley, DVM, of the University of Pennsylvania School of Veterinary Medicine, Philadelphia. About 10%-30% of children and up to 10% of adults have the disorder, the prevalence of which has more than doubled in recent years, he said.

In dogs, the prevalence is 10%-20%, making it “an extraordinarily common disorder,” he said. Lesions tend to be located on the feet, face, pinnae, ventrum, and axilla/inguinum. Additional sites vary by breed, with Dalmatians tending to get AD on the lips, French Bulldogs on the eyelids, German Shepherds on the elbows, Shar-Peis on the thorax, and Boxers on the ears.

In humans, Staphylococcus aureus is the chief microorganism of concern, said Elizabeth Grice, PhD, of the department of dermatology at the University of Pennsylvania, Philadelphia, who copresented the topic with Dr. Bradley.

Dr. Elizabeth Grice
Concern about drug resistance is “one reason why we want to better understand the entire microbiome and other organisms that are colonizing the skin,” she commented. That means better understanding the relationship among S. aureus, microbial diversity, and disease severity.

“My true love is anything to do with the skin microbiome,” she said. “The more severe the disease, the lower the skin microbiome diversity.”

Though most studies of AD use mice as animal models, dogs would be better, according to Dr. Grice and Dr. Bradley.

That’s because canine AD occurs spontaneously and exhibits immunologic and clinical features similar to those of human AD. They include prevalence, environmental triggers, immunologic profiles, genetic predispositions, lesion distribution, and frequent colonization by Staphylococcus species. In addition, dogs and their owners tend to share the same environment.

A rash of itches

Among dermatology patients – man or beast – itch can outweigh rash as a key focus of concern, according to Brian Kim, MD, of the division of dermatology at Washington University in St. Louis, and codirector for the University’s Center for the Study of Itch. “The problem is my patients don’t complain about their rash; they complain about their itch,” he said. “But we don’t understand the basic question of itch.” In fact, the FDA has not approved any drugs for the treatment of chronic itch, he said.

Dr. Brian Kim
Toward that end, veterinary medicine is moving faster than human medicine, he said, citing work in mice that has succeeded in killing itch.

For dogs, advances have been made with Janus kinase (JAK) inhibitors, which “may function as immunomodulators,” Dr. Kim said. And JAK-1 selective inhibition “may be more effective than broad JAK blockade for itch.”

‘The perfect culture plate’

Lessons can be learned from studying canine AD, which “is immunophysiologically homologous to human AD,” said Daniel O. Morris, DVM, MPH, professor of dermatology, at the University of Pennsylvania School of Veterinary Medicine, Philadelphia. “The main difference: My patients are covered in dense hair coats.” Because of that, systemic treatment is necessary, he said.

Canine AD primarily affects areas where hair is sparse or where the surface microclimate is moist, he said. A dog’s ear canal, which can be 10 times longer than a human’s, harbors plenty of moisture and heat, he said. “It’s the perfect culture plate.”

But, he added, the owners of his patients tend to resist using topical therapies “that could be potentially smeared on the babies and grandma’s diabetic foot ulcer.” So he has long relied on systemic treatments, initially steroids and cyclosporine. But they can have major side effects, and cyclosporine can take 60-90 days before it exerts maximum effect.

A faster-acting compound called oclacitinib has shown promise based on its high affinity for inhibiting JAK-1 enzyme-mediated activation of cytokine expression, including interleukin (IL)-31, he said. “Clinical trials demonstrate an antipruritic efficacy equivalent to both prednisolone and cyclosporine,” he noted. Contraindications include a history of neoplasia, the presence of severe infection, and age under 1 year.

Monoclonal antibody targets IL-31

The latest promising arrival is lokivetmab, a monoclonal antibody that targets canine IL-31, according to Dr. Morris. It acts rapidly (within 1 day for many dogs) and prevents binding of IL-31 to its neuronal receptor for at least a month, thereby interrupting neurotransmission of itch.

But side effects can be serious and common. Equal efficacy with a reduced side effect is the holy grail, he said.

Some doctors are not waiting. “People are throwing these two products at anything that itches,” he said. Unfortunately, they tend to “work miserably” for causes other than AD, he added.

Dr. Gardner, Dr. Elston, Dr. Rook, Dr. Bradley, and Dr. Morris reported no financial conflicts. Dr. Grice’s disclosures include having served as a speaker for GlaxoSmithKline and for L’Oreal France, and having received grants/research funding from Janssen Research & Development. Dr. Kim has served as a consultant to biotechnology and pharmaceutical companies.

Publications
Publications
Topics
Article Type
Sections
Article Source

AT THE 2017 AAD SUMMER MEETING

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default

Lomitapide manufacturer will plead guilty to two misdemeanor charges of misbranding

Article Type
Changed
Fri, 01/18/2019 - 17:02

Aegerion Pharmaceuticals has agreed to plead guilty in the United States District Court for the District of Massachusetts to two misdemeanor counts of violating the Federal Food, Drug, and Cosmetic Act (FD&C Act) involving the introduction of misbranded Juxtapid (lomitapide) into interstate commerce, according to a press release from the Food and Drug Administration.

Juxtapid was misbranded because Aegerion failed to comply with the requirements of the Juxtapid Risk Evaluation and Mitigation Strategy (REMS) program and because the drug’s labeling lacked adequate directions for all of Juxtapid’s intended uses, according to the charges. Aegerion also agreed to a comprehensive compliance program and legal tools for the FDA to ensure that Aegerion complies with the law, subject to judicial oversight.

“By failing to follow the safety requirements that Aegerion had agreed to, the company put patients’ lives at risk and didn’t honor the safety commitments they made as a condition of gaining approval for their drug. This is unacceptable. We will continue to pursue those who skirt the law, and flout patient safety and other postmarket commitments, using all of the enforcement tools available to us. Postmarket safety requirements are a key element of the FDA’s public health protections and we will ensure that they are fulfilled,” FDA Commissioner Scott Gottlieb, MD, said in the statement.

Rather than following the REMS requirement to distribute Juxtapid only for the narrow indication of homozygous familial hypercholesterolemia, Aegerion portrayed the definition of the rare disorder as vague and indefinite in order to extend its use to lower-risk patients. Further, Aegerion filed a misleading REMS assessment report to the FDA in which the company failed to disclose that it was distributing Juxtapid using this definition, which was inconsistent with Aegerion’s preapproval filings and peer-reviewed clinical standards of diagnosis, according to the FDA release.

Once entered by the court, the plea and consent decree will be part of a global resolution of multiple government investigations into Aegerion’s conduct with respect to the marketing and distribution of Juxtapid. This resolution was the result of a coordinated effort by the U.S. Department of Justice and several government agencies, including the FDA, the press release stated.

Juxtapid was approved in December 2012 as an adjunct therapy to treat homozygous familial hypercholesterolemia. The Juxtapid REMS requires Aegerion to educate prescribers about the risks of hepatotoxicity and the need to monitor patients treated with Juxtapid and to ensure that Juxtapid is prescribed and dispensed only to those patients with a clinical or laboratory diagnosis consistent with homozygous familial hypercholesterolemia.

Publications
Topics
Sections

Aegerion Pharmaceuticals has agreed to plead guilty in the United States District Court for the District of Massachusetts to two misdemeanor counts of violating the Federal Food, Drug, and Cosmetic Act (FD&C Act) involving the introduction of misbranded Juxtapid (lomitapide) into interstate commerce, according to a press release from the Food and Drug Administration.

Juxtapid was misbranded because Aegerion failed to comply with the requirements of the Juxtapid Risk Evaluation and Mitigation Strategy (REMS) program and because the drug’s labeling lacked adequate directions for all of Juxtapid’s intended uses, according to the charges. Aegerion also agreed to a comprehensive compliance program and legal tools for the FDA to ensure that Aegerion complies with the law, subject to judicial oversight.

“By failing to follow the safety requirements that Aegerion had agreed to, the company put patients’ lives at risk and didn’t honor the safety commitments they made as a condition of gaining approval for their drug. This is unacceptable. We will continue to pursue those who skirt the law, and flout patient safety and other postmarket commitments, using all of the enforcement tools available to us. Postmarket safety requirements are a key element of the FDA’s public health protections and we will ensure that they are fulfilled,” FDA Commissioner Scott Gottlieb, MD, said in the statement.

Rather than following the REMS requirement to distribute Juxtapid only for the narrow indication of homozygous familial hypercholesterolemia, Aegerion portrayed the definition of the rare disorder as vague and indefinite in order to extend its use to lower-risk patients. Further, Aegerion filed a misleading REMS assessment report to the FDA in which the company failed to disclose that it was distributing Juxtapid using this definition, which was inconsistent with Aegerion’s preapproval filings and peer-reviewed clinical standards of diagnosis, according to the FDA release.

Once entered by the court, the plea and consent decree will be part of a global resolution of multiple government investigations into Aegerion’s conduct with respect to the marketing and distribution of Juxtapid. This resolution was the result of a coordinated effort by the U.S. Department of Justice and several government agencies, including the FDA, the press release stated.

Juxtapid was approved in December 2012 as an adjunct therapy to treat homozygous familial hypercholesterolemia. The Juxtapid REMS requires Aegerion to educate prescribers about the risks of hepatotoxicity and the need to monitor patients treated with Juxtapid and to ensure that Juxtapid is prescribed and dispensed only to those patients with a clinical or laboratory diagnosis consistent with homozygous familial hypercholesterolemia.

Aegerion Pharmaceuticals has agreed to plead guilty in the United States District Court for the District of Massachusetts to two misdemeanor counts of violating the Federal Food, Drug, and Cosmetic Act (FD&C Act) involving the introduction of misbranded Juxtapid (lomitapide) into interstate commerce, according to a press release from the Food and Drug Administration.

Juxtapid was misbranded because Aegerion failed to comply with the requirements of the Juxtapid Risk Evaluation and Mitigation Strategy (REMS) program and because the drug’s labeling lacked adequate directions for all of Juxtapid’s intended uses, according to the charges. Aegerion also agreed to a comprehensive compliance program and legal tools for the FDA to ensure that Aegerion complies with the law, subject to judicial oversight.

“By failing to follow the safety requirements that Aegerion had agreed to, the company put patients’ lives at risk and didn’t honor the safety commitments they made as a condition of gaining approval for their drug. This is unacceptable. We will continue to pursue those who skirt the law, and flout patient safety and other postmarket commitments, using all of the enforcement tools available to us. Postmarket safety requirements are a key element of the FDA’s public health protections and we will ensure that they are fulfilled,” FDA Commissioner Scott Gottlieb, MD, said in the statement.

Rather than following the REMS requirement to distribute Juxtapid only for the narrow indication of homozygous familial hypercholesterolemia, Aegerion portrayed the definition of the rare disorder as vague and indefinite in order to extend its use to lower-risk patients. Further, Aegerion filed a misleading REMS assessment report to the FDA in which the company failed to disclose that it was distributing Juxtapid using this definition, which was inconsistent with Aegerion’s preapproval filings and peer-reviewed clinical standards of diagnosis, according to the FDA release.

Once entered by the court, the plea and consent decree will be part of a global resolution of multiple government investigations into Aegerion’s conduct with respect to the marketing and distribution of Juxtapid. This resolution was the result of a coordinated effort by the U.S. Department of Justice and several government agencies, including the FDA, the press release stated.

Juxtapid was approved in December 2012 as an adjunct therapy to treat homozygous familial hypercholesterolemia. The Juxtapid REMS requires Aegerion to educate prescribers about the risks of hepatotoxicity and the need to monitor patients treated with Juxtapid and to ensure that Juxtapid is prescribed and dispensed only to those patients with a clinical or laboratory diagnosis consistent with homozygous familial hypercholesterolemia.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default

EC expands approval of obinutuzumab in FL

Article Type
Changed
Fri, 12/16/2022 - 12:21
Display Headline
EC expands approval of obinutuzumab in FL

 

follicular lymphoma
Micrograph showing

 

The European Commission (EC) has expanded the marketing authorization for obinutuzumab (Gazyvaro).

 

The drug is now approved for use in combination with chemotherapy to treat patients with previously untreated, advanced follicular lymphoma (FL). Patients who respond to this treatment can then receive obinutuzumab maintenance.

 

This is the third EC approval for obinutuzumab.

 

The drug was first approved by the EC in 2014 to be used in combination with chlorambucil to treat patients with previously untreated chronic lymphocytic leukemia and comorbidities that make them unsuitable for full-dose fludarabine-based therapy.

 

In 2016, the EC approved obinutuzumab in combination with bendamustine, followed by obinutuzumab maintenance, in FL patients who did not respond to, or who progressed during or up to 6 months after, treatment with rituximab or a rituximab-containing regimen.

 

The EC’s latest approval of obinutuzumab is based on results of the phase 3 GALLIUM trial, which were presented at the 2016 ASH Annual Meeting.

 

The study enrolled 1401 patients with previously untreated, indolent non-Hodgkin lymphoma, including 1202 with FL.

 

Half of the FL patients (n=601) were randomized to receive obinutuzumab plus chemotherapy (followed by obinutuzumab maintenance for up to 2 years), and half were randomized to rituximab plus chemotherapy (followed by rituximab maintenance for up to 2 years).

 

The different chemotherapies used were CHOP (cyclophosphamide, doxorubicin, vincristine, and prednisolone), CVP (cyclophosphamide, vincristine, and prednisolone), and bendamustine.

 

Patients who received obinutuzumab had significantly better progression-free survival than patients who received rituximab. The 3-year progression-free survival rate was 73.3% in the rituximab arm and 80% in the obinutuzumab arm (hazard ratio [HR]=0.66, P=0.0012).

 

There was no significant difference between the treatment arms with regard to overall survival. The 3-year overall survival was 92.1% in the rituximab arm and 94% in the obinutuzumab arm (HR=0.75, P=0.21).

 

The overall incidence of adverse events (AEs) was 98.3% in the rituximab arm and 99.5% in the obinutuzumab arm. The incidence of serious AEs was 39.9% and 46.1%, respectively.

 

The incidence of grade 3 or higher AEs was higher among patients who received obinutuzumab.

 

Grade 3 or higher AEs occurring in at least 5% of patients in either arm (rituximab and obinutuzumab, respectively) included neutropenia (67.8% and 74.6%), leukopenia (37.9% and 43.9%), febrile neutropenia (4.9% and 6.9%), infections and infestations (3.7% and 6.7%), and thrombocytopenia (2.7% and 6.1%).

Publications
Topics

 

follicular lymphoma
Micrograph showing

 

The European Commission (EC) has expanded the marketing authorization for obinutuzumab (Gazyvaro).

 

The drug is now approved for use in combination with chemotherapy to treat patients with previously untreated, advanced follicular lymphoma (FL). Patients who respond to this treatment can then receive obinutuzumab maintenance.

 

This is the third EC approval for obinutuzumab.

 

The drug was first approved by the EC in 2014 to be used in combination with chlorambucil to treat patients with previously untreated chronic lymphocytic leukemia and comorbidities that make them unsuitable for full-dose fludarabine-based therapy.

 

In 2016, the EC approved obinutuzumab in combination with bendamustine, followed by obinutuzumab maintenance, in FL patients who did not respond to, or who progressed during or up to 6 months after, treatment with rituximab or a rituximab-containing regimen.

 

The EC’s latest approval of obinutuzumab is based on results of the phase 3 GALLIUM trial, which were presented at the 2016 ASH Annual Meeting.

 

The study enrolled 1401 patients with previously untreated, indolent non-Hodgkin lymphoma, including 1202 with FL.

 

Half of the FL patients (n=601) were randomized to receive obinutuzumab plus chemotherapy (followed by obinutuzumab maintenance for up to 2 years), and half were randomized to rituximab plus chemotherapy (followed by rituximab maintenance for up to 2 years).

 

The different chemotherapies used were CHOP (cyclophosphamide, doxorubicin, vincristine, and prednisolone), CVP (cyclophosphamide, vincristine, and prednisolone), and bendamustine.

 

Patients who received obinutuzumab had significantly better progression-free survival than patients who received rituximab. The 3-year progression-free survival rate was 73.3% in the rituximab arm and 80% in the obinutuzumab arm (hazard ratio [HR]=0.66, P=0.0012).

 

There was no significant difference between the treatment arms with regard to overall survival. The 3-year overall survival was 92.1% in the rituximab arm and 94% in the obinutuzumab arm (HR=0.75, P=0.21).

 

The overall incidence of adverse events (AEs) was 98.3% in the rituximab arm and 99.5% in the obinutuzumab arm. The incidence of serious AEs was 39.9% and 46.1%, respectively.

 

The incidence of grade 3 or higher AEs was higher among patients who received obinutuzumab.

 

Grade 3 or higher AEs occurring in at least 5% of patients in either arm (rituximab and obinutuzumab, respectively) included neutropenia (67.8% and 74.6%), leukopenia (37.9% and 43.9%), febrile neutropenia (4.9% and 6.9%), infections and infestations (3.7% and 6.7%), and thrombocytopenia (2.7% and 6.1%).

 

follicular lymphoma
Micrograph showing

 

The European Commission (EC) has expanded the marketing authorization for obinutuzumab (Gazyvaro).

 

The drug is now approved for use in combination with chemotherapy to treat patients with previously untreated, advanced follicular lymphoma (FL). Patients who respond to this treatment can then receive obinutuzumab maintenance.

 

This is the third EC approval for obinutuzumab.

 

The drug was first approved by the EC in 2014 to be used in combination with chlorambucil to treat patients with previously untreated chronic lymphocytic leukemia and comorbidities that make them unsuitable for full-dose fludarabine-based therapy.

 

In 2016, the EC approved obinutuzumab in combination with bendamustine, followed by obinutuzumab maintenance, in FL patients who did not respond to, or who progressed during or up to 6 months after, treatment with rituximab or a rituximab-containing regimen.

 

The EC’s latest approval of obinutuzumab is based on results of the phase 3 GALLIUM trial, which were presented at the 2016 ASH Annual Meeting.

 

The study enrolled 1401 patients with previously untreated, indolent non-Hodgkin lymphoma, including 1202 with FL.

 

Half of the FL patients (n=601) were randomized to receive obinutuzumab plus chemotherapy (followed by obinutuzumab maintenance for up to 2 years), and half were randomized to rituximab plus chemotherapy (followed by rituximab maintenance for up to 2 years).

 

The different chemotherapies used were CHOP (cyclophosphamide, doxorubicin, vincristine, and prednisolone), CVP (cyclophosphamide, vincristine, and prednisolone), and bendamustine.

 

Patients who received obinutuzumab had significantly better progression-free survival than patients who received rituximab. The 3-year progression-free survival rate was 73.3% in the rituximab arm and 80% in the obinutuzumab arm (hazard ratio [HR]=0.66, P=0.0012).

 

There was no significant difference between the treatment arms with regard to overall survival. The 3-year overall survival was 92.1% in the rituximab arm and 94% in the obinutuzumab arm (HR=0.75, P=0.21).

 

The overall incidence of adverse events (AEs) was 98.3% in the rituximab arm and 99.5% in the obinutuzumab arm. The incidence of serious AEs was 39.9% and 46.1%, respectively.

 

The incidence of grade 3 or higher AEs was higher among patients who received obinutuzumab.

 

Grade 3 or higher AEs occurring in at least 5% of patients in either arm (rituximab and obinutuzumab, respectively) included neutropenia (67.8% and 74.6%), leukopenia (37.9% and 43.9%), febrile neutropenia (4.9% and 6.9%), infections and infestations (3.7% and 6.7%), and thrombocytopenia (2.7% and 6.1%).

Publications
Publications
Topics
Article Type
Display Headline
EC expands approval of obinutuzumab in FL
Display Headline
EC expands approval of obinutuzumab in FL
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Swiss freedom, hidden witnesses, and beer

Article Type
Changed
Mon, 01/14/2019 - 10:09

Most of medical practice is mundane. Just a few interesting cases pass through now and then to break up the clinical routine, a rhythm that’s fine with me.

More often, patients expand my vistas by telling me about places I’ve never been and things I didn’t know and couldn’t imagine. Sometimes these tales are even more riveting than atopic dermatitis or mildly dysplastic nevi. Learning about them leaves me smiling and scratching my head. What will they tell me about next?

Dr. Alan Rockoff
Valentina, a native of Zurich, has been coming for a skin check every summer for years. She teaches engineering up in New Hampshire.

“I always come to Boston around this time,” she said. “But today I actually am celebrating Swiss National Day.”

“No kidding,” I said. “What is Swiss National Day?”

“We commemorate the founding of Switzerland in 1291,” she said.

“And how do you celebrate it?” I asked.

“Well, we are Swiss,” she said, “so we work all day. Then we have a party in the evening.

“That is how it used to be anyway,” she said. “About 40 years ago, the parties on the left and right made a deal and established two holidays: Labor Day on May 1st and the National Holiday on August 1st. Now we get those whole days off.”

“Which was the end of Swiss civilization as we know it,” I suggested.

“That’s exactly what my father said when it happened,” said Valentina, with a restrained, Swiss smile. “But somehow life goes on for us, even with 2 days a year off.”
swisshippo/Thinkstock

 

* * * * * * * * * * * * * * * * * * * * * * *

When I picked up his chart, I saw that my patient’s last name suggested that he hailed from one of the countries left after the breakup of Yugoslavia. We’ll call him Magovcevic.

As soon as I walked in, however, it was clear that wherever he came from was nowhere near Serbia. His features and accent were Brazilian.

“I come from Minas Gerais,” he said, “in the South, not far from Rio.”

“So how come you have a Slavic name?” I asked him.

“My parents had a different last name,” he said.

“Then how did you come to be called Magovcevic?” I asked.

“I’m in the witness protection program,” he said.

I had to hold onto the sink to stay upright. Of all the possible responses he could have made, that one was not on my list.

“Did you pick the name yourself?” I asked. I don’t think I’d ever given a thought to how family names are chosen for people in witness protection.

“No, they gave it to me,” he said. “I was still a minor.”

At that point I stopped asking questions. Whatever it was that he witnessed as a minor that landed him in witness protection I didn’t want to know about.
 

* * * * * * * * * * * * * * * * * * * * * * *

Myrna was very happy to tell me that her son was doing well in college and had a good summer job.

“He works in a beer garden downtown,” she said. “The tips are great.”

“What is he studying in school?” I asked.

“Fermentation studies,” she replied.

After she’d said he was moonlighting in a beer garden, I thought she was pulling my leg. I know college students have keg parties after class, but I didn’t know they studied what goes into the kegs during class.

But Myrna was serious. “He’s interested in biochemistry,” she explained. “He wants to focus on developing better beers.”

A younger colleague whom I told about this chuckled at my perplexity. “Sure,” she said, “fermentation studies is the hot new field. Lots of people are getting into it.”

I have long since resigned myself to being clueless about what younger people are into, especially social media. But I found myself bemused at how it just never occurred to me that bright young biochemists might burn with ambition to bring the world better craft beers.

I have since learned that fermentation studies have other applications too. Like wine. And wine, like cosmetics, has been around a lot longer than dermatology.
 

* * * * * * * * * * * * * * * * * * * * * * *

Skin is interesting, but the people inside it are often even more so. Who knows what I’ll run into tomorrow? I won’t even try to guess.
 

 

 

Dr. Rockoff practices dermatology in Brookline, Mass., and is a longtime contributor to Dermatology News. He serves on the clinical faculty at Tufts University, Boston, and has taught senior medical students and other trainees for 30 years. His second book, “Act Like a Doctor, Think Like a Patient,” is available at amazon.com and barnesandnoble.com.

Publications
Topics
Sections

Most of medical practice is mundane. Just a few interesting cases pass through now and then to break up the clinical routine, a rhythm that’s fine with me.

More often, patients expand my vistas by telling me about places I’ve never been and things I didn’t know and couldn’t imagine. Sometimes these tales are even more riveting than atopic dermatitis or mildly dysplastic nevi. Learning about them leaves me smiling and scratching my head. What will they tell me about next?

Dr. Alan Rockoff
Valentina, a native of Zurich, has been coming for a skin check every summer for years. She teaches engineering up in New Hampshire.

“I always come to Boston around this time,” she said. “But today I actually am celebrating Swiss National Day.”

“No kidding,” I said. “What is Swiss National Day?”

“We commemorate the founding of Switzerland in 1291,” she said.

“And how do you celebrate it?” I asked.

“Well, we are Swiss,” she said, “so we work all day. Then we have a party in the evening.

“That is how it used to be anyway,” she said. “About 40 years ago, the parties on the left and right made a deal and established two holidays: Labor Day on May 1st and the National Holiday on August 1st. Now we get those whole days off.”

“Which was the end of Swiss civilization as we know it,” I suggested.

“That’s exactly what my father said when it happened,” said Valentina, with a restrained, Swiss smile. “But somehow life goes on for us, even with 2 days a year off.”
swisshippo/Thinkstock

 

* * * * * * * * * * * * * * * * * * * * * * *

When I picked up his chart, I saw that my patient’s last name suggested that he hailed from one of the countries left after the breakup of Yugoslavia. We’ll call him Magovcevic.

As soon as I walked in, however, it was clear that wherever he came from was nowhere near Serbia. His features and accent were Brazilian.

“I come from Minas Gerais,” he said, “in the South, not far from Rio.”

“So how come you have a Slavic name?” I asked him.

“My parents had a different last name,” he said.

“Then how did you come to be called Magovcevic?” I asked.

“I’m in the witness protection program,” he said.

I had to hold onto the sink to stay upright. Of all the possible responses he could have made, that one was not on my list.

“Did you pick the name yourself?” I asked. I don’t think I’d ever given a thought to how family names are chosen for people in witness protection.

“No, they gave it to me,” he said. “I was still a minor.”

At that point I stopped asking questions. Whatever it was that he witnessed as a minor that landed him in witness protection I didn’t want to know about.
 

* * * * * * * * * * * * * * * * * * * * * * *

Myrna was very happy to tell me that her son was doing well in college and had a good summer job.

“He works in a beer garden downtown,” she said. “The tips are great.”

“What is he studying in school?” I asked.

“Fermentation studies,” she replied.

After she’d said he was moonlighting in a beer garden, I thought she was pulling my leg. I know college students have keg parties after class, but I didn’t know they studied what goes into the kegs during class.

But Myrna was serious. “He’s interested in biochemistry,” she explained. “He wants to focus on developing better beers.”

A younger colleague whom I told about this chuckled at my perplexity. “Sure,” she said, “fermentation studies is the hot new field. Lots of people are getting into it.”

I have long since resigned myself to being clueless about what younger people are into, especially social media. But I found myself bemused at how it just never occurred to me that bright young biochemists might burn with ambition to bring the world better craft beers.

I have since learned that fermentation studies have other applications too. Like wine. And wine, like cosmetics, has been around a lot longer than dermatology.
 

* * * * * * * * * * * * * * * * * * * * * * *

Skin is interesting, but the people inside it are often even more so. Who knows what I’ll run into tomorrow? I won’t even try to guess.
 

 

 

Dr. Rockoff practices dermatology in Brookline, Mass., and is a longtime contributor to Dermatology News. He serves on the clinical faculty at Tufts University, Boston, and has taught senior medical students and other trainees for 30 years. His second book, “Act Like a Doctor, Think Like a Patient,” is available at amazon.com and barnesandnoble.com.

Most of medical practice is mundane. Just a few interesting cases pass through now and then to break up the clinical routine, a rhythm that’s fine with me.

More often, patients expand my vistas by telling me about places I’ve never been and things I didn’t know and couldn’t imagine. Sometimes these tales are even more riveting than atopic dermatitis or mildly dysplastic nevi. Learning about them leaves me smiling and scratching my head. What will they tell me about next?

Dr. Alan Rockoff
Valentina, a native of Zurich, has been coming for a skin check every summer for years. She teaches engineering up in New Hampshire.

“I always come to Boston around this time,” she said. “But today I actually am celebrating Swiss National Day.”

“No kidding,” I said. “What is Swiss National Day?”

“We commemorate the founding of Switzerland in 1291,” she said.

“And how do you celebrate it?” I asked.

“Well, we are Swiss,” she said, “so we work all day. Then we have a party in the evening.

“That is how it used to be anyway,” she said. “About 40 years ago, the parties on the left and right made a deal and established two holidays: Labor Day on May 1st and the National Holiday on August 1st. Now we get those whole days off.”

“Which was the end of Swiss civilization as we know it,” I suggested.

“That’s exactly what my father said when it happened,” said Valentina, with a restrained, Swiss smile. “But somehow life goes on for us, even with 2 days a year off.”
swisshippo/Thinkstock

 

* * * * * * * * * * * * * * * * * * * * * * *

When I picked up his chart, I saw that my patient’s last name suggested that he hailed from one of the countries left after the breakup of Yugoslavia. We’ll call him Magovcevic.

As soon as I walked in, however, it was clear that wherever he came from was nowhere near Serbia. His features and accent were Brazilian.

“I come from Minas Gerais,” he said, “in the South, not far from Rio.”

“So how come you have a Slavic name?” I asked him.

“My parents had a different last name,” he said.

“Then how did you come to be called Magovcevic?” I asked.

“I’m in the witness protection program,” he said.

I had to hold onto the sink to stay upright. Of all the possible responses he could have made, that one was not on my list.

“Did you pick the name yourself?” I asked. I don’t think I’d ever given a thought to how family names are chosen for people in witness protection.

“No, they gave it to me,” he said. “I was still a minor.”

At that point I stopped asking questions. Whatever it was that he witnessed as a minor that landed him in witness protection I didn’t want to know about.
 

* * * * * * * * * * * * * * * * * * * * * * *

Myrna was very happy to tell me that her son was doing well in college and had a good summer job.

“He works in a beer garden downtown,” she said. “The tips are great.”

“What is he studying in school?” I asked.

“Fermentation studies,” she replied.

After she’d said he was moonlighting in a beer garden, I thought she was pulling my leg. I know college students have keg parties after class, but I didn’t know they studied what goes into the kegs during class.

But Myrna was serious. “He’s interested in biochemistry,” she explained. “He wants to focus on developing better beers.”

A younger colleague whom I told about this chuckled at my perplexity. “Sure,” she said, “fermentation studies is the hot new field. Lots of people are getting into it.”

I have long since resigned myself to being clueless about what younger people are into, especially social media. But I found myself bemused at how it just never occurred to me that bright young biochemists might burn with ambition to bring the world better craft beers.

I have since learned that fermentation studies have other applications too. Like wine. And wine, like cosmetics, has been around a lot longer than dermatology.
 

* * * * * * * * * * * * * * * * * * * * * * *

Skin is interesting, but the people inside it are often even more so. Who knows what I’ll run into tomorrow? I won’t even try to guess.
 

 

 

Dr. Rockoff practices dermatology in Brookline, Mass., and is a longtime contributor to Dermatology News. He serves on the clinical faculty at Tufts University, Boston, and has taught senior medical students and other trainees for 30 years. His second book, “Act Like a Doctor, Think Like a Patient,” is available at amazon.com and barnesandnoble.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default

Platelet-rich plasma treatment for hair loss continues to be refined

Article Type
Changed
Mon, 01/14/2019 - 10:09

– There is currently no standard protocol for injecting autologous platelet-rich plasma to stimulate hair growth, but the technique appears to be about 50% effective, according to Marc R. Avram, MD.

“I tell patients that this is not FDA [Food and Drug Administration] approved, but we think it to be safe,” said Dr. Avram, clinical professor of dermatology at the Cornell University, New York, said at the annual Masters of Aesthetics Symposium. “We don’t know how well it’s going to work. There are a lot of published data on it, but none of [them are] randomized or controlled long-term.”

toeytoey2530/Thinkstock
While the precise mechanism of action of platelet-rich plasma (PRP) remains elusive, researchers hypothesize that platelets contain alpha-granules, which are released upon activation. In turn, this action releases platelet-derived growth factor, transforming growth factor–beta, vascular endothelial growth factor, epidermal growth factor, fibroblast growth factor, and insulinlike growth factor–1, which collectively help to stimulate the hair cycle.

In Dr. Avram’s experience, he has found that PRP is a good option for patients with difficult hair loss, such as those who had extensive hair loss after chemotherapy but the hair never grew back in the same fashion, or patients who have failed treatment with finasteride and minoxidil.

Currently, there is no standard protocol for using PRP to stimulate hair growth, but the approach Dr. Avram follows is modeled on his experience of injecting thousands of patients with triamcinolone acetonide (Kenalog) for hair loss every 4-6 weeks. After drawing 20 ccs-30 ccs of blood from the patient, the vial is placed in a centrifuge for 10 minutes, a process that separates PRP from red blood cells. Next, the clinician injects PRP into the deep dermis/superficial subcutaneous tissue of the desired treatment area. An average of 4 ccs-8 ccs is injected during each session.

After three monthly treatments, patients follow up at 3 and 6 months after the last treatment to evaluate efficacy. “All patients are told if there is regrowth or thickening of terminal hair, maintenance treatments will be needed every 6-9 months,” he said.

Published clinical trials of PRP include a follow-up period of 3-12 months and most demonstrate an efficacy in the range of 50%-70%. “It seems to be more effective for earlier stages of hair loss, and there are no known side effects to date,” said Dr. Avram, who has authored five textbooks on hair and cosmetic dermatology. “I had one patient call up to say he thought he had an increase in hair loss 2-3 weeks after treatment, but that’s one patient in a couple hundred. This may be similar to the effect minoxidil has on some patients. I’ve had no other issues with side effects.”

In his opinion, future challenges in the use of PRP for restoring hair loss include better defining optimal candidates for the procedure and establishing a better treatment protocol. “How often should maintenance be done?” he asked. “Is this going to be helpful for alopecia areata and scarring alopecia? Also, we need to determine if finasteride, minoxidil, low-level light laser therapy, or any other medications can enhance PRP efficacy in combination. What’s the optimal combination for patients? We don’t know yet. But I think in the future we will.”

Dr. Avram disclosed that he is a consultant for Restoration Robotics.
 

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event
Related Articles

– There is currently no standard protocol for injecting autologous platelet-rich plasma to stimulate hair growth, but the technique appears to be about 50% effective, according to Marc R. Avram, MD.

“I tell patients that this is not FDA [Food and Drug Administration] approved, but we think it to be safe,” said Dr. Avram, clinical professor of dermatology at the Cornell University, New York, said at the annual Masters of Aesthetics Symposium. “We don’t know how well it’s going to work. There are a lot of published data on it, but none of [them are] randomized or controlled long-term.”

toeytoey2530/Thinkstock
While the precise mechanism of action of platelet-rich plasma (PRP) remains elusive, researchers hypothesize that platelets contain alpha-granules, which are released upon activation. In turn, this action releases platelet-derived growth factor, transforming growth factor–beta, vascular endothelial growth factor, epidermal growth factor, fibroblast growth factor, and insulinlike growth factor–1, which collectively help to stimulate the hair cycle.

In Dr. Avram’s experience, he has found that PRP is a good option for patients with difficult hair loss, such as those who had extensive hair loss after chemotherapy but the hair never grew back in the same fashion, or patients who have failed treatment with finasteride and minoxidil.

Currently, there is no standard protocol for using PRP to stimulate hair growth, but the approach Dr. Avram follows is modeled on his experience of injecting thousands of patients with triamcinolone acetonide (Kenalog) for hair loss every 4-6 weeks. After drawing 20 ccs-30 ccs of blood from the patient, the vial is placed in a centrifuge for 10 minutes, a process that separates PRP from red blood cells. Next, the clinician injects PRP into the deep dermis/superficial subcutaneous tissue of the desired treatment area. An average of 4 ccs-8 ccs is injected during each session.

After three monthly treatments, patients follow up at 3 and 6 months after the last treatment to evaluate efficacy. “All patients are told if there is regrowth or thickening of terminal hair, maintenance treatments will be needed every 6-9 months,” he said.

Published clinical trials of PRP include a follow-up period of 3-12 months and most demonstrate an efficacy in the range of 50%-70%. “It seems to be more effective for earlier stages of hair loss, and there are no known side effects to date,” said Dr. Avram, who has authored five textbooks on hair and cosmetic dermatology. “I had one patient call up to say he thought he had an increase in hair loss 2-3 weeks after treatment, but that’s one patient in a couple hundred. This may be similar to the effect minoxidil has on some patients. I’ve had no other issues with side effects.”

In his opinion, future challenges in the use of PRP for restoring hair loss include better defining optimal candidates for the procedure and establishing a better treatment protocol. “How often should maintenance be done?” he asked. “Is this going to be helpful for alopecia areata and scarring alopecia? Also, we need to determine if finasteride, minoxidil, low-level light laser therapy, or any other medications can enhance PRP efficacy in combination. What’s the optimal combination for patients? We don’t know yet. But I think in the future we will.”

Dr. Avram disclosed that he is a consultant for Restoration Robotics.
 

– There is currently no standard protocol for injecting autologous platelet-rich plasma to stimulate hair growth, but the technique appears to be about 50% effective, according to Marc R. Avram, MD.

“I tell patients that this is not FDA [Food and Drug Administration] approved, but we think it to be safe,” said Dr. Avram, clinical professor of dermatology at the Cornell University, New York, said at the annual Masters of Aesthetics Symposium. “We don’t know how well it’s going to work. There are a lot of published data on it, but none of [them are] randomized or controlled long-term.”

toeytoey2530/Thinkstock
While the precise mechanism of action of platelet-rich plasma (PRP) remains elusive, researchers hypothesize that platelets contain alpha-granules, which are released upon activation. In turn, this action releases platelet-derived growth factor, transforming growth factor–beta, vascular endothelial growth factor, epidermal growth factor, fibroblast growth factor, and insulinlike growth factor–1, which collectively help to stimulate the hair cycle.

In Dr. Avram’s experience, he has found that PRP is a good option for patients with difficult hair loss, such as those who had extensive hair loss after chemotherapy but the hair never grew back in the same fashion, or patients who have failed treatment with finasteride and minoxidil.

Currently, there is no standard protocol for using PRP to stimulate hair growth, but the approach Dr. Avram follows is modeled on his experience of injecting thousands of patients with triamcinolone acetonide (Kenalog) for hair loss every 4-6 weeks. After drawing 20 ccs-30 ccs of blood from the patient, the vial is placed in a centrifuge for 10 minutes, a process that separates PRP from red blood cells. Next, the clinician injects PRP into the deep dermis/superficial subcutaneous tissue of the desired treatment area. An average of 4 ccs-8 ccs is injected during each session.

After three monthly treatments, patients follow up at 3 and 6 months after the last treatment to evaluate efficacy. “All patients are told if there is regrowth or thickening of terminal hair, maintenance treatments will be needed every 6-9 months,” he said.

Published clinical trials of PRP include a follow-up period of 3-12 months and most demonstrate an efficacy in the range of 50%-70%. “It seems to be more effective for earlier stages of hair loss, and there are no known side effects to date,” said Dr. Avram, who has authored five textbooks on hair and cosmetic dermatology. “I had one patient call up to say he thought he had an increase in hair loss 2-3 weeks after treatment, but that’s one patient in a couple hundred. This may be similar to the effect minoxidil has on some patients. I’ve had no other issues with side effects.”

In his opinion, future challenges in the use of PRP for restoring hair loss include better defining optimal candidates for the procedure and establishing a better treatment protocol. “How often should maintenance be done?” he asked. “Is this going to be helpful for alopecia areata and scarring alopecia? Also, we need to determine if finasteride, minoxidil, low-level light laser therapy, or any other medications can enhance PRP efficacy in combination. What’s the optimal combination for patients? We don’t know yet. But I think in the future we will.”

Dr. Avram disclosed that he is a consultant for Restoration Robotics.
 

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

AT MOAS 2017

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default

Sharing drug paraphernalia alone didn’t transmit HCV

Article Type
Changed
Fri, 01/18/2019 - 17:02

Hepatitis C virus couldn’t be transmitted through shared drug preparation paraphernalia, but sharing paraphernalia was associated with sharing syringes nevertheless, according to researchers at Yale University, New Haven, Conn.

That makes sharing paraphernalia a “surrogate” for HCV transmission “resulting from sharing drugs,” the investigators said, but it should not be a primary focus of harm-reduction and education programs.

PaulPaladin/thinkstock
In their experiment, Robert Heimer, PhD, and his colleagues prepared syringes contaminated with HCV and attempted to replicate the conditions in which persons injecting drugs share packages of drugs, syringes, and paraphernalia.

“Water was introduced into the barrel of a contaminated ‘input’ syringe and expelled into a ‘cooker,’ and the water was drawn up into a ‘receptive’ syringe through a cotton filter,” the study authors explained. “The ‘input’ syringe, ‘cooker,’ and filter were rinsed with tissue culture medium and introduced into the microculture assay. The water drawn into the second syringe was combined with an equal volume of double-strength medium and introduced into the microculture assay” (J Infect Dis. 2017. doi: 10.1093/infdis/jix427).

The researchers tested syringes (with fixed or detachable needles), cookers, and filters (single or pooled). They were significantly more likely to recover HCV from detachable-needle syringes than from fixed-needle syringes. In the input syringes, they recovered no HCV from 70 fixed-needle syringes while they did recover HCV from 96 of 130 (73.8%) detachable-needle syringes. HCV passed to both types of syringes in the experiment’s receptive syringes but at a much higher rate for those with detachable needles than for those with fixed needles (93.8% vs. 45.7%, respectively).

No HCV was recovered from any of the cookers, regardless of syringe type. Some was recovered from filters, at higher rates with detachable needles than with fixed (27.1% vs. 1.4%).

“Money spent on ‘cookers’ and filters would be better spent on giving away more syringes,” Dr. Heimer and his coauthors concluded. “Because HCV and HIV transmission are more likely if the syringe has a detachable rather than a fixed needle, efforts should focus on providing more syringes with fixed needles.”

Publications
Topics
Sections

Hepatitis C virus couldn’t be transmitted through shared drug preparation paraphernalia, but sharing paraphernalia was associated with sharing syringes nevertheless, according to researchers at Yale University, New Haven, Conn.

That makes sharing paraphernalia a “surrogate” for HCV transmission “resulting from sharing drugs,” the investigators said, but it should not be a primary focus of harm-reduction and education programs.

PaulPaladin/thinkstock
In their experiment, Robert Heimer, PhD, and his colleagues prepared syringes contaminated with HCV and attempted to replicate the conditions in which persons injecting drugs share packages of drugs, syringes, and paraphernalia.

“Water was introduced into the barrel of a contaminated ‘input’ syringe and expelled into a ‘cooker,’ and the water was drawn up into a ‘receptive’ syringe through a cotton filter,” the study authors explained. “The ‘input’ syringe, ‘cooker,’ and filter were rinsed with tissue culture medium and introduced into the microculture assay. The water drawn into the second syringe was combined with an equal volume of double-strength medium and introduced into the microculture assay” (J Infect Dis. 2017. doi: 10.1093/infdis/jix427).

The researchers tested syringes (with fixed or detachable needles), cookers, and filters (single or pooled). They were significantly more likely to recover HCV from detachable-needle syringes than from fixed-needle syringes. In the input syringes, they recovered no HCV from 70 fixed-needle syringes while they did recover HCV from 96 of 130 (73.8%) detachable-needle syringes. HCV passed to both types of syringes in the experiment’s receptive syringes but at a much higher rate for those with detachable needles than for those with fixed needles (93.8% vs. 45.7%, respectively).

No HCV was recovered from any of the cookers, regardless of syringe type. Some was recovered from filters, at higher rates with detachable needles than with fixed (27.1% vs. 1.4%).

“Money spent on ‘cookers’ and filters would be better spent on giving away more syringes,” Dr. Heimer and his coauthors concluded. “Because HCV and HIV transmission are more likely if the syringe has a detachable rather than a fixed needle, efforts should focus on providing more syringes with fixed needles.”

Hepatitis C virus couldn’t be transmitted through shared drug preparation paraphernalia, but sharing paraphernalia was associated with sharing syringes nevertheless, according to researchers at Yale University, New Haven, Conn.

That makes sharing paraphernalia a “surrogate” for HCV transmission “resulting from sharing drugs,” the investigators said, but it should not be a primary focus of harm-reduction and education programs.

PaulPaladin/thinkstock
In their experiment, Robert Heimer, PhD, and his colleagues prepared syringes contaminated with HCV and attempted to replicate the conditions in which persons injecting drugs share packages of drugs, syringes, and paraphernalia.

“Water was introduced into the barrel of a contaminated ‘input’ syringe and expelled into a ‘cooker,’ and the water was drawn up into a ‘receptive’ syringe through a cotton filter,” the study authors explained. “The ‘input’ syringe, ‘cooker,’ and filter were rinsed with tissue culture medium and introduced into the microculture assay. The water drawn into the second syringe was combined with an equal volume of double-strength medium and introduced into the microculture assay” (J Infect Dis. 2017. doi: 10.1093/infdis/jix427).

The researchers tested syringes (with fixed or detachable needles), cookers, and filters (single or pooled). They were significantly more likely to recover HCV from detachable-needle syringes than from fixed-needle syringes. In the input syringes, they recovered no HCV from 70 fixed-needle syringes while they did recover HCV from 96 of 130 (73.8%) detachable-needle syringes. HCV passed to both types of syringes in the experiment’s receptive syringes but at a much higher rate for those with detachable needles than for those with fixed needles (93.8% vs. 45.7%, respectively).

No HCV was recovered from any of the cookers, regardless of syringe type. Some was recovered from filters, at higher rates with detachable needles than with fixed (27.1% vs. 1.4%).

“Money spent on ‘cookers’ and filters would be better spent on giving away more syringes,” Dr. Heimer and his coauthors concluded. “Because HCV and HIV transmission are more likely if the syringe has a detachable rather than a fixed needle, efforts should focus on providing more syringes with fixed needles.”

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE JOURNAL OF INFECTIOUS DISEASES

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default

With inpatient flu shots, providers’ attitude problem may outweigh parents’

Article Type
Changed
Fri, 01/18/2019 - 17:02

Provider misconceptions about parental attitudes may be a more significant barrier to inpatient flu vaccination than the parents’ actual attitudes, reported Suchitra Rao, MD, of the University of Colorado, Aurora, and her colleagues.

Surveys assessing attitudes toward inpatient influenza vaccination were given to parents/caregivers of general pediatric inpatients and to inpatient physicians, residents, nurses, physician assistants, and nurse practitioners at Children’s Hospital Colorado in Aurora between October 2014 and March 2015. Response rates were 95% of the 1,053 parents/caregivers and 58% of the 339 providers.

luiscar/Thinkstock
At time of admission, 37% of children received the current flu vaccine. The children’s median age was 4 years, 70% were white, 54% were male, and 44% had a high-risk medical condition. Of the 55% of parents who said their child usually gets an annual flu shot, 92% would agree to an inpatient flu shot if eligible. Of the 45% of parents who said they don’t routinely get their child an annual flu shot, 37% would agree to inpatient flu vaccination if eligible.

The parents agreed that the flu is a serious disease (92%), that flu vaccines work (58%), that flu vaccines are safe (76%), and that the vaccines are needed annually (76%), the Dr. Rao and her colleagues found.

The providers thought the most common barriers to vaccination were parental refusal because of child illness (80%) and family misconceptions about the vaccine (74%). Also, 54% of providers forgot to ask about flu vaccination status and 46% forgot to order flu vaccines.

When asked what interventions might increase flu vaccination rates in the inpatient setting, 73% of providers agreed that personal reminders might help increase vaccination rates, but only 48% thought that provider education might help do so.

Read more in the journal Influenza and Other Respiratory Viruses (2017 Sep 5. doi: 10.1111/irv.12482.)

Publications
Topics
Sections

Provider misconceptions about parental attitudes may be a more significant barrier to inpatient flu vaccination than the parents’ actual attitudes, reported Suchitra Rao, MD, of the University of Colorado, Aurora, and her colleagues.

Surveys assessing attitudes toward inpatient influenza vaccination were given to parents/caregivers of general pediatric inpatients and to inpatient physicians, residents, nurses, physician assistants, and nurse practitioners at Children’s Hospital Colorado in Aurora between October 2014 and March 2015. Response rates were 95% of the 1,053 parents/caregivers and 58% of the 339 providers.

luiscar/Thinkstock
At time of admission, 37% of children received the current flu vaccine. The children’s median age was 4 years, 70% were white, 54% were male, and 44% had a high-risk medical condition. Of the 55% of parents who said their child usually gets an annual flu shot, 92% would agree to an inpatient flu shot if eligible. Of the 45% of parents who said they don’t routinely get their child an annual flu shot, 37% would agree to inpatient flu vaccination if eligible.

The parents agreed that the flu is a serious disease (92%), that flu vaccines work (58%), that flu vaccines are safe (76%), and that the vaccines are needed annually (76%), the Dr. Rao and her colleagues found.

The providers thought the most common barriers to vaccination were parental refusal because of child illness (80%) and family misconceptions about the vaccine (74%). Also, 54% of providers forgot to ask about flu vaccination status and 46% forgot to order flu vaccines.

When asked what interventions might increase flu vaccination rates in the inpatient setting, 73% of providers agreed that personal reminders might help increase vaccination rates, but only 48% thought that provider education might help do so.

Read more in the journal Influenza and Other Respiratory Viruses (2017 Sep 5. doi: 10.1111/irv.12482.)

Provider misconceptions about parental attitudes may be a more significant barrier to inpatient flu vaccination than the parents’ actual attitudes, reported Suchitra Rao, MD, of the University of Colorado, Aurora, and her colleagues.

Surveys assessing attitudes toward inpatient influenza vaccination were given to parents/caregivers of general pediatric inpatients and to inpatient physicians, residents, nurses, physician assistants, and nurse practitioners at Children’s Hospital Colorado in Aurora between October 2014 and March 2015. Response rates were 95% of the 1,053 parents/caregivers and 58% of the 339 providers.

luiscar/Thinkstock
At time of admission, 37% of children received the current flu vaccine. The children’s median age was 4 years, 70% were white, 54% were male, and 44% had a high-risk medical condition. Of the 55% of parents who said their child usually gets an annual flu shot, 92% would agree to an inpatient flu shot if eligible. Of the 45% of parents who said they don’t routinely get their child an annual flu shot, 37% would agree to inpatient flu vaccination if eligible.

The parents agreed that the flu is a serious disease (92%), that flu vaccines work (58%), that flu vaccines are safe (76%), and that the vaccines are needed annually (76%), the Dr. Rao and her colleagues found.

The providers thought the most common barriers to vaccination were parental refusal because of child illness (80%) and family misconceptions about the vaccine (74%). Also, 54% of providers forgot to ask about flu vaccination status and 46% forgot to order flu vaccines.

When asked what interventions might increase flu vaccination rates in the inpatient setting, 73% of providers agreed that personal reminders might help increase vaccination rates, but only 48% thought that provider education might help do so.

Read more in the journal Influenza and Other Respiratory Viruses (2017 Sep 5. doi: 10.1111/irv.12482.)

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default

Can arterial switch operation impact cognitive deficits?

An ‘important start’
Article Type
Changed
Tue, 02/14/2023 - 13:05

With dramatic advances of neonatal repair of complex cardiac disease, the population of adults with congenital heart disease (CHD) has increased dramatically, and while studies have shown an increased risk of neurodevelopmental and psychological disorders in these patients, few studies have evaluated their cognitive and psychosocial outcomes. Now, a review of young adults who had an arterial switch operation for transposition of the great arteries in France has found that they have almost twice the rate of cognitive difficulties and more than triple the rate of cognitive impairment as healthy peers.

“Despite satisfactory outcomes in most adults with transposition of the great arteries (TGA), a substantial proportion has cognitive or psychologic difficulties that may reduce their academic success and quality of life,” said lead author David Kalfa, MD, PhD, of Morgan Stanley Children’s Hospital of New York-Presbyterian, Columbia University Medical Center and coauthors in the September issue of the Journal of Thoracic and Cardiovascular Surgery (2017;154:1028-35).

The study involved a review of 67 adults aged 18 and older born with TGA between 1984 and 1985 who had an arterial switch operation (ASO) at two hospitals in France: Necker Children’s Hospital in Paris and Marie Lannelongue Hospital in Le Plessis-Robinson. The researchers performed a matched analysis with 43 healthy subjects for age, gender, and education level.

The researchers found that 69% of the TGA patients had an intelligence quotient in the normal range of 85-115. The TGA patients had lower quotients for mean full-scale (94.9 plus or minus 15.3 vs. 103.4 plus or minus 12.3 in healthy subjects; P = 0.003), verbal (96.8 plus or minus 16.2 vs. 102.5 plus or minus 11.5; P =.033) and performance intelligence (93.7 plus or minus 14.6 vs. 103.8 plus or minus 14.3; P less than .001).

The TGA patients also had higher rates of cognitive difficulties, measured as intelligence quotient less than or equal to –1 standard deviation, and cognitive impairment, measured as intelligence quotient less than or equal to –2 standard deviation; 31% vs. 16% (P = .001) for the former and 6% vs. 2% (P = .030) for the latter.

TGA patients with cognitive difficulties had lower educational levels and were also more likely to repeat grades in school, Dr. Kalfa and coauthors noted. “Patients reported an overall satisfactory health-related quality of life,” Dr. Kalfa and coauthors said of the TGA group; “however, those with cognitive or psychologic difficulties reported poorer quality of life.” The researchers identified three predictors of worse outcomes: lower parental socioeconomic and educational status; older age at surgery; and longer hospital stays.

“Our findings suggest that the cognitive morbidities commonly reported in children and adolescents with complex CHD persist into adulthood in individuals with TGA after the ASO,” Dr. Kalfa and coauthors said. Future research should evaluate specific cognitive domains such as attention, memory, and executive functions. “This consideration is important for evaluation of the whole [adult] CHD population because specific cognitive impairments are increasingly documented into adolescence but remain rarely investigated in adulthood,” the researchers said.

Dr. Kalfa and coauthors reported having no financial disclosures.

Body

The findings by Dr. Kalfa and coauthors may point the way to improve cognitive outcomes in children who have the arterial switch operation, said Ryan R. Davies, MD, of A.I duPont Hospital for Children in Wilmington, Del., in his invited commentary (J Thorac Cardiovasc Surg. 2017;154:1036-7.) “Modifiable factors may exist both during the perioperative stage (perfusion strategies, intensive care management) and over the longer term (early neurocognitive assessments and interventions,” Dr. Davies said.

That parental socioeconomic status is associated with cognitive performance suggests early intervention and education “may pay long-term dividends,” Dr. Davies said. Future studies should focus on the impact of specific interventions and identify modifiable developmental factors, he said.

Dr. Ryan R. Davies

Dr. Kalfa and coauthors have provided an “important start” in that direction, Dr. Davies said. “They have shown that the neurodevelopmental deficits seen early in children with CHD persist into adulthood,” he said. “There are also hints here as to where interventions may be effective in ameliorating those deficits.”

Dr. Davies reported having no financial disclosures.

Publications
Topics
Sections
Body

The findings by Dr. Kalfa and coauthors may point the way to improve cognitive outcomes in children who have the arterial switch operation, said Ryan R. Davies, MD, of A.I duPont Hospital for Children in Wilmington, Del., in his invited commentary (J Thorac Cardiovasc Surg. 2017;154:1036-7.) “Modifiable factors may exist both during the perioperative stage (perfusion strategies, intensive care management) and over the longer term (early neurocognitive assessments and interventions,” Dr. Davies said.

That parental socioeconomic status is associated with cognitive performance suggests early intervention and education “may pay long-term dividends,” Dr. Davies said. Future studies should focus on the impact of specific interventions and identify modifiable developmental factors, he said.

Dr. Ryan R. Davies

Dr. Kalfa and coauthors have provided an “important start” in that direction, Dr. Davies said. “They have shown that the neurodevelopmental deficits seen early in children with CHD persist into adulthood,” he said. “There are also hints here as to where interventions may be effective in ameliorating those deficits.”

Dr. Davies reported having no financial disclosures.

Body

The findings by Dr. Kalfa and coauthors may point the way to improve cognitive outcomes in children who have the arterial switch operation, said Ryan R. Davies, MD, of A.I duPont Hospital for Children in Wilmington, Del., in his invited commentary (J Thorac Cardiovasc Surg. 2017;154:1036-7.) “Modifiable factors may exist both during the perioperative stage (perfusion strategies, intensive care management) and over the longer term (early neurocognitive assessments and interventions,” Dr. Davies said.

That parental socioeconomic status is associated with cognitive performance suggests early intervention and education “may pay long-term dividends,” Dr. Davies said. Future studies should focus on the impact of specific interventions and identify modifiable developmental factors, he said.

Dr. Ryan R. Davies

Dr. Kalfa and coauthors have provided an “important start” in that direction, Dr. Davies said. “They have shown that the neurodevelopmental deficits seen early in children with CHD persist into adulthood,” he said. “There are also hints here as to where interventions may be effective in ameliorating those deficits.”

Dr. Davies reported having no financial disclosures.

Title
An ‘important start’
An ‘important start’

With dramatic advances of neonatal repair of complex cardiac disease, the population of adults with congenital heart disease (CHD) has increased dramatically, and while studies have shown an increased risk of neurodevelopmental and psychological disorders in these patients, few studies have evaluated their cognitive and psychosocial outcomes. Now, a review of young adults who had an arterial switch operation for transposition of the great arteries in France has found that they have almost twice the rate of cognitive difficulties and more than triple the rate of cognitive impairment as healthy peers.

“Despite satisfactory outcomes in most adults with transposition of the great arteries (TGA), a substantial proportion has cognitive or psychologic difficulties that may reduce their academic success and quality of life,” said lead author David Kalfa, MD, PhD, of Morgan Stanley Children’s Hospital of New York-Presbyterian, Columbia University Medical Center and coauthors in the September issue of the Journal of Thoracic and Cardiovascular Surgery (2017;154:1028-35).

The study involved a review of 67 adults aged 18 and older born with TGA between 1984 and 1985 who had an arterial switch operation (ASO) at two hospitals in France: Necker Children’s Hospital in Paris and Marie Lannelongue Hospital in Le Plessis-Robinson. The researchers performed a matched analysis with 43 healthy subjects for age, gender, and education level.

The researchers found that 69% of the TGA patients had an intelligence quotient in the normal range of 85-115. The TGA patients had lower quotients for mean full-scale (94.9 plus or minus 15.3 vs. 103.4 plus or minus 12.3 in healthy subjects; P = 0.003), verbal (96.8 plus or minus 16.2 vs. 102.5 plus or minus 11.5; P =.033) and performance intelligence (93.7 plus or minus 14.6 vs. 103.8 plus or minus 14.3; P less than .001).

The TGA patients also had higher rates of cognitive difficulties, measured as intelligence quotient less than or equal to –1 standard deviation, and cognitive impairment, measured as intelligence quotient less than or equal to –2 standard deviation; 31% vs. 16% (P = .001) for the former and 6% vs. 2% (P = .030) for the latter.

TGA patients with cognitive difficulties had lower educational levels and were also more likely to repeat grades in school, Dr. Kalfa and coauthors noted. “Patients reported an overall satisfactory health-related quality of life,” Dr. Kalfa and coauthors said of the TGA group; “however, those with cognitive or psychologic difficulties reported poorer quality of life.” The researchers identified three predictors of worse outcomes: lower parental socioeconomic and educational status; older age at surgery; and longer hospital stays.

“Our findings suggest that the cognitive morbidities commonly reported in children and adolescents with complex CHD persist into adulthood in individuals with TGA after the ASO,” Dr. Kalfa and coauthors said. Future research should evaluate specific cognitive domains such as attention, memory, and executive functions. “This consideration is important for evaluation of the whole [adult] CHD population because specific cognitive impairments are increasingly documented into adolescence but remain rarely investigated in adulthood,” the researchers said.

Dr. Kalfa and coauthors reported having no financial disclosures.

With dramatic advances of neonatal repair of complex cardiac disease, the population of adults with congenital heart disease (CHD) has increased dramatically, and while studies have shown an increased risk of neurodevelopmental and psychological disorders in these patients, few studies have evaluated their cognitive and psychosocial outcomes. Now, a review of young adults who had an arterial switch operation for transposition of the great arteries in France has found that they have almost twice the rate of cognitive difficulties and more than triple the rate of cognitive impairment as healthy peers.

“Despite satisfactory outcomes in most adults with transposition of the great arteries (TGA), a substantial proportion has cognitive or psychologic difficulties that may reduce their academic success and quality of life,” said lead author David Kalfa, MD, PhD, of Morgan Stanley Children’s Hospital of New York-Presbyterian, Columbia University Medical Center and coauthors in the September issue of the Journal of Thoracic and Cardiovascular Surgery (2017;154:1028-35).

The study involved a review of 67 adults aged 18 and older born with TGA between 1984 and 1985 who had an arterial switch operation (ASO) at two hospitals in France: Necker Children’s Hospital in Paris and Marie Lannelongue Hospital in Le Plessis-Robinson. The researchers performed a matched analysis with 43 healthy subjects for age, gender, and education level.

The researchers found that 69% of the TGA patients had an intelligence quotient in the normal range of 85-115. The TGA patients had lower quotients for mean full-scale (94.9 plus or minus 15.3 vs. 103.4 plus or minus 12.3 in healthy subjects; P = 0.003), verbal (96.8 plus or minus 16.2 vs. 102.5 plus or minus 11.5; P =.033) and performance intelligence (93.7 plus or minus 14.6 vs. 103.8 plus or minus 14.3; P less than .001).

The TGA patients also had higher rates of cognitive difficulties, measured as intelligence quotient less than or equal to –1 standard deviation, and cognitive impairment, measured as intelligence quotient less than or equal to –2 standard deviation; 31% vs. 16% (P = .001) for the former and 6% vs. 2% (P = .030) for the latter.

TGA patients with cognitive difficulties had lower educational levels and were also more likely to repeat grades in school, Dr. Kalfa and coauthors noted. “Patients reported an overall satisfactory health-related quality of life,” Dr. Kalfa and coauthors said of the TGA group; “however, those with cognitive or psychologic difficulties reported poorer quality of life.” The researchers identified three predictors of worse outcomes: lower parental socioeconomic and educational status; older age at surgery; and longer hospital stays.

“Our findings suggest that the cognitive morbidities commonly reported in children and adolescents with complex CHD persist into adulthood in individuals with TGA after the ASO,” Dr. Kalfa and coauthors said. Future research should evaluate specific cognitive domains such as attention, memory, and executive functions. “This consideration is important for evaluation of the whole [adult] CHD population because specific cognitive impairments are increasingly documented into adolescence but remain rarely investigated in adulthood,” the researchers said.

Dr. Kalfa and coauthors reported having no financial disclosures.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM THE JOURNAL OF THORACIC AND CARDIOVASCULAR SURGERY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: A substantial proportion of young adults who had transposition of the great arteries have cognitive or psychological difficulties.

Major finding: Cognitive difficulties were significantly more frequent in the study population than the general population, 31% vs. 16%.

Data source: Age-, gender-, and education level–matched population of 67 young adults with transposition of the great arteries and 43 healthy subjects.

Disclosures: Dr. Kalfa and coauthors reported having no financial disclosures.

Disqus Comments
Default

Diagnostic laparoscopy pinpoints postop abdominal pain in bariatric patients

Article Type
Changed
Fri, 01/18/2019 - 17:02

 

The etiology of chronic pain after bariatric surgery can be difficult to pinpoint, but diagnostic laparoscopy can detect causes in about half of patients, findings from a small study have shown.

In an investigation conducted by Mohammed Alsulaimy, MD, a surgeon at the Bariatric and Metabolic Institute at the Cleveland Clinic, and his colleagues, 35 patients underwent diagnostic laparoscopy (DL) to identify the causes of their chronic abdominal pain after bariatric surgery. Patients included in the study had a history of abdominal pain lasting longer than 30 days after their bariatric procedure, a negative CT scan of their abdomen and pelvis, a gallstone-negative abdominal ultrasound, and an upper GI endoscopy with no abnormalities. Researchers collected patient data including age, gender, body, weight, and body mass index, type of previous bariatric procedure, and time between surgery and onset of pain.

Artem_Furman/Thinkstock
These patients presented with abdominal pain a median of 26 months after surgery, although nine patients began to have abdominal pain within the first year (Obes Surg. 2017 Aug;27[8]:1924-8).

The results of DL were either positive (presence detected of pathology or injury) or negative (no disease or injury detected).

Twenty patients (57%) had positive findings on DL including the presence of adhesions, chronic cholecystitis, mesenteric defect, internal hernia, and necrotic omentum, and of this group, 43% had treatment that led to improvement of pain symptoms. Only 1 of the 15 patients with negative DL findings had eventual improvement of their pain symptoms. Most patients with negative DL findings had persistent abdominal pain, possibly because of nonorganic causes and were referred to the chronic pain management service, the investigators wrote.

“About 40% of patients who undergo DL and 70% of patients with positive findings on DL experience significant symptom improvement,” the investigators said. “This study highlights the importance of offering DL as both a diagnostic and therapeutic tool in post–bariatric surgery patients with chronic abdominal of unknown etiology.”

The investigators had no relevant financial disclosures to report.

Publications
Topics
Sections

 

The etiology of chronic pain after bariatric surgery can be difficult to pinpoint, but diagnostic laparoscopy can detect causes in about half of patients, findings from a small study have shown.

In an investigation conducted by Mohammed Alsulaimy, MD, a surgeon at the Bariatric and Metabolic Institute at the Cleveland Clinic, and his colleagues, 35 patients underwent diagnostic laparoscopy (DL) to identify the causes of their chronic abdominal pain after bariatric surgery. Patients included in the study had a history of abdominal pain lasting longer than 30 days after their bariatric procedure, a negative CT scan of their abdomen and pelvis, a gallstone-negative abdominal ultrasound, and an upper GI endoscopy with no abnormalities. Researchers collected patient data including age, gender, body, weight, and body mass index, type of previous bariatric procedure, and time between surgery and onset of pain.

Artem_Furman/Thinkstock
These patients presented with abdominal pain a median of 26 months after surgery, although nine patients began to have abdominal pain within the first year (Obes Surg. 2017 Aug;27[8]:1924-8).

The results of DL were either positive (presence detected of pathology or injury) or negative (no disease or injury detected).

Twenty patients (57%) had positive findings on DL including the presence of adhesions, chronic cholecystitis, mesenteric defect, internal hernia, and necrotic omentum, and of this group, 43% had treatment that led to improvement of pain symptoms. Only 1 of the 15 patients with negative DL findings had eventual improvement of their pain symptoms. Most patients with negative DL findings had persistent abdominal pain, possibly because of nonorganic causes and were referred to the chronic pain management service, the investigators wrote.

“About 40% of patients who undergo DL and 70% of patients with positive findings on DL experience significant symptom improvement,” the investigators said. “This study highlights the importance of offering DL as both a diagnostic and therapeutic tool in post–bariatric surgery patients with chronic abdominal of unknown etiology.”

The investigators had no relevant financial disclosures to report.

 

The etiology of chronic pain after bariatric surgery can be difficult to pinpoint, but diagnostic laparoscopy can detect causes in about half of patients, findings from a small study have shown.

In an investigation conducted by Mohammed Alsulaimy, MD, a surgeon at the Bariatric and Metabolic Institute at the Cleveland Clinic, and his colleagues, 35 patients underwent diagnostic laparoscopy (DL) to identify the causes of their chronic abdominal pain after bariatric surgery. Patients included in the study had a history of abdominal pain lasting longer than 30 days after their bariatric procedure, a negative CT scan of their abdomen and pelvis, a gallstone-negative abdominal ultrasound, and an upper GI endoscopy with no abnormalities. Researchers collected patient data including age, gender, body, weight, and body mass index, type of previous bariatric procedure, and time between surgery and onset of pain.

Artem_Furman/Thinkstock
These patients presented with abdominal pain a median of 26 months after surgery, although nine patients began to have abdominal pain within the first year (Obes Surg. 2017 Aug;27[8]:1924-8).

The results of DL were either positive (presence detected of pathology or injury) or negative (no disease or injury detected).

Twenty patients (57%) had positive findings on DL including the presence of adhesions, chronic cholecystitis, mesenteric defect, internal hernia, and necrotic omentum, and of this group, 43% had treatment that led to improvement of pain symptoms. Only 1 of the 15 patients with negative DL findings had eventual improvement of their pain symptoms. Most patients with negative DL findings had persistent abdominal pain, possibly because of nonorganic causes and were referred to the chronic pain management service, the investigators wrote.

“About 40% of patients who undergo DL and 70% of patients with positive findings on DL experience significant symptom improvement,” the investigators said. “This study highlights the importance of offering DL as both a diagnostic and therapeutic tool in post–bariatric surgery patients with chronic abdominal of unknown etiology.”

The investigators had no relevant financial disclosures to report.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM OBESITY SURGERY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: Diagnostic laparoscopy successfully identified the cause of post–bariatric surgery abdominal pain in about half of patients.

Major finding: In the study group, 57% of patients had a positive diagnostic laparoscopy results identifying the source of their chronic abdominal pain.

Data source: Retrospective review of post–bariatric surgery patients who underwent diagnostic laparoscopy (DL) during 2003-2015.

Disclosures: The investigators had no relevant financial disclosures to report.

Disqus Comments
Default

Nature versus nurture: 50 years of a popular debate

Article Type
Changed
Fri, 01/18/2019 - 17:02

 

Is human behavior driven by innate biological forces or it is the product of our learning and the environment? This basic question has been debated at settings ranging from scientific conferences to dinner tables for many decades. The media also has covered it in forms ranging from documentaries to the popular comedy movie “Trading Places” (1983). Yet, despite so much attention and so much research devoted to resolving this timeless debate, the arguments continue to this day.

A lack of a clear answer, however, by no means implies that we have not made major advances in our understanding. This short review takes a look at the progression of this seemingly eternal question by categorizing the development of the nature versus nurture question into three main stages. While such a partitioning is somewhat oversimplified with regard to what the various positions on this issue have been at different times, it does illustrate the way that the debate has gradually evolved.
 

Part 1: Nature versus nurture

The origins of the nature versus nurture debate date back far beyond the past 50 years. The ancient Greek philosopher Galen postulated that personality traits were driven by the relative concentrations of four bodily fluids or “humours.” In 1874, Sir Francis Galton published “English Men of Science: Their Nature and Nurture,” in which he advanced his ideas about the dominance of hereditary factors in intelligence and character at the beginning of the eugenics movement.1 These ideas were in stark opposition to the perspective of earlier scholars, such as the philosopher John Locke, who popularized the theory that children are born a “blank slate” and from there develop their traits and intellectual abilities through their environment and experiences.

lvcandy/Thinkstock
Fifty years ago, some of the same arguments were being heard and supported by early research. The behaviorism movement, started by people such as John Watson, PhD, was a prominent force at that time, with notable psychologists such as B.F. Skinner, PhD, showing evidence in many experiments with both animals and people regarding the importance of rewards and punishments in shaping behavior.

The other primary school of thought in the mid-1960s was psychoanalysis, which was based on the ideas of Sigmund Freud, MD. Psychoanalysis maintains that the way that unconscious sexual and aggressive drives were channeled through various defense mechanisms was of primary importance to the understanding of both psychopathology and typical human behavior.

While these two perspectives were often very much in opposition to each other, they shared in common the view that the environment and a person’s individual experiences, i.e. nurture, were the prevailing forces in development. In the background, more biologically oriented research and clinical work was slowly beginning to work its way into the field, especially at certain institutions, such as Washington University in St. Louis. Several medications of various types were then available, including chlorpromazine, imipramine, and diazepam.

Overall, however, it is probably fair to say that, 50 years ago, it was the nurture perspective that held the most sway since psychodynamic treatment and behaviorist research dominated, while the emerging fields of genetics and neuroscience were only beginning to take hold.
 

Part 2: Nature and nurture

From the 1970s to the end of the 20th century, a noticeable shift occurred as knowledge of the brain and genetics – supported by remarkable advances in research techniques – began to swing the pendulum back toward an increased appreciation of nature as a critical influence on a person’s thoughts, feelings, and behavior.

Researchers Stella Chess, MD, and Alexander Thomas, MD, for example, conducted the New York Longitudinal Study, in which they closely observed a group of young children over many years. Their studies compelled them to argue for the significance of more innate temperament traits as critical aspects of a youth’s overall adjustment.2 The Human Genome Project was launched in 1990, and the entire decade was designated as the “Decade of the Brain.” During this time, neuroscience research exploded as techniques, such as MRI and PET, allowed scientists to view the living brain like never before.

The type of research investigation that perhaps was most directly relevant to the nature-nurture debate and that became quite popular during this time was the twin study. By comparing the relative similarities among monozygotic and dizygotic twins raised in the same household, it became possible to calculate directly the degree to which a variable of interest (intelligence, height, aggressive behavior) could be attributed to genetic versus environmental factors. When it came to behavioral variables, a repeated finding that emerged was that both genetic and environmental influences are important, often at close to a 50/50 split in terms of magnitude.3,4 These studies were complemented by molecular genetic studies, which were beginning to be able to identify specific genes that conveyed usually small amounts of risk for a wide range of psychiatric disorders.

Yet, while twin studies and many other lines of research made it increasingly difficult to argue for the overwhelming supremacy of either nature or nurture, the two domains generally were treated as being independent of each other. Specific traits or symptoms in an individual often were thought of as being the result of either psychological (nurture) or biological (nature) causes. Terms such as “endogenous depression,” for example, were used to distinguish those who had symptoms that were thought generally to be out of reach for “psychological” treatments, such as psychotherapy. Looking back, it might be fair to say that one of the principle flaws in this perspective was the commonly held belief that, if something was brain based or biological, then it therefore implied a kind of automatic “wiring” of the brain that was generally driven by genes and beyond the influence of environmental factors.
 

 

 

Part 3: Nature is nurture (and vice versa)

As the science progressed, it became increasingly clear that the nature and nurture domains were hopelessly intertwined with one another. From early PET-scan studies showing that both medications and psychotherapy not only changed the brain but also did so in ways similar to behavioral-genetic studies showing how genetically influenced behaviors actually cause certain environmental events to be more likely to occur, research continued to demonstrate the bidirectional influences of genetic and environmental factors on development.5,6 This appreciation rose to even greater heights with advances in the field of epigenetics, which was able to document some of the specific mechanisms through which environmental factors cause genes involved in regulating the plasticity of the brain to turn on and off.7

Dr. David C. Rettew
Given this modern understanding, the question of nature versus nurture ceases even to make sense in many ways. As an example, consider the developmental pathway a 10-year-old boy might have taken before eventually presenting to his pediatrician for severe general and social anxiety. He may have inherited a genetically based temperamental predisposition to being anxious from his parents, who also may struggle with anxiety. Then, those predispositions easily can evoke responses from his parents and teachers to shield and perhaps overprotect him, thereby limiting his opportunities to overcome his anxiety further. He selects friends and activities that match his more inhibited temperament. All of these environmental effects – some of which have been triggered by the boy’s genes – result in real changes with regard to his brain structure and epigenetic modifications, with the end result being an anxious child whose stress pathways in the brain have been reinforced while the circuits involved in emotional regulation are not as structurally or functionally strong as they otherwise would be.

In thinking through some of this complexity, however, it is important to remember the hopeful message that is contained in this rich understanding. All of these complicated, interacting genetic and environmental factors give us many avenues for positive intervention. Now we understand that not only might a medication help strengthen some of the brain connections needed to reduce and cope with that child’s anxiety, but so could mindfulness, exercise, and addressing his parents’ symptoms. When the families ask me whether their child’s struggles are behavioral or psychological, the answer I tend to give them is “yes.”
 

Dr. Rettew is a child and adolescent psychiatrist and associate professor of psychiatry and pediatrics at the University of Vermont, Burlington.

Email him at [email protected]. Follow him on Twitter @pedipsych.

References

1. “English Men of Science: Their Nature and Nurture” (London: MacMillan & Co., 1874)

2. “Temperament: Theory and Practice” (New York: Brunner/Mazel, 1996)

3. “Nature and Nurture during Infancy and Early Childhood” (New York: Cambridge University Press, 1988)

4. Nat Genet. 2015;47(7):702-9.

5. Arch Gen Psychiatry. 1992;49(9):681-9.

6. Dev Psychopathol. 1997 Spring;9(2):335-64.

7. JAMA Psychiatry. 2017;74(6):551-2.

Publications
Topics
Sections

 

Is human behavior driven by innate biological forces or it is the product of our learning and the environment? This basic question has been debated at settings ranging from scientific conferences to dinner tables for many decades. The media also has covered it in forms ranging from documentaries to the popular comedy movie “Trading Places” (1983). Yet, despite so much attention and so much research devoted to resolving this timeless debate, the arguments continue to this day.

A lack of a clear answer, however, by no means implies that we have not made major advances in our understanding. This short review takes a look at the progression of this seemingly eternal question by categorizing the development of the nature versus nurture question into three main stages. While such a partitioning is somewhat oversimplified with regard to what the various positions on this issue have been at different times, it does illustrate the way that the debate has gradually evolved.
 

Part 1: Nature versus nurture

The origins of the nature versus nurture debate date back far beyond the past 50 years. The ancient Greek philosopher Galen postulated that personality traits were driven by the relative concentrations of four bodily fluids or “humours.” In 1874, Sir Francis Galton published “English Men of Science: Their Nature and Nurture,” in which he advanced his ideas about the dominance of hereditary factors in intelligence and character at the beginning of the eugenics movement.1 These ideas were in stark opposition to the perspective of earlier scholars, such as the philosopher John Locke, who popularized the theory that children are born a “blank slate” and from there develop their traits and intellectual abilities through their environment and experiences.

lvcandy/Thinkstock
Fifty years ago, some of the same arguments were being heard and supported by early research. The behaviorism movement, started by people such as John Watson, PhD, was a prominent force at that time, with notable psychologists such as B.F. Skinner, PhD, showing evidence in many experiments with both animals and people regarding the importance of rewards and punishments in shaping behavior.

The other primary school of thought in the mid-1960s was psychoanalysis, which was based on the ideas of Sigmund Freud, MD. Psychoanalysis maintains that the way that unconscious sexual and aggressive drives were channeled through various defense mechanisms was of primary importance to the understanding of both psychopathology and typical human behavior.

While these two perspectives were often very much in opposition to each other, they shared in common the view that the environment and a person’s individual experiences, i.e. nurture, were the prevailing forces in development. In the background, more biologically oriented research and clinical work was slowly beginning to work its way into the field, especially at certain institutions, such as Washington University in St. Louis. Several medications of various types were then available, including chlorpromazine, imipramine, and diazepam.

Overall, however, it is probably fair to say that, 50 years ago, it was the nurture perspective that held the most sway since psychodynamic treatment and behaviorist research dominated, while the emerging fields of genetics and neuroscience were only beginning to take hold.
 

Part 2: Nature and nurture

From the 1970s to the end of the 20th century, a noticeable shift occurred as knowledge of the brain and genetics – supported by remarkable advances in research techniques – began to swing the pendulum back toward an increased appreciation of nature as a critical influence on a person’s thoughts, feelings, and behavior.

Researchers Stella Chess, MD, and Alexander Thomas, MD, for example, conducted the New York Longitudinal Study, in which they closely observed a group of young children over many years. Their studies compelled them to argue for the significance of more innate temperament traits as critical aspects of a youth’s overall adjustment.2 The Human Genome Project was launched in 1990, and the entire decade was designated as the “Decade of the Brain.” During this time, neuroscience research exploded as techniques, such as MRI and PET, allowed scientists to view the living brain like never before.

The type of research investigation that perhaps was most directly relevant to the nature-nurture debate and that became quite popular during this time was the twin study. By comparing the relative similarities among monozygotic and dizygotic twins raised in the same household, it became possible to calculate directly the degree to which a variable of interest (intelligence, height, aggressive behavior) could be attributed to genetic versus environmental factors. When it came to behavioral variables, a repeated finding that emerged was that both genetic and environmental influences are important, often at close to a 50/50 split in terms of magnitude.3,4 These studies were complemented by molecular genetic studies, which were beginning to be able to identify specific genes that conveyed usually small amounts of risk for a wide range of psychiatric disorders.

Yet, while twin studies and many other lines of research made it increasingly difficult to argue for the overwhelming supremacy of either nature or nurture, the two domains generally were treated as being independent of each other. Specific traits or symptoms in an individual often were thought of as being the result of either psychological (nurture) or biological (nature) causes. Terms such as “endogenous depression,” for example, were used to distinguish those who had symptoms that were thought generally to be out of reach for “psychological” treatments, such as psychotherapy. Looking back, it might be fair to say that one of the principle flaws in this perspective was the commonly held belief that, if something was brain based or biological, then it therefore implied a kind of automatic “wiring” of the brain that was generally driven by genes and beyond the influence of environmental factors.
 

 

 

Part 3: Nature is nurture (and vice versa)

As the science progressed, it became increasingly clear that the nature and nurture domains were hopelessly intertwined with one another. From early PET-scan studies showing that both medications and psychotherapy not only changed the brain but also did so in ways similar to behavioral-genetic studies showing how genetically influenced behaviors actually cause certain environmental events to be more likely to occur, research continued to demonstrate the bidirectional influences of genetic and environmental factors on development.5,6 This appreciation rose to even greater heights with advances in the field of epigenetics, which was able to document some of the specific mechanisms through which environmental factors cause genes involved in regulating the plasticity of the brain to turn on and off.7

Dr. David C. Rettew
Given this modern understanding, the question of nature versus nurture ceases even to make sense in many ways. As an example, consider the developmental pathway a 10-year-old boy might have taken before eventually presenting to his pediatrician for severe general and social anxiety. He may have inherited a genetically based temperamental predisposition to being anxious from his parents, who also may struggle with anxiety. Then, those predispositions easily can evoke responses from his parents and teachers to shield and perhaps overprotect him, thereby limiting his opportunities to overcome his anxiety further. He selects friends and activities that match his more inhibited temperament. All of these environmental effects – some of which have been triggered by the boy’s genes – result in real changes with regard to his brain structure and epigenetic modifications, with the end result being an anxious child whose stress pathways in the brain have been reinforced while the circuits involved in emotional regulation are not as structurally or functionally strong as they otherwise would be.

In thinking through some of this complexity, however, it is important to remember the hopeful message that is contained in this rich understanding. All of these complicated, interacting genetic and environmental factors give us many avenues for positive intervention. Now we understand that not only might a medication help strengthen some of the brain connections needed to reduce and cope with that child’s anxiety, but so could mindfulness, exercise, and addressing his parents’ symptoms. When the families ask me whether their child’s struggles are behavioral or psychological, the answer I tend to give them is “yes.”
 

Dr. Rettew is a child and adolescent psychiatrist and associate professor of psychiatry and pediatrics at the University of Vermont, Burlington.

Email him at [email protected]. Follow him on Twitter @pedipsych.

References

1. “English Men of Science: Their Nature and Nurture” (London: MacMillan & Co., 1874)

2. “Temperament: Theory and Practice” (New York: Brunner/Mazel, 1996)

3. “Nature and Nurture during Infancy and Early Childhood” (New York: Cambridge University Press, 1988)

4. Nat Genet. 2015;47(7):702-9.

5. Arch Gen Psychiatry. 1992;49(9):681-9.

6. Dev Psychopathol. 1997 Spring;9(2):335-64.

7. JAMA Psychiatry. 2017;74(6):551-2.

 

Is human behavior driven by innate biological forces or it is the product of our learning and the environment? This basic question has been debated at settings ranging from scientific conferences to dinner tables for many decades. The media also has covered it in forms ranging from documentaries to the popular comedy movie “Trading Places” (1983). Yet, despite so much attention and so much research devoted to resolving this timeless debate, the arguments continue to this day.

A lack of a clear answer, however, by no means implies that we have not made major advances in our understanding. This short review takes a look at the progression of this seemingly eternal question by categorizing the development of the nature versus nurture question into three main stages. While such a partitioning is somewhat oversimplified with regard to what the various positions on this issue have been at different times, it does illustrate the way that the debate has gradually evolved.
 

Part 1: Nature versus nurture

The origins of the nature versus nurture debate date back far beyond the past 50 years. The ancient Greek philosopher Galen postulated that personality traits were driven by the relative concentrations of four bodily fluids or “humours.” In 1874, Sir Francis Galton published “English Men of Science: Their Nature and Nurture,” in which he advanced his ideas about the dominance of hereditary factors in intelligence and character at the beginning of the eugenics movement.1 These ideas were in stark opposition to the perspective of earlier scholars, such as the philosopher John Locke, who popularized the theory that children are born a “blank slate” and from there develop their traits and intellectual abilities through their environment and experiences.

lvcandy/Thinkstock
Fifty years ago, some of the same arguments were being heard and supported by early research. The behaviorism movement, started by people such as John Watson, PhD, was a prominent force at that time, with notable psychologists such as B.F. Skinner, PhD, showing evidence in many experiments with both animals and people regarding the importance of rewards and punishments in shaping behavior.

The other primary school of thought in the mid-1960s was psychoanalysis, which was based on the ideas of Sigmund Freud, MD. Psychoanalysis maintains that the way that unconscious sexual and aggressive drives were channeled through various defense mechanisms was of primary importance to the understanding of both psychopathology and typical human behavior.

While these two perspectives were often very much in opposition to each other, they shared in common the view that the environment and a person’s individual experiences, i.e. nurture, were the prevailing forces in development. In the background, more biologically oriented research and clinical work was slowly beginning to work its way into the field, especially at certain institutions, such as Washington University in St. Louis. Several medications of various types were then available, including chlorpromazine, imipramine, and diazepam.

Overall, however, it is probably fair to say that, 50 years ago, it was the nurture perspective that held the most sway since psychodynamic treatment and behaviorist research dominated, while the emerging fields of genetics and neuroscience were only beginning to take hold.
 

Part 2: Nature and nurture

From the 1970s to the end of the 20th century, a noticeable shift occurred as knowledge of the brain and genetics – supported by remarkable advances in research techniques – began to swing the pendulum back toward an increased appreciation of nature as a critical influence on a person’s thoughts, feelings, and behavior.

Researchers Stella Chess, MD, and Alexander Thomas, MD, for example, conducted the New York Longitudinal Study, in which they closely observed a group of young children over many years. Their studies compelled them to argue for the significance of more innate temperament traits as critical aspects of a youth’s overall adjustment.2 The Human Genome Project was launched in 1990, and the entire decade was designated as the “Decade of the Brain.” During this time, neuroscience research exploded as techniques, such as MRI and PET, allowed scientists to view the living brain like never before.

The type of research investigation that perhaps was most directly relevant to the nature-nurture debate and that became quite popular during this time was the twin study. By comparing the relative similarities among monozygotic and dizygotic twins raised in the same household, it became possible to calculate directly the degree to which a variable of interest (intelligence, height, aggressive behavior) could be attributed to genetic versus environmental factors. When it came to behavioral variables, a repeated finding that emerged was that both genetic and environmental influences are important, often at close to a 50/50 split in terms of magnitude.3,4 These studies were complemented by molecular genetic studies, which were beginning to be able to identify specific genes that conveyed usually small amounts of risk for a wide range of psychiatric disorders.

Yet, while twin studies and many other lines of research made it increasingly difficult to argue for the overwhelming supremacy of either nature or nurture, the two domains generally were treated as being independent of each other. Specific traits or symptoms in an individual often were thought of as being the result of either psychological (nurture) or biological (nature) causes. Terms such as “endogenous depression,” for example, were used to distinguish those who had symptoms that were thought generally to be out of reach for “psychological” treatments, such as psychotherapy. Looking back, it might be fair to say that one of the principle flaws in this perspective was the commonly held belief that, if something was brain based or biological, then it therefore implied a kind of automatic “wiring” of the brain that was generally driven by genes and beyond the influence of environmental factors.
 

 

 

Part 3: Nature is nurture (and vice versa)

As the science progressed, it became increasingly clear that the nature and nurture domains were hopelessly intertwined with one another. From early PET-scan studies showing that both medications and psychotherapy not only changed the brain but also did so in ways similar to behavioral-genetic studies showing how genetically influenced behaviors actually cause certain environmental events to be more likely to occur, research continued to demonstrate the bidirectional influences of genetic and environmental factors on development.5,6 This appreciation rose to even greater heights with advances in the field of epigenetics, which was able to document some of the specific mechanisms through which environmental factors cause genes involved in regulating the plasticity of the brain to turn on and off.7

Dr. David C. Rettew
Given this modern understanding, the question of nature versus nurture ceases even to make sense in many ways. As an example, consider the developmental pathway a 10-year-old boy might have taken before eventually presenting to his pediatrician for severe general and social anxiety. He may have inherited a genetically based temperamental predisposition to being anxious from his parents, who also may struggle with anxiety. Then, those predispositions easily can evoke responses from his parents and teachers to shield and perhaps overprotect him, thereby limiting his opportunities to overcome his anxiety further. He selects friends and activities that match his more inhibited temperament. All of these environmental effects – some of which have been triggered by the boy’s genes – result in real changes with regard to his brain structure and epigenetic modifications, with the end result being an anxious child whose stress pathways in the brain have been reinforced while the circuits involved in emotional regulation are not as structurally or functionally strong as they otherwise would be.

In thinking through some of this complexity, however, it is important to remember the hopeful message that is contained in this rich understanding. All of these complicated, interacting genetic and environmental factors give us many avenues for positive intervention. Now we understand that not only might a medication help strengthen some of the brain connections needed to reduce and cope with that child’s anxiety, but so could mindfulness, exercise, and addressing his parents’ symptoms. When the families ask me whether their child’s struggles are behavioral or psychological, the answer I tend to give them is “yes.”
 

Dr. Rettew is a child and adolescent psychiatrist and associate professor of psychiatry and pediatrics at the University of Vermont, Burlington.

Email him at [email protected]. Follow him on Twitter @pedipsych.

References

1. “English Men of Science: Their Nature and Nurture” (London: MacMillan & Co., 1874)

2. “Temperament: Theory and Practice” (New York: Brunner/Mazel, 1996)

3. “Nature and Nurture during Infancy and Early Childhood” (New York: Cambridge University Press, 1988)

4. Nat Genet. 2015;47(7):702-9.

5. Arch Gen Psychiatry. 1992;49(9):681-9.

6. Dev Psychopathol. 1997 Spring;9(2):335-64.

7. JAMA Psychiatry. 2017;74(6):551-2.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default