Trigeminal Neuralgia Surgical Options Compared

Article Type
Changed
Display Headline
Trigeminal Neuralgia Surgical Options Compared

WASHINGTON – New evidence on the effectiveness of surgical options for trigeminal neuralgia points to greater safety and longer pain-free intervals when glycerol rhizotomy is used instead of balloon microcompression in patients with multiple sclerosis, as well as when the rhizotomy procedure is repeated, according to two studies reported at the meeting.

Rhizotomy vs. Microcompression

Trigeminal neuralgia occurs in about 2%-4% of patients with MS, said Dr. Grant W. Mallory, a fellow in neurological surgery at the Mayo Clinic, Rochester, Minn. Pain in these patients is more often bilateral than unilateral, and presents at a younger age than in the general population. While the majority of patients respond well to medical therapy with carbamazepine, some do require a surgical approach, he said.

Dr. Mallory and his mentor, Dr. Bruce Pollock, reviewed surgical outcomes and follow-up in 69 MS patients who underwent balloon microcompression and 68 who underwent glycerol rhizotomy during 1997-2010. The patients' mean age was 62 years. Their mean pain duration at baseline varied significantly, between 16 months in those undergoing microcompression and 55 months in those undergoing rhizotomy. Most patients had already undergone a prior procedure (87% of the microcompression patients and 48% of the rhizotomy patients).

The investigators defined an excellent outcome as being pain free and tapered off medications. A good outcome was freedom from pain with continued medication; a failed procedure was no pain relief, or pain recurrence within 1 month.

There was a nonsignificant trend for more patients to report excellent or good pain relief after rhizotomy than after microcompression (74% vs. 65%).

Overall, pain recurred in 86% of patients after a median time of 6 months; the overall median follow-up time was 13 months. Pain recurred significantly later after rhizotomy (mean of 7 months) than after balloon microcompression (5 months). This contributed to a significant difference in the percentage of patients who needed further surgical procedures: 36% with rhizotomy vs. 44% with microcompression. Patients undergoing their first procedure had a longer time to pain recurrence than did those who had a prior intervention (10 months vs. 4 months).

“Comparing our series [of MS patients] to historical controls in patients without MS, our cohort fared substantially worse with regard to initial response rate (69% vs. 90%),” Dr. Mallory said. Historical control patients also had a pain-free interval that was about twice that of patients with MS, he said.

Complications occurred significantly more often in patients treated with balloon microcompression than with rhizotomy (17% vs. 5%).

Although the study was not a randomized trial, Dr. Mallory said his mentor prefers the glycerol rhizotomy as a first-line invasive intervention in MS patients because of its low complication rate. If the patient is unable to tolerate the procedure under a local anesthetic, the surgeon may convert to a balloon microcompression under general anesthesia, Dr. Mallory said.

Benefits of Repeat Rhizotomy

Glycerol rhizotomy conferred greater benefit when it was repeated in patients with trigeminal neuralgia than when it was performed for the first time, according to a review of 547 patients conducted by Matthew Bender and his colleagues at Johns Hopkins University Medical Center, Baltimore.

Mr. Bender, a fourth-year medical student, reported that during 1998-2010, patients at the center who had undergone a prior rhizotomy were 21% less likely to have a failed procedure than were those who underwent an initial rhizotomy. These patients underwent 647 glycerol rhizotomies, including 504 first-time procedures.

The patients' mean age was 64 years, and their mean duration of trigeminal neuralgia was 7 years.

Ten patients in the repeat group and 13 in the initial group had previously undergone radiofrequency thermocoagulation. Other prior procedures included intracranial stereotactic radiosurgery (11 in the initial group and 1 in the repeat group), balloon microcompression (2 in the initial group), and microvascular decompression (19 in the initial group and 10 in the repeat group).

Rhizotomy led to at least 90% pain relief without medications for a significantly greater percentage of patients who had the procedure for a second time, compared with those who had it for the first time (74% vs. 65%). In 12% of the patients in each group there was at least 50% pain relief with minimal medications. Up to 50% pain relief with no medication change occurred in 10% of the initial group and 3% of the repeat group, whereas no pain relief occurred in 13% of the initial group and 10% of the repeat group.

The overall median time to treatment failure was similar between the first-time and repeat treatment groups (27 months vs. 19 months, respectively).

 

 

“These outcomes challenge the consensus that repeat glycerol rhizotomy has decreased pain relief and durability,” Mr. Bender said. “Not only can this be used as our first-line percutaneous procedure, it can be repeated before resorting to salvage modalities.”

Neither Dr. Mallory nor Mr. Bender had any financial declarations.

Article PDF
Author and Disclosure Information

Publications
Topics
Author and Disclosure Information

Author and Disclosure Information

Article PDF
Article PDF

WASHINGTON – New evidence on the effectiveness of surgical options for trigeminal neuralgia points to greater safety and longer pain-free intervals when glycerol rhizotomy is used instead of balloon microcompression in patients with multiple sclerosis, as well as when the rhizotomy procedure is repeated, according to two studies reported at the meeting.

Rhizotomy vs. Microcompression

Trigeminal neuralgia occurs in about 2%-4% of patients with MS, said Dr. Grant W. Mallory, a fellow in neurological surgery at the Mayo Clinic, Rochester, Minn. Pain in these patients is more often bilateral than unilateral, and presents at a younger age than in the general population. While the majority of patients respond well to medical therapy with carbamazepine, some do require a surgical approach, he said.

Dr. Mallory and his mentor, Dr. Bruce Pollock, reviewed surgical outcomes and follow-up in 69 MS patients who underwent balloon microcompression and 68 who underwent glycerol rhizotomy during 1997-2010. The patients' mean age was 62 years. Their mean pain duration at baseline varied significantly, between 16 months in those undergoing microcompression and 55 months in those undergoing rhizotomy. Most patients had already undergone a prior procedure (87% of the microcompression patients and 48% of the rhizotomy patients).

The investigators defined an excellent outcome as being pain free and tapered off medications. A good outcome was freedom from pain with continued medication; a failed procedure was no pain relief, or pain recurrence within 1 month.

There was a nonsignificant trend for more patients to report excellent or good pain relief after rhizotomy than after microcompression (74% vs. 65%).

Overall, pain recurred in 86% of patients after a median time of 6 months; the overall median follow-up time was 13 months. Pain recurred significantly later after rhizotomy (mean of 7 months) than after balloon microcompression (5 months). This contributed to a significant difference in the percentage of patients who needed further surgical procedures: 36% with rhizotomy vs. 44% with microcompression. Patients undergoing their first procedure had a longer time to pain recurrence than did those who had a prior intervention (10 months vs. 4 months).

“Comparing our series [of MS patients] to historical controls in patients without MS, our cohort fared substantially worse with regard to initial response rate (69% vs. 90%),” Dr. Mallory said. Historical control patients also had a pain-free interval that was about twice that of patients with MS, he said.

Complications occurred significantly more often in patients treated with balloon microcompression than with rhizotomy (17% vs. 5%).

Although the study was not a randomized trial, Dr. Mallory said his mentor prefers the glycerol rhizotomy as a first-line invasive intervention in MS patients because of its low complication rate. If the patient is unable to tolerate the procedure under a local anesthetic, the surgeon may convert to a balloon microcompression under general anesthesia, Dr. Mallory said.

Benefits of Repeat Rhizotomy

Glycerol rhizotomy conferred greater benefit when it was repeated in patients with trigeminal neuralgia than when it was performed for the first time, according to a review of 547 patients conducted by Matthew Bender and his colleagues at Johns Hopkins University Medical Center, Baltimore.

Mr. Bender, a fourth-year medical student, reported that during 1998-2010, patients at the center who had undergone a prior rhizotomy were 21% less likely to have a failed procedure than were those who underwent an initial rhizotomy. These patients underwent 647 glycerol rhizotomies, including 504 first-time procedures.

The patients' mean age was 64 years, and their mean duration of trigeminal neuralgia was 7 years.

Ten patients in the repeat group and 13 in the initial group had previously undergone radiofrequency thermocoagulation. Other prior procedures included intracranial stereotactic radiosurgery (11 in the initial group and 1 in the repeat group), balloon microcompression (2 in the initial group), and microvascular decompression (19 in the initial group and 10 in the repeat group).

Rhizotomy led to at least 90% pain relief without medications for a significantly greater percentage of patients who had the procedure for a second time, compared with those who had it for the first time (74% vs. 65%). In 12% of the patients in each group there was at least 50% pain relief with minimal medications. Up to 50% pain relief with no medication change occurred in 10% of the initial group and 3% of the repeat group, whereas no pain relief occurred in 13% of the initial group and 10% of the repeat group.

The overall median time to treatment failure was similar between the first-time and repeat treatment groups (27 months vs. 19 months, respectively).

 

 

“These outcomes challenge the consensus that repeat glycerol rhizotomy has decreased pain relief and durability,” Mr. Bender said. “Not only can this be used as our first-line percutaneous procedure, it can be repeated before resorting to salvage modalities.”

Neither Dr. Mallory nor Mr. Bender had any financial declarations.

WASHINGTON – New evidence on the effectiveness of surgical options for trigeminal neuralgia points to greater safety and longer pain-free intervals when glycerol rhizotomy is used instead of balloon microcompression in patients with multiple sclerosis, as well as when the rhizotomy procedure is repeated, according to two studies reported at the meeting.

Rhizotomy vs. Microcompression

Trigeminal neuralgia occurs in about 2%-4% of patients with MS, said Dr. Grant W. Mallory, a fellow in neurological surgery at the Mayo Clinic, Rochester, Minn. Pain in these patients is more often bilateral than unilateral, and presents at a younger age than in the general population. While the majority of patients respond well to medical therapy with carbamazepine, some do require a surgical approach, he said.

Dr. Mallory and his mentor, Dr. Bruce Pollock, reviewed surgical outcomes and follow-up in 69 MS patients who underwent balloon microcompression and 68 who underwent glycerol rhizotomy during 1997-2010. The patients' mean age was 62 years. Their mean pain duration at baseline varied significantly, between 16 months in those undergoing microcompression and 55 months in those undergoing rhizotomy. Most patients had already undergone a prior procedure (87% of the microcompression patients and 48% of the rhizotomy patients).

The investigators defined an excellent outcome as being pain free and tapered off medications. A good outcome was freedom from pain with continued medication; a failed procedure was no pain relief, or pain recurrence within 1 month.

There was a nonsignificant trend for more patients to report excellent or good pain relief after rhizotomy than after microcompression (74% vs. 65%).

Overall, pain recurred in 86% of patients after a median time of 6 months; the overall median follow-up time was 13 months. Pain recurred significantly later after rhizotomy (mean of 7 months) than after balloon microcompression (5 months). This contributed to a significant difference in the percentage of patients who needed further surgical procedures: 36% with rhizotomy vs. 44% with microcompression. Patients undergoing their first procedure had a longer time to pain recurrence than did those who had a prior intervention (10 months vs. 4 months).

“Comparing our series [of MS patients] to historical controls in patients without MS, our cohort fared substantially worse with regard to initial response rate (69% vs. 90%),” Dr. Mallory said. Historical control patients also had a pain-free interval that was about twice that of patients with MS, he said.

Complications occurred significantly more often in patients treated with balloon microcompression than with rhizotomy (17% vs. 5%).

Although the study was not a randomized trial, Dr. Mallory said his mentor prefers the glycerol rhizotomy as a first-line invasive intervention in MS patients because of its low complication rate. If the patient is unable to tolerate the procedure under a local anesthetic, the surgeon may convert to a balloon microcompression under general anesthesia, Dr. Mallory said.

Benefits of Repeat Rhizotomy

Glycerol rhizotomy conferred greater benefit when it was repeated in patients with trigeminal neuralgia than when it was performed for the first time, according to a review of 547 patients conducted by Matthew Bender and his colleagues at Johns Hopkins University Medical Center, Baltimore.

Mr. Bender, a fourth-year medical student, reported that during 1998-2010, patients at the center who had undergone a prior rhizotomy were 21% less likely to have a failed procedure than were those who underwent an initial rhizotomy. These patients underwent 647 glycerol rhizotomies, including 504 first-time procedures.

The patients' mean age was 64 years, and their mean duration of trigeminal neuralgia was 7 years.

Ten patients in the repeat group and 13 in the initial group had previously undergone radiofrequency thermocoagulation. Other prior procedures included intracranial stereotactic radiosurgery (11 in the initial group and 1 in the repeat group), balloon microcompression (2 in the initial group), and microvascular decompression (19 in the initial group and 10 in the repeat group).

Rhizotomy led to at least 90% pain relief without medications for a significantly greater percentage of patients who had the procedure for a second time, compared with those who had it for the first time (74% vs. 65%). In 12% of the patients in each group there was at least 50% pain relief with minimal medications. Up to 50% pain relief with no medication change occurred in 10% of the initial group and 3% of the repeat group, whereas no pain relief occurred in 13% of the initial group and 10% of the repeat group.

The overall median time to treatment failure was similar between the first-time and repeat treatment groups (27 months vs. 19 months, respectively).

 

 

“These outcomes challenge the consensus that repeat glycerol rhizotomy has decreased pain relief and durability,” Mr. Bender said. “Not only can this be used as our first-line percutaneous procedure, it can be repeated before resorting to salvage modalities.”

Neither Dr. Mallory nor Mr. Bender had any financial declarations.

Publications
Publications
Topics
Article Type
Display Headline
Trigeminal Neuralgia Surgical Options Compared
Display Headline
Trigeminal Neuralgia Surgical Options Compared
Article Source

PURLs Copyright

Inside the Article

Article PDF Media

Headache Sidelines Two-Thirds of Soldiers

Article Type
Changed
Display Headline
Headache Sidelines Two-Thirds of Soldiers

Major Finding: Some 66% of soldiers who were evacuated from war theaters for headache were never able to return to active duty on the front.

Data Source: A retrospective review of almost 1,000 soldiers in Operations Iraqi Freedom and Enduring Freedom who were taken off the battlefield with a primary diagnosis of headache.

Disclosures: The study was funded by a grant from the John P. Murtha Neuroscience and Pain Institute, the U.S. Army, and the Army Regional Anesthesia and Pain Medicine Initiative. Dr. Cohen had no financial declarations.

Soldiers evacuated from current war zones with a headache diagnosis are unlikely to return to duty, a new retrospective study has found.

About a third of these soldiers were able to return to duty, even after receiving treatment, Dr. Steven P. Cohen and his colleagues reported (Cephalalgia 2011 Oct. 12 [doi:10.1177/0333102411422382]).

Headaches account for a significant burden in units and for health care providers deployed to combat zones, wrote Dr. Cohen of Johns Hopkins University, Baltimore, and the Uniformed Services University of the Health Sciences, Bethesda, Md. “The overall [return-to-duty] rate of 33.6% is one of the lowest among all injury types, and to some degree reflects the observation that a large percentage of headaches were incurred during combat operations.”

Throughout history, most war casualties haven't been battle related, Dr. Cohen said in an interview. “Since World War I, nonbattle injuries have been by far the No. 1 reason a soldier is evacuated from the field.” Dr. Cohen is a colonel in the U.S. Army Reserve and director of pain research at the Walter Reed National Military Medical Center.

“In the earlier wars, it was respiratory and infectious disease. In these more modern conflicts, the No. 1 reason for evacuation is musculoskeletal injury, followed by psychological and neurological problems – and all of these can involve headache.”

Headache is the most common neurologic symptom in the world, he said, with some studies claiming that up to 70% of people are affected. But recent studies of soldiers deployed in the current wars suggest that the headache burden among recently deployed soldiers may be even larger. In addition to risking a combat injury, soldiers are exposed to constantly high stress levels. The combination is a perfect recipe for severe headaches.

“There are incredible psychosocial stressors involved in being deployed,” he said. “In addition to the daily possibility of being injured or killed, soldiers worry about family separation and about their colleagues who serve along with them. And this is happening in young people in whom sophisticated coping mechanisms have not yet been developed.”

To understand how headache might affect the strength and stability of military units, Dr. Cohen and his coauthors reviewed the records of 985 soldiers who had been evacuated from the wars during 2004-2009 with a primary diagnosis of headache.

Headache diagnoses fell into seven categories: postconcussive (33%); tension type (11%); migraine (30%); cervicogenic (9%); occipital neuralgia (5%); cluster (2%), and “other,” a category that included tumor, vascular pathology, psychogenic headache, substance abuse, and cerebrovascular events presenting as headache.

The soldiers' mean age was 30 years; most (88%) were men.

Almost half of the headaches (48%) were related to physical trauma; 3% were deemed psychological or emotional, 3% as environmental or infectious, and the remainder were of other etiologies or unknown. In all, 22% of the soldiers reported a prior history of headache.

Headaches were deemed to be battle related if they were sustained in a combat operation (31%). Another 62% were not related to combat, and data were unavailable for the remainder.

Episodic headache was most common (52%); 39% had constant headache. The authors did not find frequency data for 9% of the group.

Once evacuated, treatment varied widely among the group. Nonsteroidal anti-inflammatory drugs were the most commonly used medications (77%), followed by antidepressants (64%), opioids (34%), anticonvulsants (29%), and triptans (27%).

Other medical treatment included beta-blockers (11%) and calcium channel blockers (2%). Many soldiers (36%) were on multiple therapies, and 9% received injections or nerve blocks. Another 7% received an alternative medicine treatment and 4% received no treatment.

In multivariate regression analyses of the factors associated with return to duty, the investigators controlled for age; sex; military branch; headache diagnosis and etiology; treatment; psychiatric and brain injury history; family and personal headache and pain history; and smoking.

Headache type was significantly associated with return to duty. A diagnosis of occipital neuralgia lowered the odds of returning to duty by 80%, compared with tension-type headache, the reference diagnosis.

Other diagnoses that lowered the odds of a soldier's returning to duty included postconcussive headache (by 67%), cervicogenic headache (by 60%), and coexisting traumatic brain injury (by 50%).

 

 

The method of treatment also affected a soldier's return to duty. Compared with no treatment, the odds of returning to duty were significantly lowered by 59% with opioids and by 74% with beta-blockers.

Because the study examined only cases with a primary diagnosis of headache, it probably understates the disorder's true impact, Dr. Cohen said. Posttraumatic stress disorder, musculoskeletal injuries, concussions, and motor vehicle accidents – all very common wartime injuries – can cause chronic headaches.

Whether soldiers return to the battlefield, or are kept on active duty and treated outside the war theater, the cost to the military is high, he said.

“We now know that two-thirds of those who leave with headache don't come back, and even if they do, they may have limitations. They might not be able to go on foot patrol or wear their Kevlar gear – but they are still using resources. And for every soldier who is evacuated, the unit goes short and someone else has to be deployed.”

Headache treatment can last for months, he added, tying up military medical centers during active duty and after discharge.

“They continue utilizing medical and military resources the whole time they are being treated, and this costs America a huge amount of money,” he said. “Even if all our troops would pull out tomorrow, we will be paying for this for the rest of our lives, as will the soldiers who are injured.”

Headache has replaced respiratory and infectious disease as a primary reason for evacuation, said Dr. Steven P. Cohen.

Source Courtesy Keith Weller

Article PDF
Author and Disclosure Information

Publications
Topics
Author and Disclosure Information

Author and Disclosure Information

Article PDF
Article PDF

Major Finding: Some 66% of soldiers who were evacuated from war theaters for headache were never able to return to active duty on the front.

Data Source: A retrospective review of almost 1,000 soldiers in Operations Iraqi Freedom and Enduring Freedom who were taken off the battlefield with a primary diagnosis of headache.

Disclosures: The study was funded by a grant from the John P. Murtha Neuroscience and Pain Institute, the U.S. Army, and the Army Regional Anesthesia and Pain Medicine Initiative. Dr. Cohen had no financial declarations.

Soldiers evacuated from current war zones with a headache diagnosis are unlikely to return to duty, a new retrospective study has found.

About a third of these soldiers were able to return to duty, even after receiving treatment, Dr. Steven P. Cohen and his colleagues reported (Cephalalgia 2011 Oct. 12 [doi:10.1177/0333102411422382]).

Headaches account for a significant burden in units and for health care providers deployed to combat zones, wrote Dr. Cohen of Johns Hopkins University, Baltimore, and the Uniformed Services University of the Health Sciences, Bethesda, Md. “The overall [return-to-duty] rate of 33.6% is one of the lowest among all injury types, and to some degree reflects the observation that a large percentage of headaches were incurred during combat operations.”

Throughout history, most war casualties haven't been battle related, Dr. Cohen said in an interview. “Since World War I, nonbattle injuries have been by far the No. 1 reason a soldier is evacuated from the field.” Dr. Cohen is a colonel in the U.S. Army Reserve and director of pain research at the Walter Reed National Military Medical Center.

“In the earlier wars, it was respiratory and infectious disease. In these more modern conflicts, the No. 1 reason for evacuation is musculoskeletal injury, followed by psychological and neurological problems – and all of these can involve headache.”

Headache is the most common neurologic symptom in the world, he said, with some studies claiming that up to 70% of people are affected. But recent studies of soldiers deployed in the current wars suggest that the headache burden among recently deployed soldiers may be even larger. In addition to risking a combat injury, soldiers are exposed to constantly high stress levels. The combination is a perfect recipe for severe headaches.

“There are incredible psychosocial stressors involved in being deployed,” he said. “In addition to the daily possibility of being injured or killed, soldiers worry about family separation and about their colleagues who serve along with them. And this is happening in young people in whom sophisticated coping mechanisms have not yet been developed.”

To understand how headache might affect the strength and stability of military units, Dr. Cohen and his coauthors reviewed the records of 985 soldiers who had been evacuated from the wars during 2004-2009 with a primary diagnosis of headache.

Headache diagnoses fell into seven categories: postconcussive (33%); tension type (11%); migraine (30%); cervicogenic (9%); occipital neuralgia (5%); cluster (2%), and “other,” a category that included tumor, vascular pathology, psychogenic headache, substance abuse, and cerebrovascular events presenting as headache.

The soldiers' mean age was 30 years; most (88%) were men.

Almost half of the headaches (48%) were related to physical trauma; 3% were deemed psychological or emotional, 3% as environmental or infectious, and the remainder were of other etiologies or unknown. In all, 22% of the soldiers reported a prior history of headache.

Headaches were deemed to be battle related if they were sustained in a combat operation (31%). Another 62% were not related to combat, and data were unavailable for the remainder.

Episodic headache was most common (52%); 39% had constant headache. The authors did not find frequency data for 9% of the group.

Once evacuated, treatment varied widely among the group. Nonsteroidal anti-inflammatory drugs were the most commonly used medications (77%), followed by antidepressants (64%), opioids (34%), anticonvulsants (29%), and triptans (27%).

Other medical treatment included beta-blockers (11%) and calcium channel blockers (2%). Many soldiers (36%) were on multiple therapies, and 9% received injections or nerve blocks. Another 7% received an alternative medicine treatment and 4% received no treatment.

In multivariate regression analyses of the factors associated with return to duty, the investigators controlled for age; sex; military branch; headache diagnosis and etiology; treatment; psychiatric and brain injury history; family and personal headache and pain history; and smoking.

Headache type was significantly associated with return to duty. A diagnosis of occipital neuralgia lowered the odds of returning to duty by 80%, compared with tension-type headache, the reference diagnosis.

Other diagnoses that lowered the odds of a soldier's returning to duty included postconcussive headache (by 67%), cervicogenic headache (by 60%), and coexisting traumatic brain injury (by 50%).

 

 

The method of treatment also affected a soldier's return to duty. Compared with no treatment, the odds of returning to duty were significantly lowered by 59% with opioids and by 74% with beta-blockers.

Because the study examined only cases with a primary diagnosis of headache, it probably understates the disorder's true impact, Dr. Cohen said. Posttraumatic stress disorder, musculoskeletal injuries, concussions, and motor vehicle accidents – all very common wartime injuries – can cause chronic headaches.

Whether soldiers return to the battlefield, or are kept on active duty and treated outside the war theater, the cost to the military is high, he said.

“We now know that two-thirds of those who leave with headache don't come back, and even if they do, they may have limitations. They might not be able to go on foot patrol or wear their Kevlar gear – but they are still using resources. And for every soldier who is evacuated, the unit goes short and someone else has to be deployed.”

Headache treatment can last for months, he added, tying up military medical centers during active duty and after discharge.

“They continue utilizing medical and military resources the whole time they are being treated, and this costs America a huge amount of money,” he said. “Even if all our troops would pull out tomorrow, we will be paying for this for the rest of our lives, as will the soldiers who are injured.”

Headache has replaced respiratory and infectious disease as a primary reason for evacuation, said Dr. Steven P. Cohen.

Source Courtesy Keith Weller

Major Finding: Some 66% of soldiers who were evacuated from war theaters for headache were never able to return to active duty on the front.

Data Source: A retrospective review of almost 1,000 soldiers in Operations Iraqi Freedom and Enduring Freedom who were taken off the battlefield with a primary diagnosis of headache.

Disclosures: The study was funded by a grant from the John P. Murtha Neuroscience and Pain Institute, the U.S. Army, and the Army Regional Anesthesia and Pain Medicine Initiative. Dr. Cohen had no financial declarations.

Soldiers evacuated from current war zones with a headache diagnosis are unlikely to return to duty, a new retrospective study has found.

About a third of these soldiers were able to return to duty, even after receiving treatment, Dr. Steven P. Cohen and his colleagues reported (Cephalalgia 2011 Oct. 12 [doi:10.1177/0333102411422382]).

Headaches account for a significant burden in units and for health care providers deployed to combat zones, wrote Dr. Cohen of Johns Hopkins University, Baltimore, and the Uniformed Services University of the Health Sciences, Bethesda, Md. “The overall [return-to-duty] rate of 33.6% is one of the lowest among all injury types, and to some degree reflects the observation that a large percentage of headaches were incurred during combat operations.”

Throughout history, most war casualties haven't been battle related, Dr. Cohen said in an interview. “Since World War I, nonbattle injuries have been by far the No. 1 reason a soldier is evacuated from the field.” Dr. Cohen is a colonel in the U.S. Army Reserve and director of pain research at the Walter Reed National Military Medical Center.

“In the earlier wars, it was respiratory and infectious disease. In these more modern conflicts, the No. 1 reason for evacuation is musculoskeletal injury, followed by psychological and neurological problems – and all of these can involve headache.”

Headache is the most common neurologic symptom in the world, he said, with some studies claiming that up to 70% of people are affected. But recent studies of soldiers deployed in the current wars suggest that the headache burden among recently deployed soldiers may be even larger. In addition to risking a combat injury, soldiers are exposed to constantly high stress levels. The combination is a perfect recipe for severe headaches.

“There are incredible psychosocial stressors involved in being deployed,” he said. “In addition to the daily possibility of being injured or killed, soldiers worry about family separation and about their colleagues who serve along with them. And this is happening in young people in whom sophisticated coping mechanisms have not yet been developed.”

To understand how headache might affect the strength and stability of military units, Dr. Cohen and his coauthors reviewed the records of 985 soldiers who had been evacuated from the wars during 2004-2009 with a primary diagnosis of headache.

Headache diagnoses fell into seven categories: postconcussive (33%); tension type (11%); migraine (30%); cervicogenic (9%); occipital neuralgia (5%); cluster (2%), and “other,” a category that included tumor, vascular pathology, psychogenic headache, substance abuse, and cerebrovascular events presenting as headache.

The soldiers' mean age was 30 years; most (88%) were men.

Almost half of the headaches (48%) were related to physical trauma; 3% were deemed psychological or emotional, 3% as environmental or infectious, and the remainder were of other etiologies or unknown. In all, 22% of the soldiers reported a prior history of headache.

Headaches were deemed to be battle related if they were sustained in a combat operation (31%). Another 62% were not related to combat, and data were unavailable for the remainder.

Episodic headache was most common (52%); 39% had constant headache. The authors did not find frequency data for 9% of the group.

Once evacuated, treatment varied widely among the group. Nonsteroidal anti-inflammatory drugs were the most commonly used medications (77%), followed by antidepressants (64%), opioids (34%), anticonvulsants (29%), and triptans (27%).

Other medical treatment included beta-blockers (11%) and calcium channel blockers (2%). Many soldiers (36%) were on multiple therapies, and 9% received injections or nerve blocks. Another 7% received an alternative medicine treatment and 4% received no treatment.

In multivariate regression analyses of the factors associated with return to duty, the investigators controlled for age; sex; military branch; headache diagnosis and etiology; treatment; psychiatric and brain injury history; family and personal headache and pain history; and smoking.

Headache type was significantly associated with return to duty. A diagnosis of occipital neuralgia lowered the odds of returning to duty by 80%, compared with tension-type headache, the reference diagnosis.

Other diagnoses that lowered the odds of a soldier's returning to duty included postconcussive headache (by 67%), cervicogenic headache (by 60%), and coexisting traumatic brain injury (by 50%).

 

 

The method of treatment also affected a soldier's return to duty. Compared with no treatment, the odds of returning to duty were significantly lowered by 59% with opioids and by 74% with beta-blockers.

Because the study examined only cases with a primary diagnosis of headache, it probably understates the disorder's true impact, Dr. Cohen said. Posttraumatic stress disorder, musculoskeletal injuries, concussions, and motor vehicle accidents – all very common wartime injuries – can cause chronic headaches.

Whether soldiers return to the battlefield, or are kept on active duty and treated outside the war theater, the cost to the military is high, he said.

“We now know that two-thirds of those who leave with headache don't come back, and even if they do, they may have limitations. They might not be able to go on foot patrol or wear their Kevlar gear – but they are still using resources. And for every soldier who is evacuated, the unit goes short and someone else has to be deployed.”

Headache treatment can last for months, he added, tying up military medical centers during active duty and after discharge.

“They continue utilizing medical and military resources the whole time they are being treated, and this costs America a huge amount of money,” he said. “Even if all our troops would pull out tomorrow, we will be paying for this for the rest of our lives, as will the soldiers who are injured.”

Headache has replaced respiratory and infectious disease as a primary reason for evacuation, said Dr. Steven P. Cohen.

Source Courtesy Keith Weller

Publications
Publications
Topics
Article Type
Display Headline
Headache Sidelines Two-Thirds of Soldiers
Display Headline
Headache Sidelines Two-Thirds of Soldiers
Article Source

PURLs Copyright

Inside the Article

Article PDF Media

Iron for Childhood Anemia: It's Elemental

Article Type
Changed
Display Headline
Iron for Childhood Anemia: It's Elemental

BOSTON – Popeye’s prescription won’t restore iron levels in a child with frank anemia.

By the time iron-deficiency anemia has developed, no amount of spinach – or any other iron-rich food – can bring iron levels back up to normal, at least without the assistance of medical iron therapy.

"You can give a child three steaks a day and you won’t be able to get enough iron in to normalize the level," Dr. George Buchanan said at the annual meeting of the American Academy of Pediatrics. "Iron-rich foods can prevent iron deficiency, but they cannot treat it."

The reason lies in the body’s tightly controlled iron homeostasis. Just 10% of dietary iron is absorbed, and that amount varies widely because bioavailability differs among foods.

Breast milk, while relatively low in iron, has a very high bioavailability of 50%, said Dr. Buchanan, professor of pediatrics at the University of Texas Southwestern Medical Center, Dallas. Cow’s milk contains the same amount of iron – about 1 mg/L – but humans absorb very little of it.

So while cow’s milk is a perfect food for little bovines, it’s an imperfect food for little humans. "There’s not much iron in it, it’s poorly absorbed, and when the child’s stomach is full of cow’s milk, the appetite for other foods is not good. It’s not like we want to ban it – it’s a good source of vitamin D and calcium – but 24 ounces a day is all a toddler needs."

Overreliance on cow’s milk is only one possible contributor to childhood anemia, Dr. Buchanan said. Occult bleeding from the gastrointestinal tract, esophagus, lungs, or kidneys can be just enough to tip an infant, especially a preemie, into anemia. Nosebleeds and menorrhagia contribute to anemia in older children, as can sports activity. "Teens are in a stage of rapid growth and often poor diet, combined with an increase in physical activity," Dr. Buchanan said. Jogging or other vigorous activity can cause just enough minor intestinal trauma to leach away precious hemoglobin.

Genetic diseases also can underlie anemia. Pediatricians are familiar with thalassemia, but might not know about the recently described iron-refractory iron-deficiency anemia (IRIDA).

Children with a mutation in the TMPRSS6 gene produce too much hepcidin, a protein that regulates intracellular iron transport. Its normal function is to protect cells from taking up too much iron; an excess prevents iron from migrating into cells, keeping it in storage.

Proton pump inhibitors and H2 receptor antagonists can also interfere with iron absorption.

No matter what the cause, frank anemia is the last stage of iron depletion. The process starts when stored iron isn’t replaced, leading to iron-deficient erythropoeisis. Iron-deficiency anemia is the final stage of the process.

Various tests can identify each stage of the disorder, "But it may not be feasible or practical to do all of these," Dr. Buchanan said. The iron absorption test is the simplest way to quantify the stage of iron depletion, and to differentiate dietary deficiency from malabsorption syndromes.

"All of these factors can be screened for with this underutilized test," which consists of a single oral dose of 1 mg/kg iron. Serum iron is measured at baseline and 2 hours after the dosing. "The level in a normal child without iron deficiency will increase only slightly. In a child with a poor diet, it will be markedly increased. And in a child with malabsorption, the level won’t change at all."

Treatment seems simple – elemental iron in any of several forms. But each preparation has benefits and drawbacks, and each contains different amounts of iron. "Ferrous sulfate is well absorbed but has a reputation for the most side effects; but it’s the least costly. Iron polysaccharide is not as well absorbed but has fewer side effects – and costs more," he said.

Even different types of the same iron preparation can have different concentrations. For instance, ferrous sulfate drops can range from 15 mg/0.6 mL to 15 or 25 mg/mL. "It’s all over the place, so the best thing is to get to know one or two and stick with them," he said.

Getting the iron in is often more complicated than figuring out how it vanished. "It tastes bad and toddlers spit it out. Maybe parents don’t give it the way you instruct. And it’s a lengthy course of treatment: 3-4 months given twice a day, and multiple prescription refills."

Intravenous iron might someday solve the problem of compliance. Early data from a cohort of Dr. Buchanan’s patients suggest that it benefits toddlers and teens who don’t respond to oral therapy because of compliance issues. "We gave 1-7 IV doses of iron sucrose to 38 children. The toxicity was minimal, it was done in an outpatient setting, and all of them responded well," he said.

 

 

The trial prompted a now-ongoing prospective study of IV low-molecular-weight iron dextrin, given as a single dose over 1 hour to children and teens. Data on the first 22 patients will soon be presented.

"They all responded, with only some minor reactions," Dr. Buchanan said. "This is very preliminary, and I’m not recommending it. But it may be that someday we can give a single short infusion in the office or clinic and take care of the problem right away."

Dr. Buchanan said he had no relevant financial disclosures.

Meeting/Event
Author and Disclosure Information

Publications
Topics
Legacy Keywords
iron levels, children, anemia, iron-deficiency, spinach, medical iron therapy, Dr. George Buchanan, American Academy of Pediatrics,

Sections
Author and Disclosure Information

Author and Disclosure Information

Meeting/Event
Meeting/Event

BOSTON – Popeye’s prescription won’t restore iron levels in a child with frank anemia.

By the time iron-deficiency anemia has developed, no amount of spinach – or any other iron-rich food – can bring iron levels back up to normal, at least without the assistance of medical iron therapy.

"You can give a child three steaks a day and you won’t be able to get enough iron in to normalize the level," Dr. George Buchanan said at the annual meeting of the American Academy of Pediatrics. "Iron-rich foods can prevent iron deficiency, but they cannot treat it."

The reason lies in the body’s tightly controlled iron homeostasis. Just 10% of dietary iron is absorbed, and that amount varies widely because bioavailability differs among foods.

Breast milk, while relatively low in iron, has a very high bioavailability of 50%, said Dr. Buchanan, professor of pediatrics at the University of Texas Southwestern Medical Center, Dallas. Cow’s milk contains the same amount of iron – about 1 mg/L – but humans absorb very little of it.

So while cow’s milk is a perfect food for little bovines, it’s an imperfect food for little humans. "There’s not much iron in it, it’s poorly absorbed, and when the child’s stomach is full of cow’s milk, the appetite for other foods is not good. It’s not like we want to ban it – it’s a good source of vitamin D and calcium – but 24 ounces a day is all a toddler needs."

Overreliance on cow’s milk is only one possible contributor to childhood anemia, Dr. Buchanan said. Occult bleeding from the gastrointestinal tract, esophagus, lungs, or kidneys can be just enough to tip an infant, especially a preemie, into anemia. Nosebleeds and menorrhagia contribute to anemia in older children, as can sports activity. "Teens are in a stage of rapid growth and often poor diet, combined with an increase in physical activity," Dr. Buchanan said. Jogging or other vigorous activity can cause just enough minor intestinal trauma to leach away precious hemoglobin.

Genetic diseases also can underlie anemia. Pediatricians are familiar with thalassemia, but might not know about the recently described iron-refractory iron-deficiency anemia (IRIDA).

Children with a mutation in the TMPRSS6 gene produce too much hepcidin, a protein that regulates intracellular iron transport. Its normal function is to protect cells from taking up too much iron; an excess prevents iron from migrating into cells, keeping it in storage.

Proton pump inhibitors and H2 receptor antagonists can also interfere with iron absorption.

No matter what the cause, frank anemia is the last stage of iron depletion. The process starts when stored iron isn’t replaced, leading to iron-deficient erythropoeisis. Iron-deficiency anemia is the final stage of the process.

Various tests can identify each stage of the disorder, "But it may not be feasible or practical to do all of these," Dr. Buchanan said. The iron absorption test is the simplest way to quantify the stage of iron depletion, and to differentiate dietary deficiency from malabsorption syndromes.

"All of these factors can be screened for with this underutilized test," which consists of a single oral dose of 1 mg/kg iron. Serum iron is measured at baseline and 2 hours after the dosing. "The level in a normal child without iron deficiency will increase only slightly. In a child with a poor diet, it will be markedly increased. And in a child with malabsorption, the level won’t change at all."

Treatment seems simple – elemental iron in any of several forms. But each preparation has benefits and drawbacks, and each contains different amounts of iron. "Ferrous sulfate is well absorbed but has a reputation for the most side effects; but it’s the least costly. Iron polysaccharide is not as well absorbed but has fewer side effects – and costs more," he said.

Even different types of the same iron preparation can have different concentrations. For instance, ferrous sulfate drops can range from 15 mg/0.6 mL to 15 or 25 mg/mL. "It’s all over the place, so the best thing is to get to know one or two and stick with them," he said.

Getting the iron in is often more complicated than figuring out how it vanished. "It tastes bad and toddlers spit it out. Maybe parents don’t give it the way you instruct. And it’s a lengthy course of treatment: 3-4 months given twice a day, and multiple prescription refills."

Intravenous iron might someday solve the problem of compliance. Early data from a cohort of Dr. Buchanan’s patients suggest that it benefits toddlers and teens who don’t respond to oral therapy because of compliance issues. "We gave 1-7 IV doses of iron sucrose to 38 children. The toxicity was minimal, it was done in an outpatient setting, and all of them responded well," he said.

 

 

The trial prompted a now-ongoing prospective study of IV low-molecular-weight iron dextrin, given as a single dose over 1 hour to children and teens. Data on the first 22 patients will soon be presented.

"They all responded, with only some minor reactions," Dr. Buchanan said. "This is very preliminary, and I’m not recommending it. But it may be that someday we can give a single short infusion in the office or clinic and take care of the problem right away."

Dr. Buchanan said he had no relevant financial disclosures.

BOSTON – Popeye’s prescription won’t restore iron levels in a child with frank anemia.

By the time iron-deficiency anemia has developed, no amount of spinach – or any other iron-rich food – can bring iron levels back up to normal, at least without the assistance of medical iron therapy.

"You can give a child three steaks a day and you won’t be able to get enough iron in to normalize the level," Dr. George Buchanan said at the annual meeting of the American Academy of Pediatrics. "Iron-rich foods can prevent iron deficiency, but they cannot treat it."

The reason lies in the body’s tightly controlled iron homeostasis. Just 10% of dietary iron is absorbed, and that amount varies widely because bioavailability differs among foods.

Breast milk, while relatively low in iron, has a very high bioavailability of 50%, said Dr. Buchanan, professor of pediatrics at the University of Texas Southwestern Medical Center, Dallas. Cow’s milk contains the same amount of iron – about 1 mg/L – but humans absorb very little of it.

So while cow’s milk is a perfect food for little bovines, it’s an imperfect food for little humans. "There’s not much iron in it, it’s poorly absorbed, and when the child’s stomach is full of cow’s milk, the appetite for other foods is not good. It’s not like we want to ban it – it’s a good source of vitamin D and calcium – but 24 ounces a day is all a toddler needs."

Overreliance on cow’s milk is only one possible contributor to childhood anemia, Dr. Buchanan said. Occult bleeding from the gastrointestinal tract, esophagus, lungs, or kidneys can be just enough to tip an infant, especially a preemie, into anemia. Nosebleeds and menorrhagia contribute to anemia in older children, as can sports activity. "Teens are in a stage of rapid growth and often poor diet, combined with an increase in physical activity," Dr. Buchanan said. Jogging or other vigorous activity can cause just enough minor intestinal trauma to leach away precious hemoglobin.

Genetic diseases also can underlie anemia. Pediatricians are familiar with thalassemia, but might not know about the recently described iron-refractory iron-deficiency anemia (IRIDA).

Children with a mutation in the TMPRSS6 gene produce too much hepcidin, a protein that regulates intracellular iron transport. Its normal function is to protect cells from taking up too much iron; an excess prevents iron from migrating into cells, keeping it in storage.

Proton pump inhibitors and H2 receptor antagonists can also interfere with iron absorption.

No matter what the cause, frank anemia is the last stage of iron depletion. The process starts when stored iron isn’t replaced, leading to iron-deficient erythropoeisis. Iron-deficiency anemia is the final stage of the process.

Various tests can identify each stage of the disorder, "But it may not be feasible or practical to do all of these," Dr. Buchanan said. The iron absorption test is the simplest way to quantify the stage of iron depletion, and to differentiate dietary deficiency from malabsorption syndromes.

"All of these factors can be screened for with this underutilized test," which consists of a single oral dose of 1 mg/kg iron. Serum iron is measured at baseline and 2 hours after the dosing. "The level in a normal child without iron deficiency will increase only slightly. In a child with a poor diet, it will be markedly increased. And in a child with malabsorption, the level won’t change at all."

Treatment seems simple – elemental iron in any of several forms. But each preparation has benefits and drawbacks, and each contains different amounts of iron. "Ferrous sulfate is well absorbed but has a reputation for the most side effects; but it’s the least costly. Iron polysaccharide is not as well absorbed but has fewer side effects – and costs more," he said.

Even different types of the same iron preparation can have different concentrations. For instance, ferrous sulfate drops can range from 15 mg/0.6 mL to 15 or 25 mg/mL. "It’s all over the place, so the best thing is to get to know one or two and stick with them," he said.

Getting the iron in is often more complicated than figuring out how it vanished. "It tastes bad and toddlers spit it out. Maybe parents don’t give it the way you instruct. And it’s a lengthy course of treatment: 3-4 months given twice a day, and multiple prescription refills."

Intravenous iron might someday solve the problem of compliance. Early data from a cohort of Dr. Buchanan’s patients suggest that it benefits toddlers and teens who don’t respond to oral therapy because of compliance issues. "We gave 1-7 IV doses of iron sucrose to 38 children. The toxicity was minimal, it was done in an outpatient setting, and all of them responded well," he said.

 

 

The trial prompted a now-ongoing prospective study of IV low-molecular-weight iron dextrin, given as a single dose over 1 hour to children and teens. Data on the first 22 patients will soon be presented.

"They all responded, with only some minor reactions," Dr. Buchanan said. "This is very preliminary, and I’m not recommending it. But it may be that someday we can give a single short infusion in the office or clinic and take care of the problem right away."

Dr. Buchanan said he had no relevant financial disclosures.

Publications
Publications
Topics
Article Type
Display Headline
Iron for Childhood Anemia: It's Elemental
Display Headline
Iron for Childhood Anemia: It's Elemental
Legacy Keywords
iron levels, children, anemia, iron-deficiency, spinach, medical iron therapy, Dr. George Buchanan, American Academy of Pediatrics,

Legacy Keywords
iron levels, children, anemia, iron-deficiency, spinach, medical iron therapy, Dr. George Buchanan, American Academy of Pediatrics,

Sections
Article Source

EXPERT ANALYSIS FROM THE ANNUAL MEETING OF THE AMERICAN ACADEMY OF PEDIATRICS

PURLs Copyright

Inside the Article

Studies Compare Surgical Options for Trigeminal Neuralgia

Article Type
Changed
Display Headline
Studies Compare Surgical Options for Trigeminal Neuralgia

WASHINGTON – New evidence on the effectiveness of surgical options for trigeminal neuralgia points to greater safety and longer pain-free intervals when glycerol rhizotomy is used instead of balloon microcompression in patients with multiple sclerosis, as well as when the rhizotomy procedure is repeated, according to two studies reported at the annual meeting of the Congress of Neurological Surgeons.

Outcomes with a third option for trigeminal neuralgia – microvascular decompression – suggest that elderly patients have greater complications and in-hospital mortality than do younger patients.

Rhizotomy vs. Microcompression in MS. Trigeminal neuralgia occurs in about 2%-4% of patients with MS, said Dr. Grant W. Mallory, a fellow in neurological surgery at the Mayo Clinic, Rochester, Minn. Pain in these patients is more often bilateral than unilateral, and presents at a younger age than in the general population. While the majority of patients respond well to medical therapy with carbamazepine, some do require a surgical approach, he said.

Dr. Mallory and his mentor, Dr. Bruce Pollock, reviewed surgical outcomes and follow-up in 69 MS patients who underwent balloon microcompression and 68 who underwent glycerol rhizotomy during 1997-2010. The patients’ mean age was 62 years. Their mean pain duration at baseline varied significantly, between 16 months in those undergoing microcompression and 55 months in those undergoing rhizotomy. Most patients had already undergone a prior procedure (87% of the microcompression patients and 48% of the rhizotomy patients).

The investigators defined an excellent outcome as being pain free and tapered off medications. A good outcome was freedom from pain with continued medication; a failed procedure was no pain relief, or pain recurrence within 1 month.

There was a nonsignificant trend for more patients to report excellent or good pain relief after rhizotomy than after microcompression (74% vs. 65%).

Overall, pain recurred in 86% of patients after a median time of 6 months; the overall median follow-up time was 13 months. Pain recurred significantly later after rhizotomy (mean of 7 months) than after balloon microcompression (5 months). This contributed to a significant difference in the percentage of patients who needed further surgical procedures: 36% with rhizotomy vs. 44% with microcompression. Patients undergoing their first procedure had a longer time to pain recurrence than did those who had a prior intervention (10 months vs. 4 months).

"Comparing our series [of MS patients] to historical controls in patients without MS, our cohort fared substantially worse with regard to initial response rate (69% vs. 90%)," Dr. Mallory said. Historical control patients also had a pain-free interval that was about twice that of patients with MS, he said.

Complications occurred significantly more often in patients treated with balloon microcompression than with rhizotomy (17% vs. 5%). In the microcompression group, there were two cases of corneal anesthesia, one anesthesia dolorosa, five jaw deviations, four diplopias, and one each of Horne’s syndrome, dyesthesia, and thalamic hemorrhage. In the rhizotomy group there were two cases of corneal anesthesia and one case of paresthesia.

Although the study was not a randomized trial, Dr. Mallory said his mentor prefers the glycerol rhizotomy as a first-line invasive intervention in MS patients because of its low complication rate. If the patient is unable to tolerate the procedure under a local anesthetic, the surgeon may convert to a balloon microcompression under general anesthesia, Dr. Mallory said.

Repeat Rhizotomy Better Than Expected. Glycerol rhizotomy conferred greater benefit when it was repeated in patients with trigeminal neuralgia than when it was performed for the first time, according to a review of 547 patients conducted by Matthew Bender and his colleagues at Johns Hopkins University Medical Center, Baltimore.

Mr. Bender, a fourth-year medical student, reported that during 1998-2010, patients at the center who had undergone a prior rhizotomy were 21% less likely to have a failed procedure than were those who underwent an initial rhizotomy. These patients underwent 647 glycerol rhizotomies, including 504 first-time procedures.

The patients’ mean age was 64 years, and their mean duration of trigeminal neuralgia was 7 years.

All of the repeat patients had undergone at least one other rhizotomy. Ten patients in the repeat group and 13 in the initial group had previously undergone radiofrequency thermocoagulation.

The patients also had undergone other prior procedures, including intracranial stereotactic radiosurgery (11 in the initial group and 1 in the repeat group), balloon microcompression (2 in the initial group), and microvascular decompression (19 in the initial group and 10 in the repeat group).

The investigators defined the best outcome as at least 90% pain relief with no medications. They described a good outcome as at least 50% pain relief with minimal medications. Fair was defined as up to 50% pain relief with no medication change. A poor outcome was no pain relief.

 

 

Rhizotomy resulted in the best possible outcome for a significantly greater percentage of patients who had the procedure for a second time, compared with those who had it for the first time (74% vs. 65%). Good outcomes occurred in 12% of each group. Fair outcomes occurred in 10% of the initial group and 3% of the repeat group, whereas poor outcomes occurred in 13% of the initial group and 10% of the repeat group.

The overall median time to treatment failure was similar between the first-time and repeat treatment groups (27 months vs. 19 months, respectively).

"These outcomes challenge the consensus that repeat glycerol rhizotomy has decreased pain relief and durability," Mr. Bender said. "Not only can this be used as our first-line percutaneous procedure, it can be repeated before resorting to salvage modalities."

Elderly Tolerate Decompression Less. In a third study, Dr. Anand Rughani and his colleagues at the University of Vermont, Burlington, found that elderly patients do not tolerate microvascular decompression nearly as well as younger patients, based on a higher rate of in-hospital complications and mortality in the Nationwide Inpatient Sample.

Over a 10-year period in the database, Dr. Rughani and his associates found 3,273 patients who had undergone the procedure; their median age was 57 years.

The investigators examined in-hospital outcomes in patients younger than age 65 years, those who were 65-74 years old, and those who were 75 years or older.

The risk of any in-hospital complication occurring in a patient over age 65 was 7%, compared with 10% in those over age 75, a significant difference. In-hospital mortality also significantly increased as patients aged: It was 0.13% in those younger than 65 years, 0.68% in those aged 65 years or older, and about 1.5% in those aged 85 years or older.

Other complications occurred with similar frequency in patients 65 years or older and those 75 years or older: Cardiac complications occurred at a rate of 2% in both age groups; pulmonary complications in 2% and 4%, respectively; thromboembolic complications in 1% and 2%; and stroke in 3% and 4%.

"The likelihood of discharge to a nursing home or rehabilitation facility was also robustly associated with age," Dr. Rughani said. "Unfortunately, this review only captures in-hospital complications, so delayed complications like wound infections and pulmonary embolisms might not show up. This analysis could be quite an under-representation of how older patients might respond to microvascular decompression."

None of the authors of the three studies had any financial declarations.

Meeting/Event
Author and Disclosure Information

Publications
Topics
Legacy Keywords
surgical options, trigeminal neuralgia points, safety, longer pain-free intervals, glycerol rhizotomy, balloon microcompression, multiple sclerosis, the Congress of Neurological Surgeons, microvascular decompression, elderly patients, complications, Dr. Grant W. Mallory,
Author and Disclosure Information

Author and Disclosure Information

Meeting/Event
Meeting/Event

WASHINGTON – New evidence on the effectiveness of surgical options for trigeminal neuralgia points to greater safety and longer pain-free intervals when glycerol rhizotomy is used instead of balloon microcompression in patients with multiple sclerosis, as well as when the rhizotomy procedure is repeated, according to two studies reported at the annual meeting of the Congress of Neurological Surgeons.

Outcomes with a third option for trigeminal neuralgia – microvascular decompression – suggest that elderly patients have greater complications and in-hospital mortality than do younger patients.

Rhizotomy vs. Microcompression in MS. Trigeminal neuralgia occurs in about 2%-4% of patients with MS, said Dr. Grant W. Mallory, a fellow in neurological surgery at the Mayo Clinic, Rochester, Minn. Pain in these patients is more often bilateral than unilateral, and presents at a younger age than in the general population. While the majority of patients respond well to medical therapy with carbamazepine, some do require a surgical approach, he said.

Dr. Mallory and his mentor, Dr. Bruce Pollock, reviewed surgical outcomes and follow-up in 69 MS patients who underwent balloon microcompression and 68 who underwent glycerol rhizotomy during 1997-2010. The patients’ mean age was 62 years. Their mean pain duration at baseline varied significantly, between 16 months in those undergoing microcompression and 55 months in those undergoing rhizotomy. Most patients had already undergone a prior procedure (87% of the microcompression patients and 48% of the rhizotomy patients).

The investigators defined an excellent outcome as being pain free and tapered off medications. A good outcome was freedom from pain with continued medication; a failed procedure was no pain relief, or pain recurrence within 1 month.

There was a nonsignificant trend for more patients to report excellent or good pain relief after rhizotomy than after microcompression (74% vs. 65%).

Overall, pain recurred in 86% of patients after a median time of 6 months; the overall median follow-up time was 13 months. Pain recurred significantly later after rhizotomy (mean of 7 months) than after balloon microcompression (5 months). This contributed to a significant difference in the percentage of patients who needed further surgical procedures: 36% with rhizotomy vs. 44% with microcompression. Patients undergoing their first procedure had a longer time to pain recurrence than did those who had a prior intervention (10 months vs. 4 months).

"Comparing our series [of MS patients] to historical controls in patients without MS, our cohort fared substantially worse with regard to initial response rate (69% vs. 90%)," Dr. Mallory said. Historical control patients also had a pain-free interval that was about twice that of patients with MS, he said.

Complications occurred significantly more often in patients treated with balloon microcompression than with rhizotomy (17% vs. 5%). In the microcompression group, there were two cases of corneal anesthesia, one anesthesia dolorosa, five jaw deviations, four diplopias, and one each of Horne’s syndrome, dyesthesia, and thalamic hemorrhage. In the rhizotomy group there were two cases of corneal anesthesia and one case of paresthesia.

Although the study was not a randomized trial, Dr. Mallory said his mentor prefers the glycerol rhizotomy as a first-line invasive intervention in MS patients because of its low complication rate. If the patient is unable to tolerate the procedure under a local anesthetic, the surgeon may convert to a balloon microcompression under general anesthesia, Dr. Mallory said.

Repeat Rhizotomy Better Than Expected. Glycerol rhizotomy conferred greater benefit when it was repeated in patients with trigeminal neuralgia than when it was performed for the first time, according to a review of 547 patients conducted by Matthew Bender and his colleagues at Johns Hopkins University Medical Center, Baltimore.

Mr. Bender, a fourth-year medical student, reported that during 1998-2010, patients at the center who had undergone a prior rhizotomy were 21% less likely to have a failed procedure than were those who underwent an initial rhizotomy. These patients underwent 647 glycerol rhizotomies, including 504 first-time procedures.

The patients’ mean age was 64 years, and their mean duration of trigeminal neuralgia was 7 years.

All of the repeat patients had undergone at least one other rhizotomy. Ten patients in the repeat group and 13 in the initial group had previously undergone radiofrequency thermocoagulation.

The patients also had undergone other prior procedures, including intracranial stereotactic radiosurgery (11 in the initial group and 1 in the repeat group), balloon microcompression (2 in the initial group), and microvascular decompression (19 in the initial group and 10 in the repeat group).

The investigators defined the best outcome as at least 90% pain relief with no medications. They described a good outcome as at least 50% pain relief with minimal medications. Fair was defined as up to 50% pain relief with no medication change. A poor outcome was no pain relief.

 

 

Rhizotomy resulted in the best possible outcome for a significantly greater percentage of patients who had the procedure for a second time, compared with those who had it for the first time (74% vs. 65%). Good outcomes occurred in 12% of each group. Fair outcomes occurred in 10% of the initial group and 3% of the repeat group, whereas poor outcomes occurred in 13% of the initial group and 10% of the repeat group.

The overall median time to treatment failure was similar between the first-time and repeat treatment groups (27 months vs. 19 months, respectively).

"These outcomes challenge the consensus that repeat glycerol rhizotomy has decreased pain relief and durability," Mr. Bender said. "Not only can this be used as our first-line percutaneous procedure, it can be repeated before resorting to salvage modalities."

Elderly Tolerate Decompression Less. In a third study, Dr. Anand Rughani and his colleagues at the University of Vermont, Burlington, found that elderly patients do not tolerate microvascular decompression nearly as well as younger patients, based on a higher rate of in-hospital complications and mortality in the Nationwide Inpatient Sample.

Over a 10-year period in the database, Dr. Rughani and his associates found 3,273 patients who had undergone the procedure; their median age was 57 years.

The investigators examined in-hospital outcomes in patients younger than age 65 years, those who were 65-74 years old, and those who were 75 years or older.

The risk of any in-hospital complication occurring in a patient over age 65 was 7%, compared with 10% in those over age 75, a significant difference. In-hospital mortality also significantly increased as patients aged: It was 0.13% in those younger than 65 years, 0.68% in those aged 65 years or older, and about 1.5% in those aged 85 years or older.

Other complications occurred with similar frequency in patients 65 years or older and those 75 years or older: Cardiac complications occurred at a rate of 2% in both age groups; pulmonary complications in 2% and 4%, respectively; thromboembolic complications in 1% and 2%; and stroke in 3% and 4%.

"The likelihood of discharge to a nursing home or rehabilitation facility was also robustly associated with age," Dr. Rughani said. "Unfortunately, this review only captures in-hospital complications, so delayed complications like wound infections and pulmonary embolisms might not show up. This analysis could be quite an under-representation of how older patients might respond to microvascular decompression."

None of the authors of the three studies had any financial declarations.

WASHINGTON – New evidence on the effectiveness of surgical options for trigeminal neuralgia points to greater safety and longer pain-free intervals when glycerol rhizotomy is used instead of balloon microcompression in patients with multiple sclerosis, as well as when the rhizotomy procedure is repeated, according to two studies reported at the annual meeting of the Congress of Neurological Surgeons.

Outcomes with a third option for trigeminal neuralgia – microvascular decompression – suggest that elderly patients have greater complications and in-hospital mortality than do younger patients.

Rhizotomy vs. Microcompression in MS. Trigeminal neuralgia occurs in about 2%-4% of patients with MS, said Dr. Grant W. Mallory, a fellow in neurological surgery at the Mayo Clinic, Rochester, Minn. Pain in these patients is more often bilateral than unilateral, and presents at a younger age than in the general population. While the majority of patients respond well to medical therapy with carbamazepine, some do require a surgical approach, he said.

Dr. Mallory and his mentor, Dr. Bruce Pollock, reviewed surgical outcomes and follow-up in 69 MS patients who underwent balloon microcompression and 68 who underwent glycerol rhizotomy during 1997-2010. The patients’ mean age was 62 years. Their mean pain duration at baseline varied significantly, between 16 months in those undergoing microcompression and 55 months in those undergoing rhizotomy. Most patients had already undergone a prior procedure (87% of the microcompression patients and 48% of the rhizotomy patients).

The investigators defined an excellent outcome as being pain free and tapered off medications. A good outcome was freedom from pain with continued medication; a failed procedure was no pain relief, or pain recurrence within 1 month.

There was a nonsignificant trend for more patients to report excellent or good pain relief after rhizotomy than after microcompression (74% vs. 65%).

Overall, pain recurred in 86% of patients after a median time of 6 months; the overall median follow-up time was 13 months. Pain recurred significantly later after rhizotomy (mean of 7 months) than after balloon microcompression (5 months). This contributed to a significant difference in the percentage of patients who needed further surgical procedures: 36% with rhizotomy vs. 44% with microcompression. Patients undergoing their first procedure had a longer time to pain recurrence than did those who had a prior intervention (10 months vs. 4 months).

"Comparing our series [of MS patients] to historical controls in patients without MS, our cohort fared substantially worse with regard to initial response rate (69% vs. 90%)," Dr. Mallory said. Historical control patients also had a pain-free interval that was about twice that of patients with MS, he said.

Complications occurred significantly more often in patients treated with balloon microcompression than with rhizotomy (17% vs. 5%). In the microcompression group, there were two cases of corneal anesthesia, one anesthesia dolorosa, five jaw deviations, four diplopias, and one each of Horne’s syndrome, dyesthesia, and thalamic hemorrhage. In the rhizotomy group there were two cases of corneal anesthesia and one case of paresthesia.

Although the study was not a randomized trial, Dr. Mallory said his mentor prefers the glycerol rhizotomy as a first-line invasive intervention in MS patients because of its low complication rate. If the patient is unable to tolerate the procedure under a local anesthetic, the surgeon may convert to a balloon microcompression under general anesthesia, Dr. Mallory said.

Repeat Rhizotomy Better Than Expected. Glycerol rhizotomy conferred greater benefit when it was repeated in patients with trigeminal neuralgia than when it was performed for the first time, according to a review of 547 patients conducted by Matthew Bender and his colleagues at Johns Hopkins University Medical Center, Baltimore.

Mr. Bender, a fourth-year medical student, reported that during 1998-2010, patients at the center who had undergone a prior rhizotomy were 21% less likely to have a failed procedure than were those who underwent an initial rhizotomy. These patients underwent 647 glycerol rhizotomies, including 504 first-time procedures.

The patients’ mean age was 64 years, and their mean duration of trigeminal neuralgia was 7 years.

All of the repeat patients had undergone at least one other rhizotomy. Ten patients in the repeat group and 13 in the initial group had previously undergone radiofrequency thermocoagulation.

The patients also had undergone other prior procedures, including intracranial stereotactic radiosurgery (11 in the initial group and 1 in the repeat group), balloon microcompression (2 in the initial group), and microvascular decompression (19 in the initial group and 10 in the repeat group).

The investigators defined the best outcome as at least 90% pain relief with no medications. They described a good outcome as at least 50% pain relief with minimal medications. Fair was defined as up to 50% pain relief with no medication change. A poor outcome was no pain relief.

 

 

Rhizotomy resulted in the best possible outcome for a significantly greater percentage of patients who had the procedure for a second time, compared with those who had it for the first time (74% vs. 65%). Good outcomes occurred in 12% of each group. Fair outcomes occurred in 10% of the initial group and 3% of the repeat group, whereas poor outcomes occurred in 13% of the initial group and 10% of the repeat group.

The overall median time to treatment failure was similar between the first-time and repeat treatment groups (27 months vs. 19 months, respectively).

"These outcomes challenge the consensus that repeat glycerol rhizotomy has decreased pain relief and durability," Mr. Bender said. "Not only can this be used as our first-line percutaneous procedure, it can be repeated before resorting to salvage modalities."

Elderly Tolerate Decompression Less. In a third study, Dr. Anand Rughani and his colleagues at the University of Vermont, Burlington, found that elderly patients do not tolerate microvascular decompression nearly as well as younger patients, based on a higher rate of in-hospital complications and mortality in the Nationwide Inpatient Sample.

Over a 10-year period in the database, Dr. Rughani and his associates found 3,273 patients who had undergone the procedure; their median age was 57 years.

The investigators examined in-hospital outcomes in patients younger than age 65 years, those who were 65-74 years old, and those who were 75 years or older.

The risk of any in-hospital complication occurring in a patient over age 65 was 7%, compared with 10% in those over age 75, a significant difference. In-hospital mortality also significantly increased as patients aged: It was 0.13% in those younger than 65 years, 0.68% in those aged 65 years or older, and about 1.5% in those aged 85 years or older.

Other complications occurred with similar frequency in patients 65 years or older and those 75 years or older: Cardiac complications occurred at a rate of 2% in both age groups; pulmonary complications in 2% and 4%, respectively; thromboembolic complications in 1% and 2%; and stroke in 3% and 4%.

"The likelihood of discharge to a nursing home or rehabilitation facility was also robustly associated with age," Dr. Rughani said. "Unfortunately, this review only captures in-hospital complications, so delayed complications like wound infections and pulmonary embolisms might not show up. This analysis could be quite an under-representation of how older patients might respond to microvascular decompression."

None of the authors of the three studies had any financial declarations.

Publications
Publications
Topics
Article Type
Display Headline
Studies Compare Surgical Options for Trigeminal Neuralgia
Display Headline
Studies Compare Surgical Options for Trigeminal Neuralgia
Legacy Keywords
surgical options, trigeminal neuralgia points, safety, longer pain-free intervals, glycerol rhizotomy, balloon microcompression, multiple sclerosis, the Congress of Neurological Surgeons, microvascular decompression, elderly patients, complications, Dr. Grant W. Mallory,
Legacy Keywords
surgical options, trigeminal neuralgia points, safety, longer pain-free intervals, glycerol rhizotomy, balloon microcompression, multiple sclerosis, the Congress of Neurological Surgeons, microvascular decompression, elderly patients, complications, Dr. Grant W. Mallory,
Article Source

FROM THE ANNUAL MEETING OF THE CONGRESS OF NEUROLOGICAL SURGEONS

PURLs Copyright

Inside the Article

AAP's New SIDS Stoppers: Cleared Cribs, No Cosleeping

Article Type
Changed
Display Headline
AAP's New SIDS Stoppers: Cleared Cribs, No Cosleeping

BOSTON – Plush, soft, fuzzy, warm, and cuddly – those seem like the perfect attributes for a newborn nursery.

Except if you’re the newborn.

A new policy from the American Academy of Pediatrics says that babies who sleep on their back on a firm, flat surface – in their own unadorned crib – are most protected from sudden infant death syndrome (SIDS) and the deadly related tragedies of suffocation, asphyxiation, and entrapment.

Michelle G. Sullivan/Elsevier Global Medical News
Dr. Rachel Moon, the primary author of a new policy on infant death, recommends that all babies sleep alone and on their backs.

The AAP released its newest guidelines Oct. 18 for infant sleep safety and SIDS risk reduction (Pediatrics 2011 Oct. 17;doi:10.1542/peds.2011-2285). The take-home message for pediatricians and parents alike is a simple one, Dr. Rachel Moon said at a press briefing.

"Put baby on the back for every sleep. Use a firm sleep surface designed for infants, with no soft objects, wedges, positioners," or any other fashionable accoutrements such as ruffles, blankets, crib drapes, or bumper pads.

The ideal sleeping set-up? A crib, bassinet, or portable crib/play-yard in mom and dad’s room, with a firm mattress, a tight-fitting bottom sheet, and no blanket or other baby-dangerous decorative items.

Although such adornments may satisfy a parent’s fashion sense, they make no safety sense at all, said Dr. Moon, the policy’s primary author and a pediatrician at the Children’s National Medical Center, Washington.

Since 1992, when the AAP first launched its "Back to Sleep" campaign, SIDS cases in the United States have decreased by 50%. "But we’ve seen an alarming increase in other deaths," Dr. Moon said. "There has been a quadrupling of infant deaths due to suffocation and entrapment, and a lot of this is attributable to inappropriate bedding and to cosleeping" with parents.

Those deaths – grouped together as sudden unexplained infant deaths (SUID) – can’t always be distinguished from SIDS, she noted. SIDS infants probably have some vulnerability that predisposes them to an unexplained death, whether that is an inborn error of metabolism, prematurity, or exposure to cigarette smoke. SUID may occur either among those infants or among those who have no identifiable risk factors. Other than a coroner’s exam – almost universally unhelpful – there’s no way to tell these deaths apart.

The safest course is to make sure that infants have the safest possible sleep accommodations. The bare crib eliminates a number of dangerous factors that can cause an accidental infant death.

The new policy also tackles the controversial subject of cosleeping. The family bed has been promoted among many circles as the most natural way to care for a newborn. Some groups – and even physicians – have suggested that cosleeping may help prevent SIDS.

There are no data to support those claims, Dr. Moon said. In fact, cosleeping can put the infant at risk of smothering under heavy covers, airway obstruction if an adult limb falls across its face, and even overheating – a recognized SIDS risk factor.

Bed sharing is even more dangerous with adults who are medicated or have consumed alcohol or drugs, Dr. Moon added. Those adults will be less aware of their movements and whether they might endanger the sleeping infant.

"There has been a quadrupling of infant deaths due to suffocation and entrapment, and a lot of this is attributable to inappropriate bedding and to cosleeping."

Parents shouldn’t worry that babies might choke on their own secretions when sleeping on their backs, Dr. Moon said. Babies have built-in protective physical guards against choking. There’s also no evidence that placing newborns on their sides helps drain amniotic fluid or other secretions from their lungs. Moms who choose rooming-in after delivery should also put their baby to sleep in the supine position and request that nurses do the same.

Preterm babies and those with low birth weights are especially at risk for SIDS, Dr. Moon said. Even infants in the neonatal intensive care unit should sleep supine as soon as they are medically stable.

The AAP policy stresses the protective influence of breastfeeding, but notes that infants who come to the adults’ bed for nighttime nursing should go back to their own crib after feeding.

"Because of the extremely high risk of SIDS and suffocation on couches and armchairs, infants should never be fed on a couch or armchair when there is a high risk that the parent might fall asleep," according to the policy’s authors.

The AAP policy gives the pacifier its proper place as well. Pacifiers seem to protect against SIDS, although the mechanism isn’t really understood, Dr. Moon said. "It seems to have something to do with stimulating arousal," as the babies suck during sleep.

 

 

But if the plug comes unplugged during the night, don’t worry, she said. "Parents don’t need to worry about putting the pacifier back in the baby’s mouth, especially if the baby doesn’t seem to want it."

But Dr. Moon warned parents to never, ever attach a pacifier to an infant’s clothing in any way, especially with a string or ribbon around the baby’s neck.

Immunizations also protect against SIDS, so it’s critical to keep babies up to date with vaccinations, she said. And adults should never smoke around infants. Infants exposed to cigarette smoke are at a significantly increased risk of unexplained infant death.

Despite all the talk of supine positioning, supervised "tummy time" in which infants are allowed to lie prone for some time is also important, Dr. Moon added. Tummy time is an important way for infants to develop neck, back, and arm muscles, and prevent positional plagiocephaly.

Meeting/Event
Author and Disclosure Information

Publications
Topics
Legacy Keywords
sudden infant death syndrome prevention, sudden unexplained infant death, SIDS risk reduction, infant sleep safety, dangers of cosleeping, nighttime nursing
Author and Disclosure Information

Author and Disclosure Information

Meeting/Event
Meeting/Event

BOSTON – Plush, soft, fuzzy, warm, and cuddly – those seem like the perfect attributes for a newborn nursery.

Except if you’re the newborn.

A new policy from the American Academy of Pediatrics says that babies who sleep on their back on a firm, flat surface – in their own unadorned crib – are most protected from sudden infant death syndrome (SIDS) and the deadly related tragedies of suffocation, asphyxiation, and entrapment.

Michelle G. Sullivan/Elsevier Global Medical News
Dr. Rachel Moon, the primary author of a new policy on infant death, recommends that all babies sleep alone and on their backs.

The AAP released its newest guidelines Oct. 18 for infant sleep safety and SIDS risk reduction (Pediatrics 2011 Oct. 17;doi:10.1542/peds.2011-2285). The take-home message for pediatricians and parents alike is a simple one, Dr. Rachel Moon said at a press briefing.

"Put baby on the back for every sleep. Use a firm sleep surface designed for infants, with no soft objects, wedges, positioners," or any other fashionable accoutrements such as ruffles, blankets, crib drapes, or bumper pads.

The ideal sleeping set-up? A crib, bassinet, or portable crib/play-yard in mom and dad’s room, with a firm mattress, a tight-fitting bottom sheet, and no blanket or other baby-dangerous decorative items.

Although such adornments may satisfy a parent’s fashion sense, they make no safety sense at all, said Dr. Moon, the policy’s primary author and a pediatrician at the Children’s National Medical Center, Washington.

Since 1992, when the AAP first launched its "Back to Sleep" campaign, SIDS cases in the United States have decreased by 50%. "But we’ve seen an alarming increase in other deaths," Dr. Moon said. "There has been a quadrupling of infant deaths due to suffocation and entrapment, and a lot of this is attributable to inappropriate bedding and to cosleeping" with parents.

Those deaths – grouped together as sudden unexplained infant deaths (SUID) – can’t always be distinguished from SIDS, she noted. SIDS infants probably have some vulnerability that predisposes them to an unexplained death, whether that is an inborn error of metabolism, prematurity, or exposure to cigarette smoke. SUID may occur either among those infants or among those who have no identifiable risk factors. Other than a coroner’s exam – almost universally unhelpful – there’s no way to tell these deaths apart.

The safest course is to make sure that infants have the safest possible sleep accommodations. The bare crib eliminates a number of dangerous factors that can cause an accidental infant death.

The new policy also tackles the controversial subject of cosleeping. The family bed has been promoted among many circles as the most natural way to care for a newborn. Some groups – and even physicians – have suggested that cosleeping may help prevent SIDS.

There are no data to support those claims, Dr. Moon said. In fact, cosleeping can put the infant at risk of smothering under heavy covers, airway obstruction if an adult limb falls across its face, and even overheating – a recognized SIDS risk factor.

Bed sharing is even more dangerous with adults who are medicated or have consumed alcohol or drugs, Dr. Moon added. Those adults will be less aware of their movements and whether they might endanger the sleeping infant.

"There has been a quadrupling of infant deaths due to suffocation and entrapment, and a lot of this is attributable to inappropriate bedding and to cosleeping."

Parents shouldn’t worry that babies might choke on their own secretions when sleeping on their backs, Dr. Moon said. Babies have built-in protective physical guards against choking. There’s also no evidence that placing newborns on their sides helps drain amniotic fluid or other secretions from their lungs. Moms who choose rooming-in after delivery should also put their baby to sleep in the supine position and request that nurses do the same.

Preterm babies and those with low birth weights are especially at risk for SIDS, Dr. Moon said. Even infants in the neonatal intensive care unit should sleep supine as soon as they are medically stable.

The AAP policy stresses the protective influence of breastfeeding, but notes that infants who come to the adults’ bed for nighttime nursing should go back to their own crib after feeding.

"Because of the extremely high risk of SIDS and suffocation on couches and armchairs, infants should never be fed on a couch or armchair when there is a high risk that the parent might fall asleep," according to the policy’s authors.

The AAP policy gives the pacifier its proper place as well. Pacifiers seem to protect against SIDS, although the mechanism isn’t really understood, Dr. Moon said. "It seems to have something to do with stimulating arousal," as the babies suck during sleep.

 

 

But if the plug comes unplugged during the night, don’t worry, she said. "Parents don’t need to worry about putting the pacifier back in the baby’s mouth, especially if the baby doesn’t seem to want it."

But Dr. Moon warned parents to never, ever attach a pacifier to an infant’s clothing in any way, especially with a string or ribbon around the baby’s neck.

Immunizations also protect against SIDS, so it’s critical to keep babies up to date with vaccinations, she said. And adults should never smoke around infants. Infants exposed to cigarette smoke are at a significantly increased risk of unexplained infant death.

Despite all the talk of supine positioning, supervised "tummy time" in which infants are allowed to lie prone for some time is also important, Dr. Moon added. Tummy time is an important way for infants to develop neck, back, and arm muscles, and prevent positional plagiocephaly.

BOSTON – Plush, soft, fuzzy, warm, and cuddly – those seem like the perfect attributes for a newborn nursery.

Except if you’re the newborn.

A new policy from the American Academy of Pediatrics says that babies who sleep on their back on a firm, flat surface – in their own unadorned crib – are most protected from sudden infant death syndrome (SIDS) and the deadly related tragedies of suffocation, asphyxiation, and entrapment.

Michelle G. Sullivan/Elsevier Global Medical News
Dr. Rachel Moon, the primary author of a new policy on infant death, recommends that all babies sleep alone and on their backs.

The AAP released its newest guidelines Oct. 18 for infant sleep safety and SIDS risk reduction (Pediatrics 2011 Oct. 17;doi:10.1542/peds.2011-2285). The take-home message for pediatricians and parents alike is a simple one, Dr. Rachel Moon said at a press briefing.

"Put baby on the back for every sleep. Use a firm sleep surface designed for infants, with no soft objects, wedges, positioners," or any other fashionable accoutrements such as ruffles, blankets, crib drapes, or bumper pads.

The ideal sleeping set-up? A crib, bassinet, or portable crib/play-yard in mom and dad’s room, with a firm mattress, a tight-fitting bottom sheet, and no blanket or other baby-dangerous decorative items.

Although such adornments may satisfy a parent’s fashion sense, they make no safety sense at all, said Dr. Moon, the policy’s primary author and a pediatrician at the Children’s National Medical Center, Washington.

Since 1992, when the AAP first launched its "Back to Sleep" campaign, SIDS cases in the United States have decreased by 50%. "But we’ve seen an alarming increase in other deaths," Dr. Moon said. "There has been a quadrupling of infant deaths due to suffocation and entrapment, and a lot of this is attributable to inappropriate bedding and to cosleeping" with parents.

Those deaths – grouped together as sudden unexplained infant deaths (SUID) – can’t always be distinguished from SIDS, she noted. SIDS infants probably have some vulnerability that predisposes them to an unexplained death, whether that is an inborn error of metabolism, prematurity, or exposure to cigarette smoke. SUID may occur either among those infants or among those who have no identifiable risk factors. Other than a coroner’s exam – almost universally unhelpful – there’s no way to tell these deaths apart.

The safest course is to make sure that infants have the safest possible sleep accommodations. The bare crib eliminates a number of dangerous factors that can cause an accidental infant death.

The new policy also tackles the controversial subject of cosleeping. The family bed has been promoted among many circles as the most natural way to care for a newborn. Some groups – and even physicians – have suggested that cosleeping may help prevent SIDS.

There are no data to support those claims, Dr. Moon said. In fact, cosleeping can put the infant at risk of smothering under heavy covers, airway obstruction if an adult limb falls across its face, and even overheating – a recognized SIDS risk factor.

Bed sharing is even more dangerous with adults who are medicated or have consumed alcohol or drugs, Dr. Moon added. Those adults will be less aware of their movements and whether they might endanger the sleeping infant.

"There has been a quadrupling of infant deaths due to suffocation and entrapment, and a lot of this is attributable to inappropriate bedding and to cosleeping."

Parents shouldn’t worry that babies might choke on their own secretions when sleeping on their backs, Dr. Moon said. Babies have built-in protective physical guards against choking. There’s also no evidence that placing newborns on their sides helps drain amniotic fluid or other secretions from their lungs. Moms who choose rooming-in after delivery should also put their baby to sleep in the supine position and request that nurses do the same.

Preterm babies and those with low birth weights are especially at risk for SIDS, Dr. Moon said. Even infants in the neonatal intensive care unit should sleep supine as soon as they are medically stable.

The AAP policy stresses the protective influence of breastfeeding, but notes that infants who come to the adults’ bed for nighttime nursing should go back to their own crib after feeding.

"Because of the extremely high risk of SIDS and suffocation on couches and armchairs, infants should never be fed on a couch or armchair when there is a high risk that the parent might fall asleep," according to the policy’s authors.

The AAP policy gives the pacifier its proper place as well. Pacifiers seem to protect against SIDS, although the mechanism isn’t really understood, Dr. Moon said. "It seems to have something to do with stimulating arousal," as the babies suck during sleep.

 

 

But if the plug comes unplugged during the night, don’t worry, she said. "Parents don’t need to worry about putting the pacifier back in the baby’s mouth, especially if the baby doesn’t seem to want it."

But Dr. Moon warned parents to never, ever attach a pacifier to an infant’s clothing in any way, especially with a string or ribbon around the baby’s neck.

Immunizations also protect against SIDS, so it’s critical to keep babies up to date with vaccinations, she said. And adults should never smoke around infants. Infants exposed to cigarette smoke are at a significantly increased risk of unexplained infant death.

Despite all the talk of supine positioning, supervised "tummy time" in which infants are allowed to lie prone for some time is also important, Dr. Moon added. Tummy time is an important way for infants to develop neck, back, and arm muscles, and prevent positional plagiocephaly.

Publications
Publications
Topics
Article Type
Display Headline
AAP's New SIDS Stoppers: Cleared Cribs, No Cosleeping
Display Headline
AAP's New SIDS Stoppers: Cleared Cribs, No Cosleeping
Legacy Keywords
sudden infant death syndrome prevention, sudden unexplained infant death, SIDS risk reduction, infant sleep safety, dangers of cosleeping, nighttime nursing
Legacy Keywords
sudden infant death syndrome prevention, sudden unexplained infant death, SIDS risk reduction, infant sleep safety, dangers of cosleeping, nighttime nursing
Article Source

FROM THE ANNUAL MEETING OF THE AMERICAN ACADEMY OF PEDIATRICS

PURLs Copyright

Inside the Article

Toys or TV? AAP Says It's a 'No Brainer'

Article Type
Changed
Display Headline
Toys or TV? AAP Says It's a 'No Brainer'

BOSTON – Can videos create Baby Einstein? Not likely at all. New research on babies and toddlers suggests that media screen time will never replace play time with toys or interactions with actual human beings.

The American Academy of Pediatrics is taking a strong stance on this issue, releasing a new policy statement that warns against exposing little people to the big screen.

Photo (c) Serg Myshkovsky/fotolia.com
In an achievement-driven society, parents often feel pressured to provide their child with every possible "leg up" on intellectual development. But videos don’t fit that bill – at least for babies younger than 2 years.

According to the policy, released Oct. 18, TV programs and videos – even those touted as educational – are likely to do more harm than good for young children. Screen time can limit creative play time and reduce interactions with parents and other children and disrupt sleep and meal routines – all critical processes in a baby’s developmental journey, said Dr. Ari Brown, the paper’s primary author.

"The key concerns here are that infants and toddlers who get ‘screen time’ get less ‘talk time,’ " Dr. Brown said during a press briefing. "Even though parents may view videos and programs as safe, educational and entertaining, these are marketing claims," without data to back them up. "Studies have already shown that 84% of parents talk less to their babies when the television is on and that they use 74% fewer new words," a pattern that definitely affects language development, she said.

Dr. Brown, a pediatrician from Austin, Tx., stressed the paper’s take-home message: Unstructured play time is the best way to stimulate the developing brain. "When babies are engaged in unstructured free play with toys, they are learning to problem-solve, to think creatively, and develop reasoning and motor skills," she said. "Free play also teaches children how to entertain themselves, which is certainly a valuable skill."

In an achievement-driven society, parents often feel pressured to provide their child with every possible "leg up" on intellectual development. But videos don’t fit that bill – at least for babies younger than 2 years.

She cited a study in which children aged 6, 12, and 18 months watched a "Teletubbies" video both forward and backward. The younger children watched the video with the same attention whichever direction it played, showing that they made no real cognitive connection. "Only the 18-month-olds started following it more as the video went forward, paying attention to some content, and to the fades and special effects," Dr. Brown said. Around 2 years, children may actually begin to learn from a program that has a proven educational benefit," especially if watched with an engaged adult.

"Studies have found that TV as part of [the] bedtime routine can shorten sleep duration and provoke irregular sleep cycles."

Household media use also decreases reading time, the report noted. Children in households with lots of media use get an average of 25% less time reading with an adult and have a lower likelihood of being able to read, compared with children from households with lower media use.

The policy also addressed unsupervised screen time in bedrooms. By age 3 years, about a third of American children have TVs in their bedrooms, with many parents considering a bedtime video to be a calming sleep aid (Pediatrics 2011 [doi:10.1542/peds.2011-1753]).

Not so, said Dr. Brown, asserting that TVs have no place in babies’ bedrooms. "Studies have found that TV as part of [the] bedtime routine can shorten sleep duration and provoke irregular sleep cycles."

Among the new policy’s other key recommendations:

• Although AAP discourages the use of any media for children younger than 2, parents who use it should set strong limits and have a strategy for sticking to them.

• Instead of screen time, opt for supervised – but independent – play during the time when an adult is present.

• Recognize that adult media use can have a negative impact on children. "Even if the program isn’t intended for children to watch, research has found that children playing nearby will look up from their play about three times each minute instead of focusing on their own activity, and they interact less with adults when a TV is on, perhaps because the adult’s attention is focused on the program."

While pediatricians can stress all of these points to parents, they can also offer an alternative to worried moms and dads: Don’t feel guilty about putting your child down on the floor with toys.

"Look, we all live in reality. If you want your child to learn and do well, give [her] the skill set of learning through play. Not only is it OK to put your child in a room with toys, it is a good thing. Don’t feel guilty about it. We know you can’t be with [your child] 24 hours a day, and now we know there is real value in this independent play."

Meeting/Event
Author and Disclosure Information

Publications
Topics
Legacy Keywords
television, Teletubbies, Baby Einstein
Author and Disclosure Information

Author and Disclosure Information

Meeting/Event
Meeting/Event

BOSTON – Can videos create Baby Einstein? Not likely at all. New research on babies and toddlers suggests that media screen time will never replace play time with toys or interactions with actual human beings.

The American Academy of Pediatrics is taking a strong stance on this issue, releasing a new policy statement that warns against exposing little people to the big screen.

Photo (c) Serg Myshkovsky/fotolia.com
In an achievement-driven society, parents often feel pressured to provide their child with every possible "leg up" on intellectual development. But videos don’t fit that bill – at least for babies younger than 2 years.

According to the policy, released Oct. 18, TV programs and videos – even those touted as educational – are likely to do more harm than good for young children. Screen time can limit creative play time and reduce interactions with parents and other children and disrupt sleep and meal routines – all critical processes in a baby’s developmental journey, said Dr. Ari Brown, the paper’s primary author.

"The key concerns here are that infants and toddlers who get ‘screen time’ get less ‘talk time,’ " Dr. Brown said during a press briefing. "Even though parents may view videos and programs as safe, educational and entertaining, these are marketing claims," without data to back them up. "Studies have already shown that 84% of parents talk less to their babies when the television is on and that they use 74% fewer new words," a pattern that definitely affects language development, she said.

Dr. Brown, a pediatrician from Austin, Tx., stressed the paper’s take-home message: Unstructured play time is the best way to stimulate the developing brain. "When babies are engaged in unstructured free play with toys, they are learning to problem-solve, to think creatively, and develop reasoning and motor skills," she said. "Free play also teaches children how to entertain themselves, which is certainly a valuable skill."

In an achievement-driven society, parents often feel pressured to provide their child with every possible "leg up" on intellectual development. But videos don’t fit that bill – at least for babies younger than 2 years.

She cited a study in which children aged 6, 12, and 18 months watched a "Teletubbies" video both forward and backward. The younger children watched the video with the same attention whichever direction it played, showing that they made no real cognitive connection. "Only the 18-month-olds started following it more as the video went forward, paying attention to some content, and to the fades and special effects," Dr. Brown said. Around 2 years, children may actually begin to learn from a program that has a proven educational benefit," especially if watched with an engaged adult.

"Studies have found that TV as part of [the] bedtime routine can shorten sleep duration and provoke irregular sleep cycles."

Household media use also decreases reading time, the report noted. Children in households with lots of media use get an average of 25% less time reading with an adult and have a lower likelihood of being able to read, compared with children from households with lower media use.

The policy also addressed unsupervised screen time in bedrooms. By age 3 years, about a third of American children have TVs in their bedrooms, with many parents considering a bedtime video to be a calming sleep aid (Pediatrics 2011 [doi:10.1542/peds.2011-1753]).

Not so, said Dr. Brown, asserting that TVs have no place in babies’ bedrooms. "Studies have found that TV as part of [the] bedtime routine can shorten sleep duration and provoke irregular sleep cycles."

Among the new policy’s other key recommendations:

• Although AAP discourages the use of any media for children younger than 2, parents who use it should set strong limits and have a strategy for sticking to them.

• Instead of screen time, opt for supervised – but independent – play during the time when an adult is present.

• Recognize that adult media use can have a negative impact on children. "Even if the program isn’t intended for children to watch, research has found that children playing nearby will look up from their play about three times each minute instead of focusing on their own activity, and they interact less with adults when a TV is on, perhaps because the adult’s attention is focused on the program."

While pediatricians can stress all of these points to parents, they can also offer an alternative to worried moms and dads: Don’t feel guilty about putting your child down on the floor with toys.

"Look, we all live in reality. If you want your child to learn and do well, give [her] the skill set of learning through play. Not only is it OK to put your child in a room with toys, it is a good thing. Don’t feel guilty about it. We know you can’t be with [your child] 24 hours a day, and now we know there is real value in this independent play."

BOSTON – Can videos create Baby Einstein? Not likely at all. New research on babies and toddlers suggests that media screen time will never replace play time with toys or interactions with actual human beings.

The American Academy of Pediatrics is taking a strong stance on this issue, releasing a new policy statement that warns against exposing little people to the big screen.

Photo (c) Serg Myshkovsky/fotolia.com
In an achievement-driven society, parents often feel pressured to provide their child with every possible "leg up" on intellectual development. But videos don’t fit that bill – at least for babies younger than 2 years.

According to the policy, released Oct. 18, TV programs and videos – even those touted as educational – are likely to do more harm than good for young children. Screen time can limit creative play time and reduce interactions with parents and other children and disrupt sleep and meal routines – all critical processes in a baby’s developmental journey, said Dr. Ari Brown, the paper’s primary author.

"The key concerns here are that infants and toddlers who get ‘screen time’ get less ‘talk time,’ " Dr. Brown said during a press briefing. "Even though parents may view videos and programs as safe, educational and entertaining, these are marketing claims," without data to back them up. "Studies have already shown that 84% of parents talk less to their babies when the television is on and that they use 74% fewer new words," a pattern that definitely affects language development, she said.

Dr. Brown, a pediatrician from Austin, Tx., stressed the paper’s take-home message: Unstructured play time is the best way to stimulate the developing brain. "When babies are engaged in unstructured free play with toys, they are learning to problem-solve, to think creatively, and develop reasoning and motor skills," she said. "Free play also teaches children how to entertain themselves, which is certainly a valuable skill."

In an achievement-driven society, parents often feel pressured to provide their child with every possible "leg up" on intellectual development. But videos don’t fit that bill – at least for babies younger than 2 years.

She cited a study in which children aged 6, 12, and 18 months watched a "Teletubbies" video both forward and backward. The younger children watched the video with the same attention whichever direction it played, showing that they made no real cognitive connection. "Only the 18-month-olds started following it more as the video went forward, paying attention to some content, and to the fades and special effects," Dr. Brown said. Around 2 years, children may actually begin to learn from a program that has a proven educational benefit," especially if watched with an engaged adult.

"Studies have found that TV as part of [the] bedtime routine can shorten sleep duration and provoke irregular sleep cycles."

Household media use also decreases reading time, the report noted. Children in households with lots of media use get an average of 25% less time reading with an adult and have a lower likelihood of being able to read, compared with children from households with lower media use.

The policy also addressed unsupervised screen time in bedrooms. By age 3 years, about a third of American children have TVs in their bedrooms, with many parents considering a bedtime video to be a calming sleep aid (Pediatrics 2011 [doi:10.1542/peds.2011-1753]).

Not so, said Dr. Brown, asserting that TVs have no place in babies’ bedrooms. "Studies have found that TV as part of [the] bedtime routine can shorten sleep duration and provoke irregular sleep cycles."

Among the new policy’s other key recommendations:

• Although AAP discourages the use of any media for children younger than 2, parents who use it should set strong limits and have a strategy for sticking to them.

• Instead of screen time, opt for supervised – but independent – play during the time when an adult is present.

• Recognize that adult media use can have a negative impact on children. "Even if the program isn’t intended for children to watch, research has found that children playing nearby will look up from their play about three times each minute instead of focusing on their own activity, and they interact less with adults when a TV is on, perhaps because the adult’s attention is focused on the program."

While pediatricians can stress all of these points to parents, they can also offer an alternative to worried moms and dads: Don’t feel guilty about putting your child down on the floor with toys.

"Look, we all live in reality. If you want your child to learn and do well, give [her] the skill set of learning through play. Not only is it OK to put your child in a room with toys, it is a good thing. Don’t feel guilty about it. We know you can’t be with [your child] 24 hours a day, and now we know there is real value in this independent play."

Publications
Publications
Topics
Article Type
Display Headline
Toys or TV? AAP Says It's a 'No Brainer'
Display Headline
Toys or TV? AAP Says It's a 'No Brainer'
Legacy Keywords
television, Teletubbies, Baby Einstein
Legacy Keywords
television, Teletubbies, Baby Einstein
Article Source

FROM THE ANNUAL MEETING OF THE AMERICAN ACADEMY OF PEDIATRICS

PURLs Copyright

Inside the Article

Neuromodulation Implants Quelled Craniofacial Pain for Most Patients

Article Type
Changed
Display Headline
Neuromodulation Implants Quelled Craniofacial Pain for Most Patients

WASHINGTON – Peripheral neuromodulation effectively managed a variety of craniofacial and headache pain syndromes, with 82% of patients reporting significant pain relief up to 65 months after the surgery in a retrospective study.

While percutaneous neuromodulation surgery is by no means a primary therapy for facial pain or headache, "it is something to keep in mind for patients with intractable pain," Dr. Antonios Mammis said at the annual meeting of the Congress of Neurological Surgeons.

Dr. Mammis, a resident at the Neurological Institute of New Jersey in Newark, presented a review of 99 patients who underwent occipital or trigeminal branch stimulation for a variety of craniofacial pain syndromes. The study won the group’s Ronald Tasker Award for Research in Pain Management. He conducted the study while he was a resident at Long Island Jewish Medical Center, New Hyde Park, N.Y.

The review encompassed procedures done by a single neurosurgeon from 2004 to 2011. During the review, Dr. Mammis reclassified each patient’s symptoms according to the International Classification of Headache Disorders, Second Edition. Of the 99 patients, 74 were female. The mean age at surgery was 43 years (range, 11-68 years).

Chiari malformation type 1 was the most common classification (28 patients). This was not a surprise, since the neurosurgeon worked at a Chiari referral center.

Other pain classifications included migraine with or without aura (24), chronic posttraumatic headache attributed to mild head injury (11), occipital neuralgia (8), postcraniotomy headache (7), chronic cluster headache (5), headache secondary to ischemic stroke (5), other terminal branch neuralgias (5), cervicogenic headache (4), hemicrania continua (1), and acromegaly (1).

All patients underwent a 4- to 7-day treatment trial, with a bilateral lead placement performed under local anesthesia with intravenous sedation. The surgeon used surface anatomy and fluoroscopy to determine lead placement. During the trial, patients kept a headache diary noting frequency, duration, and severity.

While percutaneous neuromodulation surgery is by no means a primary therapy for facial pain or headache, "it is something to keep in mind for patients with intractable pain," said Dr. Antonios Mammis.

After the trial period, 79 patients (80%) reported significant pain relief, which was defined as at least a 50% decrease in pain as rated on a visual analog scale. These patients went on to have a permanent neuromodulator implanted. Of these, 56 received only occipital leads, 12 received only trigeminal leads, and 11 had leads implanted on both nerve branches.

Most of the headache syndromes responded equally well to the neurostimulators. At the last follow-up, which ranged from 1 to 65 months, 65 (82%) reported continued use of the stimulator and continued to report significant pain improvement.

At that time, stimulators were still being used in 15 of 18 Chiari malformation patients, 19 of 21 migraine patients, 7 of 7 occipital neuralgia patients, 6 of 7 postcraniotomy patients, 4 of 4 cluster headache patients, 1 of 3 ischemic stroke patients, 1 of 4 terminal branch neuralgia patients, 3 of 3 cervicogenic headache patients, and the one patient with hemicrania continua. The single acromegaly patient received a stimulator, but did not have long-term pain relief.

Complications arose in 10 patients. Four of these were lead migrations that required revision, which is "a problem that is very easily corrected," Dr. Mammis said. Six patients acquired an infection; three were wound erosions and three were surgical site infections without erosion. "All of these leads were explanted and revised," he said. There were no infections after revision.

Nearly a quarter of the patients (22%) asked for a cosmetic revision. "This was primarily done because they could see or feel the lead," Dr. Mammis said. "These were not infected leads, and there were no infections after these revisions."

Dr. Mammis had no financial disclosures.

Meeting/Event
Author and Disclosure Information

Publications
Topics
Legacy Keywords
Peripheral neuromodulation, craniofacial pain, headache pain, pain relief, percutaneous neuromodulation surgery, Dr. Antonios Mammis, Congress of Neurological Surgeons
Author and Disclosure Information

Author and Disclosure Information

Meeting/Event
Meeting/Event

WASHINGTON – Peripheral neuromodulation effectively managed a variety of craniofacial and headache pain syndromes, with 82% of patients reporting significant pain relief up to 65 months after the surgery in a retrospective study.

While percutaneous neuromodulation surgery is by no means a primary therapy for facial pain or headache, "it is something to keep in mind for patients with intractable pain," Dr. Antonios Mammis said at the annual meeting of the Congress of Neurological Surgeons.

Dr. Mammis, a resident at the Neurological Institute of New Jersey in Newark, presented a review of 99 patients who underwent occipital or trigeminal branch stimulation for a variety of craniofacial pain syndromes. The study won the group’s Ronald Tasker Award for Research in Pain Management. He conducted the study while he was a resident at Long Island Jewish Medical Center, New Hyde Park, N.Y.

The review encompassed procedures done by a single neurosurgeon from 2004 to 2011. During the review, Dr. Mammis reclassified each patient’s symptoms according to the International Classification of Headache Disorders, Second Edition. Of the 99 patients, 74 were female. The mean age at surgery was 43 years (range, 11-68 years).

Chiari malformation type 1 was the most common classification (28 patients). This was not a surprise, since the neurosurgeon worked at a Chiari referral center.

Other pain classifications included migraine with or without aura (24), chronic posttraumatic headache attributed to mild head injury (11), occipital neuralgia (8), postcraniotomy headache (7), chronic cluster headache (5), headache secondary to ischemic stroke (5), other terminal branch neuralgias (5), cervicogenic headache (4), hemicrania continua (1), and acromegaly (1).

All patients underwent a 4- to 7-day treatment trial, with a bilateral lead placement performed under local anesthesia with intravenous sedation. The surgeon used surface anatomy and fluoroscopy to determine lead placement. During the trial, patients kept a headache diary noting frequency, duration, and severity.

While percutaneous neuromodulation surgery is by no means a primary therapy for facial pain or headache, "it is something to keep in mind for patients with intractable pain," said Dr. Antonios Mammis.

After the trial period, 79 patients (80%) reported significant pain relief, which was defined as at least a 50% decrease in pain as rated on a visual analog scale. These patients went on to have a permanent neuromodulator implanted. Of these, 56 received only occipital leads, 12 received only trigeminal leads, and 11 had leads implanted on both nerve branches.

Most of the headache syndromes responded equally well to the neurostimulators. At the last follow-up, which ranged from 1 to 65 months, 65 (82%) reported continued use of the stimulator and continued to report significant pain improvement.

At that time, stimulators were still being used in 15 of 18 Chiari malformation patients, 19 of 21 migraine patients, 7 of 7 occipital neuralgia patients, 6 of 7 postcraniotomy patients, 4 of 4 cluster headache patients, 1 of 3 ischemic stroke patients, 1 of 4 terminal branch neuralgia patients, 3 of 3 cervicogenic headache patients, and the one patient with hemicrania continua. The single acromegaly patient received a stimulator, but did not have long-term pain relief.

Complications arose in 10 patients. Four of these were lead migrations that required revision, which is "a problem that is very easily corrected," Dr. Mammis said. Six patients acquired an infection; three were wound erosions and three were surgical site infections without erosion. "All of these leads were explanted and revised," he said. There were no infections after revision.

Nearly a quarter of the patients (22%) asked for a cosmetic revision. "This was primarily done because they could see or feel the lead," Dr. Mammis said. "These were not infected leads, and there were no infections after these revisions."

Dr. Mammis had no financial disclosures.

WASHINGTON – Peripheral neuromodulation effectively managed a variety of craniofacial and headache pain syndromes, with 82% of patients reporting significant pain relief up to 65 months after the surgery in a retrospective study.

While percutaneous neuromodulation surgery is by no means a primary therapy for facial pain or headache, "it is something to keep in mind for patients with intractable pain," Dr. Antonios Mammis said at the annual meeting of the Congress of Neurological Surgeons.

Dr. Mammis, a resident at the Neurological Institute of New Jersey in Newark, presented a review of 99 patients who underwent occipital or trigeminal branch stimulation for a variety of craniofacial pain syndromes. The study won the group’s Ronald Tasker Award for Research in Pain Management. He conducted the study while he was a resident at Long Island Jewish Medical Center, New Hyde Park, N.Y.

The review encompassed procedures done by a single neurosurgeon from 2004 to 2011. During the review, Dr. Mammis reclassified each patient’s symptoms according to the International Classification of Headache Disorders, Second Edition. Of the 99 patients, 74 were female. The mean age at surgery was 43 years (range, 11-68 years).

Chiari malformation type 1 was the most common classification (28 patients). This was not a surprise, since the neurosurgeon worked at a Chiari referral center.

Other pain classifications included migraine with or without aura (24), chronic posttraumatic headache attributed to mild head injury (11), occipital neuralgia (8), postcraniotomy headache (7), chronic cluster headache (5), headache secondary to ischemic stroke (5), other terminal branch neuralgias (5), cervicogenic headache (4), hemicrania continua (1), and acromegaly (1).

All patients underwent a 4- to 7-day treatment trial, with a bilateral lead placement performed under local anesthesia with intravenous sedation. The surgeon used surface anatomy and fluoroscopy to determine lead placement. During the trial, patients kept a headache diary noting frequency, duration, and severity.

While percutaneous neuromodulation surgery is by no means a primary therapy for facial pain or headache, "it is something to keep in mind for patients with intractable pain," said Dr. Antonios Mammis.

After the trial period, 79 patients (80%) reported significant pain relief, which was defined as at least a 50% decrease in pain as rated on a visual analog scale. These patients went on to have a permanent neuromodulator implanted. Of these, 56 received only occipital leads, 12 received only trigeminal leads, and 11 had leads implanted on both nerve branches.

Most of the headache syndromes responded equally well to the neurostimulators. At the last follow-up, which ranged from 1 to 65 months, 65 (82%) reported continued use of the stimulator and continued to report significant pain improvement.

At that time, stimulators were still being used in 15 of 18 Chiari malformation patients, 19 of 21 migraine patients, 7 of 7 occipital neuralgia patients, 6 of 7 postcraniotomy patients, 4 of 4 cluster headache patients, 1 of 3 ischemic stroke patients, 1 of 4 terminal branch neuralgia patients, 3 of 3 cervicogenic headache patients, and the one patient with hemicrania continua. The single acromegaly patient received a stimulator, but did not have long-term pain relief.

Complications arose in 10 patients. Four of these were lead migrations that required revision, which is "a problem that is very easily corrected," Dr. Mammis said. Six patients acquired an infection; three were wound erosions and three were surgical site infections without erosion. "All of these leads were explanted and revised," he said. There were no infections after revision.

Nearly a quarter of the patients (22%) asked for a cosmetic revision. "This was primarily done because they could see or feel the lead," Dr. Mammis said. "These were not infected leads, and there were no infections after these revisions."

Dr. Mammis had no financial disclosures.

Publications
Publications
Topics
Article Type
Display Headline
Neuromodulation Implants Quelled Craniofacial Pain for Most Patients
Display Headline
Neuromodulation Implants Quelled Craniofacial Pain for Most Patients
Legacy Keywords
Peripheral neuromodulation, craniofacial pain, headache pain, pain relief, percutaneous neuromodulation surgery, Dr. Antonios Mammis, Congress of Neurological Surgeons
Legacy Keywords
Peripheral neuromodulation, craniofacial pain, headache pain, pain relief, percutaneous neuromodulation surgery, Dr. Antonios Mammis, Congress of Neurological Surgeons
Article Source

FROM THE ANNUAL MEETING OF THE CONGRESS OF NEUROLOGICAL SURGEONS

PURLs Copyright

Inside the Article

Vitals

Major Finding: Up to 65 months after receiving an implanted neuromodulator, 82% of patients with craniofacial pain syndromes still reported significant pain relief.

Data Source: Retrospective study of 99 surgeries performed from 2004 to 2011.

Disclosures: Dr. Mammis had no financial disclosures.

Headaches, Not Battle Wounds, Keep Soldiers Sidelined

Article Type
Changed
Display Headline
Headaches, Not Battle Wounds, Keep Soldiers Sidelined

Soldiers evacuated from current war zones with a headache diagnosis are unlikely to return to duty, a new retrospective study has found.

Only about a third of these soldiers were able to return to duty, even after receiving treatment, Dr. Steven P. Cohen and his colleagues reported (Cephalalgia 2011 Oct. 12 [doi:10.1177/0333102411422382]).

Headaches account for a significant burden in units and for health care providers deployed to combat zones, wrote Dr. Cohen of both Johns Hopkins University, Baltimore, and the Uniformed Services University of the Health Sciences, Bethesda, Md. "The overall [return-to-duty] rate of 33.6% is one of the lowest among all injury types, and to some degree reflects the observation that a large percentage of headaches were incurred during combat operations."

Throughout history, most war casualties haven’t been battle related, Dr. Cohen said in an interview: "Since World War I, nonbattle injuries have been by far the No. 1 reason a soldier is evacuated from the field." Dr. Cohen is a colonel in the U.S. Army Reserve and director of pain research at the Walter Reed National Military Medical Center.

"In the earlier wars, it was respiratory and infectious disease. In these more modern conflicts, the No. 1 reason for evacuation is musculoskeletal injury, followed by psychological and neurological problems – and all of these can involve headache."

"We now know that two-thirds of [soldiers] who leave with headache don’t come back, and even if they do, they may have limitations."

Headache is the most common neurologic symptom in the world, he said, with some studies claiming that up to 70% of people are affected. But recent studies of soldiers deployed in the current wars suggest that the headache burden among recently deployed soldiers may be even larger. In addition to risking a combat injury, young people are exposed to constantly high stress levels. The combination is a perfect recipe for severe headaches.

"There are incredible psychosocial stressors involved in being deployed," he said. "In addition to the daily possibility of being injured or killed, soldiers worry about family separation and about their colleagues who serve along with them. And this is happening in young people in whom sophisticated coping mechanisms have not yet been developed."

To understand how headache might affect the strength and stability of military units, Dr. Cohen and his coauthors reviewed the records of 985 soldiers who had been evacuated from the wars during 2004-2009 with a primary diagnosis of headache.

Headache diagnoses fell into seven categories: postconcussive (33%); tension type (11%); migraine (30%); cervicogenic (9%); occipital neuralgia (5%); cluster (2%), and "other," a category that included tumor, vascular pathology, psychogenic headache, substance abuse, and cerebrovascular events presenting as headache.

The soldiers’ mean age was 30 years; most (88%) were men.

Almost half of the headaches (48%) were related to physical trauma; 3% were deemed psychological or emotional, 3% as environmental or infectious, and the remainder were of other etiologies or unknown. In all, 22% of the soldiers reported a prior history of headache.

Headaches were deemed to be battle related if they were sustained in a combat operation (31%). Another 62% were not related to combat, and data were unavailable for the remainder.

Episodic headache was most common (52%); 39% had constant headache. The authors did not find frequency data for 9% of the group.

Once evacuated, treatment varied widely among the group. NSAIDs were the most commonly used medications (77%), followed by antidepressants (64%), opioids (34%), anticonvulsants (29%), and triptans (27%).

Other medical treatment included beta-blockers (11%) and calcium channel blockers (2%). Many soldiers (36%) were on multiple therapies, and 9% received injections or nerve blocks.

Overall, 33.6% of the soldiers were able to return to duty. In multivariate regression analyses of the factors associated with return to duty, the investigators controlled for age; sex; military branch; headache diagnosis and etiology; treatment; psychiatric and brain injury history; family and personal headache and pain history; and smoking.

Headache type was significantly associated with return to duty. A diagnosis of occipital neuralgia lowered the odds of returning to duty by 80%, compared with tension-type headache, the reference diagnosis. Other diagnoses that lowered the odds of returning to duty included postconcussive headache (by 67%), cervicogenic headache (by 60%), and coexisting traumatic brain injury (by 50%).

"Since World War I, nonbattle injuries have been by far the No. 1 reason a soldier is evacuated from the field."

The method of treatment also affected return to duty. Compared with no treatment, the odds of returning to duty were significantly lowered by 59% with opioids and by 74% with beta-blockers.

 

 

Because the study examined only cases with a primary diagnosis of headache, it probably understates the disorder’s true impact, Dr. Cohen said. Posttraumatic stress disorder, musculoskeletal injuries, concussions, and motor vehicle accidents – all very common wartime injuries – can cause chronic headaches.

Whether soldiers return to the battlefield, or are kept on active duty and treated outside the war theater, the cost to the military is high, he said. "We now know that two-thirds of those who leave with headache don’t come back, and even if they do, they may have limitations. They might not be able to go on foot patrol or wear their Kevlar gear – but they are still using resources. And for every soldier who is evacuated, the unit goes short and someone else has to be deployed."

Headache treatment can last for months, he added, tying up military medical centers during active duty and after discharge.

"They continue utilizing medical and military resources the whole time they are being treated, and this costs America a huge amount of money," he said. "Even if all our troops would pull out tomorrow, we will be paying for this for the rest of our lives, as will the soldiers who are injured."

The study was funded by a grant from the John P. Murtha Neuroscience and Pain Institute, the U.S. Army, and the Army Regional Anesthesia and Pain Medicine Initiative. Dr. Cohen had no financial declarations.

Author and Disclosure Information

Publications
Topics
Legacy Keywords
headache diagnosis, return to duty, wounded troops, non-battle injuries, psychosocial stressors, episodic headache
Author and Disclosure Information

Author and Disclosure Information

Soldiers evacuated from current war zones with a headache diagnosis are unlikely to return to duty, a new retrospective study has found.

Only about a third of these soldiers were able to return to duty, even after receiving treatment, Dr. Steven P. Cohen and his colleagues reported (Cephalalgia 2011 Oct. 12 [doi:10.1177/0333102411422382]).

Headaches account for a significant burden in units and for health care providers deployed to combat zones, wrote Dr. Cohen of both Johns Hopkins University, Baltimore, and the Uniformed Services University of the Health Sciences, Bethesda, Md. "The overall [return-to-duty] rate of 33.6% is one of the lowest among all injury types, and to some degree reflects the observation that a large percentage of headaches were incurred during combat operations."

Throughout history, most war casualties haven’t been battle related, Dr. Cohen said in an interview: "Since World War I, nonbattle injuries have been by far the No. 1 reason a soldier is evacuated from the field." Dr. Cohen is a colonel in the U.S. Army Reserve and director of pain research at the Walter Reed National Military Medical Center.

"In the earlier wars, it was respiratory and infectious disease. In these more modern conflicts, the No. 1 reason for evacuation is musculoskeletal injury, followed by psychological and neurological problems – and all of these can involve headache."

"We now know that two-thirds of [soldiers] who leave with headache don’t come back, and even if they do, they may have limitations."

Headache is the most common neurologic symptom in the world, he said, with some studies claiming that up to 70% of people are affected. But recent studies of soldiers deployed in the current wars suggest that the headache burden among recently deployed soldiers may be even larger. In addition to risking a combat injury, young people are exposed to constantly high stress levels. The combination is a perfect recipe for severe headaches.

"There are incredible psychosocial stressors involved in being deployed," he said. "In addition to the daily possibility of being injured or killed, soldiers worry about family separation and about their colleagues who serve along with them. And this is happening in young people in whom sophisticated coping mechanisms have not yet been developed."

To understand how headache might affect the strength and stability of military units, Dr. Cohen and his coauthors reviewed the records of 985 soldiers who had been evacuated from the wars during 2004-2009 with a primary diagnosis of headache.

Headache diagnoses fell into seven categories: postconcussive (33%); tension type (11%); migraine (30%); cervicogenic (9%); occipital neuralgia (5%); cluster (2%), and "other," a category that included tumor, vascular pathology, psychogenic headache, substance abuse, and cerebrovascular events presenting as headache.

The soldiers’ mean age was 30 years; most (88%) were men.

Almost half of the headaches (48%) were related to physical trauma; 3% were deemed psychological or emotional, 3% as environmental or infectious, and the remainder were of other etiologies or unknown. In all, 22% of the soldiers reported a prior history of headache.

Headaches were deemed to be battle related if they were sustained in a combat operation (31%). Another 62% were not related to combat, and data were unavailable for the remainder.

Episodic headache was most common (52%); 39% had constant headache. The authors did not find frequency data for 9% of the group.

Once evacuated, treatment varied widely among the group. NSAIDs were the most commonly used medications (77%), followed by antidepressants (64%), opioids (34%), anticonvulsants (29%), and triptans (27%).

Other medical treatment included beta-blockers (11%) and calcium channel blockers (2%). Many soldiers (36%) were on multiple therapies, and 9% received injections or nerve blocks.

Overall, 33.6% of the soldiers were able to return to duty. In multivariate regression analyses of the factors associated with return to duty, the investigators controlled for age; sex; military branch; headache diagnosis and etiology; treatment; psychiatric and brain injury history; family and personal headache and pain history; and smoking.

Headache type was significantly associated with return to duty. A diagnosis of occipital neuralgia lowered the odds of returning to duty by 80%, compared with tension-type headache, the reference diagnosis. Other diagnoses that lowered the odds of returning to duty included postconcussive headache (by 67%), cervicogenic headache (by 60%), and coexisting traumatic brain injury (by 50%).

"Since World War I, nonbattle injuries have been by far the No. 1 reason a soldier is evacuated from the field."

The method of treatment also affected return to duty. Compared with no treatment, the odds of returning to duty were significantly lowered by 59% with opioids and by 74% with beta-blockers.

 

 

Because the study examined only cases with a primary diagnosis of headache, it probably understates the disorder’s true impact, Dr. Cohen said. Posttraumatic stress disorder, musculoskeletal injuries, concussions, and motor vehicle accidents – all very common wartime injuries – can cause chronic headaches.

Whether soldiers return to the battlefield, or are kept on active duty and treated outside the war theater, the cost to the military is high, he said. "We now know that two-thirds of those who leave with headache don’t come back, and even if they do, they may have limitations. They might not be able to go on foot patrol or wear their Kevlar gear – but they are still using resources. And for every soldier who is evacuated, the unit goes short and someone else has to be deployed."

Headache treatment can last for months, he added, tying up military medical centers during active duty and after discharge.

"They continue utilizing medical and military resources the whole time they are being treated, and this costs America a huge amount of money," he said. "Even if all our troops would pull out tomorrow, we will be paying for this for the rest of our lives, as will the soldiers who are injured."

The study was funded by a grant from the John P. Murtha Neuroscience and Pain Institute, the U.S. Army, and the Army Regional Anesthesia and Pain Medicine Initiative. Dr. Cohen had no financial declarations.

Soldiers evacuated from current war zones with a headache diagnosis are unlikely to return to duty, a new retrospective study has found.

Only about a third of these soldiers were able to return to duty, even after receiving treatment, Dr. Steven P. Cohen and his colleagues reported (Cephalalgia 2011 Oct. 12 [doi:10.1177/0333102411422382]).

Headaches account for a significant burden in units and for health care providers deployed to combat zones, wrote Dr. Cohen of both Johns Hopkins University, Baltimore, and the Uniformed Services University of the Health Sciences, Bethesda, Md. "The overall [return-to-duty] rate of 33.6% is one of the lowest among all injury types, and to some degree reflects the observation that a large percentage of headaches were incurred during combat operations."

Throughout history, most war casualties haven’t been battle related, Dr. Cohen said in an interview: "Since World War I, nonbattle injuries have been by far the No. 1 reason a soldier is evacuated from the field." Dr. Cohen is a colonel in the U.S. Army Reserve and director of pain research at the Walter Reed National Military Medical Center.

"In the earlier wars, it was respiratory and infectious disease. In these more modern conflicts, the No. 1 reason for evacuation is musculoskeletal injury, followed by psychological and neurological problems – and all of these can involve headache."

"We now know that two-thirds of [soldiers] who leave with headache don’t come back, and even if they do, they may have limitations."

Headache is the most common neurologic symptom in the world, he said, with some studies claiming that up to 70% of people are affected. But recent studies of soldiers deployed in the current wars suggest that the headache burden among recently deployed soldiers may be even larger. In addition to risking a combat injury, young people are exposed to constantly high stress levels. The combination is a perfect recipe for severe headaches.

"There are incredible psychosocial stressors involved in being deployed," he said. "In addition to the daily possibility of being injured or killed, soldiers worry about family separation and about their colleagues who serve along with them. And this is happening in young people in whom sophisticated coping mechanisms have not yet been developed."

To understand how headache might affect the strength and stability of military units, Dr. Cohen and his coauthors reviewed the records of 985 soldiers who had been evacuated from the wars during 2004-2009 with a primary diagnosis of headache.

Headache diagnoses fell into seven categories: postconcussive (33%); tension type (11%); migraine (30%); cervicogenic (9%); occipital neuralgia (5%); cluster (2%), and "other," a category that included tumor, vascular pathology, psychogenic headache, substance abuse, and cerebrovascular events presenting as headache.

The soldiers’ mean age was 30 years; most (88%) were men.

Almost half of the headaches (48%) were related to physical trauma; 3% were deemed psychological or emotional, 3% as environmental or infectious, and the remainder were of other etiologies or unknown. In all, 22% of the soldiers reported a prior history of headache.

Headaches were deemed to be battle related if they were sustained in a combat operation (31%). Another 62% were not related to combat, and data were unavailable for the remainder.

Episodic headache was most common (52%); 39% had constant headache. The authors did not find frequency data for 9% of the group.

Once evacuated, treatment varied widely among the group. NSAIDs were the most commonly used medications (77%), followed by antidepressants (64%), opioids (34%), anticonvulsants (29%), and triptans (27%).

Other medical treatment included beta-blockers (11%) and calcium channel blockers (2%). Many soldiers (36%) were on multiple therapies, and 9% received injections or nerve blocks.

Overall, 33.6% of the soldiers were able to return to duty. In multivariate regression analyses of the factors associated with return to duty, the investigators controlled for age; sex; military branch; headache diagnosis and etiology; treatment; psychiatric and brain injury history; family and personal headache and pain history; and smoking.

Headache type was significantly associated with return to duty. A diagnosis of occipital neuralgia lowered the odds of returning to duty by 80%, compared with tension-type headache, the reference diagnosis. Other diagnoses that lowered the odds of returning to duty included postconcussive headache (by 67%), cervicogenic headache (by 60%), and coexisting traumatic brain injury (by 50%).

"Since World War I, nonbattle injuries have been by far the No. 1 reason a soldier is evacuated from the field."

The method of treatment also affected return to duty. Compared with no treatment, the odds of returning to duty were significantly lowered by 59% with opioids and by 74% with beta-blockers.

 

 

Because the study examined only cases with a primary diagnosis of headache, it probably understates the disorder’s true impact, Dr. Cohen said. Posttraumatic stress disorder, musculoskeletal injuries, concussions, and motor vehicle accidents – all very common wartime injuries – can cause chronic headaches.

Whether soldiers return to the battlefield, or are kept on active duty and treated outside the war theater, the cost to the military is high, he said. "We now know that two-thirds of those who leave with headache don’t come back, and even if they do, they may have limitations. They might not be able to go on foot patrol or wear their Kevlar gear – but they are still using resources. And for every soldier who is evacuated, the unit goes short and someone else has to be deployed."

Headache treatment can last for months, he added, tying up military medical centers during active duty and after discharge.

"They continue utilizing medical and military resources the whole time they are being treated, and this costs America a huge amount of money," he said. "Even if all our troops would pull out tomorrow, we will be paying for this for the rest of our lives, as will the soldiers who are injured."

The study was funded by a grant from the John P. Murtha Neuroscience and Pain Institute, the U.S. Army, and the Army Regional Anesthesia and Pain Medicine Initiative. Dr. Cohen had no financial declarations.

Publications
Publications
Topics
Article Type
Display Headline
Headaches, Not Battle Wounds, Keep Soldiers Sidelined
Display Headline
Headaches, Not Battle Wounds, Keep Soldiers Sidelined
Legacy Keywords
headache diagnosis, return to duty, wounded troops, non-battle injuries, psychosocial stressors, episodic headache
Legacy Keywords
headache diagnosis, return to duty, wounded troops, non-battle injuries, psychosocial stressors, episodic headache
Article Source

FROM CEPHALALGIA

PURLs Copyright

Inside the Article

Vitals

Major Finding: Some 66% of soldiers who were evacuated from war theaters for headache were never able to return to active duty on the front.

Data Source: A retrospective review of almost 1,000 soldiers in Operations Iraqi Freedom and Enduring Freedom who were taken off the battlefield with a primary diagnosis of headache.

Disclosures: The study was funded by a grant from the John P. Murtha Neuroscience and Pain Institute, the U.S. Army, and the Army Regional Anesthesia and Pain Medicine Initiative. Dr. Cohen had no financial declarations.

Small Changes Count in Type 2 Diabetes Patients

Article Type
Changed
Display Headline
Small Changes Count in Type 2 Diabetes Patients

LISBON – Even small changes in hemoglobin A1c and blood pressure could significantly reduce the risk of heart attack, stroke, and other cardiovascular complications in people with type 2 diabetes, according to the findings of a population-based observational study.

A 0.5% decrease in HbA1c and a 10 Hg/mm decrease in systolic blood pressure could avert 10% of such events over 5 years, Dr. Edith Heintjes said at the annual meeting of the European Society for the Study of Diabetes. Greater changes could reduce cardiovascular events by as much as 21%, said Dr. Heintjes of the PHARMO Institute for Drug Research, Utrecht, the Netherlands.

While her study on population attributable risk was albeit theoretical, it still adds weight to the emerging theory that small changes can make a big difference to the health of people with type 2 diabetes.

"Even when we examined only modest incremental reductions, which could be achieved in the clinical setting, we found the possibility of significant benefit," she said. Those patients with the greatest risk factors – elevated HbA1c, high blood pressure, and higher body mass index – stand to gain the most when they improve those factors, she said.

Dr. Heintjes’ analysis included 5,841 Dutch patients with a diagnosis of type 2 diabetes for at least 2 years. The patients were all taking some form of treatment – oral medications, insulin, or both – for at least 6 months to be included in the study. After examining both baseline data and 5-year outcomes, she was able to extrapolate how improvements in the three risk factors might impact the expected number of cardiovascular events.

Patient data were drawn from the PHARMO record linkage system, which includes community pharmaceutical dispensing information, laboratory information, national hospitalization information, and statistics from the Dutch national diabetes monitoring program.

Patients were treated with the aim of achieving the country’s national targets: an HbA1c of below 7%, a systolic blood pressure of 140 mmHg or lower, and a body mass index of 25 kg/m2 or less.

"Even when we examined only modest incremental reductions, we found the possibility of significant benefit."

At baseline, the patients’ average age was 66 years. The average HbA1c was 7%; systolic blood pressure 149 mmHg, and body mass index, 29.5 kg/m2. Most (92%) were taking only oral medications; the remainder was also taking insulin.

Some cardiovascular morbidity was already present in the group, including peripheral artery disease (0.5%), renal impairment (11%), neuropathy (51%), and retinopathy (7%). About half of the group (45%) had a family history of cardiovascular disease.

Dr. Heintjes divided the group according to the number of risk factors each patient exhibited. A quarter (24%) had just one elevated risk factor; 47% had two elevated risk factors, and 26% had elevations in all three risk factors.

A multivariable analysis allowed her to extrapolate that 796 cardiovascular events (heart attack, ischemic heart disease, stroke, and chronic heart failure) would occur if all of the patients were followed for 5 years.

If every patient in this population were able to correct each one of the risk factors to the national recommendations, she said, 687 events would occur – a 14% decrease. Correcting HbA1c and blood pressure accounted for this change, she said; changing BMI did nothing to increase the benefit.

Theoretically, she said, patients with the most risk factors would reap the greatest benefit. The 24% with one elevated risk factor would experience a 5% reduction in cardiovascular events, while those with all three elevated risk factors, upon correcting them, would see a 21% reduction.

Considering the group’s baseline measurements, correcting to national Dutch standards would mean an average HbA1c reduction of 0.8%, a 26-mmHg reduction in systolic blood pressure, and a weight loss of 16 kg (equivalent to a BMI decrease of 5.7 kg/m2). However, Dr. Heintjes said, it might not be realistic to expect such changes. Her second analysis explored the improvements that could arise from smaller changes: a 0.5% reduction in HbA1c, a 10-mmHg reduction in systolic blood pressure and a 10% reduction in total body weight (2.6 kg/m2 decrease in BMI).

"With this analysis, we saw in the overall population that 6% of the risk could be averted," she said. Among those in the subpopulation with three risk factors, applying the smaller changes could cut the number of events by 10%.

It’s not exactly clear how the results can change clinical practice, Dr. Heintjes acknowledged. "But this does allow us to understand how small changes can translate into bigger benefits for people with type 2 diabetes."

 

 

Dr. Heintjes reported having no conflicts of interest. Her employer, PHARMO, however, receives funding from numerous pharmaceutical companies, including Astra Zeneca, which sponsored the current study.

Meeting/Event
Author and Disclosure Information

Publications
Topics
Legacy Keywords
type 2 diabetes patients, hemoglobin a1c levels, diabetes low blood pressure, cardiovascular events, population attributable risk, diabetes risk factors
Author and Disclosure Information

Author and Disclosure Information

Meeting/Event
Meeting/Event

LISBON – Even small changes in hemoglobin A1c and blood pressure could significantly reduce the risk of heart attack, stroke, and other cardiovascular complications in people with type 2 diabetes, according to the findings of a population-based observational study.

A 0.5% decrease in HbA1c and a 10 Hg/mm decrease in systolic blood pressure could avert 10% of such events over 5 years, Dr. Edith Heintjes said at the annual meeting of the European Society for the Study of Diabetes. Greater changes could reduce cardiovascular events by as much as 21%, said Dr. Heintjes of the PHARMO Institute for Drug Research, Utrecht, the Netherlands.

While her study on population attributable risk was albeit theoretical, it still adds weight to the emerging theory that small changes can make a big difference to the health of people with type 2 diabetes.

"Even when we examined only modest incremental reductions, which could be achieved in the clinical setting, we found the possibility of significant benefit," she said. Those patients with the greatest risk factors – elevated HbA1c, high blood pressure, and higher body mass index – stand to gain the most when they improve those factors, she said.

Dr. Heintjes’ analysis included 5,841 Dutch patients with a diagnosis of type 2 diabetes for at least 2 years. The patients were all taking some form of treatment – oral medications, insulin, or both – for at least 6 months to be included in the study. After examining both baseline data and 5-year outcomes, she was able to extrapolate how improvements in the three risk factors might impact the expected number of cardiovascular events.

Patient data were drawn from the PHARMO record linkage system, which includes community pharmaceutical dispensing information, laboratory information, national hospitalization information, and statistics from the Dutch national diabetes monitoring program.

Patients were treated with the aim of achieving the country’s national targets: an HbA1c of below 7%, a systolic blood pressure of 140 mmHg or lower, and a body mass index of 25 kg/m2 or less.

"Even when we examined only modest incremental reductions, we found the possibility of significant benefit."

At baseline, the patients’ average age was 66 years. The average HbA1c was 7%; systolic blood pressure 149 mmHg, and body mass index, 29.5 kg/m2. Most (92%) were taking only oral medications; the remainder was also taking insulin.

Some cardiovascular morbidity was already present in the group, including peripheral artery disease (0.5%), renal impairment (11%), neuropathy (51%), and retinopathy (7%). About half of the group (45%) had a family history of cardiovascular disease.

Dr. Heintjes divided the group according to the number of risk factors each patient exhibited. A quarter (24%) had just one elevated risk factor; 47% had two elevated risk factors, and 26% had elevations in all three risk factors.

A multivariable analysis allowed her to extrapolate that 796 cardiovascular events (heart attack, ischemic heart disease, stroke, and chronic heart failure) would occur if all of the patients were followed for 5 years.

If every patient in this population were able to correct each one of the risk factors to the national recommendations, she said, 687 events would occur – a 14% decrease. Correcting HbA1c and blood pressure accounted for this change, she said; changing BMI did nothing to increase the benefit.

Theoretically, she said, patients with the most risk factors would reap the greatest benefit. The 24% with one elevated risk factor would experience a 5% reduction in cardiovascular events, while those with all three elevated risk factors, upon correcting them, would see a 21% reduction.

Considering the group’s baseline measurements, correcting to national Dutch standards would mean an average HbA1c reduction of 0.8%, a 26-mmHg reduction in systolic blood pressure, and a weight loss of 16 kg (equivalent to a BMI decrease of 5.7 kg/m2). However, Dr. Heintjes said, it might not be realistic to expect such changes. Her second analysis explored the improvements that could arise from smaller changes: a 0.5% reduction in HbA1c, a 10-mmHg reduction in systolic blood pressure and a 10% reduction in total body weight (2.6 kg/m2 decrease in BMI).

"With this analysis, we saw in the overall population that 6% of the risk could be averted," she said. Among those in the subpopulation with three risk factors, applying the smaller changes could cut the number of events by 10%.

It’s not exactly clear how the results can change clinical practice, Dr. Heintjes acknowledged. "But this does allow us to understand how small changes can translate into bigger benefits for people with type 2 diabetes."

 

 

Dr. Heintjes reported having no conflicts of interest. Her employer, PHARMO, however, receives funding from numerous pharmaceutical companies, including Astra Zeneca, which sponsored the current study.

LISBON – Even small changes in hemoglobin A1c and blood pressure could significantly reduce the risk of heart attack, stroke, and other cardiovascular complications in people with type 2 diabetes, according to the findings of a population-based observational study.

A 0.5% decrease in HbA1c and a 10 Hg/mm decrease in systolic blood pressure could avert 10% of such events over 5 years, Dr. Edith Heintjes said at the annual meeting of the European Society for the Study of Diabetes. Greater changes could reduce cardiovascular events by as much as 21%, said Dr. Heintjes of the PHARMO Institute for Drug Research, Utrecht, the Netherlands.

While her study on population attributable risk was albeit theoretical, it still adds weight to the emerging theory that small changes can make a big difference to the health of people with type 2 diabetes.

"Even when we examined only modest incremental reductions, which could be achieved in the clinical setting, we found the possibility of significant benefit," she said. Those patients with the greatest risk factors – elevated HbA1c, high blood pressure, and higher body mass index – stand to gain the most when they improve those factors, she said.

Dr. Heintjes’ analysis included 5,841 Dutch patients with a diagnosis of type 2 diabetes for at least 2 years. The patients were all taking some form of treatment – oral medications, insulin, or both – for at least 6 months to be included in the study. After examining both baseline data and 5-year outcomes, she was able to extrapolate how improvements in the three risk factors might impact the expected number of cardiovascular events.

Patient data were drawn from the PHARMO record linkage system, which includes community pharmaceutical dispensing information, laboratory information, national hospitalization information, and statistics from the Dutch national diabetes monitoring program.

Patients were treated with the aim of achieving the country’s national targets: an HbA1c of below 7%, a systolic blood pressure of 140 mmHg or lower, and a body mass index of 25 kg/m2 or less.

"Even when we examined only modest incremental reductions, we found the possibility of significant benefit."

At baseline, the patients’ average age was 66 years. The average HbA1c was 7%; systolic blood pressure 149 mmHg, and body mass index, 29.5 kg/m2. Most (92%) were taking only oral medications; the remainder was also taking insulin.

Some cardiovascular morbidity was already present in the group, including peripheral artery disease (0.5%), renal impairment (11%), neuropathy (51%), and retinopathy (7%). About half of the group (45%) had a family history of cardiovascular disease.

Dr. Heintjes divided the group according to the number of risk factors each patient exhibited. A quarter (24%) had just one elevated risk factor; 47% had two elevated risk factors, and 26% had elevations in all three risk factors.

A multivariable analysis allowed her to extrapolate that 796 cardiovascular events (heart attack, ischemic heart disease, stroke, and chronic heart failure) would occur if all of the patients were followed for 5 years.

If every patient in this population were able to correct each one of the risk factors to the national recommendations, she said, 687 events would occur – a 14% decrease. Correcting HbA1c and blood pressure accounted for this change, she said; changing BMI did nothing to increase the benefit.

Theoretically, she said, patients with the most risk factors would reap the greatest benefit. The 24% with one elevated risk factor would experience a 5% reduction in cardiovascular events, while those with all three elevated risk factors, upon correcting them, would see a 21% reduction.

Considering the group’s baseline measurements, correcting to national Dutch standards would mean an average HbA1c reduction of 0.8%, a 26-mmHg reduction in systolic blood pressure, and a weight loss of 16 kg (equivalent to a BMI decrease of 5.7 kg/m2). However, Dr. Heintjes said, it might not be realistic to expect such changes. Her second analysis explored the improvements that could arise from smaller changes: a 0.5% reduction in HbA1c, a 10-mmHg reduction in systolic blood pressure and a 10% reduction in total body weight (2.6 kg/m2 decrease in BMI).

"With this analysis, we saw in the overall population that 6% of the risk could be averted," she said. Among those in the subpopulation with three risk factors, applying the smaller changes could cut the number of events by 10%.

It’s not exactly clear how the results can change clinical practice, Dr. Heintjes acknowledged. "But this does allow us to understand how small changes can translate into bigger benefits for people with type 2 diabetes."

 

 

Dr. Heintjes reported having no conflicts of interest. Her employer, PHARMO, however, receives funding from numerous pharmaceutical companies, including Astra Zeneca, which sponsored the current study.

Publications
Publications
Topics
Article Type
Display Headline
Small Changes Count in Type 2 Diabetes Patients
Display Headline
Small Changes Count in Type 2 Diabetes Patients
Legacy Keywords
type 2 diabetes patients, hemoglobin a1c levels, diabetes low blood pressure, cardiovascular events, population attributable risk, diabetes risk factors
Legacy Keywords
type 2 diabetes patients, hemoglobin a1c levels, diabetes low blood pressure, cardiovascular events, population attributable risk, diabetes risk factors
Article Source

FROM THE ANNUAL MEETING OF THE EUROPEAN ASSOCIATION FOR THE STUDY OF DIABETES

PURLs Copyright

Inside the Article

Vitals

Major Finding: Reducing HbA1c, blood pressure, and weight could avert up to 21% of cardiovascular events in patients with type 2 diabetes.

Data Source: A population-based observational study comprising 5,841 patients.

Disclosures: Dr. Heintjes reported having no conflicts of interest. Her employer, PHARMO, however, receives funding from numerous pharmaceutical companies, including Astra Zeneca, which sponsored the current study.

Home Urine Test Classifies Juvenile Diabetes Types

Article Type
Changed
Display Headline
Home Urine Test Classifies Juvenile Diabetes Types

Major Finding: An in-home urine test discriminated type 1 juvenile diabetes from maturity-onset diabetes of youth with 100% sensitivity and 85% specificity.

Data Source: A confirmatory study of 125 children, 96 of whom had been diagnosed with type 1 diabetes and 29 with MODY.

Disclosures: Dr. Besser had no financial disclosures.

LISBON – A single in-home test of urinary C-peptide creatinine ratio appears to differentiate type 1 diabetes from a genetic form of the disease – maturity-onset diabetes of youth.

The test saves children and parents from the stress and inconvenience of a blood test, and discriminates maturity-onset diabetes of youth (MODY) from type 1 diabetes with 100% sensitivity and 85% specificity, Dr. Rachel Besser said at the meeting.

“MODY is frequently misdiagnosed as type 1 diabetes in children, and inappropriately treated with insulin,” said Dr. Besser of Peninsula Medical School, Exeter (England). “This test is clinically useful even in patients with a very short duration of disease.”

In addition to guiding treatment, the test, which is a simple kit designed to be administered after a normal, diabetic-healthy evening meal, can pinpoint which children should undergo genetic testing for MODY, she said. “We ask the children to empty their bladders before eating, have a dinner that contains healthy carbohydrates, and then take the test about 2 hours later.”

Parents mail the sample to a laboratory where the urinary C-peptide creatinine ratio (UCPCR) is measured. A boric acid solution preserves the biomarker for up to 72 hours while en route to the lab.

Dr. Besser and her colleagues examined the test's efficacy in 96 children who had been diagnosed with type 1 diabetes and 29 children who had confirmed MODY (10 with the HNF1A/4A subtype and 19 with the GCK subtype). All of the children had a mean disease duration of 3 years. The mean age of the type 1 patients was 13 years; the MODY patients had a mean age of 14 years.

The test differentiated the two disorders quite well, Dr. Besser said. UCPCR was significantly lower in the type 1 samples than in the MODY samples (median 0.05 vs. 3.41 nmol/mmol). With the use of a cutoff of at least 1.4 nmol/mmol, the test correctly discriminated MODY from type 1 diabetes with 100% sensitivity and 85% specificity. Fourteen of the patients diagnosed with type 1 diabetes met the cutoff point of at least 1.4 nmol/mmol.

Article PDF
Author and Disclosure Information

Publications
Topics
Author and Disclosure Information

Author and Disclosure Information

Article PDF
Article PDF

Major Finding: An in-home urine test discriminated type 1 juvenile diabetes from maturity-onset diabetes of youth with 100% sensitivity and 85% specificity.

Data Source: A confirmatory study of 125 children, 96 of whom had been diagnosed with type 1 diabetes and 29 with MODY.

Disclosures: Dr. Besser had no financial disclosures.

LISBON – A single in-home test of urinary C-peptide creatinine ratio appears to differentiate type 1 diabetes from a genetic form of the disease – maturity-onset diabetes of youth.

The test saves children and parents from the stress and inconvenience of a blood test, and discriminates maturity-onset diabetes of youth (MODY) from type 1 diabetes with 100% sensitivity and 85% specificity, Dr. Rachel Besser said at the meeting.

“MODY is frequently misdiagnosed as type 1 diabetes in children, and inappropriately treated with insulin,” said Dr. Besser of Peninsula Medical School, Exeter (England). “This test is clinically useful even in patients with a very short duration of disease.”

In addition to guiding treatment, the test, which is a simple kit designed to be administered after a normal, diabetic-healthy evening meal, can pinpoint which children should undergo genetic testing for MODY, she said. “We ask the children to empty their bladders before eating, have a dinner that contains healthy carbohydrates, and then take the test about 2 hours later.”

Parents mail the sample to a laboratory where the urinary C-peptide creatinine ratio (UCPCR) is measured. A boric acid solution preserves the biomarker for up to 72 hours while en route to the lab.

Dr. Besser and her colleagues examined the test's efficacy in 96 children who had been diagnosed with type 1 diabetes and 29 children who had confirmed MODY (10 with the HNF1A/4A subtype and 19 with the GCK subtype). All of the children had a mean disease duration of 3 years. The mean age of the type 1 patients was 13 years; the MODY patients had a mean age of 14 years.

The test differentiated the two disorders quite well, Dr. Besser said. UCPCR was significantly lower in the type 1 samples than in the MODY samples (median 0.05 vs. 3.41 nmol/mmol). With the use of a cutoff of at least 1.4 nmol/mmol, the test correctly discriminated MODY from type 1 diabetes with 100% sensitivity and 85% specificity. Fourteen of the patients diagnosed with type 1 diabetes met the cutoff point of at least 1.4 nmol/mmol.

Major Finding: An in-home urine test discriminated type 1 juvenile diabetes from maturity-onset diabetes of youth with 100% sensitivity and 85% specificity.

Data Source: A confirmatory study of 125 children, 96 of whom had been diagnosed with type 1 diabetes and 29 with MODY.

Disclosures: Dr. Besser had no financial disclosures.

LISBON – A single in-home test of urinary C-peptide creatinine ratio appears to differentiate type 1 diabetes from a genetic form of the disease – maturity-onset diabetes of youth.

The test saves children and parents from the stress and inconvenience of a blood test, and discriminates maturity-onset diabetes of youth (MODY) from type 1 diabetes with 100% sensitivity and 85% specificity, Dr. Rachel Besser said at the meeting.

“MODY is frequently misdiagnosed as type 1 diabetes in children, and inappropriately treated with insulin,” said Dr. Besser of Peninsula Medical School, Exeter (England). “This test is clinically useful even in patients with a very short duration of disease.”

In addition to guiding treatment, the test, which is a simple kit designed to be administered after a normal, diabetic-healthy evening meal, can pinpoint which children should undergo genetic testing for MODY, she said. “We ask the children to empty their bladders before eating, have a dinner that contains healthy carbohydrates, and then take the test about 2 hours later.”

Parents mail the sample to a laboratory where the urinary C-peptide creatinine ratio (UCPCR) is measured. A boric acid solution preserves the biomarker for up to 72 hours while en route to the lab.

Dr. Besser and her colleagues examined the test's efficacy in 96 children who had been diagnosed with type 1 diabetes and 29 children who had confirmed MODY (10 with the HNF1A/4A subtype and 19 with the GCK subtype). All of the children had a mean disease duration of 3 years. The mean age of the type 1 patients was 13 years; the MODY patients had a mean age of 14 years.

The test differentiated the two disorders quite well, Dr. Besser said. UCPCR was significantly lower in the type 1 samples than in the MODY samples (median 0.05 vs. 3.41 nmol/mmol). With the use of a cutoff of at least 1.4 nmol/mmol, the test correctly discriminated MODY from type 1 diabetes with 100% sensitivity and 85% specificity. Fourteen of the patients diagnosed with type 1 diabetes met the cutoff point of at least 1.4 nmol/mmol.

Publications
Publications
Topics
Article Type
Display Headline
Home Urine Test Classifies Juvenile Diabetes Types
Display Headline
Home Urine Test Classifies Juvenile Diabetes Types
Article Source

From the Annual Meeting of the European Association for the Study of Diabetes

PURLs Copyright

Inside the Article

Article PDF Media