User login
Who to Blame for Surgical Readmissions?
When too many surgery patients are readmitted, the hospital can be fined by the federal government - but a new study suggests many of those readmissions are not the hospital's fault.
Many readmissions were due to issues like drug abuse or homelessness, the researchers found. Less than one in five patients returned to the hospital due to something doctors could have managed better.
"Very few were due to reasons we could control with better medical care at the index admission," said lead author Dr. Lisa McIntyre, of Harbourview Medical Center in Seattle.
McIntyre and her colleagues noted June 15 in JAMA Surgery that the U.S. government began fining hospitals in 2015 for surgery readmission rates that are higher than expected. Fines were already being imposed since 2012 for readmissions following treatments for various medical conditions.
The researchers studied the medical records of patients who were discharged from their hospital's general surgery department in 2014 or 2015 and readmitted within 30 days.
Out of the 2,100 discharges during that time, there were 173 unplanned readmissions. About 17% of those readmissions were due to injection drug use and about 15% were due to issues like homelessness or difficulty getting to follow-up appointments.
Only about 18% of readmissions - about 2% of all discharges - were due to potentially avoidable problems following surgery.
While the results are only from a single hospital, that hospital is also a safety-net facility for the local area - and McIntyre pointed out that all hospitals have some amount of disadvantaged patients.
"To be able to affect this rate, there are going to need to be new interventions that require money and a more global care package of each individual patient that doesn't stop at discharge," said McIntyre, who is also affiliated with the University of Washington.
Being female, having diabetes, having sepsis upon admission, being in the ICU and being discharged to respite care were all tied to an increased risk of readmission, the researchers found.
The results raise the question of whether readmission rates are valuable measures of surgical quality, write Drs. Alexander Schwed and Christian de Virgilio of the University of California, Los Angeles in an editorial.
Some would argue that readmitting patients is a sound medical decision that is tied to lower risks of death, they write.
"Should such an inexact marker of quality be used to financially penalize hospitals?" they ask. "Health services researchers (need to find) a better marker for surgical quality that is reliably calculable and clinically useful."
SOURCE: http://bit.ly/28Km3aH and http://bit.ly/28Km3Ye JAMA Surgery 2016.
When too many surgery patients are readmitted, the hospital can be fined by the federal government - but a new study suggests many of those readmissions are not the hospital's fault.
Many readmissions were due to issues like drug abuse or homelessness, the researchers found. Less than one in five patients returned to the hospital due to something doctors could have managed better.
"Very few were due to reasons we could control with better medical care at the index admission," said lead author Dr. Lisa McIntyre, of Harbourview Medical Center in Seattle.
McIntyre and her colleagues noted June 15 in JAMA Surgery that the U.S. government began fining hospitals in 2015 for surgery readmission rates that are higher than expected. Fines were already being imposed since 2012 for readmissions following treatments for various medical conditions.
The researchers studied the medical records of patients who were discharged from their hospital's general surgery department in 2014 or 2015 and readmitted within 30 days.
Out of the 2,100 discharges during that time, there were 173 unplanned readmissions. About 17% of those readmissions were due to injection drug use and about 15% were due to issues like homelessness or difficulty getting to follow-up appointments.
Only about 18% of readmissions - about 2% of all discharges - were due to potentially avoidable problems following surgery.
While the results are only from a single hospital, that hospital is also a safety-net facility for the local area - and McIntyre pointed out that all hospitals have some amount of disadvantaged patients.
"To be able to affect this rate, there are going to need to be new interventions that require money and a more global care package of each individual patient that doesn't stop at discharge," said McIntyre, who is also affiliated with the University of Washington.
Being female, having diabetes, having sepsis upon admission, being in the ICU and being discharged to respite care were all tied to an increased risk of readmission, the researchers found.
The results raise the question of whether readmission rates are valuable measures of surgical quality, write Drs. Alexander Schwed and Christian de Virgilio of the University of California, Los Angeles in an editorial.
Some would argue that readmitting patients is a sound medical decision that is tied to lower risks of death, they write.
"Should such an inexact marker of quality be used to financially penalize hospitals?" they ask. "Health services researchers (need to find) a better marker for surgical quality that is reliably calculable and clinically useful."
SOURCE: http://bit.ly/28Km3aH and http://bit.ly/28Km3Ye JAMA Surgery 2016.
When too many surgery patients are readmitted, the hospital can be fined by the federal government - but a new study suggests many of those readmissions are not the hospital's fault.
Many readmissions were due to issues like drug abuse or homelessness, the researchers found. Less than one in five patients returned to the hospital due to something doctors could have managed better.
"Very few were due to reasons we could control with better medical care at the index admission," said lead author Dr. Lisa McIntyre, of Harbourview Medical Center in Seattle.
McIntyre and her colleagues noted June 15 in JAMA Surgery that the U.S. government began fining hospitals in 2015 for surgery readmission rates that are higher than expected. Fines were already being imposed since 2012 for readmissions following treatments for various medical conditions.
The researchers studied the medical records of patients who were discharged from their hospital's general surgery department in 2014 or 2015 and readmitted within 30 days.
Out of the 2,100 discharges during that time, there were 173 unplanned readmissions. About 17% of those readmissions were due to injection drug use and about 15% were due to issues like homelessness or difficulty getting to follow-up appointments.
Only about 18% of readmissions - about 2% of all discharges - were due to potentially avoidable problems following surgery.
While the results are only from a single hospital, that hospital is also a safety-net facility for the local area - and McIntyre pointed out that all hospitals have some amount of disadvantaged patients.
"To be able to affect this rate, there are going to need to be new interventions that require money and a more global care package of each individual patient that doesn't stop at discharge," said McIntyre, who is also affiliated with the University of Washington.
Being female, having diabetes, having sepsis upon admission, being in the ICU and being discharged to respite care were all tied to an increased risk of readmission, the researchers found.
The results raise the question of whether readmission rates are valuable measures of surgical quality, write Drs. Alexander Schwed and Christian de Virgilio of the University of California, Los Angeles in an editorial.
Some would argue that readmitting patients is a sound medical decision that is tied to lower risks of death, they write.
"Should such an inexact marker of quality be used to financially penalize hospitals?" they ask. "Health services researchers (need to find) a better marker for surgical quality that is reliably calculable and clinically useful."
SOURCE: http://bit.ly/28Km3aH and http://bit.ly/28Km3Ye JAMA Surgery 2016.
Benefits of lifestyle intervention only brief in some patients with type 2 diabetes
NEW ORLEANS – Underserved African Americans with type 2 diabetes mellitus who participated in a year-long intensive self-management program did not experience sustained serum glucose control, compared with a control group receiving only two diabetes education classes.
“Relative to non-Hispanic whites, African Americans with type 2 diabetes experience more diabetes-related complications and higher rates of diabetes hospitalization,” lead study author Elizabeth B. Lynch, Ph.D., said at the annual scientific sessions of the American Diabetes Association. “These disparities are even greater for underserved disadvantaged African American populations.”
Dr. Lynch, a psychologist who directs the section of community health in the department of preventive medicine at Rush University Medical Center, Chicago, noted that several self-management interventions for diabetes have demonstrated efficacy at improving glucose control at 6 months. “However, there have not been any diabetes self-management interventions specifically targeting African Americans that have achieved sustained blood glucose control,” she said.
In a trial known as Lifestyle Intervention Through Food and Exercise (LIFE), the researchers examined the effect of a group-based intervention on glucose control at 12 months in a population of low-income African Americans. The intervention components consisted of cognitively tailored nutrition education taught by a registered dietitian, behavioral modification, social support, and peer support. “This education curriculum was based on a series of studies that were done using cognitive anthropological methods with low-income African Americans looking at beliefs and knowledge about the relationship between food and health,” Dr. Lynch said. “We used those studies to design an intervention with the aim of reducing cognitive load among participants when they’re learning new information about nutrition, so essentially making the information easier for people to understand.” Behavioral modification techniques included goal setting, self-monitoring, and problem solving. “We also had social support, and there was a peer supporter who was an individual from the community with type 2 diabetes who was assigned to each of the participants and called them on a regular basis to check in with them on their goals and encourage them,” she said.
The LIFE program consisted of 20 group sessions in the first 6 months and 8 sessions in the second 6 months, while a control group received 2 group-based education classes in the first 6 months only. The researchers conducted assessments at baseline, 6 months, and 12 months.
Individuals were eligible for the trial if they were African American, were a patient of a community clinic affiliated with Cook County Healthcare System, had a clinical diagnosis of type 2 diabetes, and had a hemoglobin A1c level of 7% or greater. Of 1,403 initially screened for the trial, 603 were found to be eligible. Of these, 211 were randomized and enrolled: 106 to the treatment group and 105 to the control group. There was 94% follow-up at 6 and 12 months.
At baseline, the mean age of study participants was 55 years, 70% were female, 46% had a high school education or less, 60% had an annual income of less than $24,000, 65% were uninsured, and 39% had limited health literacy. Baseline food intake as reported by two 24-hour food recalls consisted of a diet high in saturated fat and low in fiber, with a moderate intake of carbohydrates and underconsumption of fruits, vegetables, and dairy products. The baseline level of daily physical activity as measured by accelerometry revealed sedentary activity that exceeded 7 hours per day, 3,614 steps per day, and only 14 minutes per day of moderate-level activity. Study enrollees had a baseline HbA1c level of 9% and a diabetes duration of 11 years; 45% used insulin, and 48% had poor medication adherence. Their mean body mass index was 35.6 kg/m2, and 91% had hypertension.
More than half of individuals in the intervention group attended each of the 20 group sessions, and 90% attended at least 1. At the same time, 68% of individuals in the control group attended both educational sessions. Dr. Lynch reported that compared with the control group, the intervention group had a significantly greater reduction in HbA1c at 6 months (–0.76 vs. –0.21, respectively; P = .026) but not at 12 months (–0.63 vs. –0.45; P = .47). In addition, a higher percentage of individuals in the treatment group had a 0.5% or more decline in HbA1c level at 6 months (63% vs. 42%, P = .005) but not at 12 months (53% vs. 51%, P = .89). The fact that the control group also had a reduction in HbA1c presented a conundrum for the researchers. “One possible explanation for the decrease in A1c in the control group is that medication adherence increased in this group, relative to the intervention group,” Dr. Lynch explained in a press release. “Additional research is needed to identify the most effective strategies to achieve sustained A1c control in African Americans with type 2 diabetes.”
No changes were observed in blood pressure, weight, or physical activity over the course of 12 months in either group.
Although LIFE lacked a third study arm that received usual care, one possible implication of the current findings “may be that diabetes education of any type may be helpful in improving glycemic control, especially in a population that does not normally receive any education,” she said. “Medication adherence may be an easier and more effective strategy to improve glycemic control in this population.”
LIFE was supported by grants from the National Institutes of Health. Dr. Lynch reported having no relevant financial disclosures.
NEW ORLEANS – Underserved African Americans with type 2 diabetes mellitus who participated in a year-long intensive self-management program did not experience sustained serum glucose control, compared with a control group receiving only two diabetes education classes.
“Relative to non-Hispanic whites, African Americans with type 2 diabetes experience more diabetes-related complications and higher rates of diabetes hospitalization,” lead study author Elizabeth B. Lynch, Ph.D., said at the annual scientific sessions of the American Diabetes Association. “These disparities are even greater for underserved disadvantaged African American populations.”
Dr. Lynch, a psychologist who directs the section of community health in the department of preventive medicine at Rush University Medical Center, Chicago, noted that several self-management interventions for diabetes have demonstrated efficacy at improving glucose control at 6 months. “However, there have not been any diabetes self-management interventions specifically targeting African Americans that have achieved sustained blood glucose control,” she said.
In a trial known as Lifestyle Intervention Through Food and Exercise (LIFE), the researchers examined the effect of a group-based intervention on glucose control at 12 months in a population of low-income African Americans. The intervention components consisted of cognitively tailored nutrition education taught by a registered dietitian, behavioral modification, social support, and peer support. “This education curriculum was based on a series of studies that were done using cognitive anthropological methods with low-income African Americans looking at beliefs and knowledge about the relationship between food and health,” Dr. Lynch said. “We used those studies to design an intervention with the aim of reducing cognitive load among participants when they’re learning new information about nutrition, so essentially making the information easier for people to understand.” Behavioral modification techniques included goal setting, self-monitoring, and problem solving. “We also had social support, and there was a peer supporter who was an individual from the community with type 2 diabetes who was assigned to each of the participants and called them on a regular basis to check in with them on their goals and encourage them,” she said.
The LIFE program consisted of 20 group sessions in the first 6 months and 8 sessions in the second 6 months, while a control group received 2 group-based education classes in the first 6 months only. The researchers conducted assessments at baseline, 6 months, and 12 months.
Individuals were eligible for the trial if they were African American, were a patient of a community clinic affiliated with Cook County Healthcare System, had a clinical diagnosis of type 2 diabetes, and had a hemoglobin A1c level of 7% or greater. Of 1,403 initially screened for the trial, 603 were found to be eligible. Of these, 211 were randomized and enrolled: 106 to the treatment group and 105 to the control group. There was 94% follow-up at 6 and 12 months.
At baseline, the mean age of study participants was 55 years, 70% were female, 46% had a high school education or less, 60% had an annual income of less than $24,000, 65% were uninsured, and 39% had limited health literacy. Baseline food intake as reported by two 24-hour food recalls consisted of a diet high in saturated fat and low in fiber, with a moderate intake of carbohydrates and underconsumption of fruits, vegetables, and dairy products. The baseline level of daily physical activity as measured by accelerometry revealed sedentary activity that exceeded 7 hours per day, 3,614 steps per day, and only 14 minutes per day of moderate-level activity. Study enrollees had a baseline HbA1c level of 9% and a diabetes duration of 11 years; 45% used insulin, and 48% had poor medication adherence. Their mean body mass index was 35.6 kg/m2, and 91% had hypertension.
More than half of individuals in the intervention group attended each of the 20 group sessions, and 90% attended at least 1. At the same time, 68% of individuals in the control group attended both educational sessions. Dr. Lynch reported that compared with the control group, the intervention group had a significantly greater reduction in HbA1c at 6 months (–0.76 vs. –0.21, respectively; P = .026) but not at 12 months (–0.63 vs. –0.45; P = .47). In addition, a higher percentage of individuals in the treatment group had a 0.5% or more decline in HbA1c level at 6 months (63% vs. 42%, P = .005) but not at 12 months (53% vs. 51%, P = .89). The fact that the control group also had a reduction in HbA1c presented a conundrum for the researchers. “One possible explanation for the decrease in A1c in the control group is that medication adherence increased in this group, relative to the intervention group,” Dr. Lynch explained in a press release. “Additional research is needed to identify the most effective strategies to achieve sustained A1c control in African Americans with type 2 diabetes.”
No changes were observed in blood pressure, weight, or physical activity over the course of 12 months in either group.
Although LIFE lacked a third study arm that received usual care, one possible implication of the current findings “may be that diabetes education of any type may be helpful in improving glycemic control, especially in a population that does not normally receive any education,” she said. “Medication adherence may be an easier and more effective strategy to improve glycemic control in this population.”
LIFE was supported by grants from the National Institutes of Health. Dr. Lynch reported having no relevant financial disclosures.
NEW ORLEANS – Underserved African Americans with type 2 diabetes mellitus who participated in a year-long intensive self-management program did not experience sustained serum glucose control, compared with a control group receiving only two diabetes education classes.
“Relative to non-Hispanic whites, African Americans with type 2 diabetes experience more diabetes-related complications and higher rates of diabetes hospitalization,” lead study author Elizabeth B. Lynch, Ph.D., said at the annual scientific sessions of the American Diabetes Association. “These disparities are even greater for underserved disadvantaged African American populations.”
Dr. Lynch, a psychologist who directs the section of community health in the department of preventive medicine at Rush University Medical Center, Chicago, noted that several self-management interventions for diabetes have demonstrated efficacy at improving glucose control at 6 months. “However, there have not been any diabetes self-management interventions specifically targeting African Americans that have achieved sustained blood glucose control,” she said.
In a trial known as Lifestyle Intervention Through Food and Exercise (LIFE), the researchers examined the effect of a group-based intervention on glucose control at 12 months in a population of low-income African Americans. The intervention components consisted of cognitively tailored nutrition education taught by a registered dietitian, behavioral modification, social support, and peer support. “This education curriculum was based on a series of studies that were done using cognitive anthropological methods with low-income African Americans looking at beliefs and knowledge about the relationship between food and health,” Dr. Lynch said. “We used those studies to design an intervention with the aim of reducing cognitive load among participants when they’re learning new information about nutrition, so essentially making the information easier for people to understand.” Behavioral modification techniques included goal setting, self-monitoring, and problem solving. “We also had social support, and there was a peer supporter who was an individual from the community with type 2 diabetes who was assigned to each of the participants and called them on a regular basis to check in with them on their goals and encourage them,” she said.
The LIFE program consisted of 20 group sessions in the first 6 months and 8 sessions in the second 6 months, while a control group received 2 group-based education classes in the first 6 months only. The researchers conducted assessments at baseline, 6 months, and 12 months.
Individuals were eligible for the trial if they were African American, were a patient of a community clinic affiliated with Cook County Healthcare System, had a clinical diagnosis of type 2 diabetes, and had a hemoglobin A1c level of 7% or greater. Of 1,403 initially screened for the trial, 603 were found to be eligible. Of these, 211 were randomized and enrolled: 106 to the treatment group and 105 to the control group. There was 94% follow-up at 6 and 12 months.
At baseline, the mean age of study participants was 55 years, 70% were female, 46% had a high school education or less, 60% had an annual income of less than $24,000, 65% were uninsured, and 39% had limited health literacy. Baseline food intake as reported by two 24-hour food recalls consisted of a diet high in saturated fat and low in fiber, with a moderate intake of carbohydrates and underconsumption of fruits, vegetables, and dairy products. The baseline level of daily physical activity as measured by accelerometry revealed sedentary activity that exceeded 7 hours per day, 3,614 steps per day, and only 14 minutes per day of moderate-level activity. Study enrollees had a baseline HbA1c level of 9% and a diabetes duration of 11 years; 45% used insulin, and 48% had poor medication adherence. Their mean body mass index was 35.6 kg/m2, and 91% had hypertension.
More than half of individuals in the intervention group attended each of the 20 group sessions, and 90% attended at least 1. At the same time, 68% of individuals in the control group attended both educational sessions. Dr. Lynch reported that compared with the control group, the intervention group had a significantly greater reduction in HbA1c at 6 months (–0.76 vs. –0.21, respectively; P = .026) but not at 12 months (–0.63 vs. –0.45; P = .47). In addition, a higher percentage of individuals in the treatment group had a 0.5% or more decline in HbA1c level at 6 months (63% vs. 42%, P = .005) but not at 12 months (53% vs. 51%, P = .89). The fact that the control group also had a reduction in HbA1c presented a conundrum for the researchers. “One possible explanation for the decrease in A1c in the control group is that medication adherence increased in this group, relative to the intervention group,” Dr. Lynch explained in a press release. “Additional research is needed to identify the most effective strategies to achieve sustained A1c control in African Americans with type 2 diabetes.”
No changes were observed in blood pressure, weight, or physical activity over the course of 12 months in either group.
Although LIFE lacked a third study arm that received usual care, one possible implication of the current findings “may be that diabetes education of any type may be helpful in improving glycemic control, especially in a population that does not normally receive any education,” she said. “Medication adherence may be an easier and more effective strategy to improve glycemic control in this population.”
LIFE was supported by grants from the National Institutes of Health. Dr. Lynch reported having no relevant financial disclosures.
AT THE ADA ANNUAL SCIENTIFIC SESSIONS
Key clinical point: A lifestyle intervention led to a pronounced reduction in hemoglobin A1c level after 6 months but not after 1 year for African Americans with type 2 diabetes.
Major finding: Compared with patients in the control group, those in the intensive intervention group had a significantly more sizable reduction in HbA1c level at 6 months (P = .026) but not at 12 months (P = .47).
Data source: A trial of 211 patients with type 2 diabetes who were randomized to either a year-long diabetes self-management training program or to two diabetes education classes.
Disclosures: LIFE was supported by grants from the National Institutes of Health. Dr. Lynch reported having no relevant financial disclosures.
Condom basics
In discussion of pregnancy prevention, a great deal of time is spent on female birth control, and the use of condoms is like a tag line “Oh! And, of course, use a condom.” But are we really educating our teens on condoms, their proper use, and their prevention of sexually transmitted infections (STIs). Condoms are our basic entry level birth control, but despite their relative ease of use and accessibility, their use is on the decline.
When we evaluate unintended pregnancies among teens, it is down almost 20% since 1981, according to data from the Guttmacher Institute, but compared with other developed nations, our rates are high.
In 2013, the AAP published a statement encouraging schools and pediatricians to discuss and make condoms more accessible. Fifty-four studies were done on early education of condom use, and reported a 48% increase in their use and a 42% delay in the initiation of sexual activity by 6 months.1 Despite these positive findings , condom use is still declining.
Many factors that affect their use are lack of formal education, availability, and the perception that they decrease sexual pleasure. Condom manufacturers have started campaigns that promote the image of escalating sexual pleasure, but they are competing with the media and music industry, which inundate teens with all kinds of sexual images. In review of the media, 77% showed sexual content, compared with 14% that showed risk and responsible sexual activity.
Before educating teens about condoms, we first must educate ourselves. Condoms are available in three forms: latex (80%), lamb’s cecum (15%), and synthetic (5%). Latex is considered the most effective condom in protecting against STIs and birth control. Synthetic is a good alternative to latex when a latex allergy is present, but is more prone to breakage and slippage. Lamb’s cecum prevents spread of some STIs, but because of its porous nature it does not provide protection against viruses such as HIV, hepatitis B, and herpes simplex virus (HSV). Nonlatex has a longer shelf life and is more compatible with lubricants.
Spermicide-coated condoms have fallen out of favor because the spermicide shortens the shelf life of the condom, as well as can cause mucosal irritation. When the mucosa is irritated, there is an associated increase of contracting STIs, particularly HIV.2

Make patients are aware that condoms do expire and that it is important to check the expiration date. Educating patients on the proper technique of putting a condom on is likely an awkward conversation for most, so consider directing them to the website Bedsiders.org, which has great explanations on how to use and put condoms on, along with comparisons of all types of birth control and which one is best for them.
Making condoms easily accessible is the most-important intervention. Studies show that when condoms are readily available, their use increases. The AAP advocates for schools and primary care doctors to hand them out freely.
Education is key in increasing usage of condoms and reducing spread of STIs. Encouraging parents to talk with their teen and provide access to condoms is crucial in lowering STI statistics. Although abstinence is the only 100% proof of protection, proper use of contraception is an alternative.
References
1. “Condom Use by Adolescents,” Pediatrics. 2013 Nov. doi: 10.1542/peds.2013-2821
2. “Contraception for Adolescents,” Pediatrics. 2014 Oct 10;134:e1244-56.
Dr. Pearce is a pediatrician in Frankfort, Ill. Email her at [email protected].
In discussion of pregnancy prevention, a great deal of time is spent on female birth control, and the use of condoms is like a tag line “Oh! And, of course, use a condom.” But are we really educating our teens on condoms, their proper use, and their prevention of sexually transmitted infections (STIs). Condoms are our basic entry level birth control, but despite their relative ease of use and accessibility, their use is on the decline.
When we evaluate unintended pregnancies among teens, it is down almost 20% since 1981, according to data from the Guttmacher Institute, but compared with other developed nations, our rates are high.
In 2013, the AAP published a statement encouraging schools and pediatricians to discuss and make condoms more accessible. Fifty-four studies were done on early education of condom use, and reported a 48% increase in their use and a 42% delay in the initiation of sexual activity by 6 months.1 Despite these positive findings , condom use is still declining.
Many factors that affect their use are lack of formal education, availability, and the perception that they decrease sexual pleasure. Condom manufacturers have started campaigns that promote the image of escalating sexual pleasure, but they are competing with the media and music industry, which inundate teens with all kinds of sexual images. In review of the media, 77% showed sexual content, compared with 14% that showed risk and responsible sexual activity.
Before educating teens about condoms, we first must educate ourselves. Condoms are available in three forms: latex (80%), lamb’s cecum (15%), and synthetic (5%). Latex is considered the most effective condom in protecting against STIs and birth control. Synthetic is a good alternative to latex when a latex allergy is present, but is more prone to breakage and slippage. Lamb’s cecum prevents spread of some STIs, but because of its porous nature it does not provide protection against viruses such as HIV, hepatitis B, and herpes simplex virus (HSV). Nonlatex has a longer shelf life and is more compatible with lubricants.
Spermicide-coated condoms have fallen out of favor because the spermicide shortens the shelf life of the condom, as well as can cause mucosal irritation. When the mucosa is irritated, there is an associated increase of contracting STIs, particularly HIV.2

Make patients are aware that condoms do expire and that it is important to check the expiration date. Educating patients on the proper technique of putting a condom on is likely an awkward conversation for most, so consider directing them to the website Bedsiders.org, which has great explanations on how to use and put condoms on, along with comparisons of all types of birth control and which one is best for them.
Making condoms easily accessible is the most-important intervention. Studies show that when condoms are readily available, their use increases. The AAP advocates for schools and primary care doctors to hand them out freely.
Education is key in increasing usage of condoms and reducing spread of STIs. Encouraging parents to talk with their teen and provide access to condoms is crucial in lowering STI statistics. Although abstinence is the only 100% proof of protection, proper use of contraception is an alternative.
References
1. “Condom Use by Adolescents,” Pediatrics. 2013 Nov. doi: 10.1542/peds.2013-2821
2. “Contraception for Adolescents,” Pediatrics. 2014 Oct 10;134:e1244-56.
Dr. Pearce is a pediatrician in Frankfort, Ill. Email her at [email protected].
In discussion of pregnancy prevention, a great deal of time is spent on female birth control, and the use of condoms is like a tag line “Oh! And, of course, use a condom.” But are we really educating our teens on condoms, their proper use, and their prevention of sexually transmitted infections (STIs). Condoms are our basic entry level birth control, but despite their relative ease of use and accessibility, their use is on the decline.
When we evaluate unintended pregnancies among teens, it is down almost 20% since 1981, according to data from the Guttmacher Institute, but compared with other developed nations, our rates are high.
In 2013, the AAP published a statement encouraging schools and pediatricians to discuss and make condoms more accessible. Fifty-four studies were done on early education of condom use, and reported a 48% increase in their use and a 42% delay in the initiation of sexual activity by 6 months.1 Despite these positive findings , condom use is still declining.
Many factors that affect their use are lack of formal education, availability, and the perception that they decrease sexual pleasure. Condom manufacturers have started campaigns that promote the image of escalating sexual pleasure, but they are competing with the media and music industry, which inundate teens with all kinds of sexual images. In review of the media, 77% showed sexual content, compared with 14% that showed risk and responsible sexual activity.
Before educating teens about condoms, we first must educate ourselves. Condoms are available in three forms: latex (80%), lamb’s cecum (15%), and synthetic (5%). Latex is considered the most effective condom in protecting against STIs and birth control. Synthetic is a good alternative to latex when a latex allergy is present, but is more prone to breakage and slippage. Lamb’s cecum prevents spread of some STIs, but because of its porous nature it does not provide protection against viruses such as HIV, hepatitis B, and herpes simplex virus (HSV). Nonlatex has a longer shelf life and is more compatible with lubricants.
Spermicide-coated condoms have fallen out of favor because the spermicide shortens the shelf life of the condom, as well as can cause mucosal irritation. When the mucosa is irritated, there is an associated increase of contracting STIs, particularly HIV.2

Make patients are aware that condoms do expire and that it is important to check the expiration date. Educating patients on the proper technique of putting a condom on is likely an awkward conversation for most, so consider directing them to the website Bedsiders.org, which has great explanations on how to use and put condoms on, along with comparisons of all types of birth control and which one is best for them.
Making condoms easily accessible is the most-important intervention. Studies show that when condoms are readily available, their use increases. The AAP advocates for schools and primary care doctors to hand them out freely.
Education is key in increasing usage of condoms and reducing spread of STIs. Encouraging parents to talk with their teen and provide access to condoms is crucial in lowering STI statistics. Although abstinence is the only 100% proof of protection, proper use of contraception is an alternative.
References
1. “Condom Use by Adolescents,” Pediatrics. 2013 Nov. doi: 10.1542/peds.2013-2821
2. “Contraception for Adolescents,” Pediatrics. 2014 Oct 10;134:e1244-56.
Dr. Pearce is a pediatrician in Frankfort, Ill. Email her at [email protected].
Single rituximab dose slows rheumatoid arthritis development
LONDON – A single, intravenous infusion of 1,000 mg of rituximab to people with arthralgia and a high risk for developing rheumatoid arthritis cut the subsequent rate of rheumatoid arthritis development roughly in half during more than 18 months of follow-up in a proof-of-concept, placebo-controlled study that randomized 81 people.
“This is the first study to evaluate the effects of a biopharmaceutical in subjects at risk of developing RA [rheumatoid arthritis],” Dr. Daniëlle M. Gerlag said at the European Congress of Rheumatology. “These results strongly support the rationale for future clinical trials aimed at prevention of RA by a targeted intervention,” added Dr. Gerlag, a rheumatologist at the Academic Medical Center in Amsterdam.
Additional studies are needed to confirm this effect and to examine whether the period of protection against RA development can be extended by administration of additional rituximab (Rituxan) doses. In the current study, the protective effect from the single dose administered appeared to wane over time, she noted.
The idea behind this strategy is that a window of opportunity exists in people at high risk for developing RA to prevent the disease by blocking production of the autoantibodies that trigger the development of a subclinical synovitis that eventually leads to RA. Rituximab is a cytolytic antibody directed against the CD20 antigen on B cells that already has regulatory approval for treating moderately to severely active RA as well as certain other diseases.
Dr. Gerlag and her associates recently published an analysis that detailed their rationale for hypothesizing that prophylactic treatment with rituximab might prove effective at delaying or preventing the development of RA in susceptible people (Rheumatology [Oxford]. 2016 April;55[4]:607-14).
The Prevention of RA by Rituximab (PRAIRI) study ran at three Dutch centers. The investigators enrolled people with arthralgia who had never been diagnosed with arthritis, had never used a disease-modifying antirheumatic drug, and had at least one of these two risk factors: a serum level of IgM rheumatoid factor of more than 12.5 IU/mL and a serum level of anticitrullinated peptide antibodies of more than 25 IU/mL. Enrolled participants also needed to have at least one of the following: a serum level of C-reactive protein greater than 3 mg/L, an erythrocyte sedimentation rate of greater than 28 mm/hr, and evidence of subclinical synovitis identified by either ultrasound or MRI.
The researchers found these participants largely through screening sessions run at health fairs and by publicizing the study during television appearances, Dr. Gerlag said. About three-quarters of the participants were first-degree relatives of patients already diagnosed with RA, but this was not a criterion for enrollment. The participants averaged about 53 years old, and nearly two-thirds were women.
Among the 81 people who underwent treatment, 41 received a single, 1,000-mg infusion of rituximab, and 40 received a placebo infusion. The researchers then followed the participants with scheduled, periodic examinations during a median of 29 months.
During follow-up, 16 of the 40 people in the placebo group (40%) developed RA after a median of 12 months, and 14 of the 41 in the treated arm (34%) developed RA after a median of 17 months.
The researchers performed two different statistical analyses on these outcomes. They used a Kaplan-Meier survival analysis to determine the time until 25% of people in each arm developed RA. Among the placebo patients, this occurred after 12 months, while in the intervention arm, it did not occur until 24 months, a statistically significant doubling of the time to this outcome with rituximab treatment, Dr. Gerlag reported.
The second analysis calculated a Cox proportional hazard ratio based on the time to development of rheumatoid arthritis among those in each of the treatment groups. This determined a 55% reduced hazard ratio after 12 months among people treated with rituximab, compared with the placebo-treated controls, and a 53% reduced hazard after 18 months, both statistically significant differences.
A safety analysis showed that some people treated with rituximab had mild infusion-related symptoms, but no participants had serious infections. Serious adverse events occurred in 11 people in the rituximab group and in 3 in the placebo arm, but none of these serious adverse events was judged to be related to treatment by the study’s data safety monitoring board, said Dr. Gerlag, who is also on the staff of GlaxoSmithKline in Cambridge, England.
The PRAIRI study received no commercial funding. Dr. Gerlag is also a shareholder in GlaxoSmithKline, but the company played no role in the study.
On Twitter@mitchelzoler
LONDON – A single, intravenous infusion of 1,000 mg of rituximab to people with arthralgia and a high risk for developing rheumatoid arthritis cut the subsequent rate of rheumatoid arthritis development roughly in half during more than 18 months of follow-up in a proof-of-concept, placebo-controlled study that randomized 81 people.
“This is the first study to evaluate the effects of a biopharmaceutical in subjects at risk of developing RA [rheumatoid arthritis],” Dr. Daniëlle M. Gerlag said at the European Congress of Rheumatology. “These results strongly support the rationale for future clinical trials aimed at prevention of RA by a targeted intervention,” added Dr. Gerlag, a rheumatologist at the Academic Medical Center in Amsterdam.
Additional studies are needed to confirm this effect and to examine whether the period of protection against RA development can be extended by administration of additional rituximab (Rituxan) doses. In the current study, the protective effect from the single dose administered appeared to wane over time, she noted.
The idea behind this strategy is that a window of opportunity exists in people at high risk for developing RA to prevent the disease by blocking production of the autoantibodies that trigger the development of a subclinical synovitis that eventually leads to RA. Rituximab is a cytolytic antibody directed against the CD20 antigen on B cells that already has regulatory approval for treating moderately to severely active RA as well as certain other diseases.
Dr. Gerlag and her associates recently published an analysis that detailed their rationale for hypothesizing that prophylactic treatment with rituximab might prove effective at delaying or preventing the development of RA in susceptible people (Rheumatology [Oxford]. 2016 April;55[4]:607-14).
The Prevention of RA by Rituximab (PRAIRI) study ran at three Dutch centers. The investigators enrolled people with arthralgia who had never been diagnosed with arthritis, had never used a disease-modifying antirheumatic drug, and had at least one of these two risk factors: a serum level of IgM rheumatoid factor of more than 12.5 IU/mL and a serum level of anticitrullinated peptide antibodies of more than 25 IU/mL. Enrolled participants also needed to have at least one of the following: a serum level of C-reactive protein greater than 3 mg/L, an erythrocyte sedimentation rate of greater than 28 mm/hr, and evidence of subclinical synovitis identified by either ultrasound or MRI.
The researchers found these participants largely through screening sessions run at health fairs and by publicizing the study during television appearances, Dr. Gerlag said. About three-quarters of the participants were first-degree relatives of patients already diagnosed with RA, but this was not a criterion for enrollment. The participants averaged about 53 years old, and nearly two-thirds were women.
Among the 81 people who underwent treatment, 41 received a single, 1,000-mg infusion of rituximab, and 40 received a placebo infusion. The researchers then followed the participants with scheduled, periodic examinations during a median of 29 months.
During follow-up, 16 of the 40 people in the placebo group (40%) developed RA after a median of 12 months, and 14 of the 41 in the treated arm (34%) developed RA after a median of 17 months.
The researchers performed two different statistical analyses on these outcomes. They used a Kaplan-Meier survival analysis to determine the time until 25% of people in each arm developed RA. Among the placebo patients, this occurred after 12 months, while in the intervention arm, it did not occur until 24 months, a statistically significant doubling of the time to this outcome with rituximab treatment, Dr. Gerlag reported.
The second analysis calculated a Cox proportional hazard ratio based on the time to development of rheumatoid arthritis among those in each of the treatment groups. This determined a 55% reduced hazard ratio after 12 months among people treated with rituximab, compared with the placebo-treated controls, and a 53% reduced hazard after 18 months, both statistically significant differences.
A safety analysis showed that some people treated with rituximab had mild infusion-related symptoms, but no participants had serious infections. Serious adverse events occurred in 11 people in the rituximab group and in 3 in the placebo arm, but none of these serious adverse events was judged to be related to treatment by the study’s data safety monitoring board, said Dr. Gerlag, who is also on the staff of GlaxoSmithKline in Cambridge, England.
The PRAIRI study received no commercial funding. Dr. Gerlag is also a shareholder in GlaxoSmithKline, but the company played no role in the study.
On Twitter@mitchelzoler
LONDON – A single, intravenous infusion of 1,000 mg of rituximab to people with arthralgia and a high risk for developing rheumatoid arthritis cut the subsequent rate of rheumatoid arthritis development roughly in half during more than 18 months of follow-up in a proof-of-concept, placebo-controlled study that randomized 81 people.
“This is the first study to evaluate the effects of a biopharmaceutical in subjects at risk of developing RA [rheumatoid arthritis],” Dr. Daniëlle M. Gerlag said at the European Congress of Rheumatology. “These results strongly support the rationale for future clinical trials aimed at prevention of RA by a targeted intervention,” added Dr. Gerlag, a rheumatologist at the Academic Medical Center in Amsterdam.
Additional studies are needed to confirm this effect and to examine whether the period of protection against RA development can be extended by administration of additional rituximab (Rituxan) doses. In the current study, the protective effect from the single dose administered appeared to wane over time, she noted.
The idea behind this strategy is that a window of opportunity exists in people at high risk for developing RA to prevent the disease by blocking production of the autoantibodies that trigger the development of a subclinical synovitis that eventually leads to RA. Rituximab is a cytolytic antibody directed against the CD20 antigen on B cells that already has regulatory approval for treating moderately to severely active RA as well as certain other diseases.
Dr. Gerlag and her associates recently published an analysis that detailed their rationale for hypothesizing that prophylactic treatment with rituximab might prove effective at delaying or preventing the development of RA in susceptible people (Rheumatology [Oxford]. 2016 April;55[4]:607-14).
The Prevention of RA by Rituximab (PRAIRI) study ran at three Dutch centers. The investigators enrolled people with arthralgia who had never been diagnosed with arthritis, had never used a disease-modifying antirheumatic drug, and had at least one of these two risk factors: a serum level of IgM rheumatoid factor of more than 12.5 IU/mL and a serum level of anticitrullinated peptide antibodies of more than 25 IU/mL. Enrolled participants also needed to have at least one of the following: a serum level of C-reactive protein greater than 3 mg/L, an erythrocyte sedimentation rate of greater than 28 mm/hr, and evidence of subclinical synovitis identified by either ultrasound or MRI.
The researchers found these participants largely through screening sessions run at health fairs and by publicizing the study during television appearances, Dr. Gerlag said. About three-quarters of the participants were first-degree relatives of patients already diagnosed with RA, but this was not a criterion for enrollment. The participants averaged about 53 years old, and nearly two-thirds were women.
Among the 81 people who underwent treatment, 41 received a single, 1,000-mg infusion of rituximab, and 40 received a placebo infusion. The researchers then followed the participants with scheduled, periodic examinations during a median of 29 months.
During follow-up, 16 of the 40 people in the placebo group (40%) developed RA after a median of 12 months, and 14 of the 41 in the treated arm (34%) developed RA after a median of 17 months.
The researchers performed two different statistical analyses on these outcomes. They used a Kaplan-Meier survival analysis to determine the time until 25% of people in each arm developed RA. Among the placebo patients, this occurred after 12 months, while in the intervention arm, it did not occur until 24 months, a statistically significant doubling of the time to this outcome with rituximab treatment, Dr. Gerlag reported.
The second analysis calculated a Cox proportional hazard ratio based on the time to development of rheumatoid arthritis among those in each of the treatment groups. This determined a 55% reduced hazard ratio after 12 months among people treated with rituximab, compared with the placebo-treated controls, and a 53% reduced hazard after 18 months, both statistically significant differences.
A safety analysis showed that some people treated with rituximab had mild infusion-related symptoms, but no participants had serious infections. Serious adverse events occurred in 11 people in the rituximab group and in 3 in the placebo arm, but none of these serious adverse events was judged to be related to treatment by the study’s data safety monitoring board, said Dr. Gerlag, who is also on the staff of GlaxoSmithKline in Cambridge, England.
The PRAIRI study received no commercial funding. Dr. Gerlag is also a shareholder in GlaxoSmithKline, but the company played no role in the study.
On Twitter@mitchelzoler
AT THE EULAR 2016 CONGRESS
Key clinical point: A single, intravenous dose of 1,000 mg rituximab to people with arthralgia and a high risk for developing rheumatoid arthritis halved the incidence of rheumatoid arthritis during the 18 months after treatment in a placebo-controlled study.
Major finding: Rituximab cut the rheumatoid arthritis incidence, compared with placebo, by 55% after 12 months and 53% after 18 months.
Data source: PRAIRI, a multicenter, placebo-controlled, randomized trial with 81 people at high risk for developing rheumatoid arthritis.
Disclosures: PRAIRI received no commercial funding. Dr. Gerlag is an employee of and shareholder in GlaxoSmithKline, but the company played no role in the study.
Investigational Wnt inhibitor shows promise in knee osteoarthritis
LONDON – Early clinical data show that a novel injectable drug holds promise for becoming the first disease-modifying osteoarthritis drug.
The results of a randomized, placebo-controlled, double-blind phase I trial involving 61 patients showed that a single intra-articular injection of SM04690 was associated with improved Western Ontario and McMaster Universities Arthritis Index (WOMAC) function and pain scores. The investigational drug also seemed to slow joint-space narrowing, compared with baseline values, with the suggestion that it may even increase joint space width.
However, those were exploratory efficacy analyses because the primary objective of the trial was to examine the safety of SM04690, a small molecule that inhibits the Wnt signaling pathway.
“The Wnt pathway has been implicated in the development of osteoarthritis [OA],” said Dr. Yusuf Yazici during a poster presentation at the European Congress of Rheumatology.
“Overactivity of Wnt signaling leads to stem cells constantly differentiating into osteoblasts, leading to osteophyte formation,” he explained, noting that Wnt signaling also stimulates the secretion of cartilage-destroying metalloproteases (Osteoarthritis Cartilage. 2012;20:162-71). “It has been very well established in the literature that if you could somehow turn that off you, could maybe improve some of the things that are happening in osteoarthritis.”
SM04690 works by “pushing the lineage fate of progenitor stem cells in the knee towards chondrocyte formation and away from osteoblast formation,” said Dr. Yazici of New York University Langone Medical Center, New York, and the chief medical officer of Samumed, the San Diego–based company developing the novel Wnt inhibitor.
He noted preclinical data had been presented orally at the EULAR congress showing that there was cartilage growth, suppressed protease production, and reduced proinflammatory cytokine (interleukin-6 and tumor necrosis factor–alpha) production.
The phase I data represent the first in-human results, with three doses of SM04690 evaluated (0.03 mg, 0.07 mg, and 0.23 mg) versus placebo in patients with moderate to severe symptomatic OA. For inclusion, patients had to have a WOMAC total score of between 36 and 72 and Kellgren-Lawrence (KL) grade 2 or 3 knee OA, and be willing to forgo pain medication for 24 hours prior to pain assessments being performed.
At baseline, the mean age of patients ranged from 60 to 64 years, their body mass index ranged from 28.7 kg/m2 to 31.4 kg/m2, and 41%-69% had KL grade 3 knee OA.
In terms of safety, the primary objective of the trial, there were no reports of serious adverse events related to the study drug. One patient who had reported increased knee pain and paroxysmal tachycardia 2 months after the injection was found to have a history of the condition, and after unblinding, none of the patients had detectable drug levels outside of the knee.
Overall, the number of adverse events was low and no different from placebo, Dr. Yazici said. The percentage of patients reporting an adverse event with the three rising doses of SM04690 were 53%, 35%, and 44%, respectively, compared with 55% of those given placebo.
WOMAC function scores for the 0.03-mg dose declined by a mean of –18.4 at week 12 and by –20.1 at week 24 from a baseline of 39.1; for 0.07 mg, by –19.5 at week 12 and by –18.9 at 24 weeks from 37.5; for 0.23 mg, by –17.8 at week 12 and by –12.4 at week 24 from 40.4; and for placebo, by –14.9 at week 12 and by –16.0 at week 24 from 34.4.
WOMAC pain scores at baseline were a respective 10.8, 10.8, 11.4, and 9.9, and the mean changes at week 12 were –4.4, –5.8, –5.7, and –4.2. At week 24, the mean declines were –5.6, –5.3, –4.3, and –4.8.
Medial joint space width was a mean of 4.5, 3.72, 3.62, and 3.74 mm at baseline in the four treatment groups, with mean changes from baseline to 24 weeks of 0.00, 0.49, –0.15, and –0.33 for the 0.03-mg, 0.07-mg, and 0.23-mg SM04690 and placebo groups, respectively.
Although the trial was not powered to detect any statistically significant differences between the active treatment dose and placebo, there was an indication that more patients treated with SM04690 than with placebo were likely to achieve an OMERACT-OARSI strict response.
These data support the ongoing phase II trial that is being conducted in 455 patients, Dr. Yazici said. The results of that trial are expected around October 2016, which should be in time for their presentation at the annual meeting of the American College of Rheumatology.
LONDON – Early clinical data show that a novel injectable drug holds promise for becoming the first disease-modifying osteoarthritis drug.
The results of a randomized, placebo-controlled, double-blind phase I trial involving 61 patients showed that a single intra-articular injection of SM04690 was associated with improved Western Ontario and McMaster Universities Arthritis Index (WOMAC) function and pain scores. The investigational drug also seemed to slow joint-space narrowing, compared with baseline values, with the suggestion that it may even increase joint space width.
However, those were exploratory efficacy analyses because the primary objective of the trial was to examine the safety of SM04690, a small molecule that inhibits the Wnt signaling pathway.
“The Wnt pathway has been implicated in the development of osteoarthritis [OA],” said Dr. Yusuf Yazici during a poster presentation at the European Congress of Rheumatology.
“Overactivity of Wnt signaling leads to stem cells constantly differentiating into osteoblasts, leading to osteophyte formation,” he explained, noting that Wnt signaling also stimulates the secretion of cartilage-destroying metalloproteases (Osteoarthritis Cartilage. 2012;20:162-71). “It has been very well established in the literature that if you could somehow turn that off you, could maybe improve some of the things that are happening in osteoarthritis.”
SM04690 works by “pushing the lineage fate of progenitor stem cells in the knee towards chondrocyte formation and away from osteoblast formation,” said Dr. Yazici of New York University Langone Medical Center, New York, and the chief medical officer of Samumed, the San Diego–based company developing the novel Wnt inhibitor.
He noted preclinical data had been presented orally at the EULAR congress showing that there was cartilage growth, suppressed protease production, and reduced proinflammatory cytokine (interleukin-6 and tumor necrosis factor–alpha) production.
The phase I data represent the first in-human results, with three doses of SM04690 evaluated (0.03 mg, 0.07 mg, and 0.23 mg) versus placebo in patients with moderate to severe symptomatic OA. For inclusion, patients had to have a WOMAC total score of between 36 and 72 and Kellgren-Lawrence (KL) grade 2 or 3 knee OA, and be willing to forgo pain medication for 24 hours prior to pain assessments being performed.
At baseline, the mean age of patients ranged from 60 to 64 years, their body mass index ranged from 28.7 kg/m2 to 31.4 kg/m2, and 41%-69% had KL grade 3 knee OA.
In terms of safety, the primary objective of the trial, there were no reports of serious adverse events related to the study drug. One patient who had reported increased knee pain and paroxysmal tachycardia 2 months after the injection was found to have a history of the condition, and after unblinding, none of the patients had detectable drug levels outside of the knee.
Overall, the number of adverse events was low and no different from placebo, Dr. Yazici said. The percentage of patients reporting an adverse event with the three rising doses of SM04690 were 53%, 35%, and 44%, respectively, compared with 55% of those given placebo.
WOMAC function scores for the 0.03-mg dose declined by a mean of –18.4 at week 12 and by –20.1 at week 24 from a baseline of 39.1; for 0.07 mg, by –19.5 at week 12 and by –18.9 at 24 weeks from 37.5; for 0.23 mg, by –17.8 at week 12 and by –12.4 at week 24 from 40.4; and for placebo, by –14.9 at week 12 and by –16.0 at week 24 from 34.4.
WOMAC pain scores at baseline were a respective 10.8, 10.8, 11.4, and 9.9, and the mean changes at week 12 were –4.4, –5.8, –5.7, and –4.2. At week 24, the mean declines were –5.6, –5.3, –4.3, and –4.8.
Medial joint space width was a mean of 4.5, 3.72, 3.62, and 3.74 mm at baseline in the four treatment groups, with mean changes from baseline to 24 weeks of 0.00, 0.49, –0.15, and –0.33 for the 0.03-mg, 0.07-mg, and 0.23-mg SM04690 and placebo groups, respectively.
Although the trial was not powered to detect any statistically significant differences between the active treatment dose and placebo, there was an indication that more patients treated with SM04690 than with placebo were likely to achieve an OMERACT-OARSI strict response.
These data support the ongoing phase II trial that is being conducted in 455 patients, Dr. Yazici said. The results of that trial are expected around October 2016, which should be in time for their presentation at the annual meeting of the American College of Rheumatology.
LONDON – Early clinical data show that a novel injectable drug holds promise for becoming the first disease-modifying osteoarthritis drug.
The results of a randomized, placebo-controlled, double-blind phase I trial involving 61 patients showed that a single intra-articular injection of SM04690 was associated with improved Western Ontario and McMaster Universities Arthritis Index (WOMAC) function and pain scores. The investigational drug also seemed to slow joint-space narrowing, compared with baseline values, with the suggestion that it may even increase joint space width.
However, those were exploratory efficacy analyses because the primary objective of the trial was to examine the safety of SM04690, a small molecule that inhibits the Wnt signaling pathway.
“The Wnt pathway has been implicated in the development of osteoarthritis [OA],” said Dr. Yusuf Yazici during a poster presentation at the European Congress of Rheumatology.
“Overactivity of Wnt signaling leads to stem cells constantly differentiating into osteoblasts, leading to osteophyte formation,” he explained, noting that Wnt signaling also stimulates the secretion of cartilage-destroying metalloproteases (Osteoarthritis Cartilage. 2012;20:162-71). “It has been very well established in the literature that if you could somehow turn that off you, could maybe improve some of the things that are happening in osteoarthritis.”
SM04690 works by “pushing the lineage fate of progenitor stem cells in the knee towards chondrocyte formation and away from osteoblast formation,” said Dr. Yazici of New York University Langone Medical Center, New York, and the chief medical officer of Samumed, the San Diego–based company developing the novel Wnt inhibitor.
He noted preclinical data had been presented orally at the EULAR congress showing that there was cartilage growth, suppressed protease production, and reduced proinflammatory cytokine (interleukin-6 and tumor necrosis factor–alpha) production.
The phase I data represent the first in-human results, with three doses of SM04690 evaluated (0.03 mg, 0.07 mg, and 0.23 mg) versus placebo in patients with moderate to severe symptomatic OA. For inclusion, patients had to have a WOMAC total score of between 36 and 72 and Kellgren-Lawrence (KL) grade 2 or 3 knee OA, and be willing to forgo pain medication for 24 hours prior to pain assessments being performed.
At baseline, the mean age of patients ranged from 60 to 64 years, their body mass index ranged from 28.7 kg/m2 to 31.4 kg/m2, and 41%-69% had KL grade 3 knee OA.
In terms of safety, the primary objective of the trial, there were no reports of serious adverse events related to the study drug. One patient who had reported increased knee pain and paroxysmal tachycardia 2 months after the injection was found to have a history of the condition, and after unblinding, none of the patients had detectable drug levels outside of the knee.
Overall, the number of adverse events was low and no different from placebo, Dr. Yazici said. The percentage of patients reporting an adverse event with the three rising doses of SM04690 were 53%, 35%, and 44%, respectively, compared with 55% of those given placebo.
WOMAC function scores for the 0.03-mg dose declined by a mean of –18.4 at week 12 and by –20.1 at week 24 from a baseline of 39.1; for 0.07 mg, by –19.5 at week 12 and by –18.9 at 24 weeks from 37.5; for 0.23 mg, by –17.8 at week 12 and by –12.4 at week 24 from 40.4; and for placebo, by –14.9 at week 12 and by –16.0 at week 24 from 34.4.
WOMAC pain scores at baseline were a respective 10.8, 10.8, 11.4, and 9.9, and the mean changes at week 12 were –4.4, –5.8, –5.7, and –4.2. At week 24, the mean declines were –5.6, –5.3, –4.3, and –4.8.
Medial joint space width was a mean of 4.5, 3.72, 3.62, and 3.74 mm at baseline in the four treatment groups, with mean changes from baseline to 24 weeks of 0.00, 0.49, –0.15, and –0.33 for the 0.03-mg, 0.07-mg, and 0.23-mg SM04690 and placebo groups, respectively.
Although the trial was not powered to detect any statistically significant differences between the active treatment dose and placebo, there was an indication that more patients treated with SM04690 than with placebo were likely to achieve an OMERACT-OARSI strict response.
These data support the ongoing phase II trial that is being conducted in 455 patients, Dr. Yazici said. The results of that trial are expected around October 2016, which should be in time for their presentation at the annual meeting of the American College of Rheumatology.
AT THE EULAR 2016 CONGRESS
Key clinical point: Early clinical data show that a novel injectable drug holds promise for becoming the first disease-modifying osteoarthritis drug.
Major finding: SM04690 was well tolerated, and exploratory efficacy analyses showed improved function, pain, and joint space width.
Data source: A multicenter, randomized, placebo-controlled, double-blind phase I trial involving 61 patients with knee osteoarthritis.
Disclosures: Dr. Yazici is chief medical officer of Samumed, the company that funded the study.
Long-term metformin use protective against neurodegenerative disease
NEW ORLEANS – The use of metformin for at least 2 years had a protective effect on the incidence of neurodegenerative disease among elderly veterans, according to results from a large analysis of Veterans Affairs data.
At the annual scientific sessions of the American Diabetes Association, lead study author Qian Shi said that according to the current medical literature, diabetes increases one’s risk of Alzheimer’s disease (by 1.46- to 1.56-fold), all types of dementia (by 1.51- to 1.73-fold), vascular dementia (by 2.27- to 2.48-fold), and mild cognitive impairment (by 1.21-fold). “Metformin can cross the blood-brain barrier having specific effects on the central nervous system. But the exact mechanism and sites of its action remain unknown, and there are conflicting results,” said Ms. Shi, a PhD candidate in the department of global health policy and management at Tulane University School of Public Health and Tropical Medicine, New Orleans.
In an effort to examine the impact of receiving metformin treatment on the incidence of neurodegenerative disease and the association between length of metformin exposure and the risk of neurodegenerative diseases, the researchers used the Veterans Affairs database from 2004 to 2010 to study 6,046 patients who were at least 50 years of age with type 2 diabetes mellitus and were receiving long-term insulin treatment.
The length of metformin exposure was categorized by exposure years over the study period from baseline to the time of the first diagnosis of neurodegenerative disease, which included Alzheimer’s disease, Parkinson’s disease, Huntington’s disease, dementia, and cognitive impairment. The five categories of metformin exposure time were no metformin treatment, less than 1 year, 1-2 years, 2-4 years, and 4 years or more. The mean age of patients was 63 years, 98% were male, and they were followed for a median of 5.3 years.
Of the 6,046 patients, 433 developed neurodegenerative disease during the study period, primarily dementia (334 cases). Other diagnoses included Parkinson’s disease (100 cases), Alzheimer’s disease (71 cases), and cognitive impairment (19 cases).
Ms. Shi reported that the adjusted incidence rates of neurodegenerative disease by cohort were 2.08 cases per 100 person-years for those who received no metformin treatment, 2.47 per 100 person-years for those treated with metformin for less than 1 year, 1.61 per 100 person-years for those treated 1-2 years, 1.30 per 100 person-years for those treated 2-4 years, and 0.49 person-years for those treated 4 years or more. The longer patients took metformin, the less likely they were to develop neurodegenerative disease, she said.
When comparing patients who received metformin treatment with those who did not on Cox regression analysis, the hazard ratio was 0.686 for neurodegenerative disease, 0.644 for dementia, and 0.611 for Parkinson’s disease. The risk reduction was not as robust for those with Alzheimer’s disease and cognitive impairment, most likely because of the limited number of cases, Ms. Shi said. Renal disease had no significant association with the risk of neurodegenerative disease, and it was balanced across metformin exposure groups.
She acknowledged certain limitations of the study, including its retrospective design, the high proportion of males, and the fact that data on diabetes duration and serum vitamin B level were not available.
The researchers reported having no relevant financial disclosures.
NEW ORLEANS – The use of metformin for at least 2 years had a protective effect on the incidence of neurodegenerative disease among elderly veterans, according to results from a large analysis of Veterans Affairs data.
At the annual scientific sessions of the American Diabetes Association, lead study author Qian Shi said that according to the current medical literature, diabetes increases one’s risk of Alzheimer’s disease (by 1.46- to 1.56-fold), all types of dementia (by 1.51- to 1.73-fold), vascular dementia (by 2.27- to 2.48-fold), and mild cognitive impairment (by 1.21-fold). “Metformin can cross the blood-brain barrier having specific effects on the central nervous system. But the exact mechanism and sites of its action remain unknown, and there are conflicting results,” said Ms. Shi, a PhD candidate in the department of global health policy and management at Tulane University School of Public Health and Tropical Medicine, New Orleans.
In an effort to examine the impact of receiving metformin treatment on the incidence of neurodegenerative disease and the association between length of metformin exposure and the risk of neurodegenerative diseases, the researchers used the Veterans Affairs database from 2004 to 2010 to study 6,046 patients who were at least 50 years of age with type 2 diabetes mellitus and were receiving long-term insulin treatment.
The length of metformin exposure was categorized by exposure years over the study period from baseline to the time of the first diagnosis of neurodegenerative disease, which included Alzheimer’s disease, Parkinson’s disease, Huntington’s disease, dementia, and cognitive impairment. The five categories of metformin exposure time were no metformin treatment, less than 1 year, 1-2 years, 2-4 years, and 4 years or more. The mean age of patients was 63 years, 98% were male, and they were followed for a median of 5.3 years.
Of the 6,046 patients, 433 developed neurodegenerative disease during the study period, primarily dementia (334 cases). Other diagnoses included Parkinson’s disease (100 cases), Alzheimer’s disease (71 cases), and cognitive impairment (19 cases).
Ms. Shi reported that the adjusted incidence rates of neurodegenerative disease by cohort were 2.08 cases per 100 person-years for those who received no metformin treatment, 2.47 per 100 person-years for those treated with metformin for less than 1 year, 1.61 per 100 person-years for those treated 1-2 years, 1.30 per 100 person-years for those treated 2-4 years, and 0.49 person-years for those treated 4 years or more. The longer patients took metformin, the less likely they were to develop neurodegenerative disease, she said.
When comparing patients who received metformin treatment with those who did not on Cox regression analysis, the hazard ratio was 0.686 for neurodegenerative disease, 0.644 for dementia, and 0.611 for Parkinson’s disease. The risk reduction was not as robust for those with Alzheimer’s disease and cognitive impairment, most likely because of the limited number of cases, Ms. Shi said. Renal disease had no significant association with the risk of neurodegenerative disease, and it was balanced across metformin exposure groups.
She acknowledged certain limitations of the study, including its retrospective design, the high proportion of males, and the fact that data on diabetes duration and serum vitamin B level were not available.
The researchers reported having no relevant financial disclosures.
NEW ORLEANS – The use of metformin for at least 2 years had a protective effect on the incidence of neurodegenerative disease among elderly veterans, according to results from a large analysis of Veterans Affairs data.
At the annual scientific sessions of the American Diabetes Association, lead study author Qian Shi said that according to the current medical literature, diabetes increases one’s risk of Alzheimer’s disease (by 1.46- to 1.56-fold), all types of dementia (by 1.51- to 1.73-fold), vascular dementia (by 2.27- to 2.48-fold), and mild cognitive impairment (by 1.21-fold). “Metformin can cross the blood-brain barrier having specific effects on the central nervous system. But the exact mechanism and sites of its action remain unknown, and there are conflicting results,” said Ms. Shi, a PhD candidate in the department of global health policy and management at Tulane University School of Public Health and Tropical Medicine, New Orleans.
In an effort to examine the impact of receiving metformin treatment on the incidence of neurodegenerative disease and the association between length of metformin exposure and the risk of neurodegenerative diseases, the researchers used the Veterans Affairs database from 2004 to 2010 to study 6,046 patients who were at least 50 years of age with type 2 diabetes mellitus and were receiving long-term insulin treatment.
The length of metformin exposure was categorized by exposure years over the study period from baseline to the time of the first diagnosis of neurodegenerative disease, which included Alzheimer’s disease, Parkinson’s disease, Huntington’s disease, dementia, and cognitive impairment. The five categories of metformin exposure time were no metformin treatment, less than 1 year, 1-2 years, 2-4 years, and 4 years or more. The mean age of patients was 63 years, 98% were male, and they were followed for a median of 5.3 years.
Of the 6,046 patients, 433 developed neurodegenerative disease during the study period, primarily dementia (334 cases). Other diagnoses included Parkinson’s disease (100 cases), Alzheimer’s disease (71 cases), and cognitive impairment (19 cases).
Ms. Shi reported that the adjusted incidence rates of neurodegenerative disease by cohort were 2.08 cases per 100 person-years for those who received no metformin treatment, 2.47 per 100 person-years for those treated with metformin for less than 1 year, 1.61 per 100 person-years for those treated 1-2 years, 1.30 per 100 person-years for those treated 2-4 years, and 0.49 person-years for those treated 4 years or more. The longer patients took metformin, the less likely they were to develop neurodegenerative disease, she said.
When comparing patients who received metformin treatment with those who did not on Cox regression analysis, the hazard ratio was 0.686 for neurodegenerative disease, 0.644 for dementia, and 0.611 for Parkinson’s disease. The risk reduction was not as robust for those with Alzheimer’s disease and cognitive impairment, most likely because of the limited number of cases, Ms. Shi said. Renal disease had no significant association with the risk of neurodegenerative disease, and it was balanced across metformin exposure groups.
She acknowledged certain limitations of the study, including its retrospective design, the high proportion of males, and the fact that data on diabetes duration and serum vitamin B level were not available.
The researchers reported having no relevant financial disclosures.
AT THE ADA ANNUAL SCIENTIFIC SESSIONS
Key clinical point: Use of metformin for at least 2 years was protective against the onset of neurodegenerative disease.
Major finding: The adjusted incidence rates of neurodegenerative disease ranged from 2.47 cases per 100 person-years for those treated with metformin for less than 1 year to 0.49 cases per 100 person-years for those treated for 4 years or longer.
Data source: A longitudinal study of 6,046 patients at least 50 years of age with type 2 diabetes mellitus who were receiving long-term insulin treatment.
Disclosures: The researchers reported having no relevant financial disclosures.
The resurgence of syphilis: New USPSTF screening recommendations
The balm of reassurance
A toddler has had several brief episodes of mild perioral cyanosis noticed at day care. The parents see the primary care provider. The exam is normal. The child is admitted for a work-up. The CBC and comprehensive metabolic profile are unremarkable. The chest x-ray is normal. An ECG is normal. An echocardiogram is normal. The EEG is normal. Now what?
I was taught that an uncommon presentation of a common disease is still more common than a common presentation of an uncommon disease. Or simply, odd-looking horses are more common than zebras unless you practice in the savanna. There is a point in any safari at which you have to decide whether you are hunting a zebra or chasing a shadow. Clinical judgment is balancing the risk of missing something preventable that will actually harm the child against the harms of more tests.
The modern hospital has an array of equipment available. They are Greek Sirens calling to us. There is the video EEG room, the MR angiogram, the cardiac cath lab, and an endless list of blood tests. We are not even stopped by the walls of the hospital. We can order a follow-up ambulatory Holter ECG to search for intermittent arrhythmias. But are these Sirens really good medicine? At what point should we simply reassure the parents that the child is fine?
Physicians all worry about missing something. This fear was instilled when we were medical students and reinforced with the stress of residency. With years of experience, all physicians acquire a list of missed diagnoses. But I also have collected a list of times when the diagnostic tests themselves have caused harm, including death. I have a list of cases where nonspecific diagnostic testing has mislabeled a child with an obscure diagnosis that was later proven false, but not before harm was caused. There were patients with Stevens-Johnson syndrome who suffered serious harm from treatments for minor illnesses. Then there were the terrified families who, after extensive testing, became convinced that the child must have some horrible unknown disease because surely we wouldn’t have traumatized the child with all this work-up if there wasn’t really something seriously wrong. Each new test stoked their fear rather than soothed it.
A careful history is still the weapon of choice in the zebra hunt. On a first presentation of mild cyanosis, sepsis is the charging rhinoceros of preventable harm that will run you over if you are too slow to react. But this child has now had several episodes that have occurred: 1. in multiple settings, 2. with no distress, 3. while the child remained playful, and 4. that were self-limited. That history is incompatible with sepsis, so reflexively ordering a blood culture is an illogical choice. The history should be progressively explored using the differential diagnosis and an organ-based systematic approach to guide it. The thoroughness and thoughtfulness I put into the history taking can be key to finding the correct diagnosis. They also are a means of building trust and rapport with the parents. That will be important later if no definitive diagnosis is found.
Unclear and unusual presentations may merit a consult. The cardiologist knows the limits of an echo. Along with the technical expertise comes a new set of eyes and the additional perspective of a second opinion. It is great when a colleague can tell you that he or she too had a case like this years ago that was never solved, but did resolve on its own.
One advantage of being an office-based pediatrician, with an established relationship with a family over several years and a couple other children, is that parents do value and trust your clinical judgment. As a pediatrician, I know the most common product I sell is reassurance. It is not snake oil. There is a bedside manner in selling it. Put a positive spin on all the negative tests. Indicate that you and the parents can be vigilant in watching for new signs. Instruct the parents to bring the child back if the events are distressing for the child, which could justify more invasive testing. Arrange to recheck the child in the office in a week, then a month, then at the next well visit. Parents know that medicine isn’t perfect. Humans deal with fear and uncertainty better by knowing we aren’t facing the future alone. And that is why even with all this technology, I still lay a stethoscope on every child.
Dr. Powell is a pediatric hospitalist and clinical ethics consultant living in St. Louis. Dr. Powell said he had no relevant financial disclosures.
A toddler has had several brief episodes of mild perioral cyanosis noticed at day care. The parents see the primary care provider. The exam is normal. The child is admitted for a work-up. The CBC and comprehensive metabolic profile are unremarkable. The chest x-ray is normal. An ECG is normal. An echocardiogram is normal. The EEG is normal. Now what?
I was taught that an uncommon presentation of a common disease is still more common than a common presentation of an uncommon disease. Or simply, odd-looking horses are more common than zebras unless you practice in the savanna. There is a point in any safari at which you have to decide whether you are hunting a zebra or chasing a shadow. Clinical judgment is balancing the risk of missing something preventable that will actually harm the child against the harms of more tests.
The modern hospital has an array of equipment available. They are Greek Sirens calling to us. There is the video EEG room, the MR angiogram, the cardiac cath lab, and an endless list of blood tests. We are not even stopped by the walls of the hospital. We can order a follow-up ambulatory Holter ECG to search for intermittent arrhythmias. But are these Sirens really good medicine? At what point should we simply reassure the parents that the child is fine?
Physicians all worry about missing something. This fear was instilled when we were medical students and reinforced with the stress of residency. With years of experience, all physicians acquire a list of missed diagnoses. But I also have collected a list of times when the diagnostic tests themselves have caused harm, including death. I have a list of cases where nonspecific diagnostic testing has mislabeled a child with an obscure diagnosis that was later proven false, but not before harm was caused. There were patients with Stevens-Johnson syndrome who suffered serious harm from treatments for minor illnesses. Then there were the terrified families who, after extensive testing, became convinced that the child must have some horrible unknown disease because surely we wouldn’t have traumatized the child with all this work-up if there wasn’t really something seriously wrong. Each new test stoked their fear rather than soothed it.
A careful history is still the weapon of choice in the zebra hunt. On a first presentation of mild cyanosis, sepsis is the charging rhinoceros of preventable harm that will run you over if you are too slow to react. But this child has now had several episodes that have occurred: 1. in multiple settings, 2. with no distress, 3. while the child remained playful, and 4. that were self-limited. That history is incompatible with sepsis, so reflexively ordering a blood culture is an illogical choice. The history should be progressively explored using the differential diagnosis and an organ-based systematic approach to guide it. The thoroughness and thoughtfulness I put into the history taking can be key to finding the correct diagnosis. They also are a means of building trust and rapport with the parents. That will be important later if no definitive diagnosis is found.
Unclear and unusual presentations may merit a consult. The cardiologist knows the limits of an echo. Along with the technical expertise comes a new set of eyes and the additional perspective of a second opinion. It is great when a colleague can tell you that he or she too had a case like this years ago that was never solved, but did resolve on its own.
One advantage of being an office-based pediatrician, with an established relationship with a family over several years and a couple other children, is that parents do value and trust your clinical judgment. As a pediatrician, I know the most common product I sell is reassurance. It is not snake oil. There is a bedside manner in selling it. Put a positive spin on all the negative tests. Indicate that you and the parents can be vigilant in watching for new signs. Instruct the parents to bring the child back if the events are distressing for the child, which could justify more invasive testing. Arrange to recheck the child in the office in a week, then a month, then at the next well visit. Parents know that medicine isn’t perfect. Humans deal with fear and uncertainty better by knowing we aren’t facing the future alone. And that is why even with all this technology, I still lay a stethoscope on every child.
Dr. Powell is a pediatric hospitalist and clinical ethics consultant living in St. Louis. Dr. Powell said he had no relevant financial disclosures.
A toddler has had several brief episodes of mild perioral cyanosis noticed at day care. The parents see the primary care provider. The exam is normal. The child is admitted for a work-up. The CBC and comprehensive metabolic profile are unremarkable. The chest x-ray is normal. An ECG is normal. An echocardiogram is normal. The EEG is normal. Now what?
I was taught that an uncommon presentation of a common disease is still more common than a common presentation of an uncommon disease. Or simply, odd-looking horses are more common than zebras unless you practice in the savanna. There is a point in any safari at which you have to decide whether you are hunting a zebra or chasing a shadow. Clinical judgment is balancing the risk of missing something preventable that will actually harm the child against the harms of more tests.
The modern hospital has an array of equipment available. They are Greek Sirens calling to us. There is the video EEG room, the MR angiogram, the cardiac cath lab, and an endless list of blood tests. We are not even stopped by the walls of the hospital. We can order a follow-up ambulatory Holter ECG to search for intermittent arrhythmias. But are these Sirens really good medicine? At what point should we simply reassure the parents that the child is fine?
Physicians all worry about missing something. This fear was instilled when we were medical students and reinforced with the stress of residency. With years of experience, all physicians acquire a list of missed diagnoses. But I also have collected a list of times when the diagnostic tests themselves have caused harm, including death. I have a list of cases where nonspecific diagnostic testing has mislabeled a child with an obscure diagnosis that was later proven false, but not before harm was caused. There were patients with Stevens-Johnson syndrome who suffered serious harm from treatments for minor illnesses. Then there were the terrified families who, after extensive testing, became convinced that the child must have some horrible unknown disease because surely we wouldn’t have traumatized the child with all this work-up if there wasn’t really something seriously wrong. Each new test stoked their fear rather than soothed it.
A careful history is still the weapon of choice in the zebra hunt. On a first presentation of mild cyanosis, sepsis is the charging rhinoceros of preventable harm that will run you over if you are too slow to react. But this child has now had several episodes that have occurred: 1. in multiple settings, 2. with no distress, 3. while the child remained playful, and 4. that were self-limited. That history is incompatible with sepsis, so reflexively ordering a blood culture is an illogical choice. The history should be progressively explored using the differential diagnosis and an organ-based systematic approach to guide it. The thoroughness and thoughtfulness I put into the history taking can be key to finding the correct diagnosis. They also are a means of building trust and rapport with the parents. That will be important later if no definitive diagnosis is found.
Unclear and unusual presentations may merit a consult. The cardiologist knows the limits of an echo. Along with the technical expertise comes a new set of eyes and the additional perspective of a second opinion. It is great when a colleague can tell you that he or she too had a case like this years ago that was never solved, but did resolve on its own.
One advantage of being an office-based pediatrician, with an established relationship with a family over several years and a couple other children, is that parents do value and trust your clinical judgment. As a pediatrician, I know the most common product I sell is reassurance. It is not snake oil. There is a bedside manner in selling it. Put a positive spin on all the negative tests. Indicate that you and the parents can be vigilant in watching for new signs. Instruct the parents to bring the child back if the events are distressing for the child, which could justify more invasive testing. Arrange to recheck the child in the office in a week, then a month, then at the next well visit. Parents know that medicine isn’t perfect. Humans deal with fear and uncertainty better by knowing we aren’t facing the future alone. And that is why even with all this technology, I still lay a stethoscope on every child.
Dr. Powell is a pediatric hospitalist and clinical ethics consultant living in St. Louis. Dr. Powell said he had no relevant financial disclosures.
Finding Synchronous Cancers
Up to 6% of patients with head and neck squamous cell carcinoma (SCC) also have synchronous second primary cancers (SPCs). However, the synchronous cancers may be missed in a usual examination that relies on CT and MRI scans.
Related: Complex Malignancies: A Diagnostic and Therapeutic Trilemma
Clinicians from Odense University Hospital in Denmark report on a patient who presented with only tongue pain as a symptom but was found to have 4 SPCs. The CT and MRI results were inconclusive due to artifacts from metal dental fillings. However, a positron emission tomography (PET)-CT scan “easily revealed” the 3 coinciding malignancies because of their increased metabolic activity, the authors say.
Their patient had 4 primary cancers: 1 SCC on the left side of the tongue, 1 in the fold between the tongue and the floor of the mouth (the 2 tumors were near each other but separate entities), a third SCC in the right aryepiglottic fold, and a grade 2 follicular lymphoma diagnosed “by coincidence” in the lymph nodes of the neck.
The 3 SCCs in the upper aerodigestive tract were in line with the concept of field cancerization, the clinicians note. Multiple adjacent but independent tumors in the mucosa may arise from exposure to carcinogens, which can induce dysplastic changes that lead to malignancy. Moreover, although synchronous cancer of the head and neck regions and follicular lymphoma are rare, one of the potential risk factors for follicular lymphoma is smoking, the authors say. Their patient had been a smoker for 56 years.
The authors recommend a “more liberal approach” to examination and a “generous use” of PET-CT for patients with malignancies of the head and neck regions, particularly in patients with obvious risk factors, such as a long history of smoking or alcohol abuse. They add that PET-CT is also a useful tool in assessing tumor dissemination and prognosis of individual carcinomas—an important benefit in planning different treatments.
Source:
Heidemann LN, Johansen J, Larsen SR, Sørensen JA. BMJ Case Rep. 2016;pii: bcr2015214047.
doi: 10.1136/bcr-2015-214047.
Up to 6% of patients with head and neck squamous cell carcinoma (SCC) also have synchronous second primary cancers (SPCs). However, the synchronous cancers may be missed in a usual examination that relies on CT and MRI scans.
Related: Complex Malignancies: A Diagnostic and Therapeutic Trilemma
Clinicians from Odense University Hospital in Denmark report on a patient who presented with only tongue pain as a symptom but was found to have 4 SPCs. The CT and MRI results were inconclusive due to artifacts from metal dental fillings. However, a positron emission tomography (PET)-CT scan “easily revealed” the 3 coinciding malignancies because of their increased metabolic activity, the authors say.
Their patient had 4 primary cancers: 1 SCC on the left side of the tongue, 1 in the fold between the tongue and the floor of the mouth (the 2 tumors were near each other but separate entities), a third SCC in the right aryepiglottic fold, and a grade 2 follicular lymphoma diagnosed “by coincidence” in the lymph nodes of the neck.
The 3 SCCs in the upper aerodigestive tract were in line with the concept of field cancerization, the clinicians note. Multiple adjacent but independent tumors in the mucosa may arise from exposure to carcinogens, which can induce dysplastic changes that lead to malignancy. Moreover, although synchronous cancer of the head and neck regions and follicular lymphoma are rare, one of the potential risk factors for follicular lymphoma is smoking, the authors say. Their patient had been a smoker for 56 years.
The authors recommend a “more liberal approach” to examination and a “generous use” of PET-CT for patients with malignancies of the head and neck regions, particularly in patients with obvious risk factors, such as a long history of smoking or alcohol abuse. They add that PET-CT is also a useful tool in assessing tumor dissemination and prognosis of individual carcinomas—an important benefit in planning different treatments.
Source:
Heidemann LN, Johansen J, Larsen SR, Sørensen JA. BMJ Case Rep. 2016;pii: bcr2015214047.
doi: 10.1136/bcr-2015-214047.
Up to 6% of patients with head and neck squamous cell carcinoma (SCC) also have synchronous second primary cancers (SPCs). However, the synchronous cancers may be missed in a usual examination that relies on CT and MRI scans.
Related: Complex Malignancies: A Diagnostic and Therapeutic Trilemma
Clinicians from Odense University Hospital in Denmark report on a patient who presented with only tongue pain as a symptom but was found to have 4 SPCs. The CT and MRI results were inconclusive due to artifacts from metal dental fillings. However, a positron emission tomography (PET)-CT scan “easily revealed” the 3 coinciding malignancies because of their increased metabolic activity, the authors say.
Their patient had 4 primary cancers: 1 SCC on the left side of the tongue, 1 in the fold between the tongue and the floor of the mouth (the 2 tumors were near each other but separate entities), a third SCC in the right aryepiglottic fold, and a grade 2 follicular lymphoma diagnosed “by coincidence” in the lymph nodes of the neck.
The 3 SCCs in the upper aerodigestive tract were in line with the concept of field cancerization, the clinicians note. Multiple adjacent but independent tumors in the mucosa may arise from exposure to carcinogens, which can induce dysplastic changes that lead to malignancy. Moreover, although synchronous cancer of the head and neck regions and follicular lymphoma are rare, one of the potential risk factors for follicular lymphoma is smoking, the authors say. Their patient had been a smoker for 56 years.
The authors recommend a “more liberal approach” to examination and a “generous use” of PET-CT for patients with malignancies of the head and neck regions, particularly in patients with obvious risk factors, such as a long history of smoking or alcohol abuse. They add that PET-CT is also a useful tool in assessing tumor dissemination and prognosis of individual carcinomas—an important benefit in planning different treatments.
Source:
Heidemann LN, Johansen J, Larsen SR, Sørensen JA. BMJ Case Rep. 2016;pii: bcr2015214047.
doi: 10.1136/bcr-2015-214047.
The promise of peanut allergy prevention lies in draft guidelines
Updated guidelines from the National Institute of Allergy and Infectious Diseases for the early introduction of peanut-containing foods to children at increased risk for peanut allergies are on the horizon, pending final approval.
“Two studies recently showed that infants at high risk of developing peanut allergy [infants with egg allergy and or severe eczema] were much less likely to have peanut allergy at age 5 years if they were able to incorporate peanut regularly into the diet between 4 and 11 months of age,” said Dr. Scott H. Sicherer, the Elliot and Roslyn Jaffe Professor of Pediatrics, Allergy and Immunology, and chief of the division of allergy and immunology in the department of pediatrics at the Icahn School of Medicine at Mount Sinai, New York.
“However, adding peanut to the diet at this age requires caution because these infants may already be allergic to peanut, and so allergy testing and care in adding peanut to the diet with medical supervision is needed in this high-risk group,” noted Dr. Sicherer, a member of the expert panel that worked on the guidelines.
The draft guidelines include 43 clinical recommendations for the diagnosis and management of food allergies in children, according to the NIAID website. In particular, the draft guidelines recommend introducing peanut-containing foods to infants aged 4-6 months who are at increased risk for peanut allergy because of severe eczema and/or egg allergies, after an evaluation with skin prick testing or peanut-specific IgE testing.
“Peanut allergy is relatively common and often persistent, and so a strategy that could prevent the allergy is very important,” Dr. Sicherer said in an interview. “However, peanut can be a choking hazard as peanuts or peanut butter, and so families should talk to their pediatrician about how and when to incorporate peanut into the diet, and whether allergy testing and referral to an allergist is needed.”
Support for the guidelines comes from several large studies with promising results, notably the LEAP (Learning Early about Peanut Allergy) trial. A recent extension of that study, known as LEAP-On (Persistence of Oral Tolerance to Peanut), showed that regular consumption of peanut-containing foods from infancy to 5 years provided ongoing protection against allergies, even 6 years after peanut consumption was discontinued for a 1-year period in 550 children (N Eng J Med. 2016 Apr 14;374:1435-43).
In the original LEAP study, 640 infants aged 4-11 months with severe eczema, egg allergy, or both were randomized to dietary peanut consumption or avoidance (N Engl J Med. 2015 Feb 26;372[9]:803-13). The prevalence of peanut allergy at 5 years of age was approximately 2% in the peanut-consumption group, compared with 14% in the peanut-avoidance group.
Another significant randomized trial, the EAT study (Enquiring About Tolerance) tested not only peanut, but also the early introduction of cooked egg, cow’s milk, sesame, wheat, and fish to 1,303 infants aged 3 months and older in the general population. The study’s strict protocol made adherence difficult, but researchers found a significant 67% reduction in the prevalence of food allergies at age 3 years among the children who followed the protocol, compared with controls, with relative risk reductions of 100% and 75%, respectively, for peanut and egg allergies (N Engl J Med. 2016 May 5;374:1733-43).
The next steps for research should make early introduction of peanut-containing foods even more effective at allergy prevention, Dr. Sicherer noted.
“We need to learn more about how much peanut should be incorporated into the diet, how long the protein has to be kept in the diet to have the best preventative effect, and whether this strategy applies to other foods,” he said.
Updated guidelines from the National Institute of Allergy and Infectious Diseases for the early introduction of peanut-containing foods to children at increased risk for peanut allergies are on the horizon, pending final approval.
“Two studies recently showed that infants at high risk of developing peanut allergy [infants with egg allergy and or severe eczema] were much less likely to have peanut allergy at age 5 years if they were able to incorporate peanut regularly into the diet between 4 and 11 months of age,” said Dr. Scott H. Sicherer, the Elliot and Roslyn Jaffe Professor of Pediatrics, Allergy and Immunology, and chief of the division of allergy and immunology in the department of pediatrics at the Icahn School of Medicine at Mount Sinai, New York.
“However, adding peanut to the diet at this age requires caution because these infants may already be allergic to peanut, and so allergy testing and care in adding peanut to the diet with medical supervision is needed in this high-risk group,” noted Dr. Sicherer, a member of the expert panel that worked on the guidelines.
The draft guidelines include 43 clinical recommendations for the diagnosis and management of food allergies in children, according to the NIAID website. In particular, the draft guidelines recommend introducing peanut-containing foods to infants aged 4-6 months who are at increased risk for peanut allergy because of severe eczema and/or egg allergies, after an evaluation with skin prick testing or peanut-specific IgE testing.
“Peanut allergy is relatively common and often persistent, and so a strategy that could prevent the allergy is very important,” Dr. Sicherer said in an interview. “However, peanut can be a choking hazard as peanuts or peanut butter, and so families should talk to their pediatrician about how and when to incorporate peanut into the diet, and whether allergy testing and referral to an allergist is needed.”
Support for the guidelines comes from several large studies with promising results, notably the LEAP (Learning Early about Peanut Allergy) trial. A recent extension of that study, known as LEAP-On (Persistence of Oral Tolerance to Peanut), showed that regular consumption of peanut-containing foods from infancy to 5 years provided ongoing protection against allergies, even 6 years after peanut consumption was discontinued for a 1-year period in 550 children (N Eng J Med. 2016 Apr 14;374:1435-43).
In the original LEAP study, 640 infants aged 4-11 months with severe eczema, egg allergy, or both were randomized to dietary peanut consumption or avoidance (N Engl J Med. 2015 Feb 26;372[9]:803-13). The prevalence of peanut allergy at 5 years of age was approximately 2% in the peanut-consumption group, compared with 14% in the peanut-avoidance group.
Another significant randomized trial, the EAT study (Enquiring About Tolerance) tested not only peanut, but also the early introduction of cooked egg, cow’s milk, sesame, wheat, and fish to 1,303 infants aged 3 months and older in the general population. The study’s strict protocol made adherence difficult, but researchers found a significant 67% reduction in the prevalence of food allergies at age 3 years among the children who followed the protocol, compared with controls, with relative risk reductions of 100% and 75%, respectively, for peanut and egg allergies (N Engl J Med. 2016 May 5;374:1733-43).
The next steps for research should make early introduction of peanut-containing foods even more effective at allergy prevention, Dr. Sicherer noted.
“We need to learn more about how much peanut should be incorporated into the diet, how long the protein has to be kept in the diet to have the best preventative effect, and whether this strategy applies to other foods,” he said.
Updated guidelines from the National Institute of Allergy and Infectious Diseases for the early introduction of peanut-containing foods to children at increased risk for peanut allergies are on the horizon, pending final approval.
“Two studies recently showed that infants at high risk of developing peanut allergy [infants with egg allergy and or severe eczema] were much less likely to have peanut allergy at age 5 years if they were able to incorporate peanut regularly into the diet between 4 and 11 months of age,” said Dr. Scott H. Sicherer, the Elliot and Roslyn Jaffe Professor of Pediatrics, Allergy and Immunology, and chief of the division of allergy and immunology in the department of pediatrics at the Icahn School of Medicine at Mount Sinai, New York.
“However, adding peanut to the diet at this age requires caution because these infants may already be allergic to peanut, and so allergy testing and care in adding peanut to the diet with medical supervision is needed in this high-risk group,” noted Dr. Sicherer, a member of the expert panel that worked on the guidelines.
The draft guidelines include 43 clinical recommendations for the diagnosis and management of food allergies in children, according to the NIAID website. In particular, the draft guidelines recommend introducing peanut-containing foods to infants aged 4-6 months who are at increased risk for peanut allergy because of severe eczema and/or egg allergies, after an evaluation with skin prick testing or peanut-specific IgE testing.
“Peanut allergy is relatively common and often persistent, and so a strategy that could prevent the allergy is very important,” Dr. Sicherer said in an interview. “However, peanut can be a choking hazard as peanuts or peanut butter, and so families should talk to their pediatrician about how and when to incorporate peanut into the diet, and whether allergy testing and referral to an allergist is needed.”
Support for the guidelines comes from several large studies with promising results, notably the LEAP (Learning Early about Peanut Allergy) trial. A recent extension of that study, known as LEAP-On (Persistence of Oral Tolerance to Peanut), showed that regular consumption of peanut-containing foods from infancy to 5 years provided ongoing protection against allergies, even 6 years after peanut consumption was discontinued for a 1-year period in 550 children (N Eng J Med. 2016 Apr 14;374:1435-43).
In the original LEAP study, 640 infants aged 4-11 months with severe eczema, egg allergy, or both were randomized to dietary peanut consumption or avoidance (N Engl J Med. 2015 Feb 26;372[9]:803-13). The prevalence of peanut allergy at 5 years of age was approximately 2% in the peanut-consumption group, compared with 14% in the peanut-avoidance group.
Another significant randomized trial, the EAT study (Enquiring About Tolerance) tested not only peanut, but also the early introduction of cooked egg, cow’s milk, sesame, wheat, and fish to 1,303 infants aged 3 months and older in the general population. The study’s strict protocol made adherence difficult, but researchers found a significant 67% reduction in the prevalence of food allergies at age 3 years among the children who followed the protocol, compared with controls, with relative risk reductions of 100% and 75%, respectively, for peanut and egg allergies (N Engl J Med. 2016 May 5;374:1733-43).
The next steps for research should make early introduction of peanut-containing foods even more effective at allergy prevention, Dr. Sicherer noted.
“We need to learn more about how much peanut should be incorporated into the diet, how long the protein has to be kept in the diet to have the best preventative effect, and whether this strategy applies to other foods,” he said.