Begin Your Journey as an SHM Ambassador

Article Type
Changed
Display Headline
Begin Your Journey as an SHM Ambassador

Beginning March 1 and running through December 31, all active SHM members can earn 2017–2018 dues credits and special recognition for recruiting new physician, physician assistant, nurse practitioner, pharmacist, or affiliate members. Active members will be eligible for:

  • A $35 credit toward 2017–2018 dues when recruiting 1 new member
  • A $50 credit toward 2017–2018 dues when recruiting 2–4 new members
  • A $75 credit toward 2017–2018 dues when recruiting 5–9 new members
  • A $125 credit toward 2017–2018 dues when recruiting 10+ new members

For each member recruited, individuals will receive one entry into a grand-prize drawing to receive complimentary registration to HM17 in Las Vegas.

To be counted as a referral, the new member referral must:

  • Be a brand-new member to SHM (past members whose membership has lapsed do not qualify)
  • Register as a physician, physician assistant, nurse practitioner, pharmacist, or affiliate member
  • Include an active member’s name in the “referred by” field on a printed application or the online join form
  • Join between March 1, 2016, and December 31, 2016

SHM members are not eligible for dues credits through this program for member referrals attributed to free memberships received as a result of HM17 registrations.

Begin your journey as an SHM Ambassador today at www.hospitalmedicine.org/MAP.

Issue
The Hospitalist - 2016(03)
Publications
Sections

Beginning March 1 and running through December 31, all active SHM members can earn 2017–2018 dues credits and special recognition for recruiting new physician, physician assistant, nurse practitioner, pharmacist, or affiliate members. Active members will be eligible for:

  • A $35 credit toward 2017–2018 dues when recruiting 1 new member
  • A $50 credit toward 2017–2018 dues when recruiting 2–4 new members
  • A $75 credit toward 2017–2018 dues when recruiting 5–9 new members
  • A $125 credit toward 2017–2018 dues when recruiting 10+ new members

For each member recruited, individuals will receive one entry into a grand-prize drawing to receive complimentary registration to HM17 in Las Vegas.

To be counted as a referral, the new member referral must:

  • Be a brand-new member to SHM (past members whose membership has lapsed do not qualify)
  • Register as a physician, physician assistant, nurse practitioner, pharmacist, or affiliate member
  • Include an active member’s name in the “referred by” field on a printed application or the online join form
  • Join between March 1, 2016, and December 31, 2016

SHM members are not eligible for dues credits through this program for member referrals attributed to free memberships received as a result of HM17 registrations.

Begin your journey as an SHM Ambassador today at www.hospitalmedicine.org/MAP.

Beginning March 1 and running through December 31, all active SHM members can earn 2017–2018 dues credits and special recognition for recruiting new physician, physician assistant, nurse practitioner, pharmacist, or affiliate members. Active members will be eligible for:

  • A $35 credit toward 2017–2018 dues when recruiting 1 new member
  • A $50 credit toward 2017–2018 dues when recruiting 2–4 new members
  • A $75 credit toward 2017–2018 dues when recruiting 5–9 new members
  • A $125 credit toward 2017–2018 dues when recruiting 10+ new members

For each member recruited, individuals will receive one entry into a grand-prize drawing to receive complimentary registration to HM17 in Las Vegas.

To be counted as a referral, the new member referral must:

  • Be a brand-new member to SHM (past members whose membership has lapsed do not qualify)
  • Register as a physician, physician assistant, nurse practitioner, pharmacist, or affiliate member
  • Include an active member’s name in the “referred by” field on a printed application or the online join form
  • Join between March 1, 2016, and December 31, 2016

SHM members are not eligible for dues credits through this program for member referrals attributed to free memberships received as a result of HM17 registrations.

Begin your journey as an SHM Ambassador today at www.hospitalmedicine.org/MAP.

Issue
The Hospitalist - 2016(03)
Issue
The Hospitalist - 2016(03)
Publications
Publications
Article Type
Display Headline
Begin Your Journey as an SHM Ambassador
Display Headline
Begin Your Journey as an SHM Ambassador
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)

FDA calls hospital-based Zika test ‘high risk’

Article Type
Changed
Display Headline
FDA calls hospital-based Zika test ‘high risk’

Blood sample

Photo by Juan D. Alfonso

The US Food and Drug Administration (FDA) has deemed a hospital-based test for the Zika virus “high risk,” as the test has not been cleared by the FDA.

The test was developed by scientists at Texas Children’s Hospital and Houston Methodist Hospital. It has been available at both hospitals since last month.

The FDA has requested more information on the test but has not asked the hospitals to stop using it.

According to the hospitals, the test identifies virus-specific RNA sequences to detect the Zika virus. It can distinguish Zika infection from dengue, West Nile, or chikungunya infections. And it can be performed on blood, amniotic fluid, urine, or spinal fluid.

In a letter to the hospitals, the FDA said this test appears to meet the definition of a device, as defined in section 201(h) of the Federal Food Drug and Cosmetic Act. Yet the test has not been granted premarket clearance, approval, or Emergency Use Authorization review by the FDA.

Therefore, the FDA has asked for information on the test’s design, validation, and performance characteristics. The agency said the Centers for Disease Control and Prevention (CDC) and the Centers for Medicare & Medicaid Services asked the FDA to review the science behind the test.

The FDA has not asked the hospitals to stop using the test while the review is underway, according to a statement from Texas Children’s Hospital.

Nevertheless, the Association for Molecular Pathology (AMP) said it is “concerned and disappointed” to see the FDA taking enforcement action regarding this test. The AMP said these types of tests are critical for patient care and should be made available to patients in need.

In fact, the AMP said this is an example of how FDA regulation of laboratory developed procedures would hinder patient access to vital medical services. That’s because the FDA’s Emergency Use Authorization for antibody testing at the CDC or state public health labs does not provide results in the timely fashion needed for immediate patient care.

The FDA recently issued Emergency Use Authorization for the Zika IgM Antibody Capture Enzyme-Linked Immunosorbent Assay (Zika MAC-ELISA), which was developed by the CDC.

The test was distributed to labs in the US and abroad, but it was not made available in US hospitals or other primary care settings.

Publications
Topics

Blood sample

Photo by Juan D. Alfonso

The US Food and Drug Administration (FDA) has deemed a hospital-based test for the Zika virus “high risk,” as the test has not been cleared by the FDA.

The test was developed by scientists at Texas Children’s Hospital and Houston Methodist Hospital. It has been available at both hospitals since last month.

The FDA has requested more information on the test but has not asked the hospitals to stop using it.

According to the hospitals, the test identifies virus-specific RNA sequences to detect the Zika virus. It can distinguish Zika infection from dengue, West Nile, or chikungunya infections. And it can be performed on blood, amniotic fluid, urine, or spinal fluid.

In a letter to the hospitals, the FDA said this test appears to meet the definition of a device, as defined in section 201(h) of the Federal Food Drug and Cosmetic Act. Yet the test has not been granted premarket clearance, approval, or Emergency Use Authorization review by the FDA.

Therefore, the FDA has asked for information on the test’s design, validation, and performance characteristics. The agency said the Centers for Disease Control and Prevention (CDC) and the Centers for Medicare & Medicaid Services asked the FDA to review the science behind the test.

The FDA has not asked the hospitals to stop using the test while the review is underway, according to a statement from Texas Children’s Hospital.

Nevertheless, the Association for Molecular Pathology (AMP) said it is “concerned and disappointed” to see the FDA taking enforcement action regarding this test. The AMP said these types of tests are critical for patient care and should be made available to patients in need.

In fact, the AMP said this is an example of how FDA regulation of laboratory developed procedures would hinder patient access to vital medical services. That’s because the FDA’s Emergency Use Authorization for antibody testing at the CDC or state public health labs does not provide results in the timely fashion needed for immediate patient care.

The FDA recently issued Emergency Use Authorization for the Zika IgM Antibody Capture Enzyme-Linked Immunosorbent Assay (Zika MAC-ELISA), which was developed by the CDC.

The test was distributed to labs in the US and abroad, but it was not made available in US hospitals or other primary care settings.

Blood sample

Photo by Juan D. Alfonso

The US Food and Drug Administration (FDA) has deemed a hospital-based test for the Zika virus “high risk,” as the test has not been cleared by the FDA.

The test was developed by scientists at Texas Children’s Hospital and Houston Methodist Hospital. It has been available at both hospitals since last month.

The FDA has requested more information on the test but has not asked the hospitals to stop using it.

According to the hospitals, the test identifies virus-specific RNA sequences to detect the Zika virus. It can distinguish Zika infection from dengue, West Nile, or chikungunya infections. And it can be performed on blood, amniotic fluid, urine, or spinal fluid.

In a letter to the hospitals, the FDA said this test appears to meet the definition of a device, as defined in section 201(h) of the Federal Food Drug and Cosmetic Act. Yet the test has not been granted premarket clearance, approval, or Emergency Use Authorization review by the FDA.

Therefore, the FDA has asked for information on the test’s design, validation, and performance characteristics. The agency said the Centers for Disease Control and Prevention (CDC) and the Centers for Medicare & Medicaid Services asked the FDA to review the science behind the test.

The FDA has not asked the hospitals to stop using the test while the review is underway, according to a statement from Texas Children’s Hospital.

Nevertheless, the Association for Molecular Pathology (AMP) said it is “concerned and disappointed” to see the FDA taking enforcement action regarding this test. The AMP said these types of tests are critical for patient care and should be made available to patients in need.

In fact, the AMP said this is an example of how FDA regulation of laboratory developed procedures would hinder patient access to vital medical services. That’s because the FDA’s Emergency Use Authorization for antibody testing at the CDC or state public health labs does not provide results in the timely fashion needed for immediate patient care.

The FDA recently issued Emergency Use Authorization for the Zika IgM Antibody Capture Enzyme-Linked Immunosorbent Assay (Zika MAC-ELISA), which was developed by the CDC.

The test was distributed to labs in the US and abroad, but it was not made available in US hospitals or other primary care settings.

Publications
Publications
Topics
Article Type
Display Headline
FDA calls hospital-based Zika test ‘high risk’
Display Headline
FDA calls hospital-based Zika test ‘high risk’
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Severe hemophilia still tough to manage, study shows

Article Type
Changed
Display Headline
Severe hemophilia still tough to manage, study shows

Antihemophilic factor

A large study suggests that, despite treatment advances, men with severe hemophilia have not seen great reductions in bleeding events.

Data on more than 7000 men with hemophilia revealed substantial differences in the care received by men born before 1958 and those born between 1983

and 1992.

However, the data also showed that frequent bleeding was common in patients with severe hemophilia, regardless of when they were born.

These data were published in Blood.

“Our analysis provides a snapshot of how improvements in care have translated into outcomes across different generations of men with hemophilia,” said study author Paul E. Monahan, MD, of The University of North Carolina-Chapel Hill.

“While there is reason to be pleased with the progress we’ve made, our data show some surprising deficits and suggest that efforts are needed to more consistently apply the integration of standard-of-care multidisciplinary services and preventive blood clotting factor treatments to further normalize the lives of men living with hemophilia.”

Dr Monahan and his colleagues analyzed data on 7486 men—4899 with severe hemophilia (65.4%), 2587 with mild hemophilia (34.6%), 6094 with hemophilia A (81.4%), and 1392 with hemophilia B (18.6%).

The data were collected prospectively by the US Centers for Disease Control and Prevention and 130 federally supported Hemophilia Treatment Centers (HTCs) between 1998 and 2011. This represents the largest database of men living with hemophilia.

The researchers grouped the men into 4 eras (birth cohorts) to evaluate how outcomes—access to care, physical and social functioning, complications, and mortality—have changed over the last 50 years.

The cohorts were as follows:

  • Era A: born before 1958 (median age 58)
  • Era B: born between 1958 and 1975 (median age 40)
  • Era C: born between 1976 and 1982 (median age 28)
  • Era D: born between 1983 and 1992 (median age 21).

Access to care

The researchers found that the proportion of men who started home infusions before age 6 was far greater in Era D than Era A—50.4% and 2%, respectively. And the proportion of patients reporting a first HTC visit before age 2 rose nearly 10-fold from Era A to Era D—8.8% and 100%, respectively.

In addition, the use of a continuous prophylactic regimen was nearly 3-fold greater in Era D than Era A—46.7% and 16%, respectively.

However, patients in the youngest 2 birth cohorts were more likely than their older counterparts to be uninsured. The proportion of uninsured patients was 16.4% in Era D, 20.5% in Era C, 11.1% in Era B, and 5.7% in Era A.

Bleeding events

The proportion of patients reporting frequent bleeds decreased from Era A to Era D. However, frequent bleeding was common in men with severe hemophilia regardless of when they were born.

Even men from Era D—who had access to effective, safe clotting therapies and multidisciplinary care throughout their lifetimes—reported frequent bleeds.

The proportion of patients with severe hemophilia reporting 5 or more joint bleeds in the last 6 months was 42.6% in Era A and 35.5% in Era D. The proportion of patients with a joint affected by recurrent bleeding was 32.6% and 24.9%, respectively.

Functioning

The researchers also discovered that patients with severe hemophilia were 3 times as likely to report limitations in their activities or to be disabled, when compared to patients with mild hemophilia, regardless of when they were born.

Still, men from Era A were more likely to report limitations in their overall activity level—68.8% of severe hemophilia patients and 21.1% of mild hemophilia patients—than men from Era D—14.9% of severe hemophilia patients and 4.3% of mild hemophilia patients.

 

 

Men with severe hemophilia were more likely than men with mild hemophilia to report missing more than 10 days of work or school during the previous year. In Era A, the proportions were 6.9% and 2.6%, respectively. In Era D, the proportions were 5.6% and 3%, respectively.

“Clear disparities remain in terms of frequent bleeding and disability between men with severe hemophilia and mild hemophilia across every decade of adult life,” Dr Monahan said.

“We thought the difference in functional outcomes would have narrowed over the years. That is, men with severe hemophilia should look more like those with

mild disorder, given improved therapeutics and access to care, but this wasn’t the case.”

“What needs examination is why, despite widespread availability of preventive and on-demand therapies for home use, we still see disparities. It speaks to the need for continued disease surveillance to monitor and inform hemophilia interventions and outcomes.”

Mortality

There were 551 deaths during the study period. The Era A and B cohorts accounted for 82% of the deaths in the severe hemophilia population and 96% of the deaths in the mild hemophilia population.

The researchers noted that liver failure has surpassed bleeding issues and HIV as the leading cause of death among US men with hemophilia.

Although there were no liver-related deaths in the 2 youngest cohorts, liver failure was the most commonly reported cause of death across all the cohorts, for both severe hemophilia (33% of deaths) and mild hemophilia (26% of deaths).

The researchers said this finding underscores the need to swiftly evaluate and treat HCV infections.

“Liver disease worsens bleeding, so eradicating hepatitis C infections needs to be a priority, especially as we now have remarkably effective therapies,” Dr Monahan said.

Across all the birth cohorts, hemophilia-related deaths accounted for 14.6% of deaths in patients with severe hemophilia and 10.7% of deaths in those with mild hemophilia.

Publications
Topics

Antihemophilic factor

A large study suggests that, despite treatment advances, men with severe hemophilia have not seen great reductions in bleeding events.

Data on more than 7000 men with hemophilia revealed substantial differences in the care received by men born before 1958 and those born between 1983

and 1992.

However, the data also showed that frequent bleeding was common in patients with severe hemophilia, regardless of when they were born.

These data were published in Blood.

“Our analysis provides a snapshot of how improvements in care have translated into outcomes across different generations of men with hemophilia,” said study author Paul E. Monahan, MD, of The University of North Carolina-Chapel Hill.

“While there is reason to be pleased with the progress we’ve made, our data show some surprising deficits and suggest that efforts are needed to more consistently apply the integration of standard-of-care multidisciplinary services and preventive blood clotting factor treatments to further normalize the lives of men living with hemophilia.”

Dr Monahan and his colleagues analyzed data on 7486 men—4899 with severe hemophilia (65.4%), 2587 with mild hemophilia (34.6%), 6094 with hemophilia A (81.4%), and 1392 with hemophilia B (18.6%).

The data were collected prospectively by the US Centers for Disease Control and Prevention and 130 federally supported Hemophilia Treatment Centers (HTCs) between 1998 and 2011. This represents the largest database of men living with hemophilia.

The researchers grouped the men into 4 eras (birth cohorts) to evaluate how outcomes—access to care, physical and social functioning, complications, and mortality—have changed over the last 50 years.

The cohorts were as follows:

  • Era A: born before 1958 (median age 58)
  • Era B: born between 1958 and 1975 (median age 40)
  • Era C: born between 1976 and 1982 (median age 28)
  • Era D: born between 1983 and 1992 (median age 21).

Access to care

The researchers found that the proportion of men who started home infusions before age 6 was far greater in Era D than Era A—50.4% and 2%, respectively. And the proportion of patients reporting a first HTC visit before age 2 rose nearly 10-fold from Era A to Era D—8.8% and 100%, respectively.

In addition, the use of a continuous prophylactic regimen was nearly 3-fold greater in Era D than Era A—46.7% and 16%, respectively.

However, patients in the youngest 2 birth cohorts were more likely than their older counterparts to be uninsured. The proportion of uninsured patients was 16.4% in Era D, 20.5% in Era C, 11.1% in Era B, and 5.7% in Era A.

Bleeding events

The proportion of patients reporting frequent bleeds decreased from Era A to Era D. However, frequent bleeding was common in men with severe hemophilia regardless of when they were born.

Even men from Era D—who had access to effective, safe clotting therapies and multidisciplinary care throughout their lifetimes—reported frequent bleeds.

The proportion of patients with severe hemophilia reporting 5 or more joint bleeds in the last 6 months was 42.6% in Era A and 35.5% in Era D. The proportion of patients with a joint affected by recurrent bleeding was 32.6% and 24.9%, respectively.

Functioning

The researchers also discovered that patients with severe hemophilia were 3 times as likely to report limitations in their activities or to be disabled, when compared to patients with mild hemophilia, regardless of when they were born.

Still, men from Era A were more likely to report limitations in their overall activity level—68.8% of severe hemophilia patients and 21.1% of mild hemophilia patients—than men from Era D—14.9% of severe hemophilia patients and 4.3% of mild hemophilia patients.

 

 

Men with severe hemophilia were more likely than men with mild hemophilia to report missing more than 10 days of work or school during the previous year. In Era A, the proportions were 6.9% and 2.6%, respectively. In Era D, the proportions were 5.6% and 3%, respectively.

“Clear disparities remain in terms of frequent bleeding and disability between men with severe hemophilia and mild hemophilia across every decade of adult life,” Dr Monahan said.

“We thought the difference in functional outcomes would have narrowed over the years. That is, men with severe hemophilia should look more like those with

mild disorder, given improved therapeutics and access to care, but this wasn’t the case.”

“What needs examination is why, despite widespread availability of preventive and on-demand therapies for home use, we still see disparities. It speaks to the need for continued disease surveillance to monitor and inform hemophilia interventions and outcomes.”

Mortality

There were 551 deaths during the study period. The Era A and B cohorts accounted for 82% of the deaths in the severe hemophilia population and 96% of the deaths in the mild hemophilia population.

The researchers noted that liver failure has surpassed bleeding issues and HIV as the leading cause of death among US men with hemophilia.

Although there were no liver-related deaths in the 2 youngest cohorts, liver failure was the most commonly reported cause of death across all the cohorts, for both severe hemophilia (33% of deaths) and mild hemophilia (26% of deaths).

The researchers said this finding underscores the need to swiftly evaluate and treat HCV infections.

“Liver disease worsens bleeding, so eradicating hepatitis C infections needs to be a priority, especially as we now have remarkably effective therapies,” Dr Monahan said.

Across all the birth cohorts, hemophilia-related deaths accounted for 14.6% of deaths in patients with severe hemophilia and 10.7% of deaths in those with mild hemophilia.

Antihemophilic factor

A large study suggests that, despite treatment advances, men with severe hemophilia have not seen great reductions in bleeding events.

Data on more than 7000 men with hemophilia revealed substantial differences in the care received by men born before 1958 and those born between 1983

and 1992.

However, the data also showed that frequent bleeding was common in patients with severe hemophilia, regardless of when they were born.

These data were published in Blood.

“Our analysis provides a snapshot of how improvements in care have translated into outcomes across different generations of men with hemophilia,” said study author Paul E. Monahan, MD, of The University of North Carolina-Chapel Hill.

“While there is reason to be pleased with the progress we’ve made, our data show some surprising deficits and suggest that efforts are needed to more consistently apply the integration of standard-of-care multidisciplinary services and preventive blood clotting factor treatments to further normalize the lives of men living with hemophilia.”

Dr Monahan and his colleagues analyzed data on 7486 men—4899 with severe hemophilia (65.4%), 2587 with mild hemophilia (34.6%), 6094 with hemophilia A (81.4%), and 1392 with hemophilia B (18.6%).

The data were collected prospectively by the US Centers for Disease Control and Prevention and 130 federally supported Hemophilia Treatment Centers (HTCs) between 1998 and 2011. This represents the largest database of men living with hemophilia.

The researchers grouped the men into 4 eras (birth cohorts) to evaluate how outcomes—access to care, physical and social functioning, complications, and mortality—have changed over the last 50 years.

The cohorts were as follows:

  • Era A: born before 1958 (median age 58)
  • Era B: born between 1958 and 1975 (median age 40)
  • Era C: born between 1976 and 1982 (median age 28)
  • Era D: born between 1983 and 1992 (median age 21).

Access to care

The researchers found that the proportion of men who started home infusions before age 6 was far greater in Era D than Era A—50.4% and 2%, respectively. And the proportion of patients reporting a first HTC visit before age 2 rose nearly 10-fold from Era A to Era D—8.8% and 100%, respectively.

In addition, the use of a continuous prophylactic regimen was nearly 3-fold greater in Era D than Era A—46.7% and 16%, respectively.

However, patients in the youngest 2 birth cohorts were more likely than their older counterparts to be uninsured. The proportion of uninsured patients was 16.4% in Era D, 20.5% in Era C, 11.1% in Era B, and 5.7% in Era A.

Bleeding events

The proportion of patients reporting frequent bleeds decreased from Era A to Era D. However, frequent bleeding was common in men with severe hemophilia regardless of when they were born.

Even men from Era D—who had access to effective, safe clotting therapies and multidisciplinary care throughout their lifetimes—reported frequent bleeds.

The proportion of patients with severe hemophilia reporting 5 or more joint bleeds in the last 6 months was 42.6% in Era A and 35.5% in Era D. The proportion of patients with a joint affected by recurrent bleeding was 32.6% and 24.9%, respectively.

Functioning

The researchers also discovered that patients with severe hemophilia were 3 times as likely to report limitations in their activities or to be disabled, when compared to patients with mild hemophilia, regardless of when they were born.

Still, men from Era A were more likely to report limitations in their overall activity level—68.8% of severe hemophilia patients and 21.1% of mild hemophilia patients—than men from Era D—14.9% of severe hemophilia patients and 4.3% of mild hemophilia patients.

 

 

Men with severe hemophilia were more likely than men with mild hemophilia to report missing more than 10 days of work or school during the previous year. In Era A, the proportions were 6.9% and 2.6%, respectively. In Era D, the proportions were 5.6% and 3%, respectively.

“Clear disparities remain in terms of frequent bleeding and disability between men with severe hemophilia and mild hemophilia across every decade of adult life,” Dr Monahan said.

“We thought the difference in functional outcomes would have narrowed over the years. That is, men with severe hemophilia should look more like those with

mild disorder, given improved therapeutics and access to care, but this wasn’t the case.”

“What needs examination is why, despite widespread availability of preventive and on-demand therapies for home use, we still see disparities. It speaks to the need for continued disease surveillance to monitor and inform hemophilia interventions and outcomes.”

Mortality

There were 551 deaths during the study period. The Era A and B cohorts accounted for 82% of the deaths in the severe hemophilia population and 96% of the deaths in the mild hemophilia population.

The researchers noted that liver failure has surpassed bleeding issues and HIV as the leading cause of death among US men with hemophilia.

Although there were no liver-related deaths in the 2 youngest cohorts, liver failure was the most commonly reported cause of death across all the cohorts, for both severe hemophilia (33% of deaths) and mild hemophilia (26% of deaths).

The researchers said this finding underscores the need to swiftly evaluate and treat HCV infections.

“Liver disease worsens bleeding, so eradicating hepatitis C infections needs to be a priority, especially as we now have remarkably effective therapies,” Dr Monahan said.

Across all the birth cohorts, hemophilia-related deaths accounted for 14.6% of deaths in patients with severe hemophilia and 10.7% of deaths in those with mild hemophilia.

Publications
Publications
Topics
Article Type
Display Headline
Severe hemophilia still tough to manage, study shows
Display Headline
Severe hemophilia still tough to manage, study shows
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Financial burdens reduce QOL for cancer survivors

Article Type
Changed
Display Headline
Financial burdens reduce QOL for cancer survivors

Cancer patient

receiving treatment

Photo by Rhoda Baer

An analysis of nearly 20 million cancer survivors showed that almost 29% had financial burdens as a result of their cancer diagnosis and/or treatment.

In other words, they borrowed money, declared bankruptcy, worried about paying large medical bills, were unable to cover the cost of medical visits, or made other financial sacrifices.

Furthermore, such hardships could have lasting effects on a cancer survivor’s quality of life (QOL).

Hrishikesh Kale and Norman Carroll, PhD, both of Virginia Commonwealth University School of Pharmacy in Richmond, reported these findings in Cancer.

The pair analyzed 2011 Medical Expenditure Panel Survey data on 19.6 million cancer survivors, assessing financial burden and QOL.

Subjects were considered to have financial burden if they reported 1 of the following problems: borrowed money/declared bankruptcy, worried about paying large medical bills, unable to cover the cost of medical care visits, or other financial sacrifices.

Nearly 29% of the cancer survivors reported at least 1 financial problem resulting from cancer diagnosis, treatment, or lasting effects of that treatment.

Of all the cancer survivors in the analysis, 20.9% worried about paying large medical bills, 11.5% were unable to cover the cost of medical care visits, 7.6% reported borrowing money or going into debt, 1.4% declared bankruptcy, and 8.6% reported other financial sacrifices.

Cancer survivors who faced such financial difficulties had lower physical and mental health-related QOL, higher risk for depressed mood and psychological distress, and were more likely to worry about cancer recurrence, when compared with cancer survivors who did not face financial problems.

In addition, as the number of financial problems reported by cancer survivors increased, their QOL continued to decrease. And their risk for depressed mood, psychological distress, and worries about cancer recurrence continued to increase.

“Our results suggest that policies and practices that minimize cancer patients’ out-of-pocket costs can improve survivors’ health-related quality of life and psychological health,” Dr Carroll said.

“Reducing the financial burden of cancer care requires integrated efforts, and the study findings are useful for survivorship care programs, oncologists, payers, pharmaceutical companies, and patients and their family members.”

Publications
Topics

Cancer patient

receiving treatment

Photo by Rhoda Baer

An analysis of nearly 20 million cancer survivors showed that almost 29% had financial burdens as a result of their cancer diagnosis and/or treatment.

In other words, they borrowed money, declared bankruptcy, worried about paying large medical bills, were unable to cover the cost of medical visits, or made other financial sacrifices.

Furthermore, such hardships could have lasting effects on a cancer survivor’s quality of life (QOL).

Hrishikesh Kale and Norman Carroll, PhD, both of Virginia Commonwealth University School of Pharmacy in Richmond, reported these findings in Cancer.

The pair analyzed 2011 Medical Expenditure Panel Survey data on 19.6 million cancer survivors, assessing financial burden and QOL.

Subjects were considered to have financial burden if they reported 1 of the following problems: borrowed money/declared bankruptcy, worried about paying large medical bills, unable to cover the cost of medical care visits, or other financial sacrifices.

Nearly 29% of the cancer survivors reported at least 1 financial problem resulting from cancer diagnosis, treatment, or lasting effects of that treatment.

Of all the cancer survivors in the analysis, 20.9% worried about paying large medical bills, 11.5% were unable to cover the cost of medical care visits, 7.6% reported borrowing money or going into debt, 1.4% declared bankruptcy, and 8.6% reported other financial sacrifices.

Cancer survivors who faced such financial difficulties had lower physical and mental health-related QOL, higher risk for depressed mood and psychological distress, and were more likely to worry about cancer recurrence, when compared with cancer survivors who did not face financial problems.

In addition, as the number of financial problems reported by cancer survivors increased, their QOL continued to decrease. And their risk for depressed mood, psychological distress, and worries about cancer recurrence continued to increase.

“Our results suggest that policies and practices that minimize cancer patients’ out-of-pocket costs can improve survivors’ health-related quality of life and psychological health,” Dr Carroll said.

“Reducing the financial burden of cancer care requires integrated efforts, and the study findings are useful for survivorship care programs, oncologists, payers, pharmaceutical companies, and patients and their family members.”

Cancer patient

receiving treatment

Photo by Rhoda Baer

An analysis of nearly 20 million cancer survivors showed that almost 29% had financial burdens as a result of their cancer diagnosis and/or treatment.

In other words, they borrowed money, declared bankruptcy, worried about paying large medical bills, were unable to cover the cost of medical visits, or made other financial sacrifices.

Furthermore, such hardships could have lasting effects on a cancer survivor’s quality of life (QOL).

Hrishikesh Kale and Norman Carroll, PhD, both of Virginia Commonwealth University School of Pharmacy in Richmond, reported these findings in Cancer.

The pair analyzed 2011 Medical Expenditure Panel Survey data on 19.6 million cancer survivors, assessing financial burden and QOL.

Subjects were considered to have financial burden if they reported 1 of the following problems: borrowed money/declared bankruptcy, worried about paying large medical bills, unable to cover the cost of medical care visits, or other financial sacrifices.

Nearly 29% of the cancer survivors reported at least 1 financial problem resulting from cancer diagnosis, treatment, or lasting effects of that treatment.

Of all the cancer survivors in the analysis, 20.9% worried about paying large medical bills, 11.5% were unable to cover the cost of medical care visits, 7.6% reported borrowing money or going into debt, 1.4% declared bankruptcy, and 8.6% reported other financial sacrifices.

Cancer survivors who faced such financial difficulties had lower physical and mental health-related QOL, higher risk for depressed mood and psychological distress, and were more likely to worry about cancer recurrence, when compared with cancer survivors who did not face financial problems.

In addition, as the number of financial problems reported by cancer survivors increased, their QOL continued to decrease. And their risk for depressed mood, psychological distress, and worries about cancer recurrence continued to increase.

“Our results suggest that policies and practices that minimize cancer patients’ out-of-pocket costs can improve survivors’ health-related quality of life and psychological health,” Dr Carroll said.

“Reducing the financial burden of cancer care requires integrated efforts, and the study findings are useful for survivorship care programs, oncologists, payers, pharmaceutical companies, and patients and their family members.”

Publications
Publications
Topics
Article Type
Display Headline
Financial burdens reduce QOL for cancer survivors
Display Headline
Financial burdens reduce QOL for cancer survivors
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Team identifies gaps in anticoagulant use

Article Type
Changed
Display Headline
Team identifies gaps in anticoagulant use

Bottles of warfarin

Photo courtesy of NIGMS

Many patients with atrial fibrillation (AF) who are at the highest risk of stroke are not receiving the recommended oral anticoagulant therapy, according to a new study.

Researchers did find that AF patients who had a higher risk of stroke according to CHADS2 score or CHA2DS2-VASc score were more likely to receive an oral anticoagulant.

But the prevalence of oral anticoagulant use did not exceed 50%, even among high-risk patients.

Jonathan C. Hsu, MD, of the University of California, San Diego, and his colleagues reported these findings in JAMA Cardiology.

The researchers said it has not been clear if the prescription of oral anticoagulants increases as the risk of stroke increases in AF patients.

To gain some insight, the team studied 429,417 outpatients with AF enrolled in the American College of Cardiology National Cardiovascular Data Registry’s PINNACLE Registry between January 2008 and December 2012.

The researchers calculated the CHADS2 score and the CHA2DS2-VASc score for all patients and examined the association between increased stroke risk score and prescription of an oral anticoagulant.

In the entire cohort, 44.9% of patients were prescribed an oral anticoagulant, 25.9% aspirin only, 5.5% aspirin plus a thienopyridine, and 23.8% no antithrombotic therapy.

The researchers found that each 1-point increase in risk score was associated with increased odds of oral anticoagulant prescription when compared with aspirin-only prescription.

When using CHADS2 score, the adjusted odds ratio was 1.158 (95% CI, 1.144-1.172, P<0.001). When using CHA2DS2-VASc score, the adjusted odds ratio was 1.163 (95% CI, 1.157-1.169, P<0.001).

Still, the researchers said they observed a plateau in oral anticoagulant prescription.

The prevalence of oral anticoagulant prescription did not exceed 50%, even in patients with a CHADS2 score exceeding 3 or a CHA2DS2-VASc score exceeding 4.

Dr Hsu and his colleagues said these findings draw attention to important gaps in the appropriate treatment of patients with AF at the highest risk of stroke and highlight opportunities to understand the reasons behind these gaps.

Publications
Topics

Bottles of warfarin

Photo courtesy of NIGMS

Many patients with atrial fibrillation (AF) who are at the highest risk of stroke are not receiving the recommended oral anticoagulant therapy, according to a new study.

Researchers did find that AF patients who had a higher risk of stroke according to CHADS2 score or CHA2DS2-VASc score were more likely to receive an oral anticoagulant.

But the prevalence of oral anticoagulant use did not exceed 50%, even among high-risk patients.

Jonathan C. Hsu, MD, of the University of California, San Diego, and his colleagues reported these findings in JAMA Cardiology.

The researchers said it has not been clear if the prescription of oral anticoagulants increases as the risk of stroke increases in AF patients.

To gain some insight, the team studied 429,417 outpatients with AF enrolled in the American College of Cardiology National Cardiovascular Data Registry’s PINNACLE Registry between January 2008 and December 2012.

The researchers calculated the CHADS2 score and the CHA2DS2-VASc score for all patients and examined the association between increased stroke risk score and prescription of an oral anticoagulant.

In the entire cohort, 44.9% of patients were prescribed an oral anticoagulant, 25.9% aspirin only, 5.5% aspirin plus a thienopyridine, and 23.8% no antithrombotic therapy.

The researchers found that each 1-point increase in risk score was associated with increased odds of oral anticoagulant prescription when compared with aspirin-only prescription.

When using CHADS2 score, the adjusted odds ratio was 1.158 (95% CI, 1.144-1.172, P<0.001). When using CHA2DS2-VASc score, the adjusted odds ratio was 1.163 (95% CI, 1.157-1.169, P<0.001).

Still, the researchers said they observed a plateau in oral anticoagulant prescription.

The prevalence of oral anticoagulant prescription did not exceed 50%, even in patients with a CHADS2 score exceeding 3 or a CHA2DS2-VASc score exceeding 4.

Dr Hsu and his colleagues said these findings draw attention to important gaps in the appropriate treatment of patients with AF at the highest risk of stroke and highlight opportunities to understand the reasons behind these gaps.

Bottles of warfarin

Photo courtesy of NIGMS

Many patients with atrial fibrillation (AF) who are at the highest risk of stroke are not receiving the recommended oral anticoagulant therapy, according to a new study.

Researchers did find that AF patients who had a higher risk of stroke according to CHADS2 score or CHA2DS2-VASc score were more likely to receive an oral anticoagulant.

But the prevalence of oral anticoagulant use did not exceed 50%, even among high-risk patients.

Jonathan C. Hsu, MD, of the University of California, San Diego, and his colleagues reported these findings in JAMA Cardiology.

The researchers said it has not been clear if the prescription of oral anticoagulants increases as the risk of stroke increases in AF patients.

To gain some insight, the team studied 429,417 outpatients with AF enrolled in the American College of Cardiology National Cardiovascular Data Registry’s PINNACLE Registry between January 2008 and December 2012.

The researchers calculated the CHADS2 score and the CHA2DS2-VASc score for all patients and examined the association between increased stroke risk score and prescription of an oral anticoagulant.

In the entire cohort, 44.9% of patients were prescribed an oral anticoagulant, 25.9% aspirin only, 5.5% aspirin plus a thienopyridine, and 23.8% no antithrombotic therapy.

The researchers found that each 1-point increase in risk score was associated with increased odds of oral anticoagulant prescription when compared with aspirin-only prescription.

When using CHADS2 score, the adjusted odds ratio was 1.158 (95% CI, 1.144-1.172, P<0.001). When using CHA2DS2-VASc score, the adjusted odds ratio was 1.163 (95% CI, 1.157-1.169, P<0.001).

Still, the researchers said they observed a plateau in oral anticoagulant prescription.

The prevalence of oral anticoagulant prescription did not exceed 50%, even in patients with a CHADS2 score exceeding 3 or a CHA2DS2-VASc score exceeding 4.

Dr Hsu and his colleagues said these findings draw attention to important gaps in the appropriate treatment of patients with AF at the highest risk of stroke and highlight opportunities to understand the reasons behind these gaps.

Publications
Publications
Topics
Article Type
Display Headline
Team identifies gaps in anticoagulant use
Display Headline
Team identifies gaps in anticoagulant use
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Aesthetic Dermatology: Eyelash extensions

Article Type
Changed
Display Headline
Aesthetic Dermatology: Eyelash extensions

The obsession with longer, fuller, darker eyelashes has become a mainstay in our culture – initially with the ever growing options of mascaras and glue on eyelashes, and now with options that are longer lasting, including eyelash extensions (semipermanent eyelashes) and topical eyelash growth enhancers (such as bimatoprost).

Eyelash extensions are not the same as glue-on strip or individual lashes bought at the drug store or makeup counter that last 1-2 days. These are silk, mink, or poly nylon synthetic lashes that typically last for approximately four weeks, with refills often required at 2-4 week intervals as the natural eyelash sheds. They are adhered to the person’s natural eyelash via an adhesive bonding process that can take 1-2 hours for initial application. Generally, a single lash is applied to each natural lash.

Dr. Naissan O. Wesley

When applied properly, neither the extension eyelash nor the glue should touch the eyelid. The bond is designed to last until the lashes naturally fall out, although the extensions may fall out faster if one uses oil-based eye makeup remover or rubs the eyes regularly, as oil weakens the bond between the glue and the lash. Eyelash extensions are waterproof and give the appearance of having mascara on without wearing it. In the United States, eyelash extension services can range from $100 to $500 for the initial application, with decreased cost for refills. Lash extensions are waterproof and popular for special occasions and vacations, and even more so now for every day.

Potential adverse effects of eyelash extensions include ocular hyperemia, keratoconjunctivitis, allergic blepharitis, and allergic contact dermatitis in the patient. Keratoconjunctivitis is thought to be due to formaldehyde contained in some of the glues used for application.1 Eyelash extensions have also been associated with occupational allergic contact dermatitis, allergic rhinitis, and occupational asthma in the practitioner applying the eyelash extensions, particularly with the cyanoacrylate-based glues.2,3

Dr. Lily Talakoub

In a national survey of eyelash extensions and their health-related problems in Japan, 10% (205) of the respondents had experience with eyelash extensions. Of those women, 27% (55) experienced problems that included ocular hyperemia, pain, and itchy swollen eyelids.4 Conjunctival erosion from the eyelid fixing tape used during application and subconjunctival hemorrhage from compression during removal of the extensions has been also reported.1 Hair breakage and even traction alopecia may occur, especially in patients who accidentally or intentionally pull the extensions off.

If permanent eyelash damage occurs, eyelash transplantation may be required to replace the eyelash, as eyelash growth medications such as bimatoprost may not be effective if the follicle is missing or severely damaged. Eyelash transplants often grow long enough where they require trimming, especially if donor sites are taken from the scalp.5

Eyelash extensions offer a nice alternative to daily use of mascara, temporary glue-on eyelashes, and daily application of topical eyelash growth products. As this procedure has increased in number, the dermatologist may be consulted for recommendations and treatment of any potential adverse events associated with it.

References

1. Cornea. 2012 Feb;31(2):121-5.

2. Contact Dermatitis. 2012 Nov;67(5):307-8.

3. Occup Med (Lond). 2013 Jun;63(4):294-7.

4. Nihon Eiseigaku Zasshi. 2013;68(3):168-74.

5. Plast Reconstr Surg Glob Open. 2015 Apr 7;3(3):e324.

Dr. Wesley and Dr. Talakoub are co-contributors to this column. Dr. Talakoub is in private practice in McLean, Va. Dr. Wesley practices dermatology in Beverly Hills, Calif. This month’s column is by Dr. Wesley.

References

Author and Disclosure Information

Publications
Topics
Legacy Keywords
eyelash, extensions, ocular, bimatoprost, latisse
Sections
Author and Disclosure Information

Author and Disclosure Information

The obsession with longer, fuller, darker eyelashes has become a mainstay in our culture – initially with the ever growing options of mascaras and glue on eyelashes, and now with options that are longer lasting, including eyelash extensions (semipermanent eyelashes) and topical eyelash growth enhancers (such as bimatoprost).

Eyelash extensions are not the same as glue-on strip or individual lashes bought at the drug store or makeup counter that last 1-2 days. These are silk, mink, or poly nylon synthetic lashes that typically last for approximately four weeks, with refills often required at 2-4 week intervals as the natural eyelash sheds. They are adhered to the person’s natural eyelash via an adhesive bonding process that can take 1-2 hours for initial application. Generally, a single lash is applied to each natural lash.

Dr. Naissan O. Wesley

When applied properly, neither the extension eyelash nor the glue should touch the eyelid. The bond is designed to last until the lashes naturally fall out, although the extensions may fall out faster if one uses oil-based eye makeup remover or rubs the eyes regularly, as oil weakens the bond between the glue and the lash. Eyelash extensions are waterproof and give the appearance of having mascara on without wearing it. In the United States, eyelash extension services can range from $100 to $500 for the initial application, with decreased cost for refills. Lash extensions are waterproof and popular for special occasions and vacations, and even more so now for every day.

Potential adverse effects of eyelash extensions include ocular hyperemia, keratoconjunctivitis, allergic blepharitis, and allergic contact dermatitis in the patient. Keratoconjunctivitis is thought to be due to formaldehyde contained in some of the glues used for application.1 Eyelash extensions have also been associated with occupational allergic contact dermatitis, allergic rhinitis, and occupational asthma in the practitioner applying the eyelash extensions, particularly with the cyanoacrylate-based glues.2,3

Dr. Lily Talakoub

In a national survey of eyelash extensions and their health-related problems in Japan, 10% (205) of the respondents had experience with eyelash extensions. Of those women, 27% (55) experienced problems that included ocular hyperemia, pain, and itchy swollen eyelids.4 Conjunctival erosion from the eyelid fixing tape used during application and subconjunctival hemorrhage from compression during removal of the extensions has been also reported.1 Hair breakage and even traction alopecia may occur, especially in patients who accidentally or intentionally pull the extensions off.

If permanent eyelash damage occurs, eyelash transplantation may be required to replace the eyelash, as eyelash growth medications such as bimatoprost may not be effective if the follicle is missing or severely damaged. Eyelash transplants often grow long enough where they require trimming, especially if donor sites are taken from the scalp.5

Eyelash extensions offer a nice alternative to daily use of mascara, temporary glue-on eyelashes, and daily application of topical eyelash growth products. As this procedure has increased in number, the dermatologist may be consulted for recommendations and treatment of any potential adverse events associated with it.

References

1. Cornea. 2012 Feb;31(2):121-5.

2. Contact Dermatitis. 2012 Nov;67(5):307-8.

3. Occup Med (Lond). 2013 Jun;63(4):294-7.

4. Nihon Eiseigaku Zasshi. 2013;68(3):168-74.

5. Plast Reconstr Surg Glob Open. 2015 Apr 7;3(3):e324.

Dr. Wesley and Dr. Talakoub are co-contributors to this column. Dr. Talakoub is in private practice in McLean, Va. Dr. Wesley practices dermatology in Beverly Hills, Calif. This month’s column is by Dr. Wesley.

The obsession with longer, fuller, darker eyelashes has become a mainstay in our culture – initially with the ever growing options of mascaras and glue on eyelashes, and now with options that are longer lasting, including eyelash extensions (semipermanent eyelashes) and topical eyelash growth enhancers (such as bimatoprost).

Eyelash extensions are not the same as glue-on strip or individual lashes bought at the drug store or makeup counter that last 1-2 days. These are silk, mink, or poly nylon synthetic lashes that typically last for approximately four weeks, with refills often required at 2-4 week intervals as the natural eyelash sheds. They are adhered to the person’s natural eyelash via an adhesive bonding process that can take 1-2 hours for initial application. Generally, a single lash is applied to each natural lash.

Dr. Naissan O. Wesley

When applied properly, neither the extension eyelash nor the glue should touch the eyelid. The bond is designed to last until the lashes naturally fall out, although the extensions may fall out faster if one uses oil-based eye makeup remover or rubs the eyes regularly, as oil weakens the bond between the glue and the lash. Eyelash extensions are waterproof and give the appearance of having mascara on without wearing it. In the United States, eyelash extension services can range from $100 to $500 for the initial application, with decreased cost for refills. Lash extensions are waterproof and popular for special occasions and vacations, and even more so now for every day.

Potential adverse effects of eyelash extensions include ocular hyperemia, keratoconjunctivitis, allergic blepharitis, and allergic contact dermatitis in the patient. Keratoconjunctivitis is thought to be due to formaldehyde contained in some of the glues used for application.1 Eyelash extensions have also been associated with occupational allergic contact dermatitis, allergic rhinitis, and occupational asthma in the practitioner applying the eyelash extensions, particularly with the cyanoacrylate-based glues.2,3

Dr. Lily Talakoub

In a national survey of eyelash extensions and their health-related problems in Japan, 10% (205) of the respondents had experience with eyelash extensions. Of those women, 27% (55) experienced problems that included ocular hyperemia, pain, and itchy swollen eyelids.4 Conjunctival erosion from the eyelid fixing tape used during application and subconjunctival hemorrhage from compression during removal of the extensions has been also reported.1 Hair breakage and even traction alopecia may occur, especially in patients who accidentally or intentionally pull the extensions off.

If permanent eyelash damage occurs, eyelash transplantation may be required to replace the eyelash, as eyelash growth medications such as bimatoprost may not be effective if the follicle is missing or severely damaged. Eyelash transplants often grow long enough where they require trimming, especially if donor sites are taken from the scalp.5

Eyelash extensions offer a nice alternative to daily use of mascara, temporary glue-on eyelashes, and daily application of topical eyelash growth products. As this procedure has increased in number, the dermatologist may be consulted for recommendations and treatment of any potential adverse events associated with it.

References

1. Cornea. 2012 Feb;31(2):121-5.

2. Contact Dermatitis. 2012 Nov;67(5):307-8.

3. Occup Med (Lond). 2013 Jun;63(4):294-7.

4. Nihon Eiseigaku Zasshi. 2013;68(3):168-74.

5. Plast Reconstr Surg Glob Open. 2015 Apr 7;3(3):e324.

Dr. Wesley and Dr. Talakoub are co-contributors to this column. Dr. Talakoub is in private practice in McLean, Va. Dr. Wesley practices dermatology in Beverly Hills, Calif. This month’s column is by Dr. Wesley.

References

References

Publications
Publications
Topics
Article Type
Display Headline
Aesthetic Dermatology: Eyelash extensions
Display Headline
Aesthetic Dermatology: Eyelash extensions
Legacy Keywords
eyelash, extensions, ocular, bimatoprost, latisse
Legacy Keywords
eyelash, extensions, ocular, bimatoprost, latisse
Sections
Article Source

PURLs Copyright

Inside the Article

CV health may prevent cognitive decline

Article Type
Changed
Display Headline
CV health may prevent cognitive decline

The closer that older adults come to meeting the American Heart Association’s “ideal” targets for seven factors that determine cardiovascular health, the lower their risk for cognitive decline, according to a report published online March 16 in Journal of the American Heart Association.

A secondary analysis of data from a prospective population-based cohort study of stroke risk demonstrated that better alignment with the AHA’s “Life’s Simple 7” cardiovascular health metrics correlated with less decline in mental processing speed, and, to a lesser extent, in executive function and episodic memory. “The results of this study suggest that achievement of the AHA’s ideal cardiovascular health metrics may have benefits for brain health, in addition to preventing strokes and myocardial infarctions ... underscoring the importance of public health initiatives aimed to better control these seven factors,” said Hannah Gardener, Sc.D., of the department of neurology, University of Miami, and her associates.

Hannah Gardener, Sc.D.

The AHA recently defined ideal target levels for seven modifiable cardiovascular (CV) risk factors: smoking status, body mass index, physical activity level, diet, blood pressure, total cholesterol level, and fasting glucose level. Meeting or closely approaching these ideals has already been linked to a decreased risk of stroke and MI. To examine a possible association with brain health, Dr. Gardener and her colleagues assessed these seven metrics in an ethnically diverse cohort of 722 participants aged 50 years and older in the Northern Manhattan Study who underwent serial comprehensive neuropsychological testing including brain MRI.

Of the total cohort, 3% had zero ideal factors, 15% had one factor, 33% had two factors, 30% had three factors, 14% had four factors, 14% had five factors, 1% had six factors, and none had all seven factors.

“An increasing number of ideal cardiovascular health factors was positively associated with processing speed,” and the association was particularly strong for three of the factors: ideal body mass index, lack of smoking, and ideal fasting glucose level. This association persisted when the data were adjusted to account for MRI markers of subclinical vascular damage, such as abnormalities in white matter volume, brain atrophy, and previous infarctions. A similar but less strong association was seen between an increasing number of ideal cardiovascular health factors and performance on measures of episodic memory and executive function.

These seven CV factors also were associated with less decline over time in these three areas of cognitive function. In contrast, the CV factors showed no association with measures of semantic memory, the investigators said (J Am Heart Assoc. 2016 Mar 16).

The associations remained unchanged in sensitivity analyses that controlled for the presence and severity of depression.

“The results of our study add to a growing body of literature suggesting the effects of smoking and blood glucose levels on cognitive health in particular,” and support the role of vascular damage and metabolic processes in the etiology of cognitive aging and dementia, they added.

References

Click for Credit Link
Author and Disclosure Information

Publications
Topics
Click for Credit Link
Click for Credit Link
Author and Disclosure Information

Author and Disclosure Information

The closer that older adults come to meeting the American Heart Association’s “ideal” targets for seven factors that determine cardiovascular health, the lower their risk for cognitive decline, according to a report published online March 16 in Journal of the American Heart Association.

A secondary analysis of data from a prospective population-based cohort study of stroke risk demonstrated that better alignment with the AHA’s “Life’s Simple 7” cardiovascular health metrics correlated with less decline in mental processing speed, and, to a lesser extent, in executive function and episodic memory. “The results of this study suggest that achievement of the AHA’s ideal cardiovascular health metrics may have benefits for brain health, in addition to preventing strokes and myocardial infarctions ... underscoring the importance of public health initiatives aimed to better control these seven factors,” said Hannah Gardener, Sc.D., of the department of neurology, University of Miami, and her associates.

Hannah Gardener, Sc.D.

The AHA recently defined ideal target levels for seven modifiable cardiovascular (CV) risk factors: smoking status, body mass index, physical activity level, diet, blood pressure, total cholesterol level, and fasting glucose level. Meeting or closely approaching these ideals has already been linked to a decreased risk of stroke and MI. To examine a possible association with brain health, Dr. Gardener and her colleagues assessed these seven metrics in an ethnically diverse cohort of 722 participants aged 50 years and older in the Northern Manhattan Study who underwent serial comprehensive neuropsychological testing including brain MRI.

Of the total cohort, 3% had zero ideal factors, 15% had one factor, 33% had two factors, 30% had three factors, 14% had four factors, 14% had five factors, 1% had six factors, and none had all seven factors.

“An increasing number of ideal cardiovascular health factors was positively associated with processing speed,” and the association was particularly strong for three of the factors: ideal body mass index, lack of smoking, and ideal fasting glucose level. This association persisted when the data were adjusted to account for MRI markers of subclinical vascular damage, such as abnormalities in white matter volume, brain atrophy, and previous infarctions. A similar but less strong association was seen between an increasing number of ideal cardiovascular health factors and performance on measures of episodic memory and executive function.

These seven CV factors also were associated with less decline over time in these three areas of cognitive function. In contrast, the CV factors showed no association with measures of semantic memory, the investigators said (J Am Heart Assoc. 2016 Mar 16).

The associations remained unchanged in sensitivity analyses that controlled for the presence and severity of depression.

“The results of our study add to a growing body of literature suggesting the effects of smoking and blood glucose levels on cognitive health in particular,” and support the role of vascular damage and metabolic processes in the etiology of cognitive aging and dementia, they added.

The closer that older adults come to meeting the American Heart Association’s “ideal” targets for seven factors that determine cardiovascular health, the lower their risk for cognitive decline, according to a report published online March 16 in Journal of the American Heart Association.

A secondary analysis of data from a prospective population-based cohort study of stroke risk demonstrated that better alignment with the AHA’s “Life’s Simple 7” cardiovascular health metrics correlated with less decline in mental processing speed, and, to a lesser extent, in executive function and episodic memory. “The results of this study suggest that achievement of the AHA’s ideal cardiovascular health metrics may have benefits for brain health, in addition to preventing strokes and myocardial infarctions ... underscoring the importance of public health initiatives aimed to better control these seven factors,” said Hannah Gardener, Sc.D., of the department of neurology, University of Miami, and her associates.

Hannah Gardener, Sc.D.

The AHA recently defined ideal target levels for seven modifiable cardiovascular (CV) risk factors: smoking status, body mass index, physical activity level, diet, blood pressure, total cholesterol level, and fasting glucose level. Meeting or closely approaching these ideals has already been linked to a decreased risk of stroke and MI. To examine a possible association with brain health, Dr. Gardener and her colleagues assessed these seven metrics in an ethnically diverse cohort of 722 participants aged 50 years and older in the Northern Manhattan Study who underwent serial comprehensive neuropsychological testing including brain MRI.

Of the total cohort, 3% had zero ideal factors, 15% had one factor, 33% had two factors, 30% had three factors, 14% had four factors, 14% had five factors, 1% had six factors, and none had all seven factors.

“An increasing number of ideal cardiovascular health factors was positively associated with processing speed,” and the association was particularly strong for three of the factors: ideal body mass index, lack of smoking, and ideal fasting glucose level. This association persisted when the data were adjusted to account for MRI markers of subclinical vascular damage, such as abnormalities in white matter volume, brain atrophy, and previous infarctions. A similar but less strong association was seen between an increasing number of ideal cardiovascular health factors and performance on measures of episodic memory and executive function.

These seven CV factors also were associated with less decline over time in these three areas of cognitive function. In contrast, the CV factors showed no association with measures of semantic memory, the investigators said (J Am Heart Assoc. 2016 Mar 16).

The associations remained unchanged in sensitivity analyses that controlled for the presence and severity of depression.

“The results of our study add to a growing body of literature suggesting the effects of smoking and blood glucose levels on cognitive health in particular,” and support the role of vascular damage and metabolic processes in the etiology of cognitive aging and dementia, they added.

References

References

Publications
Publications
Topics
Article Type
Display Headline
CV health may prevent cognitive decline
Display Headline
CV health may prevent cognitive decline
Click for Credit Status
Active
Article Source

FROM THE JOURNAL OF THE AMERICAN HEART ASSOCIATION

PURLs Copyright

Inside the Article

Vitals

Key clinical point: The closer adults come to meeting “ideal” American Heart Association targets for seven factors related to cardiovascular health, the lower their risk for cognitive decline.

Major finding: An increasing number of the seven ideal cardiovascular health factors was positively associated with mental processing speed.

Data source: A secondary analysis of data from the Northern Manhattan Study, a prospective population-based cohort study of stroke risk, involving 722 people aged 50 years and older at baseline in 1993-2001.

Disclosures: This study was funded by the Evelyn F. McKnight Brain Institute and the National Institutes of Health. Dr. Gardener and her associates reported having no relevant financial disclosures.

Receiving the Flu Vaccine While at the Hospital Does Not Increase Adverse Effects

Article Type
Changed
Display Headline
Receiving the Flu Vaccine While at the Hospital Does Not Increase Adverse Effects

NEW YORK (Reuters Health) - Receiving the seasonal flu vaccine while in the hospital does not increase surgical patients' health care utilization or their likelihood of being evaluated for infection after discharge, according to a new retrospective cohort study.

The Advisory Committee on Immunization Practices recommends that hospitalized patients who are eligible for the flu vaccine receive it before discharge, but rates of vaccination remain low in surgical patients, Dr. Sara Tartof of Kaiser Permanente Southern California in Pasadena and her colleagues note in their report, published online March 14 in the Annals of Internal Medicine.

This could be due to surgeons' concerns that adverse effects of influenza vaccine such as myalgia or fever could be attributed to surgical complications, or could complicate post-surgical care, they add.

"When we searched in the literature, we really just couldn't find any data that really speak to this question," Dr. Tartof told Reuters Health in a telephone interview.

She and her colleagues looked at Kaiser Permanente Southern California patients aged six months or older who had inpatient surgery between September 2010 and March 2013. Of the 42,777 surgeries in their analysis, 6,420 included seasonal flu vaccination during hospitalization.

The researchers found no differences between the vaccinated and unvaccinated groups in the risk of inpatient visits,emergency department visits, post-discharge fever, or clinical evaluation for infection. There was a marginal increase in the risk of outpatient visits (relative risk 1.05, p=0.032).

"We feel that the benefits of vaccination outweigh this risk," Dr. Tartof said. "For high-risk patients, this is a health care contact, this is an opportunity to vaccinate, and we don't want to miss those."

Many patients in the study who were vaccinated against the flu received the shot when they were discharged, the researcher noted. "This may be a more comfortable time for patients and for their clinicians to vaccinate," she said.

Dr. Tartof and her colleagues are now planning to repeat the study in a larger population of nonsurgical inpatients, including children.

The Centers for Disease Control and Prevention funded this research. Five coauthors reported disclosures.

 

 

 

 

 

Issue
The Hospitalist - 2016(03)
Publications
Sections

NEW YORK (Reuters Health) - Receiving the seasonal flu vaccine while in the hospital does not increase surgical patients' health care utilization or their likelihood of being evaluated for infection after discharge, according to a new retrospective cohort study.

The Advisory Committee on Immunization Practices recommends that hospitalized patients who are eligible for the flu vaccine receive it before discharge, but rates of vaccination remain low in surgical patients, Dr. Sara Tartof of Kaiser Permanente Southern California in Pasadena and her colleagues note in their report, published online March 14 in the Annals of Internal Medicine.

This could be due to surgeons' concerns that adverse effects of influenza vaccine such as myalgia or fever could be attributed to surgical complications, or could complicate post-surgical care, they add.

"When we searched in the literature, we really just couldn't find any data that really speak to this question," Dr. Tartof told Reuters Health in a telephone interview.

She and her colleagues looked at Kaiser Permanente Southern California patients aged six months or older who had inpatient surgery between September 2010 and March 2013. Of the 42,777 surgeries in their analysis, 6,420 included seasonal flu vaccination during hospitalization.

The researchers found no differences between the vaccinated and unvaccinated groups in the risk of inpatient visits,emergency department visits, post-discharge fever, or clinical evaluation for infection. There was a marginal increase in the risk of outpatient visits (relative risk 1.05, p=0.032).

"We feel that the benefits of vaccination outweigh this risk," Dr. Tartof said. "For high-risk patients, this is a health care contact, this is an opportunity to vaccinate, and we don't want to miss those."

Many patients in the study who were vaccinated against the flu received the shot when they were discharged, the researcher noted. "This may be a more comfortable time for patients and for their clinicians to vaccinate," she said.

Dr. Tartof and her colleagues are now planning to repeat the study in a larger population of nonsurgical inpatients, including children.

The Centers for Disease Control and Prevention funded this research. Five coauthors reported disclosures.

 

 

 

 

 

NEW YORK (Reuters Health) - Receiving the seasonal flu vaccine while in the hospital does not increase surgical patients' health care utilization or their likelihood of being evaluated for infection after discharge, according to a new retrospective cohort study.

The Advisory Committee on Immunization Practices recommends that hospitalized patients who are eligible for the flu vaccine receive it before discharge, but rates of vaccination remain low in surgical patients, Dr. Sara Tartof of Kaiser Permanente Southern California in Pasadena and her colleagues note in their report, published online March 14 in the Annals of Internal Medicine.

This could be due to surgeons' concerns that adverse effects of influenza vaccine such as myalgia or fever could be attributed to surgical complications, or could complicate post-surgical care, they add.

"When we searched in the literature, we really just couldn't find any data that really speak to this question," Dr. Tartof told Reuters Health in a telephone interview.

She and her colleagues looked at Kaiser Permanente Southern California patients aged six months or older who had inpatient surgery between September 2010 and March 2013. Of the 42,777 surgeries in their analysis, 6,420 included seasonal flu vaccination during hospitalization.

The researchers found no differences between the vaccinated and unvaccinated groups in the risk of inpatient visits,emergency department visits, post-discharge fever, or clinical evaluation for infection. There was a marginal increase in the risk of outpatient visits (relative risk 1.05, p=0.032).

"We feel that the benefits of vaccination outweigh this risk," Dr. Tartof said. "For high-risk patients, this is a health care contact, this is an opportunity to vaccinate, and we don't want to miss those."

Many patients in the study who were vaccinated against the flu received the shot when they were discharged, the researcher noted. "This may be a more comfortable time for patients and for their clinicians to vaccinate," she said.

Dr. Tartof and her colleagues are now planning to repeat the study in a larger population of nonsurgical inpatients, including children.

The Centers for Disease Control and Prevention funded this research. Five coauthors reported disclosures.

 

 

 

 

 

Issue
The Hospitalist - 2016(03)
Issue
The Hospitalist - 2016(03)
Publications
Publications
Article Type
Display Headline
Receiving the Flu Vaccine While at the Hospital Does Not Increase Adverse Effects
Display Headline
Receiving the Flu Vaccine While at the Hospital Does Not Increase Adverse Effects
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)

Nicotinamide Prevents Actinic Keratoses, Basal Cell Carcinomas, and Squamous Cell Carcinomas

Article Type
Changed
Display Headline
Nicotinamide Prevents Actinic Keratoses, Basal Cell Carcinomas, and Squamous Cell Carcinomas

Chen et al (N Engl J Med. 2015;373:1618-1626) performed a multicenter, phase 3, double-blind, randomized, placebo-controlled trial. Results demonstrated that nicotinamide effectively decreased the rates of new nonmelanoma skin cancers (NMSCs) and actinic keratoses (AKs) in high-risk patients who had at least 2 histologically confirmed NMSCs in the last 5 years. In comparison to participants who received placebo, individuals who received nicotinamide 500 mg twice daily during the 12-month study (branded with a predictive acronym ONTRAC [oral nicotinamide to reduce actinic cancer]) had reduced rates of AKs of up to 20%, basal cell carcinomas of 20%, squamous cell carcinomas of 30%, and NMSCs of 23%. However, the effect of nicotinamide on NMSCs was not maintained at evaluation 6 months after discontinuation; the number of basal cell carcinomas was similar, and the number of squamous cell carcinomas was greater in participants who received nicotinamide in comparison to individuals who received placebo.

What’s the issue?

The risk for skin cancer is increased by UV radiation that damages DNA, suppresses cutaneous immunity, and inhibits DNA repair by depleting cellular adenosine triphosphate. Nicotinamide, an amide form of vitamin B3, has been demonstrated to not only reduce UV radiation–induced immunosuppression but also to prevent UV radiation–induced adenosine triphosphate depletion and glycolic blockade. Nicotinamide, which is classified as a food additive, also has neuroprotective and antioxidant functions and reduces pigmentation, wrinkles, and sebum production. Although oral nicotinamide has been demonstrated to reduce NMSCs and AKs, topical application has been shown to improve many skin conditions such as acne, atopic dermatitis, isoniazid-induced pellagra, and rosacea.

In contrast to nicotinic acid (niacin), nicotinamide is not associated with headaches, hypotension, flushing, itching, or vasodilatation. At high oral doses, side effects of nicotinamide that have been hypothesized or observed in animals, humans, or both have included the development of Parkinson disease, insulin sensitivity and diabetes mellitus, and liver toxicity. Although there are no reports in humans of growth retardation, teratogenicity, or oncogenicity, Rolfe (J Cosmet Dermatol. 2014;13:324-328) discussed that fetal blood levels of nicotinamide are greater than corresponding maternal blood levels because it is able to cross the placenta. However, according to Chen et al, no clinically significant between-group differences were found with respect to the number or types of adverse events that occurred in the placebo participants and the individuals who received 1000 mg daily of nicotinamide. Chen et al implied that there were additional benefits in the recipients of nicotinamide with regards to cognitive function and transepidermal water loss.

Perhaps all patients with a history of AKs, basal cell carcinomas, or squamous cell carcinomas should receive lifelong nicotinamide. Also, it might be reasonable to consider that all individuals older than 18 years who are not pregnant or breastfeeding with increased sun exposure but no history of AKs or NMSC add nicotinamide to their daily diets as a proactive measure for chemoprevention. Would you suggest nicotinamide to your patients?

We want to know your views! Tell us what you think.

Author and Disclosure Information

Dr. Cohen is from the Department of Dermatology, University of California San Diego.

Dr. Cohen reports no conflicts of interest in relation to this post.

Publications
Topics
Legacy Keywords
actinic, basal, cancer, carcinoma, cell, keratosis, nicotinamide, skin, squamous
Sections
Author and Disclosure Information

Dr. Cohen is from the Department of Dermatology, University of California San Diego.

Dr. Cohen reports no conflicts of interest in relation to this post.

Author and Disclosure Information

Dr. Cohen is from the Department of Dermatology, University of California San Diego.

Dr. Cohen reports no conflicts of interest in relation to this post.

Related Articles

Chen et al (N Engl J Med. 2015;373:1618-1626) performed a multicenter, phase 3, double-blind, randomized, placebo-controlled trial. Results demonstrated that nicotinamide effectively decreased the rates of new nonmelanoma skin cancers (NMSCs) and actinic keratoses (AKs) in high-risk patients who had at least 2 histologically confirmed NMSCs in the last 5 years. In comparison to participants who received placebo, individuals who received nicotinamide 500 mg twice daily during the 12-month study (branded with a predictive acronym ONTRAC [oral nicotinamide to reduce actinic cancer]) had reduced rates of AKs of up to 20%, basal cell carcinomas of 20%, squamous cell carcinomas of 30%, and NMSCs of 23%. However, the effect of nicotinamide on NMSCs was not maintained at evaluation 6 months after discontinuation; the number of basal cell carcinomas was similar, and the number of squamous cell carcinomas was greater in participants who received nicotinamide in comparison to individuals who received placebo.

What’s the issue?

The risk for skin cancer is increased by UV radiation that damages DNA, suppresses cutaneous immunity, and inhibits DNA repair by depleting cellular adenosine triphosphate. Nicotinamide, an amide form of vitamin B3, has been demonstrated to not only reduce UV radiation–induced immunosuppression but also to prevent UV radiation–induced adenosine triphosphate depletion and glycolic blockade. Nicotinamide, which is classified as a food additive, also has neuroprotective and antioxidant functions and reduces pigmentation, wrinkles, and sebum production. Although oral nicotinamide has been demonstrated to reduce NMSCs and AKs, topical application has been shown to improve many skin conditions such as acne, atopic dermatitis, isoniazid-induced pellagra, and rosacea.

In contrast to nicotinic acid (niacin), nicotinamide is not associated with headaches, hypotension, flushing, itching, or vasodilatation. At high oral doses, side effects of nicotinamide that have been hypothesized or observed in animals, humans, or both have included the development of Parkinson disease, insulin sensitivity and diabetes mellitus, and liver toxicity. Although there are no reports in humans of growth retardation, teratogenicity, or oncogenicity, Rolfe (J Cosmet Dermatol. 2014;13:324-328) discussed that fetal blood levels of nicotinamide are greater than corresponding maternal blood levels because it is able to cross the placenta. However, according to Chen et al, no clinically significant between-group differences were found with respect to the number or types of adverse events that occurred in the placebo participants and the individuals who received 1000 mg daily of nicotinamide. Chen et al implied that there were additional benefits in the recipients of nicotinamide with regards to cognitive function and transepidermal water loss.

Perhaps all patients with a history of AKs, basal cell carcinomas, or squamous cell carcinomas should receive lifelong nicotinamide. Also, it might be reasonable to consider that all individuals older than 18 years who are not pregnant or breastfeeding with increased sun exposure but no history of AKs or NMSC add nicotinamide to their daily diets as a proactive measure for chemoprevention. Would you suggest nicotinamide to your patients?

We want to know your views! Tell us what you think.

Chen et al (N Engl J Med. 2015;373:1618-1626) performed a multicenter, phase 3, double-blind, randomized, placebo-controlled trial. Results demonstrated that nicotinamide effectively decreased the rates of new nonmelanoma skin cancers (NMSCs) and actinic keratoses (AKs) in high-risk patients who had at least 2 histologically confirmed NMSCs in the last 5 years. In comparison to participants who received placebo, individuals who received nicotinamide 500 mg twice daily during the 12-month study (branded with a predictive acronym ONTRAC [oral nicotinamide to reduce actinic cancer]) had reduced rates of AKs of up to 20%, basal cell carcinomas of 20%, squamous cell carcinomas of 30%, and NMSCs of 23%. However, the effect of nicotinamide on NMSCs was not maintained at evaluation 6 months after discontinuation; the number of basal cell carcinomas was similar, and the number of squamous cell carcinomas was greater in participants who received nicotinamide in comparison to individuals who received placebo.

What’s the issue?

The risk for skin cancer is increased by UV radiation that damages DNA, suppresses cutaneous immunity, and inhibits DNA repair by depleting cellular adenosine triphosphate. Nicotinamide, an amide form of vitamin B3, has been demonstrated to not only reduce UV radiation–induced immunosuppression but also to prevent UV radiation–induced adenosine triphosphate depletion and glycolic blockade. Nicotinamide, which is classified as a food additive, also has neuroprotective and antioxidant functions and reduces pigmentation, wrinkles, and sebum production. Although oral nicotinamide has been demonstrated to reduce NMSCs and AKs, topical application has been shown to improve many skin conditions such as acne, atopic dermatitis, isoniazid-induced pellagra, and rosacea.

In contrast to nicotinic acid (niacin), nicotinamide is not associated with headaches, hypotension, flushing, itching, or vasodilatation. At high oral doses, side effects of nicotinamide that have been hypothesized or observed in animals, humans, or both have included the development of Parkinson disease, insulin sensitivity and diabetes mellitus, and liver toxicity. Although there are no reports in humans of growth retardation, teratogenicity, or oncogenicity, Rolfe (J Cosmet Dermatol. 2014;13:324-328) discussed that fetal blood levels of nicotinamide are greater than corresponding maternal blood levels because it is able to cross the placenta. However, according to Chen et al, no clinically significant between-group differences were found with respect to the number or types of adverse events that occurred in the placebo participants and the individuals who received 1000 mg daily of nicotinamide. Chen et al implied that there were additional benefits in the recipients of nicotinamide with regards to cognitive function and transepidermal water loss.

Perhaps all patients with a history of AKs, basal cell carcinomas, or squamous cell carcinomas should receive lifelong nicotinamide. Also, it might be reasonable to consider that all individuals older than 18 years who are not pregnant or breastfeeding with increased sun exposure but no history of AKs or NMSC add nicotinamide to their daily diets as a proactive measure for chemoprevention. Would you suggest nicotinamide to your patients?

We want to know your views! Tell us what you think.

Publications
Publications
Topics
Article Type
Display Headline
Nicotinamide Prevents Actinic Keratoses, Basal Cell Carcinomas, and Squamous Cell Carcinomas
Display Headline
Nicotinamide Prevents Actinic Keratoses, Basal Cell Carcinomas, and Squamous Cell Carcinomas
Legacy Keywords
actinic, basal, cancer, carcinoma, cell, keratosis, nicotinamide, skin, squamous
Legacy Keywords
actinic, basal, cancer, carcinoma, cell, keratosis, nicotinamide, skin, squamous
Sections
Disallow All Ads

RPS15 mutations prevalent in aggressive chronic lymphocytic leukemia

Ribosomal revelation
Article Type
Changed
Display Headline
RPS15 mutations prevalent in aggressive chronic lymphocytic leukemia

Mutations in the RPS15 gene occurred in 8 of 41 patients with relapsing chronic lymphocytic leukemia (CLL), and the mutations were present before treatment in 7 of the 8, a possible indication that the aberrations are early genetic events in aggressive CLL pathobiology.

RPS15 mutations may lead to defective p53 stability and increased degradation, representing a potential novel mechanism in CLL pathobiology. The findings suggest “RPS15-mutant cases should be treated with alternative regimens that act independently of the p53 pathway,” wrote Dr. Viktor Ljungström of the department of immunology, genetics, and pathology, Uppsala (Sweden) University, and colleagues (Blood 2016 Feb 25. doi: 10.1182/blood-2015-10-674572).

©SilverV/thinkstockphotos.com

In their study, the researchers performed whole exome sequencing of 110 samples collected before and after treatment from 41 patients with aggressive CLL that relapsed after a median of 2 years; 7 patients had mutations in RPS15 before treatment, and 8 had RPS15 mutations after treatment. The findings suggest that standard therapy with fludarabine, cyclophosphamide, and rituximab was not intrinsically mutagenic.

High frequencies of mutations were linked to poor outcome in both pretreated and relapse samples. These mutations included NOTCH1, TP53, ATM, SF3B1, MGA, and BIRC3. At least one mutation was seen before treatment in 26 of the 41 patients, and that rate rose to 33 of 41 patients at relapse. Two or more mutations were noted before treatment in 12 of 41 patients, and that rose to 15 of 41 at relapse.

In response to their findings, the researchers next performed targeted resequencing of the RPS15 hot spot (exon 4) in an extended series of 790 patients with CLL, intentionally enriched with 605 cases with adverse prognostic profiles. They found an additional 36 mutations in RPS15 (36/605, 6%). In contrast, none of the 185 patients with more favorable prognostic, IGHV-mutated CLL carried RPS15 mutations. RPS15-mutant patients without concomitant TP53 aberrations had an overall survival similar to other aggressive CLL subgroups, but none of the patients with both mutations survived at 10 years, compared with 59% of patients with wild-type RPS15 and wild-type TP53, “pointing to a dismal prognosis for RPS15-mutated CLL,” they wrote.

They also analyzed 30 cases with Richter syndrome (CLL transformed into diffuse large B-cell lymphoma), and only a single case was found to carry an RPS15 mutation, and the mutation was also observed in the preceding CLL phase. This finding indicates that RPS15 mutation probably does not underlie the transformation of CLL to Richter syndrome, according to the researchers.

Dr. Ljungström and coauthors reported having no relevant financial disclosures.

References

Body

In support of the authors’ hypothesis that RPS15 mutations may be an early-acquired driver in high-risk disease, the variant allele frequency in eight serially analyzed cases remained static, with only one case gaining a mutation in RPS15, whereas the variable allele frequency increased at relapse for other well-characterized mutations in ATM, BIRC3, NFKBIE, and TP53.

Pilot experiments demonstrated specific interactions between TP53 and RPS15, and p53 stability was reduced in the presence of mutant RPS15.

The findings should prompt further investigation to determine if the consequences of RPS15 mutations depend on its interaction with TP53, or if the mutations found in other ribosomal proteins indicate a different mechanism related to the 40S subunit.

Given that RPS15 is not included in common academic or commercial sequencing panels, the presence of RPS15 mutations in other diseases may be underestimated as well.

More generally, are there other cancers with subgroups enriched for other benign-appearing genes?

Dr. James Blachly is with Wexner Medical Center, the Ohio State University, Columbus. These remarks were part of an editorial accompanying a report in Blood (2016 Feb 25. doi: 10.1182/blood-2015-10-674572).

Author and Disclosure Information

Publications
Topics
Author and Disclosure Information

Author and Disclosure Information

Body

In support of the authors’ hypothesis that RPS15 mutations may be an early-acquired driver in high-risk disease, the variant allele frequency in eight serially analyzed cases remained static, with only one case gaining a mutation in RPS15, whereas the variable allele frequency increased at relapse for other well-characterized mutations in ATM, BIRC3, NFKBIE, and TP53.

Pilot experiments demonstrated specific interactions between TP53 and RPS15, and p53 stability was reduced in the presence of mutant RPS15.

The findings should prompt further investigation to determine if the consequences of RPS15 mutations depend on its interaction with TP53, or if the mutations found in other ribosomal proteins indicate a different mechanism related to the 40S subunit.

Given that RPS15 is not included in common academic or commercial sequencing panels, the presence of RPS15 mutations in other diseases may be underestimated as well.

More generally, are there other cancers with subgroups enriched for other benign-appearing genes?

Dr. James Blachly is with Wexner Medical Center, the Ohio State University, Columbus. These remarks were part of an editorial accompanying a report in Blood (2016 Feb 25. doi: 10.1182/blood-2015-10-674572).

Body

In support of the authors’ hypothesis that RPS15 mutations may be an early-acquired driver in high-risk disease, the variant allele frequency in eight serially analyzed cases remained static, with only one case gaining a mutation in RPS15, whereas the variable allele frequency increased at relapse for other well-characterized mutations in ATM, BIRC3, NFKBIE, and TP53.

Pilot experiments demonstrated specific interactions between TP53 and RPS15, and p53 stability was reduced in the presence of mutant RPS15.

The findings should prompt further investigation to determine if the consequences of RPS15 mutations depend on its interaction with TP53, or if the mutations found in other ribosomal proteins indicate a different mechanism related to the 40S subunit.

Given that RPS15 is not included in common academic or commercial sequencing panels, the presence of RPS15 mutations in other diseases may be underestimated as well.

More generally, are there other cancers with subgroups enriched for other benign-appearing genes?

Dr. James Blachly is with Wexner Medical Center, the Ohio State University, Columbus. These remarks were part of an editorial accompanying a report in Blood (2016 Feb 25. doi: 10.1182/blood-2015-10-674572).

Title
Ribosomal revelation
Ribosomal revelation

Mutations in the RPS15 gene occurred in 8 of 41 patients with relapsing chronic lymphocytic leukemia (CLL), and the mutations were present before treatment in 7 of the 8, a possible indication that the aberrations are early genetic events in aggressive CLL pathobiology.

RPS15 mutations may lead to defective p53 stability and increased degradation, representing a potential novel mechanism in CLL pathobiology. The findings suggest “RPS15-mutant cases should be treated with alternative regimens that act independently of the p53 pathway,” wrote Dr. Viktor Ljungström of the department of immunology, genetics, and pathology, Uppsala (Sweden) University, and colleagues (Blood 2016 Feb 25. doi: 10.1182/blood-2015-10-674572).

©SilverV/thinkstockphotos.com

In their study, the researchers performed whole exome sequencing of 110 samples collected before and after treatment from 41 patients with aggressive CLL that relapsed after a median of 2 years; 7 patients had mutations in RPS15 before treatment, and 8 had RPS15 mutations after treatment. The findings suggest that standard therapy with fludarabine, cyclophosphamide, and rituximab was not intrinsically mutagenic.

High frequencies of mutations were linked to poor outcome in both pretreated and relapse samples. These mutations included NOTCH1, TP53, ATM, SF3B1, MGA, and BIRC3. At least one mutation was seen before treatment in 26 of the 41 patients, and that rate rose to 33 of 41 patients at relapse. Two or more mutations were noted before treatment in 12 of 41 patients, and that rose to 15 of 41 at relapse.

In response to their findings, the researchers next performed targeted resequencing of the RPS15 hot spot (exon 4) in an extended series of 790 patients with CLL, intentionally enriched with 605 cases with adverse prognostic profiles. They found an additional 36 mutations in RPS15 (36/605, 6%). In contrast, none of the 185 patients with more favorable prognostic, IGHV-mutated CLL carried RPS15 mutations. RPS15-mutant patients without concomitant TP53 aberrations had an overall survival similar to other aggressive CLL subgroups, but none of the patients with both mutations survived at 10 years, compared with 59% of patients with wild-type RPS15 and wild-type TP53, “pointing to a dismal prognosis for RPS15-mutated CLL,” they wrote.

They also analyzed 30 cases with Richter syndrome (CLL transformed into diffuse large B-cell lymphoma), and only a single case was found to carry an RPS15 mutation, and the mutation was also observed in the preceding CLL phase. This finding indicates that RPS15 mutation probably does not underlie the transformation of CLL to Richter syndrome, according to the researchers.

Dr. Ljungström and coauthors reported having no relevant financial disclosures.

Mutations in the RPS15 gene occurred in 8 of 41 patients with relapsing chronic lymphocytic leukemia (CLL), and the mutations were present before treatment in 7 of the 8, a possible indication that the aberrations are early genetic events in aggressive CLL pathobiology.

RPS15 mutations may lead to defective p53 stability and increased degradation, representing a potential novel mechanism in CLL pathobiology. The findings suggest “RPS15-mutant cases should be treated with alternative regimens that act independently of the p53 pathway,” wrote Dr. Viktor Ljungström of the department of immunology, genetics, and pathology, Uppsala (Sweden) University, and colleagues (Blood 2016 Feb 25. doi: 10.1182/blood-2015-10-674572).

©SilverV/thinkstockphotos.com

In their study, the researchers performed whole exome sequencing of 110 samples collected before and after treatment from 41 patients with aggressive CLL that relapsed after a median of 2 years; 7 patients had mutations in RPS15 before treatment, and 8 had RPS15 mutations after treatment. The findings suggest that standard therapy with fludarabine, cyclophosphamide, and rituximab was not intrinsically mutagenic.

High frequencies of mutations were linked to poor outcome in both pretreated and relapse samples. These mutations included NOTCH1, TP53, ATM, SF3B1, MGA, and BIRC3. At least one mutation was seen before treatment in 26 of the 41 patients, and that rate rose to 33 of 41 patients at relapse. Two or more mutations were noted before treatment in 12 of 41 patients, and that rose to 15 of 41 at relapse.

In response to their findings, the researchers next performed targeted resequencing of the RPS15 hot spot (exon 4) in an extended series of 790 patients with CLL, intentionally enriched with 605 cases with adverse prognostic profiles. They found an additional 36 mutations in RPS15 (36/605, 6%). In contrast, none of the 185 patients with more favorable prognostic, IGHV-mutated CLL carried RPS15 mutations. RPS15-mutant patients without concomitant TP53 aberrations had an overall survival similar to other aggressive CLL subgroups, but none of the patients with both mutations survived at 10 years, compared with 59% of patients with wild-type RPS15 and wild-type TP53, “pointing to a dismal prognosis for RPS15-mutated CLL,” they wrote.

They also analyzed 30 cases with Richter syndrome (CLL transformed into diffuse large B-cell lymphoma), and only a single case was found to carry an RPS15 mutation, and the mutation was also observed in the preceding CLL phase. This finding indicates that RPS15 mutation probably does not underlie the transformation of CLL to Richter syndrome, according to the researchers.

Dr. Ljungström and coauthors reported having no relevant financial disclosures.

References

References

Publications
Publications
Topics
Article Type
Display Headline
RPS15 mutations prevalent in aggressive chronic lymphocytic leukemia
Display Headline
RPS15 mutations prevalent in aggressive chronic lymphocytic leukemia
Article Source

FROM BLOOD

PURLs Copyright

Inside the Article

Vitals

Key clinical point: Aberrations in the RPS15 gene before therapy may be an indicator of aggressive pathobiology in chronic lymphocytic leukemia.

Major finding: Mutations in the RPS15 gene occurred in 8 of 41 patients with relapsing CLL, and the mutations were present before treatment in 7 of the 8.

Data sources: Whole exome sequencing of 110 samples collected before and after fludarabine, cyclophosphamide, and rituximab therapy from 41 patients with relapsed CLL.

Disclosures: Dr. Ljungström and coauthors reported having no relevant financial disclosures.