User login
Use psychoeducational family therapy to help families cope with autism
Treating a family in crisis because of a difficult-to-manage family member with autism spectrum disorder (ASD) can be overwhelming. The family often is desperate and exhausted and, therefore, can be overly needy, demanding, and disorganized. Psychiatrists often are asked to intervene with medication, even though there are no drugs to treat core symptoms of ASD. At best, medication can ease associated symptoms, such as insomnia. However, when coupled with reasonable medication management, psychoeducational family therapy can be an effective, powerful intervention during initial and follow-up medication visits.
Families of ASD patients often show dysfunctional patterns: poor interpersonal and generational boundaries, closed family systems, pathological triangulations, fused and disengaged relationships, resentments, etc. It is easy to assume that an autistic patient’s behavior problems are related to these dysfunctional patterns, and these patterns are caused by psychopathology within the family. In the 1970s and 1980s researchers began to challenge this same assumption in families of patients with schizophrenia and found that the illness shaped family patterns, not the reverse. Illness exacerbations could be minimized by teaching families to reduce their expressed emotions. In addition, research clinicians stopped blaming family members and began describing family dysfunction as a “normal response” to severe psychiatric illness.1
Families of autistic individuals should learn to avoid coercive patterns and clarify interpersonal boundaries. Family members also should understand that dysfunctional patterns are a normal response to illness, these patterns can be corrected, and the correction can lead to improved management of ASD.
Psychoeducational family therapy provides an excellent framework for this family-psychiatrist interaction. Time-consuming, complex, expressive family therapies are not recommended because they tend to heighten expressed emotions.
Consider the following tips when providing psychoeducational family therapy:
• Remember that the extreme stress these families experience is based in reality. Lower functioning ASD patients might not sleep, require constant supervision, and cannot tolerate even minor frustrations.
• Respect the family’s ego defenses as a normal response to stress. Expect to feel some initial frustration and anxiety when working with overwhelmed families.
• Normalize negative feelings within the family. Everyone goes through anger, grief, and hopelessness when handling such a stressful situation.
• Avoid blaming dysfunctional patterns on individuals. Dysfunctional behavior is a normal response to the stress of caring for a family member with ASD.
• Empower the family. Remind the family that they know the patient best, so help them to find their own solutions to behavioral problems.
• Focus on the basics including establishing normal sleeping patterns and regular household routines.
• Educate the family about low sensory stimulation in the home. ASD patients are easily overwhelmed by sensory stimulation which can lead to lower frustration tolerance.
Disclosure
The author reports no financial relationships with any company whose products are mentioned in this article or with manufacturers of competing products.
Reference
1. Nichols MP. Family therapy: concepts and methods. 7th ed. Boston, MA: Pearson Education; 2006.
Treating a family in crisis because of a difficult-to-manage family member with autism spectrum disorder (ASD) can be overwhelming. The family often is desperate and exhausted and, therefore, can be overly needy, demanding, and disorganized. Psychiatrists often are asked to intervene with medication, even though there are no drugs to treat core symptoms of ASD. At best, medication can ease associated symptoms, such as insomnia. However, when coupled with reasonable medication management, psychoeducational family therapy can be an effective, powerful intervention during initial and follow-up medication visits.
Families of ASD patients often show dysfunctional patterns: poor interpersonal and generational boundaries, closed family systems, pathological triangulations, fused and disengaged relationships, resentments, etc. It is easy to assume that an autistic patient’s behavior problems are related to these dysfunctional patterns, and these patterns are caused by psychopathology within the family. In the 1970s and 1980s researchers began to challenge this same assumption in families of patients with schizophrenia and found that the illness shaped family patterns, not the reverse. Illness exacerbations could be minimized by teaching families to reduce their expressed emotions. In addition, research clinicians stopped blaming family members and began describing family dysfunction as a “normal response” to severe psychiatric illness.1
Families of autistic individuals should learn to avoid coercive patterns and clarify interpersonal boundaries. Family members also should understand that dysfunctional patterns are a normal response to illness, these patterns can be corrected, and the correction can lead to improved management of ASD.
Psychoeducational family therapy provides an excellent framework for this family-psychiatrist interaction. Time-consuming, complex, expressive family therapies are not recommended because they tend to heighten expressed emotions.
Consider the following tips when providing psychoeducational family therapy:
• Remember that the extreme stress these families experience is based in reality. Lower functioning ASD patients might not sleep, require constant supervision, and cannot tolerate even minor frustrations.
• Respect the family’s ego defenses as a normal response to stress. Expect to feel some initial frustration and anxiety when working with overwhelmed families.
• Normalize negative feelings within the family. Everyone goes through anger, grief, and hopelessness when handling such a stressful situation.
• Avoid blaming dysfunctional patterns on individuals. Dysfunctional behavior is a normal response to the stress of caring for a family member with ASD.
• Empower the family. Remind the family that they know the patient best, so help them to find their own solutions to behavioral problems.
• Focus on the basics including establishing normal sleeping patterns and regular household routines.
• Educate the family about low sensory stimulation in the home. ASD patients are easily overwhelmed by sensory stimulation which can lead to lower frustration tolerance.
Disclosure
The author reports no financial relationships with any company whose products are mentioned in this article or with manufacturers of competing products.
Treating a family in crisis because of a difficult-to-manage family member with autism spectrum disorder (ASD) can be overwhelming. The family often is desperate and exhausted and, therefore, can be overly needy, demanding, and disorganized. Psychiatrists often are asked to intervene with medication, even though there are no drugs to treat core symptoms of ASD. At best, medication can ease associated symptoms, such as insomnia. However, when coupled with reasonable medication management, psychoeducational family therapy can be an effective, powerful intervention during initial and follow-up medication visits.
Families of ASD patients often show dysfunctional patterns: poor interpersonal and generational boundaries, closed family systems, pathological triangulations, fused and disengaged relationships, resentments, etc. It is easy to assume that an autistic patient’s behavior problems are related to these dysfunctional patterns, and these patterns are caused by psychopathology within the family. In the 1970s and 1980s researchers began to challenge this same assumption in families of patients with schizophrenia and found that the illness shaped family patterns, not the reverse. Illness exacerbations could be minimized by teaching families to reduce their expressed emotions. In addition, research clinicians stopped blaming family members and began describing family dysfunction as a “normal response” to severe psychiatric illness.1
Families of autistic individuals should learn to avoid coercive patterns and clarify interpersonal boundaries. Family members also should understand that dysfunctional patterns are a normal response to illness, these patterns can be corrected, and the correction can lead to improved management of ASD.
Psychoeducational family therapy provides an excellent framework for this family-psychiatrist interaction. Time-consuming, complex, expressive family therapies are not recommended because they tend to heighten expressed emotions.
Consider the following tips when providing psychoeducational family therapy:
• Remember that the extreme stress these families experience is based in reality. Lower functioning ASD patients might not sleep, require constant supervision, and cannot tolerate even minor frustrations.
• Respect the family’s ego defenses as a normal response to stress. Expect to feel some initial frustration and anxiety when working with overwhelmed families.
• Normalize negative feelings within the family. Everyone goes through anger, grief, and hopelessness when handling such a stressful situation.
• Avoid blaming dysfunctional patterns on individuals. Dysfunctional behavior is a normal response to the stress of caring for a family member with ASD.
• Empower the family. Remind the family that they know the patient best, so help them to find their own solutions to behavioral problems.
• Focus on the basics including establishing normal sleeping patterns and regular household routines.
• Educate the family about low sensory stimulation in the home. ASD patients are easily overwhelmed by sensory stimulation which can lead to lower frustration tolerance.
Disclosure
The author reports no financial relationships with any company whose products are mentioned in this article or with manufacturers of competing products.
Reference
1. Nichols MP. Family therapy: concepts and methods. 7th ed. Boston, MA: Pearson Education; 2006.
Reference
1. Nichols MP. Family therapy: concepts and methods. 7th ed. Boston, MA: Pearson Education; 2006.
Educate patients about proper disposal of unused Rx medications—for their safety
Patients often tell clinicians that they used their “left-over” medications from previous refills, or that a family member shared medication with them. Other patients, who are non-adherent or have had a recent medication change, might reveal that they have some unused pills at home. As clinicians, what does this practice by our patients mean for us?
Prescription drug abuse is an emerging crisis, and drug diversion is a significant contributing factor.1 According to the Substance Abuse and Mental Health Services Administration’s National Survey on Drug Use and Health,2 in 2011 and 2012, on average, more than one-half of participants age ≥12 who used a pain reliever, tranquilizer, stimulant, or sedative non-medically obtained their most recently used drug “from a friend or relative for free.”
Unused, expired, and “extra” medications pose a significant risk for diversion, abuse, and accidental overdose.3 According to the Prescription Drug Abuse Prevention Plan,1 proper medication disposal is a major problem that needs action to help reduce prescription drug abuse.
Regrettably, <20% of patients receive advice on medication disposal from their health care provider,4 even though clinicians have an opportunity to educate patients and their caregivers on appropriate use, and safe disposal of, medications—in particular, controlled substances.
What should we emphasize to our patients about disposing of medications when it’s necessary?
Teach responsible use
Stress that medications prescribed for the patient are for his (her) use alone and should not be shared with friends or family. Sharing might seem kind and generous, but it can be dangerous. Medications should be used only at the prescribed dosage and frequency and for the recommended duration. If the medication causes an adverse effect or other problem, instruct the patient to talk to you before making any changes to the established regimen.
Emphasize safe disposal
Follow instructions. The label on medication bottles or other containers often has specific instructions on how to properly store, and even dispose of, the drug. Advise your patient to follow instructions on the label carefully.
Participate in a take-back program. The U.S. Drug Enforcement Administration (DEA) sponsors several kinds of drug take-back programs, including permanent locations where unused prescriptions are collected; 1-day events; and mail-in/ship-back programs.
The National Prescription Drug Take-Back Initiative is one such program that collects unused or expired medications on “Take Back Days.” On such days, DEA-coordinated collection sites nationwide accept unneeded pills, including prescription painkillers and other controlled substances, for disposal only when law enforcement personnel are present. In 2014, this program collected 780,158 lb of prescribed controlled medications.5
Patients can get more information about these programs by contacting a local pharmacy or their household trash and recycling service division.1,6
Discard medications properly in trash. An acceptable household strategy for disposing of prescription drugs is to mix the medication with an undesirable substance, such as used cat litter or coffee grounds, place the mixture in a sealed plastic bag or disposable container with a lid, and then place it in the trash.
Don’t flush. People sometimes flush unused medications down the toilet or drain. The current recommendation is against flushing unless instructions on the bottle specifically say to do so. Flushing is appropriate for disposing of some medications such as opiates, thereby minimizing the risk of accidental overdose or misuse.6 It is important to remember that most municipal sewage treatment plans do not have the ability to extract pharmaceuticals from wastewater.7
Discard empty bottles. It is important to discard pill bottles once they are empty and to remove any identifiable personal information from the label. Educate patients not to use empty pill bottles to store or transport other medications; this practice might result in accidental ingestion of the wrong medication or dose.These methods of disposal are in accordance with federal, state, and local regulations, as well as human and environmental safety standards. Appropriate disposal decreases contamination of soil and bodies of water with active pharmaceutical ingredients, thereby minimizing people’s and aquatic animals’ chronic exposure to low levels of drugs.3
Encourage patients to seek drug safety information. Patients might benefit from the information and services provided by:
• National Council on Patient Information and Education (www.talkaboutrx.org)
• Medication Use Safety Training for Seniors (www.mustforseniors.org), a nationwide initiative to promote medication education and safety in the geriatric population through an interactive program.
Remember: Although prescribing medications is strictly regulated, particularly for controlled substances, those regulations do little to prevent diversion of medications after they’ve been prescribed. Educating patients and their caregivers about safe disposal can help protect them, their family, and others.
Disclosures
The authors report no financial relationships with any company whose products are mentioned in this article or with manufacturers of competing products.
1. Epidemic: responding to America’s prescription drug abuse crisis. http://www.whitehouse.gov/sites/default/files/ ondcp/issues-content/prescription-drugs/rx_abuse_plan. pdf. Published 2011. Accessed January 29, 2015.
2. Results from the 2012 National Survey on Drug Use and Health: summary of national findings and detailed tables. http://archive.samhsa.gov/data/ NSDUH/2012SummNatFindDetTables/Index.aspx. Updated October 12, 2013. Accessed February 12, 2015.
3. Daughton CG, Ruhoy IS. Green pharmacy and pharmEcovigilance: prescribing and the planet. Expert Rev Clin Pharmacol. 2011;4(2):211-232.
4. Seehusen DA, Edwards J. Patient practices and beliefs concerning disposal of medications. J Am Board Fam Med. 2006;19(6):542-547.
5. DEA’S National Prescription Drug Take-Back Days meet a growing need for Americans. Drug Enforcement Administration. http://www.dea.gov/divisions/hq/2014/ hq050814.shtml. Published May 8, 2014. Accessed January 29, 2015.
6. How to dispose of unused medicines. FDA Consumer Health Information. http://www.fda.gov/downloads/ Drugs/ResourcesForYou/Consumers/BuyingUsing MedicineSafely/UnderstandingOver-the-Counter Medicines/ucm107163.pdf. Published April 2011. Accessed January 29, 2015.
7. Herring ME, Shah SK, Shah SK, et al. Current regulations and modest proposals regarding disposal of unused opioids and other controlled substances. J Am Osteopath Assoc. 2008;108(7):338-343.
Back Initiative, Take Back Days, discard medications
Patients often tell clinicians that they used their “left-over” medications from previous refills, or that a family member shared medication with them. Other patients, who are non-adherent or have had a recent medication change, might reveal that they have some unused pills at home. As clinicians, what does this practice by our patients mean for us?
Prescription drug abuse is an emerging crisis, and drug diversion is a significant contributing factor.1 According to the Substance Abuse and Mental Health Services Administration’s National Survey on Drug Use and Health,2 in 2011 and 2012, on average, more than one-half of participants age ≥12 who used a pain reliever, tranquilizer, stimulant, or sedative non-medically obtained their most recently used drug “from a friend or relative for free.”
Unused, expired, and “extra” medications pose a significant risk for diversion, abuse, and accidental overdose.3 According to the Prescription Drug Abuse Prevention Plan,1 proper medication disposal is a major problem that needs action to help reduce prescription drug abuse.
Regrettably, <20% of patients receive advice on medication disposal from their health care provider,4 even though clinicians have an opportunity to educate patients and their caregivers on appropriate use, and safe disposal of, medications—in particular, controlled substances.
What should we emphasize to our patients about disposing of medications when it’s necessary?
Teach responsible use
Stress that medications prescribed for the patient are for his (her) use alone and should not be shared with friends or family. Sharing might seem kind and generous, but it can be dangerous. Medications should be used only at the prescribed dosage and frequency and for the recommended duration. If the medication causes an adverse effect or other problem, instruct the patient to talk to you before making any changes to the established regimen.
Emphasize safe disposal
Follow instructions. The label on medication bottles or other containers often has specific instructions on how to properly store, and even dispose of, the drug. Advise your patient to follow instructions on the label carefully.
Participate in a take-back program. The U.S. Drug Enforcement Administration (DEA) sponsors several kinds of drug take-back programs, including permanent locations where unused prescriptions are collected; 1-day events; and mail-in/ship-back programs.
The National Prescription Drug Take-Back Initiative is one such program that collects unused or expired medications on “Take Back Days.” On such days, DEA-coordinated collection sites nationwide accept unneeded pills, including prescription painkillers and other controlled substances, for disposal only when law enforcement personnel are present. In 2014, this program collected 780,158 lb of prescribed controlled medications.5
Patients can get more information about these programs by contacting a local pharmacy or their household trash and recycling service division.1,6
Discard medications properly in trash. An acceptable household strategy for disposing of prescription drugs is to mix the medication with an undesirable substance, such as used cat litter or coffee grounds, place the mixture in a sealed plastic bag or disposable container with a lid, and then place it in the trash.
Don’t flush. People sometimes flush unused medications down the toilet or drain. The current recommendation is against flushing unless instructions on the bottle specifically say to do so. Flushing is appropriate for disposing of some medications such as opiates, thereby minimizing the risk of accidental overdose or misuse.6 It is important to remember that most municipal sewage treatment plans do not have the ability to extract pharmaceuticals from wastewater.7
Discard empty bottles. It is important to discard pill bottles once they are empty and to remove any identifiable personal information from the label. Educate patients not to use empty pill bottles to store or transport other medications; this practice might result in accidental ingestion of the wrong medication or dose.These methods of disposal are in accordance with federal, state, and local regulations, as well as human and environmental safety standards. Appropriate disposal decreases contamination of soil and bodies of water with active pharmaceutical ingredients, thereby minimizing people’s and aquatic animals’ chronic exposure to low levels of drugs.3
Encourage patients to seek drug safety information. Patients might benefit from the information and services provided by:
• National Council on Patient Information and Education (www.talkaboutrx.org)
• Medication Use Safety Training for Seniors (www.mustforseniors.org), a nationwide initiative to promote medication education and safety in the geriatric population through an interactive program.
Remember: Although prescribing medications is strictly regulated, particularly for controlled substances, those regulations do little to prevent diversion of medications after they’ve been prescribed. Educating patients and their caregivers about safe disposal can help protect them, their family, and others.
Disclosures
The authors report no financial relationships with any company whose products are mentioned in this article or with manufacturers of competing products.
Patients often tell clinicians that they used their “left-over” medications from previous refills, or that a family member shared medication with them. Other patients, who are non-adherent or have had a recent medication change, might reveal that they have some unused pills at home. As clinicians, what does this practice by our patients mean for us?
Prescription drug abuse is an emerging crisis, and drug diversion is a significant contributing factor.1 According to the Substance Abuse and Mental Health Services Administration’s National Survey on Drug Use and Health,2 in 2011 and 2012, on average, more than one-half of participants age ≥12 who used a pain reliever, tranquilizer, stimulant, or sedative non-medically obtained their most recently used drug “from a friend or relative for free.”
Unused, expired, and “extra” medications pose a significant risk for diversion, abuse, and accidental overdose.3 According to the Prescription Drug Abuse Prevention Plan,1 proper medication disposal is a major problem that needs action to help reduce prescription drug abuse.
Regrettably, <20% of patients receive advice on medication disposal from their health care provider,4 even though clinicians have an opportunity to educate patients and their caregivers on appropriate use, and safe disposal of, medications—in particular, controlled substances.
What should we emphasize to our patients about disposing of medications when it’s necessary?
Teach responsible use
Stress that medications prescribed for the patient are for his (her) use alone and should not be shared with friends or family. Sharing might seem kind and generous, but it can be dangerous. Medications should be used only at the prescribed dosage and frequency and for the recommended duration. If the medication causes an adverse effect or other problem, instruct the patient to talk to you before making any changes to the established regimen.
Emphasize safe disposal
Follow instructions. The label on medication bottles or other containers often has specific instructions on how to properly store, and even dispose of, the drug. Advise your patient to follow instructions on the label carefully.
Participate in a take-back program. The U.S. Drug Enforcement Administration (DEA) sponsors several kinds of drug take-back programs, including permanent locations where unused prescriptions are collected; 1-day events; and mail-in/ship-back programs.
The National Prescription Drug Take-Back Initiative is one such program that collects unused or expired medications on “Take Back Days.” On such days, DEA-coordinated collection sites nationwide accept unneeded pills, including prescription painkillers and other controlled substances, for disposal only when law enforcement personnel are present. In 2014, this program collected 780,158 lb of prescribed controlled medications.5
Patients can get more information about these programs by contacting a local pharmacy or their household trash and recycling service division.1,6
Discard medications properly in trash. An acceptable household strategy for disposing of prescription drugs is to mix the medication with an undesirable substance, such as used cat litter or coffee grounds, place the mixture in a sealed plastic bag or disposable container with a lid, and then place it in the trash.
Don’t flush. People sometimes flush unused medications down the toilet or drain. The current recommendation is against flushing unless instructions on the bottle specifically say to do so. Flushing is appropriate for disposing of some medications such as opiates, thereby minimizing the risk of accidental overdose or misuse.6 It is important to remember that most municipal sewage treatment plans do not have the ability to extract pharmaceuticals from wastewater.7
Discard empty bottles. It is important to discard pill bottles once they are empty and to remove any identifiable personal information from the label. Educate patients not to use empty pill bottles to store or transport other medications; this practice might result in accidental ingestion of the wrong medication or dose.These methods of disposal are in accordance with federal, state, and local regulations, as well as human and environmental safety standards. Appropriate disposal decreases contamination of soil and bodies of water with active pharmaceutical ingredients, thereby minimizing people’s and aquatic animals’ chronic exposure to low levels of drugs.3
Encourage patients to seek drug safety information. Patients might benefit from the information and services provided by:
• National Council on Patient Information and Education (www.talkaboutrx.org)
• Medication Use Safety Training for Seniors (www.mustforseniors.org), a nationwide initiative to promote medication education and safety in the geriatric population through an interactive program.
Remember: Although prescribing medications is strictly regulated, particularly for controlled substances, those regulations do little to prevent diversion of medications after they’ve been prescribed. Educating patients and their caregivers about safe disposal can help protect them, their family, and others.
Disclosures
The authors report no financial relationships with any company whose products are mentioned in this article or with manufacturers of competing products.
1. Epidemic: responding to America’s prescription drug abuse crisis. http://www.whitehouse.gov/sites/default/files/ ondcp/issues-content/prescription-drugs/rx_abuse_plan. pdf. Published 2011. Accessed January 29, 2015.
2. Results from the 2012 National Survey on Drug Use and Health: summary of national findings and detailed tables. http://archive.samhsa.gov/data/ NSDUH/2012SummNatFindDetTables/Index.aspx. Updated October 12, 2013. Accessed February 12, 2015.
3. Daughton CG, Ruhoy IS. Green pharmacy and pharmEcovigilance: prescribing and the planet. Expert Rev Clin Pharmacol. 2011;4(2):211-232.
4. Seehusen DA, Edwards J. Patient practices and beliefs concerning disposal of medications. J Am Board Fam Med. 2006;19(6):542-547.
5. DEA’S National Prescription Drug Take-Back Days meet a growing need for Americans. Drug Enforcement Administration. http://www.dea.gov/divisions/hq/2014/ hq050814.shtml. Published May 8, 2014. Accessed January 29, 2015.
6. How to dispose of unused medicines. FDA Consumer Health Information. http://www.fda.gov/downloads/ Drugs/ResourcesForYou/Consumers/BuyingUsing MedicineSafely/UnderstandingOver-the-Counter Medicines/ucm107163.pdf. Published April 2011. Accessed January 29, 2015.
7. Herring ME, Shah SK, Shah SK, et al. Current regulations and modest proposals regarding disposal of unused opioids and other controlled substances. J Am Osteopath Assoc. 2008;108(7):338-343.
1. Epidemic: responding to America’s prescription drug abuse crisis. http://www.whitehouse.gov/sites/default/files/ ondcp/issues-content/prescription-drugs/rx_abuse_plan. pdf. Published 2011. Accessed January 29, 2015.
2. Results from the 2012 National Survey on Drug Use and Health: summary of national findings and detailed tables. http://archive.samhsa.gov/data/ NSDUH/2012SummNatFindDetTables/Index.aspx. Updated October 12, 2013. Accessed February 12, 2015.
3. Daughton CG, Ruhoy IS. Green pharmacy and pharmEcovigilance: prescribing and the planet. Expert Rev Clin Pharmacol. 2011;4(2):211-232.
4. Seehusen DA, Edwards J. Patient practices and beliefs concerning disposal of medications. J Am Board Fam Med. 2006;19(6):542-547.
5. DEA’S National Prescription Drug Take-Back Days meet a growing need for Americans. Drug Enforcement Administration. http://www.dea.gov/divisions/hq/2014/ hq050814.shtml. Published May 8, 2014. Accessed January 29, 2015.
6. How to dispose of unused medicines. FDA Consumer Health Information. http://www.fda.gov/downloads/ Drugs/ResourcesForYou/Consumers/BuyingUsing MedicineSafely/UnderstandingOver-the-Counter Medicines/ucm107163.pdf. Published April 2011. Accessed January 29, 2015.
7. Herring ME, Shah SK, Shah SK, et al. Current regulations and modest proposals regarding disposal of unused opioids and other controlled substances. J Am Osteopath Assoc. 2008;108(7):338-343.
Back Initiative, Take Back Days, discard medications
Back Initiative, Take Back Days, discard medications
Impaired self-assessment in schizophrenia: Why patients misjudge their cognition and functioning
Lack of insight or “unawareness of illness” occurs within a set of self-assessment problems commonly seen in schizophrenia.1 In the clinical domain, people who do not realize they are ill typically are unwilling to accept treatment, including medication, with potential for worsened illness. They also may have difficulty self-assessing everyday function and functional potential, cognition, social cognition, and attitude, often to a variable degree across these domains (Table 1).1-3
Self-assessment of performance can be clinically helpful whether performance is objectively good or bad. Those with poor performance could be helped to attempt to match their aspirations to accomplishments and improve over time. Good performers could have their functioning bolstered by recognizing their competence. Thus, even a population whose performance often is poor could benefit from accurate self-assessment or experience additional challenges from inaccurate self-evaluation.
This article discusses patient characteristics associated with impairments in self-assessment and the most accurate sources of information for clinicians about patient functioning. Our research shows that an experienced psychiatrist is well positioned to make accurate judgments of functional potential and cognitive abilities for people with schizophrenia.
Patterns in patients with impaired self-assessment
Healthy individuals routinely overestimate their abilities and their attractiveness to others.4 Feedback that deflates these exaggerated estimates increases the accuracy of their self-assessments. Mildly depressed individuals typically are the most accurate judges of their true functioning; those with more severe levels of depression tend to underestimate their competence. Thus, simply being an inaccurate self-assessor is not “abnormal.” These response biases are consistent and predictable in healthy people.
People with severe mental illness pose a different challenge. As in the following cases, their reports manifest minimal correlation with other sources of information, including objective information about performance.
CASE 1
JR, age 28, is referred for occupational therapy because he has never worked since graduating from high school. He tells the therapist his cognitive abilities are average and intact, although his scores on a comprehensive cognitive assessment suggest performance at the first percentile of normal distribution or less. His self-reported Beck Depression Inventory (BDI) score is 4. He says he would like to work as a certified public accountant, because he believes he has an aptitude for math. He admits he has no idea what the job entails, but he is quite motivated to set up an interview as soon as possible.
CASE 2
LM, age 48, says his “best job” was managing an auto parts store for 18 months after he earned an associate’s degree and until his second psychotic episode. His most recent work was approximately 12 years ago at an oil-change facility. He agrees to discuss employment but feels his vocational skills are too deteriorated for him to succeed and requests an assessment for Alzheimer’s disease. His cognitive performance averages in the 10th percentile of the overall population, and his BDI score is 18. Tests of his ability to perform vocational skills suggest he is qualified for multiple jobs, including his previous technician position.
Individuals with schizophrenia who report no depression and no work history routinely overestimate their functional potential, whereas those with a history of unsuccessful vocational attempts often underestimate their functional potential. Inaccurate self-assessment can contribute to reduced functioning—in JR’s case because of unrealistic assessment of the match between skills and vocational potential, and in LM’s case because of overly pessimistic self-evaluation. For people with schizophrenia, inability to self-evaluate can have a bidirectional adverse impact on functioning: overestimation may lead to trying tasks that are too challenging, and underestimation may lead to reduced effort and motivation to take on functional tasks.
Metacognition and introspective accuracy
“Metacognition” refers to self-assessment of the quality and accuracy of performance on cognitive tests.5-7 Problem-solving tests— such as the Wisconsin Card Sorting test (WCST), in which the person being assessed needs to solve the test through performance feedback—are metacognition tests. When errors are made, the strategy in use needs to be discarded; when responses are correct, the strategy is retained. People with schizophrenia have disproportionate difficulties with the WCST, and deficits are especially salient when the test is modified to measure self-assessment of performance and ability to use feedback to change strategies.
“Introspective accuracy” is used to describe the wide-ranging self-assessment impairments in severe mental illness. Theories of metacognition implicate a broad spectrum, of which self-assessment is 1 component, whereas introspective accuracy more specifically indicates judgments of accuracy. Because self-assessment is focused on the self, and hence is introspective, this conceptualization can be applied to self-evaluations of:
• achievement in everyday functioning (“Did I complete that task well?”)
• potential for achievement in everyday functioning (“I could do that job”)
• cognitive performance (“Yes, I remembered all of those words”)
• social cognition (“He really is angry”).
Domains of impaired introspective accuracy
Everyday functioning. The 3 global domains of everyday functioning are social outcomes, productive/vocational outcomes, and everyday activities, including residential independence/support for people with severe mental illness. Two areas of inquiry are used in self-assessing everyday functioning: (1) what are you doing now and (2) what could you do in the future? For people with schizophrenia, a related question is how perceived impairments in everyday functioning are associated with subjective illness burden.
People with schizophrenia report illness burden consistent with their self-reported disability, suggesting their reports in these domains are not random.8 Studies have consistently found, however, that these patients report:
• less impairment on average in their everyday functioning than observed by clinicians
• less subjective illness burden compared with individuals with much less severe illnesses.
Their reports also fail to correlate with clinicians’ observations.9 Patients with schizophrenia who have never been employed may report greater vocational potential than those employed full-time. Interestingly, patients who were previously—but not currently—employed reported the least vocational potential.10 These data suggest that experience may be a factor: individuals who have never worked have no context for their self-assessments, whereas people who are persistently unemployed may have a perspective on the challenges associated with employment.
In our research,9 high-contact clinicians (ie, case manager, psychiatrist, therapist, or residential facility manager) were better able than family or friends to generate ratings from an assessment questionnaire that correlated with performance-based measures of patients’ ability to perform everyday functional skills. The ratings were generated across multiple functional status scales, suggesting that the rater was more important than the specific scale. We concluded that high-contact clinicians can generate ratings of everyday functioning that are convergent with patients’ abilities, even when they have no information about actual performance scores.
Cognitive performance. When self-reported cognitive abilities are correlated with the results of performance on neuropsychological assessments, the results are quite consistent. Patients provide reports that do not correlate with their objective performance.11 Interestingly, when clinicians were asked to use the same strategies as patients to generate ratings of cognitive impairment, clinician ratings had considerably greater evidence of validity. In several studies, patients’ ratings of their cognitive performance did not correlate with their neuropsychological test performance, even though they had just been tested on the assessment battery. Ratings by clinicians or other high-contact informants (who were unaware of patients’ test performance) were much more strongly related to patients’ objective test performance, compared with patient self-reports.12
The convergence of clinician ratings of cognitive performance with objective test data has been impressive. Correlation coefficients of at least r = 0.5, reflecting a moderate to large relationships between clinician ratings and objective performance, have been detected. Individual cognitive test domains, such as working memory and processing speed, often do not correlate with each other or with aspects of everyday functioning to that extent.13 These data suggest that a clinician assessment of cognitive performance, when focused on the correct aspects of cognitive functioning, can be a highly useful proxy for extensive neuropsychological testing.
Social cognitive performance. Introspective accuracy for social cognitive judgments can be assessed similarly to the strategies used to assess the domains of everyday functioning and cognitive performance. Patients are asked to complete a typical social cognitive task, such as determining emotions from facial stimuli or examining the eye region of the face, to determine the mental state of the depicted person. Immediately after responding to each stimulus, participants rate their confidence in the correctness of that response.
Consistent with the pattern of introspective accuracy for everyday functioning, patients with schizophrenia tend to make more high-confidence errors than healthy individuals on social cognitive tasks. That is, the patients are less likely to realize when they are wrong in their judgments of social stimuli. A similar pattern has been found for mental state attribution,14 recognition of facial emotion from the self,15 and recognition of facial emotion from others.16 These high-confidence errors also are more likely to occur for more difficult stimuli, such as faces that display only mildly emotional expressions. These difficulties appear to be specific to judgments in an immediate evaluation situation. When asked to determine if the behavior of another individual is socially appropriate, individuals with schizophrenia are as able as healthy individuals to recognize social mistakes.17 This work suggests that, at least within the domain of social cognition, introspective accuracy impairment is not caused by generalized poor judgment, just as self-assessments of disability and illness burden are generated at random.
Choosing a reliable informant
If a clinician has not had adequate time or exposure to a patient to make a cognitive or functional judgment, what should the strategy be? If asking the patient is uninformative, who should be asked? Our group has gathered information that may help clinicians identify informants who can provide ratings of cognitive performance and everyday functioning that are convergent with objective evidence.
In a systematic study of validity of reports of various informants, we compared correlations between reports of competence of everyday functioning with objective measures of cognitive test performance and ability to perform everyday functional skills. Our findings:
• Patient reports of everyday functioning were not correlated with performance-based measures for any of 6 rating scales.9
• Clinician reports of everyday functioning were correlated with objective performance across 4 of 6 rating scales.
• Correlations between ratings generated by friend or relative informants and other information were almost shocking in their lack of validity (Table 2).9
We concluded that ratings generated by a generic informant—someone who simply knows the patient and is willing to provide ratings—are highly likely to be uninformative. If a friend or relative provides information of limited usefulness, the report could easily lead to clinical decisions with high potential for bad outcomes. For example, attempts could fail to transition someone with impaired everyday living skills to independent living, or a patient whose potential is underestimated might not be offered opportunities to achieve attainable functional goals.
We found that the closer the rater was to a full caregiver role, the better and more accurate the information obtained. Caregivers who had regular contact with patients had much more valid ratings when performance on functionally relevant objective measures was considered. Patients with caregivers had greater impairments in everyday outcomes, however, suggesting that this subset was more impaired than the overall sample. For patients without caregivers, other sources of information—including careful observation by high-contact clinicians—seem to be required to generate a valid assessment of functioning.
Direct functional implications of impaired introspective accuracy
Clinical effects of reduced awareness of illness include reduced adherence to medication, followed by relapse, disturbed behavior, leading to emergency room treatments or acute admissions, and—more rarely—disturbed behavior associated with violence or self-harm. Relapses such as these can adversely affect brain structure and function, with declines in cognitive functioning early in the illness.
Our recent study18 quantifies the direct impact of impairments in introspective accuracy on everyday functioning. We asked 214 individuals with schizophrenia to self-evaluate their cognitive ability with a systematic rating scale and to self-report their everyday functioning in social, vocational, and everyday activities domains. We used performance-based measures to assess their cognitive abilities and everyday functional skills. Concurrently, high-contact clinicians rated these same abilities with the same rating scales. We then predicted everyday functioning, as rated by the clinicians, with the discrepancies between self-assessed and clinician-assessed functioning, and patients’ scores on the performance-based measures.
Impaired introspective accuracy, as indexed by difference scores between clinician ratings and self-reports, was a more potent predictor of everyday functional deficits in social, vocational, and everyday activities domains than scores on performance-based measures of cognitive abilities and functional skills. Even when we analyzed only deficits in introspective accuracy for cognition as the predictor of everyday outcomes in these 3 real-world functional domains, the results were the same. Impaired introspective accuracy was the single best predictor of everyday functioning in all 3 domains, with actual abilities considerably less important.
Patient characteristics that predict introspective accuracy
Patient characteristics associated with impairments in introspective accuracy (Table 3)19,20 are easy to identify and assess. Subjective reports of depression have a bell-shaped relationship with introspective accuracy. A self-reported score of 0 by a disabled schizophrenia patient suggests some unawareness of an unfortunate life situation; mild to moderate scores are associated with more accurate self-assessment; and more severe scores, as seen in other conditions, often predict overestimation of disability.19
Psychosis and negative symptoms are associated with reduced introspective accuracy and global over-reporting of functional competence.20 Patients who have never worked have no way to comprehend the specific challenges associated with obtaining and sustaining employment. Patients who had a job and have not been able to return work may perceive barriers as more substantial than they are.
Tips to manage impairments in introspective accuracy
Ensure that assessment information is valid. If a patient has limited ability to self-assess, seek other sources of data. If a patient has psychotic symptoms, denies being depressed, or has limited life experience, the clinician should adjust her (his) interpretation of the self-report accordingly, because these factors are known to adversely affect the accuracy of self-assessment. Consider informants’ level and quality of contact with the patient, as well as any motivation or bias that might influence the accuracy of their reports. Other professionals, such as occupational therapists, can provide useful information as reference points for treatment planning.
Consider treatments aimed at increasing introspective accuracy, such as structured training and exposure to self-assessment situations,6 and interventions aimed at increasing organization and skills performance. Cognitive remediation therapies, although not widely available, have potential to improve functioning, with excellent persistence over time.21
Related Resources
• Harvey PD, ed. Cognitive impairment in schizophrenia: characteristics, assessment and treatment. Cambridge, United Kingdom: Cambridge University Press; 2013.
• Gould F, McGuire LS, Durand D, et al. Self-assessment in schizophrenia: accuracy of assessment of cognition and everyday functioning [published online February 2, 2015]. Neuropsychology.
• Dunning D. Self-insight: detours and roadblocks on the path to knowing thyself. New York, NY: Psychology Press; 2012.
Acknowledgment
This paper was supported by Grants MH078775 to Dr. Harvey and MH093432 to Drs. Harvey and Pinkham from the National Institute of Mental Health.
Disclosures
Dr. Harvey has received consulting fees from AbbVie, Boehringer Ingelheim, Forum Pharmaceuticals, Genentech, Otsuka America Pharmaceuticals, Roche, Sanofi, Sunovion Pharmaceuticals, and Takeda Pharmaceuticals. Dr. Pinkham has served as a consultant for Otsuka America Pharmaceuticals.
1. Amador XF, Flaum M, Andreasen NC, et al. Awareness of illness in schizophrenia and schizoaffective and mood disorders. Arch Gen Psychiatry. 1994;51(10):826-836.
2. Medalia A, Thysen J. A comparison of insight into clinical symptoms versus insight into neuro-cognitive symptoms in schizophrenia. Schizophr Res. 2010;118(1-3):134-139.
3. Beck AT, Baruch E, Balter JM, et al. A new instrument for measuring insight: the Beck Cognitive Insight Scale. Schizophr Res. 2004;68(2-3):319-329.
4. Kruger J, Dunning D. Unskilled and unaware of it: how difficulties in recognizing one’s own incompetence lead to inflated self-assessments. J Pers Soc Psychol. 1999;77(6):1121-1134.
5. Lysaker P, Vohs J, Ballard R, et al. Metacognition, self-reflection and recovery in schizophrenia. Future Neurology. 2013;8(1):103-115.
6. Lysaker PH, Dimaggio G. Metacognitive capacities for reflection in schizophrenia: implications for developing treatments. Schizophr Bull. 2014;40(3):487-491.
7. Koren D, Seidman LJ, Goldsmith M, et al. Real-world cognitive—and metacognitive—dysfunction in schizophrenia: a new approach for measuring (and remediating) more “right stuff.” Schizophr Bull. 2006;32(2):310-326.
8. McKibbin C, Patterson TL, Jeste DV. Assessing disability in older patients with schizophrenia: results from the WHODAS-II. J Ner Men Dis. 2004;192(6):405-413.
9. Sabbag S, Twamley EW, Vella L, et al. Assessing everyday functioning in schizophrenia: not all informants seem equally informative. Schizophr Res. 2011;131(1-3):250-255.
10. Gould F, Sabbag S, Durand D, et al. Self-assessment of functional ability in schizophrenia: milestone achievement and its relationship to accuracy of self-evaluation. Psychiatry Res. 2013;207(1-2):19-24.
11. Keefe RS, Poe M, Walker TM, et al. The Schizophrenia Cognition Rating Scale: an interview-based assessment and its relationship to cognition, real-world functioning, and functional capacity. Am J Psychiatry. 2006;163(3):426-432.
12. Durand D, Strassnig M, Sabbag S, et al. Factors influencing self-assessment of cognition and functioning in schizophrenia: implications for treatment studies [published online July 25, 2014]. Eur Neuropsychopharmacol. doi: 10.1016/j.euroneuro.2014.07.008.
13. McClure MM, Bowie CR, Patterson TL, et al. Correlations of functional capacity and neuropsychological performance in older patients with schizophrenia: evidence for specificity of relationships? Schizophr Res. 2007;89(1-3):330-338.
14. Köther U, Veckenstedt R, Vitzthum F, et al. “Don’t give me that look” - overconfidence in false mental state perception in schizophrenia. Psychiatry Res. 2012;196(1):1-8.
15. Demily C, Weiss T, Desmurget M, et al Recognition of self-generated facial emotions is impaired in schizophrenia. J Neuropsychiatry Clin Neurosci. 2011;23(2):189-193.
16. Moritz S, Woznica A, Andreou C, et al. Response confidence for emotion perception in schizophrenia using a Continuous Facial Sequence Task. Psychiatry Res. 2012;200(2-3):202-207.
17. Langdon R, Connors MH, Connaughton E. Social cognition and social judgment in schizophrenia. Schizophrenia Research: Cognition. 2014;1(4):171-174.
18. Gould F, McGuire LS, Durand D, et al. Self-assessment in schizophrenia: accuracy of evaluation of cognition and everyday functioning [published online February 2, 2015]. Neuropsychology.
19. Bowie CR, Twamley EW, Anderson H, et al. Self-assessment of functional status in schizophrenia. J Psychiatr Res. 2007;41(12):1012-1018.
20. Sabbag S, Twamley EW, Vella L, et al. Predictors of the accuracy of self-assessment of everyday functioning in people with schizophrenia. Schizophr Res. 2012;137(1- 3):190-195.
21. McGurk SR, Mueser KT, Feldman K, et al. Cognitive training for supported employment: 2-3 year outcomes of a randomized controlled trial. Am J Psychiatry. 2007;164(3):437-441.
Lack of insight or “unawareness of illness” occurs within a set of self-assessment problems commonly seen in schizophrenia.1 In the clinical domain, people who do not realize they are ill typically are unwilling to accept treatment, including medication, with potential for worsened illness. They also may have difficulty self-assessing everyday function and functional potential, cognition, social cognition, and attitude, often to a variable degree across these domains (Table 1).1-3
Self-assessment of performance can be clinically helpful whether performance is objectively good or bad. Those with poor performance could be helped to attempt to match their aspirations to accomplishments and improve over time. Good performers could have their functioning bolstered by recognizing their competence. Thus, even a population whose performance often is poor could benefit from accurate self-assessment or experience additional challenges from inaccurate self-evaluation.
This article discusses patient characteristics associated with impairments in self-assessment and the most accurate sources of information for clinicians about patient functioning. Our research shows that an experienced psychiatrist is well positioned to make accurate judgments of functional potential and cognitive abilities for people with schizophrenia.
Patterns in patients with impaired self-assessment
Healthy individuals routinely overestimate their abilities and their attractiveness to others.4 Feedback that deflates these exaggerated estimates increases the accuracy of their self-assessments. Mildly depressed individuals typically are the most accurate judges of their true functioning; those with more severe levels of depression tend to underestimate their competence. Thus, simply being an inaccurate self-assessor is not “abnormal.” These response biases are consistent and predictable in healthy people.
People with severe mental illness pose a different challenge. As in the following cases, their reports manifest minimal correlation with other sources of information, including objective information about performance.
CASE 1
JR, age 28, is referred for occupational therapy because he has never worked since graduating from high school. He tells the therapist his cognitive abilities are average and intact, although his scores on a comprehensive cognitive assessment suggest performance at the first percentile of normal distribution or less. His self-reported Beck Depression Inventory (BDI) score is 4. He says he would like to work as a certified public accountant, because he believes he has an aptitude for math. He admits he has no idea what the job entails, but he is quite motivated to set up an interview as soon as possible.
CASE 2
LM, age 48, says his “best job” was managing an auto parts store for 18 months after he earned an associate’s degree and until his second psychotic episode. His most recent work was approximately 12 years ago at an oil-change facility. He agrees to discuss employment but feels his vocational skills are too deteriorated for him to succeed and requests an assessment for Alzheimer’s disease. His cognitive performance averages in the 10th percentile of the overall population, and his BDI score is 18. Tests of his ability to perform vocational skills suggest he is qualified for multiple jobs, including his previous technician position.
Individuals with schizophrenia who report no depression and no work history routinely overestimate their functional potential, whereas those with a history of unsuccessful vocational attempts often underestimate their functional potential. Inaccurate self-assessment can contribute to reduced functioning—in JR’s case because of unrealistic assessment of the match between skills and vocational potential, and in LM’s case because of overly pessimistic self-evaluation. For people with schizophrenia, inability to self-evaluate can have a bidirectional adverse impact on functioning: overestimation may lead to trying tasks that are too challenging, and underestimation may lead to reduced effort and motivation to take on functional tasks.
Metacognition and introspective accuracy
“Metacognition” refers to self-assessment of the quality and accuracy of performance on cognitive tests.5-7 Problem-solving tests— such as the Wisconsin Card Sorting test (WCST), in which the person being assessed needs to solve the test through performance feedback—are metacognition tests. When errors are made, the strategy in use needs to be discarded; when responses are correct, the strategy is retained. People with schizophrenia have disproportionate difficulties with the WCST, and deficits are especially salient when the test is modified to measure self-assessment of performance and ability to use feedback to change strategies.
“Introspective accuracy” is used to describe the wide-ranging self-assessment impairments in severe mental illness. Theories of metacognition implicate a broad spectrum, of which self-assessment is 1 component, whereas introspective accuracy more specifically indicates judgments of accuracy. Because self-assessment is focused on the self, and hence is introspective, this conceptualization can be applied to self-evaluations of:
• achievement in everyday functioning (“Did I complete that task well?”)
• potential for achievement in everyday functioning (“I could do that job”)
• cognitive performance (“Yes, I remembered all of those words”)
• social cognition (“He really is angry”).
Domains of impaired introspective accuracy
Everyday functioning. The 3 global domains of everyday functioning are social outcomes, productive/vocational outcomes, and everyday activities, including residential independence/support for people with severe mental illness. Two areas of inquiry are used in self-assessing everyday functioning: (1) what are you doing now and (2) what could you do in the future? For people with schizophrenia, a related question is how perceived impairments in everyday functioning are associated with subjective illness burden.
People with schizophrenia report illness burden consistent with their self-reported disability, suggesting their reports in these domains are not random.8 Studies have consistently found, however, that these patients report:
• less impairment on average in their everyday functioning than observed by clinicians
• less subjective illness burden compared with individuals with much less severe illnesses.
Their reports also fail to correlate with clinicians’ observations.9 Patients with schizophrenia who have never been employed may report greater vocational potential than those employed full-time. Interestingly, patients who were previously—but not currently—employed reported the least vocational potential.10 These data suggest that experience may be a factor: individuals who have never worked have no context for their self-assessments, whereas people who are persistently unemployed may have a perspective on the challenges associated with employment.
In our research,9 high-contact clinicians (ie, case manager, psychiatrist, therapist, or residential facility manager) were better able than family or friends to generate ratings from an assessment questionnaire that correlated with performance-based measures of patients’ ability to perform everyday functional skills. The ratings were generated across multiple functional status scales, suggesting that the rater was more important than the specific scale. We concluded that high-contact clinicians can generate ratings of everyday functioning that are convergent with patients’ abilities, even when they have no information about actual performance scores.
Cognitive performance. When self-reported cognitive abilities are correlated with the results of performance on neuropsychological assessments, the results are quite consistent. Patients provide reports that do not correlate with their objective performance.11 Interestingly, when clinicians were asked to use the same strategies as patients to generate ratings of cognitive impairment, clinician ratings had considerably greater evidence of validity. In several studies, patients’ ratings of their cognitive performance did not correlate with their neuropsychological test performance, even though they had just been tested on the assessment battery. Ratings by clinicians or other high-contact informants (who were unaware of patients’ test performance) were much more strongly related to patients’ objective test performance, compared with patient self-reports.12
The convergence of clinician ratings of cognitive performance with objective test data has been impressive. Correlation coefficients of at least r = 0.5, reflecting a moderate to large relationships between clinician ratings and objective performance, have been detected. Individual cognitive test domains, such as working memory and processing speed, often do not correlate with each other or with aspects of everyday functioning to that extent.13 These data suggest that a clinician assessment of cognitive performance, when focused on the correct aspects of cognitive functioning, can be a highly useful proxy for extensive neuropsychological testing.
Social cognitive performance. Introspective accuracy for social cognitive judgments can be assessed similarly to the strategies used to assess the domains of everyday functioning and cognitive performance. Patients are asked to complete a typical social cognitive task, such as determining emotions from facial stimuli or examining the eye region of the face, to determine the mental state of the depicted person. Immediately after responding to each stimulus, participants rate their confidence in the correctness of that response.
Consistent with the pattern of introspective accuracy for everyday functioning, patients with schizophrenia tend to make more high-confidence errors than healthy individuals on social cognitive tasks. That is, the patients are less likely to realize when they are wrong in their judgments of social stimuli. A similar pattern has been found for mental state attribution,14 recognition of facial emotion from the self,15 and recognition of facial emotion from others.16 These high-confidence errors also are more likely to occur for more difficult stimuli, such as faces that display only mildly emotional expressions. These difficulties appear to be specific to judgments in an immediate evaluation situation. When asked to determine if the behavior of another individual is socially appropriate, individuals with schizophrenia are as able as healthy individuals to recognize social mistakes.17 This work suggests that, at least within the domain of social cognition, introspective accuracy impairment is not caused by generalized poor judgment, just as self-assessments of disability and illness burden are generated at random.
Choosing a reliable informant
If a clinician has not had adequate time or exposure to a patient to make a cognitive or functional judgment, what should the strategy be? If asking the patient is uninformative, who should be asked? Our group has gathered information that may help clinicians identify informants who can provide ratings of cognitive performance and everyday functioning that are convergent with objective evidence.
In a systematic study of validity of reports of various informants, we compared correlations between reports of competence of everyday functioning with objective measures of cognitive test performance and ability to perform everyday functional skills. Our findings:
• Patient reports of everyday functioning were not correlated with performance-based measures for any of 6 rating scales.9
• Clinician reports of everyday functioning were correlated with objective performance across 4 of 6 rating scales.
• Correlations between ratings generated by friend or relative informants and other information were almost shocking in their lack of validity (Table 2).9
We concluded that ratings generated by a generic informant—someone who simply knows the patient and is willing to provide ratings—are highly likely to be uninformative. If a friend or relative provides information of limited usefulness, the report could easily lead to clinical decisions with high potential for bad outcomes. For example, attempts could fail to transition someone with impaired everyday living skills to independent living, or a patient whose potential is underestimated might not be offered opportunities to achieve attainable functional goals.
We found that the closer the rater was to a full caregiver role, the better and more accurate the information obtained. Caregivers who had regular contact with patients had much more valid ratings when performance on functionally relevant objective measures was considered. Patients with caregivers had greater impairments in everyday outcomes, however, suggesting that this subset was more impaired than the overall sample. For patients without caregivers, other sources of information—including careful observation by high-contact clinicians—seem to be required to generate a valid assessment of functioning.
Direct functional implications of impaired introspective accuracy
Clinical effects of reduced awareness of illness include reduced adherence to medication, followed by relapse, disturbed behavior, leading to emergency room treatments or acute admissions, and—more rarely—disturbed behavior associated with violence or self-harm. Relapses such as these can adversely affect brain structure and function, with declines in cognitive functioning early in the illness.
Our recent study18 quantifies the direct impact of impairments in introspective accuracy on everyday functioning. We asked 214 individuals with schizophrenia to self-evaluate their cognitive ability with a systematic rating scale and to self-report their everyday functioning in social, vocational, and everyday activities domains. We used performance-based measures to assess their cognitive abilities and everyday functional skills. Concurrently, high-contact clinicians rated these same abilities with the same rating scales. We then predicted everyday functioning, as rated by the clinicians, with the discrepancies between self-assessed and clinician-assessed functioning, and patients’ scores on the performance-based measures.
Impaired introspective accuracy, as indexed by difference scores between clinician ratings and self-reports, was a more potent predictor of everyday functional deficits in social, vocational, and everyday activities domains than scores on performance-based measures of cognitive abilities and functional skills. Even when we analyzed only deficits in introspective accuracy for cognition as the predictor of everyday outcomes in these 3 real-world functional domains, the results were the same. Impaired introspective accuracy was the single best predictor of everyday functioning in all 3 domains, with actual abilities considerably less important.
Patient characteristics that predict introspective accuracy
Patient characteristics associated with impairments in introspective accuracy (Table 3)19,20 are easy to identify and assess. Subjective reports of depression have a bell-shaped relationship with introspective accuracy. A self-reported score of 0 by a disabled schizophrenia patient suggests some unawareness of an unfortunate life situation; mild to moderate scores are associated with more accurate self-assessment; and more severe scores, as seen in other conditions, often predict overestimation of disability.19
Psychosis and negative symptoms are associated with reduced introspective accuracy and global over-reporting of functional competence.20 Patients who have never worked have no way to comprehend the specific challenges associated with obtaining and sustaining employment. Patients who had a job and have not been able to return work may perceive barriers as more substantial than they are.
Tips to manage impairments in introspective accuracy
Ensure that assessment information is valid. If a patient has limited ability to self-assess, seek other sources of data. If a patient has psychotic symptoms, denies being depressed, or has limited life experience, the clinician should adjust her (his) interpretation of the self-report accordingly, because these factors are known to adversely affect the accuracy of self-assessment. Consider informants’ level and quality of contact with the patient, as well as any motivation or bias that might influence the accuracy of their reports. Other professionals, such as occupational therapists, can provide useful information as reference points for treatment planning.
Consider treatments aimed at increasing introspective accuracy, such as structured training and exposure to self-assessment situations,6 and interventions aimed at increasing organization and skills performance. Cognitive remediation therapies, although not widely available, have potential to improve functioning, with excellent persistence over time.21
Related Resources
• Harvey PD, ed. Cognitive impairment in schizophrenia: characteristics, assessment and treatment. Cambridge, United Kingdom: Cambridge University Press; 2013.
• Gould F, McGuire LS, Durand D, et al. Self-assessment in schizophrenia: accuracy of assessment of cognition and everyday functioning [published online February 2, 2015]. Neuropsychology.
• Dunning D. Self-insight: detours and roadblocks on the path to knowing thyself. New York, NY: Psychology Press; 2012.
Acknowledgment
This paper was supported by Grants MH078775 to Dr. Harvey and MH093432 to Drs. Harvey and Pinkham from the National Institute of Mental Health.
Disclosures
Dr. Harvey has received consulting fees from AbbVie, Boehringer Ingelheim, Forum Pharmaceuticals, Genentech, Otsuka America Pharmaceuticals, Roche, Sanofi, Sunovion Pharmaceuticals, and Takeda Pharmaceuticals. Dr. Pinkham has served as a consultant for Otsuka America Pharmaceuticals.
Lack of insight or “unawareness of illness” occurs within a set of self-assessment problems commonly seen in schizophrenia.1 In the clinical domain, people who do not realize they are ill typically are unwilling to accept treatment, including medication, with potential for worsened illness. They also may have difficulty self-assessing everyday function and functional potential, cognition, social cognition, and attitude, often to a variable degree across these domains (Table 1).1-3
Self-assessment of performance can be clinically helpful whether performance is objectively good or bad. Those with poor performance could be helped to attempt to match their aspirations to accomplishments and improve over time. Good performers could have their functioning bolstered by recognizing their competence. Thus, even a population whose performance often is poor could benefit from accurate self-assessment or experience additional challenges from inaccurate self-evaluation.
This article discusses patient characteristics associated with impairments in self-assessment and the most accurate sources of information for clinicians about patient functioning. Our research shows that an experienced psychiatrist is well positioned to make accurate judgments of functional potential and cognitive abilities for people with schizophrenia.
Patterns in patients with impaired self-assessment
Healthy individuals routinely overestimate their abilities and their attractiveness to others.4 Feedback that deflates these exaggerated estimates increases the accuracy of their self-assessments. Mildly depressed individuals typically are the most accurate judges of their true functioning; those with more severe levels of depression tend to underestimate their competence. Thus, simply being an inaccurate self-assessor is not “abnormal.” These response biases are consistent and predictable in healthy people.
People with severe mental illness pose a different challenge. As in the following cases, their reports manifest minimal correlation with other sources of information, including objective information about performance.
CASE 1
JR, age 28, is referred for occupational therapy because he has never worked since graduating from high school. He tells the therapist his cognitive abilities are average and intact, although his scores on a comprehensive cognitive assessment suggest performance at the first percentile of normal distribution or less. His self-reported Beck Depression Inventory (BDI) score is 4. He says he would like to work as a certified public accountant, because he believes he has an aptitude for math. He admits he has no idea what the job entails, but he is quite motivated to set up an interview as soon as possible.
CASE 2
LM, age 48, says his “best job” was managing an auto parts store for 18 months after he earned an associate’s degree and until his second psychotic episode. His most recent work was approximately 12 years ago at an oil-change facility. He agrees to discuss employment but feels his vocational skills are too deteriorated for him to succeed and requests an assessment for Alzheimer’s disease. His cognitive performance averages in the 10th percentile of the overall population, and his BDI score is 18. Tests of his ability to perform vocational skills suggest he is qualified for multiple jobs, including his previous technician position.
Individuals with schizophrenia who report no depression and no work history routinely overestimate their functional potential, whereas those with a history of unsuccessful vocational attempts often underestimate their functional potential. Inaccurate self-assessment can contribute to reduced functioning—in JR’s case because of unrealistic assessment of the match between skills and vocational potential, and in LM’s case because of overly pessimistic self-evaluation. For people with schizophrenia, inability to self-evaluate can have a bidirectional adverse impact on functioning: overestimation may lead to trying tasks that are too challenging, and underestimation may lead to reduced effort and motivation to take on functional tasks.
Metacognition and introspective accuracy
“Metacognition” refers to self-assessment of the quality and accuracy of performance on cognitive tests.5-7 Problem-solving tests— such as the Wisconsin Card Sorting test (WCST), in which the person being assessed needs to solve the test through performance feedback—are metacognition tests. When errors are made, the strategy in use needs to be discarded; when responses are correct, the strategy is retained. People with schizophrenia have disproportionate difficulties with the WCST, and deficits are especially salient when the test is modified to measure self-assessment of performance and ability to use feedback to change strategies.
“Introspective accuracy” is used to describe the wide-ranging self-assessment impairments in severe mental illness. Theories of metacognition implicate a broad spectrum, of which self-assessment is 1 component, whereas introspective accuracy more specifically indicates judgments of accuracy. Because self-assessment is focused on the self, and hence is introspective, this conceptualization can be applied to self-evaluations of:
• achievement in everyday functioning (“Did I complete that task well?”)
• potential for achievement in everyday functioning (“I could do that job”)
• cognitive performance (“Yes, I remembered all of those words”)
• social cognition (“He really is angry”).
Domains of impaired introspective accuracy
Everyday functioning. The 3 global domains of everyday functioning are social outcomes, productive/vocational outcomes, and everyday activities, including residential independence/support for people with severe mental illness. Two areas of inquiry are used in self-assessing everyday functioning: (1) what are you doing now and (2) what could you do in the future? For people with schizophrenia, a related question is how perceived impairments in everyday functioning are associated with subjective illness burden.
People with schizophrenia report illness burden consistent with their self-reported disability, suggesting their reports in these domains are not random.8 Studies have consistently found, however, that these patients report:
• less impairment on average in their everyday functioning than observed by clinicians
• less subjective illness burden compared with individuals with much less severe illnesses.
Their reports also fail to correlate with clinicians’ observations.9 Patients with schizophrenia who have never been employed may report greater vocational potential than those employed full-time. Interestingly, patients who were previously—but not currently—employed reported the least vocational potential.10 These data suggest that experience may be a factor: individuals who have never worked have no context for their self-assessments, whereas people who are persistently unemployed may have a perspective on the challenges associated with employment.
In our research,9 high-contact clinicians (ie, case manager, psychiatrist, therapist, or residential facility manager) were better able than family or friends to generate ratings from an assessment questionnaire that correlated with performance-based measures of patients’ ability to perform everyday functional skills. The ratings were generated across multiple functional status scales, suggesting that the rater was more important than the specific scale. We concluded that high-contact clinicians can generate ratings of everyday functioning that are convergent with patients’ abilities, even when they have no information about actual performance scores.
Cognitive performance. When self-reported cognitive abilities are correlated with the results of performance on neuropsychological assessments, the results are quite consistent. Patients provide reports that do not correlate with their objective performance.11 Interestingly, when clinicians were asked to use the same strategies as patients to generate ratings of cognitive impairment, clinician ratings had considerably greater evidence of validity. In several studies, patients’ ratings of their cognitive performance did not correlate with their neuropsychological test performance, even though they had just been tested on the assessment battery. Ratings by clinicians or other high-contact informants (who were unaware of patients’ test performance) were much more strongly related to patients’ objective test performance, compared with patient self-reports.12
The convergence of clinician ratings of cognitive performance with objective test data has been impressive. Correlation coefficients of at least r = 0.5, reflecting a moderate to large relationships between clinician ratings and objective performance, have been detected. Individual cognitive test domains, such as working memory and processing speed, often do not correlate with each other or with aspects of everyday functioning to that extent.13 These data suggest that a clinician assessment of cognitive performance, when focused on the correct aspects of cognitive functioning, can be a highly useful proxy for extensive neuropsychological testing.
Social cognitive performance. Introspective accuracy for social cognitive judgments can be assessed similarly to the strategies used to assess the domains of everyday functioning and cognitive performance. Patients are asked to complete a typical social cognitive task, such as determining emotions from facial stimuli or examining the eye region of the face, to determine the mental state of the depicted person. Immediately after responding to each stimulus, participants rate their confidence in the correctness of that response.
Consistent with the pattern of introspective accuracy for everyday functioning, patients with schizophrenia tend to make more high-confidence errors than healthy individuals on social cognitive tasks. That is, the patients are less likely to realize when they are wrong in their judgments of social stimuli. A similar pattern has been found for mental state attribution,14 recognition of facial emotion from the self,15 and recognition of facial emotion from others.16 These high-confidence errors also are more likely to occur for more difficult stimuli, such as faces that display only mildly emotional expressions. These difficulties appear to be specific to judgments in an immediate evaluation situation. When asked to determine if the behavior of another individual is socially appropriate, individuals with schizophrenia are as able as healthy individuals to recognize social mistakes.17 This work suggests that, at least within the domain of social cognition, introspective accuracy impairment is not caused by generalized poor judgment, just as self-assessments of disability and illness burden are generated at random.
Choosing a reliable informant
If a clinician has not had adequate time or exposure to a patient to make a cognitive or functional judgment, what should the strategy be? If asking the patient is uninformative, who should be asked? Our group has gathered information that may help clinicians identify informants who can provide ratings of cognitive performance and everyday functioning that are convergent with objective evidence.
In a systematic study of validity of reports of various informants, we compared correlations between reports of competence of everyday functioning with objective measures of cognitive test performance and ability to perform everyday functional skills. Our findings:
• Patient reports of everyday functioning were not correlated with performance-based measures for any of 6 rating scales.9
• Clinician reports of everyday functioning were correlated with objective performance across 4 of 6 rating scales.
• Correlations between ratings generated by friend or relative informants and other information were almost shocking in their lack of validity (Table 2).9
We concluded that ratings generated by a generic informant—someone who simply knows the patient and is willing to provide ratings—are highly likely to be uninformative. If a friend or relative provides information of limited usefulness, the report could easily lead to clinical decisions with high potential for bad outcomes. For example, attempts could fail to transition someone with impaired everyday living skills to independent living, or a patient whose potential is underestimated might not be offered opportunities to achieve attainable functional goals.
We found that the closer the rater was to a full caregiver role, the better and more accurate the information obtained. Caregivers who had regular contact with patients had much more valid ratings when performance on functionally relevant objective measures was considered. Patients with caregivers had greater impairments in everyday outcomes, however, suggesting that this subset was more impaired than the overall sample. For patients without caregivers, other sources of information—including careful observation by high-contact clinicians—seem to be required to generate a valid assessment of functioning.
Direct functional implications of impaired introspective accuracy
Clinical effects of reduced awareness of illness include reduced adherence to medication, followed by relapse, disturbed behavior, leading to emergency room treatments or acute admissions, and—more rarely—disturbed behavior associated with violence or self-harm. Relapses such as these can adversely affect brain structure and function, with declines in cognitive functioning early in the illness.
Our recent study18 quantifies the direct impact of impairments in introspective accuracy on everyday functioning. We asked 214 individuals with schizophrenia to self-evaluate their cognitive ability with a systematic rating scale and to self-report their everyday functioning in social, vocational, and everyday activities domains. We used performance-based measures to assess their cognitive abilities and everyday functional skills. Concurrently, high-contact clinicians rated these same abilities with the same rating scales. We then predicted everyday functioning, as rated by the clinicians, with the discrepancies between self-assessed and clinician-assessed functioning, and patients’ scores on the performance-based measures.
Impaired introspective accuracy, as indexed by difference scores between clinician ratings and self-reports, was a more potent predictor of everyday functional deficits in social, vocational, and everyday activities domains than scores on performance-based measures of cognitive abilities and functional skills. Even when we analyzed only deficits in introspective accuracy for cognition as the predictor of everyday outcomes in these 3 real-world functional domains, the results were the same. Impaired introspective accuracy was the single best predictor of everyday functioning in all 3 domains, with actual abilities considerably less important.
Patient characteristics that predict introspective accuracy
Patient characteristics associated with impairments in introspective accuracy (Table 3)19,20 are easy to identify and assess. Subjective reports of depression have a bell-shaped relationship with introspective accuracy. A self-reported score of 0 by a disabled schizophrenia patient suggests some unawareness of an unfortunate life situation; mild to moderate scores are associated with more accurate self-assessment; and more severe scores, as seen in other conditions, often predict overestimation of disability.19
Psychosis and negative symptoms are associated with reduced introspective accuracy and global over-reporting of functional competence.20 Patients who have never worked have no way to comprehend the specific challenges associated with obtaining and sustaining employment. Patients who had a job and have not been able to return work may perceive barriers as more substantial than they are.
Tips to manage impairments in introspective accuracy
Ensure that assessment information is valid. If a patient has limited ability to self-assess, seek other sources of data. If a patient has psychotic symptoms, denies being depressed, or has limited life experience, the clinician should adjust her (his) interpretation of the self-report accordingly, because these factors are known to adversely affect the accuracy of self-assessment. Consider informants’ level and quality of contact with the patient, as well as any motivation or bias that might influence the accuracy of their reports. Other professionals, such as occupational therapists, can provide useful information as reference points for treatment planning.
Consider treatments aimed at increasing introspective accuracy, such as structured training and exposure to self-assessment situations,6 and interventions aimed at increasing organization and skills performance. Cognitive remediation therapies, although not widely available, have potential to improve functioning, with excellent persistence over time.21
Related Resources
• Harvey PD, ed. Cognitive impairment in schizophrenia: characteristics, assessment and treatment. Cambridge, United Kingdom: Cambridge University Press; 2013.
• Gould F, McGuire LS, Durand D, et al. Self-assessment in schizophrenia: accuracy of assessment of cognition and everyday functioning [published online February 2, 2015]. Neuropsychology.
• Dunning D. Self-insight: detours and roadblocks on the path to knowing thyself. New York, NY: Psychology Press; 2012.
Acknowledgment
This paper was supported by Grants MH078775 to Dr. Harvey and MH093432 to Drs. Harvey and Pinkham from the National Institute of Mental Health.
Disclosures
Dr. Harvey has received consulting fees from AbbVie, Boehringer Ingelheim, Forum Pharmaceuticals, Genentech, Otsuka America Pharmaceuticals, Roche, Sanofi, Sunovion Pharmaceuticals, and Takeda Pharmaceuticals. Dr. Pinkham has served as a consultant for Otsuka America Pharmaceuticals.
1. Amador XF, Flaum M, Andreasen NC, et al. Awareness of illness in schizophrenia and schizoaffective and mood disorders. Arch Gen Psychiatry. 1994;51(10):826-836.
2. Medalia A, Thysen J. A comparison of insight into clinical symptoms versus insight into neuro-cognitive symptoms in schizophrenia. Schizophr Res. 2010;118(1-3):134-139.
3. Beck AT, Baruch E, Balter JM, et al. A new instrument for measuring insight: the Beck Cognitive Insight Scale. Schizophr Res. 2004;68(2-3):319-329.
4. Kruger J, Dunning D. Unskilled and unaware of it: how difficulties in recognizing one’s own incompetence lead to inflated self-assessments. J Pers Soc Psychol. 1999;77(6):1121-1134.
5. Lysaker P, Vohs J, Ballard R, et al. Metacognition, self-reflection and recovery in schizophrenia. Future Neurology. 2013;8(1):103-115.
6. Lysaker PH, Dimaggio G. Metacognitive capacities for reflection in schizophrenia: implications for developing treatments. Schizophr Bull. 2014;40(3):487-491.
7. Koren D, Seidman LJ, Goldsmith M, et al. Real-world cognitive—and metacognitive—dysfunction in schizophrenia: a new approach for measuring (and remediating) more “right stuff.” Schizophr Bull. 2006;32(2):310-326.
8. McKibbin C, Patterson TL, Jeste DV. Assessing disability in older patients with schizophrenia: results from the WHODAS-II. J Ner Men Dis. 2004;192(6):405-413.
9. Sabbag S, Twamley EW, Vella L, et al. Assessing everyday functioning in schizophrenia: not all informants seem equally informative. Schizophr Res. 2011;131(1-3):250-255.
10. Gould F, Sabbag S, Durand D, et al. Self-assessment of functional ability in schizophrenia: milestone achievement and its relationship to accuracy of self-evaluation. Psychiatry Res. 2013;207(1-2):19-24.
11. Keefe RS, Poe M, Walker TM, et al. The Schizophrenia Cognition Rating Scale: an interview-based assessment and its relationship to cognition, real-world functioning, and functional capacity. Am J Psychiatry. 2006;163(3):426-432.
12. Durand D, Strassnig M, Sabbag S, et al. Factors influencing self-assessment of cognition and functioning in schizophrenia: implications for treatment studies [published online July 25, 2014]. Eur Neuropsychopharmacol. doi: 10.1016/j.euroneuro.2014.07.008.
13. McClure MM, Bowie CR, Patterson TL, et al. Correlations of functional capacity and neuropsychological performance in older patients with schizophrenia: evidence for specificity of relationships? Schizophr Res. 2007;89(1-3):330-338.
14. Köther U, Veckenstedt R, Vitzthum F, et al. “Don’t give me that look” - overconfidence in false mental state perception in schizophrenia. Psychiatry Res. 2012;196(1):1-8.
15. Demily C, Weiss T, Desmurget M, et al Recognition of self-generated facial emotions is impaired in schizophrenia. J Neuropsychiatry Clin Neurosci. 2011;23(2):189-193.
16. Moritz S, Woznica A, Andreou C, et al. Response confidence for emotion perception in schizophrenia using a Continuous Facial Sequence Task. Psychiatry Res. 2012;200(2-3):202-207.
17. Langdon R, Connors MH, Connaughton E. Social cognition and social judgment in schizophrenia. Schizophrenia Research: Cognition. 2014;1(4):171-174.
18. Gould F, McGuire LS, Durand D, et al. Self-assessment in schizophrenia: accuracy of evaluation of cognition and everyday functioning [published online February 2, 2015]. Neuropsychology.
19. Bowie CR, Twamley EW, Anderson H, et al. Self-assessment of functional status in schizophrenia. J Psychiatr Res. 2007;41(12):1012-1018.
20. Sabbag S, Twamley EW, Vella L, et al. Predictors of the accuracy of self-assessment of everyday functioning in people with schizophrenia. Schizophr Res. 2012;137(1- 3):190-195.
21. McGurk SR, Mueser KT, Feldman K, et al. Cognitive training for supported employment: 2-3 year outcomes of a randomized controlled trial. Am J Psychiatry. 2007;164(3):437-441.
1. Amador XF, Flaum M, Andreasen NC, et al. Awareness of illness in schizophrenia and schizoaffective and mood disorders. Arch Gen Psychiatry. 1994;51(10):826-836.
2. Medalia A, Thysen J. A comparison of insight into clinical symptoms versus insight into neuro-cognitive symptoms in schizophrenia. Schizophr Res. 2010;118(1-3):134-139.
3. Beck AT, Baruch E, Balter JM, et al. A new instrument for measuring insight: the Beck Cognitive Insight Scale. Schizophr Res. 2004;68(2-3):319-329.
4. Kruger J, Dunning D. Unskilled and unaware of it: how difficulties in recognizing one’s own incompetence lead to inflated self-assessments. J Pers Soc Psychol. 1999;77(6):1121-1134.
5. Lysaker P, Vohs J, Ballard R, et al. Metacognition, self-reflection and recovery in schizophrenia. Future Neurology. 2013;8(1):103-115.
6. Lysaker PH, Dimaggio G. Metacognitive capacities for reflection in schizophrenia: implications for developing treatments. Schizophr Bull. 2014;40(3):487-491.
7. Koren D, Seidman LJ, Goldsmith M, et al. Real-world cognitive—and metacognitive—dysfunction in schizophrenia: a new approach for measuring (and remediating) more “right stuff.” Schizophr Bull. 2006;32(2):310-326.
8. McKibbin C, Patterson TL, Jeste DV. Assessing disability in older patients with schizophrenia: results from the WHODAS-II. J Ner Men Dis. 2004;192(6):405-413.
9. Sabbag S, Twamley EW, Vella L, et al. Assessing everyday functioning in schizophrenia: not all informants seem equally informative. Schizophr Res. 2011;131(1-3):250-255.
10. Gould F, Sabbag S, Durand D, et al. Self-assessment of functional ability in schizophrenia: milestone achievement and its relationship to accuracy of self-evaluation. Psychiatry Res. 2013;207(1-2):19-24.
11. Keefe RS, Poe M, Walker TM, et al. The Schizophrenia Cognition Rating Scale: an interview-based assessment and its relationship to cognition, real-world functioning, and functional capacity. Am J Psychiatry. 2006;163(3):426-432.
12. Durand D, Strassnig M, Sabbag S, et al. Factors influencing self-assessment of cognition and functioning in schizophrenia: implications for treatment studies [published online July 25, 2014]. Eur Neuropsychopharmacol. doi: 10.1016/j.euroneuro.2014.07.008.
13. McClure MM, Bowie CR, Patterson TL, et al. Correlations of functional capacity and neuropsychological performance in older patients with schizophrenia: evidence for specificity of relationships? Schizophr Res. 2007;89(1-3):330-338.
14. Köther U, Veckenstedt R, Vitzthum F, et al. “Don’t give me that look” - overconfidence in false mental state perception in schizophrenia. Psychiatry Res. 2012;196(1):1-8.
15. Demily C, Weiss T, Desmurget M, et al Recognition of self-generated facial emotions is impaired in schizophrenia. J Neuropsychiatry Clin Neurosci. 2011;23(2):189-193.
16. Moritz S, Woznica A, Andreou C, et al. Response confidence for emotion perception in schizophrenia using a Continuous Facial Sequence Task. Psychiatry Res. 2012;200(2-3):202-207.
17. Langdon R, Connors MH, Connaughton E. Social cognition and social judgment in schizophrenia. Schizophrenia Research: Cognition. 2014;1(4):171-174.
18. Gould F, McGuire LS, Durand D, et al. Self-assessment in schizophrenia: accuracy of evaluation of cognition and everyday functioning [published online February 2, 2015]. Neuropsychology.
19. Bowie CR, Twamley EW, Anderson H, et al. Self-assessment of functional status in schizophrenia. J Psychiatr Res. 2007;41(12):1012-1018.
20. Sabbag S, Twamley EW, Vella L, et al. Predictors of the accuracy of self-assessment of everyday functioning in people with schizophrenia. Schizophr Res. 2012;137(1- 3):190-195.
21. McGurk SR, Mueser KT, Feldman K, et al. Cognitive training for supported employment: 2-3 year outcomes of a randomized controlled trial. Am J Psychiatry. 2007;164(3):437-441.
Substance use disorders in adolescents with psychiatric comorbidity: When to screen and how to treat
Substances use during adolescence is common in the United States. Data from the 2014 Monitoring the Future Survey estimated that among 12th graders, 60.2% used alcohol, 35.1% used marijuana, and 13.9% used a prescription drug for nonmedical use within the previous year.1 An estimated 11.4% of adolescents meet DSM-IV threshold criteria for a substance use disorder (SUD).2 Substance use in adolescents often co-occurs with psychological distress and psychiatric illness. Adolescents with a psychiatric disorder are at increased risk for developing a SUD; conversely, high rates of psychiatric illness are seen in adolescents with a SUD.3,4 In one study, 82% of adolescents hospitalized for SUD treatment were found to have a co-occurring axis I disorder.5 Furthermore, co-occurring psychiatric illness and SUD complicates treatment course and prognosis. Adolescents with co-occurring psychiatric illness and SUD often benefit from an integrated, multimodal treatment approach that includes psychotherapy, pharmacologic interventions, family involvement, and collaboration with community supports.
In this article, we focus on pharmacologic management of non-nicotinic SUDs in adolescents, with an emphasis on those with comorbid psychiatric illness.
Screening and assessment of substance use
It is important to counsel children with a psychiatric illness and their parents about the increased risk for SUD before a patient transitions to adolescence. Discussions about substance abuse should begin during the 5th grade because data suggests that adolescent substance use often starts in middle school (6th to 9th grade). Clinicians should routinely screen adolescent patients for substance use. Nonproprietary screening tools available through the National Institute on Alcohol and Alcoholism and the National Institute on Drug Abuse are listed in Table 1.6-8 The Screening to Brief Intervention (S2BI) is a newer tool that has been shown to be highly effective in identifying adolescents at risk for substance abuse and differentiating severity of illness.8 The S2BI includes screening questions that assess for use of 8 substances in the past year.
Adolescents with psychiatric illness who are identified to be at risk for problems associated with substance use should be evaluated further for the presence or absence of a SUD. The number of criteria a patient endorses over the past year (Table 29) is used to assess SUD severity—mild, moderate, or severe. Additional considerations include substance use patterns such as type, site, quantity, frequency, context, and combinations of substances.
It is important to be curious and nonjudgmental when evaluating substance use patterns with adolescents to obtain a comprehensive assessment. Teenagers often are creative and inventive in their efforts to maximize intoxication, which can put them at risk for complications associated with acute intoxication. Rapidly evolving methods of ingesting highly concentrated forms of tetrahydrocannabinol (“wax,” “dabs”) are an example of use patterns that are ahead of what is reported in the literature.
Any substance use in an adolescent with a psychiatric illness is of concern and should be monitored closely because of the potential impact of substance use on the co-occurring psychiatric illness and possible interactions between the abused substance and prescribed medication.
Treatment interventions
Although this review will focus on pharmacotherapy, individual, group, and family psychotherapies are a critical part of a treatment plan for adolescents with comorbid psychiatric illness and SUD (Table 3). Collaboration with community supports, including school and legal officials, can help reinforce contingencies and assist with connecting a teen with positive prosocial activities. Involvement with mutual help organizations, such as Alcoholics Anonymous, can facilitate adolescent engagement with a positive sober network.10
Pharmacologic strategies for treating co-occurring psychiatric illness and SUD include medication to:
• decrease substance use and promote abstinence
• alleviate withdrawal symptoms (medication to treat withdrawal symptoms and agonist treatments)
• block the effect of substance use (antagonist agents)
• decrease likelihood of substance use with aversive agents
• target comorbid psychiatric illness.
Medication to decrease substance use and promote abstinence. One strategy is to target cravings and urges to use substances with medication. Naltrexone is an opiate antagonist FDA-approved for treating alcohol and opioid use disorders in adults and is available as a daily oral medication and a monthly injectable depot preparation (extended-release naltrexone). Two small open-label studies showed decreased alcohol use with naltrexone treatment in adolescents with alcohol use disorder.11,12 In a randomized double-blind placebo controlled (RCT) crossover study of 22 adolescent problem drinkers, naltrexone, 50 mg/d, reduced the likelihood of drinking and heavy drinking (P ≤ .03).13 Acamprosate, another anti-craving medication FDA-approved for treating alcohol use disorder in adults, has no data on the safety or efficacy for adolescent alcohol use disorder.
There is limited research on agents that decrease use and promote abstinence from non-nicotinic substances other than alcohol. There is one pilot RCT that evaluated N-acetylcysteine (NAC)—an over-the-counter supplement that modulates the glutamate system—for treating adolescent Cannabis dependence. Treatment with NAC, 2,400 mg/d, was well tolerated and had twice the odds of increasing negative urine cannabinoid tests during treatment than placebo.14 Although NAC treatment was associated with decreased Cannabis use, it did not significantly decrease cravings compared with placebo.15
Medication to alleviate withdrawal symptoms. Some patients may find the physical discomfort and psychological distress associated with substance withdrawal so intolerable that to avoid it they continue to use drugs or alcohol. Medication to treat withdrawal symptoms and agonist treatments can be used to alleviate discomfort and distress associated with withdrawal. Agonist treatments, such as methadone and buprenorphine, bind to the same receptors as the target substance, which allows the patient to shift to controlled use of a prescribed substitute. Agonist treatments are used for short detoxification and over longer periods of time for maintenance treatment. Methadone, which decreases craving and withdrawal symptoms from opiates by binding to the μ-opiate receptor and blocking other substances from binding, is frequently used for detoxification and maintenance treatment in adults. There is limited data on methadone substitution therapy for adolescents in the United States.16 Methadone maintenance for adolescents in the United States is restricted to severe cases of opioid use disorder. Federal guidelines specify that adolescents age <18 can only receive methadone if they have had 2 unsuccessful detoxification attempts or outpatient psychosocial treatments and have met DSM criteria for an opioid use disorder for 1 year.17
Buprenorphine is a partial μ-opiate receptor agonist that is FDA-approved for use in adolescents age ≥16 with opioid dependence. Although a waiver from the U.S. Drug Enforcement Administration is required to prescribe buprenorphine, it generally can be administered in outpatient settings with relative ease compared with methadone.
Marsch et al18 examined the efficacy of buprenorphine compared with clonidine for detoxification over 1 month in 36 adolescents with opioid dependence. Clonidine is an α-2 adrenergic agonist that often is used during detoxification from opioids.19 Although both buprenorphine and clonidine relieved withdrawal symptoms, a significantly higher percentage of patients receiving buprenorphine completed treatment (72%) compared with those taking clonidine (39%) (P < .05).18 Detoxification with buprenorphine also was associated with a higher percentage of negative urine drug screens (64% vs 32%, P = .01), and those receiving buprenorphine were more likely to continue on naltrexone maintenance for continued medication-assisted treatment after detoxification compared with those randomized to clonidine.
Woody et al20 compared use of buprenorphine/naloxone for opioid detoxification vs short-term maintenance. Patients age 16 to 21 were randomized to detoxification over 2 weeks vs stabilization and maintenance for 9 weeks and taper over 3 weeks. Maintenance treatment with buprenorphine/naloxone was associated with less opioid use, less injection drug use, and less need for addiction treatment outside of that received through the study compared with detoxification treatment. When buprenorphine/naloxone was discontinued both the detoxification and maintenance groups had high rates of positive urine toxicology screens at 1-year follow up (mean 48% to 72%). These data suggests maintenance with buprenorphine/ naloxone for adolescents and young adults is more effective than short-term detoxification for stabilizing opioid use disorders, although optimal treatment duration is unclear. Clinically, it is important to continue buprenorphine/naloxone maintenance until the patient has stabilized in recovery and has acquired coping skills to manage urges, cravings, and psychological distress (eg, anger, stress) that often arise during a slow taper of agonist treatment.
Antagonist treatment to block the effect of substance use
As an opioid receptor antagonist, naltrexone is effective for treating opioid use disorder because it blocks the action of opioids. Fishman et al21 published a descriptive series of 16 adolescents and young adults followed over 4 months who received the injectable depot preparation (extended-release) naltrexone while in residential treatment, and then discharged to outpatient care. Most patients who received extended-release naltrexone remained in outpatient treatment (63%) and reduced their opioid use or were abstinent at 4 months (56%). One barrier to naltrexone treatment is the need to be abstinent from opioids for 7 to 10 days to prevent precipitated opioid withdrawal. Therefore, naltrexone is a good option for adolescents who present for treatment early and are not physiologically dependent on opioids or are receiving treatment in a structured environment after detoxification, such as residential treatment or sober living.
Aversive agents to diminish substance use. Aversive agents produce an unpleasant reaction when a target substance is consumed. Disulfiram is prototypic aversive agent that prevents the breakdown of acetaldehyde, a toxic metabolite of alcohol. Patients who drink alcohol while taking disulfiram may experience adverse effects, including tachycardia, shortness of breath, nausea, dizziness, and confusion. There have been 2 studies examining the efficacy of disulfiram in adolescents with alcohol use disorder. Niederhofer et al22 found that disulfiram treatment significantly increased cumulative abstinence in a small RCT (P = .012). In another small randomized, open-label, 3-month study of adolescents who received disulfiram or naltrexone in addition to weekly psychotherapy, disulfiram was superior to naltrexone in mean days abstinent from alcohol, 84 days vs 51 days, respectively (P = .0001).23 Often adolescents are not willing to adhere to disulfiram because they are concerned about the aversive reaction when combined with alcohol use. Consider prescribing disulfiram for adolescents who are about to go “on pass” from a therapeutic school or residential SUD treatment center and will be returning to an environment where they may be tempted to use alcohol.
Pharmacotherapy to treat co-occurring psychiatric illness
Continued treatment of a psychiatric illness that co-occurs with SUD is important. As we recommended, consider psychosocial treatments for both the SUD and comorbid psychopathology. Several single-site RCTs have evaluated the efficacy of the selective serotonin reuptake inhibitors (SSRIs) fluoxetine and sertraline for depressive disorders in adolescents with a co-occurring SUD.24-28 Most studies have shown improvement in depressive symptoms and substance use in medication and placebo groups.24,25,27,28 However, treatment with fluoxetine, 20 mg/d, or sertraline, 100 mg/d, when compared with placebo was associated with improved depressive symptoms in 1 of 3 studies and had no significant difference in SUD outcome. The authors of these studies believe that the general improvement in depression and the SUD was related to use of cognitive-behavioral therapy (CBT) and/or motivational enhancement therapy.24,25,27,28
Research on the use of mood stabilizers for adolescents with mood dysregulation and a SUD is limited but has suggested benefit associated with pharmacotherapy (Table 4).29-32 Two RCTs and 1 open-label study demonstrated reductions in substance use with mood stabilizer treatment in adolescents with co-occurring SUD and mood dysregulation.29-32 The effect of pharmacotherapy on mood dysregulation ratings are less clear because there was no change in severity of affective symptoms observed in a small RCT of lithium (average blood level 0.9 mEq/L)29; and improvement in affective symptoms was noted in topiramate (300 mg/d) and placebo groups when both groups were treated with concurrent quetiapine.32 Because of the high risk of SUD and severe morbidity in juvenile bipolar disorder and severe mood dysregulation,33 larger RCTs are warranted.
Several studies have evaluated the impact of stimulant and nonstimulant treatments for attention-deficit/hyperactivity disorder (ADHD) in adolescents with a co-occurring SUD.34-39 The largest and only multisite study evaluated the efficacy of osmotic (extended) release methylphenidate (OROS-MPH) vs placebo for adolescents who also were receiving CBT for SUD.36 In this 16-week RCT, the OROS-MPH and placebo groups showed improvement in self-reported ADHD symptoms with no difference between groups. Parent report of ADHD symptoms did indicate a greater reduction in symptoms in the OROS-MPH group compared with placebo. Both groups had a decrease in self-reported days of substance use over the past month with no differences between groups. Pharmacotherapy trials for ADHD that have included psychotherapy highlight the effectiveness of CBT for SUD and co-occurring psychiatric illness.36,39,40
Although conduct disorder and anxiety disorders commonly co-occur with SUD, there has been less research evaluating the impact of pharmacotherapy on treating these disorders. Riggs et al25,34,35,41 evaluated the impact of pharmacotherapy targeted to co-occurring ADHD and major depressive disorder in the context of conduct disorder and SUD. When evaluated in an outpatient setting, the presence of a treatment intervention to address the co-occurring SUD was an important component that led to a reduction in conduct symptoms.25,35 There have been no comprehensive studies on the impact of pharmacotherapy for treating anxiety and SUD in adolescents.
Recommendations for clinical management
Although more research is needed to evaluate the role of pharmacotherapy for adolescents with co-occurring psychiatric illness and a SUD, recommended practice is to continue pharmacotherapy and closely monitor response to treatment when at-risk substance use begins in patients with co-occurring psychiatric illness. In adolescents with a threshold SUD, continue pharmacotherapy for unstable mood disorders with first-line choices of SSRIs for unipolar depression and second-generation antipsychotics for bipolar spectrum illness. Suggested conservative pharmacological interventions for anxiety disorders include SSRIs and buspirone, which have been shown to be effective for treating anxiety in children and adolescents.42,43 For patients with comorbid ADHD and SUD, if possible, it is recommended to first stabilize substance use (low-level use or abstinence) and consider treating ADHD immediately thereafter with a nonstimulant such as atomoxetine, which has data on efficacy and safety in context to substance use; and/or an α-agonist or an extended-release stimulant. Because of the potential for misuse and toxicity associated with concurrent substance use, benzodiazepines should be considered a last treatment of choice for adolescents with anxiety disorders and a SUD. Similarly, the use of immediate-release stimulants should be avoided in patients with ADHD and a SUD. When prescribing medications that could be misused or toxic when combined with a substance, it is important to evaluate the risk and benefit of continued use of a particular medication and consider prescribing lower quantities to decrease risk for misuse (1- to 2-week supply). Adolescents often are reluctant to engage in SUD treatment and one strategy to consider is to make continued prescription of any medication contingent on engaging in SUD treatment. Enlist parents in helping to monitor, store, and administer their child’s medication to improve adherence and decrease the potential for misuse, diversion, and complications associated with substance intoxication.
Bottom Line
It is important to screen for substance use in adolescents with co-occurring
psychiatric illness and vice versa. When at-risk or hazardous substance use is
detected there are effective psychosocial and pharmacologic interventions that
can be used to treat adolescent substance use disorders alone and in combination
with certain psychiatric disorders.
Related Resources
• National Institute on Drug Abuse. www.drugabuse.gov.
• National Institute on Alcohol Abuse and Alcoholism. www.niaaa.nih.gov.
• Substance Abuse and Mental Health Services Administration. www.samhsa.gov.
Drug Brand Names
Acamprosate • Campral
Atomoxetine • Strattera
Buprenorphine• Subutex
Buprenorphine/naloxone • Suboxone
Buspirone • Buspar
Clonidine • Catapres
Disulfiram • Antabuse
Fluoxetine • Prozac
Lithium • Lithobid, Eskalith
Methadone • Dolophine
Naltrexone • ReVia, Vivitrol
Osmotic (extended) release methylphenidate • Concerta
Sertraline • Zoloft
Topiramate • Topamax
Quetiapine • Seroquel
Valproic acid • Depakote
Disclosures
Dr. Yule received grant support from the 2012 American Academy of Child and Adolescent Psychiatry Pilot Research Award for Junior Faculty supported by Lilly USA, LLC, and receives grant support from the 2014 Louis V. Gerstner III Research Scholar Award. Dr. Wilens has received grant support from the National Institute on Drug Abuse (NIDA); has been a consultant for Euthymics/Neurovance, NIDA, Ironshore Pharmaceuticals and Development, Theravance Biopharma, Tris Pharma, the U.S. National Football League (ERM Associates), U.S. Minor/Major League Baseball, and Bay Cove Human Services (Clinical Services).
1. Johnston LD, Miech RA, O’Malley PM, et al. Monitoring the future, Table 2: trends in annual prevalence of use of various drugs in grades 8, 10, and 12. http://www. monitoringthefuture.org/data/14data.html#2014data-drugs. Published December 16, 2014. Accessed January 6, 2015.
2. Merikangas KR, He JP, Burnstein M, et al. Lifetime prevalence of mental disorders in U.S. adolescents: results from the National Comorbidity Survey Replication-- Adolescent Supplement (NCS-A). J Am Acad Child Adolesc Psychiatry. 2010;49(10):980-989.
3. Kandel DB, Johnson JG, Bird HR, et al. Psychiatric disorders associated with substance use among children and adolescents: findings from the Methods for the Epidemiology of Child and Adolescent Mental Disorders (MECA) Study. J Abnorm Child Psychol. 1997;25(2):122-132.
4. Roberts RE, Roberts CR, Xing Y. Comorbidity of substance use disorders and other psychiatric disorders among adolescents: evidence from an epidemiologic survey. Drug Alcohol Depend. 2007;88(suppl 1):S4-S13.
5. Stowell R, Estroff TW. Psychiatric disorders in substance-abusing adolescent inpatients: a pilot study. J Am Acad Child Adolesc Psychiatry. 1992;31(6):1036-1040.
6. National Institute of Alcohol Abuse and Alcoholism. Alcohol screening and brief intervention for youth: a practitioner’s guide. http://www.niaaa.nih.gov/ Publications/EducationTrainingMaterials/Pages/YouthGuide.aspx. Accessed March 11, 2015.
7. Children’s Hospital Boston. The CRAFFT screening interview. http://www.integration.samhsa.gov/clinical-practice/sbirt/CRAFFT_Screening_interview.pdf. Published 2009. Accessed March 11, 2015.
8. Levy S, Weiss R, Sherritt L, et al. An electronic screen for triaging adolescent substance use by risk levels. JAMA Pediatr. 2014;168(9):822-828.
9. Diagnostic and statistical manual of mental disorders, 5th ed. Washington, DC: American Psychiatric Association; 2013.
10. Kelly JF, Myers MG. Adolescents’ participation in Alcoholics Anonymous and Narcotics Anonymous: review, implications and future directions. J Psychoactive Drugs. 2007;39(3):259-269.
11. Lifrak PD, Alterman AI, O’Brien CP, et al. Naltrexone for alcoholic adolescents. Am J Psychiatry. 1997;154(3):439-441.
12. Deas D, May MP, Randall C, et al. Naltrexone treatment of adolescent alcoholics: an open-label pilot study. J Child Adolesc Psychopharmacol. 2005;15(5):723-728.
13. Miranda R, Ray L, Blanchard A, et al. Effects of naltrexone on adolescent alcohol cue reactivity and sensitivity: an initial randomized trial. Addict Biol. 2014;19(5):941-954.
14. Gray KM, Carpenter MJ, Baker NL, et al. A double-blind randomized controlled trial of N-acetylcysteine in cannabis-dependent adolescents. Am J Psychiatry. 2012;169(8):805-812.
15. Roten AT, Baker NL, Gray KM. Marijuana craving trajectories in an adolescent marijuana cessation pharmacotherapy trial. Addict Behav. 2013;38(3):1788-1791.
16. Hopfer CJ, Khuri E, Crowley TJ, et al. Adolescent heroin use: a review of the descriptive and treatment literature. J Subst Abuse Treat. 2002;23(3):231-237.
17. Center for Substance Abuse Treatment. Medication-assisted treatment for opioid addiction in opioid treatment programs. Treatment Improvement Protocol (TIP) Series 43. HHS Publication No. (SMA) 12-4214. Rockville, MD: Substance Abuse and Mental Health Services Administration; 2005.
18. Marsch LA, Bickel WK, Badger GJ, et al. Comparison of pharmacological treatments for opioid-dependent adolescents: a randomized controlled trial. Arch Gen Psychiatry. 2005;62(10):1157-1164.
19. Gowing L, Farrell MF, Ali R, et al. Alpha2-adrenergic agonists for the management of opioid withdrawal. Cochrane Database Syst Rev. 2014;3:CD002024.
20. Woody GE, Poole SA, Subramaniam G, et al. Extended vs short-term buprenorphine-naloxone for treatment of opioid-addicted youth: a randomized trial. JAMA. 2008; 300(17):2003-2011.
21. Fishman MJ, Winstanley EL, Curran E, et al. Treatment of opioid dependence in adolescents and young adults with extended release naltrexone: preliminary case-series and feasibility. Addiction. 2010;105(9):1669-1676.
22. Niederhofer H, Staffen W. Comparison of disulfiram and placebo in treatment of alcohol dependence of adolescents. Drug Alcohol Rev. 2003;22(3):295-297.
23. De Sousa AA, De Sousa J, Kapoor H. An open randomized trial comparing disulfiram and naltrexone in adolescents with alcohol dependence. J Subst Abuse Treat. 2008;13(6):382-388.
24. Deas D, Randall CL, Roberts JS, et al. A double-blind, placebo-controlled trial of sertraline in depressed adolescent alcoholics: a pilot study. Hum Psychopharmacol. 2000;15(6):461-469.
25. Riggs PD, Mikulich-Gilbertson SK, Davies RD, et al. A randomized controlled trial of fluoxetine and cognitive behavioral therapy in adolescents with major depression, behavior problems, and substance use disorders. Arch Pediatr Adolesc Med. 2007;161(11):1026-1034.
26. Findling RL, Pagano ME, McNamara NK, et al. The short-term safety and efficacy of fluoxetine in depressed adolescents with alcohol and cannabis use disorders: a pilot randomized placebo-controlled trial. Child Adolesc Psychiatry Ment Health. 2009;3(1):11.
27. Cornelius JR, Bukstein OG, Douaihy AB, et al. Double-blind fluoxetine trial in comorbid MDD-CUD youth and young adults. Drug Alcohol Depend. 2010;112(1-2):39-45.
28. Cornelius JR, Bukstein OG, Wood DS, et al. Double-blind placebo-controlled trial of fluoxetine in adolescents with comorbid major depression and an alcohol use disorder. Addict Behav. 2009;34(10):905-909.
29. Geller B, Cooper TB, Sun K, et al. Double-blind and placebo controlled study of lithium for adolescent bipolar disorders with secondary substance dependency. J Am Acad Child Adolesc Psychiatry. 1998;37(2):171-178.
30. Donovan SJ, Susser ES, Nunes E. Divalproex sodium for use with conduct disordered adolescent marijuana users. Am J Addict. 1996;5(2):181.
31. Donovan SJ, Susser ES, Nunes EV, et al. Divalproex treatment of disruptive adolescents: a report of 10 cases. J Clin Psychiatry. 1997;58(1):12-15.
32. DelBello, M. Topiramate plus quetiapine cut Cannabis use in bipolar teens. Paper presented at: American Academy of Child and Adolescent Psychiatry’s Annual Meeting. November 2011; Toronto, Ontario, Canada.
33. Wilens TE, Biederman J, Adamson JJ, et al. Further evidence of an association between adolescent bipolar disorder with smoking and substance use disorders: a controlled study. Drug Alcohol Depend. 2008;95(3):188-198.
34. Riggs PD, Leon SL, Mikulich SK, et al. An open trial of bupropion for ADHD in adolescents with substance use disorders and conduct disorder. J Am Acad Child Adolesc Psychiatry. 1998;37(12):1271-1278.
35. Riggs PD, Hall SK, Mikulich-Gilbertson SK, et al. A randomized controlled trial of pemoline for attention-deficit/hyperactivity disorder in substance-abusing adolescents. J Am Acad Child Adolesc Psychiatry. 2004;43(4):420-429.
36. Riggs PD, Winhusen T, Davies RD, et al. Randomized controlled trial of osmotic-release methylphenidate with cognitive-behavioral therapy in adolescents with attention-deficit/hyperactivity disorder and substance use disorders. J Am Acad Child Adolesc Psychiatry. 2011;50(9):903-914.
37. Szobot CM, Rohde LA, Katz B, et al. A randomized crossover clinical study showing that methylphenidate- SODAS improves attention-deficit/hyperactivity disorder symptoms in adolescents with substance use disorder. Braz J Med Biol Res. 2008;41(3):250-257.
38. Solhkhah R, Wilens TE, Daly J, et al. Bupropion SR for the treatment of substance-abusing outpatient adolescents with attention-deficit/hyperactivity disorder and mood disorders. J Child Adolesc Psychopharmacol. 2005;15(5): 777-786.
39. Thurstone C, Riggs PD, Salomonsen-Sautel S, et al. Randomized, controlled trial of atomoxetine for attention-deficit/hyperactivity disorder in adolescents with substance use disorder. J Am Acad Child Adolesc Psychiatry. 2010;49(6):573-582.
40. Zulauf CA, Sprich SE, Safren SA, et al. The complicated relationship between attention deficit/hyperactivity disorder and substance use disorders. Curr Psychiatry Rep. 2014;16(3):436.
41. Riggs PD, Mikulich SK, Coffman LM, et al. Fluoxetine in drug-dependent delinquents with major depression: an open trial. J Child Adolesc Psychopharmacol. 1997;7(2):87-95.
42. Mohatt J, Bennett SM, Walkup JT. Treatment of separation, generalized, and social anxiety disorders in youths. Am J Psychiatry. 2014;171(7):741-748.
43. Strawn JR, Sakolsky DJ, Rynn MA. Psychopharmacologic treatment of children and adolescents with anxiety disorders. Child Adolesc Psychiatr Clin N Am. 2012; 21(3):527-539.
Substances use during adolescence is common in the United States. Data from the 2014 Monitoring the Future Survey estimated that among 12th graders, 60.2% used alcohol, 35.1% used marijuana, and 13.9% used a prescription drug for nonmedical use within the previous year.1 An estimated 11.4% of adolescents meet DSM-IV threshold criteria for a substance use disorder (SUD).2 Substance use in adolescents often co-occurs with psychological distress and psychiatric illness. Adolescents with a psychiatric disorder are at increased risk for developing a SUD; conversely, high rates of psychiatric illness are seen in adolescents with a SUD.3,4 In one study, 82% of adolescents hospitalized for SUD treatment were found to have a co-occurring axis I disorder.5 Furthermore, co-occurring psychiatric illness and SUD complicates treatment course and prognosis. Adolescents with co-occurring psychiatric illness and SUD often benefit from an integrated, multimodal treatment approach that includes psychotherapy, pharmacologic interventions, family involvement, and collaboration with community supports.
In this article, we focus on pharmacologic management of non-nicotinic SUDs in adolescents, with an emphasis on those with comorbid psychiatric illness.
Screening and assessment of substance use
It is important to counsel children with a psychiatric illness and their parents about the increased risk for SUD before a patient transitions to adolescence. Discussions about substance abuse should begin during the 5th grade because data suggests that adolescent substance use often starts in middle school (6th to 9th grade). Clinicians should routinely screen adolescent patients for substance use. Nonproprietary screening tools available through the National Institute on Alcohol and Alcoholism and the National Institute on Drug Abuse are listed in Table 1.6-8 The Screening to Brief Intervention (S2BI) is a newer tool that has been shown to be highly effective in identifying adolescents at risk for substance abuse and differentiating severity of illness.8 The S2BI includes screening questions that assess for use of 8 substances in the past year.
Adolescents with psychiatric illness who are identified to be at risk for problems associated with substance use should be evaluated further for the presence or absence of a SUD. The number of criteria a patient endorses over the past year (Table 29) is used to assess SUD severity—mild, moderate, or severe. Additional considerations include substance use patterns such as type, site, quantity, frequency, context, and combinations of substances.
It is important to be curious and nonjudgmental when evaluating substance use patterns with adolescents to obtain a comprehensive assessment. Teenagers often are creative and inventive in their efforts to maximize intoxication, which can put them at risk for complications associated with acute intoxication. Rapidly evolving methods of ingesting highly concentrated forms of tetrahydrocannabinol (“wax,” “dabs”) are an example of use patterns that are ahead of what is reported in the literature.
Any substance use in an adolescent with a psychiatric illness is of concern and should be monitored closely because of the potential impact of substance use on the co-occurring psychiatric illness and possible interactions between the abused substance and prescribed medication.
Treatment interventions
Although this review will focus on pharmacotherapy, individual, group, and family psychotherapies are a critical part of a treatment plan for adolescents with comorbid psychiatric illness and SUD (Table 3). Collaboration with community supports, including school and legal officials, can help reinforce contingencies and assist with connecting a teen with positive prosocial activities. Involvement with mutual help organizations, such as Alcoholics Anonymous, can facilitate adolescent engagement with a positive sober network.10
Pharmacologic strategies for treating co-occurring psychiatric illness and SUD include medication to:
• decrease substance use and promote abstinence
• alleviate withdrawal symptoms (medication to treat withdrawal symptoms and agonist treatments)
• block the effect of substance use (antagonist agents)
• decrease likelihood of substance use with aversive agents
• target comorbid psychiatric illness.
Medication to decrease substance use and promote abstinence. One strategy is to target cravings and urges to use substances with medication. Naltrexone is an opiate antagonist FDA-approved for treating alcohol and opioid use disorders in adults and is available as a daily oral medication and a monthly injectable depot preparation (extended-release naltrexone). Two small open-label studies showed decreased alcohol use with naltrexone treatment in adolescents with alcohol use disorder.11,12 In a randomized double-blind placebo controlled (RCT) crossover study of 22 adolescent problem drinkers, naltrexone, 50 mg/d, reduced the likelihood of drinking and heavy drinking (P ≤ .03).13 Acamprosate, another anti-craving medication FDA-approved for treating alcohol use disorder in adults, has no data on the safety or efficacy for adolescent alcohol use disorder.
There is limited research on agents that decrease use and promote abstinence from non-nicotinic substances other than alcohol. There is one pilot RCT that evaluated N-acetylcysteine (NAC)—an over-the-counter supplement that modulates the glutamate system—for treating adolescent Cannabis dependence. Treatment with NAC, 2,400 mg/d, was well tolerated and had twice the odds of increasing negative urine cannabinoid tests during treatment than placebo.14 Although NAC treatment was associated with decreased Cannabis use, it did not significantly decrease cravings compared with placebo.15
Medication to alleviate withdrawal symptoms. Some patients may find the physical discomfort and psychological distress associated with substance withdrawal so intolerable that to avoid it they continue to use drugs or alcohol. Medication to treat withdrawal symptoms and agonist treatments can be used to alleviate discomfort and distress associated with withdrawal. Agonist treatments, such as methadone and buprenorphine, bind to the same receptors as the target substance, which allows the patient to shift to controlled use of a prescribed substitute. Agonist treatments are used for short detoxification and over longer periods of time for maintenance treatment. Methadone, which decreases craving and withdrawal symptoms from opiates by binding to the μ-opiate receptor and blocking other substances from binding, is frequently used for detoxification and maintenance treatment in adults. There is limited data on methadone substitution therapy for adolescents in the United States.16 Methadone maintenance for adolescents in the United States is restricted to severe cases of opioid use disorder. Federal guidelines specify that adolescents age <18 can only receive methadone if they have had 2 unsuccessful detoxification attempts or outpatient psychosocial treatments and have met DSM criteria for an opioid use disorder for 1 year.17
Buprenorphine is a partial μ-opiate receptor agonist that is FDA-approved for use in adolescents age ≥16 with opioid dependence. Although a waiver from the U.S. Drug Enforcement Administration is required to prescribe buprenorphine, it generally can be administered in outpatient settings with relative ease compared with methadone.
Marsch et al18 examined the efficacy of buprenorphine compared with clonidine for detoxification over 1 month in 36 adolescents with opioid dependence. Clonidine is an α-2 adrenergic agonist that often is used during detoxification from opioids.19 Although both buprenorphine and clonidine relieved withdrawal symptoms, a significantly higher percentage of patients receiving buprenorphine completed treatment (72%) compared with those taking clonidine (39%) (P < .05).18 Detoxification with buprenorphine also was associated with a higher percentage of negative urine drug screens (64% vs 32%, P = .01), and those receiving buprenorphine were more likely to continue on naltrexone maintenance for continued medication-assisted treatment after detoxification compared with those randomized to clonidine.
Woody et al20 compared use of buprenorphine/naloxone for opioid detoxification vs short-term maintenance. Patients age 16 to 21 were randomized to detoxification over 2 weeks vs stabilization and maintenance for 9 weeks and taper over 3 weeks. Maintenance treatment with buprenorphine/naloxone was associated with less opioid use, less injection drug use, and less need for addiction treatment outside of that received through the study compared with detoxification treatment. When buprenorphine/naloxone was discontinued both the detoxification and maintenance groups had high rates of positive urine toxicology screens at 1-year follow up (mean 48% to 72%). These data suggests maintenance with buprenorphine/ naloxone for adolescents and young adults is more effective than short-term detoxification for stabilizing opioid use disorders, although optimal treatment duration is unclear. Clinically, it is important to continue buprenorphine/naloxone maintenance until the patient has stabilized in recovery and has acquired coping skills to manage urges, cravings, and psychological distress (eg, anger, stress) that often arise during a slow taper of agonist treatment.
Antagonist treatment to block the effect of substance use
As an opioid receptor antagonist, naltrexone is effective for treating opioid use disorder because it blocks the action of opioids. Fishman et al21 published a descriptive series of 16 adolescents and young adults followed over 4 months who received the injectable depot preparation (extended-release) naltrexone while in residential treatment, and then discharged to outpatient care. Most patients who received extended-release naltrexone remained in outpatient treatment (63%) and reduced their opioid use or were abstinent at 4 months (56%). One barrier to naltrexone treatment is the need to be abstinent from opioids for 7 to 10 days to prevent precipitated opioid withdrawal. Therefore, naltrexone is a good option for adolescents who present for treatment early and are not physiologically dependent on opioids or are receiving treatment in a structured environment after detoxification, such as residential treatment or sober living.
Aversive agents to diminish substance use. Aversive agents produce an unpleasant reaction when a target substance is consumed. Disulfiram is prototypic aversive agent that prevents the breakdown of acetaldehyde, a toxic metabolite of alcohol. Patients who drink alcohol while taking disulfiram may experience adverse effects, including tachycardia, shortness of breath, nausea, dizziness, and confusion. There have been 2 studies examining the efficacy of disulfiram in adolescents with alcohol use disorder. Niederhofer et al22 found that disulfiram treatment significantly increased cumulative abstinence in a small RCT (P = .012). In another small randomized, open-label, 3-month study of adolescents who received disulfiram or naltrexone in addition to weekly psychotherapy, disulfiram was superior to naltrexone in mean days abstinent from alcohol, 84 days vs 51 days, respectively (P = .0001).23 Often adolescents are not willing to adhere to disulfiram because they are concerned about the aversive reaction when combined with alcohol use. Consider prescribing disulfiram for adolescents who are about to go “on pass” from a therapeutic school or residential SUD treatment center and will be returning to an environment where they may be tempted to use alcohol.
Pharmacotherapy to treat co-occurring psychiatric illness
Continued treatment of a psychiatric illness that co-occurs with SUD is important. As we recommended, consider psychosocial treatments for both the SUD and comorbid psychopathology. Several single-site RCTs have evaluated the efficacy of the selective serotonin reuptake inhibitors (SSRIs) fluoxetine and sertraline for depressive disorders in adolescents with a co-occurring SUD.24-28 Most studies have shown improvement in depressive symptoms and substance use in medication and placebo groups.24,25,27,28 However, treatment with fluoxetine, 20 mg/d, or sertraline, 100 mg/d, when compared with placebo was associated with improved depressive symptoms in 1 of 3 studies and had no significant difference in SUD outcome. The authors of these studies believe that the general improvement in depression and the SUD was related to use of cognitive-behavioral therapy (CBT) and/or motivational enhancement therapy.24,25,27,28
Research on the use of mood stabilizers for adolescents with mood dysregulation and a SUD is limited but has suggested benefit associated with pharmacotherapy (Table 4).29-32 Two RCTs and 1 open-label study demonstrated reductions in substance use with mood stabilizer treatment in adolescents with co-occurring SUD and mood dysregulation.29-32 The effect of pharmacotherapy on mood dysregulation ratings are less clear because there was no change in severity of affective symptoms observed in a small RCT of lithium (average blood level 0.9 mEq/L)29; and improvement in affective symptoms was noted in topiramate (300 mg/d) and placebo groups when both groups were treated with concurrent quetiapine.32 Because of the high risk of SUD and severe morbidity in juvenile bipolar disorder and severe mood dysregulation,33 larger RCTs are warranted.
Several studies have evaluated the impact of stimulant and nonstimulant treatments for attention-deficit/hyperactivity disorder (ADHD) in adolescents with a co-occurring SUD.34-39 The largest and only multisite study evaluated the efficacy of osmotic (extended) release methylphenidate (OROS-MPH) vs placebo for adolescents who also were receiving CBT for SUD.36 In this 16-week RCT, the OROS-MPH and placebo groups showed improvement in self-reported ADHD symptoms with no difference between groups. Parent report of ADHD symptoms did indicate a greater reduction in symptoms in the OROS-MPH group compared with placebo. Both groups had a decrease in self-reported days of substance use over the past month with no differences between groups. Pharmacotherapy trials for ADHD that have included psychotherapy highlight the effectiveness of CBT for SUD and co-occurring psychiatric illness.36,39,40
Although conduct disorder and anxiety disorders commonly co-occur with SUD, there has been less research evaluating the impact of pharmacotherapy on treating these disorders. Riggs et al25,34,35,41 evaluated the impact of pharmacotherapy targeted to co-occurring ADHD and major depressive disorder in the context of conduct disorder and SUD. When evaluated in an outpatient setting, the presence of a treatment intervention to address the co-occurring SUD was an important component that led to a reduction in conduct symptoms.25,35 There have been no comprehensive studies on the impact of pharmacotherapy for treating anxiety and SUD in adolescents.
Recommendations for clinical management
Although more research is needed to evaluate the role of pharmacotherapy for adolescents with co-occurring psychiatric illness and a SUD, recommended practice is to continue pharmacotherapy and closely monitor response to treatment when at-risk substance use begins in patients with co-occurring psychiatric illness. In adolescents with a threshold SUD, continue pharmacotherapy for unstable mood disorders with first-line choices of SSRIs for unipolar depression and second-generation antipsychotics for bipolar spectrum illness. Suggested conservative pharmacological interventions for anxiety disorders include SSRIs and buspirone, which have been shown to be effective for treating anxiety in children and adolescents.42,43 For patients with comorbid ADHD and SUD, if possible, it is recommended to first stabilize substance use (low-level use or abstinence) and consider treating ADHD immediately thereafter with a nonstimulant such as atomoxetine, which has data on efficacy and safety in context to substance use; and/or an α-agonist or an extended-release stimulant. Because of the potential for misuse and toxicity associated with concurrent substance use, benzodiazepines should be considered a last treatment of choice for adolescents with anxiety disorders and a SUD. Similarly, the use of immediate-release stimulants should be avoided in patients with ADHD and a SUD. When prescribing medications that could be misused or toxic when combined with a substance, it is important to evaluate the risk and benefit of continued use of a particular medication and consider prescribing lower quantities to decrease risk for misuse (1- to 2-week supply). Adolescents often are reluctant to engage in SUD treatment and one strategy to consider is to make continued prescription of any medication contingent on engaging in SUD treatment. Enlist parents in helping to monitor, store, and administer their child’s medication to improve adherence and decrease the potential for misuse, diversion, and complications associated with substance intoxication.
Bottom Line
It is important to screen for substance use in adolescents with co-occurring
psychiatric illness and vice versa. When at-risk or hazardous substance use is
detected there are effective psychosocial and pharmacologic interventions that
can be used to treat adolescent substance use disorders alone and in combination
with certain psychiatric disorders.
Related Resources
• National Institute on Drug Abuse. www.drugabuse.gov.
• National Institute on Alcohol Abuse and Alcoholism. www.niaaa.nih.gov.
• Substance Abuse and Mental Health Services Administration. www.samhsa.gov.
Drug Brand Names
Acamprosate • Campral
Atomoxetine • Strattera
Buprenorphine• Subutex
Buprenorphine/naloxone • Suboxone
Buspirone • Buspar
Clonidine • Catapres
Disulfiram • Antabuse
Fluoxetine • Prozac
Lithium • Lithobid, Eskalith
Methadone • Dolophine
Naltrexone • ReVia, Vivitrol
Osmotic (extended) release methylphenidate • Concerta
Sertraline • Zoloft
Topiramate • Topamax
Quetiapine • Seroquel
Valproic acid • Depakote
Disclosures
Dr. Yule received grant support from the 2012 American Academy of Child and Adolescent Psychiatry Pilot Research Award for Junior Faculty supported by Lilly USA, LLC, and receives grant support from the 2014 Louis V. Gerstner III Research Scholar Award. Dr. Wilens has received grant support from the National Institute on Drug Abuse (NIDA); has been a consultant for Euthymics/Neurovance, NIDA, Ironshore Pharmaceuticals and Development, Theravance Biopharma, Tris Pharma, the U.S. National Football League (ERM Associates), U.S. Minor/Major League Baseball, and Bay Cove Human Services (Clinical Services).
Substances use during adolescence is common in the United States. Data from the 2014 Monitoring the Future Survey estimated that among 12th graders, 60.2% used alcohol, 35.1% used marijuana, and 13.9% used a prescription drug for nonmedical use within the previous year.1 An estimated 11.4% of adolescents meet DSM-IV threshold criteria for a substance use disorder (SUD).2 Substance use in adolescents often co-occurs with psychological distress and psychiatric illness. Adolescents with a psychiatric disorder are at increased risk for developing a SUD; conversely, high rates of psychiatric illness are seen in adolescents with a SUD.3,4 In one study, 82% of adolescents hospitalized for SUD treatment were found to have a co-occurring axis I disorder.5 Furthermore, co-occurring psychiatric illness and SUD complicates treatment course and prognosis. Adolescents with co-occurring psychiatric illness and SUD often benefit from an integrated, multimodal treatment approach that includes psychotherapy, pharmacologic interventions, family involvement, and collaboration with community supports.
In this article, we focus on pharmacologic management of non-nicotinic SUDs in adolescents, with an emphasis on those with comorbid psychiatric illness.
Screening and assessment of substance use
It is important to counsel children with a psychiatric illness and their parents about the increased risk for SUD before a patient transitions to adolescence. Discussions about substance abuse should begin during the 5th grade because data suggests that adolescent substance use often starts in middle school (6th to 9th grade). Clinicians should routinely screen adolescent patients for substance use. Nonproprietary screening tools available through the National Institute on Alcohol and Alcoholism and the National Institute on Drug Abuse are listed in Table 1.6-8 The Screening to Brief Intervention (S2BI) is a newer tool that has been shown to be highly effective in identifying adolescents at risk for substance abuse and differentiating severity of illness.8 The S2BI includes screening questions that assess for use of 8 substances in the past year.
Adolescents with psychiatric illness who are identified to be at risk for problems associated with substance use should be evaluated further for the presence or absence of a SUD. The number of criteria a patient endorses over the past year (Table 29) is used to assess SUD severity—mild, moderate, or severe. Additional considerations include substance use patterns such as type, site, quantity, frequency, context, and combinations of substances.
It is important to be curious and nonjudgmental when evaluating substance use patterns with adolescents to obtain a comprehensive assessment. Teenagers often are creative and inventive in their efforts to maximize intoxication, which can put them at risk for complications associated with acute intoxication. Rapidly evolving methods of ingesting highly concentrated forms of tetrahydrocannabinol (“wax,” “dabs”) are an example of use patterns that are ahead of what is reported in the literature.
Any substance use in an adolescent with a psychiatric illness is of concern and should be monitored closely because of the potential impact of substance use on the co-occurring psychiatric illness and possible interactions between the abused substance and prescribed medication.
Treatment interventions
Although this review will focus on pharmacotherapy, individual, group, and family psychotherapies are a critical part of a treatment plan for adolescents with comorbid psychiatric illness and SUD (Table 3). Collaboration with community supports, including school and legal officials, can help reinforce contingencies and assist with connecting a teen with positive prosocial activities. Involvement with mutual help organizations, such as Alcoholics Anonymous, can facilitate adolescent engagement with a positive sober network.10
Pharmacologic strategies for treating co-occurring psychiatric illness and SUD include medication to:
• decrease substance use and promote abstinence
• alleviate withdrawal symptoms (medication to treat withdrawal symptoms and agonist treatments)
• block the effect of substance use (antagonist agents)
• decrease likelihood of substance use with aversive agents
• target comorbid psychiatric illness.
Medication to decrease substance use and promote abstinence. One strategy is to target cravings and urges to use substances with medication. Naltrexone is an opiate antagonist FDA-approved for treating alcohol and opioid use disorders in adults and is available as a daily oral medication and a monthly injectable depot preparation (extended-release naltrexone). Two small open-label studies showed decreased alcohol use with naltrexone treatment in adolescents with alcohol use disorder.11,12 In a randomized double-blind placebo controlled (RCT) crossover study of 22 adolescent problem drinkers, naltrexone, 50 mg/d, reduced the likelihood of drinking and heavy drinking (P ≤ .03).13 Acamprosate, another anti-craving medication FDA-approved for treating alcohol use disorder in adults, has no data on the safety or efficacy for adolescent alcohol use disorder.
There is limited research on agents that decrease use and promote abstinence from non-nicotinic substances other than alcohol. There is one pilot RCT that evaluated N-acetylcysteine (NAC)—an over-the-counter supplement that modulates the glutamate system—for treating adolescent Cannabis dependence. Treatment with NAC, 2,400 mg/d, was well tolerated and had twice the odds of increasing negative urine cannabinoid tests during treatment than placebo.14 Although NAC treatment was associated with decreased Cannabis use, it did not significantly decrease cravings compared with placebo.15
Medication to alleviate withdrawal symptoms. Some patients may find the physical discomfort and psychological distress associated with substance withdrawal so intolerable that to avoid it they continue to use drugs or alcohol. Medication to treat withdrawal symptoms and agonist treatments can be used to alleviate discomfort and distress associated with withdrawal. Agonist treatments, such as methadone and buprenorphine, bind to the same receptors as the target substance, which allows the patient to shift to controlled use of a prescribed substitute. Agonist treatments are used for short detoxification and over longer periods of time for maintenance treatment. Methadone, which decreases craving and withdrawal symptoms from opiates by binding to the μ-opiate receptor and blocking other substances from binding, is frequently used for detoxification and maintenance treatment in adults. There is limited data on methadone substitution therapy for adolescents in the United States.16 Methadone maintenance for adolescents in the United States is restricted to severe cases of opioid use disorder. Federal guidelines specify that adolescents age <18 can only receive methadone if they have had 2 unsuccessful detoxification attempts or outpatient psychosocial treatments and have met DSM criteria for an opioid use disorder for 1 year.17
Buprenorphine is a partial μ-opiate receptor agonist that is FDA-approved for use in adolescents age ≥16 with opioid dependence. Although a waiver from the U.S. Drug Enforcement Administration is required to prescribe buprenorphine, it generally can be administered in outpatient settings with relative ease compared with methadone.
Marsch et al18 examined the efficacy of buprenorphine compared with clonidine for detoxification over 1 month in 36 adolescents with opioid dependence. Clonidine is an α-2 adrenergic agonist that often is used during detoxification from opioids.19 Although both buprenorphine and clonidine relieved withdrawal symptoms, a significantly higher percentage of patients receiving buprenorphine completed treatment (72%) compared with those taking clonidine (39%) (P < .05).18 Detoxification with buprenorphine also was associated with a higher percentage of negative urine drug screens (64% vs 32%, P = .01), and those receiving buprenorphine were more likely to continue on naltrexone maintenance for continued medication-assisted treatment after detoxification compared with those randomized to clonidine.
Woody et al20 compared use of buprenorphine/naloxone for opioid detoxification vs short-term maintenance. Patients age 16 to 21 were randomized to detoxification over 2 weeks vs stabilization and maintenance for 9 weeks and taper over 3 weeks. Maintenance treatment with buprenorphine/naloxone was associated with less opioid use, less injection drug use, and less need for addiction treatment outside of that received through the study compared with detoxification treatment. When buprenorphine/naloxone was discontinued both the detoxification and maintenance groups had high rates of positive urine toxicology screens at 1-year follow up (mean 48% to 72%). These data suggests maintenance with buprenorphine/ naloxone for adolescents and young adults is more effective than short-term detoxification for stabilizing opioid use disorders, although optimal treatment duration is unclear. Clinically, it is important to continue buprenorphine/naloxone maintenance until the patient has stabilized in recovery and has acquired coping skills to manage urges, cravings, and psychological distress (eg, anger, stress) that often arise during a slow taper of agonist treatment.
Antagonist treatment to block the effect of substance use
As an opioid receptor antagonist, naltrexone is effective for treating opioid use disorder because it blocks the action of opioids. Fishman et al21 published a descriptive series of 16 adolescents and young adults followed over 4 months who received the injectable depot preparation (extended-release) naltrexone while in residential treatment, and then discharged to outpatient care. Most patients who received extended-release naltrexone remained in outpatient treatment (63%) and reduced their opioid use or were abstinent at 4 months (56%). One barrier to naltrexone treatment is the need to be abstinent from opioids for 7 to 10 days to prevent precipitated opioid withdrawal. Therefore, naltrexone is a good option for adolescents who present for treatment early and are not physiologically dependent on opioids or are receiving treatment in a structured environment after detoxification, such as residential treatment or sober living.
Aversive agents to diminish substance use. Aversive agents produce an unpleasant reaction when a target substance is consumed. Disulfiram is prototypic aversive agent that prevents the breakdown of acetaldehyde, a toxic metabolite of alcohol. Patients who drink alcohol while taking disulfiram may experience adverse effects, including tachycardia, shortness of breath, nausea, dizziness, and confusion. There have been 2 studies examining the efficacy of disulfiram in adolescents with alcohol use disorder. Niederhofer et al22 found that disulfiram treatment significantly increased cumulative abstinence in a small RCT (P = .012). In another small randomized, open-label, 3-month study of adolescents who received disulfiram or naltrexone in addition to weekly psychotherapy, disulfiram was superior to naltrexone in mean days abstinent from alcohol, 84 days vs 51 days, respectively (P = .0001).23 Often adolescents are not willing to adhere to disulfiram because they are concerned about the aversive reaction when combined with alcohol use. Consider prescribing disulfiram for adolescents who are about to go “on pass” from a therapeutic school or residential SUD treatment center and will be returning to an environment where they may be tempted to use alcohol.
Pharmacotherapy to treat co-occurring psychiatric illness
Continued treatment of a psychiatric illness that co-occurs with SUD is important. As we recommended, consider psychosocial treatments for both the SUD and comorbid psychopathology. Several single-site RCTs have evaluated the efficacy of the selective serotonin reuptake inhibitors (SSRIs) fluoxetine and sertraline for depressive disorders in adolescents with a co-occurring SUD.24-28 Most studies have shown improvement in depressive symptoms and substance use in medication and placebo groups.24,25,27,28 However, treatment with fluoxetine, 20 mg/d, or sertraline, 100 mg/d, when compared with placebo was associated with improved depressive symptoms in 1 of 3 studies and had no significant difference in SUD outcome. The authors of these studies believe that the general improvement in depression and the SUD was related to use of cognitive-behavioral therapy (CBT) and/or motivational enhancement therapy.24,25,27,28
Research on the use of mood stabilizers for adolescents with mood dysregulation and a SUD is limited but has suggested benefit associated with pharmacotherapy (Table 4).29-32 Two RCTs and 1 open-label study demonstrated reductions in substance use with mood stabilizer treatment in adolescents with co-occurring SUD and mood dysregulation.29-32 The effect of pharmacotherapy on mood dysregulation ratings are less clear because there was no change in severity of affective symptoms observed in a small RCT of lithium (average blood level 0.9 mEq/L)29; and improvement in affective symptoms was noted in topiramate (300 mg/d) and placebo groups when both groups were treated with concurrent quetiapine.32 Because of the high risk of SUD and severe morbidity in juvenile bipolar disorder and severe mood dysregulation,33 larger RCTs are warranted.
Several studies have evaluated the impact of stimulant and nonstimulant treatments for attention-deficit/hyperactivity disorder (ADHD) in adolescents with a co-occurring SUD.34-39 The largest and only multisite study evaluated the efficacy of osmotic (extended) release methylphenidate (OROS-MPH) vs placebo for adolescents who also were receiving CBT for SUD.36 In this 16-week RCT, the OROS-MPH and placebo groups showed improvement in self-reported ADHD symptoms with no difference between groups. Parent report of ADHD symptoms did indicate a greater reduction in symptoms in the OROS-MPH group compared with placebo. Both groups had a decrease in self-reported days of substance use over the past month with no differences between groups. Pharmacotherapy trials for ADHD that have included psychotherapy highlight the effectiveness of CBT for SUD and co-occurring psychiatric illness.36,39,40
Although conduct disorder and anxiety disorders commonly co-occur with SUD, there has been less research evaluating the impact of pharmacotherapy on treating these disorders. Riggs et al25,34,35,41 evaluated the impact of pharmacotherapy targeted to co-occurring ADHD and major depressive disorder in the context of conduct disorder and SUD. When evaluated in an outpatient setting, the presence of a treatment intervention to address the co-occurring SUD was an important component that led to a reduction in conduct symptoms.25,35 There have been no comprehensive studies on the impact of pharmacotherapy for treating anxiety and SUD in adolescents.
Recommendations for clinical management
Although more research is needed to evaluate the role of pharmacotherapy for adolescents with co-occurring psychiatric illness and a SUD, recommended practice is to continue pharmacotherapy and closely monitor response to treatment when at-risk substance use begins in patients with co-occurring psychiatric illness. In adolescents with a threshold SUD, continue pharmacotherapy for unstable mood disorders with first-line choices of SSRIs for unipolar depression and second-generation antipsychotics for bipolar spectrum illness. Suggested conservative pharmacological interventions for anxiety disorders include SSRIs and buspirone, which have been shown to be effective for treating anxiety in children and adolescents.42,43 For patients with comorbid ADHD and SUD, if possible, it is recommended to first stabilize substance use (low-level use or abstinence) and consider treating ADHD immediately thereafter with a nonstimulant such as atomoxetine, which has data on efficacy and safety in context to substance use; and/or an α-agonist or an extended-release stimulant. Because of the potential for misuse and toxicity associated with concurrent substance use, benzodiazepines should be considered a last treatment of choice for adolescents with anxiety disorders and a SUD. Similarly, the use of immediate-release stimulants should be avoided in patients with ADHD and a SUD. When prescribing medications that could be misused or toxic when combined with a substance, it is important to evaluate the risk and benefit of continued use of a particular medication and consider prescribing lower quantities to decrease risk for misuse (1- to 2-week supply). Adolescents often are reluctant to engage in SUD treatment and one strategy to consider is to make continued prescription of any medication contingent on engaging in SUD treatment. Enlist parents in helping to monitor, store, and administer their child’s medication to improve adherence and decrease the potential for misuse, diversion, and complications associated with substance intoxication.
Bottom Line
It is important to screen for substance use in adolescents with co-occurring
psychiatric illness and vice versa. When at-risk or hazardous substance use is
detected there are effective psychosocial and pharmacologic interventions that
can be used to treat adolescent substance use disorders alone and in combination
with certain psychiatric disorders.
Related Resources
• National Institute on Drug Abuse. www.drugabuse.gov.
• National Institute on Alcohol Abuse and Alcoholism. www.niaaa.nih.gov.
• Substance Abuse and Mental Health Services Administration. www.samhsa.gov.
Drug Brand Names
Acamprosate • Campral
Atomoxetine • Strattera
Buprenorphine• Subutex
Buprenorphine/naloxone • Suboxone
Buspirone • Buspar
Clonidine • Catapres
Disulfiram • Antabuse
Fluoxetine • Prozac
Lithium • Lithobid, Eskalith
Methadone • Dolophine
Naltrexone • ReVia, Vivitrol
Osmotic (extended) release methylphenidate • Concerta
Sertraline • Zoloft
Topiramate • Topamax
Quetiapine • Seroquel
Valproic acid • Depakote
Disclosures
Dr. Yule received grant support from the 2012 American Academy of Child and Adolescent Psychiatry Pilot Research Award for Junior Faculty supported by Lilly USA, LLC, and receives grant support from the 2014 Louis V. Gerstner III Research Scholar Award. Dr. Wilens has received grant support from the National Institute on Drug Abuse (NIDA); has been a consultant for Euthymics/Neurovance, NIDA, Ironshore Pharmaceuticals and Development, Theravance Biopharma, Tris Pharma, the U.S. National Football League (ERM Associates), U.S. Minor/Major League Baseball, and Bay Cove Human Services (Clinical Services).
1. Johnston LD, Miech RA, O’Malley PM, et al. Monitoring the future, Table 2: trends in annual prevalence of use of various drugs in grades 8, 10, and 12. http://www. monitoringthefuture.org/data/14data.html#2014data-drugs. Published December 16, 2014. Accessed January 6, 2015.
2. Merikangas KR, He JP, Burnstein M, et al. Lifetime prevalence of mental disorders in U.S. adolescents: results from the National Comorbidity Survey Replication-- Adolescent Supplement (NCS-A). J Am Acad Child Adolesc Psychiatry. 2010;49(10):980-989.
3. Kandel DB, Johnson JG, Bird HR, et al. Psychiatric disorders associated with substance use among children and adolescents: findings from the Methods for the Epidemiology of Child and Adolescent Mental Disorders (MECA) Study. J Abnorm Child Psychol. 1997;25(2):122-132.
4. Roberts RE, Roberts CR, Xing Y. Comorbidity of substance use disorders and other psychiatric disorders among adolescents: evidence from an epidemiologic survey. Drug Alcohol Depend. 2007;88(suppl 1):S4-S13.
5. Stowell R, Estroff TW. Psychiatric disorders in substance-abusing adolescent inpatients: a pilot study. J Am Acad Child Adolesc Psychiatry. 1992;31(6):1036-1040.
6. National Institute of Alcohol Abuse and Alcoholism. Alcohol screening and brief intervention for youth: a practitioner’s guide. http://www.niaaa.nih.gov/ Publications/EducationTrainingMaterials/Pages/YouthGuide.aspx. Accessed March 11, 2015.
7. Children’s Hospital Boston. The CRAFFT screening interview. http://www.integration.samhsa.gov/clinical-practice/sbirt/CRAFFT_Screening_interview.pdf. Published 2009. Accessed March 11, 2015.
8. Levy S, Weiss R, Sherritt L, et al. An electronic screen for triaging adolescent substance use by risk levels. JAMA Pediatr. 2014;168(9):822-828.
9. Diagnostic and statistical manual of mental disorders, 5th ed. Washington, DC: American Psychiatric Association; 2013.
10. Kelly JF, Myers MG. Adolescents’ participation in Alcoholics Anonymous and Narcotics Anonymous: review, implications and future directions. J Psychoactive Drugs. 2007;39(3):259-269.
11. Lifrak PD, Alterman AI, O’Brien CP, et al. Naltrexone for alcoholic adolescents. Am J Psychiatry. 1997;154(3):439-441.
12. Deas D, May MP, Randall C, et al. Naltrexone treatment of adolescent alcoholics: an open-label pilot study. J Child Adolesc Psychopharmacol. 2005;15(5):723-728.
13. Miranda R, Ray L, Blanchard A, et al. Effects of naltrexone on adolescent alcohol cue reactivity and sensitivity: an initial randomized trial. Addict Biol. 2014;19(5):941-954.
14. Gray KM, Carpenter MJ, Baker NL, et al. A double-blind randomized controlled trial of N-acetylcysteine in cannabis-dependent adolescents. Am J Psychiatry. 2012;169(8):805-812.
15. Roten AT, Baker NL, Gray KM. Marijuana craving trajectories in an adolescent marijuana cessation pharmacotherapy trial. Addict Behav. 2013;38(3):1788-1791.
16. Hopfer CJ, Khuri E, Crowley TJ, et al. Adolescent heroin use: a review of the descriptive and treatment literature. J Subst Abuse Treat. 2002;23(3):231-237.
17. Center for Substance Abuse Treatment. Medication-assisted treatment for opioid addiction in opioid treatment programs. Treatment Improvement Protocol (TIP) Series 43. HHS Publication No. (SMA) 12-4214. Rockville, MD: Substance Abuse and Mental Health Services Administration; 2005.
18. Marsch LA, Bickel WK, Badger GJ, et al. Comparison of pharmacological treatments for opioid-dependent adolescents: a randomized controlled trial. Arch Gen Psychiatry. 2005;62(10):1157-1164.
19. Gowing L, Farrell MF, Ali R, et al. Alpha2-adrenergic agonists for the management of opioid withdrawal. Cochrane Database Syst Rev. 2014;3:CD002024.
20. Woody GE, Poole SA, Subramaniam G, et al. Extended vs short-term buprenorphine-naloxone for treatment of opioid-addicted youth: a randomized trial. JAMA. 2008; 300(17):2003-2011.
21. Fishman MJ, Winstanley EL, Curran E, et al. Treatment of opioid dependence in adolescents and young adults with extended release naltrexone: preliminary case-series and feasibility. Addiction. 2010;105(9):1669-1676.
22. Niederhofer H, Staffen W. Comparison of disulfiram and placebo in treatment of alcohol dependence of adolescents. Drug Alcohol Rev. 2003;22(3):295-297.
23. De Sousa AA, De Sousa J, Kapoor H. An open randomized trial comparing disulfiram and naltrexone in adolescents with alcohol dependence. J Subst Abuse Treat. 2008;13(6):382-388.
24. Deas D, Randall CL, Roberts JS, et al. A double-blind, placebo-controlled trial of sertraline in depressed adolescent alcoholics: a pilot study. Hum Psychopharmacol. 2000;15(6):461-469.
25. Riggs PD, Mikulich-Gilbertson SK, Davies RD, et al. A randomized controlled trial of fluoxetine and cognitive behavioral therapy in adolescents with major depression, behavior problems, and substance use disorders. Arch Pediatr Adolesc Med. 2007;161(11):1026-1034.
26. Findling RL, Pagano ME, McNamara NK, et al. The short-term safety and efficacy of fluoxetine in depressed adolescents with alcohol and cannabis use disorders: a pilot randomized placebo-controlled trial. Child Adolesc Psychiatry Ment Health. 2009;3(1):11.
27. Cornelius JR, Bukstein OG, Douaihy AB, et al. Double-blind fluoxetine trial in comorbid MDD-CUD youth and young adults. Drug Alcohol Depend. 2010;112(1-2):39-45.
28. Cornelius JR, Bukstein OG, Wood DS, et al. Double-blind placebo-controlled trial of fluoxetine in adolescents with comorbid major depression and an alcohol use disorder. Addict Behav. 2009;34(10):905-909.
29. Geller B, Cooper TB, Sun K, et al. Double-blind and placebo controlled study of lithium for adolescent bipolar disorders with secondary substance dependency. J Am Acad Child Adolesc Psychiatry. 1998;37(2):171-178.
30. Donovan SJ, Susser ES, Nunes E. Divalproex sodium for use with conduct disordered adolescent marijuana users. Am J Addict. 1996;5(2):181.
31. Donovan SJ, Susser ES, Nunes EV, et al. Divalproex treatment of disruptive adolescents: a report of 10 cases. J Clin Psychiatry. 1997;58(1):12-15.
32. DelBello, M. Topiramate plus quetiapine cut Cannabis use in bipolar teens. Paper presented at: American Academy of Child and Adolescent Psychiatry’s Annual Meeting. November 2011; Toronto, Ontario, Canada.
33. Wilens TE, Biederman J, Adamson JJ, et al. Further evidence of an association between adolescent bipolar disorder with smoking and substance use disorders: a controlled study. Drug Alcohol Depend. 2008;95(3):188-198.
34. Riggs PD, Leon SL, Mikulich SK, et al. An open trial of bupropion for ADHD in adolescents with substance use disorders and conduct disorder. J Am Acad Child Adolesc Psychiatry. 1998;37(12):1271-1278.
35. Riggs PD, Hall SK, Mikulich-Gilbertson SK, et al. A randomized controlled trial of pemoline for attention-deficit/hyperactivity disorder in substance-abusing adolescents. J Am Acad Child Adolesc Psychiatry. 2004;43(4):420-429.
36. Riggs PD, Winhusen T, Davies RD, et al. Randomized controlled trial of osmotic-release methylphenidate with cognitive-behavioral therapy in adolescents with attention-deficit/hyperactivity disorder and substance use disorders. J Am Acad Child Adolesc Psychiatry. 2011;50(9):903-914.
37. Szobot CM, Rohde LA, Katz B, et al. A randomized crossover clinical study showing that methylphenidate- SODAS improves attention-deficit/hyperactivity disorder symptoms in adolescents with substance use disorder. Braz J Med Biol Res. 2008;41(3):250-257.
38. Solhkhah R, Wilens TE, Daly J, et al. Bupropion SR for the treatment of substance-abusing outpatient adolescents with attention-deficit/hyperactivity disorder and mood disorders. J Child Adolesc Psychopharmacol. 2005;15(5): 777-786.
39. Thurstone C, Riggs PD, Salomonsen-Sautel S, et al. Randomized, controlled trial of atomoxetine for attention-deficit/hyperactivity disorder in adolescents with substance use disorder. J Am Acad Child Adolesc Psychiatry. 2010;49(6):573-582.
40. Zulauf CA, Sprich SE, Safren SA, et al. The complicated relationship between attention deficit/hyperactivity disorder and substance use disorders. Curr Psychiatry Rep. 2014;16(3):436.
41. Riggs PD, Mikulich SK, Coffman LM, et al. Fluoxetine in drug-dependent delinquents with major depression: an open trial. J Child Adolesc Psychopharmacol. 1997;7(2):87-95.
42. Mohatt J, Bennett SM, Walkup JT. Treatment of separation, generalized, and social anxiety disorders in youths. Am J Psychiatry. 2014;171(7):741-748.
43. Strawn JR, Sakolsky DJ, Rynn MA. Psychopharmacologic treatment of children and adolescents with anxiety disorders. Child Adolesc Psychiatr Clin N Am. 2012; 21(3):527-539.
1. Johnston LD, Miech RA, O’Malley PM, et al. Monitoring the future, Table 2: trends in annual prevalence of use of various drugs in grades 8, 10, and 12. http://www. monitoringthefuture.org/data/14data.html#2014data-drugs. Published December 16, 2014. Accessed January 6, 2015.
2. Merikangas KR, He JP, Burnstein M, et al. Lifetime prevalence of mental disorders in U.S. adolescents: results from the National Comorbidity Survey Replication-- Adolescent Supplement (NCS-A). J Am Acad Child Adolesc Psychiatry. 2010;49(10):980-989.
3. Kandel DB, Johnson JG, Bird HR, et al. Psychiatric disorders associated with substance use among children and adolescents: findings from the Methods for the Epidemiology of Child and Adolescent Mental Disorders (MECA) Study. J Abnorm Child Psychol. 1997;25(2):122-132.
4. Roberts RE, Roberts CR, Xing Y. Comorbidity of substance use disorders and other psychiatric disorders among adolescents: evidence from an epidemiologic survey. Drug Alcohol Depend. 2007;88(suppl 1):S4-S13.
5. Stowell R, Estroff TW. Psychiatric disorders in substance-abusing adolescent inpatients: a pilot study. J Am Acad Child Adolesc Psychiatry. 1992;31(6):1036-1040.
6. National Institute of Alcohol Abuse and Alcoholism. Alcohol screening and brief intervention for youth: a practitioner’s guide. http://www.niaaa.nih.gov/ Publications/EducationTrainingMaterials/Pages/YouthGuide.aspx. Accessed March 11, 2015.
7. Children’s Hospital Boston. The CRAFFT screening interview. http://www.integration.samhsa.gov/clinical-practice/sbirt/CRAFFT_Screening_interview.pdf. Published 2009. Accessed March 11, 2015.
8. Levy S, Weiss R, Sherritt L, et al. An electronic screen for triaging adolescent substance use by risk levels. JAMA Pediatr. 2014;168(9):822-828.
9. Diagnostic and statistical manual of mental disorders, 5th ed. Washington, DC: American Psychiatric Association; 2013.
10. Kelly JF, Myers MG. Adolescents’ participation in Alcoholics Anonymous and Narcotics Anonymous: review, implications and future directions. J Psychoactive Drugs. 2007;39(3):259-269.
11. Lifrak PD, Alterman AI, O’Brien CP, et al. Naltrexone for alcoholic adolescents. Am J Psychiatry. 1997;154(3):439-441.
12. Deas D, May MP, Randall C, et al. Naltrexone treatment of adolescent alcoholics: an open-label pilot study. J Child Adolesc Psychopharmacol. 2005;15(5):723-728.
13. Miranda R, Ray L, Blanchard A, et al. Effects of naltrexone on adolescent alcohol cue reactivity and sensitivity: an initial randomized trial. Addict Biol. 2014;19(5):941-954.
14. Gray KM, Carpenter MJ, Baker NL, et al. A double-blind randomized controlled trial of N-acetylcysteine in cannabis-dependent adolescents. Am J Psychiatry. 2012;169(8):805-812.
15. Roten AT, Baker NL, Gray KM. Marijuana craving trajectories in an adolescent marijuana cessation pharmacotherapy trial. Addict Behav. 2013;38(3):1788-1791.
16. Hopfer CJ, Khuri E, Crowley TJ, et al. Adolescent heroin use: a review of the descriptive and treatment literature. J Subst Abuse Treat. 2002;23(3):231-237.
17. Center for Substance Abuse Treatment. Medication-assisted treatment for opioid addiction in opioid treatment programs. Treatment Improvement Protocol (TIP) Series 43. HHS Publication No. (SMA) 12-4214. Rockville, MD: Substance Abuse and Mental Health Services Administration; 2005.
18. Marsch LA, Bickel WK, Badger GJ, et al. Comparison of pharmacological treatments for opioid-dependent adolescents: a randomized controlled trial. Arch Gen Psychiatry. 2005;62(10):1157-1164.
19. Gowing L, Farrell MF, Ali R, et al. Alpha2-adrenergic agonists for the management of opioid withdrawal. Cochrane Database Syst Rev. 2014;3:CD002024.
20. Woody GE, Poole SA, Subramaniam G, et al. Extended vs short-term buprenorphine-naloxone for treatment of opioid-addicted youth: a randomized trial. JAMA. 2008; 300(17):2003-2011.
21. Fishman MJ, Winstanley EL, Curran E, et al. Treatment of opioid dependence in adolescents and young adults with extended release naltrexone: preliminary case-series and feasibility. Addiction. 2010;105(9):1669-1676.
22. Niederhofer H, Staffen W. Comparison of disulfiram and placebo in treatment of alcohol dependence of adolescents. Drug Alcohol Rev. 2003;22(3):295-297.
23. De Sousa AA, De Sousa J, Kapoor H. An open randomized trial comparing disulfiram and naltrexone in adolescents with alcohol dependence. J Subst Abuse Treat. 2008;13(6):382-388.
24. Deas D, Randall CL, Roberts JS, et al. A double-blind, placebo-controlled trial of sertraline in depressed adolescent alcoholics: a pilot study. Hum Psychopharmacol. 2000;15(6):461-469.
25. Riggs PD, Mikulich-Gilbertson SK, Davies RD, et al. A randomized controlled trial of fluoxetine and cognitive behavioral therapy in adolescents with major depression, behavior problems, and substance use disorders. Arch Pediatr Adolesc Med. 2007;161(11):1026-1034.
26. Findling RL, Pagano ME, McNamara NK, et al. The short-term safety and efficacy of fluoxetine in depressed adolescents with alcohol and cannabis use disorders: a pilot randomized placebo-controlled trial. Child Adolesc Psychiatry Ment Health. 2009;3(1):11.
27. Cornelius JR, Bukstein OG, Douaihy AB, et al. Double-blind fluoxetine trial in comorbid MDD-CUD youth and young adults. Drug Alcohol Depend. 2010;112(1-2):39-45.
28. Cornelius JR, Bukstein OG, Wood DS, et al. Double-blind placebo-controlled trial of fluoxetine in adolescents with comorbid major depression and an alcohol use disorder. Addict Behav. 2009;34(10):905-909.
29. Geller B, Cooper TB, Sun K, et al. Double-blind and placebo controlled study of lithium for adolescent bipolar disorders with secondary substance dependency. J Am Acad Child Adolesc Psychiatry. 1998;37(2):171-178.
30. Donovan SJ, Susser ES, Nunes E. Divalproex sodium for use with conduct disordered adolescent marijuana users. Am J Addict. 1996;5(2):181.
31. Donovan SJ, Susser ES, Nunes EV, et al. Divalproex treatment of disruptive adolescents: a report of 10 cases. J Clin Psychiatry. 1997;58(1):12-15.
32. DelBello, M. Topiramate plus quetiapine cut Cannabis use in bipolar teens. Paper presented at: American Academy of Child and Adolescent Psychiatry’s Annual Meeting. November 2011; Toronto, Ontario, Canada.
33. Wilens TE, Biederman J, Adamson JJ, et al. Further evidence of an association between adolescent bipolar disorder with smoking and substance use disorders: a controlled study. Drug Alcohol Depend. 2008;95(3):188-198.
34. Riggs PD, Leon SL, Mikulich SK, et al. An open trial of bupropion for ADHD in adolescents with substance use disorders and conduct disorder. J Am Acad Child Adolesc Psychiatry. 1998;37(12):1271-1278.
35. Riggs PD, Hall SK, Mikulich-Gilbertson SK, et al. A randomized controlled trial of pemoline for attention-deficit/hyperactivity disorder in substance-abusing adolescents. J Am Acad Child Adolesc Psychiatry. 2004;43(4):420-429.
36. Riggs PD, Winhusen T, Davies RD, et al. Randomized controlled trial of osmotic-release methylphenidate with cognitive-behavioral therapy in adolescents with attention-deficit/hyperactivity disorder and substance use disorders. J Am Acad Child Adolesc Psychiatry. 2011;50(9):903-914.
37. Szobot CM, Rohde LA, Katz B, et al. A randomized crossover clinical study showing that methylphenidate- SODAS improves attention-deficit/hyperactivity disorder symptoms in adolescents with substance use disorder. Braz J Med Biol Res. 2008;41(3):250-257.
38. Solhkhah R, Wilens TE, Daly J, et al. Bupropion SR for the treatment of substance-abusing outpatient adolescents with attention-deficit/hyperactivity disorder and mood disorders. J Child Adolesc Psychopharmacol. 2005;15(5): 777-786.
39. Thurstone C, Riggs PD, Salomonsen-Sautel S, et al. Randomized, controlled trial of atomoxetine for attention-deficit/hyperactivity disorder in adolescents with substance use disorder. J Am Acad Child Adolesc Psychiatry. 2010;49(6):573-582.
40. Zulauf CA, Sprich SE, Safren SA, et al. The complicated relationship between attention deficit/hyperactivity disorder and substance use disorders. Curr Psychiatry Rep. 2014;16(3):436.
41. Riggs PD, Mikulich SK, Coffman LM, et al. Fluoxetine in drug-dependent delinquents with major depression: an open trial. J Child Adolesc Psychopharmacol. 1997;7(2):87-95.
42. Mohatt J, Bennett SM, Walkup JT. Treatment of separation, generalized, and social anxiety disorders in youths. Am J Psychiatry. 2014;171(7):741-748.
43. Strawn JR, Sakolsky DJ, Rynn MA. Psychopharmacologic treatment of children and adolescents with anxiety disorders. Child Adolesc Psychiatr Clin N Am. 2012; 21(3):527-539.
Cloud-based systems can help secure patient information
Physicians hardly need the Health Insurance Portability and Accountability Act (HIPAA) to remind them how important it is to safeguard their patients’ records. Physicians understand that patient information is sensitive and it would be disastrous if their files became public or fell into the wrong hands. However, the use of health information technology to record patient information, although beneficial for medical professionals and patients, poses risks to patient privacy.1
HIPAA requires clinicians and health care systems to protect patient information, whether it is maintained in an electronic health records system, stored on a mobile device, or transmitted via e-mail to another physician. The U.S. Department of Health and Human Services will increase HIPAA audits this year to make sure that medical practices have taken measures to protect their patients’ health information. Physicians and other clinicians can take advantage of cloud-based file-sharing services, such as Dropbox, without running afoul of HIPAA.
Mobile computing, the cloud, and patient information: A risky combination
Although mobile computing and cloud-based file-sharing sites such as Dropbox and Google Drive allow physicians to take notes on a tablet, annotate those notes on a laptop, and share them with a physician who views them on his (her) desktop, this free flow of information makes it more difficult to stay compliant with HIPAA.
Dropbox and other file-sharing services encrypt documents while they’re stored in the cloud but the files are unprotected when downloaded to a device. E-mail, which isn’t as versatile or useful as these services, also is not HIPAA-compliant unless the files are encrypted.
Often, small psychiatric practices use these online services and e-mail even if they’re aware of the risks because they don’t have time to research a better solution. Or they might resort to faxing or even snail-mailing documents, losing out on the increased productivity that the cloud can provide.
Secure technologies satisfy auditors
A number of tools exist to help physicians seamlessly integrate the encryption necessary to keep their patients’ records safe and meet HIPAA security requirements. Here’s a look at 3 options.
Sookasa (plus Dropbox). One option is to invest in a software product designed to encrypt documents shared through cloud-based services. This type of software creates a compliance “shield” around files stored on the cloud, converting files into HIPAA safe havens. The files are encrypted when synced to new devices or shared with other users, meaning they’re protected no matter where they reside.2
Sookasa is an online service that encrypts files shared and stored in Dropbox. The company plans to extend its support to other popular cloud services such as Google Drive and Microsoft OneDrive. Sookasa also audits and controls access to encrypted files, so that patient data can be blocked even if a device is lost or sto len. Sookasa users also can share files via e-mail with added encryption and authentication to make sure only the authorized receiver gets the documents.2
TigerText. Regular SMS text messages on your mobile phone aren’t compliant with HIPAA, but TigerText replicates the texting experience in a secure way. Instead of being stored on your mobile phone, messages sent through TigerText are stored on the company’s servers. Messages sent through the application can’t be saved, copied, or forwarded to other recipients. TigerText messages also are deleted, either after a set time period or after they’ve been read. Because the messages aren’t stored on phones, a lost or stolen phone won’t result in a data breach and a HIPAA violation.3
Secure text messaging won’t help physicians store and manage large amounts of patient files, but it’s a must-have if they use texting to communicate about patient care.
DataMotion SecureMail provides e-mail encryption services to health care organizations and other enterprises. Using a decryption key, authorized users can open and read the encrypted e-mails, which are HIPAA-compliant.4 This method is superior to other services that encrypt e-mails on the server. Several providers, such as Google’s e-mail encryption service Postini, ensure that e-mails are encrypted when they are stored on the server; however, the body text and attachments included in specific e-mails are not encrypted on the senders’ and receivers’ devices. If you lose a connected device, you would still be at risk of a HIPAA breach.
DataMotion’s SecureMail provides detailed tracking and logging of e-mails, which is necessary for auditing purposes. The product also works on mobile devices.
E-mail is a helpful tool for quickly sharing files and an e-mail encryption product such as SecureMail makes it possible to do so securely. Other e-mail encryption products do not securely store and back up all files in a centralized way.
DisclosureDr. Cidon is CEO and Co-founder of Sookasa.
1. U.S. Department of Health and Human Services. HIPAA privacy, security, and breach notification adult program. http://www.hhs.gov/ocr/privacy/hipaa/enforcement/ audit. Accessed February 12, 2015.
2. Sookasa Web site. How it works. https://www.sookasa. com/how-it-works. Accessed February 12, 2015.
3. TigerText Web site. http://www.tigertext.com. Accessed February 12, 2015.
4. DataMotion Web site. http://datamotion.com/products/ securemail/securemail-desktop. Accessed February 12, 2015.
Physicians hardly need the Health Insurance Portability and Accountability Act (HIPAA) to remind them how important it is to safeguard their patients’ records. Physicians understand that patient information is sensitive and it would be disastrous if their files became public or fell into the wrong hands. However, the use of health information technology to record patient information, although beneficial for medical professionals and patients, poses risks to patient privacy.1
HIPAA requires clinicians and health care systems to protect patient information, whether it is maintained in an electronic health records system, stored on a mobile device, or transmitted via e-mail to another physician. The U.S. Department of Health and Human Services will increase HIPAA audits this year to make sure that medical practices have taken measures to protect their patients’ health information. Physicians and other clinicians can take advantage of cloud-based file-sharing services, such as Dropbox, without running afoul of HIPAA.
Mobile computing, the cloud, and patient information: A risky combination
Although mobile computing and cloud-based file-sharing sites such as Dropbox and Google Drive allow physicians to take notes on a tablet, annotate those notes on a laptop, and share them with a physician who views them on his (her) desktop, this free flow of information makes it more difficult to stay compliant with HIPAA.
Dropbox and other file-sharing services encrypt documents while they’re stored in the cloud but the files are unprotected when downloaded to a device. E-mail, which isn’t as versatile or useful as these services, also is not HIPAA-compliant unless the files are encrypted.
Often, small psychiatric practices use these online services and e-mail even if they’re aware of the risks because they don’t have time to research a better solution. Or they might resort to faxing or even snail-mailing documents, losing out on the increased productivity that the cloud can provide.
Secure technologies satisfy auditors
A number of tools exist to help physicians seamlessly integrate the encryption necessary to keep their patients’ records safe and meet HIPAA security requirements. Here’s a look at 3 options.
Sookasa (plus Dropbox). One option is to invest in a software product designed to encrypt documents shared through cloud-based services. This type of software creates a compliance “shield” around files stored on the cloud, converting files into HIPAA safe havens. The files are encrypted when synced to new devices or shared with other users, meaning they’re protected no matter where they reside.2
Sookasa is an online service that encrypts files shared and stored in Dropbox. The company plans to extend its support to other popular cloud services such as Google Drive and Microsoft OneDrive. Sookasa also audits and controls access to encrypted files, so that patient data can be blocked even if a device is lost or sto len. Sookasa users also can share files via e-mail with added encryption and authentication to make sure only the authorized receiver gets the documents.2
TigerText. Regular SMS text messages on your mobile phone aren’t compliant with HIPAA, but TigerText replicates the texting experience in a secure way. Instead of being stored on your mobile phone, messages sent through TigerText are stored on the company’s servers. Messages sent through the application can’t be saved, copied, or forwarded to other recipients. TigerText messages also are deleted, either after a set time period or after they’ve been read. Because the messages aren’t stored on phones, a lost or stolen phone won’t result in a data breach and a HIPAA violation.3
Secure text messaging won’t help physicians store and manage large amounts of patient files, but it’s a must-have if they use texting to communicate about patient care.
DataMotion SecureMail provides e-mail encryption services to health care organizations and other enterprises. Using a decryption key, authorized users can open and read the encrypted e-mails, which are HIPAA-compliant.4 This method is superior to other services that encrypt e-mails on the server. Several providers, such as Google’s e-mail encryption service Postini, ensure that e-mails are encrypted when they are stored on the server; however, the body text and attachments included in specific e-mails are not encrypted on the senders’ and receivers’ devices. If you lose a connected device, you would still be at risk of a HIPAA breach.
DataMotion’s SecureMail provides detailed tracking and logging of e-mails, which is necessary for auditing purposes. The product also works on mobile devices.
E-mail is a helpful tool for quickly sharing files and an e-mail encryption product such as SecureMail makes it possible to do so securely. Other e-mail encryption products do not securely store and back up all files in a centralized way.
DisclosureDr. Cidon is CEO and Co-founder of Sookasa.
Physicians hardly need the Health Insurance Portability and Accountability Act (HIPAA) to remind them how important it is to safeguard their patients’ records. Physicians understand that patient information is sensitive and it would be disastrous if their files became public or fell into the wrong hands. However, the use of health information technology to record patient information, although beneficial for medical professionals and patients, poses risks to patient privacy.1
HIPAA requires clinicians and health care systems to protect patient information, whether it is maintained in an electronic health records system, stored on a mobile device, or transmitted via e-mail to another physician. The U.S. Department of Health and Human Services will increase HIPAA audits this year to make sure that medical practices have taken measures to protect their patients’ health information. Physicians and other clinicians can take advantage of cloud-based file-sharing services, such as Dropbox, without running afoul of HIPAA.
Mobile computing, the cloud, and patient information: A risky combination
Although mobile computing and cloud-based file-sharing sites such as Dropbox and Google Drive allow physicians to take notes on a tablet, annotate those notes on a laptop, and share them with a physician who views them on his (her) desktop, this free flow of information makes it more difficult to stay compliant with HIPAA.
Dropbox and other file-sharing services encrypt documents while they’re stored in the cloud but the files are unprotected when downloaded to a device. E-mail, which isn’t as versatile or useful as these services, also is not HIPAA-compliant unless the files are encrypted.
Often, small psychiatric practices use these online services and e-mail even if they’re aware of the risks because they don’t have time to research a better solution. Or they might resort to faxing or even snail-mailing documents, losing out on the increased productivity that the cloud can provide.
Secure technologies satisfy auditors
A number of tools exist to help physicians seamlessly integrate the encryption necessary to keep their patients’ records safe and meet HIPAA security requirements. Here’s a look at 3 options.
Sookasa (plus Dropbox). One option is to invest in a software product designed to encrypt documents shared through cloud-based services. This type of software creates a compliance “shield” around files stored on the cloud, converting files into HIPAA safe havens. The files are encrypted when synced to new devices or shared with other users, meaning they’re protected no matter where they reside.2
Sookasa is an online service that encrypts files shared and stored in Dropbox. The company plans to extend its support to other popular cloud services such as Google Drive and Microsoft OneDrive. Sookasa also audits and controls access to encrypted files, so that patient data can be blocked even if a device is lost or sto len. Sookasa users also can share files via e-mail with added encryption and authentication to make sure only the authorized receiver gets the documents.2
TigerText. Regular SMS text messages on your mobile phone aren’t compliant with HIPAA, but TigerText replicates the texting experience in a secure way. Instead of being stored on your mobile phone, messages sent through TigerText are stored on the company’s servers. Messages sent through the application can’t be saved, copied, or forwarded to other recipients. TigerText messages also are deleted, either after a set time period or after they’ve been read. Because the messages aren’t stored on phones, a lost or stolen phone won’t result in a data breach and a HIPAA violation.3
Secure text messaging won’t help physicians store and manage large amounts of patient files, but it’s a must-have if they use texting to communicate about patient care.
DataMotion SecureMail provides e-mail encryption services to health care organizations and other enterprises. Using a decryption key, authorized users can open and read the encrypted e-mails, which are HIPAA-compliant.4 This method is superior to other services that encrypt e-mails on the server. Several providers, such as Google’s e-mail encryption service Postini, ensure that e-mails are encrypted when they are stored on the server; however, the body text and attachments included in specific e-mails are not encrypted on the senders’ and receivers’ devices. If you lose a connected device, you would still be at risk of a HIPAA breach.
DataMotion’s SecureMail provides detailed tracking and logging of e-mails, which is necessary for auditing purposes. The product also works on mobile devices.
E-mail is a helpful tool for quickly sharing files and an e-mail encryption product such as SecureMail makes it possible to do so securely. Other e-mail encryption products do not securely store and back up all files in a centralized way.
DisclosureDr. Cidon is CEO and Co-founder of Sookasa.
1. U.S. Department of Health and Human Services. HIPAA privacy, security, and breach notification adult program. http://www.hhs.gov/ocr/privacy/hipaa/enforcement/ audit. Accessed February 12, 2015.
2. Sookasa Web site. How it works. https://www.sookasa. com/how-it-works. Accessed February 12, 2015.
3. TigerText Web site. http://www.tigertext.com. Accessed February 12, 2015.
4. DataMotion Web site. http://datamotion.com/products/ securemail/securemail-desktop. Accessed February 12, 2015.
1. U.S. Department of Health and Human Services. HIPAA privacy, security, and breach notification adult program. http://www.hhs.gov/ocr/privacy/hipaa/enforcement/ audit. Accessed February 12, 2015.
2. Sookasa Web site. How it works. https://www.sookasa. com/how-it-works. Accessed February 12, 2015.
3. TigerText Web site. http://www.tigertext.com. Accessed February 12, 2015.
4. DataMotion Web site. http://datamotion.com/products/ securemail/securemail-desktop. Accessed February 12, 2015.
Clozapine Management for Internists
Clozapine is a second‐generation antipsychotic (SGA) medication that was developed in 1959, introduced to Europe in 1971, and withdrawn from the market in 1975 due to associated concerns for potentially fatal agranulocytosis. In 1989, the US Food and Drug Administration (FDA) approved use of clozapine for the management of treatment‐resistant schizophrenia, under strict parameters for complete blood count (CBC) monitoring. Clozapine has since gained an additional FDA indication for reducing suicidal behavior in patients with schizophrenia and schizoaffective disorder,[1, 2, 3] and displayed superiority to both first generation antipsychotics and other SGA agents in reducing symptom burden.[2, 4, 5]
Clozapine's clinical benefits include lowering mortality in schizophrenia,[6] reducing deaths from ischemic heart disease,[7] curtailing substance use in individuals with psychotic disorders,[8] increasing rates of independent living and meaningful occupational activity, and reducing psychiatric hospitalizations and need for involuntary treatment.[9] Because schizophrenia, itself, is associated with a 15‐ to 20‐year decrease in average lifespan,[10] these benefits of clozapine are particularly salient. Yet the mechanism by which clozapine mitigates otherwise‐refractory psychotic symptoms is a conundrum. Structurally a tricyclic dibenzodiazepine, clozapine has relatively little effect on the dopamine D2 receptor, which has classically been thought to mediate the treatment effect of antipsychotics.[11, 12]
The unique nature of clozapine extends to its adverse effect profile. A significant percentage of patients who discontinue clozapine (17%35.4%) cite medical complications, the most common being seizures, constipation, sedation, and neutropenia.[13, 14] Yet several studies, including the landmark Clinical Antipsychotic Trials for Interventions Effectiveness (CATIE) study, have found that patients were more likely to adhere to clozapine therapy than to other antipsychotics.[2, 15] In the CATIE study, 44% of subjects taking clozapine continued the medication for 18 months, compared to 29% of individuals on olanzapine, 14% on risperidone, and 7% on quetiapine. Median time until discontinuation of clozapine was 10.5 months, significantly longer than for quetiapine (2.8 months) and olanzapine (2.7 months).[2] Because patients who experience clozapine‐related medical complications are likely to present first to the primary care or general hospital setting, internists must be aware of potential iatrogenic effects, and of their implications for psychiatric and medical care. Using case examples, we will examine both common and serious complications associated with clozapine, and discuss recommendations for management, including indications for clozapine discontinuation.
NEUROLOGICAL
Case Vignette 1
Mr. A is a 29‐year‐old man with asthma and schizophrenia who experienced a generalized tonic‐clonic seizure during treatment at a psychiatric facility. The patient started clozapine therapy 5 weeks prior, with gradual titration to 425 mg daily. Mr. A's previous medication trials included olanzapine and chlorpromazine, which rendered little improvement to his chronic auditory hallucinations. Clozapine was temporarily withheld during further neurologic workup, in which both electroencephalogram (EEG) and brain magnetic resonance imaging were unremarkable. After 60 hours, clozapine titration was reinitiated, and valproic acid was started for mood stabilization and seizure prophylaxis. Mr. A was discharged 6 weeks later on clozapine, 600 mg at bedtime, and extended‐release divalproate, 2500 mg at bedtime. The patient suffered no further seizure activity throughout hospitalization and for at least 1 year postdischarge.
Seizures complicate clozapine use in up to 5% of cases, with a dose‐dependent risk pattern.[16] Seizures are most commonly associated with serum clozapine levels above 500 g/L), but have also been reported with lower levels of clozapine and its metabolite norclozapine.[17] Though nonspecific EEG changes (ie, focal or generalized spikes, spike‐wave and polyspike discharges) have been associated with clozapine administration, they do not reliably predict seizure tendency.[17] Prophylaxis with antiepileptic drugs (AEDs) is not recommended, though AED treatment may be undertaken for patients who experience a seizure while on clozapine. When seizures occur in the context of elevated serum levels, reducing clozapine to the lowest effective dose is preferred over initiating an AED. Although this reduces the potential for exposure to anticonvulsant‐associated adverse effects, it may also introduce the risk of relapsed psychotic symptoms, and therefore requires close monitoring by a psychiatrist. For those who opt to initiate AED therapy, we recommend consideration of each medication's therapeutic and side‐effect profiles based on the patient's medical history and active symptoms. For example, in the case of Mr. A, valproate was used to target concomitant mood symptoms; likewise, patients who experience troublesome weight gain, as well as seizures, may benefit from topiramate. The occurrence of seizures does not preclude continuation of clozapine therapy, in conjunction with an AED[18] and after consideration of potential risks and benefits of use. Clozapine is not contraindicated in patients with well‐controlled epilepsy.[19]
Sedation, the most common neurologic side effect of clozapine, is also dose dependent and often abates during titration.[20] Though clozapine may induce extrapyramidal symptoms, including rigidity, tremor, and dystonia, the risk is considerably lower with clozapine than other antipsychotics, owing to a lesser affinity for D2 receptors. Associated parkinsonism should prompt consideration of dose reduction, in discussion with a psychiatrist, with concurrent monitoring of serum clozapine levels and close follow‐up for emergence of psychotic symptoms. If dose reduction is ineffective, not indicated, or not preferred by the patient, the addition of an anticholinergic medication may be considered (eg, diphenhydramine 2550 mg, benztropine 12 mg). Neuroleptic malignant syndrome, although rare, is life‐threatening and warrants immediate discontinuation of clozapine, though successful rechallenge after has been reported in case reports.[21]
CARDIAC
Case Vignette 2
Mr. B is a 34‐year‐old man with sinus tachycardia, a benign adrenal tumor, and chronic paranoid schizophrenia that had been poorly responsive to numerous antipsychotic trials. During a psychiatric hospitalization for paranoid delusions with aggressive threats toward family, Mr. B was started on clozapine and titrated to 250 mg daily. On day 16 of clozapine therapy, the patient began to experience cough, and several days later, diffuse rhonchi were noted on examination. Complete blood count revealed WBC 20.3 * 103/L, with 37% eosinophils and absolute eosinophil count of 7.51 (increased from 12%/1.90 the week before), and an electrocardiogram showed sinus tachycardia with ST‐segment changes. Mr. B was transferred to the general medical hospital for workup of presumed myocarditis.
Approximately one‐quarter of patients who take clozapine experience sinus tachycardia, which may be related to clozapine's anticholinergic effects causing rebound noradrenergic elevations[22]; persistent or problematic tachycardia may be treated using a cardio‐selective ‐blocker. Clozapine has also been linked to significant increases in systolic and diastolic blood pressure in 4% of patients (monitoring data); the risk of hypertension increases with the duration of clozapine treatment, and appears to be independent of the patient's weight.[23] Orthostatic hypotension has been reported in 9% of patients on clozapine therapy, though effects can be mitigated with gradual titration, adequate hydration, compression stockings, and patient education. Sinus tachycardia, hypertension, and orthostatic hypotension are not absolute indications to discontinue clozapine; rather, we advocate for treating these side effects while continuing clozapine treatment.[24]
Myocarditis represents the most serious cardiac side effect of clozapine.[25, 26] Although the absolute risk appears to be lower than 0.1%,[24] Kilian et al. calculated a 1000‐to‐2000fold increase in relative risk of myocarditis among patients who take clozapine, compared to the general population.[26] Most cases occur within the first month of treatment, with median time to onset of 15 days. This time course is consistent with an acute immunoglobulin Emediated hypersensitivity (type 1) reaction, and eosinophilic infiltrates have been found on autopsy, consistent with an acute drug reaction.[20]
Because of this early onset, the physician should maintain a particularly high index of suspicion in the first months of treatment, rigorously questioning patients and families about signs and symptoms of cardiac disease. If patients on clozapine present with flu‐like symptoms, fever, myalgia, dizziness, chest pain, dyspnea, tachycardia, palpitations, or other signs or symptoms of heart failure, evaluation for myocarditis should be undertaken.[25] Several centers have utilized cardiac enzymes (e.g., troponin I, troponin T, creatine kinase‐myocardial band) as a universal screen for myocarditis, though this is not a universal practice.[24] Both tachycardia and flu‐like symptoms may be associated with clozapine, particularly during the titration period, and these are normally benign symptoms requiring no intervention. If the diagnosis of myocarditis is made, however, clozapine should be stopped immediately. Myocarditis is often considered to be a contraindication to restarting clozapine, though cases have been reported of successful clozapine rechallenge in patients who had previously experienced myocarditis.[21]
Recommendations for clozapine‐associated electrocardiography (ECG) monitoring have not been standardized. Based on common clinical practice and the time course of serious cardiac complications, we recommend baseline ECG prior to the start of clozapine, with follow‐up ECG 2 to 4 weeks after clozapine initiation, and every 6 months thereafter.
GASTROINTESTINAL
Case Vignette 3
Mr. C is a 61‐year‐old man with chronic paranoid schizophrenia and a history of multiple‐state hospital admissions. He had been maintained on clozapine for 15 years, allowing him to live independently and avoid psychiatric hospitalization. Mr. C was admitted to the general medical hospital with nausea, vomiting, and an inability to tolerate oral intake. He was found to have a high‐grade small‐bowel obstruction, and all oral medications were initially discontinued. After successful management of his acute gastrointestinal presentation and discussion of potential risks and benefits of various treatment options, clozapine was reinitiated along with bulk laxative and stool softening agents.
Affecting 14% to 60% of individuals who are prescribed clozapine, constipation represents the most common associated gastrointestinal complaint.[27] For most patients, this condition is uncomfortable but nonlethal, though it has been implicated in several deaths by aspiration pneumonia and small‐bowel perforation.[28, 29] Providers must screen regularly for constipation and treat aggressively with stimulant laxatives and stool softeners,[18] while reviewing medication lists and, when possible, streamlining extraneous anticholinergic contributors. Clozapine‐prescribed individuals also frequently suffer from gastrointestinal reflux disease (GERD), for which behavioral interventions (eg, smoking cessation or remaining upright for 3 hours after meals) should be considered in addition to pharmacologic treatment with proton pump inhibitors. Clozapine therapy may be continued while constipation and GERD are managed medically.
Potentially fatal gastrointestinal hypomotility and small‐bowel obstruction are rare but well‐described complications that occur in up to 0.3% of patients who take clozapine.[27] This effect appears to be dose dependent, and higher blood levels are associated with greater severity of constipation and risk for serious hypomotility.[27] Clozapine should be withheld during treatment for such serious adverse events as ileus or small‐bowel perforation; however, once these conditions have stabilized, clozapine therapy may be reconsidered based on an analysis of potential benefits and risks. If clozapine is withheld, the internist must monitor for acute worsening of mental status, inattention, and disorientation, as clozapine withdrawal‐related delirium has been reported.[30] Ultimately, aggressive treatment of constipation in conjunction with continued clozapine therapy is the recommended course of action.[28]
Given the increased risk of ileus in the postoperative period, it is particularly important for physicians to inquire about preoperative bowel habits and assess for any existing constipation. Careful monitoring of postoperative bowel motility, along with early and aggressive management of constipation, is recommended. Concurrent administration of other constipating agents (eg, opiates, anticholinergics) should be limited to the lowest effective dose.[27] Although transaminitis, hepatitis, and pancreatitis have all been associated with clozapine in case reports, these are rare,[31] and the approach to management should be considered on a case‐by‐case basis.
HEMATOLOGIC
Case Vignette 4
Ms. D is a 38‐year‐old woman with a schizoaffective disorder who was started on clozapine after 3 other agents had failed to control her psychotic symptoms and alleviate chronic suicidal thoughts. Baseline CBC revealed serum white blood cell count (WBC) of 7800/mm3 and absolute neutrophil count (ANC) of 4700/mm3. In Ms. D's third week of clozapine use, WBC dropped to 4400/mm3 and ANC to 2200/mm3. Repeat lab draw confirmed this, prompting the treatment team to initiate twice‐weekly CBC monitoring. Ms. D's counts continued to fall, and 10 days after the initial drop, WBC was calculated at 1400/mm3 and ANC at 790/mm3. Clozapine was discontinued, and though the patient was asymptomatic, broad‐spectrum antibiotics were initiated. She received daily CBC monitoring until WBC >3000/mm3 and ANC >1500/mm3. An alternate psychotropic medication was initiated several weeks thereafter.
Neutropenia (white blood cell count <3000/mm3) is a common complication that affects approximately 3% of patients who take clozapine.[32] This may be mediated by clozapine's selective impact on the precursors of polymorphonuclear leukocytes, though the mechanism remains unknown.[33] Although neutropenia is not an absolute contraindication for clozapine therapy, guidelines recommend cessation of clozapine when the ANC drops below 1000/mm3.[34] A meta‐analysis of 112 patients who were rechallenged following neutropenia found that 69% tolerated a rechallenge without development of a subsequent dyscrasia.[21]
In the case of chemotherapy‐induced neutropenia, several case reports support the continued use of clozapine during cancer treatment[35]; this requires a written request to the pharmaceutical company that manufactures clozapine and documentation of the expected time course and contribution of chemotherapy to neutropenia.[36] Clozapine's association with neutropenia warrants close monitoring in individuals with human immunodeficiency virus (HIV) and other causes of immune compromise. Reports of clozapine continuation in HIV‐positive individuals underscore the importance of close collaboration between infectious disease and psychiatry, with specific focus on potential interactions between clozapine and antiretroviral agents and close monitoring of viral load and ANC.[37]
The most feared complication of clozapine remains agranulocytosis, defined as ANC<500/mm3,[33] which occurs in up to 1% of monitored patients. In 1975, clozapine was banned worldwide after 8 fatal cases of agranulocytosis were reported in Finland.[38] The drug was reintroduced for treatment‐resistant schizophrenia with strict monitoring parameters, which has sharply reduced the death rate. One study found 12 actual deaths between 1990 and 1994, compared to the 149 predicted deaths without monitoring.[39]
The risk of agranulocytosis appears to be higher in older adults and in patients with a lower baseline WBC count. Although there are reports of delayed agranulocytosis occurring in patients after up to 19 years of treatment,[40] the incidence of leukopenia is greatest in the first year. Given this high‐risk period, mandatory monitoring is as follows: weekly WBC and neutrophil counts for the first 26 weeks, biweekly counts for the second 26 weeks, and every 4 weeks thereafter. Of note, many of the later cases of agranulocytosis appear to be related to medication coadministration, particularly with valproic acid, though no definitive link has been established.[40]
Treatment of clozapine‐induced agranulocytosis consists of immediate clozapine cessation, and consideration of initiation of prophylactic broad‐spectrum antibiotics and granulocyte colony‐stimulating factor (such as filgrastim) until the granulocyte count normalizes.[41, 42] Although few case reports describe successful clozapine rechallenge in patients with a history of agranulocytosis, the data are sparse, and current practice is to permanently discontinue clozapine if ANC falls below 1000/mm3.[21, 41]
ADDITIONAL COMPLICATIONS (METABOLIC, RENAL, URINARY)
Moderate to marked weight gain occurs in over 50% of patients treated with clozapine, with average gains of nearly 10% body weight.[43] In a 10‐year follow‐up study of patients treated with clozapine, Henderson et al. reported an average weight gain of 13 kg, with 34% percent of studied patients developing diabetes mellitus. Metabolic side effects of second‐generation antipsychotics, including clozapine, are a well‐documented and troubling phenomenon.[44] Limited evidence supports use of metformin, alongside behavioral therapy, for concerns related to glucose dysregulation.[45] Some patients have also experienced weight loss with adjunctive topiramate use, particularly if they have also suffered seizures.[46]
Urinary incontinence and nocturnal enuresis are both associated with clozapine, but are likely under‐reported because of patient and provider embarrassment; providers also may not think to ask about these specific symptoms. First‐line treatment for nocturnal enuresis is to limit fluids in the evening. Desmopressin has a controversial role in treating nocturnal enuresis owing to its risk of hyponatremia; appropriate monitoring should be implemented if this agent is used.[18]
Clozapine has been associated with acute interstitial nephritis (AIN), although this is thought to be a relatively rare side effect. Drug‐induced AIN typically appears soon after initiation and presents with the clinical triad of rash, fever, and eosinophilia. Given that weekly CBC is mandatory in the initiation phase, eosinophilia is easily detectible and may serve as a marker for potential AIN.[47]
Sialorrhea, particularly during sleep, is a bothersome condition affecting up to one‐third of patients who take clozapine.[48] Although clozapine is strongly anticholinergic, its agonist activity at the M4 muscarinic receptor and antagonism of the alpha‐2 adrenergic receptor are postulated as the mechanisms underlying hypersalivation. Sialorrhea is frequently seen early in treatment and does not appear to be dose dependent.[48] Excessive salivation is typically managed with behavioral interventions (eg, utilizing towels or other absorbent materials on top of bedding). If hypersalivation occurs during the day, chewing sugar‐free gum may increase the rate of swallowing and make symptoms less bothersome. If this does not provide adequate relief, practitioners may consider use of atropine 1% solution administered directly to the oral cavity.[49]
DRUG‐DRUG INTERACTIONS
For hospitalists, who must frequently alter existing medications or add new ones, awareness of potential drug‐drug interactions is crucial. Clozapine is metabolized by the cytochrome p450 system, with predominant metabolism through the isoenzymes 1A2, 3A4, and 2D6.[50] Common medications that induce clozapine metabolism (thereby decreasing clozapine levels) include phenytoin, phenobarbital, carbamazepine, oxcarbazepine, and corticosteroids. Conversely, stopping these medications after long‐term therapy will raise clozapine levels. Substances that inhibit clozapine metabolism (thereby increasing clozapine levels) include ciprofloxacin, erythromycin, clarithromycin, fluvoxamine, fluoxetine, paroxetine, protease inhibitors, verapamil, and grapefruit juice. We recommend caution when concurrently administering other agents that increase risk for agranulocytosis, including carbamazepine, trimethoprim‐sulfamethoxazole, sulfasalazine, and tricyclic antidepressants.
Cigarette smoking decreases clozapine blood levels by induction of CYP1A2. Patients require a 10% to 30% reduction to clozapine dose during periods of smoking cessation, including when smoking is stopped during inpatient hospitalization.[51] Nicotine replacement therapy does not induce CYP1A2 and therefore does not have a compensatory effect on clozapine levels. On discharge or resumption of smoking, patients may require an increase of their dose of clozapine to maintain adequate antipsychotic effect.
SUMMARY OF RECOMMENDATIONS
Medical complications are cited as the cause in 20% of clozapine discontinuations; most commonly, these include seizures, severe constipation, somnolence, and neutropenia. Given the high risk of psychiatric morbidity posed by discontinuation, we recommend managing mild‐moderate symptoms and side effects while continuing the drug, when possible (Table 1). We encourage hospitalists to confer with the patient's psychiatrist or the inpatient psychiatry consultation service when making changes to clozapine therapy. Specific recommendations are as follows:
- We advocate withholding clozapine administration pending medical optimization for several conditions, including: small‐bowel obstruction, neuroleptic malignant syndrome, venous thromboembolism, diabetic ketoacidosis, or hyperosmolar coma.
- Clinical scenarios requiring acute discontinuation of clozapine include agranulocytosis and myocarditis. Successful rechallenge with clozapine has been described after both conditions; at the same time, given the high morbidity and mortality of myocarditis and agranulocytosis, re‐initiation of clozapine requires an extensive risk‐benefit discussion with the patient and family, informed consent, and, in the case of agranulocytosis, approval from the national clozapine registry (Table 2).
- Although adjunctive therapy with filgrastim was initially thought to permit a clozapine rechallenge in patients with a history of agranulocytosis, case reports on this strategy have been equivocal, and further research is necessary to determine the most effective strategy for management.
| Clinical Lab/Study | Frequency of Monitoring | |
|---|---|---|
| Cardiac | Electrocardiogram | Baseline, 24 weeks after initiation, every 6 months thereafter |
| Cardiac enzymes (eg, troponin I) echocardiogram | No standard guidelines, unless clinically indicated | |
| Hematologic | Complete blood count with differential | Baseline, then weekly 26 weeks, then every 2 weeks 26 weeks, then every 4 weeks thereafter |
| Metabolic | Body mass index; circumference of waist | Baseline, then every 3 to 6 months |
| Fasting glucose | Baseline, then every 6 months | |
| Fasting lipid panel | Baseline, then yearly | |
| Neurologic | Electroencephalogram | No standard guidelines, unless clinically indicated |
| Vital signs | Heart rate, blood pressure, temperature | Baseline and at each follow‐up visit |
| Requires Acute Clozapine Discontinuation* | Clozapine Interruption During Management | Does Not Typically Require Clozapine Discontinuation |
|---|---|---|
| ||
| Agranulocytosis (ANC<1.0 109/mm3) | Diabetic complications (eg, ketoacidosis, hyperosmolar coma) | Constipation |
| Cardiomyopathy (severe) | Gastrointestinal obstruction, ileus | Diabetes mellitus |
| Myocarditis | Neuroleptic malignant syndrome | Gastroesophageal Reflux |
| Venous thromboembolism | Hyperlipidemia | |
| Hypertension | ||
| Orthostatic hypotension | ||
| Sedation | ||
| Seizures | ||
| Sialorrhea | ||
| Sinus tachycardia | ||
| Urinary changes (eg, enuresis, incontinence) | ||
| Weight gain | ||
CONCLUSION
Clozapine has been a very successful treatment for patients with schizophrenia who have failed other antipsychotic therapies. However, fears of potential side effects and frequent monitoring have limited its use and led to unnecessary discontinuation. To mitigate risk for serious complications, we hope to increase hospitalists' awareness of prevention, monitoring, and treatment of side effects, and to promote comfort with circumstances that warrant continuation or discontinuation of clozapine (Table 3). The hospitalist plays a crucial role in managing these complications as well as conveying information and recommendations to primary care providers; as such, their familiarity with the medication is essential for proper management of individuals who take clozapine.
| Take‐Home Points |
|---|
| 1. Clozapine is the gold standard for treatment‐resistant schizophrenia; however, its use is limited by side effects, many of which can be successfully treated by internists. |
| 2. There are few indications for discontinuing clozapine (myocarditis, small‐bowel obstruction, agranulocytosis). The psychiatry service should be consulted in the event that clozapine is discontinued. |
| 3. Seizures are not an indication for discontinuing clozapine; instead, we recommend adding an antiepileptic drug. |
| 4. All second‐generation antipsychotics are associated with diabetes mellitus and significant weight gain. Clozapine is more highly associated with metabolic side effects than many other medications in this class. |
| 5. Sedation, sialorrhea, and constipation are common and can be managed pharmacologically and with behavioral interventions. |
Disclosure: Nothing to report.
- , , , . Clozapine versus typical neuroleptic medication for schizophrenia. Cochrane Database Syst Rev. 2009(1):CD000059.
- , , , et al. Effectiveness of clozapine versus olanzapine, quetiapine, and risperidone in patients with chronic schizophrenia who did not respond to prior atypical antipsychotic treatment. Am J Psychiatry. 2006;163(4):600–610.
- , , , et al. Randomized controlled trial of effect of prescription of clozapine versus other second‐generation antipsychotic drugs in resistant schizophrenia. Schizophr Bull. 2006;32(4):715–723.
- , , , et al. Effects of clozapine on positive and negative symptoms in outpatients with schizophrenia. Am J Psychiatry. 1994;151(1):20–26.
- , , , . Clozapine for the treatment‐resistant schizophrenic. A double‐blind comparison with chlorpromazine. Arch Gen Psychiatry. 1988;45(9):789–796.
- , , , , , . Clozapine treatment for suicidality in schizophrenia: International Suicide Prevention Trial (InterSePT). Arch Gen Psychiatry. 2003;60(1):82–91.
- , , , et al. 11‐year follow‐up of mortality in patients with schizophrenia: a population‐based cohort study (FIN11 study). Lancet. 2009;374(9690):620–627.
- , , , , . Clozapine use and relapses of substance use disorder among patients with co‐occurring schizophrenia and substance use disorders. Schizophr Bull. 2006;32(4):637–643.
- , , . Outcomes for schizophrenia patients with clozapine treatment: how good does it get? J Psychopharmacol. 2009;23(8):957–965.
- , , , . Morbidity and mortality in people with serious mental illness. National Association of State Mental Health Program Directors (NASMHPD) Medical Directors Council. Available at: http://www.nasmhpd.org/docs/publications/MDCdocs/Mortality%20and%20Morbidity%20Final%20Report%208.18.08.pdf. Accessed February 3, 2015.
- , . Pharmacological actions of the atypical antipsychotic drug clozapine: a review. Synapse. 1996;24(4):349–394.
- , . Clozapine. A novel antipsychotic agent. N Engl J Med. 1991;324(11):746–754.
- , . Reason for clozapine cessation. Acta Psychiatr Scand. 2012;125(1):39–44.
- , , , . Termination of clozapine treatment due to medical reasons: when is it warranted and how can it be avoided? J Clin Psychiatry. 2013;74(6):603–613.
- , , , , , . Time to discontinuation of antipsychotic drugs in a schizophrenia cohort: influence of current treatment strategies. Ther Adv Psychopharmacol. 2014;4(6):228–239.
- , , . Clozapine‐related seizures. Neurology. 1991;41(3):369–371.
- , , , . Clozapine‐related EEG changes and seizures: dose and plasma‐level relationships. Ther Adv Psychopharmacol. 2011;1(2):47–66.
- . Review and management of clozapine side effects. J Clin Psychiatry. 2000;61(suppl 8):14–17; discussion 18–19.
- , . Epilepsy, psychosis and clozapine. Human Psychopharmacol Clin Exp. 2002;17:115–119.
- , , , , , . Response of patients with treatment‐refractory schizophrenia to clozapine within three serum level ranges. Am J Psychiatry. 1996;153(12):1579–1584.
- , , , , . When can patients with potentially life‐threatening adverse effects be rechallenged with clozapine? A systematic review of the published literature. Schizophr Res. 2012;134(2–3):180–186.
- , . Clinical profile of clozapine: adverse reactions and agranulocytosis. Psychiatr Q. 1992;63(1):51–70.
- , , , , , . Clozapine and hypertension: a chart review of 82 patients. J Clin Psychiatry. 2004;65(5):686–689.
- , , . Adverse cardiac effects associated with clozapine. J Clin Psychopharmacol. 2005;25(1):32–41.
- , , , , . Clozapine induced myocarditis: a rare but fatal complication. Int J Cardiol. 2006;112(2):e5–e6.
- , , , . Myocarditis and cardiomyopathy associated with clozapine. Lancet. 1999;354(9193):1841–1845.
- , , , . Life‐threatening clozapine‐induced gastrointestinal hypomotility: an analysis of 102 cases. J Clin Psychiatry. 2008;69(5):759–768.
- , , , . Fatalities associated with clozapine‐related constipation and bowel obstruction: a literature review and two case reports. Psychosomatics. 2009;50(4):416–419.
- , , . Death from clozapine‐induced constipation: case report and literature review. Psychosomatics. 2002;43(1):71–73.
- , , , , , . Clozapine: a clinical review of adverse effects and management. Ann Clin Psychiatry. 2003;15(1):33–48.
- , , , , . Beyond white blood cell monitoring: screening in the initial phase of clozapine therapy. J Clin Psychiatry. 2012;73(10):1307–1312.
- Clozapine [package insert]. Sellersville, PA: TEVA Pharmaceuticals USA; 2013. Available at: https://www.clozapineregistry.com/insert.pdf.ashx. Accessed October 27, 2014.
- , , , , . Clozapine‐induced agranulocytosis. Incidence and risk factors in the United States. N Engl J Med. 1993;329(3):162–167.
- Clozaril (clozapine) prescribing information. Washington, DC: U.S. Food and Drug Administration; 2013. Available at: http://www.accessdata.fda.gov/drugsatfda_docs/label/2013/019758s069s071lbl.pdf. Accessed February 4, 2015.
- . Clozapine therapy during cancer treatment. Am J Psychiatry. 2004;161(1):175.
- , , , , . Continuation of clozapine during chemotherapy: a case report and review of literature. Psychosomatics. 2014;55(6):673–679.
- , , . Clozapine use in HIV‐infected schizophrenia patients: a case‐based discussion and review. Psychosomatics. 2009;50(6):626–632.
- , , , . Letter: clozapine and agranulocytosis. Lancet. 1975;2(7935):611.
- . Effects of the clozapine national registry system on incidence of deaths related to agranulocytosis. Psychiatr Serv. 1996;47(1):52–56.
- , . White blood cell monitoring during long‐term clozapine treatment. Am J Psychiatry. 2013;170(4):366–369.
- , , . Add‐on filgrastim during clozapine rechallenge in patients with a history of clozapine‐related granulocytopenia/agranulocytosis. Am J Psychiatry. 2009;166(2):236.
- , , . Add‐on filgrastim during clozapine rechallenge unsuccessful in preventing agranulocytosis. Gen Hosp Psychiatry. 2013;35(5):576.e11–12.
- , , , , , . Clozapine‐induced weight gain: prevalence and clinical relevance. Am J Psychiatry. 1992;149(1):68–72.
- , , , et al. Clozapine, diabetes mellitus, hyperlipidemia, and cardiovascular risks and mortality: results of a 10‐year naturalistic study. J Clin Psychiatry. 2005;66(9):1116–1121.
- , , , et al. Effects of adjunctive metformin on metabolic traits in nondiabetic clozapine‐treated patients with schizophrenia and the effect of metformin discontinuation on body weight: a 24‐week, randomized, double‐blind, placebo‐controlled study. J Clin Psychiatry. 2013;74(5):e424–e430.
- , , , . Topiramate for clozapine‐induced seizures. Am J Psychiatry. 2001;158(6):968–969.
- , , , , . Clozapine‐induced acute interstitial nephritis. Lancet. 1999;354(9185):1180–1181.
- , , , , . Update on the clinical efficacy and side effects of clozapine. Schizophr Bull. 1991;17(2):247–261.
- , , . Clozapine‐induced sialorrhea: pathophysiology and management strategies. Psychopharmacology. 2006;185(3):265–273.
- , , . Clozapine drug‐drug interactions: a review of the literature. Hum Psychopharm Clin. 1997;12(1):5–20.
- , , . The effect of smoking and cytochrome P450 CYP1A2 genetic polymorphism on clozapine clearance and dose requirement. Pharmacogenetics. 2003;13(3):169–172.
Clozapine is a second‐generation antipsychotic (SGA) medication that was developed in 1959, introduced to Europe in 1971, and withdrawn from the market in 1975 due to associated concerns for potentially fatal agranulocytosis. In 1989, the US Food and Drug Administration (FDA) approved use of clozapine for the management of treatment‐resistant schizophrenia, under strict parameters for complete blood count (CBC) monitoring. Clozapine has since gained an additional FDA indication for reducing suicidal behavior in patients with schizophrenia and schizoaffective disorder,[1, 2, 3] and displayed superiority to both first generation antipsychotics and other SGA agents in reducing symptom burden.[2, 4, 5]
Clozapine's clinical benefits include lowering mortality in schizophrenia,[6] reducing deaths from ischemic heart disease,[7] curtailing substance use in individuals with psychotic disorders,[8] increasing rates of independent living and meaningful occupational activity, and reducing psychiatric hospitalizations and need for involuntary treatment.[9] Because schizophrenia, itself, is associated with a 15‐ to 20‐year decrease in average lifespan,[10] these benefits of clozapine are particularly salient. Yet the mechanism by which clozapine mitigates otherwise‐refractory psychotic symptoms is a conundrum. Structurally a tricyclic dibenzodiazepine, clozapine has relatively little effect on the dopamine D2 receptor, which has classically been thought to mediate the treatment effect of antipsychotics.[11, 12]
The unique nature of clozapine extends to its adverse effect profile. A significant percentage of patients who discontinue clozapine (17%35.4%) cite medical complications, the most common being seizures, constipation, sedation, and neutropenia.[13, 14] Yet several studies, including the landmark Clinical Antipsychotic Trials for Interventions Effectiveness (CATIE) study, have found that patients were more likely to adhere to clozapine therapy than to other antipsychotics.[2, 15] In the CATIE study, 44% of subjects taking clozapine continued the medication for 18 months, compared to 29% of individuals on olanzapine, 14% on risperidone, and 7% on quetiapine. Median time until discontinuation of clozapine was 10.5 months, significantly longer than for quetiapine (2.8 months) and olanzapine (2.7 months).[2] Because patients who experience clozapine‐related medical complications are likely to present first to the primary care or general hospital setting, internists must be aware of potential iatrogenic effects, and of their implications for psychiatric and medical care. Using case examples, we will examine both common and serious complications associated with clozapine, and discuss recommendations for management, including indications for clozapine discontinuation.
NEUROLOGICAL
Case Vignette 1
Mr. A is a 29‐year‐old man with asthma and schizophrenia who experienced a generalized tonic‐clonic seizure during treatment at a psychiatric facility. The patient started clozapine therapy 5 weeks prior, with gradual titration to 425 mg daily. Mr. A's previous medication trials included olanzapine and chlorpromazine, which rendered little improvement to his chronic auditory hallucinations. Clozapine was temporarily withheld during further neurologic workup, in which both electroencephalogram (EEG) and brain magnetic resonance imaging were unremarkable. After 60 hours, clozapine titration was reinitiated, and valproic acid was started for mood stabilization and seizure prophylaxis. Mr. A was discharged 6 weeks later on clozapine, 600 mg at bedtime, and extended‐release divalproate, 2500 mg at bedtime. The patient suffered no further seizure activity throughout hospitalization and for at least 1 year postdischarge.
Seizures complicate clozapine use in up to 5% of cases, with a dose‐dependent risk pattern.[16] Seizures are most commonly associated with serum clozapine levels above 500 g/L), but have also been reported with lower levels of clozapine and its metabolite norclozapine.[17] Though nonspecific EEG changes (ie, focal or generalized spikes, spike‐wave and polyspike discharges) have been associated with clozapine administration, they do not reliably predict seizure tendency.[17] Prophylaxis with antiepileptic drugs (AEDs) is not recommended, though AED treatment may be undertaken for patients who experience a seizure while on clozapine. When seizures occur in the context of elevated serum levels, reducing clozapine to the lowest effective dose is preferred over initiating an AED. Although this reduces the potential for exposure to anticonvulsant‐associated adverse effects, it may also introduce the risk of relapsed psychotic symptoms, and therefore requires close monitoring by a psychiatrist. For those who opt to initiate AED therapy, we recommend consideration of each medication's therapeutic and side‐effect profiles based on the patient's medical history and active symptoms. For example, in the case of Mr. A, valproate was used to target concomitant mood symptoms; likewise, patients who experience troublesome weight gain, as well as seizures, may benefit from topiramate. The occurrence of seizures does not preclude continuation of clozapine therapy, in conjunction with an AED[18] and after consideration of potential risks and benefits of use. Clozapine is not contraindicated in patients with well‐controlled epilepsy.[19]
Sedation, the most common neurologic side effect of clozapine, is also dose dependent and often abates during titration.[20] Though clozapine may induce extrapyramidal symptoms, including rigidity, tremor, and dystonia, the risk is considerably lower with clozapine than other antipsychotics, owing to a lesser affinity for D2 receptors. Associated parkinsonism should prompt consideration of dose reduction, in discussion with a psychiatrist, with concurrent monitoring of serum clozapine levels and close follow‐up for emergence of psychotic symptoms. If dose reduction is ineffective, not indicated, or not preferred by the patient, the addition of an anticholinergic medication may be considered (eg, diphenhydramine 2550 mg, benztropine 12 mg). Neuroleptic malignant syndrome, although rare, is life‐threatening and warrants immediate discontinuation of clozapine, though successful rechallenge after has been reported in case reports.[21]
CARDIAC
Case Vignette 2
Mr. B is a 34‐year‐old man with sinus tachycardia, a benign adrenal tumor, and chronic paranoid schizophrenia that had been poorly responsive to numerous antipsychotic trials. During a psychiatric hospitalization for paranoid delusions with aggressive threats toward family, Mr. B was started on clozapine and titrated to 250 mg daily. On day 16 of clozapine therapy, the patient began to experience cough, and several days later, diffuse rhonchi were noted on examination. Complete blood count revealed WBC 20.3 * 103/L, with 37% eosinophils and absolute eosinophil count of 7.51 (increased from 12%/1.90 the week before), and an electrocardiogram showed sinus tachycardia with ST‐segment changes. Mr. B was transferred to the general medical hospital for workup of presumed myocarditis.
Approximately one‐quarter of patients who take clozapine experience sinus tachycardia, which may be related to clozapine's anticholinergic effects causing rebound noradrenergic elevations[22]; persistent or problematic tachycardia may be treated using a cardio‐selective ‐blocker. Clozapine has also been linked to significant increases in systolic and diastolic blood pressure in 4% of patients (monitoring data); the risk of hypertension increases with the duration of clozapine treatment, and appears to be independent of the patient's weight.[23] Orthostatic hypotension has been reported in 9% of patients on clozapine therapy, though effects can be mitigated with gradual titration, adequate hydration, compression stockings, and patient education. Sinus tachycardia, hypertension, and orthostatic hypotension are not absolute indications to discontinue clozapine; rather, we advocate for treating these side effects while continuing clozapine treatment.[24]
Myocarditis represents the most serious cardiac side effect of clozapine.[25, 26] Although the absolute risk appears to be lower than 0.1%,[24] Kilian et al. calculated a 1000‐to‐2000fold increase in relative risk of myocarditis among patients who take clozapine, compared to the general population.[26] Most cases occur within the first month of treatment, with median time to onset of 15 days. This time course is consistent with an acute immunoglobulin Emediated hypersensitivity (type 1) reaction, and eosinophilic infiltrates have been found on autopsy, consistent with an acute drug reaction.[20]
Because of this early onset, the physician should maintain a particularly high index of suspicion in the first months of treatment, rigorously questioning patients and families about signs and symptoms of cardiac disease. If patients on clozapine present with flu‐like symptoms, fever, myalgia, dizziness, chest pain, dyspnea, tachycardia, palpitations, or other signs or symptoms of heart failure, evaluation for myocarditis should be undertaken.[25] Several centers have utilized cardiac enzymes (e.g., troponin I, troponin T, creatine kinase‐myocardial band) as a universal screen for myocarditis, though this is not a universal practice.[24] Both tachycardia and flu‐like symptoms may be associated with clozapine, particularly during the titration period, and these are normally benign symptoms requiring no intervention. If the diagnosis of myocarditis is made, however, clozapine should be stopped immediately. Myocarditis is often considered to be a contraindication to restarting clozapine, though cases have been reported of successful clozapine rechallenge in patients who had previously experienced myocarditis.[21]
Recommendations for clozapine‐associated electrocardiography (ECG) monitoring have not been standardized. Based on common clinical practice and the time course of serious cardiac complications, we recommend baseline ECG prior to the start of clozapine, with follow‐up ECG 2 to 4 weeks after clozapine initiation, and every 6 months thereafter.
GASTROINTESTINAL
Case Vignette 3
Mr. C is a 61‐year‐old man with chronic paranoid schizophrenia and a history of multiple‐state hospital admissions. He had been maintained on clozapine for 15 years, allowing him to live independently and avoid psychiatric hospitalization. Mr. C was admitted to the general medical hospital with nausea, vomiting, and an inability to tolerate oral intake. He was found to have a high‐grade small‐bowel obstruction, and all oral medications were initially discontinued. After successful management of his acute gastrointestinal presentation and discussion of potential risks and benefits of various treatment options, clozapine was reinitiated along with bulk laxative and stool softening agents.
Affecting 14% to 60% of individuals who are prescribed clozapine, constipation represents the most common associated gastrointestinal complaint.[27] For most patients, this condition is uncomfortable but nonlethal, though it has been implicated in several deaths by aspiration pneumonia and small‐bowel perforation.[28, 29] Providers must screen regularly for constipation and treat aggressively with stimulant laxatives and stool softeners,[18] while reviewing medication lists and, when possible, streamlining extraneous anticholinergic contributors. Clozapine‐prescribed individuals also frequently suffer from gastrointestinal reflux disease (GERD), for which behavioral interventions (eg, smoking cessation or remaining upright for 3 hours after meals) should be considered in addition to pharmacologic treatment with proton pump inhibitors. Clozapine therapy may be continued while constipation and GERD are managed medically.
Potentially fatal gastrointestinal hypomotility and small‐bowel obstruction are rare but well‐described complications that occur in up to 0.3% of patients who take clozapine.[27] This effect appears to be dose dependent, and higher blood levels are associated with greater severity of constipation and risk for serious hypomotility.[27] Clozapine should be withheld during treatment for such serious adverse events as ileus or small‐bowel perforation; however, once these conditions have stabilized, clozapine therapy may be reconsidered based on an analysis of potential benefits and risks. If clozapine is withheld, the internist must monitor for acute worsening of mental status, inattention, and disorientation, as clozapine withdrawal‐related delirium has been reported.[30] Ultimately, aggressive treatment of constipation in conjunction with continued clozapine therapy is the recommended course of action.[28]
Given the increased risk of ileus in the postoperative period, it is particularly important for physicians to inquire about preoperative bowel habits and assess for any existing constipation. Careful monitoring of postoperative bowel motility, along with early and aggressive management of constipation, is recommended. Concurrent administration of other constipating agents (eg, opiates, anticholinergics) should be limited to the lowest effective dose.[27] Although transaminitis, hepatitis, and pancreatitis have all been associated with clozapine in case reports, these are rare,[31] and the approach to management should be considered on a case‐by‐case basis.
HEMATOLOGIC
Case Vignette 4
Ms. D is a 38‐year‐old woman with a schizoaffective disorder who was started on clozapine after 3 other agents had failed to control her psychotic symptoms and alleviate chronic suicidal thoughts. Baseline CBC revealed serum white blood cell count (WBC) of 7800/mm3 and absolute neutrophil count (ANC) of 4700/mm3. In Ms. D's third week of clozapine use, WBC dropped to 4400/mm3 and ANC to 2200/mm3. Repeat lab draw confirmed this, prompting the treatment team to initiate twice‐weekly CBC monitoring. Ms. D's counts continued to fall, and 10 days after the initial drop, WBC was calculated at 1400/mm3 and ANC at 790/mm3. Clozapine was discontinued, and though the patient was asymptomatic, broad‐spectrum antibiotics were initiated. She received daily CBC monitoring until WBC >3000/mm3 and ANC >1500/mm3. An alternate psychotropic medication was initiated several weeks thereafter.
Neutropenia (white blood cell count <3000/mm3) is a common complication that affects approximately 3% of patients who take clozapine.[32] This may be mediated by clozapine's selective impact on the precursors of polymorphonuclear leukocytes, though the mechanism remains unknown.[33] Although neutropenia is not an absolute contraindication for clozapine therapy, guidelines recommend cessation of clozapine when the ANC drops below 1000/mm3.[34] A meta‐analysis of 112 patients who were rechallenged following neutropenia found that 69% tolerated a rechallenge without development of a subsequent dyscrasia.[21]
In the case of chemotherapy‐induced neutropenia, several case reports support the continued use of clozapine during cancer treatment[35]; this requires a written request to the pharmaceutical company that manufactures clozapine and documentation of the expected time course and contribution of chemotherapy to neutropenia.[36] Clozapine's association with neutropenia warrants close monitoring in individuals with human immunodeficiency virus (HIV) and other causes of immune compromise. Reports of clozapine continuation in HIV‐positive individuals underscore the importance of close collaboration between infectious disease and psychiatry, with specific focus on potential interactions between clozapine and antiretroviral agents and close monitoring of viral load and ANC.[37]
The most feared complication of clozapine remains agranulocytosis, defined as ANC<500/mm3,[33] which occurs in up to 1% of monitored patients. In 1975, clozapine was banned worldwide after 8 fatal cases of agranulocytosis were reported in Finland.[38] The drug was reintroduced for treatment‐resistant schizophrenia with strict monitoring parameters, which has sharply reduced the death rate. One study found 12 actual deaths between 1990 and 1994, compared to the 149 predicted deaths without monitoring.[39]
The risk of agranulocytosis appears to be higher in older adults and in patients with a lower baseline WBC count. Although there are reports of delayed agranulocytosis occurring in patients after up to 19 years of treatment,[40] the incidence of leukopenia is greatest in the first year. Given this high‐risk period, mandatory monitoring is as follows: weekly WBC and neutrophil counts for the first 26 weeks, biweekly counts for the second 26 weeks, and every 4 weeks thereafter. Of note, many of the later cases of agranulocytosis appear to be related to medication coadministration, particularly with valproic acid, though no definitive link has been established.[40]
Treatment of clozapine‐induced agranulocytosis consists of immediate clozapine cessation, and consideration of initiation of prophylactic broad‐spectrum antibiotics and granulocyte colony‐stimulating factor (such as filgrastim) until the granulocyte count normalizes.[41, 42] Although few case reports describe successful clozapine rechallenge in patients with a history of agranulocytosis, the data are sparse, and current practice is to permanently discontinue clozapine if ANC falls below 1000/mm3.[21, 41]
ADDITIONAL COMPLICATIONS (METABOLIC, RENAL, URINARY)
Moderate to marked weight gain occurs in over 50% of patients treated with clozapine, with average gains of nearly 10% body weight.[43] In a 10‐year follow‐up study of patients treated with clozapine, Henderson et al. reported an average weight gain of 13 kg, with 34% percent of studied patients developing diabetes mellitus. Metabolic side effects of second‐generation antipsychotics, including clozapine, are a well‐documented and troubling phenomenon.[44] Limited evidence supports use of metformin, alongside behavioral therapy, for concerns related to glucose dysregulation.[45] Some patients have also experienced weight loss with adjunctive topiramate use, particularly if they have also suffered seizures.[46]
Urinary incontinence and nocturnal enuresis are both associated with clozapine, but are likely under‐reported because of patient and provider embarrassment; providers also may not think to ask about these specific symptoms. First‐line treatment for nocturnal enuresis is to limit fluids in the evening. Desmopressin has a controversial role in treating nocturnal enuresis owing to its risk of hyponatremia; appropriate monitoring should be implemented if this agent is used.[18]
Clozapine has been associated with acute interstitial nephritis (AIN), although this is thought to be a relatively rare side effect. Drug‐induced AIN typically appears soon after initiation and presents with the clinical triad of rash, fever, and eosinophilia. Given that weekly CBC is mandatory in the initiation phase, eosinophilia is easily detectible and may serve as a marker for potential AIN.[47]
Sialorrhea, particularly during sleep, is a bothersome condition affecting up to one‐third of patients who take clozapine.[48] Although clozapine is strongly anticholinergic, its agonist activity at the M4 muscarinic receptor and antagonism of the alpha‐2 adrenergic receptor are postulated as the mechanisms underlying hypersalivation. Sialorrhea is frequently seen early in treatment and does not appear to be dose dependent.[48] Excessive salivation is typically managed with behavioral interventions (eg, utilizing towels or other absorbent materials on top of bedding). If hypersalivation occurs during the day, chewing sugar‐free gum may increase the rate of swallowing and make symptoms less bothersome. If this does not provide adequate relief, practitioners may consider use of atropine 1% solution administered directly to the oral cavity.[49]
DRUG‐DRUG INTERACTIONS
For hospitalists, who must frequently alter existing medications or add new ones, awareness of potential drug‐drug interactions is crucial. Clozapine is metabolized by the cytochrome p450 system, with predominant metabolism through the isoenzymes 1A2, 3A4, and 2D6.[50] Common medications that induce clozapine metabolism (thereby decreasing clozapine levels) include phenytoin, phenobarbital, carbamazepine, oxcarbazepine, and corticosteroids. Conversely, stopping these medications after long‐term therapy will raise clozapine levels. Substances that inhibit clozapine metabolism (thereby increasing clozapine levels) include ciprofloxacin, erythromycin, clarithromycin, fluvoxamine, fluoxetine, paroxetine, protease inhibitors, verapamil, and grapefruit juice. We recommend caution when concurrently administering other agents that increase risk for agranulocytosis, including carbamazepine, trimethoprim‐sulfamethoxazole, sulfasalazine, and tricyclic antidepressants.
Cigarette smoking decreases clozapine blood levels by induction of CYP1A2. Patients require a 10% to 30% reduction to clozapine dose during periods of smoking cessation, including when smoking is stopped during inpatient hospitalization.[51] Nicotine replacement therapy does not induce CYP1A2 and therefore does not have a compensatory effect on clozapine levels. On discharge or resumption of smoking, patients may require an increase of their dose of clozapine to maintain adequate antipsychotic effect.
SUMMARY OF RECOMMENDATIONS
Medical complications are cited as the cause in 20% of clozapine discontinuations; most commonly, these include seizures, severe constipation, somnolence, and neutropenia. Given the high risk of psychiatric morbidity posed by discontinuation, we recommend managing mild‐moderate symptoms and side effects while continuing the drug, when possible (Table 1). We encourage hospitalists to confer with the patient's psychiatrist or the inpatient psychiatry consultation service when making changes to clozapine therapy. Specific recommendations are as follows:
- We advocate withholding clozapine administration pending medical optimization for several conditions, including: small‐bowel obstruction, neuroleptic malignant syndrome, venous thromboembolism, diabetic ketoacidosis, or hyperosmolar coma.
- Clinical scenarios requiring acute discontinuation of clozapine include agranulocytosis and myocarditis. Successful rechallenge with clozapine has been described after both conditions; at the same time, given the high morbidity and mortality of myocarditis and agranulocytosis, re‐initiation of clozapine requires an extensive risk‐benefit discussion with the patient and family, informed consent, and, in the case of agranulocytosis, approval from the national clozapine registry (Table 2).
- Although adjunctive therapy with filgrastim was initially thought to permit a clozapine rechallenge in patients with a history of agranulocytosis, case reports on this strategy have been equivocal, and further research is necessary to determine the most effective strategy for management.
| Clinical Lab/Study | Frequency of Monitoring | |
|---|---|---|
| Cardiac | Electrocardiogram | Baseline, 24 weeks after initiation, every 6 months thereafter |
| Cardiac enzymes (eg, troponin I) echocardiogram | No standard guidelines, unless clinically indicated | |
| Hematologic | Complete blood count with differential | Baseline, then weekly 26 weeks, then every 2 weeks 26 weeks, then every 4 weeks thereafter |
| Metabolic | Body mass index; circumference of waist | Baseline, then every 3 to 6 months |
| Fasting glucose | Baseline, then every 6 months | |
| Fasting lipid panel | Baseline, then yearly | |
| Neurologic | Electroencephalogram | No standard guidelines, unless clinically indicated |
| Vital signs | Heart rate, blood pressure, temperature | Baseline and at each follow‐up visit |
| Requires Acute Clozapine Discontinuation* | Clozapine Interruption During Management | Does Not Typically Require Clozapine Discontinuation |
|---|---|---|
| ||
| Agranulocytosis (ANC<1.0 109/mm3) | Diabetic complications (eg, ketoacidosis, hyperosmolar coma) | Constipation |
| Cardiomyopathy (severe) | Gastrointestinal obstruction, ileus | Diabetes mellitus |
| Myocarditis | Neuroleptic malignant syndrome | Gastroesophageal Reflux |
| Venous thromboembolism | Hyperlipidemia | |
| Hypertension | ||
| Orthostatic hypotension | ||
| Sedation | ||
| Seizures | ||
| Sialorrhea | ||
| Sinus tachycardia | ||
| Urinary changes (eg, enuresis, incontinence) | ||
| Weight gain | ||
CONCLUSION
Clozapine has been a very successful treatment for patients with schizophrenia who have failed other antipsychotic therapies. However, fears of potential side effects and frequent monitoring have limited its use and led to unnecessary discontinuation. To mitigate risk for serious complications, we hope to increase hospitalists' awareness of prevention, monitoring, and treatment of side effects, and to promote comfort with circumstances that warrant continuation or discontinuation of clozapine (Table 3). The hospitalist plays a crucial role in managing these complications as well as conveying information and recommendations to primary care providers; as such, their familiarity with the medication is essential for proper management of individuals who take clozapine.
| Take‐Home Points |
|---|
| 1. Clozapine is the gold standard for treatment‐resistant schizophrenia; however, its use is limited by side effects, many of which can be successfully treated by internists. |
| 2. There are few indications for discontinuing clozapine (myocarditis, small‐bowel obstruction, agranulocytosis). The psychiatry service should be consulted in the event that clozapine is discontinued. |
| 3. Seizures are not an indication for discontinuing clozapine; instead, we recommend adding an antiepileptic drug. |
| 4. All second‐generation antipsychotics are associated with diabetes mellitus and significant weight gain. Clozapine is more highly associated with metabolic side effects than many other medications in this class. |
| 5. Sedation, sialorrhea, and constipation are common and can be managed pharmacologically and with behavioral interventions. |
Disclosure: Nothing to report.
Clozapine is a second‐generation antipsychotic (SGA) medication that was developed in 1959, introduced to Europe in 1971, and withdrawn from the market in 1975 due to associated concerns for potentially fatal agranulocytosis. In 1989, the US Food and Drug Administration (FDA) approved use of clozapine for the management of treatment‐resistant schizophrenia, under strict parameters for complete blood count (CBC) monitoring. Clozapine has since gained an additional FDA indication for reducing suicidal behavior in patients with schizophrenia and schizoaffective disorder,[1, 2, 3] and displayed superiority to both first generation antipsychotics and other SGA agents in reducing symptom burden.[2, 4, 5]
Clozapine's clinical benefits include lowering mortality in schizophrenia,[6] reducing deaths from ischemic heart disease,[7] curtailing substance use in individuals with psychotic disorders,[8] increasing rates of independent living and meaningful occupational activity, and reducing psychiatric hospitalizations and need for involuntary treatment.[9] Because schizophrenia, itself, is associated with a 15‐ to 20‐year decrease in average lifespan,[10] these benefits of clozapine are particularly salient. Yet the mechanism by which clozapine mitigates otherwise‐refractory psychotic symptoms is a conundrum. Structurally a tricyclic dibenzodiazepine, clozapine has relatively little effect on the dopamine D2 receptor, which has classically been thought to mediate the treatment effect of antipsychotics.[11, 12]
The unique nature of clozapine extends to its adverse effect profile. A significant percentage of patients who discontinue clozapine (17%35.4%) cite medical complications, the most common being seizures, constipation, sedation, and neutropenia.[13, 14] Yet several studies, including the landmark Clinical Antipsychotic Trials for Interventions Effectiveness (CATIE) study, have found that patients were more likely to adhere to clozapine therapy than to other antipsychotics.[2, 15] In the CATIE study, 44% of subjects taking clozapine continued the medication for 18 months, compared to 29% of individuals on olanzapine, 14% on risperidone, and 7% on quetiapine. Median time until discontinuation of clozapine was 10.5 months, significantly longer than for quetiapine (2.8 months) and olanzapine (2.7 months).[2] Because patients who experience clozapine‐related medical complications are likely to present first to the primary care or general hospital setting, internists must be aware of potential iatrogenic effects, and of their implications for psychiatric and medical care. Using case examples, we will examine both common and serious complications associated with clozapine, and discuss recommendations for management, including indications for clozapine discontinuation.
NEUROLOGICAL
Case Vignette 1
Mr. A is a 29‐year‐old man with asthma and schizophrenia who experienced a generalized tonic‐clonic seizure during treatment at a psychiatric facility. The patient started clozapine therapy 5 weeks prior, with gradual titration to 425 mg daily. Mr. A's previous medication trials included olanzapine and chlorpromazine, which rendered little improvement to his chronic auditory hallucinations. Clozapine was temporarily withheld during further neurologic workup, in which both electroencephalogram (EEG) and brain magnetic resonance imaging were unremarkable. After 60 hours, clozapine titration was reinitiated, and valproic acid was started for mood stabilization and seizure prophylaxis. Mr. A was discharged 6 weeks later on clozapine, 600 mg at bedtime, and extended‐release divalproate, 2500 mg at bedtime. The patient suffered no further seizure activity throughout hospitalization and for at least 1 year postdischarge.
Seizures complicate clozapine use in up to 5% of cases, with a dose‐dependent risk pattern.[16] Seizures are most commonly associated with serum clozapine levels above 500 g/L), but have also been reported with lower levels of clozapine and its metabolite norclozapine.[17] Though nonspecific EEG changes (ie, focal or generalized spikes, spike‐wave and polyspike discharges) have been associated with clozapine administration, they do not reliably predict seizure tendency.[17] Prophylaxis with antiepileptic drugs (AEDs) is not recommended, though AED treatment may be undertaken for patients who experience a seizure while on clozapine. When seizures occur in the context of elevated serum levels, reducing clozapine to the lowest effective dose is preferred over initiating an AED. Although this reduces the potential for exposure to anticonvulsant‐associated adverse effects, it may also introduce the risk of relapsed psychotic symptoms, and therefore requires close monitoring by a psychiatrist. For those who opt to initiate AED therapy, we recommend consideration of each medication's therapeutic and side‐effect profiles based on the patient's medical history and active symptoms. For example, in the case of Mr. A, valproate was used to target concomitant mood symptoms; likewise, patients who experience troublesome weight gain, as well as seizures, may benefit from topiramate. The occurrence of seizures does not preclude continuation of clozapine therapy, in conjunction with an AED[18] and after consideration of potential risks and benefits of use. Clozapine is not contraindicated in patients with well‐controlled epilepsy.[19]
Sedation, the most common neurologic side effect of clozapine, is also dose dependent and often abates during titration.[20] Though clozapine may induce extrapyramidal symptoms, including rigidity, tremor, and dystonia, the risk is considerably lower with clozapine than other antipsychotics, owing to a lesser affinity for D2 receptors. Associated parkinsonism should prompt consideration of dose reduction, in discussion with a psychiatrist, with concurrent monitoring of serum clozapine levels and close follow‐up for emergence of psychotic symptoms. If dose reduction is ineffective, not indicated, or not preferred by the patient, the addition of an anticholinergic medication may be considered (eg, diphenhydramine 2550 mg, benztropine 12 mg). Neuroleptic malignant syndrome, although rare, is life‐threatening and warrants immediate discontinuation of clozapine, though successful rechallenge after has been reported in case reports.[21]
CARDIAC
Case Vignette 2
Mr. B is a 34‐year‐old man with sinus tachycardia, a benign adrenal tumor, and chronic paranoid schizophrenia that had been poorly responsive to numerous antipsychotic trials. During a psychiatric hospitalization for paranoid delusions with aggressive threats toward family, Mr. B was started on clozapine and titrated to 250 mg daily. On day 16 of clozapine therapy, the patient began to experience cough, and several days later, diffuse rhonchi were noted on examination. Complete blood count revealed WBC 20.3 * 103/L, with 37% eosinophils and absolute eosinophil count of 7.51 (increased from 12%/1.90 the week before), and an electrocardiogram showed sinus tachycardia with ST‐segment changes. Mr. B was transferred to the general medical hospital for workup of presumed myocarditis.
Approximately one‐quarter of patients who take clozapine experience sinus tachycardia, which may be related to clozapine's anticholinergic effects causing rebound noradrenergic elevations[22]; persistent or problematic tachycardia may be treated using a cardio‐selective ‐blocker. Clozapine has also been linked to significant increases in systolic and diastolic blood pressure in 4% of patients (monitoring data); the risk of hypertension increases with the duration of clozapine treatment, and appears to be independent of the patient's weight.[23] Orthostatic hypotension has been reported in 9% of patients on clozapine therapy, though effects can be mitigated with gradual titration, adequate hydration, compression stockings, and patient education. Sinus tachycardia, hypertension, and orthostatic hypotension are not absolute indications to discontinue clozapine; rather, we advocate for treating these side effects while continuing clozapine treatment.[24]
Myocarditis represents the most serious cardiac side effect of clozapine.[25, 26] Although the absolute risk appears to be lower than 0.1%,[24] Kilian et al. calculated a 1000‐to‐2000fold increase in relative risk of myocarditis among patients who take clozapine, compared to the general population.[26] Most cases occur within the first month of treatment, with median time to onset of 15 days. This time course is consistent with an acute immunoglobulin Emediated hypersensitivity (type 1) reaction, and eosinophilic infiltrates have been found on autopsy, consistent with an acute drug reaction.[20]
Because of this early onset, the physician should maintain a particularly high index of suspicion in the first months of treatment, rigorously questioning patients and families about signs and symptoms of cardiac disease. If patients on clozapine present with flu‐like symptoms, fever, myalgia, dizziness, chest pain, dyspnea, tachycardia, palpitations, or other signs or symptoms of heart failure, evaluation for myocarditis should be undertaken.[25] Several centers have utilized cardiac enzymes (e.g., troponin I, troponin T, creatine kinase‐myocardial band) as a universal screen for myocarditis, though this is not a universal practice.[24] Both tachycardia and flu‐like symptoms may be associated with clozapine, particularly during the titration period, and these are normally benign symptoms requiring no intervention. If the diagnosis of myocarditis is made, however, clozapine should be stopped immediately. Myocarditis is often considered to be a contraindication to restarting clozapine, though cases have been reported of successful clozapine rechallenge in patients who had previously experienced myocarditis.[21]
Recommendations for clozapine‐associated electrocardiography (ECG) monitoring have not been standardized. Based on common clinical practice and the time course of serious cardiac complications, we recommend baseline ECG prior to the start of clozapine, with follow‐up ECG 2 to 4 weeks after clozapine initiation, and every 6 months thereafter.
GASTROINTESTINAL
Case Vignette 3
Mr. C is a 61‐year‐old man with chronic paranoid schizophrenia and a history of multiple‐state hospital admissions. He had been maintained on clozapine for 15 years, allowing him to live independently and avoid psychiatric hospitalization. Mr. C was admitted to the general medical hospital with nausea, vomiting, and an inability to tolerate oral intake. He was found to have a high‐grade small‐bowel obstruction, and all oral medications were initially discontinued. After successful management of his acute gastrointestinal presentation and discussion of potential risks and benefits of various treatment options, clozapine was reinitiated along with bulk laxative and stool softening agents.
Affecting 14% to 60% of individuals who are prescribed clozapine, constipation represents the most common associated gastrointestinal complaint.[27] For most patients, this condition is uncomfortable but nonlethal, though it has been implicated in several deaths by aspiration pneumonia and small‐bowel perforation.[28, 29] Providers must screen regularly for constipation and treat aggressively with stimulant laxatives and stool softeners,[18] while reviewing medication lists and, when possible, streamlining extraneous anticholinergic contributors. Clozapine‐prescribed individuals also frequently suffer from gastrointestinal reflux disease (GERD), for which behavioral interventions (eg, smoking cessation or remaining upright for 3 hours after meals) should be considered in addition to pharmacologic treatment with proton pump inhibitors. Clozapine therapy may be continued while constipation and GERD are managed medically.
Potentially fatal gastrointestinal hypomotility and small‐bowel obstruction are rare but well‐described complications that occur in up to 0.3% of patients who take clozapine.[27] This effect appears to be dose dependent, and higher blood levels are associated with greater severity of constipation and risk for serious hypomotility.[27] Clozapine should be withheld during treatment for such serious adverse events as ileus or small‐bowel perforation; however, once these conditions have stabilized, clozapine therapy may be reconsidered based on an analysis of potential benefits and risks. If clozapine is withheld, the internist must monitor for acute worsening of mental status, inattention, and disorientation, as clozapine withdrawal‐related delirium has been reported.[30] Ultimately, aggressive treatment of constipation in conjunction with continued clozapine therapy is the recommended course of action.[28]
Given the increased risk of ileus in the postoperative period, it is particularly important for physicians to inquire about preoperative bowel habits and assess for any existing constipation. Careful monitoring of postoperative bowel motility, along with early and aggressive management of constipation, is recommended. Concurrent administration of other constipating agents (eg, opiates, anticholinergics) should be limited to the lowest effective dose.[27] Although transaminitis, hepatitis, and pancreatitis have all been associated with clozapine in case reports, these are rare,[31] and the approach to management should be considered on a case‐by‐case basis.
HEMATOLOGIC
Case Vignette 4
Ms. D is a 38‐year‐old woman with a schizoaffective disorder who was started on clozapine after 3 other agents had failed to control her psychotic symptoms and alleviate chronic suicidal thoughts. Baseline CBC revealed serum white blood cell count (WBC) of 7800/mm3 and absolute neutrophil count (ANC) of 4700/mm3. In Ms. D's third week of clozapine use, WBC dropped to 4400/mm3 and ANC to 2200/mm3. Repeat lab draw confirmed this, prompting the treatment team to initiate twice‐weekly CBC monitoring. Ms. D's counts continued to fall, and 10 days after the initial drop, WBC was calculated at 1400/mm3 and ANC at 790/mm3. Clozapine was discontinued, and though the patient was asymptomatic, broad‐spectrum antibiotics were initiated. She received daily CBC monitoring until WBC >3000/mm3 and ANC >1500/mm3. An alternate psychotropic medication was initiated several weeks thereafter.
Neutropenia (white blood cell count <3000/mm3) is a common complication that affects approximately 3% of patients who take clozapine.[32] This may be mediated by clozapine's selective impact on the precursors of polymorphonuclear leukocytes, though the mechanism remains unknown.[33] Although neutropenia is not an absolute contraindication for clozapine therapy, guidelines recommend cessation of clozapine when the ANC drops below 1000/mm3.[34] A meta‐analysis of 112 patients who were rechallenged following neutropenia found that 69% tolerated a rechallenge without development of a subsequent dyscrasia.[21]
In the case of chemotherapy‐induced neutropenia, several case reports support the continued use of clozapine during cancer treatment[35]; this requires a written request to the pharmaceutical company that manufactures clozapine and documentation of the expected time course and contribution of chemotherapy to neutropenia.[36] Clozapine's association with neutropenia warrants close monitoring in individuals with human immunodeficiency virus (HIV) and other causes of immune compromise. Reports of clozapine continuation in HIV‐positive individuals underscore the importance of close collaboration between infectious disease and psychiatry, with specific focus on potential interactions between clozapine and antiretroviral agents and close monitoring of viral load and ANC.[37]
The most feared complication of clozapine remains agranulocytosis, defined as ANC<500/mm3,[33] which occurs in up to 1% of monitored patients. In 1975, clozapine was banned worldwide after 8 fatal cases of agranulocytosis were reported in Finland.[38] The drug was reintroduced for treatment‐resistant schizophrenia with strict monitoring parameters, which has sharply reduced the death rate. One study found 12 actual deaths between 1990 and 1994, compared to the 149 predicted deaths without monitoring.[39]
The risk of agranulocytosis appears to be higher in older adults and in patients with a lower baseline WBC count. Although there are reports of delayed agranulocytosis occurring in patients after up to 19 years of treatment,[40] the incidence of leukopenia is greatest in the first year. Given this high‐risk period, mandatory monitoring is as follows: weekly WBC and neutrophil counts for the first 26 weeks, biweekly counts for the second 26 weeks, and every 4 weeks thereafter. Of note, many of the later cases of agranulocytosis appear to be related to medication coadministration, particularly with valproic acid, though no definitive link has been established.[40]
Treatment of clozapine‐induced agranulocytosis consists of immediate clozapine cessation, and consideration of initiation of prophylactic broad‐spectrum antibiotics and granulocyte colony‐stimulating factor (such as filgrastim) until the granulocyte count normalizes.[41, 42] Although few case reports describe successful clozapine rechallenge in patients with a history of agranulocytosis, the data are sparse, and current practice is to permanently discontinue clozapine if ANC falls below 1000/mm3.[21, 41]
ADDITIONAL COMPLICATIONS (METABOLIC, RENAL, URINARY)
Moderate to marked weight gain occurs in over 50% of patients treated with clozapine, with average gains of nearly 10% body weight.[43] In a 10‐year follow‐up study of patients treated with clozapine, Henderson et al. reported an average weight gain of 13 kg, with 34% percent of studied patients developing diabetes mellitus. Metabolic side effects of second‐generation antipsychotics, including clozapine, are a well‐documented and troubling phenomenon.[44] Limited evidence supports use of metformin, alongside behavioral therapy, for concerns related to glucose dysregulation.[45] Some patients have also experienced weight loss with adjunctive topiramate use, particularly if they have also suffered seizures.[46]
Urinary incontinence and nocturnal enuresis are both associated with clozapine, but are likely under‐reported because of patient and provider embarrassment; providers also may not think to ask about these specific symptoms. First‐line treatment for nocturnal enuresis is to limit fluids in the evening. Desmopressin has a controversial role in treating nocturnal enuresis owing to its risk of hyponatremia; appropriate monitoring should be implemented if this agent is used.[18]
Clozapine has been associated with acute interstitial nephritis (AIN), although this is thought to be a relatively rare side effect. Drug‐induced AIN typically appears soon after initiation and presents with the clinical triad of rash, fever, and eosinophilia. Given that weekly CBC is mandatory in the initiation phase, eosinophilia is easily detectible and may serve as a marker for potential AIN.[47]
Sialorrhea, particularly during sleep, is a bothersome condition affecting up to one‐third of patients who take clozapine.[48] Although clozapine is strongly anticholinergic, its agonist activity at the M4 muscarinic receptor and antagonism of the alpha‐2 adrenergic receptor are postulated as the mechanisms underlying hypersalivation. Sialorrhea is frequently seen early in treatment and does not appear to be dose dependent.[48] Excessive salivation is typically managed with behavioral interventions (eg, utilizing towels or other absorbent materials on top of bedding). If hypersalivation occurs during the day, chewing sugar‐free gum may increase the rate of swallowing and make symptoms less bothersome. If this does not provide adequate relief, practitioners may consider use of atropine 1% solution administered directly to the oral cavity.[49]
DRUG‐DRUG INTERACTIONS
For hospitalists, who must frequently alter existing medications or add new ones, awareness of potential drug‐drug interactions is crucial. Clozapine is metabolized by the cytochrome p450 system, with predominant metabolism through the isoenzymes 1A2, 3A4, and 2D6.[50] Common medications that induce clozapine metabolism (thereby decreasing clozapine levels) include phenytoin, phenobarbital, carbamazepine, oxcarbazepine, and corticosteroids. Conversely, stopping these medications after long‐term therapy will raise clozapine levels. Substances that inhibit clozapine metabolism (thereby increasing clozapine levels) include ciprofloxacin, erythromycin, clarithromycin, fluvoxamine, fluoxetine, paroxetine, protease inhibitors, verapamil, and grapefruit juice. We recommend caution when concurrently administering other agents that increase risk for agranulocytosis, including carbamazepine, trimethoprim‐sulfamethoxazole, sulfasalazine, and tricyclic antidepressants.
Cigarette smoking decreases clozapine blood levels by induction of CYP1A2. Patients require a 10% to 30% reduction to clozapine dose during periods of smoking cessation, including when smoking is stopped during inpatient hospitalization.[51] Nicotine replacement therapy does not induce CYP1A2 and therefore does not have a compensatory effect on clozapine levels. On discharge or resumption of smoking, patients may require an increase of their dose of clozapine to maintain adequate antipsychotic effect.
SUMMARY OF RECOMMENDATIONS
Medical complications are cited as the cause in 20% of clozapine discontinuations; most commonly, these include seizures, severe constipation, somnolence, and neutropenia. Given the high risk of psychiatric morbidity posed by discontinuation, we recommend managing mild‐moderate symptoms and side effects while continuing the drug, when possible (Table 1). We encourage hospitalists to confer with the patient's psychiatrist or the inpatient psychiatry consultation service when making changes to clozapine therapy. Specific recommendations are as follows:
- We advocate withholding clozapine administration pending medical optimization for several conditions, including: small‐bowel obstruction, neuroleptic malignant syndrome, venous thromboembolism, diabetic ketoacidosis, or hyperosmolar coma.
- Clinical scenarios requiring acute discontinuation of clozapine include agranulocytosis and myocarditis. Successful rechallenge with clozapine has been described after both conditions; at the same time, given the high morbidity and mortality of myocarditis and agranulocytosis, re‐initiation of clozapine requires an extensive risk‐benefit discussion with the patient and family, informed consent, and, in the case of agranulocytosis, approval from the national clozapine registry (Table 2).
- Although adjunctive therapy with filgrastim was initially thought to permit a clozapine rechallenge in patients with a history of agranulocytosis, case reports on this strategy have been equivocal, and further research is necessary to determine the most effective strategy for management.
| Clinical Lab/Study | Frequency of Monitoring | |
|---|---|---|
| Cardiac | Electrocardiogram | Baseline, 24 weeks after initiation, every 6 months thereafter |
| Cardiac enzymes (eg, troponin I) echocardiogram | No standard guidelines, unless clinically indicated | |
| Hematologic | Complete blood count with differential | Baseline, then weekly 26 weeks, then every 2 weeks 26 weeks, then every 4 weeks thereafter |
| Metabolic | Body mass index; circumference of waist | Baseline, then every 3 to 6 months |
| Fasting glucose | Baseline, then every 6 months | |
| Fasting lipid panel | Baseline, then yearly | |
| Neurologic | Electroencephalogram | No standard guidelines, unless clinically indicated |
| Vital signs | Heart rate, blood pressure, temperature | Baseline and at each follow‐up visit |
| Requires Acute Clozapine Discontinuation* | Clozapine Interruption During Management | Does Not Typically Require Clozapine Discontinuation |
|---|---|---|
| ||
| Agranulocytosis (ANC<1.0 109/mm3) | Diabetic complications (eg, ketoacidosis, hyperosmolar coma) | Constipation |
| Cardiomyopathy (severe) | Gastrointestinal obstruction, ileus | Diabetes mellitus |
| Myocarditis | Neuroleptic malignant syndrome | Gastroesophageal Reflux |
| Venous thromboembolism | Hyperlipidemia | |
| Hypertension | ||
| Orthostatic hypotension | ||
| Sedation | ||
| Seizures | ||
| Sialorrhea | ||
| Sinus tachycardia | ||
| Urinary changes (eg, enuresis, incontinence) | ||
| Weight gain | ||
CONCLUSION
Clozapine has been a very successful treatment for patients with schizophrenia who have failed other antipsychotic therapies. However, fears of potential side effects and frequent monitoring have limited its use and led to unnecessary discontinuation. To mitigate risk for serious complications, we hope to increase hospitalists' awareness of prevention, monitoring, and treatment of side effects, and to promote comfort with circumstances that warrant continuation or discontinuation of clozapine (Table 3). The hospitalist plays a crucial role in managing these complications as well as conveying information and recommendations to primary care providers; as such, their familiarity with the medication is essential for proper management of individuals who take clozapine.
| Take‐Home Points |
|---|
| 1. Clozapine is the gold standard for treatment‐resistant schizophrenia; however, its use is limited by side effects, many of which can be successfully treated by internists. |
| 2. There are few indications for discontinuing clozapine (myocarditis, small‐bowel obstruction, agranulocytosis). The psychiatry service should be consulted in the event that clozapine is discontinued. |
| 3. Seizures are not an indication for discontinuing clozapine; instead, we recommend adding an antiepileptic drug. |
| 4. All second‐generation antipsychotics are associated with diabetes mellitus and significant weight gain. Clozapine is more highly associated with metabolic side effects than many other medications in this class. |
| 5. Sedation, sialorrhea, and constipation are common and can be managed pharmacologically and with behavioral interventions. |
Disclosure: Nothing to report.
- , , , . Clozapine versus typical neuroleptic medication for schizophrenia. Cochrane Database Syst Rev. 2009(1):CD000059.
- , , , et al. Effectiveness of clozapine versus olanzapine, quetiapine, and risperidone in patients with chronic schizophrenia who did not respond to prior atypical antipsychotic treatment. Am J Psychiatry. 2006;163(4):600–610.
- , , , et al. Randomized controlled trial of effect of prescription of clozapine versus other second‐generation antipsychotic drugs in resistant schizophrenia. Schizophr Bull. 2006;32(4):715–723.
- , , , et al. Effects of clozapine on positive and negative symptoms in outpatients with schizophrenia. Am J Psychiatry. 1994;151(1):20–26.
- , , , . Clozapine for the treatment‐resistant schizophrenic. A double‐blind comparison with chlorpromazine. Arch Gen Psychiatry. 1988;45(9):789–796.
- , , , , , . Clozapine treatment for suicidality in schizophrenia: International Suicide Prevention Trial (InterSePT). Arch Gen Psychiatry. 2003;60(1):82–91.
- , , , et al. 11‐year follow‐up of mortality in patients with schizophrenia: a population‐based cohort study (FIN11 study). Lancet. 2009;374(9690):620–627.
- , , , , . Clozapine use and relapses of substance use disorder among patients with co‐occurring schizophrenia and substance use disorders. Schizophr Bull. 2006;32(4):637–643.
- , , . Outcomes for schizophrenia patients with clozapine treatment: how good does it get? J Psychopharmacol. 2009;23(8):957–965.
- , , , . Morbidity and mortality in people with serious mental illness. National Association of State Mental Health Program Directors (NASMHPD) Medical Directors Council. Available at: http://www.nasmhpd.org/docs/publications/MDCdocs/Mortality%20and%20Morbidity%20Final%20Report%208.18.08.pdf. Accessed February 3, 2015.
- , . Pharmacological actions of the atypical antipsychotic drug clozapine: a review. Synapse. 1996;24(4):349–394.
- , . Clozapine. A novel antipsychotic agent. N Engl J Med. 1991;324(11):746–754.
- , . Reason for clozapine cessation. Acta Psychiatr Scand. 2012;125(1):39–44.
- , , , . Termination of clozapine treatment due to medical reasons: when is it warranted and how can it be avoided? J Clin Psychiatry. 2013;74(6):603–613.
- , , , , , . Time to discontinuation of antipsychotic drugs in a schizophrenia cohort: influence of current treatment strategies. Ther Adv Psychopharmacol. 2014;4(6):228–239.
- , , . Clozapine‐related seizures. Neurology. 1991;41(3):369–371.
- , , , . Clozapine‐related EEG changes and seizures: dose and plasma‐level relationships. Ther Adv Psychopharmacol. 2011;1(2):47–66.
- . Review and management of clozapine side effects. J Clin Psychiatry. 2000;61(suppl 8):14–17; discussion 18–19.
- , . Epilepsy, psychosis and clozapine. Human Psychopharmacol Clin Exp. 2002;17:115–119.
- , , , , , . Response of patients with treatment‐refractory schizophrenia to clozapine within three serum level ranges. Am J Psychiatry. 1996;153(12):1579–1584.
- , , , , . When can patients with potentially life‐threatening adverse effects be rechallenged with clozapine? A systematic review of the published literature. Schizophr Res. 2012;134(2–3):180–186.
- , . Clinical profile of clozapine: adverse reactions and agranulocytosis. Psychiatr Q. 1992;63(1):51–70.
- , , , , , . Clozapine and hypertension: a chart review of 82 patients. J Clin Psychiatry. 2004;65(5):686–689.
- , , . Adverse cardiac effects associated with clozapine. J Clin Psychopharmacol. 2005;25(1):32–41.
- , , , , . Clozapine induced myocarditis: a rare but fatal complication. Int J Cardiol. 2006;112(2):e5–e6.
- , , , . Myocarditis and cardiomyopathy associated with clozapine. Lancet. 1999;354(9193):1841–1845.
- , , , . Life‐threatening clozapine‐induced gastrointestinal hypomotility: an analysis of 102 cases. J Clin Psychiatry. 2008;69(5):759–768.
- , , , . Fatalities associated with clozapine‐related constipation and bowel obstruction: a literature review and two case reports. Psychosomatics. 2009;50(4):416–419.
- , , . Death from clozapine‐induced constipation: case report and literature review. Psychosomatics. 2002;43(1):71–73.
- , , , , , . Clozapine: a clinical review of adverse effects and management. Ann Clin Psychiatry. 2003;15(1):33–48.
- , , , , . Beyond white blood cell monitoring: screening in the initial phase of clozapine therapy. J Clin Psychiatry. 2012;73(10):1307–1312.
- Clozapine [package insert]. Sellersville, PA: TEVA Pharmaceuticals USA; 2013. Available at: https://www.clozapineregistry.com/insert.pdf.ashx. Accessed October 27, 2014.
- , , , , . Clozapine‐induced agranulocytosis. Incidence and risk factors in the United States. N Engl J Med. 1993;329(3):162–167.
- Clozaril (clozapine) prescribing information. Washington, DC: U.S. Food and Drug Administration; 2013. Available at: http://www.accessdata.fda.gov/drugsatfda_docs/label/2013/019758s069s071lbl.pdf. Accessed February 4, 2015.
- . Clozapine therapy during cancer treatment. Am J Psychiatry. 2004;161(1):175.
- , , , , . Continuation of clozapine during chemotherapy: a case report and review of literature. Psychosomatics. 2014;55(6):673–679.
- , , . Clozapine use in HIV‐infected schizophrenia patients: a case‐based discussion and review. Psychosomatics. 2009;50(6):626–632.
- , , , . Letter: clozapine and agranulocytosis. Lancet. 1975;2(7935):611.
- . Effects of the clozapine national registry system on incidence of deaths related to agranulocytosis. Psychiatr Serv. 1996;47(1):52–56.
- , . White blood cell monitoring during long‐term clozapine treatment. Am J Psychiatry. 2013;170(4):366–369.
- , , . Add‐on filgrastim during clozapine rechallenge in patients with a history of clozapine‐related granulocytopenia/agranulocytosis. Am J Psychiatry. 2009;166(2):236.
- , , . Add‐on filgrastim during clozapine rechallenge unsuccessful in preventing agranulocytosis. Gen Hosp Psychiatry. 2013;35(5):576.e11–12.
- , , , , , . Clozapine‐induced weight gain: prevalence and clinical relevance. Am J Psychiatry. 1992;149(1):68–72.
- , , , et al. Clozapine, diabetes mellitus, hyperlipidemia, and cardiovascular risks and mortality: results of a 10‐year naturalistic study. J Clin Psychiatry. 2005;66(9):1116–1121.
- , , , et al. Effects of adjunctive metformin on metabolic traits in nondiabetic clozapine‐treated patients with schizophrenia and the effect of metformin discontinuation on body weight: a 24‐week, randomized, double‐blind, placebo‐controlled study. J Clin Psychiatry. 2013;74(5):e424–e430.
- , , , . Topiramate for clozapine‐induced seizures. Am J Psychiatry. 2001;158(6):968–969.
- , , , , . Clozapine‐induced acute interstitial nephritis. Lancet. 1999;354(9185):1180–1181.
- , , , , . Update on the clinical efficacy and side effects of clozapine. Schizophr Bull. 1991;17(2):247–261.
- , , . Clozapine‐induced sialorrhea: pathophysiology and management strategies. Psychopharmacology. 2006;185(3):265–273.
- , , . Clozapine drug‐drug interactions: a review of the literature. Hum Psychopharm Clin. 1997;12(1):5–20.
- , , . The effect of smoking and cytochrome P450 CYP1A2 genetic polymorphism on clozapine clearance and dose requirement. Pharmacogenetics. 2003;13(3):169–172.
- , , , . Clozapine versus typical neuroleptic medication for schizophrenia. Cochrane Database Syst Rev. 2009(1):CD000059.
- , , , et al. Effectiveness of clozapine versus olanzapine, quetiapine, and risperidone in patients with chronic schizophrenia who did not respond to prior atypical antipsychotic treatment. Am J Psychiatry. 2006;163(4):600–610.
- , , , et al. Randomized controlled trial of effect of prescription of clozapine versus other second‐generation antipsychotic drugs in resistant schizophrenia. Schizophr Bull. 2006;32(4):715–723.
- , , , et al. Effects of clozapine on positive and negative symptoms in outpatients with schizophrenia. Am J Psychiatry. 1994;151(1):20–26.
- , , , . Clozapine for the treatment‐resistant schizophrenic. A double‐blind comparison with chlorpromazine. Arch Gen Psychiatry. 1988;45(9):789–796.
- , , , , , . Clozapine treatment for suicidality in schizophrenia: International Suicide Prevention Trial (InterSePT). Arch Gen Psychiatry. 2003;60(1):82–91.
- , , , et al. 11‐year follow‐up of mortality in patients with schizophrenia: a population‐based cohort study (FIN11 study). Lancet. 2009;374(9690):620–627.
- , , , , . Clozapine use and relapses of substance use disorder among patients with co‐occurring schizophrenia and substance use disorders. Schizophr Bull. 2006;32(4):637–643.
- , , . Outcomes for schizophrenia patients with clozapine treatment: how good does it get? J Psychopharmacol. 2009;23(8):957–965.
- , , , . Morbidity and mortality in people with serious mental illness. National Association of State Mental Health Program Directors (NASMHPD) Medical Directors Council. Available at: http://www.nasmhpd.org/docs/publications/MDCdocs/Mortality%20and%20Morbidity%20Final%20Report%208.18.08.pdf. Accessed February 3, 2015.
- , . Pharmacological actions of the atypical antipsychotic drug clozapine: a review. Synapse. 1996;24(4):349–394.
- , . Clozapine. A novel antipsychotic agent. N Engl J Med. 1991;324(11):746–754.
- , . Reason for clozapine cessation. Acta Psychiatr Scand. 2012;125(1):39–44.
- , , , . Termination of clozapine treatment due to medical reasons: when is it warranted and how can it be avoided? J Clin Psychiatry. 2013;74(6):603–613.
- , , , , , . Time to discontinuation of antipsychotic drugs in a schizophrenia cohort: influence of current treatment strategies. Ther Adv Psychopharmacol. 2014;4(6):228–239.
- , , . Clozapine‐related seizures. Neurology. 1991;41(3):369–371.
- , , , . Clozapine‐related EEG changes and seizures: dose and plasma‐level relationships. Ther Adv Psychopharmacol. 2011;1(2):47–66.
- . Review and management of clozapine side effects. J Clin Psychiatry. 2000;61(suppl 8):14–17; discussion 18–19.
- , . Epilepsy, psychosis and clozapine. Human Psychopharmacol Clin Exp. 2002;17:115–119.
- , , , , , . Response of patients with treatment‐refractory schizophrenia to clozapine within three serum level ranges. Am J Psychiatry. 1996;153(12):1579–1584.
- , , , , . When can patients with potentially life‐threatening adverse effects be rechallenged with clozapine? A systematic review of the published literature. Schizophr Res. 2012;134(2–3):180–186.
- , . Clinical profile of clozapine: adverse reactions and agranulocytosis. Psychiatr Q. 1992;63(1):51–70.
- , , , , , . Clozapine and hypertension: a chart review of 82 patients. J Clin Psychiatry. 2004;65(5):686–689.
- , , . Adverse cardiac effects associated with clozapine. J Clin Psychopharmacol. 2005;25(1):32–41.
- , , , , . Clozapine induced myocarditis: a rare but fatal complication. Int J Cardiol. 2006;112(2):e5–e6.
- , , , . Myocarditis and cardiomyopathy associated with clozapine. Lancet. 1999;354(9193):1841–1845.
- , , , . Life‐threatening clozapine‐induced gastrointestinal hypomotility: an analysis of 102 cases. J Clin Psychiatry. 2008;69(5):759–768.
- , , , . Fatalities associated with clozapine‐related constipation and bowel obstruction: a literature review and two case reports. Psychosomatics. 2009;50(4):416–419.
- , , . Death from clozapine‐induced constipation: case report and literature review. Psychosomatics. 2002;43(1):71–73.
- , , , , , . Clozapine: a clinical review of adverse effects and management. Ann Clin Psychiatry. 2003;15(1):33–48.
- , , , , . Beyond white blood cell monitoring: screening in the initial phase of clozapine therapy. J Clin Psychiatry. 2012;73(10):1307–1312.
- Clozapine [package insert]. Sellersville, PA: TEVA Pharmaceuticals USA; 2013. Available at: https://www.clozapineregistry.com/insert.pdf.ashx. Accessed October 27, 2014.
- , , , , . Clozapine‐induced agranulocytosis. Incidence and risk factors in the United States. N Engl J Med. 1993;329(3):162–167.
- Clozaril (clozapine) prescribing information. Washington, DC: U.S. Food and Drug Administration; 2013. Available at: http://www.accessdata.fda.gov/drugsatfda_docs/label/2013/019758s069s071lbl.pdf. Accessed February 4, 2015.
- . Clozapine therapy during cancer treatment. Am J Psychiatry. 2004;161(1):175.
- , , , , . Continuation of clozapine during chemotherapy: a case report and review of literature. Psychosomatics. 2014;55(6):673–679.
- , , . Clozapine use in HIV‐infected schizophrenia patients: a case‐based discussion and review. Psychosomatics. 2009;50(6):626–632.
- , , , . Letter: clozapine and agranulocytosis. Lancet. 1975;2(7935):611.
- . Effects of the clozapine national registry system on incidence of deaths related to agranulocytosis. Psychiatr Serv. 1996;47(1):52–56.
- , . White blood cell monitoring during long‐term clozapine treatment. Am J Psychiatry. 2013;170(4):366–369.
- , , . Add‐on filgrastim during clozapine rechallenge in patients with a history of clozapine‐related granulocytopenia/agranulocytosis. Am J Psychiatry. 2009;166(2):236.
- , , . Add‐on filgrastim during clozapine rechallenge unsuccessful in preventing agranulocytosis. Gen Hosp Psychiatry. 2013;35(5):576.e11–12.
- , , , , , . Clozapine‐induced weight gain: prevalence and clinical relevance. Am J Psychiatry. 1992;149(1):68–72.
- , , , et al. Clozapine, diabetes mellitus, hyperlipidemia, and cardiovascular risks and mortality: results of a 10‐year naturalistic study. J Clin Psychiatry. 2005;66(9):1116–1121.
- , , , et al. Effects of adjunctive metformin on metabolic traits in nondiabetic clozapine‐treated patients with schizophrenia and the effect of metformin discontinuation on body weight: a 24‐week, randomized, double‐blind, placebo‐controlled study. J Clin Psychiatry. 2013;74(5):e424–e430.
- , , , . Topiramate for clozapine‐induced seizures. Am J Psychiatry. 2001;158(6):968–969.
- , , , , . Clozapine‐induced acute interstitial nephritis. Lancet. 1999;354(9185):1180–1181.
- , , , , . Update on the clinical efficacy and side effects of clozapine. Schizophr Bull. 1991;17(2):247–261.
- , , . Clozapine‐induced sialorrhea: pathophysiology and management strategies. Psychopharmacology. 2006;185(3):265–273.
- , , . Clozapine drug‐drug interactions: a review of the literature. Hum Psychopharm Clin. 1997;12(1):5–20.
- , , . The effect of smoking and cytochrome P450 CYP1A2 genetic polymorphism on clozapine clearance and dose requirement. Pharmacogenetics. 2003;13(3):169–172.
Automated Sepsis Alert Systems
Sepsis is the most expensive condition treated in the hospital, resulting in an aggregate cost of $20.3 billion or 5.2% of total aggregate cost for all hospitalizations in the United States.[1] Rates of sepsis and sepsis‐related mortality are rising in the United States.[2, 3] Timely treatment of sepsis, including adequate fluid resuscitation and appropriate antibiotic administration, decreases morbidity, mortality, and costs.[4, 5, 6] Consequently, the Surviving Sepsis Campaign recommends timely care with the implementation of sepsis bundles and protocols.[4] Though effective, sepsis protocols require dedicated personnel with specialized training, who must be highly vigilant and constantly monitor a patient's condition for the course of an entire hospitalization.[7, 8] As such, delays in administering evidence‐based therapies are common.[8, 9]
Automated electronic sepsis alerts are being developed and implemented to facilitate the delivery of timely sepsis care. Electronic alert systems synthesize electronic health data routinely collected for clinical purposes in real time or near real time to automatically identify sepsis based on prespecified diagnostic criteria, and immediately alert providers that their patient may meet sepsis criteria via electronic notifications (eg, through electronic health record [EHR], e‐mail, or pager alerts).
However, little data exist to describe whether automated, electronic systems achieve their intended goal of earlier, more effective sepsis care. To examine this question, we performed a systematic review on automated electronic sepsis alerts to assess their suitability for clinical use. Our 2 objectives were: (1) to describe the diagnostic accuracy of alert systems in identifying sepsis using electronic data available in real‐time or near real‐time, and (2) to evaluate the effectiveness of sepsis alert systems on sepsis care process measures and clinical outcomes.
MATERIALS AND METHODS
Data Sources and Search Strategies
We searched PubMed MEDLINE, Embase, The Cochrane Library, and the Cumulative Index to Nursing and Allied Health Literature from database inception through June 27, 2014, for all studies that contained the following 3 concepts: sepsis, electronic systems, and alerts (or identification). All citations were imported into an electronic database (EndNote X5; Thomson‐Reuters Corp., New York, NY) (see Supporting Information, Appendix, in the online version of this article for our complete search strategy).
Study Selection
Two authors (A.N.M. and O.K.N.) reviewed the citation titles, abstracts, and full‐text articles of potentially relevant references identified from the literature search for eligibility. References of selected articles were hand searched to identify additional eligible studies. Inclusion criteria for eligible studies were: (1) adult patients (aged 18 years) receiving care either in the emergency department or hospital, (2) outcomes of interest including diagnostic accuracy in identification of sepsis, and/or effectiveness of sepsis alerts on process measures and clinical outcomes evaluated using empiric data, and (3) sepsis alert systems used real time or near real time electronically available data to enable proactive, timely management. We excluded studies that: (1) tested the effect of other electronic interventions that were not sepsis alerts (ie, computerized order sets) for sepsis management; (2) studies solely focused on detecting and treating central line‐associated bloodstream infections, shock (not otherwise specified), bacteremia, or other device‐related infections; and (3) studies evaluating the effectiveness of sepsis alerts without a control group.
Data Extraction and Quality Assessment
Two reviewers (A.N.M. and O.K.N.) extracted data on the clinical setting, study design, dates of enrollment, definition of sepsis, details of the identification and alert systems, diagnostic accuracy of the alert system, and the incidence of process measures and clinical outcomes using a standardized form. Discrepancies between reviewers were resolved by discussion and consensus. Data discrepancies identified in 1 study were resolved by contacting the corresponding author.[10]
For studies assessing the diagnostic accuracy of sepsis identification, study quality was assessed using the Quality Assessment of Diagnostic Accuracy Studies revised tool.[11] For studies evaluating the effectiveness of sepsis alert systems, studies were considered high quality if a contemporaneous control group was present to account for temporal trends (eg, randomized controlled trial or observational analysis with a concurrent control). Fair‐quality studies were before‐and‐after studies that adjusted for potential confounders between time periods. Low‐quality studies included those that did not account for temporal trends, such as before‐and‐after studies using only historical controls without adjustment. Studies that did not use an intention‐to‐treat analysis were also considered low quality. The strength of the overall body of evidence, including risk of bias, was guided by the Grading of Recommendations Assessment, Development, and Evaluation Working Group Criteria adapted by the Agency of Healthcare Research and Quality.[12]
Data Synthesis
To analyze the diagnostic accuracy of automated sepsis alert systems to identify sepsis and to evaluate the effect on outcomes, we performed a qualitative assessment of all studies. We were unable to perform a meta‐analysis due to significant heterogeneity in study quality, clinical setting, and definition of the sepsis alert. Diagnostic accuracy of sepsis identification was measured by sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV), and likelihood ratio (LR). Effectiveness was assessed by changes in sepsis care process measures (ie, time to antibiotics) and outcomes (length of stay, mortality).
RESULTS
Description of Studies
Of 1293 titles, 183 qualified for abstract review, 84 for full‐text review, and 8 articles met our inclusion criteria (see Supporting Figure in the online version of this article). Five articles evaluated the diagnostic accuracy of sepsis identification,[10, 13, 14, 15, 16] and 5 articles[10, 14, 17, 18, 19] evaluated the effectiveness of automated electronic sepsis alerts on sepsis process measures and patient outcomes. All articles were published between 2009 and 2014 and were single‐site studies conducted at academic medical centers (Tables 1 and 2). The clinical settings in the included studies varied and included the emergency department (ED), hospital wards, and the intensive care unit (ICU).
| Source | Site No./Type | Setting | Alert Threshold | Gold Standard Definition | Gold Standard Measurement | No. | Study Qualitya |
|---|---|---|---|---|---|---|---|
| |||||||
| Hooper et al., 201210 | 1/academic | MICU | 2 SIRS criteriab | Reviewer judgment, not otherwise specified | Chart review | 560 | High |
| Meurer et al., 200913 | 1/academic | ED | 2 SIRS criteria | Reviewer judgment whether diagnosis of infection present in ED plus SIRS criteria | Chart review | 248 | Low |
| Nelson J. et al., 201114 | 1/academic | ED | 2 SIRS criteria and 2 SBP measurements <90 mm Hg | Reviewer judgment whether infection present, requiring hospitalization with at least 1 organ system involved | Chart review | 1,386 | High |
| Nguyen et al., 201415 | 1/academic | ED | 2 SIRS criteria and 1 sign of shock (SBP 90 mm Hg or lactic acid 2.0 mmol/L) | Reviewer judgment to confirm SIRS, shock, and presence of a serious infection | Chart review | 1,095 | Low |
| Thiel et al., 201016 | 1/academic | Wards | Recursive partitioning tree analysis including vitals and laboratory resultsc | Admitted to the hospital wards and subsequently transferred to the ICU for septic shock and treated with vasopressor therapy | ICD‐9 discharge codes for acute infection, acute organ dysfunction, and need for vasopressors within 24 hours of ICU transfer | 27,674 | Low |
| Source | Design | Site No./ Type | Setting | No. | Alert System Type | Alert Threshold | Alert Notificationa | Treatment Recommendation | Study Qualityb |
|---|---|---|---|---|---|---|---|---|---|
| |||||||||
| Berger et al., 201017 | Before‐after (6 months pre and 6 months post) | 1/academic | ED | 5796c | CPOE system | 2 SIRS criteria | CPOE passive alert | Yes: lactate collection | Low |
| Hooper et al., 201210 | RCT | 1/academic | MICU | 443 | EHR | 2 SIRS criteriad | Text page and EHR passive alert | No | High |
| McRee et al., 201418 | Before‐after (6 months pre and 6 months post) | 1/academic | Wards | 171e | EHR | 2 SIRS criteria | Notified nurse, specifics unclear | No, but the nurse completed a sepsis risk evaluation flow sheet | Low |
| Nelson et al., 201114 | Before‐after (3 months pre and 3 months post) | 1/academic | ED | 184f | EHR | 2 SIRS criteria and 2 or more SBP readings <90 mm Hg | Text page and EHR passive alert | Yes: fluid resuscitation, blood culture collection, antibiotic administration, among others | Low |
| Sawyer et al., 201119 | Prospective, nonrandomized (2 intervention and 4 control wards) | 1/academic | Wards | 300 | EHR | Recursive partitioning regression tree algorithm including vitals and lab valuesg | Text page to charge nurse who then assessed patient and informed treating physicianh | No | High |
Among the 8 included studies, there was significant heterogeneity in threshold criteria for sepsis identification and subsequent alert activation. The most commonly defined threshold was the presence of 2 or more systemic inflammatory response syndrome (SIRS) criteria.[10, 13, 17, 18]
Diagnostic Accuracy of Automated Electronic Sepsis Alert Systems
The prevalence of sepsis varied substantially between the studies depending on the gold standard definition of sepsis used and the clinical setting (ED, wards, or ICU) of the study (Table 3). The 2 studies[14, 16] that defined sepsis as requiring evidence of shock had a substantially lower prevalence (0.8%4.7%) compared to the 2 studies[10, 13] that defined sepsis as having only 2 or more SIRS criteria with a presumed diagnosis of an infection (27.8%32.5%).
| Source | Setting | Alert Threshold | Prevalence, % | Sensitivity, % (95% CI) | Specificity, % (95% CI) | PPV, % (95% CI) | NPV, % (95% CI) | LR+, (95% CI) | LR, (95% CI) |
|---|---|---|---|---|---|---|---|---|---|
| |||||||||
| Hooper et al., 201210 | MICU | 2 SIRS criteriaa | 36.3 | 98.9 (95.799.8) | 18.1 (14.222.9) | 40.7 (36.145.5) | 96.7 (87.599.4) | 1.21 (1.14‐1.27) | 0.06 (0.01‐0.25) |
| Meurer et al., 200913 | ED | 2 SIRS criteria | 27.8 | 36.2 (25.348.8) | 79.9 (73.185.3) | 41.0 (28.854.3) | 76.5 (69.682.2) | 1.80 (1.17‐2.76) | 0.80 (0.67‐0.96) |
| Nelson et al., 201114 | ED | 2 SIRS criteria and 2 SBP measurements<90 mm Hg | 0.8 | 63.6 (31.687.8) | 99.6 (99.099.8) | 53.8 (26.179.6) | 99.7 (99.299.9) | 145.8 (58.4364.1) | 0.37 (0.17‐0.80) |
| Nguyen et al., 201415 | ED | 2 SIRS criteria and 1 sign of shock (SBP 90 mm Hg or lactic acid 2.0 mmol/L) | Unable to estimateb | Unable to estimateb | Unable to estimateb | 44.7 (41.248.2) | 100.0c (98.8100.0) | Unable to estimateb | Unable to estimateb |
| Thiel et al., 201016 | Wards | Recursive partitioning tree analysis including vitals and laboratory resultsd | 4.7 | 17.1 (15.119.3) | 96.7 (96.596.9) | 20.5 (18.223.0) | 95.9 (95.796.2) | 5.22 (4.56‐5.98) | 0.86 (0.84‐0.88) |
All alert systems had suboptimal PPV (20.5%‐53.8%). The 2 studies that designed the sepsis alert to activate by SIRS criteria alone[10, 13] had a positive predictive value of 41% and a positive LR of 1.21 to 1.80. The ability to exclude the presence of sepsis varied considerably depending on the clinical setting. The study by Hooper et al.[10] that examined the alert among patients in the medical ICU appeared more effective at ruling out sepsis (NPV=96.7%; negative LR=0.06) compared to a similar alert system used by Meurer et al.[13] that studied patients in the ED (NPV=76.5%, negative LR=0.80).
There were also differences in the diagnostic accuracy of the sepsis alert systems depending on how the threshold for activating the sepsis alert was defined and applied in the study. Two studies evaluated a sepsis alert system among patients presenting to the ED at the same academic medical center.[13, 14] The alert system (Nelson et al.) that was triggered by a combination of SIRS criteria and hypotension (PPV=53.8%, LR+=145.8; NPV=99.7%, LR=0.37) outperformed the alert system (Meurer et al.) that was triggered by SIRS criteria alone (PPV=41.0%, LR+=1.80; NPV=76.5%, LR=0.80). Furthermore, the study by Meurer and colleagues evaluated the accuracy of the alert system only among patients who were hospitalized after presenting to the ED, rather than all consecutive patients presenting to the ED. This selection bias likely falsely inflated the diagnostic accuracy of the alert system used by Meurer et al., suggesting the alert system that was triggered by a combination of SIRS criteria and hypotension was comparatively even more accurate.
Two studies evaluating the diagnostic accuracy of the alert system were deemed to be high quality (Table 4). Three studies were considered low quality1 study did not include all patients in their assessment of diagnostic accuracy13; 1 study consecutively selected alert cases but randomly selected nonalert cases, greatly limiting the assessment of diagnostic accuracy15; and the other study applied a gold standard that was unlikely to correctly classify sepsis (septic shock requiring ICU transfer with vasopressor support in the first 24 hours was defined by discharge International Classification of Diseases, Ninth Revision diagnoses without chart review), with a considerable delay from the alert system trigger (alert identification was compared to the discharge diagnosis rather than physician review of real‐time data).[16]
| Study | Patient Selection | Index Test | Reference Standard | Flow and Timing |
|---|---|---|---|---|
| ||||
| Hooper et al., 201210 | +++ | +++ | ++b | +++ |
| Meurer et al., 200913 | +++ | +++ | ++b | +c |
| Nelson et al., 201114 | +++ | +++ | ++b | +++ |
| Nguyen et al., 201415 | +d | +++ | +e | +++ |
| Thiel et al., 201016 | +++ | +++ | +f | +g |
Effectiveness of Automated Electronic Sepsis Alert Systems
Characteristics of the studies evaluating the effectiveness of automated electronic sepsis alert systems are summarized in Table 2. Regarding activation of the sepsis alert, 2 studies notified the provider directly by an automated text page and a passive EHR alert (not requiring the provider to acknowledge the alert or take action),[10, 14] 1 study notified the provider by a passive electronic alert alone,[17] and 1 study only employed an automated text page.[19] Furthermore, if the sepsis alert was activated, 2 studies suggested specific clinical management decisions,[14, 17] 2 studies left clinical management decisions solely to the discretion of the treating provider,[10, 19] and 1 study assisted the diagnosis of sepsis by prompting nurses to complete a second manual sepsis risk evaluation.[18]
Table 5 summarizes the effectiveness of automated electronic sepsis alert systems. Two studies evaluating the effectiveness of the sepsis alert system were considered to be high‐quality studies based on the use of a contemporaneous control group to account for temporal trends and an intention‐to‐treat analysis.[10, 19] The 2 studies evaluating the effectiveness of a sepsis alert system in the ED were considered low quality due to before‐and‐after designs without an intention‐to‐treat analysis.[14, 17]
| Source | Outcomes Evaluated | Key Findings | Quality |
|---|---|---|---|
| |||
| Hooper et al., 201210 | Primary: time to receipt of antibiotic (new or changed) | No difference (6.1 hours for control vs 6.0 hours for intervention, P=0.95) | High |
| Secondary: sepsis‐related process measures and outcomes | No difference in amount of 6 hour IV fluid administration (964 mL vs 1,019 mL, P=0.6), collection of blood cultures (adjusted HR 1.01; 95% CI, 0.76 to 1.35), collection of lactate (adjusted HR 0.84; 95% CI, 0.54 to 1.30), ICU length of stay (3.0 vs 3.0 days, P=0.2), hospital length of stay (4.7 vs 5.7 days, P=0.08), and hospital mortality (10% for control vs 14% for intervention, P=0.3) | ||
| Sawyer et al., 201119 | Primary: sepsis‐related process measures (antibiotic escalation, IV fluids, oxygen therapy, vasopressor initiation, diagnostic testing (blood culture, CXR) within 12 hours of alert | Increases in receiving 1 measure (56% for control vs 71% for intervention, P=0.02), antibiotic escalation (24% vs 36%, P=0.04), IV fluid administration (24% vs 38%, P=0.01), and oxygen therapy (8% vs 20%, P=0.005). There was a nonsignificant increase in obtaining diagnostic tests (40% vs 52%, P=0.06) and vasopressor initiation (3% vs 6%, P=0.4) | High |
| Secondary: ICU transfer, hospital length of stay, hospital length of stay after alert, in‐hospital mortality | Similar rate of ICU transfer (23% for control vs 26% for intervention, P=0.6), hospital length of stay (7 vs 9 days, median, P=0.8), hospital length of stay after alert (5 vs 6 days, median, P=0.7), and in‐hospital mortality (12% vs 10%, P=0.7) | ||
| Berger et al., 201017 | Primary: lactate collection in ED | Increase in lactate collection in the ED (5.2% before vs 12.7% after alert implemented, absolute increase of 7.5%, 95% CI, 6.0% to 9.0%) | Low |
| Secondary: lactate collection among hospitalized patients, proportion of patients with abnormal lactate (4 mmol/L), and in‐hospital mortality among hospitalized patients | Increase in lactate collection among hospitalized patients (15.3% vs 34.2%, absolute increase of 18.9%, 95% CI, 15.0% to 22.8%); decrease in the proportion of abnormal lactate values (21.9% vs 14.8%, absolute decrease of 7.6%, 95% CI, 15.8% to 0.6%), and no significant difference in mortality (5.7% vs 5.2%, absolute decrease of 0.5%, 95% CI, 1.6% to 2.6%, P=0.6) | ||
| McRee et al., 201418 | Stage of sepsis, length of stay, mortality, discharge location | Nonsignificant decrease in stage of sepsis (34.7% with septic shock before vs 21.9% after, P>0.05); no difference in length‐of‐stay (8.5 days before vs 8.7 days after, P>0.05). Decrease in mortality (9.3% before vs 1.0% after, P<0.05) and proportion of patients discharged home (25.3% before vs 49.0% after, P<0.05) | Low |
| Nelson et al., 201114 | Frequency and time to completion of process measures: lactate, blood culture, CXR, and antibiotic initiation | Increases in blood culture collection (OR 2.9; 95% CI, 1.1 to 7.7) and CXR (OR 3.2; 95% CI, 1.1 to 9.5); nonsignificant increases in lactate collection (OR 1.7; 95% CI, 0.9 to 3.2) and antibiotic administration (OR 2.8; 95% CI, 0.9 to 8.3). Only blood cultures were collected in a more timely manner (median of 86 minutes before vs 81 minutes after alert implementation, P=0.03). | Low |
Neither of the 2 high‐quality studies that included a contemporaneous control found evidence for improving inpatient mortality or hospital and ICU length of stay.[10, 19] The impact of sepsis alert systems on improving process measures for sepsis management depended on the clinical setting. In a randomized controlled trial of patients admitted to a medical ICU, Hooper et al. did not find any benefit of implementing a sepsis alert system on improving intermediate outcome measures such as antibiotic escalation, fluid resuscitation, and collection of blood cultures and lactate.[10] However, in a well‐designed observational study, Sawyer et al. found significant increases in antibiotic escalation, fluid resuscitation, and diagnostic testing in patients admitted to the medical wards.[19] Both studies that evaluated the effectiveness of sepsis alert systems in the ED showed improvements in various process measures,[14, 17] but without improvement in mortality.[17] The single study that showed improvement in clinical outcomes (in‐hospital mortality and disposition location) was of low quality due to the prestudypoststudy design without adjustment for potential confounders and lack of an intention‐to‐treat analysis (only individuals with a discharge diagnosis of sepsis were included, rather than all individuals who triggered the alert).[18] Additionally, the preintervention group had a higher proportion of individuals with septic shock compared to the postintervention group, raising the possibility that the observed improvement was due to difference in severity of illness between the 2 groups rather than due to the intervention.
None of the studies included in this review explicitly reported on the potential harms (eg, excess antimicrobial use or alert fatigue) after implementation of sepsis alerts, but Hooper et al. found a nonsignificant increase in mortality, and Sawyer et al. showed a nonsignificant increase in the length of stay in the intervention group compared to the control group.[10, 19] Berger et al. showed an overall increase in the number of lactate tests performed, but with a decrease in the proportion of abnormal lactate values (21.9% vs 14.8%, absolute decrease of 7.6%, 95% confidence interval, 15.8% to 0.6%), suggesting potential overtesting in patients at low risk for septic shock. In the study by Hooper et al., 88% (442/502) of the patients in the medical intensive care unit triggered an alert, raising the concern for alert fatigue.[10] Furthermore, 3 studies did not perform intention‐to‐treat analyses; rather, they included only patients who triggered the alert and also had provider‐suspected or confirmed sepsis,[14, 17] or had a discharge diagnosis for sepsis.[18]
DISCUSSION
The use of sepsis alert systems derived from electronic health data and targeting hospitalized patients improve a subset of sepsis process of care measures, but at the cost of poor positive predictive value and no clear improvement in mortality or length of stay. There is insufficient evidence for the effectiveness of automated electronic sepsis alert systems in the emergency department.
We found considerable variability in the diagnostic accuracy of automated electronic sepsis alert systems. There was moderate evidence that alert systems designed to identify severe sepsis (eg, SIRS criteria plus measures of shock) had greater diagnostic accuracy than alert systems that detected sepsis based on SIRS criteria alone. Given that SIRS criteria are highly prevalent among hospitalized patients with noninfectious diseases,[20] sepsis alert systems triggered by standard SIRS criteria may have poorer predictive value with an increased risk of alert fatigueexcessive electronic warnings resulting in physicians disregarding clinically useful alerts.[21] The potential for alert fatigue is even greater in critical care settings. A retrospective analysis of physiological alarms in the ICU estimated on average 6 alarms per hour with only 15% of alarms considered to be clinically relevant.[22]
The fact that sepsis alert systems improve intermediate process measures among ward and ED patients but not ICU patients likely reflects differences in both the patients and the clinical settings.[23] First, patients in the ICU may already be prescribed broad spectrum antibiotics, aggressively fluid resuscitated, and have other diagnostic testing performed before the activation of a sepsis alert, so it would be less likely to see an improvement in the rates of process measures assessing initiation or escalation of therapy compared to patients treated on the wards or in the ED. The apparent lack of benefit of these systems in the ICU may merely represent a ceiling effect. Second, nurses and physicians are already vigilantly monitoring patients in the ICU for signs of clinical deterioration, so additional alert systems may be redundant. Third, patients in the ICU are connected to standard bedside monitors that continuously monitor for the presence of abnormal vital signs. An additional sepsis alert system triggered by SIRS criteria alone may be superfluous to the existing infrastructure. Fourth, the majority of patients in the ICU will trigger the sepsis alert system,[10] so there likely is a high noise‐to‐signal ratio with resultant alert fatigue.[21]
In addition to greater emphasis on alert systems of greater diagnostic accuracy and effectiveness, our review notes several important gaps that limit evidence supporting the usefulness of automated sepsis alert systems. First, there are little data to describe the optimal design of sepsis alerts[24, 25] or the frequency with which they are appropriately acted upon or dismissed. In addition, we found little data to support whether effectiveness of alert systems differed based on whether clinical decision support was included with the alert itself (eg, direct prompting with specific clinical management recommendations) or the configuration of the alert (eg, interruptive alert or informational).[24, 25] Most of the studies we reviewed employed alerts primarily targeting physicians; we found little evidence for systems that also alerted other providers (eg, nurses or rapid response teams). Few studies provided data on harms of these systems (eg, excess antimicrobial use, fluid overload due to aggressive fluid resuscitation) or how often these treatments were administered to patients who did not eventually have sepsis. Few studies employed study designs that limited biases (eg, randomized or quasiexperimental designs) or used an intention‐to‐treat approach. Studies that exclude false positive alerts in analyses could bias estimates toward making sepsis alert systems appear more effective than they actually were. Finally, although presumably, deploying automated sepsis alerts in the ED would facilitate more timely recognition and treatment, more rigorously conducted studies are needed to identify whether using these alerts in the ED are of greater value compared to the wards and ICU. Given the limited number of studies included in this review, we were unable to make strong conclusions regarding the clinical benefits and cost‐effectiveness of implementing automated sepsis alerts.
Our review has certain limitations. First, despite our extensive literature search strategy, we may have missed studies published in the grey literature or in non‐English languages. Second, there is potential publication bias given the number of abstracts that we identified addressing 1 of our prespecified research questions compared to the number of peer‐reviewed publications identified by our search strategy.
CONCLUSION
Automated electronic sepsis alert systems have promise in delivering early goal‐directed therapies to patients. However, at present, automated sepsis alerts derived from electronic health data may improve care processes but tend to have poor PPV and have not been shown to improve mortality or length of stay. Future efforts should develop and study methods for sepsis alert systems that avoid the potential for alert fatigue while improving outcomes.
Acknowledgements
The authors thank Gloria Won, MLIS, for her assistance with developing and performing the literature search strategy and wish her a long and joyous retirement.
Disclosures: Part of Dr. Makam's work on this project was completed while he was a primary care research fellow at the University of California, San Francisco, funded by a National Research Service Award (training grant T32HP19025‐07‐00). Dr. Makam is currently supported by the National Center for Advancing Translational Sciences of the National Institutes of Health (KL2TR001103). Dr. Nguyen was supported by the Agency for Healthcare Research and Quality (R24HS022428‐01). Dr. Auerbach was supported by an NHLBI K24 grant (K24HL098372). Dr. Makam had full access to the data in the study and takes responsibility for the integrity of the date and accuracy of the data analysis. Study concept and design: all authors. Acquisition of data: Makam and Nguyen. Analysis and interpretation of data: all authors. Drafting of the manuscript: Makam. Critical revision of the manuscript: all authors. Statistical analysis: Makam and Nguyen. The authors have no conflicts of interest to disclose.
- , . National inpatient hospital costs: the most expensive conditions by payer, 2011: statistical brief #160. Healthcare Cost and Utilization Project (HCUP) Statistical Briefs. Rockville, MD: Agency for Healthcare Research and Quality; 2013.
- , , , . Inpatient care for septicemia or sepsis: a challenge for patients and hospitals. NCHS Data Brief. 2011;(62):1–8.
- , , , . The epidemiology of sepsis in the United States from 1979 through 2000. N Engl J Med. 2003;348(16):1546–1554.
- , , , et al. Surviving sepsis campaign: international guidelines for management of severe sepsis and septic shock: 2012. Crit Care Med. 2013;41(2):580–637.
- , , , et al. Early goal‐directed therapy in the treatment of severe sepsis and septic shock. N Engl J Med. 2001;345(19):1368–1377.
- , , , et al. A randomized trial of protocol‐based care for early septic shock. N Engl J Med. 2014;370(18):1683–1693.
- , . Implementation of early goal‐directed therapy for septic patients in the emergency department: a review of the literature. J Emerg Nurs. 2013;39(1):13–19.
- , , , , , . Factors influencing variability in compliance rates and clinical outcomes among three different severe sepsis bundles. Ann Pharmacother. 2007;41(6):929–936.
- , , , et al. Improvement in process of care and outcome after a multicenter severe sepsis educational program in Spain. JAMA. 2008;299(19):2294–2303.
- , , , et al. Randomized trial of automated, electronic monitoring to facilitate early detection of sepsis in the intensive care unit*. Crit Care Med. 2012;40(7):2096–2101.
- , , , et al. QUADAS‐2: a revised tool for the quality assessment of diagnostic accuracy studies. Ann Intern Med. 2011;155(8):529–536.
- , , , et al. AHRQ series paper 5: grading the strength of a body of evidence when comparing medical interventions—agency for healthcare research and quality and the effective health‐care program. J Clin Epidemiol. 2010;63(5):513–523.
- , , , et al. Real‐time identification of serious infection in geriatric patients using clinical information system surveillance. J Am Geriatr Soc. 2009;57(1):40–45.
- , , , . Prospective trial of real‐time electronic surveillance to expedite early care of severe sepsis. Ann Emerg Med. 2011;57(5):500–504.
- , , , et al. Automated electronic medical record sepsis detection in the emergency department. PeerJ. 2014;2:e343.
- , , , , , . Early prediction of septic shock in hospitalized patients. J Hosp Med. 2010;5(1):19–25.
- , , , , . A Computerized alert screening for severe sepsis in emergency department patients increases lactate testing but does not improve inpatient mortality. Appl Clin Inform. 2010;1(4):394–407.
- , , , , . The impact of an electronic medical record surveillance program on outcomes for patients with sepsis. Heart Lung. 2014;43(6):546–549.
- , , , et al. Implementation of a real‐time computerized sepsis alert in nonintensive care unit patients. Crit Care Med. 2011;39(3):469–473.
- . The epidemiology of the systemic inflammatory response. Intensive Care Med. 2000;26(suppl 1):S64–S74.
- , , , et al. Overrides of medication‐related clinical decision support alerts in outpatients. J Am Med Inform Assoc. 2014;21(3):487–491.
- , , , , , . Intensive care unit alarms–how many do we need? Crit Care Med. 2010;38(2):451–456.
- , . How can we best use electronic data to find and treat the critically ill?*. Crit Care Med. 2012;40(7):2242–2243.
- , , , et al. Identifying best practices for clinical decision support and knowledge management in the field. Stud Health Technol Inform. 2010;160(pt 2):806–810.
- , , , et al. Best practices in clinical decision support: the case of preventive care reminders. Appl Clin Inform. 2010;1(3):331–345.
Sepsis is the most expensive condition treated in the hospital, resulting in an aggregate cost of $20.3 billion or 5.2% of total aggregate cost for all hospitalizations in the United States.[1] Rates of sepsis and sepsis‐related mortality are rising in the United States.[2, 3] Timely treatment of sepsis, including adequate fluid resuscitation and appropriate antibiotic administration, decreases morbidity, mortality, and costs.[4, 5, 6] Consequently, the Surviving Sepsis Campaign recommends timely care with the implementation of sepsis bundles and protocols.[4] Though effective, sepsis protocols require dedicated personnel with specialized training, who must be highly vigilant and constantly monitor a patient's condition for the course of an entire hospitalization.[7, 8] As such, delays in administering evidence‐based therapies are common.[8, 9]
Automated electronic sepsis alerts are being developed and implemented to facilitate the delivery of timely sepsis care. Electronic alert systems synthesize electronic health data routinely collected for clinical purposes in real time or near real time to automatically identify sepsis based on prespecified diagnostic criteria, and immediately alert providers that their patient may meet sepsis criteria via electronic notifications (eg, through electronic health record [EHR], e‐mail, or pager alerts).
However, little data exist to describe whether automated, electronic systems achieve their intended goal of earlier, more effective sepsis care. To examine this question, we performed a systematic review on automated electronic sepsis alerts to assess their suitability for clinical use. Our 2 objectives were: (1) to describe the diagnostic accuracy of alert systems in identifying sepsis using electronic data available in real‐time or near real‐time, and (2) to evaluate the effectiveness of sepsis alert systems on sepsis care process measures and clinical outcomes.
MATERIALS AND METHODS
Data Sources and Search Strategies
We searched PubMed MEDLINE, Embase, The Cochrane Library, and the Cumulative Index to Nursing and Allied Health Literature from database inception through June 27, 2014, for all studies that contained the following 3 concepts: sepsis, electronic systems, and alerts (or identification). All citations were imported into an electronic database (EndNote X5; Thomson‐Reuters Corp., New York, NY) (see Supporting Information, Appendix, in the online version of this article for our complete search strategy).
Study Selection
Two authors (A.N.M. and O.K.N.) reviewed the citation titles, abstracts, and full‐text articles of potentially relevant references identified from the literature search for eligibility. References of selected articles were hand searched to identify additional eligible studies. Inclusion criteria for eligible studies were: (1) adult patients (aged 18 years) receiving care either in the emergency department or hospital, (2) outcomes of interest including diagnostic accuracy in identification of sepsis, and/or effectiveness of sepsis alerts on process measures and clinical outcomes evaluated using empiric data, and (3) sepsis alert systems used real time or near real time electronically available data to enable proactive, timely management. We excluded studies that: (1) tested the effect of other electronic interventions that were not sepsis alerts (ie, computerized order sets) for sepsis management; (2) studies solely focused on detecting and treating central line‐associated bloodstream infections, shock (not otherwise specified), bacteremia, or other device‐related infections; and (3) studies evaluating the effectiveness of sepsis alerts without a control group.
Data Extraction and Quality Assessment
Two reviewers (A.N.M. and O.K.N.) extracted data on the clinical setting, study design, dates of enrollment, definition of sepsis, details of the identification and alert systems, diagnostic accuracy of the alert system, and the incidence of process measures and clinical outcomes using a standardized form. Discrepancies between reviewers were resolved by discussion and consensus. Data discrepancies identified in 1 study were resolved by contacting the corresponding author.[10]
For studies assessing the diagnostic accuracy of sepsis identification, study quality was assessed using the Quality Assessment of Diagnostic Accuracy Studies revised tool.[11] For studies evaluating the effectiveness of sepsis alert systems, studies were considered high quality if a contemporaneous control group was present to account for temporal trends (eg, randomized controlled trial or observational analysis with a concurrent control). Fair‐quality studies were before‐and‐after studies that adjusted for potential confounders between time periods. Low‐quality studies included those that did not account for temporal trends, such as before‐and‐after studies using only historical controls without adjustment. Studies that did not use an intention‐to‐treat analysis were also considered low quality. The strength of the overall body of evidence, including risk of bias, was guided by the Grading of Recommendations Assessment, Development, and Evaluation Working Group Criteria adapted by the Agency of Healthcare Research and Quality.[12]
Data Synthesis
To analyze the diagnostic accuracy of automated sepsis alert systems to identify sepsis and to evaluate the effect on outcomes, we performed a qualitative assessment of all studies. We were unable to perform a meta‐analysis due to significant heterogeneity in study quality, clinical setting, and definition of the sepsis alert. Diagnostic accuracy of sepsis identification was measured by sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV), and likelihood ratio (LR). Effectiveness was assessed by changes in sepsis care process measures (ie, time to antibiotics) and outcomes (length of stay, mortality).
RESULTS
Description of Studies
Of 1293 titles, 183 qualified for abstract review, 84 for full‐text review, and 8 articles met our inclusion criteria (see Supporting Figure in the online version of this article). Five articles evaluated the diagnostic accuracy of sepsis identification,[10, 13, 14, 15, 16] and 5 articles[10, 14, 17, 18, 19] evaluated the effectiveness of automated electronic sepsis alerts on sepsis process measures and patient outcomes. All articles were published between 2009 and 2014 and were single‐site studies conducted at academic medical centers (Tables 1 and 2). The clinical settings in the included studies varied and included the emergency department (ED), hospital wards, and the intensive care unit (ICU).
| Source | Site No./Type | Setting | Alert Threshold | Gold Standard Definition | Gold Standard Measurement | No. | Study Qualitya |
|---|---|---|---|---|---|---|---|
| |||||||
| Hooper et al., 201210 | 1/academic | MICU | 2 SIRS criteriab | Reviewer judgment, not otherwise specified | Chart review | 560 | High |
| Meurer et al., 200913 | 1/academic | ED | 2 SIRS criteria | Reviewer judgment whether diagnosis of infection present in ED plus SIRS criteria | Chart review | 248 | Low |
| Nelson J. et al., 201114 | 1/academic | ED | 2 SIRS criteria and 2 SBP measurements <90 mm Hg | Reviewer judgment whether infection present, requiring hospitalization with at least 1 organ system involved | Chart review | 1,386 | High |
| Nguyen et al., 201415 | 1/academic | ED | 2 SIRS criteria and 1 sign of shock (SBP 90 mm Hg or lactic acid 2.0 mmol/L) | Reviewer judgment to confirm SIRS, shock, and presence of a serious infection | Chart review | 1,095 | Low |
| Thiel et al., 201016 | 1/academic | Wards | Recursive partitioning tree analysis including vitals and laboratory resultsc | Admitted to the hospital wards and subsequently transferred to the ICU for septic shock and treated with vasopressor therapy | ICD‐9 discharge codes for acute infection, acute organ dysfunction, and need for vasopressors within 24 hours of ICU transfer | 27,674 | Low |
| Source | Design | Site No./ Type | Setting | No. | Alert System Type | Alert Threshold | Alert Notificationa | Treatment Recommendation | Study Qualityb |
|---|---|---|---|---|---|---|---|---|---|
| |||||||||
| Berger et al., 201017 | Before‐after (6 months pre and 6 months post) | 1/academic | ED | 5796c | CPOE system | 2 SIRS criteria | CPOE passive alert | Yes: lactate collection | Low |
| Hooper et al., 201210 | RCT | 1/academic | MICU | 443 | EHR | 2 SIRS criteriad | Text page and EHR passive alert | No | High |
| McRee et al., 201418 | Before‐after (6 months pre and 6 months post) | 1/academic | Wards | 171e | EHR | 2 SIRS criteria | Notified nurse, specifics unclear | No, but the nurse completed a sepsis risk evaluation flow sheet | Low |
| Nelson et al., 201114 | Before‐after (3 months pre and 3 months post) | 1/academic | ED | 184f | EHR | 2 SIRS criteria and 2 or more SBP readings <90 mm Hg | Text page and EHR passive alert | Yes: fluid resuscitation, blood culture collection, antibiotic administration, among others | Low |
| Sawyer et al., 201119 | Prospective, nonrandomized (2 intervention and 4 control wards) | 1/academic | Wards | 300 | EHR | Recursive partitioning regression tree algorithm including vitals and lab valuesg | Text page to charge nurse who then assessed patient and informed treating physicianh | No | High |
Among the 8 included studies, there was significant heterogeneity in threshold criteria for sepsis identification and subsequent alert activation. The most commonly defined threshold was the presence of 2 or more systemic inflammatory response syndrome (SIRS) criteria.[10, 13, 17, 18]
Diagnostic Accuracy of Automated Electronic Sepsis Alert Systems
The prevalence of sepsis varied substantially between the studies depending on the gold standard definition of sepsis used and the clinical setting (ED, wards, or ICU) of the study (Table 3). The 2 studies[14, 16] that defined sepsis as requiring evidence of shock had a substantially lower prevalence (0.8%4.7%) compared to the 2 studies[10, 13] that defined sepsis as having only 2 or more SIRS criteria with a presumed diagnosis of an infection (27.8%32.5%).
| Source | Setting | Alert Threshold | Prevalence, % | Sensitivity, % (95% CI) | Specificity, % (95% CI) | PPV, % (95% CI) | NPV, % (95% CI) | LR+, (95% CI) | LR, (95% CI) |
|---|---|---|---|---|---|---|---|---|---|
| |||||||||
| Hooper et al., 201210 | MICU | 2 SIRS criteriaa | 36.3 | 98.9 (95.799.8) | 18.1 (14.222.9) | 40.7 (36.145.5) | 96.7 (87.599.4) | 1.21 (1.14‐1.27) | 0.06 (0.01‐0.25) |
| Meurer et al., 200913 | ED | 2 SIRS criteria | 27.8 | 36.2 (25.348.8) | 79.9 (73.185.3) | 41.0 (28.854.3) | 76.5 (69.682.2) | 1.80 (1.17‐2.76) | 0.80 (0.67‐0.96) |
| Nelson et al., 201114 | ED | 2 SIRS criteria and 2 SBP measurements<90 mm Hg | 0.8 | 63.6 (31.687.8) | 99.6 (99.099.8) | 53.8 (26.179.6) | 99.7 (99.299.9) | 145.8 (58.4364.1) | 0.37 (0.17‐0.80) |
| Nguyen et al., 201415 | ED | 2 SIRS criteria and 1 sign of shock (SBP 90 mm Hg or lactic acid 2.0 mmol/L) | Unable to estimateb | Unable to estimateb | Unable to estimateb | 44.7 (41.248.2) | 100.0c (98.8100.0) | Unable to estimateb | Unable to estimateb |
| Thiel et al., 201016 | Wards | Recursive partitioning tree analysis including vitals and laboratory resultsd | 4.7 | 17.1 (15.119.3) | 96.7 (96.596.9) | 20.5 (18.223.0) | 95.9 (95.796.2) | 5.22 (4.56‐5.98) | 0.86 (0.84‐0.88) |
All alert systems had suboptimal PPV (20.5%‐53.8%). The 2 studies that designed the sepsis alert to activate by SIRS criteria alone[10, 13] had a positive predictive value of 41% and a positive LR of 1.21 to 1.80. The ability to exclude the presence of sepsis varied considerably depending on the clinical setting. The study by Hooper et al.[10] that examined the alert among patients in the medical ICU appeared more effective at ruling out sepsis (NPV=96.7%; negative LR=0.06) compared to a similar alert system used by Meurer et al.[13] that studied patients in the ED (NPV=76.5%, negative LR=0.80).
There were also differences in the diagnostic accuracy of the sepsis alert systems depending on how the threshold for activating the sepsis alert was defined and applied in the study. Two studies evaluated a sepsis alert system among patients presenting to the ED at the same academic medical center.[13, 14] The alert system (Nelson et al.) that was triggered by a combination of SIRS criteria and hypotension (PPV=53.8%, LR+=145.8; NPV=99.7%, LR=0.37) outperformed the alert system (Meurer et al.) that was triggered by SIRS criteria alone (PPV=41.0%, LR+=1.80; NPV=76.5%, LR=0.80). Furthermore, the study by Meurer and colleagues evaluated the accuracy of the alert system only among patients who were hospitalized after presenting to the ED, rather than all consecutive patients presenting to the ED. This selection bias likely falsely inflated the diagnostic accuracy of the alert system used by Meurer et al., suggesting the alert system that was triggered by a combination of SIRS criteria and hypotension was comparatively even more accurate.
Two studies evaluating the diagnostic accuracy of the alert system were deemed to be high quality (Table 4). Three studies were considered low quality1 study did not include all patients in their assessment of diagnostic accuracy13; 1 study consecutively selected alert cases but randomly selected nonalert cases, greatly limiting the assessment of diagnostic accuracy15; and the other study applied a gold standard that was unlikely to correctly classify sepsis (septic shock requiring ICU transfer with vasopressor support in the first 24 hours was defined by discharge International Classification of Diseases, Ninth Revision diagnoses without chart review), with a considerable delay from the alert system trigger (alert identification was compared to the discharge diagnosis rather than physician review of real‐time data).[16]
| Study | Patient Selection | Index Test | Reference Standard | Flow and Timing |
|---|---|---|---|---|
| ||||
| Hooper et al., 201210 | +++ | +++ | ++b | +++ |
| Meurer et al., 200913 | +++ | +++ | ++b | +c |
| Nelson et al., 201114 | +++ | +++ | ++b | +++ |
| Nguyen et al., 201415 | +d | +++ | +e | +++ |
| Thiel et al., 201016 | +++ | +++ | +f | +g |
Effectiveness of Automated Electronic Sepsis Alert Systems
Characteristics of the studies evaluating the effectiveness of automated electronic sepsis alert systems are summarized in Table 2. Regarding activation of the sepsis alert, 2 studies notified the provider directly by an automated text page and a passive EHR alert (not requiring the provider to acknowledge the alert or take action),[10, 14] 1 study notified the provider by a passive electronic alert alone,[17] and 1 study only employed an automated text page.[19] Furthermore, if the sepsis alert was activated, 2 studies suggested specific clinical management decisions,[14, 17] 2 studies left clinical management decisions solely to the discretion of the treating provider,[10, 19] and 1 study assisted the diagnosis of sepsis by prompting nurses to complete a second manual sepsis risk evaluation.[18]
Table 5 summarizes the effectiveness of automated electronic sepsis alert systems. Two studies evaluating the effectiveness of the sepsis alert system were considered to be high‐quality studies based on the use of a contemporaneous control group to account for temporal trends and an intention‐to‐treat analysis.[10, 19] The 2 studies evaluating the effectiveness of a sepsis alert system in the ED were considered low quality due to before‐and‐after designs without an intention‐to‐treat analysis.[14, 17]
| Source | Outcomes Evaluated | Key Findings | Quality |
|---|---|---|---|
| |||
| Hooper et al., 201210 | Primary: time to receipt of antibiotic (new or changed) | No difference (6.1 hours for control vs 6.0 hours for intervention, P=0.95) | High |
| Secondary: sepsis‐related process measures and outcomes | No difference in amount of 6 hour IV fluid administration (964 mL vs 1,019 mL, P=0.6), collection of blood cultures (adjusted HR 1.01; 95% CI, 0.76 to 1.35), collection of lactate (adjusted HR 0.84; 95% CI, 0.54 to 1.30), ICU length of stay (3.0 vs 3.0 days, P=0.2), hospital length of stay (4.7 vs 5.7 days, P=0.08), and hospital mortality (10% for control vs 14% for intervention, P=0.3) | ||
| Sawyer et al., 201119 | Primary: sepsis‐related process measures (antibiotic escalation, IV fluids, oxygen therapy, vasopressor initiation, diagnostic testing (blood culture, CXR) within 12 hours of alert | Increases in receiving 1 measure (56% for control vs 71% for intervention, P=0.02), antibiotic escalation (24% vs 36%, P=0.04), IV fluid administration (24% vs 38%, P=0.01), and oxygen therapy (8% vs 20%, P=0.005). There was a nonsignificant increase in obtaining diagnostic tests (40% vs 52%, P=0.06) and vasopressor initiation (3% vs 6%, P=0.4) | High |
| Secondary: ICU transfer, hospital length of stay, hospital length of stay after alert, in‐hospital mortality | Similar rate of ICU transfer (23% for control vs 26% for intervention, P=0.6), hospital length of stay (7 vs 9 days, median, P=0.8), hospital length of stay after alert (5 vs 6 days, median, P=0.7), and in‐hospital mortality (12% vs 10%, P=0.7) | ||
| Berger et al., 201017 | Primary: lactate collection in ED | Increase in lactate collection in the ED (5.2% before vs 12.7% after alert implemented, absolute increase of 7.5%, 95% CI, 6.0% to 9.0%) | Low |
| Secondary: lactate collection among hospitalized patients, proportion of patients with abnormal lactate (4 mmol/L), and in‐hospital mortality among hospitalized patients | Increase in lactate collection among hospitalized patients (15.3% vs 34.2%, absolute increase of 18.9%, 95% CI, 15.0% to 22.8%); decrease in the proportion of abnormal lactate values (21.9% vs 14.8%, absolute decrease of 7.6%, 95% CI, 15.8% to 0.6%), and no significant difference in mortality (5.7% vs 5.2%, absolute decrease of 0.5%, 95% CI, 1.6% to 2.6%, P=0.6) | ||
| McRee et al., 201418 | Stage of sepsis, length of stay, mortality, discharge location | Nonsignificant decrease in stage of sepsis (34.7% with septic shock before vs 21.9% after, P>0.05); no difference in length‐of‐stay (8.5 days before vs 8.7 days after, P>0.05). Decrease in mortality (9.3% before vs 1.0% after, P<0.05) and proportion of patients discharged home (25.3% before vs 49.0% after, P<0.05) | Low |
| Nelson et al., 201114 | Frequency and time to completion of process measures: lactate, blood culture, CXR, and antibiotic initiation | Increases in blood culture collection (OR 2.9; 95% CI, 1.1 to 7.7) and CXR (OR 3.2; 95% CI, 1.1 to 9.5); nonsignificant increases in lactate collection (OR 1.7; 95% CI, 0.9 to 3.2) and antibiotic administration (OR 2.8; 95% CI, 0.9 to 8.3). Only blood cultures were collected in a more timely manner (median of 86 minutes before vs 81 minutes after alert implementation, P=0.03). | Low |
Neither of the 2 high‐quality studies that included a contemporaneous control found evidence for improving inpatient mortality or hospital and ICU length of stay.[10, 19] The impact of sepsis alert systems on improving process measures for sepsis management depended on the clinical setting. In a randomized controlled trial of patients admitted to a medical ICU, Hooper et al. did not find any benefit of implementing a sepsis alert system on improving intermediate outcome measures such as antibiotic escalation, fluid resuscitation, and collection of blood cultures and lactate.[10] However, in a well‐designed observational study, Sawyer et al. found significant increases in antibiotic escalation, fluid resuscitation, and diagnostic testing in patients admitted to the medical wards.[19] Both studies that evaluated the effectiveness of sepsis alert systems in the ED showed improvements in various process measures,[14, 17] but without improvement in mortality.[17] The single study that showed improvement in clinical outcomes (in‐hospital mortality and disposition location) was of low quality due to the prestudypoststudy design without adjustment for potential confounders and lack of an intention‐to‐treat analysis (only individuals with a discharge diagnosis of sepsis were included, rather than all individuals who triggered the alert).[18] Additionally, the preintervention group had a higher proportion of individuals with septic shock compared to the postintervention group, raising the possibility that the observed improvement was due to difference in severity of illness between the 2 groups rather than due to the intervention.
None of the studies included in this review explicitly reported on the potential harms (eg, excess antimicrobial use or alert fatigue) after implementation of sepsis alerts, but Hooper et al. found a nonsignificant increase in mortality, and Sawyer et al. showed a nonsignificant increase in the length of stay in the intervention group compared to the control group.[10, 19] Berger et al. showed an overall increase in the number of lactate tests performed, but with a decrease in the proportion of abnormal lactate values (21.9% vs 14.8%, absolute decrease of 7.6%, 95% confidence interval, 15.8% to 0.6%), suggesting potential overtesting in patients at low risk for septic shock. In the study by Hooper et al., 88% (442/502) of the patients in the medical intensive care unit triggered an alert, raising the concern for alert fatigue.[10] Furthermore, 3 studies did not perform intention‐to‐treat analyses; rather, they included only patients who triggered the alert and also had provider‐suspected or confirmed sepsis,[14, 17] or had a discharge diagnosis for sepsis.[18]
DISCUSSION
The use of sepsis alert systems derived from electronic health data and targeting hospitalized patients improve a subset of sepsis process of care measures, but at the cost of poor positive predictive value and no clear improvement in mortality or length of stay. There is insufficient evidence for the effectiveness of automated electronic sepsis alert systems in the emergency department.
We found considerable variability in the diagnostic accuracy of automated electronic sepsis alert systems. There was moderate evidence that alert systems designed to identify severe sepsis (eg, SIRS criteria plus measures of shock) had greater diagnostic accuracy than alert systems that detected sepsis based on SIRS criteria alone. Given that SIRS criteria are highly prevalent among hospitalized patients with noninfectious diseases,[20] sepsis alert systems triggered by standard SIRS criteria may have poorer predictive value with an increased risk of alert fatigueexcessive electronic warnings resulting in physicians disregarding clinically useful alerts.[21] The potential for alert fatigue is even greater in critical care settings. A retrospective analysis of physiological alarms in the ICU estimated on average 6 alarms per hour with only 15% of alarms considered to be clinically relevant.[22]
The fact that sepsis alert systems improve intermediate process measures among ward and ED patients but not ICU patients likely reflects differences in both the patients and the clinical settings.[23] First, patients in the ICU may already be prescribed broad spectrum antibiotics, aggressively fluid resuscitated, and have other diagnostic testing performed before the activation of a sepsis alert, so it would be less likely to see an improvement in the rates of process measures assessing initiation or escalation of therapy compared to patients treated on the wards or in the ED. The apparent lack of benefit of these systems in the ICU may merely represent a ceiling effect. Second, nurses and physicians are already vigilantly monitoring patients in the ICU for signs of clinical deterioration, so additional alert systems may be redundant. Third, patients in the ICU are connected to standard bedside monitors that continuously monitor for the presence of abnormal vital signs. An additional sepsis alert system triggered by SIRS criteria alone may be superfluous to the existing infrastructure. Fourth, the majority of patients in the ICU will trigger the sepsis alert system,[10] so there likely is a high noise‐to‐signal ratio with resultant alert fatigue.[21]
In addition to greater emphasis on alert systems of greater diagnostic accuracy and effectiveness, our review notes several important gaps that limit evidence supporting the usefulness of automated sepsis alert systems. First, there are little data to describe the optimal design of sepsis alerts[24, 25] or the frequency with which they are appropriately acted upon or dismissed. In addition, we found little data to support whether effectiveness of alert systems differed based on whether clinical decision support was included with the alert itself (eg, direct prompting with specific clinical management recommendations) or the configuration of the alert (eg, interruptive alert or informational).[24, 25] Most of the studies we reviewed employed alerts primarily targeting physicians; we found little evidence for systems that also alerted other providers (eg, nurses or rapid response teams). Few studies provided data on harms of these systems (eg, excess antimicrobial use, fluid overload due to aggressive fluid resuscitation) or how often these treatments were administered to patients who did not eventually have sepsis. Few studies employed study designs that limited biases (eg, randomized or quasiexperimental designs) or used an intention‐to‐treat approach. Studies that exclude false positive alerts in analyses could bias estimates toward making sepsis alert systems appear more effective than they actually were. Finally, although presumably, deploying automated sepsis alerts in the ED would facilitate more timely recognition and treatment, more rigorously conducted studies are needed to identify whether using these alerts in the ED are of greater value compared to the wards and ICU. Given the limited number of studies included in this review, we were unable to make strong conclusions regarding the clinical benefits and cost‐effectiveness of implementing automated sepsis alerts.
Our review has certain limitations. First, despite our extensive literature search strategy, we may have missed studies published in the grey literature or in non‐English languages. Second, there is potential publication bias given the number of abstracts that we identified addressing 1 of our prespecified research questions compared to the number of peer‐reviewed publications identified by our search strategy.
CONCLUSION
Automated electronic sepsis alert systems have promise in delivering early goal‐directed therapies to patients. However, at present, automated sepsis alerts derived from electronic health data may improve care processes but tend to have poor PPV and have not been shown to improve mortality or length of stay. Future efforts should develop and study methods for sepsis alert systems that avoid the potential for alert fatigue while improving outcomes.
Acknowledgements
The authors thank Gloria Won, MLIS, for her assistance with developing and performing the literature search strategy and wish her a long and joyous retirement.
Disclosures: Part of Dr. Makam's work on this project was completed while he was a primary care research fellow at the University of California, San Francisco, funded by a National Research Service Award (training grant T32HP19025‐07‐00). Dr. Makam is currently supported by the National Center for Advancing Translational Sciences of the National Institutes of Health (KL2TR001103). Dr. Nguyen was supported by the Agency for Healthcare Research and Quality (R24HS022428‐01). Dr. Auerbach was supported by an NHLBI K24 grant (K24HL098372). Dr. Makam had full access to the data in the study and takes responsibility for the integrity of the date and accuracy of the data analysis. Study concept and design: all authors. Acquisition of data: Makam and Nguyen. Analysis and interpretation of data: all authors. Drafting of the manuscript: Makam. Critical revision of the manuscript: all authors. Statistical analysis: Makam and Nguyen. The authors have no conflicts of interest to disclose.
Sepsis is the most expensive condition treated in the hospital, resulting in an aggregate cost of $20.3 billion or 5.2% of total aggregate cost for all hospitalizations in the United States.[1] Rates of sepsis and sepsis‐related mortality are rising in the United States.[2, 3] Timely treatment of sepsis, including adequate fluid resuscitation and appropriate antibiotic administration, decreases morbidity, mortality, and costs.[4, 5, 6] Consequently, the Surviving Sepsis Campaign recommends timely care with the implementation of sepsis bundles and protocols.[4] Though effective, sepsis protocols require dedicated personnel with specialized training, who must be highly vigilant and constantly monitor a patient's condition for the course of an entire hospitalization.[7, 8] As such, delays in administering evidence‐based therapies are common.[8, 9]
Automated electronic sepsis alerts are being developed and implemented to facilitate the delivery of timely sepsis care. Electronic alert systems synthesize electronic health data routinely collected for clinical purposes in real time or near real time to automatically identify sepsis based on prespecified diagnostic criteria, and immediately alert providers that their patient may meet sepsis criteria via electronic notifications (eg, through electronic health record [EHR], e‐mail, or pager alerts).
However, little data exist to describe whether automated, electronic systems achieve their intended goal of earlier, more effective sepsis care. To examine this question, we performed a systematic review on automated electronic sepsis alerts to assess their suitability for clinical use. Our 2 objectives were: (1) to describe the diagnostic accuracy of alert systems in identifying sepsis using electronic data available in real‐time or near real‐time, and (2) to evaluate the effectiveness of sepsis alert systems on sepsis care process measures and clinical outcomes.
MATERIALS AND METHODS
Data Sources and Search Strategies
We searched PubMed MEDLINE, Embase, The Cochrane Library, and the Cumulative Index to Nursing and Allied Health Literature from database inception through June 27, 2014, for all studies that contained the following 3 concepts: sepsis, electronic systems, and alerts (or identification). All citations were imported into an electronic database (EndNote X5; Thomson‐Reuters Corp., New York, NY) (see Supporting Information, Appendix, in the online version of this article for our complete search strategy).
Study Selection
Two authors (A.N.M. and O.K.N.) reviewed the citation titles, abstracts, and full‐text articles of potentially relevant references identified from the literature search for eligibility. References of selected articles were hand searched to identify additional eligible studies. Inclusion criteria for eligible studies were: (1) adult patients (aged 18 years) receiving care either in the emergency department or hospital, (2) outcomes of interest including diagnostic accuracy in identification of sepsis, and/or effectiveness of sepsis alerts on process measures and clinical outcomes evaluated using empiric data, and (3) sepsis alert systems used real time or near real time electronically available data to enable proactive, timely management. We excluded studies that: (1) tested the effect of other electronic interventions that were not sepsis alerts (ie, computerized order sets) for sepsis management; (2) studies solely focused on detecting and treating central line‐associated bloodstream infections, shock (not otherwise specified), bacteremia, or other device‐related infections; and (3) studies evaluating the effectiveness of sepsis alerts without a control group.
Data Extraction and Quality Assessment
Two reviewers (A.N.M. and O.K.N.) extracted data on the clinical setting, study design, dates of enrollment, definition of sepsis, details of the identification and alert systems, diagnostic accuracy of the alert system, and the incidence of process measures and clinical outcomes using a standardized form. Discrepancies between reviewers were resolved by discussion and consensus. Data discrepancies identified in 1 study were resolved by contacting the corresponding author.[10]
For studies assessing the diagnostic accuracy of sepsis identification, study quality was assessed using the Quality Assessment of Diagnostic Accuracy Studies revised tool.[11] For studies evaluating the effectiveness of sepsis alert systems, studies were considered high quality if a contemporaneous control group was present to account for temporal trends (eg, randomized controlled trial or observational analysis with a concurrent control). Fair‐quality studies were before‐and‐after studies that adjusted for potential confounders between time periods. Low‐quality studies included those that did not account for temporal trends, such as before‐and‐after studies using only historical controls without adjustment. Studies that did not use an intention‐to‐treat analysis were also considered low quality. The strength of the overall body of evidence, including risk of bias, was guided by the Grading of Recommendations Assessment, Development, and Evaluation Working Group Criteria adapted by the Agency of Healthcare Research and Quality.[12]
Data Synthesis
To analyze the diagnostic accuracy of automated sepsis alert systems to identify sepsis and to evaluate the effect on outcomes, we performed a qualitative assessment of all studies. We were unable to perform a meta‐analysis due to significant heterogeneity in study quality, clinical setting, and definition of the sepsis alert. Diagnostic accuracy of sepsis identification was measured by sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV), and likelihood ratio (LR). Effectiveness was assessed by changes in sepsis care process measures (ie, time to antibiotics) and outcomes (length of stay, mortality).
RESULTS
Description of Studies
Of 1293 titles, 183 qualified for abstract review, 84 for full‐text review, and 8 articles met our inclusion criteria (see Supporting Figure in the online version of this article). Five articles evaluated the diagnostic accuracy of sepsis identification,[10, 13, 14, 15, 16] and 5 articles[10, 14, 17, 18, 19] evaluated the effectiveness of automated electronic sepsis alerts on sepsis process measures and patient outcomes. All articles were published between 2009 and 2014 and were single‐site studies conducted at academic medical centers (Tables 1 and 2). The clinical settings in the included studies varied and included the emergency department (ED), hospital wards, and the intensive care unit (ICU).
| Source | Site No./Type | Setting | Alert Threshold | Gold Standard Definition | Gold Standard Measurement | No. | Study Qualitya |
|---|---|---|---|---|---|---|---|
| |||||||
| Hooper et al., 201210 | 1/academic | MICU | 2 SIRS criteriab | Reviewer judgment, not otherwise specified | Chart review | 560 | High |
| Meurer et al., 200913 | 1/academic | ED | 2 SIRS criteria | Reviewer judgment whether diagnosis of infection present in ED plus SIRS criteria | Chart review | 248 | Low |
| Nelson J. et al., 201114 | 1/academic | ED | 2 SIRS criteria and 2 SBP measurements <90 mm Hg | Reviewer judgment whether infection present, requiring hospitalization with at least 1 organ system involved | Chart review | 1,386 | High |
| Nguyen et al., 201415 | 1/academic | ED | 2 SIRS criteria and 1 sign of shock (SBP 90 mm Hg or lactic acid 2.0 mmol/L) | Reviewer judgment to confirm SIRS, shock, and presence of a serious infection | Chart review | 1,095 | Low |
| Thiel et al., 201016 | 1/academic | Wards | Recursive partitioning tree analysis including vitals and laboratory resultsc | Admitted to the hospital wards and subsequently transferred to the ICU for septic shock and treated with vasopressor therapy | ICD‐9 discharge codes for acute infection, acute organ dysfunction, and need for vasopressors within 24 hours of ICU transfer | 27,674 | Low |
| Source | Design | Site No./ Type | Setting | No. | Alert System Type | Alert Threshold | Alert Notificationa | Treatment Recommendation | Study Qualityb |
|---|---|---|---|---|---|---|---|---|---|
| |||||||||
| Berger et al., 201017 | Before‐after (6 months pre and 6 months post) | 1/academic | ED | 5796c | CPOE system | 2 SIRS criteria | CPOE passive alert | Yes: lactate collection | Low |
| Hooper et al., 201210 | RCT | 1/academic | MICU | 443 | EHR | 2 SIRS criteriad | Text page and EHR passive alert | No | High |
| McRee et al., 201418 | Before‐after (6 months pre and 6 months post) | 1/academic | Wards | 171e | EHR | 2 SIRS criteria | Notified nurse, specifics unclear | No, but the nurse completed a sepsis risk evaluation flow sheet | Low |
| Nelson et al., 201114 | Before‐after (3 months pre and 3 months post) | 1/academic | ED | 184f | EHR | 2 SIRS criteria and 2 or more SBP readings <90 mm Hg | Text page and EHR passive alert | Yes: fluid resuscitation, blood culture collection, antibiotic administration, among others | Low |
| Sawyer et al., 201119 | Prospective, nonrandomized (2 intervention and 4 control wards) | 1/academic | Wards | 300 | EHR | Recursive partitioning regression tree algorithm including vitals and lab valuesg | Text page to charge nurse who then assessed patient and informed treating physicianh | No | High |
Among the 8 included studies, there was significant heterogeneity in threshold criteria for sepsis identification and subsequent alert activation. The most commonly defined threshold was the presence of 2 or more systemic inflammatory response syndrome (SIRS) criteria.[10, 13, 17, 18]
Diagnostic Accuracy of Automated Electronic Sepsis Alert Systems
The prevalence of sepsis varied substantially between the studies depending on the gold standard definition of sepsis used and the clinical setting (ED, wards, or ICU) of the study (Table 3). The 2 studies[14, 16] that defined sepsis as requiring evidence of shock had a substantially lower prevalence (0.8%4.7%) compared to the 2 studies[10, 13] that defined sepsis as having only 2 or more SIRS criteria with a presumed diagnosis of an infection (27.8%32.5%).
| Source | Setting | Alert Threshold | Prevalence, % | Sensitivity, % (95% CI) | Specificity, % (95% CI) | PPV, % (95% CI) | NPV, % (95% CI) | LR+, (95% CI) | LR, (95% CI) |
|---|---|---|---|---|---|---|---|---|---|
| |||||||||
| Hooper et al., 201210 | MICU | 2 SIRS criteriaa | 36.3 | 98.9 (95.799.8) | 18.1 (14.222.9) | 40.7 (36.145.5) | 96.7 (87.599.4) | 1.21 (1.14‐1.27) | 0.06 (0.01‐0.25) |
| Meurer et al., 200913 | ED | 2 SIRS criteria | 27.8 | 36.2 (25.348.8) | 79.9 (73.185.3) | 41.0 (28.854.3) | 76.5 (69.682.2) | 1.80 (1.17‐2.76) | 0.80 (0.67‐0.96) |
| Nelson et al., 201114 | ED | 2 SIRS criteria and 2 SBP measurements<90 mm Hg | 0.8 | 63.6 (31.687.8) | 99.6 (99.099.8) | 53.8 (26.179.6) | 99.7 (99.299.9) | 145.8 (58.4364.1) | 0.37 (0.17‐0.80) |
| Nguyen et al., 201415 | ED | 2 SIRS criteria and 1 sign of shock (SBP 90 mm Hg or lactic acid 2.0 mmol/L) | Unable to estimateb | Unable to estimateb | Unable to estimateb | 44.7 (41.248.2) | 100.0c (98.8100.0) | Unable to estimateb | Unable to estimateb |
| Thiel et al., 201016 | Wards | Recursive partitioning tree analysis including vitals and laboratory resultsd | 4.7 | 17.1 (15.119.3) | 96.7 (96.596.9) | 20.5 (18.223.0) | 95.9 (95.796.2) | 5.22 (4.56‐5.98) | 0.86 (0.84‐0.88) |
All alert systems had suboptimal PPV (20.5%‐53.8%). The 2 studies that designed the sepsis alert to activate by SIRS criteria alone[10, 13] had a positive predictive value of 41% and a positive LR of 1.21 to 1.80. The ability to exclude the presence of sepsis varied considerably depending on the clinical setting. The study by Hooper et al.[10] that examined the alert among patients in the medical ICU appeared more effective at ruling out sepsis (NPV=96.7%; negative LR=0.06) compared to a similar alert system used by Meurer et al.[13] that studied patients in the ED (NPV=76.5%, negative LR=0.80).
There were also differences in the diagnostic accuracy of the sepsis alert systems depending on how the threshold for activating the sepsis alert was defined and applied in the study. Two studies evaluated a sepsis alert system among patients presenting to the ED at the same academic medical center.[13, 14] The alert system (Nelson et al.) that was triggered by a combination of SIRS criteria and hypotension (PPV=53.8%, LR+=145.8; NPV=99.7%, LR=0.37) outperformed the alert system (Meurer et al.) that was triggered by SIRS criteria alone (PPV=41.0%, LR+=1.80; NPV=76.5%, LR=0.80). Furthermore, the study by Meurer and colleagues evaluated the accuracy of the alert system only among patients who were hospitalized after presenting to the ED, rather than all consecutive patients presenting to the ED. This selection bias likely falsely inflated the diagnostic accuracy of the alert system used by Meurer et al., suggesting the alert system that was triggered by a combination of SIRS criteria and hypotension was comparatively even more accurate.
Two studies evaluating the diagnostic accuracy of the alert system were deemed to be high quality (Table 4). Three studies were considered low quality1 study did not include all patients in their assessment of diagnostic accuracy13; 1 study consecutively selected alert cases but randomly selected nonalert cases, greatly limiting the assessment of diagnostic accuracy15; and the other study applied a gold standard that was unlikely to correctly classify sepsis (septic shock requiring ICU transfer with vasopressor support in the first 24 hours was defined by discharge International Classification of Diseases, Ninth Revision diagnoses without chart review), with a considerable delay from the alert system trigger (alert identification was compared to the discharge diagnosis rather than physician review of real‐time data).[16]
| Study | Patient Selection | Index Test | Reference Standard | Flow and Timing |
|---|---|---|---|---|
| ||||
| Hooper et al., 201210 | +++ | +++ | ++b | +++ |
| Meurer et al., 200913 | +++ | +++ | ++b | +c |
| Nelson et al., 201114 | +++ | +++ | ++b | +++ |
| Nguyen et al., 201415 | +d | +++ | +e | +++ |
| Thiel et al., 201016 | +++ | +++ | +f | +g |
Effectiveness of Automated Electronic Sepsis Alert Systems
Characteristics of the studies evaluating the effectiveness of automated electronic sepsis alert systems are summarized in Table 2. Regarding activation of the sepsis alert, 2 studies notified the provider directly by an automated text page and a passive EHR alert (not requiring the provider to acknowledge the alert or take action),[10, 14] 1 study notified the provider by a passive electronic alert alone,[17] and 1 study only employed an automated text page.[19] Furthermore, if the sepsis alert was activated, 2 studies suggested specific clinical management decisions,[14, 17] 2 studies left clinical management decisions solely to the discretion of the treating provider,[10, 19] and 1 study assisted the diagnosis of sepsis by prompting nurses to complete a second manual sepsis risk evaluation.[18]
Table 5 summarizes the effectiveness of automated electronic sepsis alert systems. Two studies evaluating the effectiveness of the sepsis alert system were considered to be high‐quality studies based on the use of a contemporaneous control group to account for temporal trends and an intention‐to‐treat analysis.[10, 19] The 2 studies evaluating the effectiveness of a sepsis alert system in the ED were considered low quality due to before‐and‐after designs without an intention‐to‐treat analysis.[14, 17]
| Source | Outcomes Evaluated | Key Findings | Quality |
|---|---|---|---|
| |||
| Hooper et al., 201210 | Primary: time to receipt of antibiotic (new or changed) | No difference (6.1 hours for control vs 6.0 hours for intervention, P=0.95) | High |
| Secondary: sepsis‐related process measures and outcomes | No difference in amount of 6 hour IV fluid administration (964 mL vs 1,019 mL, P=0.6), collection of blood cultures (adjusted HR 1.01; 95% CI, 0.76 to 1.35), collection of lactate (adjusted HR 0.84; 95% CI, 0.54 to 1.30), ICU length of stay (3.0 vs 3.0 days, P=0.2), hospital length of stay (4.7 vs 5.7 days, P=0.08), and hospital mortality (10% for control vs 14% for intervention, P=0.3) | ||
| Sawyer et al., 201119 | Primary: sepsis‐related process measures (antibiotic escalation, IV fluids, oxygen therapy, vasopressor initiation, diagnostic testing (blood culture, CXR) within 12 hours of alert | Increases in receiving 1 measure (56% for control vs 71% for intervention, P=0.02), antibiotic escalation (24% vs 36%, P=0.04), IV fluid administration (24% vs 38%, P=0.01), and oxygen therapy (8% vs 20%, P=0.005). There was a nonsignificant increase in obtaining diagnostic tests (40% vs 52%, P=0.06) and vasopressor initiation (3% vs 6%, P=0.4) | High |
| Secondary: ICU transfer, hospital length of stay, hospital length of stay after alert, in‐hospital mortality | Similar rate of ICU transfer (23% for control vs 26% for intervention, P=0.6), hospital length of stay (7 vs 9 days, median, P=0.8), hospital length of stay after alert (5 vs 6 days, median, P=0.7), and in‐hospital mortality (12% vs 10%, P=0.7) | ||
| Berger et al., 201017 | Primary: lactate collection in ED | Increase in lactate collection in the ED (5.2% before vs 12.7% after alert implemented, absolute increase of 7.5%, 95% CI, 6.0% to 9.0%) | Low |
| Secondary: lactate collection among hospitalized patients, proportion of patients with abnormal lactate (4 mmol/L), and in‐hospital mortality among hospitalized patients | Increase in lactate collection among hospitalized patients (15.3% vs 34.2%, absolute increase of 18.9%, 95% CI, 15.0% to 22.8%); decrease in the proportion of abnormal lactate values (21.9% vs 14.8%, absolute decrease of 7.6%, 95% CI, 15.8% to 0.6%), and no significant difference in mortality (5.7% vs 5.2%, absolute decrease of 0.5%, 95% CI, 1.6% to 2.6%, P=0.6) | ||
| McRee et al., 201418 | Stage of sepsis, length of stay, mortality, discharge location | Nonsignificant decrease in stage of sepsis (34.7% with septic shock before vs 21.9% after, P>0.05); no difference in length‐of‐stay (8.5 days before vs 8.7 days after, P>0.05). Decrease in mortality (9.3% before vs 1.0% after, P<0.05) and proportion of patients discharged home (25.3% before vs 49.0% after, P<0.05) | Low |
| Nelson et al., 201114 | Frequency and time to completion of process measures: lactate, blood culture, CXR, and antibiotic initiation | Increases in blood culture collection (OR 2.9; 95% CI, 1.1 to 7.7) and CXR (OR 3.2; 95% CI, 1.1 to 9.5); nonsignificant increases in lactate collection (OR 1.7; 95% CI, 0.9 to 3.2) and antibiotic administration (OR 2.8; 95% CI, 0.9 to 8.3). Only blood cultures were collected in a more timely manner (median of 86 minutes before vs 81 minutes after alert implementation, P=0.03). | Low |
Neither of the 2 high‐quality studies that included a contemporaneous control found evidence for improving inpatient mortality or hospital and ICU length of stay.[10, 19] The impact of sepsis alert systems on improving process measures for sepsis management depended on the clinical setting. In a randomized controlled trial of patients admitted to a medical ICU, Hooper et al. did not find any benefit of implementing a sepsis alert system on improving intermediate outcome measures such as antibiotic escalation, fluid resuscitation, and collection of blood cultures and lactate.[10] However, in a well‐designed observational study, Sawyer et al. found significant increases in antibiotic escalation, fluid resuscitation, and diagnostic testing in patients admitted to the medical wards.[19] Both studies that evaluated the effectiveness of sepsis alert systems in the ED showed improvements in various process measures,[14, 17] but without improvement in mortality.[17] The single study that showed improvement in clinical outcomes (in‐hospital mortality and disposition location) was of low quality due to the prestudypoststudy design without adjustment for potential confounders and lack of an intention‐to‐treat analysis (only individuals with a discharge diagnosis of sepsis were included, rather than all individuals who triggered the alert).[18] Additionally, the preintervention group had a higher proportion of individuals with septic shock compared to the postintervention group, raising the possibility that the observed improvement was due to difference in severity of illness between the 2 groups rather than due to the intervention.
None of the studies included in this review explicitly reported on the potential harms (eg, excess antimicrobial use or alert fatigue) after implementation of sepsis alerts, but Hooper et al. found a nonsignificant increase in mortality, and Sawyer et al. showed a nonsignificant increase in the length of stay in the intervention group compared to the control group.[10, 19] Berger et al. showed an overall increase in the number of lactate tests performed, but with a decrease in the proportion of abnormal lactate values (21.9% vs 14.8%, absolute decrease of 7.6%, 95% confidence interval, 15.8% to 0.6%), suggesting potential overtesting in patients at low risk for septic shock. In the study by Hooper et al., 88% (442/502) of the patients in the medical intensive care unit triggered an alert, raising the concern for alert fatigue.[10] Furthermore, 3 studies did not perform intention‐to‐treat analyses; rather, they included only patients who triggered the alert and also had provider‐suspected or confirmed sepsis,[14, 17] or had a discharge diagnosis for sepsis.[18]
DISCUSSION
The use of sepsis alert systems derived from electronic health data and targeting hospitalized patients improve a subset of sepsis process of care measures, but at the cost of poor positive predictive value and no clear improvement in mortality or length of stay. There is insufficient evidence for the effectiveness of automated electronic sepsis alert systems in the emergency department.
We found considerable variability in the diagnostic accuracy of automated electronic sepsis alert systems. There was moderate evidence that alert systems designed to identify severe sepsis (eg, SIRS criteria plus measures of shock) had greater diagnostic accuracy than alert systems that detected sepsis based on SIRS criteria alone. Given that SIRS criteria are highly prevalent among hospitalized patients with noninfectious diseases,[20] sepsis alert systems triggered by standard SIRS criteria may have poorer predictive value with an increased risk of alert fatigueexcessive electronic warnings resulting in physicians disregarding clinically useful alerts.[21] The potential for alert fatigue is even greater in critical care settings. A retrospective analysis of physiological alarms in the ICU estimated on average 6 alarms per hour with only 15% of alarms considered to be clinically relevant.[22]
The fact that sepsis alert systems improve intermediate process measures among ward and ED patients but not ICU patients likely reflects differences in both the patients and the clinical settings.[23] First, patients in the ICU may already be prescribed broad spectrum antibiotics, aggressively fluid resuscitated, and have other diagnostic testing performed before the activation of a sepsis alert, so it would be less likely to see an improvement in the rates of process measures assessing initiation or escalation of therapy compared to patients treated on the wards or in the ED. The apparent lack of benefit of these systems in the ICU may merely represent a ceiling effect. Second, nurses and physicians are already vigilantly monitoring patients in the ICU for signs of clinical deterioration, so additional alert systems may be redundant. Third, patients in the ICU are connected to standard bedside monitors that continuously monitor for the presence of abnormal vital signs. An additional sepsis alert system triggered by SIRS criteria alone may be superfluous to the existing infrastructure. Fourth, the majority of patients in the ICU will trigger the sepsis alert system,[10] so there likely is a high noise‐to‐signal ratio with resultant alert fatigue.[21]
In addition to greater emphasis on alert systems of greater diagnostic accuracy and effectiveness, our review notes several important gaps that limit evidence supporting the usefulness of automated sepsis alert systems. First, there are little data to describe the optimal design of sepsis alerts[24, 25] or the frequency with which they are appropriately acted upon or dismissed. In addition, we found little data to support whether effectiveness of alert systems differed based on whether clinical decision support was included with the alert itself (eg, direct prompting with specific clinical management recommendations) or the configuration of the alert (eg, interruptive alert or informational).[24, 25] Most of the studies we reviewed employed alerts primarily targeting physicians; we found little evidence for systems that also alerted other providers (eg, nurses or rapid response teams). Few studies provided data on harms of these systems (eg, excess antimicrobial use, fluid overload due to aggressive fluid resuscitation) or how often these treatments were administered to patients who did not eventually have sepsis. Few studies employed study designs that limited biases (eg, randomized or quasiexperimental designs) or used an intention‐to‐treat approach. Studies that exclude false positive alerts in analyses could bias estimates toward making sepsis alert systems appear more effective than they actually were. Finally, although presumably, deploying automated sepsis alerts in the ED would facilitate more timely recognition and treatment, more rigorously conducted studies are needed to identify whether using these alerts in the ED are of greater value compared to the wards and ICU. Given the limited number of studies included in this review, we were unable to make strong conclusions regarding the clinical benefits and cost‐effectiveness of implementing automated sepsis alerts.
Our review has certain limitations. First, despite our extensive literature search strategy, we may have missed studies published in the grey literature or in non‐English languages. Second, there is potential publication bias given the number of abstracts that we identified addressing 1 of our prespecified research questions compared to the number of peer‐reviewed publications identified by our search strategy.
CONCLUSION
Automated electronic sepsis alert systems have promise in delivering early goal‐directed therapies to patients. However, at present, automated sepsis alerts derived from electronic health data may improve care processes but tend to have poor PPV and have not been shown to improve mortality or length of stay. Future efforts should develop and study methods for sepsis alert systems that avoid the potential for alert fatigue while improving outcomes.
Acknowledgements
The authors thank Gloria Won, MLIS, for her assistance with developing and performing the literature search strategy and wish her a long and joyous retirement.
Disclosures: Part of Dr. Makam's work on this project was completed while he was a primary care research fellow at the University of California, San Francisco, funded by a National Research Service Award (training grant T32HP19025‐07‐00). Dr. Makam is currently supported by the National Center for Advancing Translational Sciences of the National Institutes of Health (KL2TR001103). Dr. Nguyen was supported by the Agency for Healthcare Research and Quality (R24HS022428‐01). Dr. Auerbach was supported by an NHLBI K24 grant (K24HL098372). Dr. Makam had full access to the data in the study and takes responsibility for the integrity of the date and accuracy of the data analysis. Study concept and design: all authors. Acquisition of data: Makam and Nguyen. Analysis and interpretation of data: all authors. Drafting of the manuscript: Makam. Critical revision of the manuscript: all authors. Statistical analysis: Makam and Nguyen. The authors have no conflicts of interest to disclose.
- , . National inpatient hospital costs: the most expensive conditions by payer, 2011: statistical brief #160. Healthcare Cost and Utilization Project (HCUP) Statistical Briefs. Rockville, MD: Agency for Healthcare Research and Quality; 2013.
- , , , . Inpatient care for septicemia or sepsis: a challenge for patients and hospitals. NCHS Data Brief. 2011;(62):1–8.
- , , , . The epidemiology of sepsis in the United States from 1979 through 2000. N Engl J Med. 2003;348(16):1546–1554.
- , , , et al. Surviving sepsis campaign: international guidelines for management of severe sepsis and septic shock: 2012. Crit Care Med. 2013;41(2):580–637.
- , , , et al. Early goal‐directed therapy in the treatment of severe sepsis and septic shock. N Engl J Med. 2001;345(19):1368–1377.
- , , , et al. A randomized trial of protocol‐based care for early septic shock. N Engl J Med. 2014;370(18):1683–1693.
- , . Implementation of early goal‐directed therapy for septic patients in the emergency department: a review of the literature. J Emerg Nurs. 2013;39(1):13–19.
- , , , , , . Factors influencing variability in compliance rates and clinical outcomes among three different severe sepsis bundles. Ann Pharmacother. 2007;41(6):929–936.
- , , , et al. Improvement in process of care and outcome after a multicenter severe sepsis educational program in Spain. JAMA. 2008;299(19):2294–2303.
- , , , et al. Randomized trial of automated, electronic monitoring to facilitate early detection of sepsis in the intensive care unit*. Crit Care Med. 2012;40(7):2096–2101.
- , , , et al. QUADAS‐2: a revised tool for the quality assessment of diagnostic accuracy studies. Ann Intern Med. 2011;155(8):529–536.
- , , , et al. AHRQ series paper 5: grading the strength of a body of evidence when comparing medical interventions—agency for healthcare research and quality and the effective health‐care program. J Clin Epidemiol. 2010;63(5):513–523.
- , , , et al. Real‐time identification of serious infection in geriatric patients using clinical information system surveillance. J Am Geriatr Soc. 2009;57(1):40–45.
- , , , . Prospective trial of real‐time electronic surveillance to expedite early care of severe sepsis. Ann Emerg Med. 2011;57(5):500–504.
- , , , et al. Automated electronic medical record sepsis detection in the emergency department. PeerJ. 2014;2:e343.
- , , , , , . Early prediction of septic shock in hospitalized patients. J Hosp Med. 2010;5(1):19–25.
- , , , , . A Computerized alert screening for severe sepsis in emergency department patients increases lactate testing but does not improve inpatient mortality. Appl Clin Inform. 2010;1(4):394–407.
- , , , , . The impact of an electronic medical record surveillance program on outcomes for patients with sepsis. Heart Lung. 2014;43(6):546–549.
- , , , et al. Implementation of a real‐time computerized sepsis alert in nonintensive care unit patients. Crit Care Med. 2011;39(3):469–473.
- . The epidemiology of the systemic inflammatory response. Intensive Care Med. 2000;26(suppl 1):S64–S74.
- , , , et al. Overrides of medication‐related clinical decision support alerts in outpatients. J Am Med Inform Assoc. 2014;21(3):487–491.
- , , , , , . Intensive care unit alarms–how many do we need? Crit Care Med. 2010;38(2):451–456.
- , . How can we best use electronic data to find and treat the critically ill?*. Crit Care Med. 2012;40(7):2242–2243.
- , , , et al. Identifying best practices for clinical decision support and knowledge management in the field. Stud Health Technol Inform. 2010;160(pt 2):806–810.
- , , , et al. Best practices in clinical decision support: the case of preventive care reminders. Appl Clin Inform. 2010;1(3):331–345.
- , . National inpatient hospital costs: the most expensive conditions by payer, 2011: statistical brief #160. Healthcare Cost and Utilization Project (HCUP) Statistical Briefs. Rockville, MD: Agency for Healthcare Research and Quality; 2013.
- , , , . Inpatient care for septicemia or sepsis: a challenge for patients and hospitals. NCHS Data Brief. 2011;(62):1–8.
- , , , . The epidemiology of sepsis in the United States from 1979 through 2000. N Engl J Med. 2003;348(16):1546–1554.
- , , , et al. Surviving sepsis campaign: international guidelines for management of severe sepsis and septic shock: 2012. Crit Care Med. 2013;41(2):580–637.
- , , , et al. Early goal‐directed therapy in the treatment of severe sepsis and septic shock. N Engl J Med. 2001;345(19):1368–1377.
- , , , et al. A randomized trial of protocol‐based care for early septic shock. N Engl J Med. 2014;370(18):1683–1693.
- , . Implementation of early goal‐directed therapy for septic patients in the emergency department: a review of the literature. J Emerg Nurs. 2013;39(1):13–19.
- , , , , , . Factors influencing variability in compliance rates and clinical outcomes among three different severe sepsis bundles. Ann Pharmacother. 2007;41(6):929–936.
- , , , et al. Improvement in process of care and outcome after a multicenter severe sepsis educational program in Spain. JAMA. 2008;299(19):2294–2303.
- , , , et al. Randomized trial of automated, electronic monitoring to facilitate early detection of sepsis in the intensive care unit*. Crit Care Med. 2012;40(7):2096–2101.
- , , , et al. QUADAS‐2: a revised tool for the quality assessment of diagnostic accuracy studies. Ann Intern Med. 2011;155(8):529–536.
- , , , et al. AHRQ series paper 5: grading the strength of a body of evidence when comparing medical interventions—agency for healthcare research and quality and the effective health‐care program. J Clin Epidemiol. 2010;63(5):513–523.
- , , , et al. Real‐time identification of serious infection in geriatric patients using clinical information system surveillance. J Am Geriatr Soc. 2009;57(1):40–45.
- , , , . Prospective trial of real‐time electronic surveillance to expedite early care of severe sepsis. Ann Emerg Med. 2011;57(5):500–504.
- , , , et al. Automated electronic medical record sepsis detection in the emergency department. PeerJ. 2014;2:e343.
- , , , , , . Early prediction of septic shock in hospitalized patients. J Hosp Med. 2010;5(1):19–25.
- , , , , . A Computerized alert screening for severe sepsis in emergency department patients increases lactate testing but does not improve inpatient mortality. Appl Clin Inform. 2010;1(4):394–407.
- , , , , . The impact of an electronic medical record surveillance program on outcomes for patients with sepsis. Heart Lung. 2014;43(6):546–549.
- , , , et al. Implementation of a real‐time computerized sepsis alert in nonintensive care unit patients. Crit Care Med. 2011;39(3):469–473.
- . The epidemiology of the systemic inflammatory response. Intensive Care Med. 2000;26(suppl 1):S64–S74.
- , , , et al. Overrides of medication‐related clinical decision support alerts in outpatients. J Am Med Inform Assoc. 2014;21(3):487–491.
- , , , , , . Intensive care unit alarms–how many do we need? Crit Care Med. 2010;38(2):451–456.
- , . How can we best use electronic data to find and treat the critically ill?*. Crit Care Med. 2012;40(7):2242–2243.
- , , , et al. Identifying best practices for clinical decision support and knowledge management in the field. Stud Health Technol Inform. 2010;160(pt 2):806–810.
- , , , et al. Best practices in clinical decision support: the case of preventive care reminders. Appl Clin Inform. 2010;1(3):331–345.
Navigating Venous Access
Reliable venous access is fundamental for the safe and effective care of hospitalized patients. Venous access devices (VADs) are conduits for this purpose, providing delivery of intravenous medications, accurate measurement of central venous pressure, or administration of life‐saving blood products. Despite this important role, VADs are also often the source of hospital‐acquired complications. Although inpatient providers must balance the relative risks of VADs against their benefits, the evidence supporting such decisions is often limited. Advances in technology, scattered research, and growing availability of novel devices has only further fragmented provider knowledge in the field of vascular access.[1]
It is not surprising, then, that survey‐based studies of hospitalists reveal important knowledge gaps with regard to practices associated with VADs.[2] In this narrative review, we seek to bridge this gap by providing a concise and pragmatic overview of the fundamentals of venous access. We focus specifically on parameters that influence decisions regarding VAD placement in hospitalized patients, providing key takeaways for practicing hospitalists.
METHODS
To compile this review, we systematically searched Medline (via Ovid) for several keywords, including: peripheral intravenous catheters, ultrasound‐guided peripheral catheter, intraosseous, midline, peripherally inserted central catheter, central venous catheters, and vascular access device complications. We concentrated on full‐length articles in English only; no date restrictions were placed on the search. We reviewed guidelines and consensus statements (eg, from the Center for Disease Control [CDC] or Choosing Wisely criteria) as appropriate. Additional studies of interest were identified through content experts (M.P., C.M.R.) and bibliographies of included studies.
SCIENTIFIC PRINCIPLES UNDERPINNING VENOUS ACCESS
It is useful to begin by reviewing VAD‐related nomenclature and physiology. In the simplest sense, a VAD consists of a hub (providing access to various connectors), a hollow tube divided into 1 or many sections (lumens), and a tip that may terminate within a central or peripheral blood vessel. VADs are classified as central venous catheters (eg, centrally inserted central catheters [CICCs] or peripherally inserted central catheters [PICCs]) or peripheral intravenous catheters (eg, midlines or peripheral intravenous catheters) based on site of entry and location of the catheter tip. Therefore, VADs entering via proximal or distal veins of the arm are often referred to as peripheral lines, as their site of entry and tip both reside within peripheral veins. Conversely, the term central line is often used when VADs enter or terminate in a central vein (eg, subclavian vein insertion with the catheter tip in the lower superior vena cava).
Attention to a host of clinical and theoretical parameters is important when choosing a device for venous access. Some such parameters are summarized in Table 1.
| Parameter | Major Considerations |
|---|---|
| |
| Desired flow rate | Smaller diameter veins susceptible to damage with high flow rates. |
| Short, large‐bore catheters facilitate rapid infusion. | |
| Nature of infusion | pH, viscosity, and temperature may damage vessels. |
| Vesicants and irritants should always be administered into larger, central veins. | |
| Desired duration of vascular access, or dwell time | Vessel thrombosis or phlebitis increase over time with catheter in place. |
| Intermittent infusions increase complications in central catheters; often tunneled catheters are recommended. | |
| Urgency of placement | Access to large caliber vessels is often needed in emergencies. |
| Critically ill or hemodynamically unstable patients may require urgent access for invasive monitoring or rapid infusions. | |
| Patients with trauma often require large volumes of blood products and reliable access to central veins. | |
| Number of device lumens | VADs may have single or multiple lumens. |
| Multilumen allows for multiple functions (eg, infusion of multiple agents, measurement of central venous pressures, blood draws). | |
| Device gauge | In general, use of a smaller‐gauge catheter is preferred to prevent complications. |
| However, larger catheter diameter may be needed for specific clinical needs (eg, blood transfusion). | |
| Device coating | VADs may have antithrombotic or anti‐infective coatings. |
| These devices may be of value in patients at high risk of complications. | |
| Such devices, however, may be more costly than their counterparts. | |
| Self‐care compatibility | VADs that can be cared for by patients are ideal for outpatient care. |
| Conversely, VADs such as peripheral catheters, are highly prone to dislodgement and should be reserved for supervised settings only. | |
VENOUS ACCESS DEVICES
We will organize our discussion of VADs based on whether they terminate in peripheral or central vessels. These anatomical considerations are relevant as they determine physical characteristics, compatibility with particular infusates, dwell time, and risk of complications associated with each VAD discussed in Table 2.
| Complications | Major Considerations |
|---|---|
| |
| Infection | VADs breach the integrity of skin and permit skin pathogens to enter the blood stream (extraluminal infection). |
| Inadequate antisepsis of the VAD hub, including poor hand hygiene, failure to "scrub the hub," and multiple manipulations may also increase the risk of VAD‐related infection (endoluminal infection). | |
| Infections may be local (eg, exit‐site infections) or may spread hematogenously (eg, CLABSI). | |
| Type of VAD, duration of therapy, and host characteristics interact to influence infection risk. | |
| VADs with antiseptic coatings (eg, chlorhexidine) or antibiotic coatings (eg, minocycline) may reduce risk of infection in high‐risk patients. | |
| Antiseptic‐impregnated dressings may reduce risk of extraluminal infection. | |
| Venous thrombosis | VADs predispose to venous stasis and thrombosis. |
| Duration of VAD use, type and care of the VAD, and patient characteristics affect risk of thromboembolism. | |
| VAD tip position is a key determinant of venous thrombosis; central VADs that do not terminate at the cavo‐atrial junction should be repositioned to reduce the risk of thrombosis. | |
| Antithrombotic coated or eluting devices may reduce risk of thrombosis, though definitive data are lacking. | |
| Phlebitis | Inflammation caused by damage to tunica media.[18] |
| 3 types of phlebitis: | |
| Chemical: due to irritation of media from the infusate. | |
| Mechanical: VAD physically damages the vessel. | |
| Infective: bacteria invade vein and inflame vessel wall. | |
| Phlebitis may be limited by close attention to infusate compatibility with peripheral veins, appropriate dilution, and prompt removal of catheters that show signs of inflammation. | |
| Phlebitis may be prevented in PICCs by ensuring at least a 2:1 vein:catheter ratio. | |
| Extravasation | Extravasation (also called infiltration) is defined as leakage of infusate from intravascular to extravascular space. |
| Extravasation of vesicants/emrritants is particularly worrisome. | |
| May result in severe tissue injury, blistering, and tissue necrosis.[11] | |
| VADs should be checked frequently for adequate flushing and position prior to each infusion to minimize risk. | |
| Any VAD with redness, swelling, and tenderness at the entry site or problems with flushing should not be used without further examination and review of position. | |
Peripheral Venous Access
Short Peripheral Intravenous Catheter
Approximately 200 million peripheral intravenous catheters (PIVs) are placed annually in the United States, making them the most common intravenous catheter.[3] PIVs are short devices, 3 to 6 cm in length, that enter and terminate in peripheral veins (Figure 1A). Placement is recommended in forearm veins rather than those of the hand, wrist, or upper arm, as forearm sites are less prone to occlusion, accidental removal, and phlebitis.[4] Additionally, placement in hand veins impedes activities of daily living (eg, hand washing) and is not preferred by patients.[5] PIV size ranges from 24 gauge (smallest) to 14 gauge (largest); larger catheters are often reserved for fluid resuscitation or blood transfusion as they accommodate greater flow and limit hemolysis. To decrease risk of phlebitis and thrombosis, the shortest catheter and smallest diameter should be used. However, unless adequately secured, smaller diameter catheters are also associated with greater rates of accidental removal.[4, 5]

By definition, PIVs are short‐term devices. The CDC currently recommends removal and replacement of these devices no more frequently than every 72 to 96 hours in adults. However, a recent randomized controlled trial found that replacing PIVs when clinically indicated (eg, device failure, phlebitis) rather than on a routine schedule added 30 hours to their lifespan without an increase in complications.[6] A systematic review by the Cochrane Collaboration echoes these findings.[3] These data have thus been incorporated into recommendations from the Infusion Nurses Society (INS) and the National Health Service in the United Kingdom.[5, 7] In hospitalized patients, this approach is relevant, as it preserves venous access sites, maximizes device dwell, and limits additional PIV insertions. In turn, these differences may reduce the need for invasive VADs such as PICCs. Furthermore, the projected 5‐year savings from implementation of clinically indicated PIV removal policies is US$300 million and 1 million health‐worker hours in the United States alone.[4]
PIVs offer many advantages. First, they are minimally invasive and require little training to insert. Second, they can be used for diverse indications in patients requiring short‐term (1 week) venous access. Third, PIVs do not require imaging to ensure correct placement; palpation of superficial veins is sufficient. Fourth, PIVs exhibit a risk of bloodstream infection that is about 40‐fold lower than more invasive, longer‐dwelling VADs[8] (0.06 bacteremia per 1000 catheter‐days).
Despite these advantages, PIVs also have important drawbacks. First, a quarter of all PIVs fail through occlusion or accidental dislodgement.[4] Infiltration, extravasation, and hematoma formation are important adverse events that may occur in such cases. Second, thrombophlebitis (pain and redness at the insertion site) is frequent, and may require device removal, especially in patients with catheters 20 guage.[9] Third, despite their relative safety, PIVs can cause localized or hematogenous infection. Septic thrombophlebitis (superficial thrombosis and bloodstream infection) and catheter‐related bloodstream infection, though rare, have been reported with PIVs and may lead to serious complications.[8, 10] In fact, some suggest that the overall burden of bloodstream infection risk posed by PIVs may be similar to that of CICCs given the substantially greater number of devices used and greater number of device days.[8]
PIVs and other peripheral VADs are not suitable for infusion of vesicants or irritants, which require larger, central veins for delivery. Vesicants (drugs that cause blistering on infusion) include chemotherapeutic agents (eg, dactinomycin, paclitaxel) and commonly used nonchemotherapeutical agents (eg, diazepam, piperacillin, vancomycin, esmolol, or total parenteral nutrition [TPN]).[11] Irritants (phlebitogenic drugs) cause short‐term inflammation and pain, and thus should not be peripherally infused for prolonged durations. Common irritants in the hospital setting include acyclovir, dobutamine, penicillin, and potassium chloride.
Of note, about one‐quarter of PIV insertions fail owing to difficult intravenous access.[12] Ultrasound‐guided peripheral intravenous (USGPIV) catheter placement is emerging as a technique to provide peripheral access for such patients to avoid placement of central venous access devices. Novel, longer devices (>8 cm) with built‐in guide wires have been developed to increase placement success of USGPIVs. These new designs provide easier access into deeper arm veins (brachial or basilic) not otherwise accessible by short PIVs. Although studies comparing the efficacy of USGPIV devices to other VADs are limited, a recent systematic review showed that time to successful cannulation was shorter, and fewer attempts were required to place USGPIVs compared to PIVs.[13] A recent study in France found that USGPIVs met the infusion needs of patients with difficult veins with minimal increase in complications.[14] Despite these encouraging data, future studies are needed to better evaluate this technology.
Midline Catheter
A midline is a VAD that is between 7.5 to 25 cm in length and is typically inserted into veins above the antecubital fossa. The catheter tip resides in a peripheral upper arm vein, often the basilic or cephalic vein, terminating just short of the subclavian vein (Figure 1B). Midline‐like devices were first developed in the 1950s and were initially used as an alternative to PIVs because they were thought to allow longer dwell times.[15] However, because they were originally constructed with a fairly rigid material, infiltration, mechanical phlebitis, and inflammation were common and tempered enthusiasm for their use.[15, 16] Newer midline devices obviate many of these problems and are inserted by ultrasound guidance and modified Seldinger technique.[17] Despite these advances, data regarding comparative efficacy are limited.
Midlines offer longer dwell times than standard PIVs owing to termination in the larger diameter basilic and brachial veins of the arm. Additionally, owing to their length, midlines are less prone to dislodgement. As they are inserted with greater antisepsis than PIVs and better secured to the skin, they are more durable than PIVs.[5, 9, 18] Current INS standards recommend use of midlines for 1 to 4 weeks.[5] Because they terminate in a peripheral vein, medications and infusions compatible with midlines are identical to those that are infused through a PIV. Thus, TPN, vesicants or irritants, or drugs that feature a pH <5 or pH >9, or >500 mOsm should not be infused through a midline.[15] New evidence suggests that diluted solutions of vancomycin (usually pH <5) may be safe to infuse for short durations (<6 days) through a midline, and that concentration rather than pH may be more important in this regard.[19] Although it is possible that the use of midlines may extend to agents typically not deemed peripheral access compatible, limited evidence exists to support such a strategy at this time.
Midlines offer several advantages. First, because blood flow is greater in the more proximal veins of the arm, midlines can accommodate infusions at rates of 100 to 150 mL/min compared to 20 to 40 mL/min in smaller peripheral veins. Higher flow rates offer greater hemodilution (dilution of the infusion with blood), decreasing the likelihood of phlebitis and infiltration.[20] Second, midlines do not require x‐ray verification of tip placement; thus, their use is often favored in resource‐depleted settings such as skilled nursing facilities. Third, midlines offer longer dwell times than peripheral intravenous catheters and can thus serve as bridge devices for short‐term intravenous antibiotics or peripheral‐compatible infusions in an outpatient setting. Available evidence suggests that midlines are associated with low rates of bloodstream infection (0.30.8 per 1000 catheter‐days).[17] The most frequent complications include phlebitis (4.2%) and occlusion (3.3%).[20] Given these favorable statistics, midlines may offer a good alternative to PIVs in select patients who require peripheral infusions of intermediate duration.
Intraosseous Vascular Access
Intraosseous (IO) devices access the vascular system by piercing cortical bone. These devices provide access to the intramedullary cavity and venous plexi of long bones such as the tibia, femur, or humerus. Several insertion devices are now commercially available and have enhanced the ease and safety of IO placement. Using these newer devices, IO access may be obtained in 1 to 2 minutes with minimal training. By comparison, a central venous catheter often requires 10 to 15 minutes to insert with substantial training efforts for providers.[21, 22, 23]
IO devices thus offer several advantages. First, given the rapidity with which they can be inserted, they are often preferred in emergency settings (eg, trauma). Second, these devices are versatile and can accommodate both central and peripheral infusates.[24] Third, a recent meta‐analysis found that IOs have a low complication rate of 0.8%, with extravasation of infusate through the cortical entry site being the most common adverse event.[21] Of note, this study also reported zero local or distal infectious complications, a finding that may relate to the shorter dwell of these devices.[21] Some animal studies suggest that fat embolism from bone may occur at high rates with IO VADs.[25] However, death or significant morbidity from fat emboli in humans following IO access has not been described. Whether such emboli occur or are clinically significant in the context of IO devices remains unclear at this time.[21]
Central Venous Access Devices
Central venous access devices (CVADs) share in common tip termination in the cavo‐atrial junction, either in the lower portion of the superior vena cava or in the upper portion of the right atrium. CVADs can be safely used for irritant or vesicant medications as well as for blood withdrawal, blood exchange procedures (eg, dialysis), and hemodynamic monitoring. Traditionally, these devices are 15 to 25 cm in length and are directly inserted in the deep veins of the supra‐ or infraclavicular area, including the internal jugular, brachiocephalic, subclavian, or axillary veins. PICCs are unique CVADs in that they enter through peripheral veins but terminate in the proximity of the cavoatrial junction. Regarding nomenclature, CICC will be used to denote devices that enter directly into veins of the neck or chest, whereas PICC will be used for devices that are inserted peripherally but terminate centrally.
Peripherally Inserted Central Catheter
PICCs are inserted into peripheral veins of the upper arm (eg, brachial, basilica, or cephalic vein) and advanced such that the tip resides at the cavoatrial junction (Figure 1C). PICCs offer prolonged dwell times and are thus indicated when patients require venous access for weeks or months.[26] Additionally, they can accommodate a variety of infusates and are safer to insert than CICCs, given placement in peripheral veins of the arm rather than central veins of the chest/neck. Thus, insertion complications such as pneumothorax, hemothorax, or significant bleeding are rare with PICCs. In fact, a recent study reported that PICC insertion by hospitalists was associated with low rates of insertion or infectious complications.[27]
However, like CICCs, PICCs are associated with central lineassociated bloodstream infection (CLABSI), a serious complication known to prolong length of hospital stay, increase costs, and carry a 12% to 25% associated mortality.[28, 29] In the United States alone, over 250,000 CLASBI cases occur per year drawing considerable attention from the CDC and Joint Commission, who now mandate reporting and nonpayment for hospital‐acquired CLABSI.[30, 31, 32] A recent systematic review and meta‐analysis found that PICCs are associated with a substantial risk of CLABSI in hospitalized patients.[33] Importantly, no difference in CLABSI rates between PICCs and CICCs in hospitalized patients was evident in this meta‐analysis. Therefore, current guidelines specifically recommend against use of PICCs over CICCs as a strategy to reduce CLABSI.[34] Additionally, PICCs are associated with 2.5‐fold greater risk of deep vein thrombosis (DVT) compared to CICCs; thus, they should be used with caution in patients with cancer or those with underlying hypercoagulable states.
Of particular import to hospitalists is the fact that PICC placement is contraindicated in patients with stage IIIB or greater chronic kidney disease (CKD). In such patients, sequelae of PICC use, such as phlebitis or central vein stenosis, can be devastating in patients with CKD.[35] In a recent study, prior PICC placement was the strongest predictor of subsequent arteriovenous graft failure.[36] For this reason, Choosing Wisely recommendations call for avoidance of PICCs in such patients.[37]
Centrally Inserted Central Catheter
CICCs are CVADs placed by puncture and cannulation of the internal jugular, subclavian, brachiocephalic, or femoral veins (Figure 1D) and compose the vast majority of VADs placed in ICU settings.[38, 39] Central termination of CICCs allows for a variety of infusions, including irritants, vesicants, and vasopressors, as well as blood withdrawal and hemodynamic monitoring. CICCs are typically used for 7 to 14 days, but may remain for longer durations if they remain complication free and clinically necessary.[40] A key advantage of CICCs is that they can be placed in emergent settings to facilitate quick access for rapid infusion or hemodynamic monitoring. In particular, CICCs are inserted in the femoral vein and may be useful in emergency settings. However, owing to risk of infection and inability to monitor central pressures, these femoral devices should be replaced with a proper CICC or PICC when possible. Importantly, although CICCs are almost exclusively used in intensive or emergency care, PICCs may also be considered in such settings.[41, 42] CICCs usually have multiple lumens and often serve several simultaneous functions such as both infusions and hemodynamic monitoring.
Despite their benefits, CICCs have several disadvantages. First, insertion requires an experienced clinician and has historically been a task limited to physicians. However, this is changing rapidly (especially in Europe and Australia) where specially trained nurses are assuming responsibility for CICC placement.[43] Second, these devices are historically more likely to be associated with CLABSI, with estimates of infection rates varying between 2 and 5 infections per 1000 catheter‐days.[44] Third, CICCs pose a significant DVT risk, with rates around 22 DVTs per 1000 catheter‐days.[45] However, compared to PICCs, the DVT risk appears lower, and CICC use may be preferable in patients at high risk of DVT, such as critically ill or cancer populations.[46] An important note to prevent CICC insertion complications relates to use of ultrasound, a practice that has been associated with decreased accidental arterial puncture and hematoma formation. The role of ultrasound guidance with PICCs as well as implications for thrombotic and infectious events remains less characterized at this point.[47]
Tunneled Central Venous Access Devices
Tunneled devices (either CICCs or PICCs) are characterized by the fact that the insertion site on the skin and site of ultimate venipuncture are physically separated (Figure 1E). Tunneling limits bacterial entry from the extraluminal aspect of the CVAD to the bloodstream. For example, internal jugular veins are often ideal sites of puncture but inappropriate sites for catheter placement, as providing care to this area is challenging and may increase risk of infection.[34] Tunneling to the infraclavicular area provides a better option, as it provides an exit site that can be adequately cared for. Importantly, any CVAD (PICCs or CICCs) can be tunneled. Additionally, tunneled CICCs may be used in patients with chronic or impending renal failure where PICCs are contraindicated because entry into dialysis‐relevant vessels is to be avoided.[48] Such devices also allow regular blood sampling in patients who require frequent testing but have limited peripheral access, such as those with hematological malignancies. Additionally, tunneled catheters are more comfortable for patients and viewed as being more socially acceptable than nontunneled devices. However, the more invasive and permanent nature of these devices often requires deliberation prior to insertion.
Of note, tunneled devices and ports may be used as long‐term (>3 months to years) VADs. As our focus in this review is short‐term devices, we will not expand the discussion of these devices as they are almost always used for prolonged durations.[7]
OPERATIONALIZING THE DATA: AN ALGORITHMIC APPROACH TO VENOUS ACCESS
Hospitalists should consider approaching venous access using an algorithm based on a number of parameters. For example, a critically ill patient who requires vasopressor support and hemodynamic monitoring will need a CICC or a PICC. Given the potential greater risk of thromboses from PICCs, a CICC is preferable for critically ill patients provided an experienced inserter is available. Conversely, patients who require short‐term (<710 days) venous access for infusion of nonirritant or nonvesicant therapy often only require a PIV. In patients with poor or difficult venous access, USGPIVs or midlines may be ideal and preferred over short PIVs. Finally, patients who require longer‐term or home‐based treatment may benefit from early placement of a midline or a PICC, depending again on the nature of the infusion, duration of treatment, and available venous access sites.
An algorithmic approach considering these parameters is suggested in Figure 2, and a brief overview of the devices and their considerations is shown in Table 3.
| Vascular Access Device | Central/Peripheral | Anatomical Location of Placement | Desired Duration of Placement | Common Uses | BSI Risk (Per 1,000 Catheter‐Days) | Thrombosis Risk | Important Considerations |
|---|---|---|---|---|---|---|---|
| |||||||
| Small peripheral IV | Peripheral | Peripheral veins, usually forearm | 710 days | Fluid resuscitation, most medications, blood products | 0.06[8] | Virtually no risk | Consider necessity of PIV daily and remove unnecessary devices |
| Midline | Peripheral | Inserted around antecubital fossa, reside within basilic or cephalic vein of the arm | 24 weeks | Long‐term medications excluding TPN, vesicants, corrosives | 0.30.8[17] | Insufficient data | Can be used as bridge devices for patients to complete short‐term antibiotics/emnfusions as an outpatient |
| Peripherally inserted central catheter | Central | Inserted into peripheral arm vein and advanced to larger veins (eg, internal jugular or subclavian) to the CAJ | >1 week, <3 months | Large variety of infusates, including TPN, vesicants, corrosives | 2.4[44] | 6.30% | Contraindicated in patients with CKD stage IIIb or higher |
| Centrally inserted central catheters | Central | Inserted above (internal jugular vein, brachiocephalic vein, subclavian vein), or below the clavicle (axillary vein) | >1 week, <3 months | Same infusate variety as PICC, measurement of central venous pressures, common in trauma/emergent settings | 2.3[44] | 1.30% | Given lower rates of DVT than PICC, preferred in ICU and hypercoagulable environments |
| Tunneled CICCs | Central | Placed percutaneously in any large vein in the arm, chest, neck or groin | >3 months to years | Central infusates, as in any CVAD; used for patients with CKD stage IIIb or greater when a PICC is indicated | Insufficient data | Insufficient data | May be superior when insertion site and puncture site are not congruent and may increase risk of infection |

CONCLUSIONS
With strides in technology and progress in medicine, hospitalists have access to an array of options for venous access. However, every VAD has limitations that can be easily overlooked in a perfunctory decision‐making process. The data presented in this review thus provide a first step to improving safety in this evolving science. Studies that further determine appropriateness of VADs in hospitalized settings are necessary. Only through such progressive scientific enquiry will complication‐free venous access be realized.
Disclosure
Nothing to report.
- , . The need for comparative data in vascular access: the rationale and design of the PICC registry. J Vasc Access. 2013; 18(4): 219–224.
- , , , et al. Hospitalist experiences, practice, opinions, and knowledge regarding peripherally inserted central catheters: a Michigan survey. J Hosp Med. 2013; 8(6): 309–314.
- , , , . Clinically‐indicated replacement versus routine replacement of peripheral venous catheters. Cochrane Database Syst Rev. 2013; 4: CD007798.
- , , , et al. Cost‐effectiveness analysis of clinically indicated versus routine replacement of peripheral intravenous catheters. Appl Health Econ Health Policy. 2014; 12(1): 51–58.
- Infusion Nurses Society. Infusion Nursing Standards of Practice. Norwood, MA; Infusion Nurses Society; 2011.
- , , , et al. Routine versus clinically indicated replacement of peripheral intravenous catheters: a randomised controlled equivalence trial. Lancet. 2012; 380(9847): 1066–1074.
- , , , et al. epic3: national evidence‐based guidelines for preventing healthcare‐associated infections in NHS hospitals in England. J Hosp Infect. 2014; 86(suppl 1): S1–S70.
- . Short peripheral intravenous catheters and infections. J Infus Nurs. 2012; 35(4): 230–240.
- , , , et al. Phlebitis risk varies by peripheral venous catheter site and increases after 96 hours: a large multi‐centre prospective study. J Adv Nurs. 2014; 70(11): 2539–2549.
- , . Intravenous catheter complications in the hand and forearm. J Trauma. 2004; 56(1): 123–127.
- , , , . Vesicant extravasation part I: Mechanisms, pathogenesis, and nursing care to reduce risk. Oncol Nurs Forum. 2006; 33(6): 1134–1141.
- , . Variables influencing intravenous catheter insertion difficulty and failure: an analysis of 339 intravenous catheter insertions. Heart Lung. 2005; 34(5): 345–359.
- , , . Ultrasound‐guided peripheral venous access: a systematic review of randomized‐controlled trials. Eur J Emerg Med. 2014; 21(1): 18–23.
- , , , et al. Difficult peripheral venous access: clinical evaluation of a catheter inserted with the Seldinger method under ultrasound guidance. J Crit Care. 2014; 29(5): 823–827.
- . Choosing the right intravenous catheter. Home Healthc Nurse. 2007; 25(8): 523–531; quiz 532–523.
- Adverse reactions associated with midline catheters—United States, 1992–1995. MMWR Morb Mortal Wkly Rep. 1996; 45(5): 101–103.
- , , . The risk of midline catheterization in hospitalized patients. A prospective study. Ann Intern Med. 1995; 123(11): 841–844.
- . Midline catheters: indications, complications and maintenance. Nurs Stand. 2007; 22(11): 48–57; quiz 58.
- , . Safe administration of vancomycin through a novel midline catheter: a randomized, prospective clinical trial. J Vasc Access. 2014; 15(4): 251–256.
- . Midline catheters: the middle ground of intravenous therapy administration. J Infus Nurs. 2004; 27(5): 313–321.
- . Vascular access in resuscitation: is there a role for the intraosseous route? Anesthesiology. 2014; 120(4): 1015–1031.
- , , , et al. A new system for sternal intraosseous infusion in adults. Prehosp Emerg Care. 2000; 4(2): 173–177.
- , , , , , . Comparison of two intraosseous access devices in adult patients under resuscitation in the emergency department: a prospective, randomized study. Resuscitation. 2010; 81(8): 994–999.
- , , , , . Comparison study of intraosseous, central intravenous, and peripheral intravenous infusions of emergency drugs. Am J Dis Child. 1990; 144(1): 112–117.
- , , , , . The safety of intraosseous infusions: risks of fat and bone marrow emboli to the lungs. Ann Emerg Med. 1989; 18(10): 1062–1067.
- , , , , . ESPEN guidelines on parenteral nutrition: central venous catheters (access, care, diagnosis and therapy of complications). Clin Nutr. 2009; 28(4): 365–377.
- , . Peripherally inserted central catheter use in the hospitalized patient: is there a role for the hospitalist? J Hosp Med. 2009; 4(6): E1–E4.
- Vital signs: central line‐associated blood stream infections—United States, 2001, 2008, and 2009. MMWR Morb Mortal Wkly Rep. 2011; 60(8): 243–248.
- , , , . Hospital costs of central line‐associated bloodstream infections and cost‐effectiveness of closed vs. open infusion containers. The case of Intensive Care Units in Italy. Cost Eff Resour Alloc. 2010; 8: 8.
- , , . The risk of bloodstream infection in adults with different intravascular devices: a systematic review of 200 published prospective studies. Mayo Clin Proc. 2006; 81(9): 1159–1171.
- The Joint Commission. Preventing Central Line‐Associated Bloodstream Infections: A Global Challenge, a Global Perspective. Oak Brook, IL: Joint Commission Resources; 2012.
- , , , et al. Guidelines for the prevention of intravascular catheter‐related infections. Am J Infect Control. 2011; 39(4 suppl 1): S1–S34.
- , , , , . The risk of bloodstream infection associated with peripherally inserted central catheters compared with central venous catheters in adults: a systematic review and meta‐analysis. Infect Control Hosp Epidemiol. 2013; 34(9): 908–918.
- Society for Healthcare Epidemiology of America, Infectious Diseases Society of America, American Hospital Association, Association for Professionals in Infection Control and Epidemiology, The Joint Commission. Compendium of Strategies to Prevent Healthcare‐Associated Infections in Acute Care Hospitals: 2014 Updates. Available at: http://www.shea‐online.org. Accessed August 1, 2014.
- , , , , . Guidelines for venous access in patients with chronic kidney disease. A Position Statement from the American Society of Diagnostic and Interventional Nephrology, Clinical Practice Committee and the Association for Vascular Access. Semin Dial. 2008; 21(2): 186–191.
- , , , et al. Association between prior peripherally inserted central catheters and lack of functioning arteriovenous fistulas: a case‐control study in hemodialysis patients. Am J Kidney Dis. 2012; 60(4): 601–608.
- , , , et al. Critical and honest conversations: the evidence behind the “Choosing Wisely” campaign recommendations by the American Society of Nephrology. Clin J Am Soc Nephrol. 2012; 7(10): 1664–1672.
- , , , et al. Do physicians know which of their patients have central venous catheters? A multi‐center observational study. Ann Intern Med. 2014; 161(8): 562–567.
- , , , et al. Hospital‐wide survey of the use of central venous catheters. J Hosp Infect. 2011; 77(4): 304–308.
- , , , , , . Peripherally inserted central catheter‐related deep vein thrombosis: contemporary patterns and predictors. J Thromb Haemost. 2014; 12(6): 847–854.
- , , , . An in vitro study comparing a peripherally inserted central catheter to a conventional central venous catheter: no difference in static and dynamic pressure transmission. BMC Anesthesiol. 2010; 10: 18.
- , , , et al. Clinical experience with power‐injectable PICCs in intensive care patients. Crit Care. 2012; 16(1): R21.
- , , , et al. Nurse‐led central venous catheter insertion‐procedural characteristics and outcomes of three intensive care based catheter placement services. Int J Nurs Stud. 2012; 49(2): 162–168.
- , , , et al. Peripherally inserted central venous catheters in the acute care setting: a safe alternative to high‐risk short‐term central venous catheters. Am J Infect Control. 2010; 38(2): 149–153.
- , , , et al. Which central venous catheters have the highest rate of catheter‐associated deep venous thrombosis: a prospective analysis of 2,128 catheter days in the surgical intensive care unit. J Trauma Acute Care Surg. 2013; 74(2): 454–460; discussion 461–452.
- , , , et al. Risk of venous thromboembolism associated with peripherally inserted central catheters: a systematic review and meta‐analysis. Lancet. 2013; 382(9889): 311–325.
- , , , , . Ultrasound guidance versus anatomical landmarks for subclavian or femoral vein catheterization. Cochrane Database Syst Rev. 2015; 1: CD011447.
- , , , et al. Tunneled jugular small‐bore central catheters as an alternative to peripherally inserted central catheters for intermediate‐term venous access in patients with hemodialysis and chronic renal insufficiency. Radiology. 1999; 213(1): 303–306.
Reliable venous access is fundamental for the safe and effective care of hospitalized patients. Venous access devices (VADs) are conduits for this purpose, providing delivery of intravenous medications, accurate measurement of central venous pressure, or administration of life‐saving blood products. Despite this important role, VADs are also often the source of hospital‐acquired complications. Although inpatient providers must balance the relative risks of VADs against their benefits, the evidence supporting such decisions is often limited. Advances in technology, scattered research, and growing availability of novel devices has only further fragmented provider knowledge in the field of vascular access.[1]
It is not surprising, then, that survey‐based studies of hospitalists reveal important knowledge gaps with regard to practices associated with VADs.[2] In this narrative review, we seek to bridge this gap by providing a concise and pragmatic overview of the fundamentals of venous access. We focus specifically on parameters that influence decisions regarding VAD placement in hospitalized patients, providing key takeaways for practicing hospitalists.
METHODS
To compile this review, we systematically searched Medline (via Ovid) for several keywords, including: peripheral intravenous catheters, ultrasound‐guided peripheral catheter, intraosseous, midline, peripherally inserted central catheter, central venous catheters, and vascular access device complications. We concentrated on full‐length articles in English only; no date restrictions were placed on the search. We reviewed guidelines and consensus statements (eg, from the Center for Disease Control [CDC] or Choosing Wisely criteria) as appropriate. Additional studies of interest were identified through content experts (M.P., C.M.R.) and bibliographies of included studies.
SCIENTIFIC PRINCIPLES UNDERPINNING VENOUS ACCESS
It is useful to begin by reviewing VAD‐related nomenclature and physiology. In the simplest sense, a VAD consists of a hub (providing access to various connectors), a hollow tube divided into 1 or many sections (lumens), and a tip that may terminate within a central or peripheral blood vessel. VADs are classified as central venous catheters (eg, centrally inserted central catheters [CICCs] or peripherally inserted central catheters [PICCs]) or peripheral intravenous catheters (eg, midlines or peripheral intravenous catheters) based on site of entry and location of the catheter tip. Therefore, VADs entering via proximal or distal veins of the arm are often referred to as peripheral lines, as their site of entry and tip both reside within peripheral veins. Conversely, the term central line is often used when VADs enter or terminate in a central vein (eg, subclavian vein insertion with the catheter tip in the lower superior vena cava).
Attention to a host of clinical and theoretical parameters is important when choosing a device for venous access. Some such parameters are summarized in Table 1.
| Parameter | Major Considerations |
|---|---|
| |
| Desired flow rate | Smaller diameter veins susceptible to damage with high flow rates. |
| Short, large‐bore catheters facilitate rapid infusion. | |
| Nature of infusion | pH, viscosity, and temperature may damage vessels. |
| Vesicants and irritants should always be administered into larger, central veins. | |
| Desired duration of vascular access, or dwell time | Vessel thrombosis or phlebitis increase over time with catheter in place. |
| Intermittent infusions increase complications in central catheters; often tunneled catheters are recommended. | |
| Urgency of placement | Access to large caliber vessels is often needed in emergencies. |
| Critically ill or hemodynamically unstable patients may require urgent access for invasive monitoring or rapid infusions. | |
| Patients with trauma often require large volumes of blood products and reliable access to central veins. | |
| Number of device lumens | VADs may have single or multiple lumens. |
| Multilumen allows for multiple functions (eg, infusion of multiple agents, measurement of central venous pressures, blood draws). | |
| Device gauge | In general, use of a smaller‐gauge catheter is preferred to prevent complications. |
| However, larger catheter diameter may be needed for specific clinical needs (eg, blood transfusion). | |
| Device coating | VADs may have antithrombotic or anti‐infective coatings. |
| These devices may be of value in patients at high risk of complications. | |
| Such devices, however, may be more costly than their counterparts. | |
| Self‐care compatibility | VADs that can be cared for by patients are ideal for outpatient care. |
| Conversely, VADs such as peripheral catheters, are highly prone to dislodgement and should be reserved for supervised settings only. | |
VENOUS ACCESS DEVICES
We will organize our discussion of VADs based on whether they terminate in peripheral or central vessels. These anatomical considerations are relevant as they determine physical characteristics, compatibility with particular infusates, dwell time, and risk of complications associated with each VAD discussed in Table 2.
| Complications | Major Considerations |
|---|---|
| |
| Infection | VADs breach the integrity of skin and permit skin pathogens to enter the blood stream (extraluminal infection). |
| Inadequate antisepsis of the VAD hub, including poor hand hygiene, failure to "scrub the hub," and multiple manipulations may also increase the risk of VAD‐related infection (endoluminal infection). | |
| Infections may be local (eg, exit‐site infections) or may spread hematogenously (eg, CLABSI). | |
| Type of VAD, duration of therapy, and host characteristics interact to influence infection risk. | |
| VADs with antiseptic coatings (eg, chlorhexidine) or antibiotic coatings (eg, minocycline) may reduce risk of infection in high‐risk patients. | |
| Antiseptic‐impregnated dressings may reduce risk of extraluminal infection. | |
| Venous thrombosis | VADs predispose to venous stasis and thrombosis. |
| Duration of VAD use, type and care of the VAD, and patient characteristics affect risk of thromboembolism. | |
| VAD tip position is a key determinant of venous thrombosis; central VADs that do not terminate at the cavo‐atrial junction should be repositioned to reduce the risk of thrombosis. | |
| Antithrombotic coated or eluting devices may reduce risk of thrombosis, though definitive data are lacking. | |
| Phlebitis | Inflammation caused by damage to tunica media.[18] |
| 3 types of phlebitis: | |
| Chemical: due to irritation of media from the infusate. | |
| Mechanical: VAD physically damages the vessel. | |
| Infective: bacteria invade vein and inflame vessel wall. | |
| Phlebitis may be limited by close attention to infusate compatibility with peripheral veins, appropriate dilution, and prompt removal of catheters that show signs of inflammation. | |
| Phlebitis may be prevented in PICCs by ensuring at least a 2:1 vein:catheter ratio. | |
| Extravasation | Extravasation (also called infiltration) is defined as leakage of infusate from intravascular to extravascular space. |
| Extravasation of vesicants/emrritants is particularly worrisome. | |
| May result in severe tissue injury, blistering, and tissue necrosis.[11] | |
| VADs should be checked frequently for adequate flushing and position prior to each infusion to minimize risk. | |
| Any VAD with redness, swelling, and tenderness at the entry site or problems with flushing should not be used without further examination and review of position. | |
Peripheral Venous Access
Short Peripheral Intravenous Catheter
Approximately 200 million peripheral intravenous catheters (PIVs) are placed annually in the United States, making them the most common intravenous catheter.[3] PIVs are short devices, 3 to 6 cm in length, that enter and terminate in peripheral veins (Figure 1A). Placement is recommended in forearm veins rather than those of the hand, wrist, or upper arm, as forearm sites are less prone to occlusion, accidental removal, and phlebitis.[4] Additionally, placement in hand veins impedes activities of daily living (eg, hand washing) and is not preferred by patients.[5] PIV size ranges from 24 gauge (smallest) to 14 gauge (largest); larger catheters are often reserved for fluid resuscitation or blood transfusion as they accommodate greater flow and limit hemolysis. To decrease risk of phlebitis and thrombosis, the shortest catheter and smallest diameter should be used. However, unless adequately secured, smaller diameter catheters are also associated with greater rates of accidental removal.[4, 5]

By definition, PIVs are short‐term devices. The CDC currently recommends removal and replacement of these devices no more frequently than every 72 to 96 hours in adults. However, a recent randomized controlled trial found that replacing PIVs when clinically indicated (eg, device failure, phlebitis) rather than on a routine schedule added 30 hours to their lifespan without an increase in complications.[6] A systematic review by the Cochrane Collaboration echoes these findings.[3] These data have thus been incorporated into recommendations from the Infusion Nurses Society (INS) and the National Health Service in the United Kingdom.[5, 7] In hospitalized patients, this approach is relevant, as it preserves venous access sites, maximizes device dwell, and limits additional PIV insertions. In turn, these differences may reduce the need for invasive VADs such as PICCs. Furthermore, the projected 5‐year savings from implementation of clinically indicated PIV removal policies is US$300 million and 1 million health‐worker hours in the United States alone.[4]
PIVs offer many advantages. First, they are minimally invasive and require little training to insert. Second, they can be used for diverse indications in patients requiring short‐term (1 week) venous access. Third, PIVs do not require imaging to ensure correct placement; palpation of superficial veins is sufficient. Fourth, PIVs exhibit a risk of bloodstream infection that is about 40‐fold lower than more invasive, longer‐dwelling VADs[8] (0.06 bacteremia per 1000 catheter‐days).
Despite these advantages, PIVs also have important drawbacks. First, a quarter of all PIVs fail through occlusion or accidental dislodgement.[4] Infiltration, extravasation, and hematoma formation are important adverse events that may occur in such cases. Second, thrombophlebitis (pain and redness at the insertion site) is frequent, and may require device removal, especially in patients with catheters 20 guage.[9] Third, despite their relative safety, PIVs can cause localized or hematogenous infection. Septic thrombophlebitis (superficial thrombosis and bloodstream infection) and catheter‐related bloodstream infection, though rare, have been reported with PIVs and may lead to serious complications.[8, 10] In fact, some suggest that the overall burden of bloodstream infection risk posed by PIVs may be similar to that of CICCs given the substantially greater number of devices used and greater number of device days.[8]
PIVs and other peripheral VADs are not suitable for infusion of vesicants or irritants, which require larger, central veins for delivery. Vesicants (drugs that cause blistering on infusion) include chemotherapeutic agents (eg, dactinomycin, paclitaxel) and commonly used nonchemotherapeutical agents (eg, diazepam, piperacillin, vancomycin, esmolol, or total parenteral nutrition [TPN]).[11] Irritants (phlebitogenic drugs) cause short‐term inflammation and pain, and thus should not be peripherally infused for prolonged durations. Common irritants in the hospital setting include acyclovir, dobutamine, penicillin, and potassium chloride.
Of note, about one‐quarter of PIV insertions fail owing to difficult intravenous access.[12] Ultrasound‐guided peripheral intravenous (USGPIV) catheter placement is emerging as a technique to provide peripheral access for such patients to avoid placement of central venous access devices. Novel, longer devices (>8 cm) with built‐in guide wires have been developed to increase placement success of USGPIVs. These new designs provide easier access into deeper arm veins (brachial or basilic) not otherwise accessible by short PIVs. Although studies comparing the efficacy of USGPIV devices to other VADs are limited, a recent systematic review showed that time to successful cannulation was shorter, and fewer attempts were required to place USGPIVs compared to PIVs.[13] A recent study in France found that USGPIVs met the infusion needs of patients with difficult veins with minimal increase in complications.[14] Despite these encouraging data, future studies are needed to better evaluate this technology.
Midline Catheter
A midline is a VAD that is between 7.5 to 25 cm in length and is typically inserted into veins above the antecubital fossa. The catheter tip resides in a peripheral upper arm vein, often the basilic or cephalic vein, terminating just short of the subclavian vein (Figure 1B). Midline‐like devices were first developed in the 1950s and were initially used as an alternative to PIVs because they were thought to allow longer dwell times.[15] However, because they were originally constructed with a fairly rigid material, infiltration, mechanical phlebitis, and inflammation were common and tempered enthusiasm for their use.[15, 16] Newer midline devices obviate many of these problems and are inserted by ultrasound guidance and modified Seldinger technique.[17] Despite these advances, data regarding comparative efficacy are limited.
Midlines offer longer dwell times than standard PIVs owing to termination in the larger diameter basilic and brachial veins of the arm. Additionally, owing to their length, midlines are less prone to dislodgement. As they are inserted with greater antisepsis than PIVs and better secured to the skin, they are more durable than PIVs.[5, 9, 18] Current INS standards recommend use of midlines for 1 to 4 weeks.[5] Because they terminate in a peripheral vein, medications and infusions compatible with midlines are identical to those that are infused through a PIV. Thus, TPN, vesicants or irritants, or drugs that feature a pH <5 or pH >9, or >500 mOsm should not be infused through a midline.[15] New evidence suggests that diluted solutions of vancomycin (usually pH <5) may be safe to infuse for short durations (<6 days) through a midline, and that concentration rather than pH may be more important in this regard.[19] Although it is possible that the use of midlines may extend to agents typically not deemed peripheral access compatible, limited evidence exists to support such a strategy at this time.
Midlines offer several advantages. First, because blood flow is greater in the more proximal veins of the arm, midlines can accommodate infusions at rates of 100 to 150 mL/min compared to 20 to 40 mL/min in smaller peripheral veins. Higher flow rates offer greater hemodilution (dilution of the infusion with blood), decreasing the likelihood of phlebitis and infiltration.[20] Second, midlines do not require x‐ray verification of tip placement; thus, their use is often favored in resource‐depleted settings such as skilled nursing facilities. Third, midlines offer longer dwell times than peripheral intravenous catheters and can thus serve as bridge devices for short‐term intravenous antibiotics or peripheral‐compatible infusions in an outpatient setting. Available evidence suggests that midlines are associated with low rates of bloodstream infection (0.30.8 per 1000 catheter‐days).[17] The most frequent complications include phlebitis (4.2%) and occlusion (3.3%).[20] Given these favorable statistics, midlines may offer a good alternative to PIVs in select patients who require peripheral infusions of intermediate duration.
Intraosseous Vascular Access
Intraosseous (IO) devices access the vascular system by piercing cortical bone. These devices provide access to the intramedullary cavity and venous plexi of long bones such as the tibia, femur, or humerus. Several insertion devices are now commercially available and have enhanced the ease and safety of IO placement. Using these newer devices, IO access may be obtained in 1 to 2 minutes with minimal training. By comparison, a central venous catheter often requires 10 to 15 minutes to insert with substantial training efforts for providers.[21, 22, 23]
IO devices thus offer several advantages. First, given the rapidity with which they can be inserted, they are often preferred in emergency settings (eg, trauma). Second, these devices are versatile and can accommodate both central and peripheral infusates.[24] Third, a recent meta‐analysis found that IOs have a low complication rate of 0.8%, with extravasation of infusate through the cortical entry site being the most common adverse event.[21] Of note, this study also reported zero local or distal infectious complications, a finding that may relate to the shorter dwell of these devices.[21] Some animal studies suggest that fat embolism from bone may occur at high rates with IO VADs.[25] However, death or significant morbidity from fat emboli in humans following IO access has not been described. Whether such emboli occur or are clinically significant in the context of IO devices remains unclear at this time.[21]
Central Venous Access Devices
Central venous access devices (CVADs) share in common tip termination in the cavo‐atrial junction, either in the lower portion of the superior vena cava or in the upper portion of the right atrium. CVADs can be safely used for irritant or vesicant medications as well as for blood withdrawal, blood exchange procedures (eg, dialysis), and hemodynamic monitoring. Traditionally, these devices are 15 to 25 cm in length and are directly inserted in the deep veins of the supra‐ or infraclavicular area, including the internal jugular, brachiocephalic, subclavian, or axillary veins. PICCs are unique CVADs in that they enter through peripheral veins but terminate in the proximity of the cavoatrial junction. Regarding nomenclature, CICC will be used to denote devices that enter directly into veins of the neck or chest, whereas PICC will be used for devices that are inserted peripherally but terminate centrally.
Peripherally Inserted Central Catheter
PICCs are inserted into peripheral veins of the upper arm (eg, brachial, basilica, or cephalic vein) and advanced such that the tip resides at the cavoatrial junction (Figure 1C). PICCs offer prolonged dwell times and are thus indicated when patients require venous access for weeks or months.[26] Additionally, they can accommodate a variety of infusates and are safer to insert than CICCs, given placement in peripheral veins of the arm rather than central veins of the chest/neck. Thus, insertion complications such as pneumothorax, hemothorax, or significant bleeding are rare with PICCs. In fact, a recent study reported that PICC insertion by hospitalists was associated with low rates of insertion or infectious complications.[27]
However, like CICCs, PICCs are associated with central lineassociated bloodstream infection (CLABSI), a serious complication known to prolong length of hospital stay, increase costs, and carry a 12% to 25% associated mortality.[28, 29] In the United States alone, over 250,000 CLASBI cases occur per year drawing considerable attention from the CDC and Joint Commission, who now mandate reporting and nonpayment for hospital‐acquired CLABSI.[30, 31, 32] A recent systematic review and meta‐analysis found that PICCs are associated with a substantial risk of CLABSI in hospitalized patients.[33] Importantly, no difference in CLABSI rates between PICCs and CICCs in hospitalized patients was evident in this meta‐analysis. Therefore, current guidelines specifically recommend against use of PICCs over CICCs as a strategy to reduce CLABSI.[34] Additionally, PICCs are associated with 2.5‐fold greater risk of deep vein thrombosis (DVT) compared to CICCs; thus, they should be used with caution in patients with cancer or those with underlying hypercoagulable states.
Of particular import to hospitalists is the fact that PICC placement is contraindicated in patients with stage IIIB or greater chronic kidney disease (CKD). In such patients, sequelae of PICC use, such as phlebitis or central vein stenosis, can be devastating in patients with CKD.[35] In a recent study, prior PICC placement was the strongest predictor of subsequent arteriovenous graft failure.[36] For this reason, Choosing Wisely recommendations call for avoidance of PICCs in such patients.[37]
Centrally Inserted Central Catheter
CICCs are CVADs placed by puncture and cannulation of the internal jugular, subclavian, brachiocephalic, or femoral veins (Figure 1D) and compose the vast majority of VADs placed in ICU settings.[38, 39] Central termination of CICCs allows for a variety of infusions, including irritants, vesicants, and vasopressors, as well as blood withdrawal and hemodynamic monitoring. CICCs are typically used for 7 to 14 days, but may remain for longer durations if they remain complication free and clinically necessary.[40] A key advantage of CICCs is that they can be placed in emergent settings to facilitate quick access for rapid infusion or hemodynamic monitoring. In particular, CICCs are inserted in the femoral vein and may be useful in emergency settings. However, owing to risk of infection and inability to monitor central pressures, these femoral devices should be replaced with a proper CICC or PICC when possible. Importantly, although CICCs are almost exclusively used in intensive or emergency care, PICCs may also be considered in such settings.[41, 42] CICCs usually have multiple lumens and often serve several simultaneous functions such as both infusions and hemodynamic monitoring.
Despite their benefits, CICCs have several disadvantages. First, insertion requires an experienced clinician and has historically been a task limited to physicians. However, this is changing rapidly (especially in Europe and Australia) where specially trained nurses are assuming responsibility for CICC placement.[43] Second, these devices are historically more likely to be associated with CLABSI, with estimates of infection rates varying between 2 and 5 infections per 1000 catheter‐days.[44] Third, CICCs pose a significant DVT risk, with rates around 22 DVTs per 1000 catheter‐days.[45] However, compared to PICCs, the DVT risk appears lower, and CICC use may be preferable in patients at high risk of DVT, such as critically ill or cancer populations.[46] An important note to prevent CICC insertion complications relates to use of ultrasound, a practice that has been associated with decreased accidental arterial puncture and hematoma formation. The role of ultrasound guidance with PICCs as well as implications for thrombotic and infectious events remains less characterized at this point.[47]
Tunneled Central Venous Access Devices
Tunneled devices (either CICCs or PICCs) are characterized by the fact that the insertion site on the skin and site of ultimate venipuncture are physically separated (Figure 1E). Tunneling limits bacterial entry from the extraluminal aspect of the CVAD to the bloodstream. For example, internal jugular veins are often ideal sites of puncture but inappropriate sites for catheter placement, as providing care to this area is challenging and may increase risk of infection.[34] Tunneling to the infraclavicular area provides a better option, as it provides an exit site that can be adequately cared for. Importantly, any CVAD (PICCs or CICCs) can be tunneled. Additionally, tunneled CICCs may be used in patients with chronic or impending renal failure where PICCs are contraindicated because entry into dialysis‐relevant vessels is to be avoided.[48] Such devices also allow regular blood sampling in patients who require frequent testing but have limited peripheral access, such as those with hematological malignancies. Additionally, tunneled catheters are more comfortable for patients and viewed as being more socially acceptable than nontunneled devices. However, the more invasive and permanent nature of these devices often requires deliberation prior to insertion.
Of note, tunneled devices and ports may be used as long‐term (>3 months to years) VADs. As our focus in this review is short‐term devices, we will not expand the discussion of these devices as they are almost always used for prolonged durations.[7]
OPERATIONALIZING THE DATA: AN ALGORITHMIC APPROACH TO VENOUS ACCESS
Hospitalists should consider approaching venous access using an algorithm based on a number of parameters. For example, a critically ill patient who requires vasopressor support and hemodynamic monitoring will need a CICC or a PICC. Given the potential greater risk of thromboses from PICCs, a CICC is preferable for critically ill patients provided an experienced inserter is available. Conversely, patients who require short‐term (<710 days) venous access for infusion of nonirritant or nonvesicant therapy often only require a PIV. In patients with poor or difficult venous access, USGPIVs or midlines may be ideal and preferred over short PIVs. Finally, patients who require longer‐term or home‐based treatment may benefit from early placement of a midline or a PICC, depending again on the nature of the infusion, duration of treatment, and available venous access sites.
An algorithmic approach considering these parameters is suggested in Figure 2, and a brief overview of the devices and their considerations is shown in Table 3.
| Vascular Access Device | Central/Peripheral | Anatomical Location of Placement | Desired Duration of Placement | Common Uses | BSI Risk (Per 1,000 Catheter‐Days) | Thrombosis Risk | Important Considerations |
|---|---|---|---|---|---|---|---|
| |||||||
| Small peripheral IV | Peripheral | Peripheral veins, usually forearm | 710 days | Fluid resuscitation, most medications, blood products | 0.06[8] | Virtually no risk | Consider necessity of PIV daily and remove unnecessary devices |
| Midline | Peripheral | Inserted around antecubital fossa, reside within basilic or cephalic vein of the arm | 24 weeks | Long‐term medications excluding TPN, vesicants, corrosives | 0.30.8[17] | Insufficient data | Can be used as bridge devices for patients to complete short‐term antibiotics/emnfusions as an outpatient |
| Peripherally inserted central catheter | Central | Inserted into peripheral arm vein and advanced to larger veins (eg, internal jugular or subclavian) to the CAJ | >1 week, <3 months | Large variety of infusates, including TPN, vesicants, corrosives | 2.4[44] | 6.30% | Contraindicated in patients with CKD stage IIIb or higher |
| Centrally inserted central catheters | Central | Inserted above (internal jugular vein, brachiocephalic vein, subclavian vein), or below the clavicle (axillary vein) | >1 week, <3 months | Same infusate variety as PICC, measurement of central venous pressures, common in trauma/emergent settings | 2.3[44] | 1.30% | Given lower rates of DVT than PICC, preferred in ICU and hypercoagulable environments |
| Tunneled CICCs | Central | Placed percutaneously in any large vein in the arm, chest, neck or groin | >3 months to years | Central infusates, as in any CVAD; used for patients with CKD stage IIIb or greater when a PICC is indicated | Insufficient data | Insufficient data | May be superior when insertion site and puncture site are not congruent and may increase risk of infection |

CONCLUSIONS
With strides in technology and progress in medicine, hospitalists have access to an array of options for venous access. However, every VAD has limitations that can be easily overlooked in a perfunctory decision‐making process. The data presented in this review thus provide a first step to improving safety in this evolving science. Studies that further determine appropriateness of VADs in hospitalized settings are necessary. Only through such progressive scientific enquiry will complication‐free venous access be realized.
Disclosure
Nothing to report.
Reliable venous access is fundamental for the safe and effective care of hospitalized patients. Venous access devices (VADs) are conduits for this purpose, providing delivery of intravenous medications, accurate measurement of central venous pressure, or administration of life‐saving blood products. Despite this important role, VADs are also often the source of hospital‐acquired complications. Although inpatient providers must balance the relative risks of VADs against their benefits, the evidence supporting such decisions is often limited. Advances in technology, scattered research, and growing availability of novel devices has only further fragmented provider knowledge in the field of vascular access.[1]
It is not surprising, then, that survey‐based studies of hospitalists reveal important knowledge gaps with regard to practices associated with VADs.[2] In this narrative review, we seek to bridge this gap by providing a concise and pragmatic overview of the fundamentals of venous access. We focus specifically on parameters that influence decisions regarding VAD placement in hospitalized patients, providing key takeaways for practicing hospitalists.
METHODS
To compile this review, we systematically searched Medline (via Ovid) for several keywords, including: peripheral intravenous catheters, ultrasound‐guided peripheral catheter, intraosseous, midline, peripherally inserted central catheter, central venous catheters, and vascular access device complications. We concentrated on full‐length articles in English only; no date restrictions were placed on the search. We reviewed guidelines and consensus statements (eg, from the Center for Disease Control [CDC] or Choosing Wisely criteria) as appropriate. Additional studies of interest were identified through content experts (M.P., C.M.R.) and bibliographies of included studies.
SCIENTIFIC PRINCIPLES UNDERPINNING VENOUS ACCESS
It is useful to begin by reviewing VAD‐related nomenclature and physiology. In the simplest sense, a VAD consists of a hub (providing access to various connectors), a hollow tube divided into 1 or many sections (lumens), and a tip that may terminate within a central or peripheral blood vessel. VADs are classified as central venous catheters (eg, centrally inserted central catheters [CICCs] or peripherally inserted central catheters [PICCs]) or peripheral intravenous catheters (eg, midlines or peripheral intravenous catheters) based on site of entry and location of the catheter tip. Therefore, VADs entering via proximal or distal veins of the arm are often referred to as peripheral lines, as their site of entry and tip both reside within peripheral veins. Conversely, the term central line is often used when VADs enter or terminate in a central vein (eg, subclavian vein insertion with the catheter tip in the lower superior vena cava).
Attention to a host of clinical and theoretical parameters is important when choosing a device for venous access. Some such parameters are summarized in Table 1.
| Parameter | Major Considerations |
|---|---|
| |
| Desired flow rate | Smaller diameter veins susceptible to damage with high flow rates. |
| Short, large‐bore catheters facilitate rapid infusion. | |
| Nature of infusion | pH, viscosity, and temperature may damage vessels. |
| Vesicants and irritants should always be administered into larger, central veins. | |
| Desired duration of vascular access, or dwell time | Vessel thrombosis or phlebitis increase over time with catheter in place. |
| Intermittent infusions increase complications in central catheters; often tunneled catheters are recommended. | |
| Urgency of placement | Access to large caliber vessels is often needed in emergencies. |
| Critically ill or hemodynamically unstable patients may require urgent access for invasive monitoring or rapid infusions. | |
| Patients with trauma often require large volumes of blood products and reliable access to central veins. | |
| Number of device lumens | VADs may have single or multiple lumens. |
| Multilumen allows for multiple functions (eg, infusion of multiple agents, measurement of central venous pressures, blood draws). | |
| Device gauge | In general, use of a smaller‐gauge catheter is preferred to prevent complications. |
| However, larger catheter diameter may be needed for specific clinical needs (eg, blood transfusion). | |
| Device coating | VADs may have antithrombotic or anti‐infective coatings. |
| These devices may be of value in patients at high risk of complications. | |
| Such devices, however, may be more costly than their counterparts. | |
| Self‐care compatibility | VADs that can be cared for by patients are ideal for outpatient care. |
| Conversely, VADs such as peripheral catheters, are highly prone to dislodgement and should be reserved for supervised settings only. | |
VENOUS ACCESS DEVICES
We will organize our discussion of VADs based on whether they terminate in peripheral or central vessels. These anatomical considerations are relevant as they determine physical characteristics, compatibility with particular infusates, dwell time, and risk of complications associated with each VAD discussed in Table 2.
| Complications | Major Considerations |
|---|---|
| |
| Infection | VADs breach the integrity of skin and permit skin pathogens to enter the blood stream (extraluminal infection). |
| Inadequate antisepsis of the VAD hub, including poor hand hygiene, failure to "scrub the hub," and multiple manipulations may also increase the risk of VAD‐related infection (endoluminal infection). | |
| Infections may be local (eg, exit‐site infections) or may spread hematogenously (eg, CLABSI). | |
| Type of VAD, duration of therapy, and host characteristics interact to influence infection risk. | |
| VADs with antiseptic coatings (eg, chlorhexidine) or antibiotic coatings (eg, minocycline) may reduce risk of infection in high‐risk patients. | |
| Antiseptic‐impregnated dressings may reduce risk of extraluminal infection. | |
| Venous thrombosis | VADs predispose to venous stasis and thrombosis. |
| Duration of VAD use, type and care of the VAD, and patient characteristics affect risk of thromboembolism. | |
| VAD tip position is a key determinant of venous thrombosis; central VADs that do not terminate at the cavo‐atrial junction should be repositioned to reduce the risk of thrombosis. | |
| Antithrombotic coated or eluting devices may reduce risk of thrombosis, though definitive data are lacking. | |
| Phlebitis | Inflammation caused by damage to tunica media.[18] |
| 3 types of phlebitis: | |
| Chemical: due to irritation of media from the infusate. | |
| Mechanical: VAD physically damages the vessel. | |
| Infective: bacteria invade vein and inflame vessel wall. | |
| Phlebitis may be limited by close attention to infusate compatibility with peripheral veins, appropriate dilution, and prompt removal of catheters that show signs of inflammation. | |
| Phlebitis may be prevented in PICCs by ensuring at least a 2:1 vein:catheter ratio. | |
| Extravasation | Extravasation (also called infiltration) is defined as leakage of infusate from intravascular to extravascular space. |
| Extravasation of vesicants/emrritants is particularly worrisome. | |
| May result in severe tissue injury, blistering, and tissue necrosis.[11] | |
| VADs should be checked frequently for adequate flushing and position prior to each infusion to minimize risk. | |
| Any VAD with redness, swelling, and tenderness at the entry site or problems with flushing should not be used without further examination and review of position. | |
Peripheral Venous Access
Short Peripheral Intravenous Catheter
Approximately 200 million peripheral intravenous catheters (PIVs) are placed annually in the United States, making them the most common intravenous catheter.[3] PIVs are short devices, 3 to 6 cm in length, that enter and terminate in peripheral veins (Figure 1A). Placement is recommended in forearm veins rather than those of the hand, wrist, or upper arm, as forearm sites are less prone to occlusion, accidental removal, and phlebitis.[4] Additionally, placement in hand veins impedes activities of daily living (eg, hand washing) and is not preferred by patients.[5] PIV size ranges from 24 gauge (smallest) to 14 gauge (largest); larger catheters are often reserved for fluid resuscitation or blood transfusion as they accommodate greater flow and limit hemolysis. To decrease risk of phlebitis and thrombosis, the shortest catheter and smallest diameter should be used. However, unless adequately secured, smaller diameter catheters are also associated with greater rates of accidental removal.[4, 5]

By definition, PIVs are short‐term devices. The CDC currently recommends removal and replacement of these devices no more frequently than every 72 to 96 hours in adults. However, a recent randomized controlled trial found that replacing PIVs when clinically indicated (eg, device failure, phlebitis) rather than on a routine schedule added 30 hours to their lifespan without an increase in complications.[6] A systematic review by the Cochrane Collaboration echoes these findings.[3] These data have thus been incorporated into recommendations from the Infusion Nurses Society (INS) and the National Health Service in the United Kingdom.[5, 7] In hospitalized patients, this approach is relevant, as it preserves venous access sites, maximizes device dwell, and limits additional PIV insertions. In turn, these differences may reduce the need for invasive VADs such as PICCs. Furthermore, the projected 5‐year savings from implementation of clinically indicated PIV removal policies is US$300 million and 1 million health‐worker hours in the United States alone.[4]
PIVs offer many advantages. First, they are minimally invasive and require little training to insert. Second, they can be used for diverse indications in patients requiring short‐term (1 week) venous access. Third, PIVs do not require imaging to ensure correct placement; palpation of superficial veins is sufficient. Fourth, PIVs exhibit a risk of bloodstream infection that is about 40‐fold lower than more invasive, longer‐dwelling VADs[8] (0.06 bacteremia per 1000 catheter‐days).
Despite these advantages, PIVs also have important drawbacks. First, a quarter of all PIVs fail through occlusion or accidental dislodgement.[4] Infiltration, extravasation, and hematoma formation are important adverse events that may occur in such cases. Second, thrombophlebitis (pain and redness at the insertion site) is frequent, and may require device removal, especially in patients with catheters 20 guage.[9] Third, despite their relative safety, PIVs can cause localized or hematogenous infection. Septic thrombophlebitis (superficial thrombosis and bloodstream infection) and catheter‐related bloodstream infection, though rare, have been reported with PIVs and may lead to serious complications.[8, 10] In fact, some suggest that the overall burden of bloodstream infection risk posed by PIVs may be similar to that of CICCs given the substantially greater number of devices used and greater number of device days.[8]
PIVs and other peripheral VADs are not suitable for infusion of vesicants or irritants, which require larger, central veins for delivery. Vesicants (drugs that cause blistering on infusion) include chemotherapeutic agents (eg, dactinomycin, paclitaxel) and commonly used nonchemotherapeutical agents (eg, diazepam, piperacillin, vancomycin, esmolol, or total parenteral nutrition [TPN]).[11] Irritants (phlebitogenic drugs) cause short‐term inflammation and pain, and thus should not be peripherally infused for prolonged durations. Common irritants in the hospital setting include acyclovir, dobutamine, penicillin, and potassium chloride.
Of note, about one‐quarter of PIV insertions fail owing to difficult intravenous access.[12] Ultrasound‐guided peripheral intravenous (USGPIV) catheter placement is emerging as a technique to provide peripheral access for such patients to avoid placement of central venous access devices. Novel, longer devices (>8 cm) with built‐in guide wires have been developed to increase placement success of USGPIVs. These new designs provide easier access into deeper arm veins (brachial or basilic) not otherwise accessible by short PIVs. Although studies comparing the efficacy of USGPIV devices to other VADs are limited, a recent systematic review showed that time to successful cannulation was shorter, and fewer attempts were required to place USGPIVs compared to PIVs.[13] A recent study in France found that USGPIVs met the infusion needs of patients with difficult veins with minimal increase in complications.[14] Despite these encouraging data, future studies are needed to better evaluate this technology.
Midline Catheter
A midline is a VAD that is between 7.5 to 25 cm in length and is typically inserted into veins above the antecubital fossa. The catheter tip resides in a peripheral upper arm vein, often the basilic or cephalic vein, terminating just short of the subclavian vein (Figure 1B). Midline‐like devices were first developed in the 1950s and were initially used as an alternative to PIVs because they were thought to allow longer dwell times.[15] However, because they were originally constructed with a fairly rigid material, infiltration, mechanical phlebitis, and inflammation were common and tempered enthusiasm for their use.[15, 16] Newer midline devices obviate many of these problems and are inserted by ultrasound guidance and modified Seldinger technique.[17] Despite these advances, data regarding comparative efficacy are limited.
Midlines offer longer dwell times than standard PIVs owing to termination in the larger diameter basilic and brachial veins of the arm. Additionally, owing to their length, midlines are less prone to dislodgement. As they are inserted with greater antisepsis than PIVs and better secured to the skin, they are more durable than PIVs.[5, 9, 18] Current INS standards recommend use of midlines for 1 to 4 weeks.[5] Because they terminate in a peripheral vein, medications and infusions compatible with midlines are identical to those that are infused through a PIV. Thus, TPN, vesicants or irritants, or drugs that feature a pH <5 or pH >9, or >500 mOsm should not be infused through a midline.[15] New evidence suggests that diluted solutions of vancomycin (usually pH <5) may be safe to infuse for short durations (<6 days) through a midline, and that concentration rather than pH may be more important in this regard.[19] Although it is possible that the use of midlines may extend to agents typically not deemed peripheral access compatible, limited evidence exists to support such a strategy at this time.
Midlines offer several advantages. First, because blood flow is greater in the more proximal veins of the arm, midlines can accommodate infusions at rates of 100 to 150 mL/min compared to 20 to 40 mL/min in smaller peripheral veins. Higher flow rates offer greater hemodilution (dilution of the infusion with blood), decreasing the likelihood of phlebitis and infiltration.[20] Second, midlines do not require x‐ray verification of tip placement; thus, their use is often favored in resource‐depleted settings such as skilled nursing facilities. Third, midlines offer longer dwell times than peripheral intravenous catheters and can thus serve as bridge devices for short‐term intravenous antibiotics or peripheral‐compatible infusions in an outpatient setting. Available evidence suggests that midlines are associated with low rates of bloodstream infection (0.30.8 per 1000 catheter‐days).[17] The most frequent complications include phlebitis (4.2%) and occlusion (3.3%).[20] Given these favorable statistics, midlines may offer a good alternative to PIVs in select patients who require peripheral infusions of intermediate duration.
Intraosseous Vascular Access
Intraosseous (IO) devices access the vascular system by piercing cortical bone. These devices provide access to the intramedullary cavity and venous plexi of long bones such as the tibia, femur, or humerus. Several insertion devices are now commercially available and have enhanced the ease and safety of IO placement. Using these newer devices, IO access may be obtained in 1 to 2 minutes with minimal training. By comparison, a central venous catheter often requires 10 to 15 minutes to insert with substantial training efforts for providers.[21, 22, 23]
IO devices thus offer several advantages. First, given the rapidity with which they can be inserted, they are often preferred in emergency settings (eg, trauma). Second, these devices are versatile and can accommodate both central and peripheral infusates.[24] Third, a recent meta‐analysis found that IOs have a low complication rate of 0.8%, with extravasation of infusate through the cortical entry site being the most common adverse event.[21] Of note, this study also reported zero local or distal infectious complications, a finding that may relate to the shorter dwell of these devices.[21] Some animal studies suggest that fat embolism from bone may occur at high rates with IO VADs.[25] However, death or significant morbidity from fat emboli in humans following IO access has not been described. Whether such emboli occur or are clinically significant in the context of IO devices remains unclear at this time.[21]
Central Venous Access Devices
Central venous access devices (CVADs) share in common tip termination in the cavo‐atrial junction, either in the lower portion of the superior vena cava or in the upper portion of the right atrium. CVADs can be safely used for irritant or vesicant medications as well as for blood withdrawal, blood exchange procedures (eg, dialysis), and hemodynamic monitoring. Traditionally, these devices are 15 to 25 cm in length and are directly inserted in the deep veins of the supra‐ or infraclavicular area, including the internal jugular, brachiocephalic, subclavian, or axillary veins. PICCs are unique CVADs in that they enter through peripheral veins but terminate in the proximity of the cavoatrial junction. Regarding nomenclature, CICC will be used to denote devices that enter directly into veins of the neck or chest, whereas PICC will be used for devices that are inserted peripherally but terminate centrally.
Peripherally Inserted Central Catheter
PICCs are inserted into peripheral veins of the upper arm (eg, brachial, basilica, or cephalic vein) and advanced such that the tip resides at the cavoatrial junction (Figure 1C). PICCs offer prolonged dwell times and are thus indicated when patients require venous access for weeks or months.[26] Additionally, they can accommodate a variety of infusates and are safer to insert than CICCs, given placement in peripheral veins of the arm rather than central veins of the chest/neck. Thus, insertion complications such as pneumothorax, hemothorax, or significant bleeding are rare with PICCs. In fact, a recent study reported that PICC insertion by hospitalists was associated with low rates of insertion or infectious complications.[27]
However, like CICCs, PICCs are associated with central lineassociated bloodstream infection (CLABSI), a serious complication known to prolong length of hospital stay, increase costs, and carry a 12% to 25% associated mortality.[28, 29] In the United States alone, over 250,000 CLASBI cases occur per year drawing considerable attention from the CDC and Joint Commission, who now mandate reporting and nonpayment for hospital‐acquired CLABSI.[30, 31, 32] A recent systematic review and meta‐analysis found that PICCs are associated with a substantial risk of CLABSI in hospitalized patients.[33] Importantly, no difference in CLABSI rates between PICCs and CICCs in hospitalized patients was evident in this meta‐analysis. Therefore, current guidelines specifically recommend against use of PICCs over CICCs as a strategy to reduce CLABSI.[34] Additionally, PICCs are associated with 2.5‐fold greater risk of deep vein thrombosis (DVT) compared to CICCs; thus, they should be used with caution in patients with cancer or those with underlying hypercoagulable states.
Of particular import to hospitalists is the fact that PICC placement is contraindicated in patients with stage IIIB or greater chronic kidney disease (CKD). In such patients, sequelae of PICC use, such as phlebitis or central vein stenosis, can be devastating in patients with CKD.[35] In a recent study, prior PICC placement was the strongest predictor of subsequent arteriovenous graft failure.[36] For this reason, Choosing Wisely recommendations call for avoidance of PICCs in such patients.[37]
Centrally Inserted Central Catheter
CICCs are CVADs placed by puncture and cannulation of the internal jugular, subclavian, brachiocephalic, or femoral veins (Figure 1D) and compose the vast majority of VADs placed in ICU settings.[38, 39] Central termination of CICCs allows for a variety of infusions, including irritants, vesicants, and vasopressors, as well as blood withdrawal and hemodynamic monitoring. CICCs are typically used for 7 to 14 days, but may remain for longer durations if they remain complication free and clinically necessary.[40] A key advantage of CICCs is that they can be placed in emergent settings to facilitate quick access for rapid infusion or hemodynamic monitoring. In particular, CICCs are inserted in the femoral vein and may be useful in emergency settings. However, owing to risk of infection and inability to monitor central pressures, these femoral devices should be replaced with a proper CICC or PICC when possible. Importantly, although CICCs are almost exclusively used in intensive or emergency care, PICCs may also be considered in such settings.[41, 42] CICCs usually have multiple lumens and often serve several simultaneous functions such as both infusions and hemodynamic monitoring.
Despite their benefits, CICCs have several disadvantages. First, insertion requires an experienced clinician and has historically been a task limited to physicians. However, this is changing rapidly (especially in Europe and Australia) where specially trained nurses are assuming responsibility for CICC placement.[43] Second, these devices are historically more likely to be associated with CLABSI, with estimates of infection rates varying between 2 and 5 infections per 1000 catheter‐days.[44] Third, CICCs pose a significant DVT risk, with rates around 22 DVTs per 1000 catheter‐days.[45] However, compared to PICCs, the DVT risk appears lower, and CICC use may be preferable in patients at high risk of DVT, such as critically ill or cancer populations.[46] An important note to prevent CICC insertion complications relates to use of ultrasound, a practice that has been associated with decreased accidental arterial puncture and hematoma formation. The role of ultrasound guidance with PICCs as well as implications for thrombotic and infectious events remains less characterized at this point.[47]
Tunneled Central Venous Access Devices
Tunneled devices (either CICCs or PICCs) are characterized by the fact that the insertion site on the skin and site of ultimate venipuncture are physically separated (Figure 1E). Tunneling limits bacterial entry from the extraluminal aspect of the CVAD to the bloodstream. For example, internal jugular veins are often ideal sites of puncture but inappropriate sites for catheter placement, as providing care to this area is challenging and may increase risk of infection.[34] Tunneling to the infraclavicular area provides a better option, as it provides an exit site that can be adequately cared for. Importantly, any CVAD (PICCs or CICCs) can be tunneled. Additionally, tunneled CICCs may be used in patients with chronic or impending renal failure where PICCs are contraindicated because entry into dialysis‐relevant vessels is to be avoided.[48] Such devices also allow regular blood sampling in patients who require frequent testing but have limited peripheral access, such as those with hematological malignancies. Additionally, tunneled catheters are more comfortable for patients and viewed as being more socially acceptable than nontunneled devices. However, the more invasive and permanent nature of these devices often requires deliberation prior to insertion.
Of note, tunneled devices and ports may be used as long‐term (>3 months to years) VADs. As our focus in this review is short‐term devices, we will not expand the discussion of these devices as they are almost always used for prolonged durations.[7]
OPERATIONALIZING THE DATA: AN ALGORITHMIC APPROACH TO VENOUS ACCESS
Hospitalists should consider approaching venous access using an algorithm based on a number of parameters. For example, a critically ill patient who requires vasopressor support and hemodynamic monitoring will need a CICC or a PICC. Given the potential greater risk of thromboses from PICCs, a CICC is preferable for critically ill patients provided an experienced inserter is available. Conversely, patients who require short‐term (<710 days) venous access for infusion of nonirritant or nonvesicant therapy often only require a PIV. In patients with poor or difficult venous access, USGPIVs or midlines may be ideal and preferred over short PIVs. Finally, patients who require longer‐term or home‐based treatment may benefit from early placement of a midline or a PICC, depending again on the nature of the infusion, duration of treatment, and available venous access sites.
An algorithmic approach considering these parameters is suggested in Figure 2, and a brief overview of the devices and their considerations is shown in Table 3.
| Vascular Access Device | Central/Peripheral | Anatomical Location of Placement | Desired Duration of Placement | Common Uses | BSI Risk (Per 1,000 Catheter‐Days) | Thrombosis Risk | Important Considerations |
|---|---|---|---|---|---|---|---|
| |||||||
| Small peripheral IV | Peripheral | Peripheral veins, usually forearm | 710 days | Fluid resuscitation, most medications, blood products | 0.06[8] | Virtually no risk | Consider necessity of PIV daily and remove unnecessary devices |
| Midline | Peripheral | Inserted around antecubital fossa, reside within basilic or cephalic vein of the arm | 24 weeks | Long‐term medications excluding TPN, vesicants, corrosives | 0.30.8[17] | Insufficient data | Can be used as bridge devices for patients to complete short‐term antibiotics/emnfusions as an outpatient |
| Peripherally inserted central catheter | Central | Inserted into peripheral arm vein and advanced to larger veins (eg, internal jugular or subclavian) to the CAJ | >1 week, <3 months | Large variety of infusates, including TPN, vesicants, corrosives | 2.4[44] | 6.30% | Contraindicated in patients with CKD stage IIIb or higher |
| Centrally inserted central catheters | Central | Inserted above (internal jugular vein, brachiocephalic vein, subclavian vein), or below the clavicle (axillary vein) | >1 week, <3 months | Same infusate variety as PICC, measurement of central venous pressures, common in trauma/emergent settings | 2.3[44] | 1.30% | Given lower rates of DVT than PICC, preferred in ICU and hypercoagulable environments |
| Tunneled CICCs | Central | Placed percutaneously in any large vein in the arm, chest, neck or groin | >3 months to years | Central infusates, as in any CVAD; used for patients with CKD stage IIIb or greater when a PICC is indicated | Insufficient data | Insufficient data | May be superior when insertion site and puncture site are not congruent and may increase risk of infection |

CONCLUSIONS
With strides in technology and progress in medicine, hospitalists have access to an array of options for venous access. However, every VAD has limitations that can be easily overlooked in a perfunctory decision‐making process. The data presented in this review thus provide a first step to improving safety in this evolving science. Studies that further determine appropriateness of VADs in hospitalized settings are necessary. Only through such progressive scientific enquiry will complication‐free venous access be realized.
Disclosure
Nothing to report.
- , . The need for comparative data in vascular access: the rationale and design of the PICC registry. J Vasc Access. 2013; 18(4): 219–224.
- , , , et al. Hospitalist experiences, practice, opinions, and knowledge regarding peripherally inserted central catheters: a Michigan survey. J Hosp Med. 2013; 8(6): 309–314.
- , , , . Clinically‐indicated replacement versus routine replacement of peripheral venous catheters. Cochrane Database Syst Rev. 2013; 4: CD007798.
- , , , et al. Cost‐effectiveness analysis of clinically indicated versus routine replacement of peripheral intravenous catheters. Appl Health Econ Health Policy. 2014; 12(1): 51–58.
- Infusion Nurses Society. Infusion Nursing Standards of Practice. Norwood, MA; Infusion Nurses Society; 2011.
- , , , et al. Routine versus clinically indicated replacement of peripheral intravenous catheters: a randomised controlled equivalence trial. Lancet. 2012; 380(9847): 1066–1074.
- , , , et al. epic3: national evidence‐based guidelines for preventing healthcare‐associated infections in NHS hospitals in England. J Hosp Infect. 2014; 86(suppl 1): S1–S70.
- . Short peripheral intravenous catheters and infections. J Infus Nurs. 2012; 35(4): 230–240.
- , , , et al. Phlebitis risk varies by peripheral venous catheter site and increases after 96 hours: a large multi‐centre prospective study. J Adv Nurs. 2014; 70(11): 2539–2549.
- , . Intravenous catheter complications in the hand and forearm. J Trauma. 2004; 56(1): 123–127.
- , , , . Vesicant extravasation part I: Mechanisms, pathogenesis, and nursing care to reduce risk. Oncol Nurs Forum. 2006; 33(6): 1134–1141.
- , . Variables influencing intravenous catheter insertion difficulty and failure: an analysis of 339 intravenous catheter insertions. Heart Lung. 2005; 34(5): 345–359.
- , , . Ultrasound‐guided peripheral venous access: a systematic review of randomized‐controlled trials. Eur J Emerg Med. 2014; 21(1): 18–23.
- , , , et al. Difficult peripheral venous access: clinical evaluation of a catheter inserted with the Seldinger method under ultrasound guidance. J Crit Care. 2014; 29(5): 823–827.
- . Choosing the right intravenous catheter. Home Healthc Nurse. 2007; 25(8): 523–531; quiz 532–523.
- Adverse reactions associated with midline catheters—United States, 1992–1995. MMWR Morb Mortal Wkly Rep. 1996; 45(5): 101–103.
- , , . The risk of midline catheterization in hospitalized patients. A prospective study. Ann Intern Med. 1995; 123(11): 841–844.
- . Midline catheters: indications, complications and maintenance. Nurs Stand. 2007; 22(11): 48–57; quiz 58.
- , . Safe administration of vancomycin through a novel midline catheter: a randomized, prospective clinical trial. J Vasc Access. 2014; 15(4): 251–256.
- . Midline catheters: the middle ground of intravenous therapy administration. J Infus Nurs. 2004; 27(5): 313–321.
- . Vascular access in resuscitation: is there a role for the intraosseous route? Anesthesiology. 2014; 120(4): 1015–1031.
- , , , et al. A new system for sternal intraosseous infusion in adults. Prehosp Emerg Care. 2000; 4(2): 173–177.
- , , , , , . Comparison of two intraosseous access devices in adult patients under resuscitation in the emergency department: a prospective, randomized study. Resuscitation. 2010; 81(8): 994–999.
- , , , , . Comparison study of intraosseous, central intravenous, and peripheral intravenous infusions of emergency drugs. Am J Dis Child. 1990; 144(1): 112–117.
- , , , , . The safety of intraosseous infusions: risks of fat and bone marrow emboli to the lungs. Ann Emerg Med. 1989; 18(10): 1062–1067.
- , , , , . ESPEN guidelines on parenteral nutrition: central venous catheters (access, care, diagnosis and therapy of complications). Clin Nutr. 2009; 28(4): 365–377.
- , . Peripherally inserted central catheter use in the hospitalized patient: is there a role for the hospitalist? J Hosp Med. 2009; 4(6): E1–E4.
- Vital signs: central line‐associated blood stream infections—United States, 2001, 2008, and 2009. MMWR Morb Mortal Wkly Rep. 2011; 60(8): 243–248.
- , , , . Hospital costs of central line‐associated bloodstream infections and cost‐effectiveness of closed vs. open infusion containers. The case of Intensive Care Units in Italy. Cost Eff Resour Alloc. 2010; 8: 8.
- , , . The risk of bloodstream infection in adults with different intravascular devices: a systematic review of 200 published prospective studies. Mayo Clin Proc. 2006; 81(9): 1159–1171.
- The Joint Commission. Preventing Central Line‐Associated Bloodstream Infections: A Global Challenge, a Global Perspective. Oak Brook, IL: Joint Commission Resources; 2012.
- , , , et al. Guidelines for the prevention of intravascular catheter‐related infections. Am J Infect Control. 2011; 39(4 suppl 1): S1–S34.
- , , , , . The risk of bloodstream infection associated with peripherally inserted central catheters compared with central venous catheters in adults: a systematic review and meta‐analysis. Infect Control Hosp Epidemiol. 2013; 34(9): 908–918.
- Society for Healthcare Epidemiology of America, Infectious Diseases Society of America, American Hospital Association, Association for Professionals in Infection Control and Epidemiology, The Joint Commission. Compendium of Strategies to Prevent Healthcare‐Associated Infections in Acute Care Hospitals: 2014 Updates. Available at: http://www.shea‐online.org. Accessed August 1, 2014.
- , , , , . Guidelines for venous access in patients with chronic kidney disease. A Position Statement from the American Society of Diagnostic and Interventional Nephrology, Clinical Practice Committee and the Association for Vascular Access. Semin Dial. 2008; 21(2): 186–191.
- , , , et al. Association between prior peripherally inserted central catheters and lack of functioning arteriovenous fistulas: a case‐control study in hemodialysis patients. Am J Kidney Dis. 2012; 60(4): 601–608.
- , , , et al. Critical and honest conversations: the evidence behind the “Choosing Wisely” campaign recommendations by the American Society of Nephrology. Clin J Am Soc Nephrol. 2012; 7(10): 1664–1672.
- , , , et al. Do physicians know which of their patients have central venous catheters? A multi‐center observational study. Ann Intern Med. 2014; 161(8): 562–567.
- , , , et al. Hospital‐wide survey of the use of central venous catheters. J Hosp Infect. 2011; 77(4): 304–308.
- , , , , , . Peripherally inserted central catheter‐related deep vein thrombosis: contemporary patterns and predictors. J Thromb Haemost. 2014; 12(6): 847–854.
- , , , . An in vitro study comparing a peripherally inserted central catheter to a conventional central venous catheter: no difference in static and dynamic pressure transmission. BMC Anesthesiol. 2010; 10: 18.
- , , , et al. Clinical experience with power‐injectable PICCs in intensive care patients. Crit Care. 2012; 16(1): R21.
- , , , et al. Nurse‐led central venous catheter insertion‐procedural characteristics and outcomes of three intensive care based catheter placement services. Int J Nurs Stud. 2012; 49(2): 162–168.
- , , , et al. Peripherally inserted central venous catheters in the acute care setting: a safe alternative to high‐risk short‐term central venous catheters. Am J Infect Control. 2010; 38(2): 149–153.
- , , , et al. Which central venous catheters have the highest rate of catheter‐associated deep venous thrombosis: a prospective analysis of 2,128 catheter days in the surgical intensive care unit. J Trauma Acute Care Surg. 2013; 74(2): 454–460; discussion 461–452.
- , , , et al. Risk of venous thromboembolism associated with peripherally inserted central catheters: a systematic review and meta‐analysis. Lancet. 2013; 382(9889): 311–325.
- , , , , . Ultrasound guidance versus anatomical landmarks for subclavian or femoral vein catheterization. Cochrane Database Syst Rev. 2015; 1: CD011447.
- , , , et al. Tunneled jugular small‐bore central catheters as an alternative to peripherally inserted central catheters for intermediate‐term venous access in patients with hemodialysis and chronic renal insufficiency. Radiology. 1999; 213(1): 303–306.
- , . The need for comparative data in vascular access: the rationale and design of the PICC registry. J Vasc Access. 2013; 18(4): 219–224.
- , , , et al. Hospitalist experiences, practice, opinions, and knowledge regarding peripherally inserted central catheters: a Michigan survey. J Hosp Med. 2013; 8(6): 309–314.
- , , , . Clinically‐indicated replacement versus routine replacement of peripheral venous catheters. Cochrane Database Syst Rev. 2013; 4: CD007798.
- , , , et al. Cost‐effectiveness analysis of clinically indicated versus routine replacement of peripheral intravenous catheters. Appl Health Econ Health Policy. 2014; 12(1): 51–58.
- Infusion Nurses Society. Infusion Nursing Standards of Practice. Norwood, MA; Infusion Nurses Society; 2011.
- , , , et al. Routine versus clinically indicated replacement of peripheral intravenous catheters: a randomised controlled equivalence trial. Lancet. 2012; 380(9847): 1066–1074.
- , , , et al. epic3: national evidence‐based guidelines for preventing healthcare‐associated infections in NHS hospitals in England. J Hosp Infect. 2014; 86(suppl 1): S1–S70.
- . Short peripheral intravenous catheters and infections. J Infus Nurs. 2012; 35(4): 230–240.
- , , , et al. Phlebitis risk varies by peripheral venous catheter site and increases after 96 hours: a large multi‐centre prospective study. J Adv Nurs. 2014; 70(11): 2539–2549.
- , . Intravenous catheter complications in the hand and forearm. J Trauma. 2004; 56(1): 123–127.
- , , , . Vesicant extravasation part I: Mechanisms, pathogenesis, and nursing care to reduce risk. Oncol Nurs Forum. 2006; 33(6): 1134–1141.
- , . Variables influencing intravenous catheter insertion difficulty and failure: an analysis of 339 intravenous catheter insertions. Heart Lung. 2005; 34(5): 345–359.
- , , . Ultrasound‐guided peripheral venous access: a systematic review of randomized‐controlled trials. Eur J Emerg Med. 2014; 21(1): 18–23.
- , , , et al. Difficult peripheral venous access: clinical evaluation of a catheter inserted with the Seldinger method under ultrasound guidance. J Crit Care. 2014; 29(5): 823–827.
- . Choosing the right intravenous catheter. Home Healthc Nurse. 2007; 25(8): 523–531; quiz 532–523.
- Adverse reactions associated with midline catheters—United States, 1992–1995. MMWR Morb Mortal Wkly Rep. 1996; 45(5): 101–103.
- , , . The risk of midline catheterization in hospitalized patients. A prospective study. Ann Intern Med. 1995; 123(11): 841–844.
- . Midline catheters: indications, complications and maintenance. Nurs Stand. 2007; 22(11): 48–57; quiz 58.
- , . Safe administration of vancomycin through a novel midline catheter: a randomized, prospective clinical trial. J Vasc Access. 2014; 15(4): 251–256.
- . Midline catheters: the middle ground of intravenous therapy administration. J Infus Nurs. 2004; 27(5): 313–321.
- . Vascular access in resuscitation: is there a role for the intraosseous route? Anesthesiology. 2014; 120(4): 1015–1031.
- , , , et al. A new system for sternal intraosseous infusion in adults. Prehosp Emerg Care. 2000; 4(2): 173–177.
- , , , , , . Comparison of two intraosseous access devices in adult patients under resuscitation in the emergency department: a prospective, randomized study. Resuscitation. 2010; 81(8): 994–999.
- , , , , . Comparison study of intraosseous, central intravenous, and peripheral intravenous infusions of emergency drugs. Am J Dis Child. 1990; 144(1): 112–117.
- , , , , . The safety of intraosseous infusions: risks of fat and bone marrow emboli to the lungs. Ann Emerg Med. 1989; 18(10): 1062–1067.
- , , , , . ESPEN guidelines on parenteral nutrition: central venous catheters (access, care, diagnosis and therapy of complications). Clin Nutr. 2009; 28(4): 365–377.
- , . Peripherally inserted central catheter use in the hospitalized patient: is there a role for the hospitalist? J Hosp Med. 2009; 4(6): E1–E4.
- Vital signs: central line‐associated blood stream infections—United States, 2001, 2008, and 2009. MMWR Morb Mortal Wkly Rep. 2011; 60(8): 243–248.
- , , , . Hospital costs of central line‐associated bloodstream infections and cost‐effectiveness of closed vs. open infusion containers. The case of Intensive Care Units in Italy. Cost Eff Resour Alloc. 2010; 8: 8.
- , , . The risk of bloodstream infection in adults with different intravascular devices: a systematic review of 200 published prospective studies. Mayo Clin Proc. 2006; 81(9): 1159–1171.
- The Joint Commission. Preventing Central Line‐Associated Bloodstream Infections: A Global Challenge, a Global Perspective. Oak Brook, IL: Joint Commission Resources; 2012.
- , , , et al. Guidelines for the prevention of intravascular catheter‐related infections. Am J Infect Control. 2011; 39(4 suppl 1): S1–S34.
- , , , , . The risk of bloodstream infection associated with peripherally inserted central catheters compared with central venous catheters in adults: a systematic review and meta‐analysis. Infect Control Hosp Epidemiol. 2013; 34(9): 908–918.
- Society for Healthcare Epidemiology of America, Infectious Diseases Society of America, American Hospital Association, Association for Professionals in Infection Control and Epidemiology, The Joint Commission. Compendium of Strategies to Prevent Healthcare‐Associated Infections in Acute Care Hospitals: 2014 Updates. Available at: http://www.shea‐online.org. Accessed August 1, 2014.
- , , , , . Guidelines for venous access in patients with chronic kidney disease. A Position Statement from the American Society of Diagnostic and Interventional Nephrology, Clinical Practice Committee and the Association for Vascular Access. Semin Dial. 2008; 21(2): 186–191.
- , , , et al. Association between prior peripherally inserted central catheters and lack of functioning arteriovenous fistulas: a case‐control study in hemodialysis patients. Am J Kidney Dis. 2012; 60(4): 601–608.
- , , , et al. Critical and honest conversations: the evidence behind the “Choosing Wisely” campaign recommendations by the American Society of Nephrology. Clin J Am Soc Nephrol. 2012; 7(10): 1664–1672.
- , , , et al. Do physicians know which of their patients have central venous catheters? A multi‐center observational study. Ann Intern Med. 2014; 161(8): 562–567.
- , , , et al. Hospital‐wide survey of the use of central venous catheters. J Hosp Infect. 2011; 77(4): 304–308.
- , , , , , . Peripherally inserted central catheter‐related deep vein thrombosis: contemporary patterns and predictors. J Thromb Haemost. 2014; 12(6): 847–854.
- , , , . An in vitro study comparing a peripherally inserted central catheter to a conventional central venous catheter: no difference in static and dynamic pressure transmission. BMC Anesthesiol. 2010; 10: 18.
- , , , et al. Clinical experience with power‐injectable PICCs in intensive care patients. Crit Care. 2012; 16(1): R21.
- , , , et al. Nurse‐led central venous catheter insertion‐procedural characteristics and outcomes of three intensive care based catheter placement services. Int J Nurs Stud. 2012; 49(2): 162–168.
- , , , et al. Peripherally inserted central venous catheters in the acute care setting: a safe alternative to high‐risk short‐term central venous catheters. Am J Infect Control. 2010; 38(2): 149–153.
- , , , et al. Which central venous catheters have the highest rate of catheter‐associated deep venous thrombosis: a prospective analysis of 2,128 catheter days in the surgical intensive care unit. J Trauma Acute Care Surg. 2013; 74(2): 454–460; discussion 461–452.
- , , , et al. Risk of venous thromboembolism associated with peripherally inserted central catheters: a systematic review and meta‐analysis. Lancet. 2013; 382(9889): 311–325.
- , , , , . Ultrasound guidance versus anatomical landmarks for subclavian or femoral vein catheterization. Cochrane Database Syst Rev. 2015; 1: CD011447.
- , , , et al. Tunneled jugular small‐bore central catheters as an alternative to peripherally inserted central catheters for intermediate‐term venous access in patients with hemodialysis and chronic renal insufficiency. Radiology. 1999; 213(1): 303–306.
Epithelial Ovarian Cancer: Evaluation, Staging, Surgery, and Stage I and II Disease Management
Edited by: Arthur T. Skarin, MD, FACP, FCCP
Ovarian cancer is the second most common gynecologic cancer among women in the United States. It is also the fifth leading cause of cancer mortality in women and the leading cause of death among women with gynecologic malignancies. The American Cancer Society statistics released in 2015 estimate that 21,290 new cases of ovarian cancer will occur during the year, with approximately 14,180 deaths. Globally, there were 238,719 new cases of ovarian cancer diagnosed in 2012, representing 3.6% of all cancers in women, and nearly 151,905 deaths. The highest incidence of ovarian cancer occurs in northern, central, and eastern Europe, followed by western Europe and North America, with the lowest incidence in parts of Africa and Asia. The majority of women presenting with ovarian cancer will present at an advanced stage, and the 5-year survival in this group is less than 30%.
To read the full article in PDF:
Edited by: Arthur T. Skarin, MD, FACP, FCCP
Ovarian cancer is the second most common gynecologic cancer among women in the United States. It is also the fifth leading cause of cancer mortality in women and the leading cause of death among women with gynecologic malignancies. The American Cancer Society statistics released in 2015 estimate that 21,290 new cases of ovarian cancer will occur during the year, with approximately 14,180 deaths. Globally, there were 238,719 new cases of ovarian cancer diagnosed in 2012, representing 3.6% of all cancers in women, and nearly 151,905 deaths. The highest incidence of ovarian cancer occurs in northern, central, and eastern Europe, followed by western Europe and North America, with the lowest incidence in parts of Africa and Asia. The majority of women presenting with ovarian cancer will present at an advanced stage, and the 5-year survival in this group is less than 30%.
To read the full article in PDF:
Edited by: Arthur T. Skarin, MD, FACP, FCCP
Ovarian cancer is the second most common gynecologic cancer among women in the United States. It is also the fifth leading cause of cancer mortality in women and the leading cause of death among women with gynecologic malignancies. The American Cancer Society statistics released in 2015 estimate that 21,290 new cases of ovarian cancer will occur during the year, with approximately 14,180 deaths. Globally, there were 238,719 new cases of ovarian cancer diagnosed in 2012, representing 3.6% of all cancers in women, and nearly 151,905 deaths. The highest incidence of ovarian cancer occurs in northern, central, and eastern Europe, followed by western Europe and North America, with the lowest incidence in parts of Africa and Asia. The majority of women presenting with ovarian cancer will present at an advanced stage, and the 5-year survival in this group is less than 30%.
To read the full article in PDF:
Cancer-Related Anemia
Anemia occurs in more than half of patients with cancer and is associated with worse performance status, quality of life, and survival. Anemia is often attributed to the effects of chemotherapy; however, a 2004 European Cancer Anemia Survey reported that 39% of patients with cancer were anemic prior to starting chemotherapy and the incidence of anemia may be as high as 90% in patients on chemotherapy. The pathogenesis of cancer-related anemia is multifactorial; it can be a direct result of cancer invading the bone marrow, or result from the effects of radiation, chemotherapy-induced anemia, chronic renal disease, and cancer-related inflammation leading to functional iron deficiency anemia.
To read the full article in PDF:
Anemia occurs in more than half of patients with cancer and is associated with worse performance status, quality of life, and survival. Anemia is often attributed to the effects of chemotherapy; however, a 2004 European Cancer Anemia Survey reported that 39% of patients with cancer were anemic prior to starting chemotherapy and the incidence of anemia may be as high as 90% in patients on chemotherapy. The pathogenesis of cancer-related anemia is multifactorial; it can be a direct result of cancer invading the bone marrow, or result from the effects of radiation, chemotherapy-induced anemia, chronic renal disease, and cancer-related inflammation leading to functional iron deficiency anemia.
To read the full article in PDF:
Anemia occurs in more than half of patients with cancer and is associated with worse performance status, quality of life, and survival. Anemia is often attributed to the effects of chemotherapy; however, a 2004 European Cancer Anemia Survey reported that 39% of patients with cancer were anemic prior to starting chemotherapy and the incidence of anemia may be as high as 90% in patients on chemotherapy. The pathogenesis of cancer-related anemia is multifactorial; it can be a direct result of cancer invading the bone marrow, or result from the effects of radiation, chemotherapy-induced anemia, chronic renal disease, and cancer-related inflammation leading to functional iron deficiency anemia.
To read the full article in PDF:






