User login
When practice-changing results don’t change practice
The highly favorable results of the CheckMate 816 trial of neoadjuvant chemotherapy plus nivolumab for resectable stage IB-IIIA non–small cell lung cancer (NSCLC) were impressive enough to prompt a Food and Drug Administration approval of this combination in March 2022.
For many, this led to a marked shift in how we approached these patients. But in my conversations with many care teams, they have expressed ambivalence about using the chemoimmunotherapy regimen. Some have conveyed to me that the lack of statistically significant improvement in overall survival is a sticking point. Others have expressed uncertainty about the true benefit of neoadjuvant chemotherapy alongside nivolumab for patients with earlier-stage disease, given that 64% of patients in the trial had stage IIIA disease. The benefit of the neoadjuvant combination in patients with low or negative tumor programmed death–ligand 1 (PD-L1) expression also remains a question mark, though the trial found no significant differences in outcomes by PD-L1 subset.
But among many of my colleagues who favor adjuvant over neoadjuvant therapy, it isn’t necessarily the fine points of the data that present the real barrier: it’s the sentiment that “we just don’t favor a neoadjuvant approach at my place.”
If the worry is that a subset of patients who are eligible for up-front surgery may be derailed from the operating room if they experience significant disease progression or a complication during preoperative therapy or that surgery will more difficult after chemoimmunotherapy, those concerns are not supported by evidence. In fact, data on surgical outcomes from CheckMate 816 assessing these issues found that surgery after chemoimmunotherapy was approximately 30 minutes faster than it was after chemotherapy alone. In addition, the combination neoadjuvant chemoimmunotherapy approach was associated with less extensive surgeries, particularly for patients with stage IIIA NSCLC, and patients experienced measurably lower reports of pain and dyspnea as well.
Though postoperative systemic therapy has been our general approach for resectable NSCLC for nearly 2 decades, there are several reasons to focus on neoadjuvant therapy.
First, immunotherapy may work more effectively when the tumor antigens as well as lymph nodes and lymphatic system are present in situ at the time.
Second, patients may be eager to complete their treatment within a 3-month period of just three cycles of systemic therapy followed by surgery rather than receiving their treatment over a prolonged chapter of their lives, starting with surgery followed by four cycles of chemotherapy and 1 year of immunotherapy.
Finally, we can’t ignore the fact that most neoadjuvant therapy is delivered exactly as intended, whereas planned adjuvant therapy is often not started or rarely completed as designed. At most, only about half of appropriate patients for adjuvant chemotherapy even start it, and far less complete a full four cycles or go on to complete prolonged adjuvant immunotherapy.
We also can’t underestimate the value of imaging and pathology findings after patients have completed neoadjuvant therapy. The pathologic complete response rate in CheckMate 816 is predictive of improved event-free survival over time.
And that isn’t just a binary variable of achieving a pathologic complete response or not. The degree of residual, viable tumor after surgery is a continuous variable associated along a spectrum with event-free survival. Our colleagues who treat breast cancer have been able to customize postoperative therapy to improve outcomes on the basis of the results achieved with neoadjuvant therapy. Multidisciplinary gastrointestinal oncology teams have revolutionized outcomes with rectal cancer by transitioning to total neoadjuvant therapy that makes it possible to deliver treatment more reliably and pursue organ-sparing approaches while achieving better survival.
Putting all of this together, I appreciate arguments against the generalizability or the maturity of the data supporting neoadjuvant chemoimmunotherapy for resectable NSCLC. However, sidestepping our most promising advances will harm our patients. Plus, what’s the point of generating practice-changing results if we don’t accept and implement them?
We owe it to our patients to follow the evolving evidence and not just stick to what we’ve always done.
Dr. West is an associate professor at City of Hope Comprehensive Cancer Center in Duarte, Calif., and vice president of network strategy at AccessHope in Los Angeles. Dr. West serves as web editor for JAMA Oncology, edits and writes several sections on lung cancer for UpToDate, and leads a wide range of continuing medical education and other educational programs.
A version of this article first appeared on Medscape.com.
The highly favorable results of the CheckMate 816 trial of neoadjuvant chemotherapy plus nivolumab for resectable stage IB-IIIA non–small cell lung cancer (NSCLC) were impressive enough to prompt a Food and Drug Administration approval of this combination in March 2022.
For many, this led to a marked shift in how we approached these patients. But in my conversations with many care teams, they have expressed ambivalence about using the chemoimmunotherapy regimen. Some have conveyed to me that the lack of statistically significant improvement in overall survival is a sticking point. Others have expressed uncertainty about the true benefit of neoadjuvant chemotherapy alongside nivolumab for patients with earlier-stage disease, given that 64% of patients in the trial had stage IIIA disease. The benefit of the neoadjuvant combination in patients with low or negative tumor programmed death–ligand 1 (PD-L1) expression also remains a question mark, though the trial found no significant differences in outcomes by PD-L1 subset.
But among many of my colleagues who favor adjuvant over neoadjuvant therapy, it isn’t necessarily the fine points of the data that present the real barrier: it’s the sentiment that “we just don’t favor a neoadjuvant approach at my place.”
If the worry is that a subset of patients who are eligible for up-front surgery may be derailed from the operating room if they experience significant disease progression or a complication during preoperative therapy or that surgery will more difficult after chemoimmunotherapy, those concerns are not supported by evidence. In fact, data on surgical outcomes from CheckMate 816 assessing these issues found that surgery after chemoimmunotherapy was approximately 30 minutes faster than it was after chemotherapy alone. In addition, the combination neoadjuvant chemoimmunotherapy approach was associated with less extensive surgeries, particularly for patients with stage IIIA NSCLC, and patients experienced measurably lower reports of pain and dyspnea as well.
Though postoperative systemic therapy has been our general approach for resectable NSCLC for nearly 2 decades, there are several reasons to focus on neoadjuvant therapy.
First, immunotherapy may work more effectively when the tumor antigens as well as lymph nodes and lymphatic system are present in situ at the time.
Second, patients may be eager to complete their treatment within a 3-month period of just three cycles of systemic therapy followed by surgery rather than receiving their treatment over a prolonged chapter of their lives, starting with surgery followed by four cycles of chemotherapy and 1 year of immunotherapy.
Finally, we can’t ignore the fact that most neoadjuvant therapy is delivered exactly as intended, whereas planned adjuvant therapy is often not started or rarely completed as designed. At most, only about half of appropriate patients for adjuvant chemotherapy even start it, and far less complete a full four cycles or go on to complete prolonged adjuvant immunotherapy.
We also can’t underestimate the value of imaging and pathology findings after patients have completed neoadjuvant therapy. The pathologic complete response rate in CheckMate 816 is predictive of improved event-free survival over time.
And that isn’t just a binary variable of achieving a pathologic complete response or not. The degree of residual, viable tumor after surgery is a continuous variable associated along a spectrum with event-free survival. Our colleagues who treat breast cancer have been able to customize postoperative therapy to improve outcomes on the basis of the results achieved with neoadjuvant therapy. Multidisciplinary gastrointestinal oncology teams have revolutionized outcomes with rectal cancer by transitioning to total neoadjuvant therapy that makes it possible to deliver treatment more reliably and pursue organ-sparing approaches while achieving better survival.
Putting all of this together, I appreciate arguments against the generalizability or the maturity of the data supporting neoadjuvant chemoimmunotherapy for resectable NSCLC. However, sidestepping our most promising advances will harm our patients. Plus, what’s the point of generating practice-changing results if we don’t accept and implement them?
We owe it to our patients to follow the evolving evidence and not just stick to what we’ve always done.
Dr. West is an associate professor at City of Hope Comprehensive Cancer Center in Duarte, Calif., and vice president of network strategy at AccessHope in Los Angeles. Dr. West serves as web editor for JAMA Oncology, edits and writes several sections on lung cancer for UpToDate, and leads a wide range of continuing medical education and other educational programs.
A version of this article first appeared on Medscape.com.
The highly favorable results of the CheckMate 816 trial of neoadjuvant chemotherapy plus nivolumab for resectable stage IB-IIIA non–small cell lung cancer (NSCLC) were impressive enough to prompt a Food and Drug Administration approval of this combination in March 2022.
For many, this led to a marked shift in how we approached these patients. But in my conversations with many care teams, they have expressed ambivalence about using the chemoimmunotherapy regimen. Some have conveyed to me that the lack of statistically significant improvement in overall survival is a sticking point. Others have expressed uncertainty about the true benefit of neoadjuvant chemotherapy alongside nivolumab for patients with earlier-stage disease, given that 64% of patients in the trial had stage IIIA disease. The benefit of the neoadjuvant combination in patients with low or negative tumor programmed death–ligand 1 (PD-L1) expression also remains a question mark, though the trial found no significant differences in outcomes by PD-L1 subset.
But among many of my colleagues who favor adjuvant over neoadjuvant therapy, it isn’t necessarily the fine points of the data that present the real barrier: it’s the sentiment that “we just don’t favor a neoadjuvant approach at my place.”
If the worry is that a subset of patients who are eligible for up-front surgery may be derailed from the operating room if they experience significant disease progression or a complication during preoperative therapy or that surgery will more difficult after chemoimmunotherapy, those concerns are not supported by evidence. In fact, data on surgical outcomes from CheckMate 816 assessing these issues found that surgery after chemoimmunotherapy was approximately 30 minutes faster than it was after chemotherapy alone. In addition, the combination neoadjuvant chemoimmunotherapy approach was associated with less extensive surgeries, particularly for patients with stage IIIA NSCLC, and patients experienced measurably lower reports of pain and dyspnea as well.
Though postoperative systemic therapy has been our general approach for resectable NSCLC for nearly 2 decades, there are several reasons to focus on neoadjuvant therapy.
First, immunotherapy may work more effectively when the tumor antigens as well as lymph nodes and lymphatic system are present in situ at the time.
Second, patients may be eager to complete their treatment within a 3-month period of just three cycles of systemic therapy followed by surgery rather than receiving their treatment over a prolonged chapter of their lives, starting with surgery followed by four cycles of chemotherapy and 1 year of immunotherapy.
Finally, we can’t ignore the fact that most neoadjuvant therapy is delivered exactly as intended, whereas planned adjuvant therapy is often not started or rarely completed as designed. At most, only about half of appropriate patients for adjuvant chemotherapy even start it, and far less complete a full four cycles or go on to complete prolonged adjuvant immunotherapy.
We also can’t underestimate the value of imaging and pathology findings after patients have completed neoadjuvant therapy. The pathologic complete response rate in CheckMate 816 is predictive of improved event-free survival over time.
And that isn’t just a binary variable of achieving a pathologic complete response or not. The degree of residual, viable tumor after surgery is a continuous variable associated along a spectrum with event-free survival. Our colleagues who treat breast cancer have been able to customize postoperative therapy to improve outcomes on the basis of the results achieved with neoadjuvant therapy. Multidisciplinary gastrointestinal oncology teams have revolutionized outcomes with rectal cancer by transitioning to total neoadjuvant therapy that makes it possible to deliver treatment more reliably and pursue organ-sparing approaches while achieving better survival.
Putting all of this together, I appreciate arguments against the generalizability or the maturity of the data supporting neoadjuvant chemoimmunotherapy for resectable NSCLC. However, sidestepping our most promising advances will harm our patients. Plus, what’s the point of generating practice-changing results if we don’t accept and implement them?
We owe it to our patients to follow the evolving evidence and not just stick to what we’ve always done.
Dr. West is an associate professor at City of Hope Comprehensive Cancer Center in Duarte, Calif., and vice president of network strategy at AccessHope in Los Angeles. Dr. West serves as web editor for JAMA Oncology, edits and writes several sections on lung cancer for UpToDate, and leads a wide range of continuing medical education and other educational programs.
A version of this article first appeared on Medscape.com.
Heart rate, cardiac phase influence perception of time
People’s perception of time is subjective and based not only on their emotional state but also on heartbeat and heart rate (HR), two new studies suggest.
Researchers studied young adults with an electrocardiogram (ECG), measuring electrical activity at millisecond resolution while participants listened to tones that varied in duration. Participants were asked to report whether certain tones were longer or shorter, in relation to others.
The researchers found that the momentary perception of time was not continuous but rather expanded or contracted with each heartbeat. When the heartbeat preceding a tone was shorter, participants regarded the tone as longer in duration; but when the preceding heartbeat was longer, the participants experienced the tone as shorter.
“Our findings suggest that there is a unique role that cardiac dynamics play in the momentary experience of time,” lead author Saeedah Sadeghi, MSc, a doctoral candidate in the department of psychology at Cornell University, Ithaca, N.Y., said in an interview.
The study was published online in Psychophysiology.
In a second study, published in the journal Current Biology, a separate team of researchers asked participants to judge whether a brief event – the presentation of a tone or an image – was shorter or longer than a reference duration. ECG was used to track systole and diastole when participants were presented with these events.
The researchers found that the durations were underestimated during systole and overestimated during diastole, suggesting that time seemed to “speed up” or “slow down,” based on cardiac contraction and relaxation. When participants rated the events as more arousing, their perceived durations contracted, even during diastole.
“In our new paper, we show that our heart shapes the perceived duration of events, so time passes quicker when the heart contracts but slower when the heart relaxes,” lead author Irena Arslanova, PhD, postdoctoral researcher in cognitive neuroscience, Royal Holloway University of London, told this news organization.
Temporal ‘wrinkles’
“Subjective time is malleable,” observed Ms. Sadeghi and colleagues in their report. “Rather than being a uniform dimension, perceived duration has ‘wrinkles,’ with certain intervals appearing to dilate or contract relative to objective time” – a phenomenon sometimes referred to as “distortion.”
“We have known that people aren’t always consistent in how they perceive time, and objective duration doesn’t always explain subjective perception of time,” Ms. Sadeghi said.
Although the potential role of the heart in the experience of time has been hypothesized, research into the heart-time connection has been limited, with previous studies focusing primarily on estimating the average cardiac measures on longer time scales over seconds to minutes.
The current study sought to investigate “the beat-by-beat fluctuations of the heart period on the experience of brief moments in time” because, compared with longer time scales, subsecond temporal perception “has different underlying mechanisms” and a subsecond stimulus can be a “small fraction of a heartbeat.”
To home in on this small fraction, the researchers studied 45 participants (aged 18-21), who listened to 210 tones ranging in duration from 80 ms (short) to 188 ms (long). The tones were linearly spaced at 18-ms increments (80, 98, 116, 134, 152, 170, 188).
Participants were asked to categorize each tone as “short” or “long.” All tones were randomly assigned to be synchronized either with the systolic or diastolic phase of the cardiac cycle (50% each). The tones were triggered by participants’ heartbeats.
In addition, participants engaged in a heartbeat-counting activity, in which they were asked not to touch their pulse but to count their heartbeats by tuning in to their bodily sensations at intervals of 25, 35, and 45 seconds.
‘Classical’ response
“Participants exhibited an increased heart period after tone onset, which returned to baseline following an average canonical bell shape,” the authors reported.
The researchers performed regression analyses to determine how, on average, the heart rate before the tone was related to perceived duration or how the amount of change after the tone was related to perceived duration.
They found that when the heart rate was higher before the tone, participants tended to be more accurate in their time perception. When the heartbeat preceding a tone was shorter, participants experienced the tone as longer; conversely, when the heartbeat was longer, they experienced the duration of the identical sound as shorter.
When participants focused their attention on the sounds, their heart rate was affected such that their orienting responses actually changed their heart rate and, in turn, their temporal perception.
“The orienting response is classical,” Ms. Sadeghi said. “When you attend to something unpredictable or novel, the act of orienting attention decreases the HR.”
She explained that the heartbeats are “noise to the brain.” When people need to perceive external events, “a decrease in HR facilitates the intake of things from outside and facilitates sensory intake.”
A lower HR “makes it easier for the person to take in the tone and perceive it, so it feels as though they perceive more of the tone and the duration seems longer – similarly, when the HR decreases.”
It is unknown whether this is a causal relationship, she cautioned, “but it seems as though the decrease in HR somehow makes it easier to ‘get’ more of the tone, which then appears to have longer duration.”
Bidirectional relationship
“We know that experienced time can be distorted,” said Dr. Arslanova. “Time flies by when we’re busy or having fun but drags on when we’re bored or waiting for something, yet we still don’t know how the brain gives rise to such elastic experience of time.”
The brain controls the heart in response to the information the heart provides about the state of the body, she noted, “but we have begun to see more research showing that the heart–brain relationship is bidirectional.”
This means that the heart plays a role in shaping “how we process information and experience emotions.” In this analysis, Dr. Arslanova and colleagues “wanted to study whether the heart also shapes the experience of time.”
To do so, they conducted two experiments.
In the first, participants (n = 28) were presented with brief events during systole or during diastole. The events took the form of an emotionally neutral visual shape or auditory tone, shown for durations of 200 to 400 ms.
Participants were asked whether these events were of longer or shorter duration, compared with a reference duration.
The researchers found significant main effect of cardiac phase systole (F(1,27) = 8.1, P =.01), with stimuli presented at diastole regarded, on average, as 7 ms longer than those presented at systole.
They also found a significant main effect of modality (F(1,27) = 5.7, P = .02), with tones judged, on average, as 13 ms longer than visual stimuli.
“This means that time ‘sped up’ during the heart’s contraction and ‘slowed down’ during the heart’s relaxation,” Dr. Arslanova said.
The effect of cardiac phase on duration perception was independent of changes in HR, the authors noted.
In the second experiment, participants performed a similar task, but this time, it involved the images of faces containing emotional expressions. The researchers again observed a similar pattern of time appearing to speed up during systole and slow down during diastole, with stimuli present at diastole regarded as being an average 9 ms longer than those presented at systole.
These opposing effects of systole and diastole on time perception were present only for low and average arousal ratings (b = 14.4 [SE 3.2], P < .001 and b = 9.2 [2.3], P <.001, respectively). However, this effect disappeared when arousal ratings increased (b = 4.1 [3.2] P =.21).
“Interestingly, when participants rated the events as more arousing, their perceived durations contracted, even during the heart’s relaxation,” Dr. Arslanova observed. “This means that in a nonaroused state, the two cardiac phases pull the experienced duration in opposite directions – time contracts, then expands.”
The findings “also predict that increasing HR would speed up passing time, making events seem shorter, because there will be a stronger influence from the heart’s contractions,” she said.
She described the relationship between time perception and emotion as complex, noting that the findings are important because they show “that the way we experience time cannot be examined in isolation from our body,” she said.
Converging evidence
Martin Wiener, PhD, assistant professor, George Mason University, Fairfax, Va., said both papers “provide converging evidence on the role of the heart in our perception of time.”
Together, “the results share that our sense of time – that is, our incoming sensory perception of the present ‘moment’ – is adjusted or ‘gated’ by both our HR and cardiac phase,” said Dr. Wiener, executive director of the Timing Research Forum.
The studies “provide a link between the body and the brain, in terms of our perception, and that we cannot study one without the context of the other,” said Dr. Wiener, who was not involved with the current study.
“All of this opens up a new avenue of research, and so it is very exciting to see,” Dr. Wiener stated.
No source of funding was listed for the study by Ms. Sadeghi and coauthors. They declared no relevant financial relationships.
Dr. Arslanova and coauthors declared no competing interests. Senior author Manos Tsakiris, PhD, receives funding from the European Research Council Consolidator Grant. Dr. Wiener declared no relevant financial relationships.
A version of this article first appeared on Medscape.com.
People’s perception of time is subjective and based not only on their emotional state but also on heartbeat and heart rate (HR), two new studies suggest.
Researchers studied young adults with an electrocardiogram (ECG), measuring electrical activity at millisecond resolution while participants listened to tones that varied in duration. Participants were asked to report whether certain tones were longer or shorter, in relation to others.
The researchers found that the momentary perception of time was not continuous but rather expanded or contracted with each heartbeat. When the heartbeat preceding a tone was shorter, participants regarded the tone as longer in duration; but when the preceding heartbeat was longer, the participants experienced the tone as shorter.
“Our findings suggest that there is a unique role that cardiac dynamics play in the momentary experience of time,” lead author Saeedah Sadeghi, MSc, a doctoral candidate in the department of psychology at Cornell University, Ithaca, N.Y., said in an interview.
The study was published online in Psychophysiology.
In a second study, published in the journal Current Biology, a separate team of researchers asked participants to judge whether a brief event – the presentation of a tone or an image – was shorter or longer than a reference duration. ECG was used to track systole and diastole when participants were presented with these events.
The researchers found that the durations were underestimated during systole and overestimated during diastole, suggesting that time seemed to “speed up” or “slow down,” based on cardiac contraction and relaxation. When participants rated the events as more arousing, their perceived durations contracted, even during diastole.
“In our new paper, we show that our heart shapes the perceived duration of events, so time passes quicker when the heart contracts but slower when the heart relaxes,” lead author Irena Arslanova, PhD, postdoctoral researcher in cognitive neuroscience, Royal Holloway University of London, told this news organization.
Temporal ‘wrinkles’
“Subjective time is malleable,” observed Ms. Sadeghi and colleagues in their report. “Rather than being a uniform dimension, perceived duration has ‘wrinkles,’ with certain intervals appearing to dilate or contract relative to objective time” – a phenomenon sometimes referred to as “distortion.”
“We have known that people aren’t always consistent in how they perceive time, and objective duration doesn’t always explain subjective perception of time,” Ms. Sadeghi said.
Although the potential role of the heart in the experience of time has been hypothesized, research into the heart-time connection has been limited, with previous studies focusing primarily on estimating the average cardiac measures on longer time scales over seconds to minutes.
The current study sought to investigate “the beat-by-beat fluctuations of the heart period on the experience of brief moments in time” because, compared with longer time scales, subsecond temporal perception “has different underlying mechanisms” and a subsecond stimulus can be a “small fraction of a heartbeat.”
To home in on this small fraction, the researchers studied 45 participants (aged 18-21), who listened to 210 tones ranging in duration from 80 ms (short) to 188 ms (long). The tones were linearly spaced at 18-ms increments (80, 98, 116, 134, 152, 170, 188).
Participants were asked to categorize each tone as “short” or “long.” All tones were randomly assigned to be synchronized either with the systolic or diastolic phase of the cardiac cycle (50% each). The tones were triggered by participants’ heartbeats.
In addition, participants engaged in a heartbeat-counting activity, in which they were asked not to touch their pulse but to count their heartbeats by tuning in to their bodily sensations at intervals of 25, 35, and 45 seconds.
‘Classical’ response
“Participants exhibited an increased heart period after tone onset, which returned to baseline following an average canonical bell shape,” the authors reported.
The researchers performed regression analyses to determine how, on average, the heart rate before the tone was related to perceived duration or how the amount of change after the tone was related to perceived duration.
They found that when the heart rate was higher before the tone, participants tended to be more accurate in their time perception. When the heartbeat preceding a tone was shorter, participants experienced the tone as longer; conversely, when the heartbeat was longer, they experienced the duration of the identical sound as shorter.
When participants focused their attention on the sounds, their heart rate was affected such that their orienting responses actually changed their heart rate and, in turn, their temporal perception.
“The orienting response is classical,” Ms. Sadeghi said. “When you attend to something unpredictable or novel, the act of orienting attention decreases the HR.”
She explained that the heartbeats are “noise to the brain.” When people need to perceive external events, “a decrease in HR facilitates the intake of things from outside and facilitates sensory intake.”
A lower HR “makes it easier for the person to take in the tone and perceive it, so it feels as though they perceive more of the tone and the duration seems longer – similarly, when the HR decreases.”
It is unknown whether this is a causal relationship, she cautioned, “but it seems as though the decrease in HR somehow makes it easier to ‘get’ more of the tone, which then appears to have longer duration.”
Bidirectional relationship
“We know that experienced time can be distorted,” said Dr. Arslanova. “Time flies by when we’re busy or having fun but drags on when we’re bored or waiting for something, yet we still don’t know how the brain gives rise to such elastic experience of time.”
The brain controls the heart in response to the information the heart provides about the state of the body, she noted, “but we have begun to see more research showing that the heart–brain relationship is bidirectional.”
This means that the heart plays a role in shaping “how we process information and experience emotions.” In this analysis, Dr. Arslanova and colleagues “wanted to study whether the heart also shapes the experience of time.”
To do so, they conducted two experiments.
In the first, participants (n = 28) were presented with brief events during systole or during diastole. The events took the form of an emotionally neutral visual shape or auditory tone, shown for durations of 200 to 400 ms.
Participants were asked whether these events were of longer or shorter duration, compared with a reference duration.
The researchers found significant main effect of cardiac phase systole (F(1,27) = 8.1, P =.01), with stimuli presented at diastole regarded, on average, as 7 ms longer than those presented at systole.
They also found a significant main effect of modality (F(1,27) = 5.7, P = .02), with tones judged, on average, as 13 ms longer than visual stimuli.
“This means that time ‘sped up’ during the heart’s contraction and ‘slowed down’ during the heart’s relaxation,” Dr. Arslanova said.
The effect of cardiac phase on duration perception was independent of changes in HR, the authors noted.
In the second experiment, participants performed a similar task, but this time, it involved the images of faces containing emotional expressions. The researchers again observed a similar pattern of time appearing to speed up during systole and slow down during diastole, with stimuli present at diastole regarded as being an average 9 ms longer than those presented at systole.
These opposing effects of systole and diastole on time perception were present only for low and average arousal ratings (b = 14.4 [SE 3.2], P < .001 and b = 9.2 [2.3], P <.001, respectively). However, this effect disappeared when arousal ratings increased (b = 4.1 [3.2] P =.21).
“Interestingly, when participants rated the events as more arousing, their perceived durations contracted, even during the heart’s relaxation,” Dr. Arslanova observed. “This means that in a nonaroused state, the two cardiac phases pull the experienced duration in opposite directions – time contracts, then expands.”
The findings “also predict that increasing HR would speed up passing time, making events seem shorter, because there will be a stronger influence from the heart’s contractions,” she said.
She described the relationship between time perception and emotion as complex, noting that the findings are important because they show “that the way we experience time cannot be examined in isolation from our body,” she said.
Converging evidence
Martin Wiener, PhD, assistant professor, George Mason University, Fairfax, Va., said both papers “provide converging evidence on the role of the heart in our perception of time.”
Together, “the results share that our sense of time – that is, our incoming sensory perception of the present ‘moment’ – is adjusted or ‘gated’ by both our HR and cardiac phase,” said Dr. Wiener, executive director of the Timing Research Forum.
The studies “provide a link between the body and the brain, in terms of our perception, and that we cannot study one without the context of the other,” said Dr. Wiener, who was not involved with the current study.
“All of this opens up a new avenue of research, and so it is very exciting to see,” Dr. Wiener stated.
No source of funding was listed for the study by Ms. Sadeghi and coauthors. They declared no relevant financial relationships.
Dr. Arslanova and coauthors declared no competing interests. Senior author Manos Tsakiris, PhD, receives funding from the European Research Council Consolidator Grant. Dr. Wiener declared no relevant financial relationships.
A version of this article first appeared on Medscape.com.
People’s perception of time is subjective and based not only on their emotional state but also on heartbeat and heart rate (HR), two new studies suggest.
Researchers studied young adults with an electrocardiogram (ECG), measuring electrical activity at millisecond resolution while participants listened to tones that varied in duration. Participants were asked to report whether certain tones were longer or shorter, in relation to others.
The researchers found that the momentary perception of time was not continuous but rather expanded or contracted with each heartbeat. When the heartbeat preceding a tone was shorter, participants regarded the tone as longer in duration; but when the preceding heartbeat was longer, the participants experienced the tone as shorter.
“Our findings suggest that there is a unique role that cardiac dynamics play in the momentary experience of time,” lead author Saeedah Sadeghi, MSc, a doctoral candidate in the department of psychology at Cornell University, Ithaca, N.Y., said in an interview.
The study was published online in Psychophysiology.
In a second study, published in the journal Current Biology, a separate team of researchers asked participants to judge whether a brief event – the presentation of a tone or an image – was shorter or longer than a reference duration. ECG was used to track systole and diastole when participants were presented with these events.
The researchers found that the durations were underestimated during systole and overestimated during diastole, suggesting that time seemed to “speed up” or “slow down,” based on cardiac contraction and relaxation. When participants rated the events as more arousing, their perceived durations contracted, even during diastole.
“In our new paper, we show that our heart shapes the perceived duration of events, so time passes quicker when the heart contracts but slower when the heart relaxes,” lead author Irena Arslanova, PhD, postdoctoral researcher in cognitive neuroscience, Royal Holloway University of London, told this news organization.
Temporal ‘wrinkles’
“Subjective time is malleable,” observed Ms. Sadeghi and colleagues in their report. “Rather than being a uniform dimension, perceived duration has ‘wrinkles,’ with certain intervals appearing to dilate or contract relative to objective time” – a phenomenon sometimes referred to as “distortion.”
“We have known that people aren’t always consistent in how they perceive time, and objective duration doesn’t always explain subjective perception of time,” Ms. Sadeghi said.
Although the potential role of the heart in the experience of time has been hypothesized, research into the heart-time connection has been limited, with previous studies focusing primarily on estimating the average cardiac measures on longer time scales over seconds to minutes.
The current study sought to investigate “the beat-by-beat fluctuations of the heart period on the experience of brief moments in time” because, compared with longer time scales, subsecond temporal perception “has different underlying mechanisms” and a subsecond stimulus can be a “small fraction of a heartbeat.”
To home in on this small fraction, the researchers studied 45 participants (aged 18-21), who listened to 210 tones ranging in duration from 80 ms (short) to 188 ms (long). The tones were linearly spaced at 18-ms increments (80, 98, 116, 134, 152, 170, 188).
Participants were asked to categorize each tone as “short” or “long.” All tones were randomly assigned to be synchronized either with the systolic or diastolic phase of the cardiac cycle (50% each). The tones were triggered by participants’ heartbeats.
In addition, participants engaged in a heartbeat-counting activity, in which they were asked not to touch their pulse but to count their heartbeats by tuning in to their bodily sensations at intervals of 25, 35, and 45 seconds.
‘Classical’ response
“Participants exhibited an increased heart period after tone onset, which returned to baseline following an average canonical bell shape,” the authors reported.
The researchers performed regression analyses to determine how, on average, the heart rate before the tone was related to perceived duration or how the amount of change after the tone was related to perceived duration.
They found that when the heart rate was higher before the tone, participants tended to be more accurate in their time perception. When the heartbeat preceding a tone was shorter, participants experienced the tone as longer; conversely, when the heartbeat was longer, they experienced the duration of the identical sound as shorter.
When participants focused their attention on the sounds, their heart rate was affected such that their orienting responses actually changed their heart rate and, in turn, their temporal perception.
“The orienting response is classical,” Ms. Sadeghi said. “When you attend to something unpredictable or novel, the act of orienting attention decreases the HR.”
She explained that the heartbeats are “noise to the brain.” When people need to perceive external events, “a decrease in HR facilitates the intake of things from outside and facilitates sensory intake.”
A lower HR “makes it easier for the person to take in the tone and perceive it, so it feels as though they perceive more of the tone and the duration seems longer – similarly, when the HR decreases.”
It is unknown whether this is a causal relationship, she cautioned, “but it seems as though the decrease in HR somehow makes it easier to ‘get’ more of the tone, which then appears to have longer duration.”
Bidirectional relationship
“We know that experienced time can be distorted,” said Dr. Arslanova. “Time flies by when we’re busy or having fun but drags on when we’re bored or waiting for something, yet we still don’t know how the brain gives rise to such elastic experience of time.”
The brain controls the heart in response to the information the heart provides about the state of the body, she noted, “but we have begun to see more research showing that the heart–brain relationship is bidirectional.”
This means that the heart plays a role in shaping “how we process information and experience emotions.” In this analysis, Dr. Arslanova and colleagues “wanted to study whether the heart also shapes the experience of time.”
To do so, they conducted two experiments.
In the first, participants (n = 28) were presented with brief events during systole or during diastole. The events took the form of an emotionally neutral visual shape or auditory tone, shown for durations of 200 to 400 ms.
Participants were asked whether these events were of longer or shorter duration, compared with a reference duration.
The researchers found significant main effect of cardiac phase systole (F(1,27) = 8.1, P =.01), with stimuli presented at diastole regarded, on average, as 7 ms longer than those presented at systole.
They also found a significant main effect of modality (F(1,27) = 5.7, P = .02), with tones judged, on average, as 13 ms longer than visual stimuli.
“This means that time ‘sped up’ during the heart’s contraction and ‘slowed down’ during the heart’s relaxation,” Dr. Arslanova said.
The effect of cardiac phase on duration perception was independent of changes in HR, the authors noted.
In the second experiment, participants performed a similar task, but this time, it involved the images of faces containing emotional expressions. The researchers again observed a similar pattern of time appearing to speed up during systole and slow down during diastole, with stimuli present at diastole regarded as being an average 9 ms longer than those presented at systole.
These opposing effects of systole and diastole on time perception were present only for low and average arousal ratings (b = 14.4 [SE 3.2], P < .001 and b = 9.2 [2.3], P <.001, respectively). However, this effect disappeared when arousal ratings increased (b = 4.1 [3.2] P =.21).
“Interestingly, when participants rated the events as more arousing, their perceived durations contracted, even during the heart’s relaxation,” Dr. Arslanova observed. “This means that in a nonaroused state, the two cardiac phases pull the experienced duration in opposite directions – time contracts, then expands.”
The findings “also predict that increasing HR would speed up passing time, making events seem shorter, because there will be a stronger influence from the heart’s contractions,” she said.
She described the relationship between time perception and emotion as complex, noting that the findings are important because they show “that the way we experience time cannot be examined in isolation from our body,” she said.
Converging evidence
Martin Wiener, PhD, assistant professor, George Mason University, Fairfax, Va., said both papers “provide converging evidence on the role of the heart in our perception of time.”
Together, “the results share that our sense of time – that is, our incoming sensory perception of the present ‘moment’ – is adjusted or ‘gated’ by both our HR and cardiac phase,” said Dr. Wiener, executive director of the Timing Research Forum.
The studies “provide a link between the body and the brain, in terms of our perception, and that we cannot study one without the context of the other,” said Dr. Wiener, who was not involved with the current study.
“All of this opens up a new avenue of research, and so it is very exciting to see,” Dr. Wiener stated.
No source of funding was listed for the study by Ms. Sadeghi and coauthors. They declared no relevant financial relationships.
Dr. Arslanova and coauthors declared no competing interests. Senior author Manos Tsakiris, PhD, receives funding from the European Research Council Consolidator Grant. Dr. Wiener declared no relevant financial relationships.
A version of this article first appeared on Medscape.com.
FROM PSYCHOPHYSIOLOGY
Nasal COVID treatment shows early promise against multiple variants
if used within 4 hours after infection inside the nose, new research reveals.
Known as TriSb92 (brand name Covidin, from drugmaker Pandemblock Oy in Finland), the viral inhibitor also appears effective against all coronavirus variants of concern, neutralizing even the Omicron variants BA.5, XBB, and BQ.1.1 in laboratory and mice studies.
Unlike a COVID vaccine that boosts a person’s immune system as protection, the antiviral nasal spray works more directly by blocking the virus, acting as a “biological mask in the nasal cavity,” according to the biotechnology company set up to develop the treatment.
The product targets a stable site on the spike protein of the virus that is not known to mutate. This same site is shared among many variants of the COVID virus, so it could be effective against future variants as well, researchers note.
“In animal models, by directly inactivating the virus, TriSb92 offers immediate and robust protection” against coronavirus infection and severe COVID, said Anna R. Mäkelä, PhD, lead author of the study and a senior scientist in the department of virology at the University of Helsinki.
The study was published online in Nature Communications.
A potential first line of defense
Even in cases where the antiviral does not prevent coronavirus infection, the treatment could slow infection. This could happen by limiting how much virus could replicate early in the skin inside the nose and nasopharynx (the upper part of the throat), said Dr. Mäkelä, who is also CEO of Pandemblock Oy, the company set up to develop the product.
“TriSb92 could effectively tip the balance in favor of the [the person] and thereby help to reduce the risk of severe COVID-19 disease,” she said.
The antiviral also could offer an alternative to people who cannot or do not respond to a vaccine.
“Many elderly people as well as individuals who are immunodeficient for various reasons do not respond to vaccines and are in the need of other protective measures,” said Kalle Saksela, MD, PhD, senior author of the study and a virologist at the University of Helsinki.
Multiple doses needed?
TriSb92 is “one of multiple nasal spray approaches but unlikely to be as durable as effective nasal vaccines,” said Eric Topol, MD, a professor of molecular medicine and executive vice president of Scripps Research in La Jolla, Calif. Dr. Topol is also editor-in-chief of Medscape, WebMD’s sister site for medical professionals.
“The sprays generally require multiple doses per day, whereas a single dose of a nasal vaccine may protect for months,” he said.
“Both have the allure of being variant-proof,” Dr. Topol added.
Thinking small
Many laboratories are shifting from treatments using monoclonal antibodies to treatments using smaller antibody fragments called “nanobodies” because they are more cost-effective and are able to last longer in storage, Dr. Mäkelä and colleagues noted.
Several of these nanobodies have shown promise against viruses in cell culture or animal models, including as an intranasal preventive treatment for SARS-CoV-2.
One of these smaller antibodies is being developed from llamas for example; another comes from experiments with yeast to develop synthetic nanobodies; and in a third case, researchers isolated nanobodies from llamas and from mice and showed they could neutralize the SARS-CoV-2 virus.
These nanobodies and TriSb92 target a specific part of the coronavirus spike protein called the receptor-binding domain (RBD). The RBD is where the coronavirus attaches to cells in the body. These agents essentially trick the virus by changing the structure of the outside of cells, so they look like a virus has already fused to them. This way, the virus moves on.
Key findings
The researchers compared mice treated with TriSb92 before and after exposure to SARS-CoV-2. When given in advance, none of the treated mice had SARS-CoV-2 RNA in their lungs, while untreated mice in the comparison group had “abundant” levels.
Other evidence of viral infection showed similar differences between treated and untreated mice in the protective lining of cells called the epithelium inside the nose, nasal mucosa, and airways.
Similarly, when given 2 or 4 hours after SARS-CoV-2 had already infected the epithelium, TriSb92 was linked to a complete lack of the virus’s RNA in the lungs.
It was more effective against the virus, though, when given before infection rather than after, “perhaps due to the initial establishment of the infection,” the researchers note.
The company led by Dr. Mäkelä is now working to secure funding for clinical trials of TriSb92 in humans.
A version of this article first appeared on WebMD.com.
if used within 4 hours after infection inside the nose, new research reveals.
Known as TriSb92 (brand name Covidin, from drugmaker Pandemblock Oy in Finland), the viral inhibitor also appears effective against all coronavirus variants of concern, neutralizing even the Omicron variants BA.5, XBB, and BQ.1.1 in laboratory and mice studies.
Unlike a COVID vaccine that boosts a person’s immune system as protection, the antiviral nasal spray works more directly by blocking the virus, acting as a “biological mask in the nasal cavity,” according to the biotechnology company set up to develop the treatment.
The product targets a stable site on the spike protein of the virus that is not known to mutate. This same site is shared among many variants of the COVID virus, so it could be effective against future variants as well, researchers note.
“In animal models, by directly inactivating the virus, TriSb92 offers immediate and robust protection” against coronavirus infection and severe COVID, said Anna R. Mäkelä, PhD, lead author of the study and a senior scientist in the department of virology at the University of Helsinki.
The study was published online in Nature Communications.
A potential first line of defense
Even in cases where the antiviral does not prevent coronavirus infection, the treatment could slow infection. This could happen by limiting how much virus could replicate early in the skin inside the nose and nasopharynx (the upper part of the throat), said Dr. Mäkelä, who is also CEO of Pandemblock Oy, the company set up to develop the product.
“TriSb92 could effectively tip the balance in favor of the [the person] and thereby help to reduce the risk of severe COVID-19 disease,” she said.
The antiviral also could offer an alternative to people who cannot or do not respond to a vaccine.
“Many elderly people as well as individuals who are immunodeficient for various reasons do not respond to vaccines and are in the need of other protective measures,” said Kalle Saksela, MD, PhD, senior author of the study and a virologist at the University of Helsinki.
Multiple doses needed?
TriSb92 is “one of multiple nasal spray approaches but unlikely to be as durable as effective nasal vaccines,” said Eric Topol, MD, a professor of molecular medicine and executive vice president of Scripps Research in La Jolla, Calif. Dr. Topol is also editor-in-chief of Medscape, WebMD’s sister site for medical professionals.
“The sprays generally require multiple doses per day, whereas a single dose of a nasal vaccine may protect for months,” he said.
“Both have the allure of being variant-proof,” Dr. Topol added.
Thinking small
Many laboratories are shifting from treatments using monoclonal antibodies to treatments using smaller antibody fragments called “nanobodies” because they are more cost-effective and are able to last longer in storage, Dr. Mäkelä and colleagues noted.
Several of these nanobodies have shown promise against viruses in cell culture or animal models, including as an intranasal preventive treatment for SARS-CoV-2.
One of these smaller antibodies is being developed from llamas for example; another comes from experiments with yeast to develop synthetic nanobodies; and in a third case, researchers isolated nanobodies from llamas and from mice and showed they could neutralize the SARS-CoV-2 virus.
These nanobodies and TriSb92 target a specific part of the coronavirus spike protein called the receptor-binding domain (RBD). The RBD is where the coronavirus attaches to cells in the body. These agents essentially trick the virus by changing the structure of the outside of cells, so they look like a virus has already fused to them. This way, the virus moves on.
Key findings
The researchers compared mice treated with TriSb92 before and after exposure to SARS-CoV-2. When given in advance, none of the treated mice had SARS-CoV-2 RNA in their lungs, while untreated mice in the comparison group had “abundant” levels.
Other evidence of viral infection showed similar differences between treated and untreated mice in the protective lining of cells called the epithelium inside the nose, nasal mucosa, and airways.
Similarly, when given 2 or 4 hours after SARS-CoV-2 had already infected the epithelium, TriSb92 was linked to a complete lack of the virus’s RNA in the lungs.
It was more effective against the virus, though, when given before infection rather than after, “perhaps due to the initial establishment of the infection,” the researchers note.
The company led by Dr. Mäkelä is now working to secure funding for clinical trials of TriSb92 in humans.
A version of this article first appeared on WebMD.com.
if used within 4 hours after infection inside the nose, new research reveals.
Known as TriSb92 (brand name Covidin, from drugmaker Pandemblock Oy in Finland), the viral inhibitor also appears effective against all coronavirus variants of concern, neutralizing even the Omicron variants BA.5, XBB, and BQ.1.1 in laboratory and mice studies.
Unlike a COVID vaccine that boosts a person’s immune system as protection, the antiviral nasal spray works more directly by blocking the virus, acting as a “biological mask in the nasal cavity,” according to the biotechnology company set up to develop the treatment.
The product targets a stable site on the spike protein of the virus that is not known to mutate. This same site is shared among many variants of the COVID virus, so it could be effective against future variants as well, researchers note.
“In animal models, by directly inactivating the virus, TriSb92 offers immediate and robust protection” against coronavirus infection and severe COVID, said Anna R. Mäkelä, PhD, lead author of the study and a senior scientist in the department of virology at the University of Helsinki.
The study was published online in Nature Communications.
A potential first line of defense
Even in cases where the antiviral does not prevent coronavirus infection, the treatment could slow infection. This could happen by limiting how much virus could replicate early in the skin inside the nose and nasopharynx (the upper part of the throat), said Dr. Mäkelä, who is also CEO of Pandemblock Oy, the company set up to develop the product.
“TriSb92 could effectively tip the balance in favor of the [the person] and thereby help to reduce the risk of severe COVID-19 disease,” she said.
The antiviral also could offer an alternative to people who cannot or do not respond to a vaccine.
“Many elderly people as well as individuals who are immunodeficient for various reasons do not respond to vaccines and are in the need of other protective measures,” said Kalle Saksela, MD, PhD, senior author of the study and a virologist at the University of Helsinki.
Multiple doses needed?
TriSb92 is “one of multiple nasal spray approaches but unlikely to be as durable as effective nasal vaccines,” said Eric Topol, MD, a professor of molecular medicine and executive vice president of Scripps Research in La Jolla, Calif. Dr. Topol is also editor-in-chief of Medscape, WebMD’s sister site for medical professionals.
“The sprays generally require multiple doses per day, whereas a single dose of a nasal vaccine may protect for months,” he said.
“Both have the allure of being variant-proof,” Dr. Topol added.
Thinking small
Many laboratories are shifting from treatments using monoclonal antibodies to treatments using smaller antibody fragments called “nanobodies” because they are more cost-effective and are able to last longer in storage, Dr. Mäkelä and colleagues noted.
Several of these nanobodies have shown promise against viruses in cell culture or animal models, including as an intranasal preventive treatment for SARS-CoV-2.
One of these smaller antibodies is being developed from llamas for example; another comes from experiments with yeast to develop synthetic nanobodies; and in a third case, researchers isolated nanobodies from llamas and from mice and showed they could neutralize the SARS-CoV-2 virus.
These nanobodies and TriSb92 target a specific part of the coronavirus spike protein called the receptor-binding domain (RBD). The RBD is where the coronavirus attaches to cells in the body. These agents essentially trick the virus by changing the structure of the outside of cells, so they look like a virus has already fused to them. This way, the virus moves on.
Key findings
The researchers compared mice treated with TriSb92 before and after exposure to SARS-CoV-2. When given in advance, none of the treated mice had SARS-CoV-2 RNA in their lungs, while untreated mice in the comparison group had “abundant” levels.
Other evidence of viral infection showed similar differences between treated and untreated mice in the protective lining of cells called the epithelium inside the nose, nasal mucosa, and airways.
Similarly, when given 2 or 4 hours after SARS-CoV-2 had already infected the epithelium, TriSb92 was linked to a complete lack of the virus’s RNA in the lungs.
It was more effective against the virus, though, when given before infection rather than after, “perhaps due to the initial establishment of the infection,” the researchers note.
The company led by Dr. Mäkelä is now working to secure funding for clinical trials of TriSb92 in humans.
A version of this article first appeared on WebMD.com.
FROM NATURE COMMUNICATIONS
Cluster, migraine headache strongly linked to circadian rhythm
A meta-analysis of 16 studies showed a circadian pattern in 71% of cluster headache attacks (3,490 of 4,953), with a clear circadian peak between 9:00 p.m. and 3:00 a.m.
Migraine was also associated with a circadian pattern in 50% of cases (2,698 of 5,385) across eight studies, with a clear circadian trough between 11:00 p.m. and 7:00 a.m.
Seasonal peaks were also evident for cluster headache (spring and autumn) and migraine (April to October).
“In the short term, these findings help us explain the timing to patients – for example, it is possible that a headache at 8 a.m. is due to their internal body clock instead of their pillow, or breakfast food, or morning medications,” lead investigator Mark Burish, MD, PhD, associate professor, department of neurosurgery, at University of Texas Health Houston, told this news organization.
“In the long term, these findings do suggest that medications that target the circadian system could be effective in migraine and headache patients,” Dr. Burish added.
The study was published online in Neurology.
Treatment implications?
Across studies, chronotype was “highly variable” for both cluster headache and migraine, the investigators report.
Cluster headache was associated with lower melatonin and higher cortisol levels, compared with non–cluster headache controls.
On a genetic level, cluster headache was associated with two core circadian genes (CLOCK and REV-ERB–alpha), and five of the nine genes that increase the likelihood of having cluster headache are genes with a circadian pattern of expression.
Migraine headache was associated with lower urinary melatonin levels and with the core circadian genes, CK1-delta and ROR-alpha, and 110 of the 168 genes associated with migraine were clock-controlled genes.
“The data suggest that both of these headache disorders are highly circadian at multiple levels, especially cluster headache,” Dr. Burish said in a release.
“This reinforces the importance of the hypothalamus – the area of the brain that houses the primary biological clock – and its role in cluster headache and migraine. It also raises the question of the genetics of triggers such as sleep changes that are known triggers for migraine and are cues for the body’s circadian rhythm,” Dr. Burish said.
“We hope that future research will look into circadian medications as a new treatment option for migraine and cluster headache patients,” Dr. Burish told this news organization.
Importance of sleep regulation
The authors of an accompanying editorial note that even though the study doesn’t have immediate clinical implications, it offers a better understanding of the way chronobiologic factors may influence treatment.
“At a minimum, interventions known to regulate and improve sleep (e.g., melatonin, cognitive behavioral therapy), and which are safe and straightforward to introduce, may be useful in some individuals susceptible to circadian misalignment or sleep disorders,” write Heidi Sutherland, PhD, and Lyn Griffiths, PhD, with Queensland University of Technology, Brisbane, Australia.
“Treatment of comorbidities (e.g., insomnia) that result in sleep disturbances may also help headache management. Furthermore, chronobiological aspects of any pharmacological interventions should be considered, as some frequently used headache and migraine drugs can modulate circadian cycles and influence the expression of circadian genes (e.g., verapamil), or have sleep-related side effects,” they add.
A limitation of the study was the lack of information on factors that could influence the circadian cycle, such as medications; other disorders, such as bipolar disorder; or circadian rhythm issues, such as night-shift work.
The study was supported by grants from the Japan Society for the Promotion of Science, the National Institutes of Health, The Welch Foundation, and The Will Erwin Headache Research Foundation. Dr. Burish is an unpaid member of the medical advisory board of Clusterbusters, and a site investigator for a cluster headache clinical trial funded by Lundbeck. Dr. Sutherland has received grant funding from the U.S. Migraine Research Foundation, and received institute support from Queensland University of Technology for genetics research. Dr. Griffiths has received grant funding from the Australian NHMRC, U.S. Department of Defense, and the U.S. Migraine Research Foundation, and consultancy funding from TEVA.
A version of this article first appeared on Medscape.com.
A meta-analysis of 16 studies showed a circadian pattern in 71% of cluster headache attacks (3,490 of 4,953), with a clear circadian peak between 9:00 p.m. and 3:00 a.m.
Migraine was also associated with a circadian pattern in 50% of cases (2,698 of 5,385) across eight studies, with a clear circadian trough between 11:00 p.m. and 7:00 a.m.
Seasonal peaks were also evident for cluster headache (spring and autumn) and migraine (April to October).
“In the short term, these findings help us explain the timing to patients – for example, it is possible that a headache at 8 a.m. is due to their internal body clock instead of their pillow, or breakfast food, or morning medications,” lead investigator Mark Burish, MD, PhD, associate professor, department of neurosurgery, at University of Texas Health Houston, told this news organization.
“In the long term, these findings do suggest that medications that target the circadian system could be effective in migraine and headache patients,” Dr. Burish added.
The study was published online in Neurology.
Treatment implications?
Across studies, chronotype was “highly variable” for both cluster headache and migraine, the investigators report.
Cluster headache was associated with lower melatonin and higher cortisol levels, compared with non–cluster headache controls.
On a genetic level, cluster headache was associated with two core circadian genes (CLOCK and REV-ERB–alpha), and five of the nine genes that increase the likelihood of having cluster headache are genes with a circadian pattern of expression.
Migraine headache was associated with lower urinary melatonin levels and with the core circadian genes, CK1-delta and ROR-alpha, and 110 of the 168 genes associated with migraine were clock-controlled genes.
“The data suggest that both of these headache disorders are highly circadian at multiple levels, especially cluster headache,” Dr. Burish said in a release.
“This reinforces the importance of the hypothalamus – the area of the brain that houses the primary biological clock – and its role in cluster headache and migraine. It also raises the question of the genetics of triggers such as sleep changes that are known triggers for migraine and are cues for the body’s circadian rhythm,” Dr. Burish said.
“We hope that future research will look into circadian medications as a new treatment option for migraine and cluster headache patients,” Dr. Burish told this news organization.
Importance of sleep regulation
The authors of an accompanying editorial note that even though the study doesn’t have immediate clinical implications, it offers a better understanding of the way chronobiologic factors may influence treatment.
“At a minimum, interventions known to regulate and improve sleep (e.g., melatonin, cognitive behavioral therapy), and which are safe and straightforward to introduce, may be useful in some individuals susceptible to circadian misalignment or sleep disorders,” write Heidi Sutherland, PhD, and Lyn Griffiths, PhD, with Queensland University of Technology, Brisbane, Australia.
“Treatment of comorbidities (e.g., insomnia) that result in sleep disturbances may also help headache management. Furthermore, chronobiological aspects of any pharmacological interventions should be considered, as some frequently used headache and migraine drugs can modulate circadian cycles and influence the expression of circadian genes (e.g., verapamil), or have sleep-related side effects,” they add.
A limitation of the study was the lack of information on factors that could influence the circadian cycle, such as medications; other disorders, such as bipolar disorder; or circadian rhythm issues, such as night-shift work.
The study was supported by grants from the Japan Society for the Promotion of Science, the National Institutes of Health, The Welch Foundation, and The Will Erwin Headache Research Foundation. Dr. Burish is an unpaid member of the medical advisory board of Clusterbusters, and a site investigator for a cluster headache clinical trial funded by Lundbeck. Dr. Sutherland has received grant funding from the U.S. Migraine Research Foundation, and received institute support from Queensland University of Technology for genetics research. Dr. Griffiths has received grant funding from the Australian NHMRC, U.S. Department of Defense, and the U.S. Migraine Research Foundation, and consultancy funding from TEVA.
A version of this article first appeared on Medscape.com.
A meta-analysis of 16 studies showed a circadian pattern in 71% of cluster headache attacks (3,490 of 4,953), with a clear circadian peak between 9:00 p.m. and 3:00 a.m.
Migraine was also associated with a circadian pattern in 50% of cases (2,698 of 5,385) across eight studies, with a clear circadian trough between 11:00 p.m. and 7:00 a.m.
Seasonal peaks were also evident for cluster headache (spring and autumn) and migraine (April to October).
“In the short term, these findings help us explain the timing to patients – for example, it is possible that a headache at 8 a.m. is due to their internal body clock instead of their pillow, or breakfast food, or morning medications,” lead investigator Mark Burish, MD, PhD, associate professor, department of neurosurgery, at University of Texas Health Houston, told this news organization.
“In the long term, these findings do suggest that medications that target the circadian system could be effective in migraine and headache patients,” Dr. Burish added.
The study was published online in Neurology.
Treatment implications?
Across studies, chronotype was “highly variable” for both cluster headache and migraine, the investigators report.
Cluster headache was associated with lower melatonin and higher cortisol levels, compared with non–cluster headache controls.
On a genetic level, cluster headache was associated with two core circadian genes (CLOCK and REV-ERB–alpha), and five of the nine genes that increase the likelihood of having cluster headache are genes with a circadian pattern of expression.
Migraine headache was associated with lower urinary melatonin levels and with the core circadian genes, CK1-delta and ROR-alpha, and 110 of the 168 genes associated with migraine were clock-controlled genes.
“The data suggest that both of these headache disorders are highly circadian at multiple levels, especially cluster headache,” Dr. Burish said in a release.
“This reinforces the importance of the hypothalamus – the area of the brain that houses the primary biological clock – and its role in cluster headache and migraine. It also raises the question of the genetics of triggers such as sleep changes that are known triggers for migraine and are cues for the body’s circadian rhythm,” Dr. Burish said.
“We hope that future research will look into circadian medications as a new treatment option for migraine and cluster headache patients,” Dr. Burish told this news organization.
Importance of sleep regulation
The authors of an accompanying editorial note that even though the study doesn’t have immediate clinical implications, it offers a better understanding of the way chronobiologic factors may influence treatment.
“At a minimum, interventions known to regulate and improve sleep (e.g., melatonin, cognitive behavioral therapy), and which are safe and straightforward to introduce, may be useful in some individuals susceptible to circadian misalignment or sleep disorders,” write Heidi Sutherland, PhD, and Lyn Griffiths, PhD, with Queensland University of Technology, Brisbane, Australia.
“Treatment of comorbidities (e.g., insomnia) that result in sleep disturbances may also help headache management. Furthermore, chronobiological aspects of any pharmacological interventions should be considered, as some frequently used headache and migraine drugs can modulate circadian cycles and influence the expression of circadian genes (e.g., verapamil), or have sleep-related side effects,” they add.
A limitation of the study was the lack of information on factors that could influence the circadian cycle, such as medications; other disorders, such as bipolar disorder; or circadian rhythm issues, such as night-shift work.
The study was supported by grants from the Japan Society for the Promotion of Science, the National Institutes of Health, The Welch Foundation, and The Will Erwin Headache Research Foundation. Dr. Burish is an unpaid member of the medical advisory board of Clusterbusters, and a site investigator for a cluster headache clinical trial funded by Lundbeck. Dr. Sutherland has received grant funding from the U.S. Migraine Research Foundation, and received institute support from Queensland University of Technology for genetics research. Dr. Griffiths has received grant funding from the Australian NHMRC, U.S. Department of Defense, and the U.S. Migraine Research Foundation, and consultancy funding from TEVA.
A version of this article first appeared on Medscape.com.
FROM NEUROLOGY
Kickback Scheme Nets Prison Time for Philadelphia VAMC Service Chief
A former manager at the Philadelphia Veterans Affairs Medical Center (VAMC) has been sentenced to 6 months in federal prison for his part in a bribery scheme.
Ralph Johnson was convicted of accepting $30,000 in kickbacks and bribes for steering contracts to Earron and Carlicha Starks, who ran Ekno Medical Supply and Collondale Medical Supply from 2009 to 2019. Johnson served as chief of environmental services at the medical center. He admitted to receiving cash in binders and packages mailed to his home between 2018 and 2019.
The Starkses pleaded guilty first to paying kickbacks on $7 million worth of contracts to Florida VA facilities, then participated in a sting that implicated Johnson.
The VA Office of Inspector General began investigating Johnson in 2018 after the Starkses, who were indicted for bribing staff at US Department of Veterans Affairs (VA) hospitals in Miami and West Palm Beach, Florida, said they also paid officials in VA facilities on the East Coast.
According to the Philadelphia Inquirer, the judge credited Johnson’s past military service and his “extensive cooperation” with federal authorities investigating fraud within the VA. Johnson apologized to his former employers: “Throughout these 2 and a half years [since the arrest] there’s not a day I don’t think about the wrongness that I did.”
In addition to the prison sentence, Johnson has been ordered to pay back, at $50 a month, the $440,000-plus he cost the Philadelphia VAMC in fraudulent and bloated contracts.
Johnson is at least the third Philadelphia VAMC employee indicted or sentenced for fraud since 2020.
A former manager at the Philadelphia Veterans Affairs Medical Center (VAMC) has been sentenced to 6 months in federal prison for his part in a bribery scheme.
Ralph Johnson was convicted of accepting $30,000 in kickbacks and bribes for steering contracts to Earron and Carlicha Starks, who ran Ekno Medical Supply and Collondale Medical Supply from 2009 to 2019. Johnson served as chief of environmental services at the medical center. He admitted to receiving cash in binders and packages mailed to his home between 2018 and 2019.
The Starkses pleaded guilty first to paying kickbacks on $7 million worth of contracts to Florida VA facilities, then participated in a sting that implicated Johnson.
The VA Office of Inspector General began investigating Johnson in 2018 after the Starkses, who were indicted for bribing staff at US Department of Veterans Affairs (VA) hospitals in Miami and West Palm Beach, Florida, said they also paid officials in VA facilities on the East Coast.
According to the Philadelphia Inquirer, the judge credited Johnson’s past military service and his “extensive cooperation” with federal authorities investigating fraud within the VA. Johnson apologized to his former employers: “Throughout these 2 and a half years [since the arrest] there’s not a day I don’t think about the wrongness that I did.”
In addition to the prison sentence, Johnson has been ordered to pay back, at $50 a month, the $440,000-plus he cost the Philadelphia VAMC in fraudulent and bloated contracts.
Johnson is at least the third Philadelphia VAMC employee indicted or sentenced for fraud since 2020.
A former manager at the Philadelphia Veterans Affairs Medical Center (VAMC) has been sentenced to 6 months in federal prison for his part in a bribery scheme.
Ralph Johnson was convicted of accepting $30,000 in kickbacks and bribes for steering contracts to Earron and Carlicha Starks, who ran Ekno Medical Supply and Collondale Medical Supply from 2009 to 2019. Johnson served as chief of environmental services at the medical center. He admitted to receiving cash in binders and packages mailed to his home between 2018 and 2019.
The Starkses pleaded guilty first to paying kickbacks on $7 million worth of contracts to Florida VA facilities, then participated in a sting that implicated Johnson.
The VA Office of Inspector General began investigating Johnson in 2018 after the Starkses, who were indicted for bribing staff at US Department of Veterans Affairs (VA) hospitals in Miami and West Palm Beach, Florida, said they also paid officials in VA facilities on the East Coast.
According to the Philadelphia Inquirer, the judge credited Johnson’s past military service and his “extensive cooperation” with federal authorities investigating fraud within the VA. Johnson apologized to his former employers: “Throughout these 2 and a half years [since the arrest] there’s not a day I don’t think about the wrongness that I did.”
In addition to the prison sentence, Johnson has been ordered to pay back, at $50 a month, the $440,000-plus he cost the Philadelphia VAMC in fraudulent and bloated contracts.
Johnson is at least the third Philadelphia VAMC employee indicted or sentenced for fraud since 2020.
Song stuck in your head? What earworms reveal about health
If Miley Cyrus has planted “Flowers” in your head, rest assured you’re not alone.
An earworm – a bit of music you can’t shake from your brain – happens to almost everyone.
The culprit is typically a song you’ve heard repeatedly with a strong rhythm and melody (like Miley’s No. 1 hit this year).
It pops into your head and stays there, unbidden and often unwanted. As you fish for something new on Spotify, there’s always a chance that a catchy hook holds an earworm.
“A catchy tune or melody is the part of a song most likely to get stuck in a person’s head, often a bit from the chorus,” said Elizabeth H. Margulis, PhD, a professor at Princeton (N.J.) University and director of its music cognition lab. The phenomenon, which has been studied since 1885 (way before earbuds), goes by such names as stuck song syndrome, sticky music, musical imagery repetition, intrusive musical imagery, or the semi-official term, involuntary musical imagery, or INMI.
Research confirms how common it is. A 2020 study of American college students found that 97% had experienced an earworm in the past month, similar to findings from a larger Finnish survey done more than 10 years ago.
One in five people had experienced an earworm more than once a day, the study found. The typical length was 10-30 minutes, though 8.5% said theirs lasted more than 3 hours. Levels of “distress and interference” that earworms caused was mostly “mild to moderate.”
Some 86% said they tried to stop it – most frequently by distraction, like talking to a friend or listening to another song.
If music is important to you, your earworms are more likely to last longer and be harder to control, earlier research found. And women are thought to be more likely to have them.
“Very musical people may have more earworms because it’s easy for them to conjure up a certain tune,” says David Silbersweig, MD, chairman of the department of psychiatry and codirector of the Institute for the Neurosciences at Brigham and Women’s Hospital in Boston.
Moreover, people who lack “psychological flexibility” may find earworms more bothersome. The more they try to avoid or control intrusive thoughts (or songs), the more persistent those thoughts become.
“This is consistent with OCD (obsessive-compulsive disorder) research on the paradoxical effect of thought suppression,” the authors of the 2020 study wrote. In fact, people who report very annoying or stressful earworms are more likely to have obsessive-compulsive symptoms.
That makes them worth a closer look.
Digging for the source of earworms
Scientists trace earworms to the auditory cortex in the temporal lobe of the brain, which controls how you perceive music, as well as deep temporal lobe areas that are responsible for retrieving memories. Your amygdala and ventral striatum, parts of your brain that involve emotion, also tie into the making of an earworm.
MRI experiments found that “INMI is a common internal experience recruiting brain networks involved in perception, emotions, memory and spontaneous thoughts,” a 2015 paper in Consciousness and Cognition reported.
These brain networks work in tandem if you connect a song to an emotional memory – that’s when you’re more likely to experience it as an earworm. The “loop” of music you’ll hear in your head is usually a 20-second snippet.
Think of it as a “cognitive itch,” as researchers from the Netherlands put it. An earworm can be triggered by associating a song with a specific situation or emotion. Trying to suppress it just reminds you it’s there, “scratching” the itch and making it worse. “The more one tries to suppress the songs, the more their impetus increases, a mental process known as ironic process theory,” they wrote.
“It’s also worth pointing out that earworms don’t always occur right after a song ends,” said Michael K. Scullin, PhD, an associate professor of psychology and neuroscience at Baylor University in Waco, Tex. “Sometimes they only occur many hours later, and sometimes the earworm isn’t the song you were most recently listening to.”
These processes aren’t fully understood, he said, “but they likely represent memory consolidation mechanisms; that is, the brain trying to reactivate and stabilize musical memories.” Kind of like switching “radio stations” in your head.
When to worry
Earworms are most often harmless. “They’re part of a healthy brain,” said Dr. Silbersweig. But in rare cases, they indicate certain medical conditions. People with OCD, for example, have been shown to have earworms during times of stress. If this is the case, cognitive-behavioral therapy as well as some antidepressants may help.
Take an earworm seriously if it’s linked to other symptoms, said Elaine Jones, MD, a neurologist in Hilton Head, S.C., and a fellow of the American Academy of Neurology. Those symptoms could include “loss of consciousness or confusion, visual loss or changes, speech arrest, tremors of arms or legs,” she said.
“Most worrisome would be a seizure, but other causes could include a migraine aura. In a younger person, less than 20 years old, this kind of earworm could indicate a psychiatric condition like schizophrenia.” Drug toxicity or brain damage can also present with earworms.
Her bottom line: “If an earworm is persistent for more than 24 hours, or if it is associated with the other symptoms mentioned above, it would be important to reach out to your primary care doctor to ensure that nothing more serious is going on,” said Dr. Jones. With no other symptoms, “it is more likely to be just an earworm.”
Japanese research also indicates that an earworm that lasts for several hours in a day can be linked to depression. If a person has symptoms such as low mood, insomnia, and loss of appetite, along with earworms that last several hours a day, treatment may help.
There’s another category called “musical hallucinations” – where the person thinks they are actually hearing music, which could be a symptom of depression, although scientists don’t know for sure. The drug vortioxetine, which may help boost serotonin in the brain, has shown some promise in reducing earworms.
Some research has shown that diseases that damage the auditory pathway in the brain have a link to musical hallucinations.
How to stop a simple earworm
Here are six easy ways to make it stop:
- Mix up your playlist. “Listening to songs repeatedly does increase the likelihood that they’ll get stuck,” said Dr. Margulis.
- Take breaks from your tunes throughout the day. “Longer listening durations are more likely to lead to earworms,” Dr. Scullin said.
- Use your feet. than the beat of your earworm. This will interrupt your memory of the tempo and can help chase away the earworm.
- Stick with that song. “Listen to a song all the way through,” said Dr. Silbersweig. If you only listen to snippets of a song, the can take hold. That’s the brain’s tendency to remember things that are interrupted more easily than completed things.
- Distract yourself. Lose yourself in a book, a movie, your work, or a hobby that requires concentration. “Redirecting attention to an absorbing task can be an effective way to dislodge an earworm,” said Dr. Margulis.
- Chew gum. shows that the action of doing so interferes with repetitive memories and stops your mind from “scanning” a song. Then enjoy the sound of silence!
A version of this article first appeared on WebMD.com.
If Miley Cyrus has planted “Flowers” in your head, rest assured you’re not alone.
An earworm – a bit of music you can’t shake from your brain – happens to almost everyone.
The culprit is typically a song you’ve heard repeatedly with a strong rhythm and melody (like Miley’s No. 1 hit this year).
It pops into your head and stays there, unbidden and often unwanted. As you fish for something new on Spotify, there’s always a chance that a catchy hook holds an earworm.
“A catchy tune or melody is the part of a song most likely to get stuck in a person’s head, often a bit from the chorus,” said Elizabeth H. Margulis, PhD, a professor at Princeton (N.J.) University and director of its music cognition lab. The phenomenon, which has been studied since 1885 (way before earbuds), goes by such names as stuck song syndrome, sticky music, musical imagery repetition, intrusive musical imagery, or the semi-official term, involuntary musical imagery, or INMI.
Research confirms how common it is. A 2020 study of American college students found that 97% had experienced an earworm in the past month, similar to findings from a larger Finnish survey done more than 10 years ago.
One in five people had experienced an earworm more than once a day, the study found. The typical length was 10-30 minutes, though 8.5% said theirs lasted more than 3 hours. Levels of “distress and interference” that earworms caused was mostly “mild to moderate.”
Some 86% said they tried to stop it – most frequently by distraction, like talking to a friend or listening to another song.
If music is important to you, your earworms are more likely to last longer and be harder to control, earlier research found. And women are thought to be more likely to have them.
“Very musical people may have more earworms because it’s easy for them to conjure up a certain tune,” says David Silbersweig, MD, chairman of the department of psychiatry and codirector of the Institute for the Neurosciences at Brigham and Women’s Hospital in Boston.
Moreover, people who lack “psychological flexibility” may find earworms more bothersome. The more they try to avoid or control intrusive thoughts (or songs), the more persistent those thoughts become.
“This is consistent with OCD (obsessive-compulsive disorder) research on the paradoxical effect of thought suppression,” the authors of the 2020 study wrote. In fact, people who report very annoying or stressful earworms are more likely to have obsessive-compulsive symptoms.
That makes them worth a closer look.
Digging for the source of earworms
Scientists trace earworms to the auditory cortex in the temporal lobe of the brain, which controls how you perceive music, as well as deep temporal lobe areas that are responsible for retrieving memories. Your amygdala and ventral striatum, parts of your brain that involve emotion, also tie into the making of an earworm.
MRI experiments found that “INMI is a common internal experience recruiting brain networks involved in perception, emotions, memory and spontaneous thoughts,” a 2015 paper in Consciousness and Cognition reported.
These brain networks work in tandem if you connect a song to an emotional memory – that’s when you’re more likely to experience it as an earworm. The “loop” of music you’ll hear in your head is usually a 20-second snippet.
Think of it as a “cognitive itch,” as researchers from the Netherlands put it. An earworm can be triggered by associating a song with a specific situation or emotion. Trying to suppress it just reminds you it’s there, “scratching” the itch and making it worse. “The more one tries to suppress the songs, the more their impetus increases, a mental process known as ironic process theory,” they wrote.
“It’s also worth pointing out that earworms don’t always occur right after a song ends,” said Michael K. Scullin, PhD, an associate professor of psychology and neuroscience at Baylor University in Waco, Tex. “Sometimes they only occur many hours later, and sometimes the earworm isn’t the song you were most recently listening to.”
These processes aren’t fully understood, he said, “but they likely represent memory consolidation mechanisms; that is, the brain trying to reactivate and stabilize musical memories.” Kind of like switching “radio stations” in your head.
When to worry
Earworms are most often harmless. “They’re part of a healthy brain,” said Dr. Silbersweig. But in rare cases, they indicate certain medical conditions. People with OCD, for example, have been shown to have earworms during times of stress. If this is the case, cognitive-behavioral therapy as well as some antidepressants may help.
Take an earworm seriously if it’s linked to other symptoms, said Elaine Jones, MD, a neurologist in Hilton Head, S.C., and a fellow of the American Academy of Neurology. Those symptoms could include “loss of consciousness or confusion, visual loss or changes, speech arrest, tremors of arms or legs,” she said.
“Most worrisome would be a seizure, but other causes could include a migraine aura. In a younger person, less than 20 years old, this kind of earworm could indicate a psychiatric condition like schizophrenia.” Drug toxicity or brain damage can also present with earworms.
Her bottom line: “If an earworm is persistent for more than 24 hours, or if it is associated with the other symptoms mentioned above, it would be important to reach out to your primary care doctor to ensure that nothing more serious is going on,” said Dr. Jones. With no other symptoms, “it is more likely to be just an earworm.”
Japanese research also indicates that an earworm that lasts for several hours in a day can be linked to depression. If a person has symptoms such as low mood, insomnia, and loss of appetite, along with earworms that last several hours a day, treatment may help.
There’s another category called “musical hallucinations” – where the person thinks they are actually hearing music, which could be a symptom of depression, although scientists don’t know for sure. The drug vortioxetine, which may help boost serotonin in the brain, has shown some promise in reducing earworms.
Some research has shown that diseases that damage the auditory pathway in the brain have a link to musical hallucinations.
How to stop a simple earworm
Here are six easy ways to make it stop:
- Mix up your playlist. “Listening to songs repeatedly does increase the likelihood that they’ll get stuck,” said Dr. Margulis.
- Take breaks from your tunes throughout the day. “Longer listening durations are more likely to lead to earworms,” Dr. Scullin said.
- Use your feet. than the beat of your earworm. This will interrupt your memory of the tempo and can help chase away the earworm.
- Stick with that song. “Listen to a song all the way through,” said Dr. Silbersweig. If you only listen to snippets of a song, the can take hold. That’s the brain’s tendency to remember things that are interrupted more easily than completed things.
- Distract yourself. Lose yourself in a book, a movie, your work, or a hobby that requires concentration. “Redirecting attention to an absorbing task can be an effective way to dislodge an earworm,” said Dr. Margulis.
- Chew gum. shows that the action of doing so interferes with repetitive memories and stops your mind from “scanning” a song. Then enjoy the sound of silence!
A version of this article first appeared on WebMD.com.
If Miley Cyrus has planted “Flowers” in your head, rest assured you’re not alone.
An earworm – a bit of music you can’t shake from your brain – happens to almost everyone.
The culprit is typically a song you’ve heard repeatedly with a strong rhythm and melody (like Miley’s No. 1 hit this year).
It pops into your head and stays there, unbidden and often unwanted. As you fish for something new on Spotify, there’s always a chance that a catchy hook holds an earworm.
“A catchy tune or melody is the part of a song most likely to get stuck in a person’s head, often a bit from the chorus,” said Elizabeth H. Margulis, PhD, a professor at Princeton (N.J.) University and director of its music cognition lab. The phenomenon, which has been studied since 1885 (way before earbuds), goes by such names as stuck song syndrome, sticky music, musical imagery repetition, intrusive musical imagery, or the semi-official term, involuntary musical imagery, or INMI.
Research confirms how common it is. A 2020 study of American college students found that 97% had experienced an earworm in the past month, similar to findings from a larger Finnish survey done more than 10 years ago.
One in five people had experienced an earworm more than once a day, the study found. The typical length was 10-30 minutes, though 8.5% said theirs lasted more than 3 hours. Levels of “distress and interference” that earworms caused was mostly “mild to moderate.”
Some 86% said they tried to stop it – most frequently by distraction, like talking to a friend or listening to another song.
If music is important to you, your earworms are more likely to last longer and be harder to control, earlier research found. And women are thought to be more likely to have them.
“Very musical people may have more earworms because it’s easy for them to conjure up a certain tune,” says David Silbersweig, MD, chairman of the department of psychiatry and codirector of the Institute for the Neurosciences at Brigham and Women’s Hospital in Boston.
Moreover, people who lack “psychological flexibility” may find earworms more bothersome. The more they try to avoid or control intrusive thoughts (or songs), the more persistent those thoughts become.
“This is consistent with OCD (obsessive-compulsive disorder) research on the paradoxical effect of thought suppression,” the authors of the 2020 study wrote. In fact, people who report very annoying or stressful earworms are more likely to have obsessive-compulsive symptoms.
That makes them worth a closer look.
Digging for the source of earworms
Scientists trace earworms to the auditory cortex in the temporal lobe of the brain, which controls how you perceive music, as well as deep temporal lobe areas that are responsible for retrieving memories. Your amygdala and ventral striatum, parts of your brain that involve emotion, also tie into the making of an earworm.
MRI experiments found that “INMI is a common internal experience recruiting brain networks involved in perception, emotions, memory and spontaneous thoughts,” a 2015 paper in Consciousness and Cognition reported.
These brain networks work in tandem if you connect a song to an emotional memory – that’s when you’re more likely to experience it as an earworm. The “loop” of music you’ll hear in your head is usually a 20-second snippet.
Think of it as a “cognitive itch,” as researchers from the Netherlands put it. An earworm can be triggered by associating a song with a specific situation or emotion. Trying to suppress it just reminds you it’s there, “scratching” the itch and making it worse. “The more one tries to suppress the songs, the more their impetus increases, a mental process known as ironic process theory,” they wrote.
“It’s also worth pointing out that earworms don’t always occur right after a song ends,” said Michael K. Scullin, PhD, an associate professor of psychology and neuroscience at Baylor University in Waco, Tex. “Sometimes they only occur many hours later, and sometimes the earworm isn’t the song you were most recently listening to.”
These processes aren’t fully understood, he said, “but they likely represent memory consolidation mechanisms; that is, the brain trying to reactivate and stabilize musical memories.” Kind of like switching “radio stations” in your head.
When to worry
Earworms are most often harmless. “They’re part of a healthy brain,” said Dr. Silbersweig. But in rare cases, they indicate certain medical conditions. People with OCD, for example, have been shown to have earworms during times of stress. If this is the case, cognitive-behavioral therapy as well as some antidepressants may help.
Take an earworm seriously if it’s linked to other symptoms, said Elaine Jones, MD, a neurologist in Hilton Head, S.C., and a fellow of the American Academy of Neurology. Those symptoms could include “loss of consciousness or confusion, visual loss or changes, speech arrest, tremors of arms or legs,” she said.
“Most worrisome would be a seizure, but other causes could include a migraine aura. In a younger person, less than 20 years old, this kind of earworm could indicate a psychiatric condition like schizophrenia.” Drug toxicity or brain damage can also present with earworms.
Her bottom line: “If an earworm is persistent for more than 24 hours, or if it is associated with the other symptoms mentioned above, it would be important to reach out to your primary care doctor to ensure that nothing more serious is going on,” said Dr. Jones. With no other symptoms, “it is more likely to be just an earworm.”
Japanese research also indicates that an earworm that lasts for several hours in a day can be linked to depression. If a person has symptoms such as low mood, insomnia, and loss of appetite, along with earworms that last several hours a day, treatment may help.
There’s another category called “musical hallucinations” – where the person thinks they are actually hearing music, which could be a symptom of depression, although scientists don’t know for sure. The drug vortioxetine, which may help boost serotonin in the brain, has shown some promise in reducing earworms.
Some research has shown that diseases that damage the auditory pathway in the brain have a link to musical hallucinations.
How to stop a simple earworm
Here are six easy ways to make it stop:
- Mix up your playlist. “Listening to songs repeatedly does increase the likelihood that they’ll get stuck,” said Dr. Margulis.
- Take breaks from your tunes throughout the day. “Longer listening durations are more likely to lead to earworms,” Dr. Scullin said.
- Use your feet. than the beat of your earworm. This will interrupt your memory of the tempo and can help chase away the earworm.
- Stick with that song. “Listen to a song all the way through,” said Dr. Silbersweig. If you only listen to snippets of a song, the can take hold. That’s the brain’s tendency to remember things that are interrupted more easily than completed things.
- Distract yourself. Lose yourself in a book, a movie, your work, or a hobby that requires concentration. “Redirecting attention to an absorbing task can be an effective way to dislodge an earworm,” said Dr. Margulis.
- Chew gum. shows that the action of doing so interferes with repetitive memories and stops your mind from “scanning” a song. Then enjoy the sound of silence!
A version of this article first appeared on WebMD.com.
Analysis identifies gaps in CV risk screening of patients with psoriasis
Just , according to an analysis of 10 years of national survey data.
From 2007 to 2016, national screening rates for four CV risk factors at 14.8 million psoriasis-related visits to dermatology providers were 11% (body-mass index), 7.4% (blood pressure), 2.9% (cholesterol), and 1.7% (glucose). Data from the National Ambulatory Medical Care Survey showed that at least one of the four factors was screened at 16% of dermatology visits, said William B. Song, BS, of the department of dermatology, University of Pennsylvania, Philadelphia, and associates.
The main focus of their study, however, was regional differences. “CV risk factor screening by dermatology providers for patients with psoriasis is low across all regions of the United States and lowest in the South, the region that experiences the highest CVD burden in the United States,” they wrote in a letter to the editor.
Compared with the South, the adjusted odds of any CV screening were 0.98 in the West, 1.25 in the Northeast, and 1.92 in the Midwest. Blood pressure screening was significantly higher in all three regions, compared with the South, while BMI screening was actually lower in the West (0.74), the investigators reported. Odds ratios were not available for cholesterol and glucose screening because of sample size limitations.
The regional variation in screening rates “is not explained by patient demographics or disease severity,” they noted, adding that 2.8 million visits with BP screening would have been added over the 10-year study period “if providers in the South screened patients with psoriasis for high blood pressure at the same rate as providers in the Northeast.”
Guidelines published in 2019 by the American Academy of Dermatology and the National Psoriasis Foundation – which were cowritten by Joel M. Gelfand, MD, senior author of the current study – noted that dermatologists “play an important role in evidence-based screening of CV risk factors in patients with psoriasis,” the investigators wrote. But the regional variations suggest “that some regions experience barriers to appropriate screening or challenges in adhering to guidelines for managing psoriasis and CV risk.”
While the lack of data from after 2016 is one of the study limitations, they added, “continued efforts to develop effective interventions to improve CV screening and care for people with psoriasis in all regions of the U.S. are needed to more effectively address the burden of CV disease experienced by people with psoriasis.”
The study was partly funded by the National Psoriasis Foundation. Three of the seven investigators disclosed earnings from private companies in the form of consultant fees, research support, and honoraria. Dr. Gelfand is a deputy editor for the Journal of Investigative Dermatology.
Just , according to an analysis of 10 years of national survey data.
From 2007 to 2016, national screening rates for four CV risk factors at 14.8 million psoriasis-related visits to dermatology providers were 11% (body-mass index), 7.4% (blood pressure), 2.9% (cholesterol), and 1.7% (glucose). Data from the National Ambulatory Medical Care Survey showed that at least one of the four factors was screened at 16% of dermatology visits, said William B. Song, BS, of the department of dermatology, University of Pennsylvania, Philadelphia, and associates.
The main focus of their study, however, was regional differences. “CV risk factor screening by dermatology providers for patients with psoriasis is low across all regions of the United States and lowest in the South, the region that experiences the highest CVD burden in the United States,” they wrote in a letter to the editor.
Compared with the South, the adjusted odds of any CV screening were 0.98 in the West, 1.25 in the Northeast, and 1.92 in the Midwest. Blood pressure screening was significantly higher in all three regions, compared with the South, while BMI screening was actually lower in the West (0.74), the investigators reported. Odds ratios were not available for cholesterol and glucose screening because of sample size limitations.
The regional variation in screening rates “is not explained by patient demographics or disease severity,” they noted, adding that 2.8 million visits with BP screening would have been added over the 10-year study period “if providers in the South screened patients with psoriasis for high blood pressure at the same rate as providers in the Northeast.”
Guidelines published in 2019 by the American Academy of Dermatology and the National Psoriasis Foundation – which were cowritten by Joel M. Gelfand, MD, senior author of the current study – noted that dermatologists “play an important role in evidence-based screening of CV risk factors in patients with psoriasis,” the investigators wrote. But the regional variations suggest “that some regions experience barriers to appropriate screening or challenges in adhering to guidelines for managing psoriasis and CV risk.”
While the lack of data from after 2016 is one of the study limitations, they added, “continued efforts to develop effective interventions to improve CV screening and care for people with psoriasis in all regions of the U.S. are needed to more effectively address the burden of CV disease experienced by people with psoriasis.”
The study was partly funded by the National Psoriasis Foundation. Three of the seven investigators disclosed earnings from private companies in the form of consultant fees, research support, and honoraria. Dr. Gelfand is a deputy editor for the Journal of Investigative Dermatology.
Just , according to an analysis of 10 years of national survey data.
From 2007 to 2016, national screening rates for four CV risk factors at 14.8 million psoriasis-related visits to dermatology providers were 11% (body-mass index), 7.4% (blood pressure), 2.9% (cholesterol), and 1.7% (glucose). Data from the National Ambulatory Medical Care Survey showed that at least one of the four factors was screened at 16% of dermatology visits, said William B. Song, BS, of the department of dermatology, University of Pennsylvania, Philadelphia, and associates.
The main focus of their study, however, was regional differences. “CV risk factor screening by dermatology providers for patients with psoriasis is low across all regions of the United States and lowest in the South, the region that experiences the highest CVD burden in the United States,” they wrote in a letter to the editor.
Compared with the South, the adjusted odds of any CV screening were 0.98 in the West, 1.25 in the Northeast, and 1.92 in the Midwest. Blood pressure screening was significantly higher in all three regions, compared with the South, while BMI screening was actually lower in the West (0.74), the investigators reported. Odds ratios were not available for cholesterol and glucose screening because of sample size limitations.
The regional variation in screening rates “is not explained by patient demographics or disease severity,” they noted, adding that 2.8 million visits with BP screening would have been added over the 10-year study period “if providers in the South screened patients with psoriasis for high blood pressure at the same rate as providers in the Northeast.”
Guidelines published in 2019 by the American Academy of Dermatology and the National Psoriasis Foundation – which were cowritten by Joel M. Gelfand, MD, senior author of the current study – noted that dermatologists “play an important role in evidence-based screening of CV risk factors in patients with psoriasis,” the investigators wrote. But the regional variations suggest “that some regions experience barriers to appropriate screening or challenges in adhering to guidelines for managing psoriasis and CV risk.”
While the lack of data from after 2016 is one of the study limitations, they added, “continued efforts to develop effective interventions to improve CV screening and care for people with psoriasis in all regions of the U.S. are needed to more effectively address the burden of CV disease experienced by people with psoriasis.”
The study was partly funded by the National Psoriasis Foundation. Three of the seven investigators disclosed earnings from private companies in the form of consultant fees, research support, and honoraria. Dr. Gelfand is a deputy editor for the Journal of Investigative Dermatology.
FROM THE JOURNAL OF INVESTIGATIVE DERMATOLOGY
Over-the-scope clips in routine nonvariceal bleed still uncertain
when used as primary treatment in patients with high-risk nonvariceal upper gastrointestinal lesions, shows a randomized controlled trial (RCT).
However, noted the investigators, writing in Annals of Internal Medicine, and physicians who wrote an accompanying editorial, reservations remain about first-line use of OTSCs, but mostly relate to method, technique, and cost.
“The absolute difference in the rate of further bleeding was 11.4 percentage points. We should however be cautious in our recommendation of using OTSC as first-line treatment,” wrote researchers who were led by James Y.W. Lau, MD, from Prince of Wales Hospital, Chinese University of Hong Kong.
“The primary use of OTSCs may find a role in the treatment of ulcers predicted to fail standard endoscopic treatment,” the authors wrote. However, they emphasized that, “We are not advocating routine primary use of OTSCs. These clips are costly, and a formal cost analysis is not available in the literature. The use of OTSCs involves scope withdrawal, mounting of the OTSCs, and scope reinsertion, which increase the procedure time. Endoscopists also require training before using OTSCs.”
Alan N. Barkun, MD, gastroenterologist and professor of medicine with McGill University, Montreal, who cowrote the editorial accompanying the research paper, said the study investigators were highly experienced surgeon-scientists, pointing out that, overall, first-line use of OTSC in this patient group improved patient outcomes.
“The main message here is that if you can position the clip properly, then it is likely to stay in place, better than standard approaches,” he said, adding that, “I support it fully for second-line use but there currently still exists uncertainty for routine first-line adoption in nonvariceal bleeding. Clinicians fail to position the clip properly in around 5% of patients which is higher than standard endoscopic approaches, and nobody has yet clearly defined the lesions that are difficult to clip with the OTSC.
“If you’re going to tell people to use it, then you need to tell them with which particular lesions OTSC works best as first-line approach,” he added.
Lesions of concern include upon leaving the stomach and entering the duodenum, and in passing from the first to the second stage of the duodenum. “These are tight areas, and these larger full-thickness bite OTSC may create pseudo-polyps, even possibly causing obstruction. Perforation is also a risk.” One of each of these complications were noted in this study.
The study included 190 adult patients with active bleeding or a nonbleeding visible vessel from a nonvariceal cause on upper gastrointestinal endoscopy. Of these, 97 patients received standard hemostatic treatment and 93 received OTSC. The primary endpoint of a 30-day probability of further bleeding was 14.6% in the standard treatment and 3.2% in the OTSC group (risk difference, 11.4 percentage points [95% confidence interval (CI), 3.3-20.0 percentage points]; P = .006). Failure to control bleeding after assigned endoscopic treatment in the standard treatment and OTSC groups was 6 versus 1 in the standard treatment and OTSC groups, respectively. Thirty-day recurrent bleeding was 8 versus 2 in the standard treatment and OTSC groups, respectively. Eight patients in the standard treatment group needed further intervention compared with two in the OTSC group. Thirty-day mortality was four versus two, respectively.
“First-line OTSC has a role to play but whether it is the best approach is hard to say due to methodological limitations that were seen in this and earlier studies, however if you can position the clip properly it likely does well,” Dr. Barkun said.
Dr. Lau declares that he received honorarium for a lecture from OVESCO. Dr. Li has no disclosures. Dr. Barkun has no relevant disclosures.
when used as primary treatment in patients with high-risk nonvariceal upper gastrointestinal lesions, shows a randomized controlled trial (RCT).
However, noted the investigators, writing in Annals of Internal Medicine, and physicians who wrote an accompanying editorial, reservations remain about first-line use of OTSCs, but mostly relate to method, technique, and cost.
“The absolute difference in the rate of further bleeding was 11.4 percentage points. We should however be cautious in our recommendation of using OTSC as first-line treatment,” wrote researchers who were led by James Y.W. Lau, MD, from Prince of Wales Hospital, Chinese University of Hong Kong.
“The primary use of OTSCs may find a role in the treatment of ulcers predicted to fail standard endoscopic treatment,” the authors wrote. However, they emphasized that, “We are not advocating routine primary use of OTSCs. These clips are costly, and a formal cost analysis is not available in the literature. The use of OTSCs involves scope withdrawal, mounting of the OTSCs, and scope reinsertion, which increase the procedure time. Endoscopists also require training before using OTSCs.”
Alan N. Barkun, MD, gastroenterologist and professor of medicine with McGill University, Montreal, who cowrote the editorial accompanying the research paper, said the study investigators were highly experienced surgeon-scientists, pointing out that, overall, first-line use of OTSC in this patient group improved patient outcomes.
“The main message here is that if you can position the clip properly, then it is likely to stay in place, better than standard approaches,” he said, adding that, “I support it fully for second-line use but there currently still exists uncertainty for routine first-line adoption in nonvariceal bleeding. Clinicians fail to position the clip properly in around 5% of patients which is higher than standard endoscopic approaches, and nobody has yet clearly defined the lesions that are difficult to clip with the OTSC.
“If you’re going to tell people to use it, then you need to tell them with which particular lesions OTSC works best as first-line approach,” he added.
Lesions of concern include upon leaving the stomach and entering the duodenum, and in passing from the first to the second stage of the duodenum. “These are tight areas, and these larger full-thickness bite OTSC may create pseudo-polyps, even possibly causing obstruction. Perforation is also a risk.” One of each of these complications were noted in this study.
The study included 190 adult patients with active bleeding or a nonbleeding visible vessel from a nonvariceal cause on upper gastrointestinal endoscopy. Of these, 97 patients received standard hemostatic treatment and 93 received OTSC. The primary endpoint of a 30-day probability of further bleeding was 14.6% in the standard treatment and 3.2% in the OTSC group (risk difference, 11.4 percentage points [95% confidence interval (CI), 3.3-20.0 percentage points]; P = .006). Failure to control bleeding after assigned endoscopic treatment in the standard treatment and OTSC groups was 6 versus 1 in the standard treatment and OTSC groups, respectively. Thirty-day recurrent bleeding was 8 versus 2 in the standard treatment and OTSC groups, respectively. Eight patients in the standard treatment group needed further intervention compared with two in the OTSC group. Thirty-day mortality was four versus two, respectively.
“First-line OTSC has a role to play but whether it is the best approach is hard to say due to methodological limitations that were seen in this and earlier studies, however if you can position the clip properly it likely does well,” Dr. Barkun said.
Dr. Lau declares that he received honorarium for a lecture from OVESCO. Dr. Li has no disclosures. Dr. Barkun has no relevant disclosures.
when used as primary treatment in patients with high-risk nonvariceal upper gastrointestinal lesions, shows a randomized controlled trial (RCT).
However, noted the investigators, writing in Annals of Internal Medicine, and physicians who wrote an accompanying editorial, reservations remain about first-line use of OTSCs, but mostly relate to method, technique, and cost.
“The absolute difference in the rate of further bleeding was 11.4 percentage points. We should however be cautious in our recommendation of using OTSC as first-line treatment,” wrote researchers who were led by James Y.W. Lau, MD, from Prince of Wales Hospital, Chinese University of Hong Kong.
“The primary use of OTSCs may find a role in the treatment of ulcers predicted to fail standard endoscopic treatment,” the authors wrote. However, they emphasized that, “We are not advocating routine primary use of OTSCs. These clips are costly, and a formal cost analysis is not available in the literature. The use of OTSCs involves scope withdrawal, mounting of the OTSCs, and scope reinsertion, which increase the procedure time. Endoscopists also require training before using OTSCs.”
Alan N. Barkun, MD, gastroenterologist and professor of medicine with McGill University, Montreal, who cowrote the editorial accompanying the research paper, said the study investigators were highly experienced surgeon-scientists, pointing out that, overall, first-line use of OTSC in this patient group improved patient outcomes.
“The main message here is that if you can position the clip properly, then it is likely to stay in place, better than standard approaches,” he said, adding that, “I support it fully for second-line use but there currently still exists uncertainty for routine first-line adoption in nonvariceal bleeding. Clinicians fail to position the clip properly in around 5% of patients which is higher than standard endoscopic approaches, and nobody has yet clearly defined the lesions that are difficult to clip with the OTSC.
“If you’re going to tell people to use it, then you need to tell them with which particular lesions OTSC works best as first-line approach,” he added.
Lesions of concern include upon leaving the stomach and entering the duodenum, and in passing from the first to the second stage of the duodenum. “These are tight areas, and these larger full-thickness bite OTSC may create pseudo-polyps, even possibly causing obstruction. Perforation is also a risk.” One of each of these complications were noted in this study.
The study included 190 adult patients with active bleeding or a nonbleeding visible vessel from a nonvariceal cause on upper gastrointestinal endoscopy. Of these, 97 patients received standard hemostatic treatment and 93 received OTSC. The primary endpoint of a 30-day probability of further bleeding was 14.6% in the standard treatment and 3.2% in the OTSC group (risk difference, 11.4 percentage points [95% confidence interval (CI), 3.3-20.0 percentage points]; P = .006). Failure to control bleeding after assigned endoscopic treatment in the standard treatment and OTSC groups was 6 versus 1 in the standard treatment and OTSC groups, respectively. Thirty-day recurrent bleeding was 8 versus 2 in the standard treatment and OTSC groups, respectively. Eight patients in the standard treatment group needed further intervention compared with two in the OTSC group. Thirty-day mortality was four versus two, respectively.
“First-line OTSC has a role to play but whether it is the best approach is hard to say due to methodological limitations that were seen in this and earlier studies, however if you can position the clip properly it likely does well,” Dr. Barkun said.
Dr. Lau declares that he received honorarium for a lecture from OVESCO. Dr. Li has no disclosures. Dr. Barkun has no relevant disclosures.
FROM ANNALS OF INTERNAL MEDICINE
Take time to relax and enjoy the ride
This past weekend was one of my least-favorite parts of the annual cycle: I shut off and drained my hot tub.
I’ve always loved sitting in hot tubs, as far back as I can remember. Growing up on family vacations I preferred them to the pool. So when I was grown up and could afford one, I got it for my house.
I spend my winter weekend afternoons relaxing in it with a can of beer, some bottles of iced tea, and a pile of journals or a book. I put instrumental jazz on my phone and spend a few pleasant hours there, catching up on my reading.
But, as the Phoenix weather swings back to summer temps, it’s time to turn it off until next November.
It’s interesting the ways we mark the passage of time in our lives. The traditional standards are New Year’s, major holidays, and birthdays. Some may mark it by their favorite sports seasons starting.
In medicine we may mark it by patient ages, or a drug that we thought just came to market now going generic, or realizing our state or DEA license is up for renewal.
It doesn’t really matter how you mark the time – it’s going to happen whether you do or don’t. The person you see in the mirror is the same one there since you were tall enough to see over the bathroom countertop. Isn’t it just the ones around us who change?
As Phoenix moves back to a summer footing, and as someone who’s been through 56 of them, it’s hard not to think about it. Summer vacations growing up, summer classes in college, summer elective rotations in medical school. Now I work year-round and watch the same cycle play out with my kids in college.
You often hear the phrase “a hundred years from now it won’t make a difference.” Probably true. In 2123 the time I spent relaxing in my hot tub won’t mean anything, or be remembered by anyone.
But I’m not sitting in it to think about that. I’m in it because I have what I have now, and none of us will ever have that again. And part of that, to me, is enjoying some time in the hot tub.
Because That may not matter in one hundred years, but it matters to me today. And that’s what’s really important.
To all of us.
Dr. Block has a solo neurology practice in Scottsdale, Ariz.
This past weekend was one of my least-favorite parts of the annual cycle: I shut off and drained my hot tub.
I’ve always loved sitting in hot tubs, as far back as I can remember. Growing up on family vacations I preferred them to the pool. So when I was grown up and could afford one, I got it for my house.
I spend my winter weekend afternoons relaxing in it with a can of beer, some bottles of iced tea, and a pile of journals or a book. I put instrumental jazz on my phone and spend a few pleasant hours there, catching up on my reading.
But, as the Phoenix weather swings back to summer temps, it’s time to turn it off until next November.
It’s interesting the ways we mark the passage of time in our lives. The traditional standards are New Year’s, major holidays, and birthdays. Some may mark it by their favorite sports seasons starting.
In medicine we may mark it by patient ages, or a drug that we thought just came to market now going generic, or realizing our state or DEA license is up for renewal.
It doesn’t really matter how you mark the time – it’s going to happen whether you do or don’t. The person you see in the mirror is the same one there since you were tall enough to see over the bathroom countertop. Isn’t it just the ones around us who change?
As Phoenix moves back to a summer footing, and as someone who’s been through 56 of them, it’s hard not to think about it. Summer vacations growing up, summer classes in college, summer elective rotations in medical school. Now I work year-round and watch the same cycle play out with my kids in college.
You often hear the phrase “a hundred years from now it won’t make a difference.” Probably true. In 2123 the time I spent relaxing in my hot tub won’t mean anything, or be remembered by anyone.
But I’m not sitting in it to think about that. I’m in it because I have what I have now, and none of us will ever have that again. And part of that, to me, is enjoying some time in the hot tub.
Because That may not matter in one hundred years, but it matters to me today. And that’s what’s really important.
To all of us.
Dr. Block has a solo neurology practice in Scottsdale, Ariz.
This past weekend was one of my least-favorite parts of the annual cycle: I shut off and drained my hot tub.
I’ve always loved sitting in hot tubs, as far back as I can remember. Growing up on family vacations I preferred them to the pool. So when I was grown up and could afford one, I got it for my house.
I spend my winter weekend afternoons relaxing in it with a can of beer, some bottles of iced tea, and a pile of journals or a book. I put instrumental jazz on my phone and spend a few pleasant hours there, catching up on my reading.
But, as the Phoenix weather swings back to summer temps, it’s time to turn it off until next November.
It’s interesting the ways we mark the passage of time in our lives. The traditional standards are New Year’s, major holidays, and birthdays. Some may mark it by their favorite sports seasons starting.
In medicine we may mark it by patient ages, or a drug that we thought just came to market now going generic, or realizing our state or DEA license is up for renewal.
It doesn’t really matter how you mark the time – it’s going to happen whether you do or don’t. The person you see in the mirror is the same one there since you were tall enough to see over the bathroom countertop. Isn’t it just the ones around us who change?
As Phoenix moves back to a summer footing, and as someone who’s been through 56 of them, it’s hard not to think about it. Summer vacations growing up, summer classes in college, summer elective rotations in medical school. Now I work year-round and watch the same cycle play out with my kids in college.
You often hear the phrase “a hundred years from now it won’t make a difference.” Probably true. In 2123 the time I spent relaxing in my hot tub won’t mean anything, or be remembered by anyone.
But I’m not sitting in it to think about that. I’m in it because I have what I have now, and none of us will ever have that again. And part of that, to me, is enjoying some time in the hot tub.
Because That may not matter in one hundred years, but it matters to me today. And that’s what’s really important.
To all of us.
Dr. Block has a solo neurology practice in Scottsdale, Ariz.
Targeted Agents Plus Endocrine Therapy in HR+/HER2- Advanced Breast Cancer
The past decade has brought tremendous progress in the treatment of HR+/HER2- advanced breast cancer. The use of newer targeted agents, including PI3K, mTOR, and CDK4/6 inhibitors, in combination with endocrine therapy (ET) may prove a valuable strategy to overcome ET resistance and further extend patient survival.
In this ReCAP, Dr Richard Finn, of the Geffen School of Medicine at UCLA, discusses the growing body of research into CDK4/6 inhibition, PI3K inhibition, and mTOR inhibition, with and without ET. He touches on findings from the EMERALD, MAINTAIN, and monarchE studies to highlight evidence supporting that these targeted agents, in combination with ET, may improve outcomes for patients with advanced disease.
He comments on approaches to sequencing ET and novel agents for patients with recurrence or disease progression, taking into consideration their unique tumor burden, pace of disease, and possible gene mutations. Dr Finn concludes by advising that clinical trial enrollment can provide high-risk patients access to the newest treatments.
--
Professor, Department of Medicine, Division of Hematology/Oncology, Geffen School of Medicine at UCLA, Los Angeles, California
Richard S. Finn, MD, has disclosed the following relevant financial relationships:
Serve(d) as a director, officer, partner, employee, advisor, consultant, or trustee for: AstraZeneca; Bayer; CStone; Eisai; Exelixis; Eli Lilly; Pfizer; Merck; Roche; Genentech; Jiangsu Hengrui
Serve(d) as a speaker or a member of a speakers bureau for: Genentech Institution received research grant from: Bayer; Eli Lilly; Eisai; Pfizer; Roche; Genentech
Received income in an amount equal to or greater than $250 from: AstraZeneca; Bayer; CStone; Eisai; Exelixis; Eli Lilly; Pfizer; Merck; Roche; Genentech; Jiangsu Hengrui
The past decade has brought tremendous progress in the treatment of HR+/HER2- advanced breast cancer. The use of newer targeted agents, including PI3K, mTOR, and CDK4/6 inhibitors, in combination with endocrine therapy (ET) may prove a valuable strategy to overcome ET resistance and further extend patient survival.
In this ReCAP, Dr Richard Finn, of the Geffen School of Medicine at UCLA, discusses the growing body of research into CDK4/6 inhibition, PI3K inhibition, and mTOR inhibition, with and without ET. He touches on findings from the EMERALD, MAINTAIN, and monarchE studies to highlight evidence supporting that these targeted agents, in combination with ET, may improve outcomes for patients with advanced disease.
He comments on approaches to sequencing ET and novel agents for patients with recurrence or disease progression, taking into consideration their unique tumor burden, pace of disease, and possible gene mutations. Dr Finn concludes by advising that clinical trial enrollment can provide high-risk patients access to the newest treatments.
--
Professor, Department of Medicine, Division of Hematology/Oncology, Geffen School of Medicine at UCLA, Los Angeles, California
Richard S. Finn, MD, has disclosed the following relevant financial relationships:
Serve(d) as a director, officer, partner, employee, advisor, consultant, or trustee for: AstraZeneca; Bayer; CStone; Eisai; Exelixis; Eli Lilly; Pfizer; Merck; Roche; Genentech; Jiangsu Hengrui
Serve(d) as a speaker or a member of a speakers bureau for: Genentech Institution received research grant from: Bayer; Eli Lilly; Eisai; Pfizer; Roche; Genentech
Received income in an amount equal to or greater than $250 from: AstraZeneca; Bayer; CStone; Eisai; Exelixis; Eli Lilly; Pfizer; Merck; Roche; Genentech; Jiangsu Hengrui
The past decade has brought tremendous progress in the treatment of HR+/HER2- advanced breast cancer. The use of newer targeted agents, including PI3K, mTOR, and CDK4/6 inhibitors, in combination with endocrine therapy (ET) may prove a valuable strategy to overcome ET resistance and further extend patient survival.
In this ReCAP, Dr Richard Finn, of the Geffen School of Medicine at UCLA, discusses the growing body of research into CDK4/6 inhibition, PI3K inhibition, and mTOR inhibition, with and without ET. He touches on findings from the EMERALD, MAINTAIN, and monarchE studies to highlight evidence supporting that these targeted agents, in combination with ET, may improve outcomes for patients with advanced disease.
He comments on approaches to sequencing ET and novel agents for patients with recurrence or disease progression, taking into consideration their unique tumor burden, pace of disease, and possible gene mutations. Dr Finn concludes by advising that clinical trial enrollment can provide high-risk patients access to the newest treatments.
--
Professor, Department of Medicine, Division of Hematology/Oncology, Geffen School of Medicine at UCLA, Los Angeles, California
Richard S. Finn, MD, has disclosed the following relevant financial relationships:
Serve(d) as a director, officer, partner, employee, advisor, consultant, or trustee for: AstraZeneca; Bayer; CStone; Eisai; Exelixis; Eli Lilly; Pfizer; Merck; Roche; Genentech; Jiangsu Hengrui
Serve(d) as a speaker or a member of a speakers bureau for: Genentech Institution received research grant from: Bayer; Eli Lilly; Eisai; Pfizer; Roche; Genentech
Received income in an amount equal to or greater than $250 from: AstraZeneca; Bayer; CStone; Eisai; Exelixis; Eli Lilly; Pfizer; Merck; Roche; Genentech; Jiangsu Hengrui
