User login
Snot happens
If you do your recreational reading during your lunch break, you might want to skip this column and save it for later in the day because I’m going to be talking about snot ... gooey, slimy, green, yellow, and even clear, runny snot. Children and snot go together like ham and eggs.
Although snot is as much a part of the normal maturation process as cutting and losing primary teeth, it suffers from a serious image problem. When we refer to a “snotty-nosed kid,” we aren’t just describing a child with an unappealing visage – we are suggesting that he has a personality problem and an unpleasant demeanor. But, this characterization is unfairly inaccurate because during the winter months, it seems that 90% of the children under the age of 3 years have runny, snotty noses. Most of them continue to be cute and have endearing personalities despite the river of mucus cascading down over their upper lips.
Adults may complain of having a “runny nose,” but they would never admit to having a “snotty nose.” Does something magically happen at puberty so that the human body no longer manufactures snot? No, it is all about appearances. Adults have learned strategies for keeping snot off their faces. They carry handkerchiefs in their pockets or wads of facial tissue stuffed up a sleeve.
But little children don’t care how they look. Like tears, snot emerges on their faces at body temperature. It doesn’t feel uncomfortable and when it drips from their upper lips and lands on their tongue, it doesn’t taste unpleasant. Little children don’t have important paperwork that might be spattered with dripping snot nor do they have computer keyboards or touch screens to be besmirched (or at least they shouldn’t have).
The fact is that adults, especially parents, don’t like the look of snot dripping from anyone’s nose. They don’t even like the sound of it gurgling around that might signal the appearance of a disgusting rivulet. I recently learned from an article in the Wall Street Journal (“Clear Baby’s Stuffy Nose,” by Laura Johannes, Feb 2, 2015) about two new products that promise parents a new tool to deal with this natural substance that they find so repulsive. Both gadgets incorporate a mouthpiece and tube with which the parent or caregiver sucks the snot out of the child’s nose into a reservoir. Although each system is fitted with a filter of sorts, I doubt you will find many nonparental caregivers logging on to watch the instructional videos.
A small study reported by one of the manufacturers claims that parents felt that their children were less congested and slept better when they used these “snotsuckers.” Of course, there was no control group. I suspect that neither apparatus would be any more effective that the old blue, green or brown suction bulb that seems to magically emerge from the womb immediately after the baby and before the placenta. At least I’ve always assumed that’s where they came from because one always appeared on the warming /resuscitation table wrapped in the same towel as the baby.
These little one-piece wonders with no moving parts to malfunction or detach are all one needs to remove snot ... that is, if it needs to be removed. The problem with rubber bulbs is that if used too often they can cause irritation of the nares. However, it you can convince parents to use a bulb only when the child is experiencing some trouble breathing, this complication is usually avoided. The challenge is getting parents to ignore their natural revulsion to seeing or even hearing snot. Trying to make their child’s face a snot-free zone will make the child’s nose a bloody painful mess.
The same challenge will confront you if you suggest to parents that their child will be more comfortable if they wipe his runny nose as infrequently as possible. It seems too counterintuitive that the best option for the child is to let the snot dry on and soak it off with warm water at lunch, bedtime, and of course, just before Skyping with Grandma.
Dr. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years. He has authored several books on behavioral pediatrics, including “Coping with a Picky Eater.” E-mail him at [email protected].
If you do your recreational reading during your lunch break, you might want to skip this column and save it for later in the day because I’m going to be talking about snot ... gooey, slimy, green, yellow, and even clear, runny snot. Children and snot go together like ham and eggs.
Although snot is as much a part of the normal maturation process as cutting and losing primary teeth, it suffers from a serious image problem. When we refer to a “snotty-nosed kid,” we aren’t just describing a child with an unappealing visage – we are suggesting that he has a personality problem and an unpleasant demeanor. But, this characterization is unfairly inaccurate because during the winter months, it seems that 90% of the children under the age of 3 years have runny, snotty noses. Most of them continue to be cute and have endearing personalities despite the river of mucus cascading down over their upper lips.
Adults may complain of having a “runny nose,” but they would never admit to having a “snotty nose.” Does something magically happen at puberty so that the human body no longer manufactures snot? No, it is all about appearances. Adults have learned strategies for keeping snot off their faces. They carry handkerchiefs in their pockets or wads of facial tissue stuffed up a sleeve.
But little children don’t care how they look. Like tears, snot emerges on their faces at body temperature. It doesn’t feel uncomfortable and when it drips from their upper lips and lands on their tongue, it doesn’t taste unpleasant. Little children don’t have important paperwork that might be spattered with dripping snot nor do they have computer keyboards or touch screens to be besmirched (or at least they shouldn’t have).
The fact is that adults, especially parents, don’t like the look of snot dripping from anyone’s nose. They don’t even like the sound of it gurgling around that might signal the appearance of a disgusting rivulet. I recently learned from an article in the Wall Street Journal (“Clear Baby’s Stuffy Nose,” by Laura Johannes, Feb 2, 2015) about two new products that promise parents a new tool to deal with this natural substance that they find so repulsive. Both gadgets incorporate a mouthpiece and tube with which the parent or caregiver sucks the snot out of the child’s nose into a reservoir. Although each system is fitted with a filter of sorts, I doubt you will find many nonparental caregivers logging on to watch the instructional videos.
A small study reported by one of the manufacturers claims that parents felt that their children were less congested and slept better when they used these “snotsuckers.” Of course, there was no control group. I suspect that neither apparatus would be any more effective that the old blue, green or brown suction bulb that seems to magically emerge from the womb immediately after the baby and before the placenta. At least I’ve always assumed that’s where they came from because one always appeared on the warming /resuscitation table wrapped in the same towel as the baby.
These little one-piece wonders with no moving parts to malfunction or detach are all one needs to remove snot ... that is, if it needs to be removed. The problem with rubber bulbs is that if used too often they can cause irritation of the nares. However, it you can convince parents to use a bulb only when the child is experiencing some trouble breathing, this complication is usually avoided. The challenge is getting parents to ignore their natural revulsion to seeing or even hearing snot. Trying to make their child’s face a snot-free zone will make the child’s nose a bloody painful mess.
The same challenge will confront you if you suggest to parents that their child will be more comfortable if they wipe his runny nose as infrequently as possible. It seems too counterintuitive that the best option for the child is to let the snot dry on and soak it off with warm water at lunch, bedtime, and of course, just before Skyping with Grandma.
Dr. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years. He has authored several books on behavioral pediatrics, including “Coping with a Picky Eater.” E-mail him at [email protected].
If you do your recreational reading during your lunch break, you might want to skip this column and save it for later in the day because I’m going to be talking about snot ... gooey, slimy, green, yellow, and even clear, runny snot. Children and snot go together like ham and eggs.
Although snot is as much a part of the normal maturation process as cutting and losing primary teeth, it suffers from a serious image problem. When we refer to a “snotty-nosed kid,” we aren’t just describing a child with an unappealing visage – we are suggesting that he has a personality problem and an unpleasant demeanor. But, this characterization is unfairly inaccurate because during the winter months, it seems that 90% of the children under the age of 3 years have runny, snotty noses. Most of them continue to be cute and have endearing personalities despite the river of mucus cascading down over their upper lips.
Adults may complain of having a “runny nose,” but they would never admit to having a “snotty nose.” Does something magically happen at puberty so that the human body no longer manufactures snot? No, it is all about appearances. Adults have learned strategies for keeping snot off their faces. They carry handkerchiefs in their pockets or wads of facial tissue stuffed up a sleeve.
But little children don’t care how they look. Like tears, snot emerges on their faces at body temperature. It doesn’t feel uncomfortable and when it drips from their upper lips and lands on their tongue, it doesn’t taste unpleasant. Little children don’t have important paperwork that might be spattered with dripping snot nor do they have computer keyboards or touch screens to be besmirched (or at least they shouldn’t have).
The fact is that adults, especially parents, don’t like the look of snot dripping from anyone’s nose. They don’t even like the sound of it gurgling around that might signal the appearance of a disgusting rivulet. I recently learned from an article in the Wall Street Journal (“Clear Baby’s Stuffy Nose,” by Laura Johannes, Feb 2, 2015) about two new products that promise parents a new tool to deal with this natural substance that they find so repulsive. Both gadgets incorporate a mouthpiece and tube with which the parent or caregiver sucks the snot out of the child’s nose into a reservoir. Although each system is fitted with a filter of sorts, I doubt you will find many nonparental caregivers logging on to watch the instructional videos.
A small study reported by one of the manufacturers claims that parents felt that their children were less congested and slept better when they used these “snotsuckers.” Of course, there was no control group. I suspect that neither apparatus would be any more effective that the old blue, green or brown suction bulb that seems to magically emerge from the womb immediately after the baby and before the placenta. At least I’ve always assumed that’s where they came from because one always appeared on the warming /resuscitation table wrapped in the same towel as the baby.
These little one-piece wonders with no moving parts to malfunction or detach are all one needs to remove snot ... that is, if it needs to be removed. The problem with rubber bulbs is that if used too often they can cause irritation of the nares. However, it you can convince parents to use a bulb only when the child is experiencing some trouble breathing, this complication is usually avoided. The challenge is getting parents to ignore their natural revulsion to seeing or even hearing snot. Trying to make their child’s face a snot-free zone will make the child’s nose a bloody painful mess.
The same challenge will confront you if you suggest to parents that their child will be more comfortable if they wipe his runny nose as infrequently as possible. It seems too counterintuitive that the best option for the child is to let the snot dry on and soak it off with warm water at lunch, bedtime, and of course, just before Skyping with Grandma.
Dr. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years. He has authored several books on behavioral pediatrics, including “Coping with a Picky Eater.” E-mail him at [email protected].
Mother knew better
Like most pediatricians, I often think about parenting. It’s hard to ignore when your days and some nights are surrounded by it in a variety of forms ... the good the bad, and the ugly. Recently, I was trying to recall my own parents’ (mostly my Mom’s) style of parenting and discovered that I was having trouble remembering many of the specifics of how my sister and I were raised.
The haziness of that recollection could simply reflect my aging memory, but I prefer to interpret it as a sign that our parents consciously avoided being heavy handed in their approach. Granted, my sister and I have grown up to be reasonably agreeable adults and were relatively unadventurous children. But, we were far from angelic.
My mother was a quiet person. In fact, I have trouble recalling much about the sound of her voice. She was not one to debate or argue. Although my parents had their disagreements, my mother would refuse to argue, which infuriated my father. Instead she would state her position once and wait for the issue to play out. The result usually vindicated her softly stated position.
But, my mother wasn’t perfect, and it is those few parenting missteps that I remember ... most of them fondly. For example, when she first heard me swear (I don’t even remember the word), she said that I was to have my mouth washed out with soap. I assume she had heard this advice from my grandmother. However, she wasn’t quite clear on the technique. The result was a lot of fumbling around with a large bar of soap and a very small mouth. It certainly wasn’t a deterrent in large part, because I sensed she was giggling during the ordeal.
I don’t recall being spanked, but my mother was not adverse to physical deterrent. After years of reminding me to sit up straight at the dinner table, she took to nonchalantly – and without a word – poking me between the shoulder blades with a fork as she passed behind me while serving supper. No more idle threats, just a sharp reminder. Of course, it was no more effective than the soap, and I am still a sloucher.
My mother occasionally made attempts at anticipatory guidance with mixed results. One cold December morning as I was heading off to school, she took me aside and cautioned, “Now Willy, don’t ever stick your tongue on a cold pipe.” I’m not sure what prompted this warning because it was decades before this foolishness made it to the silver screen in Jean Shepherd’s “A Christmas Story.” Up to that point, putting my tongue on a cold pipe was an activity that had never crossed my little mind. But now she had planted the seed and for many winters I couldn’t pass a parking meter without the little devil on my shoulder whispering in my ear, “Go try it.”
The only parenting misstep for which I still hold a grudge is one in which my mother ignored her ample storehouse of good sense and floated with the mainstream of bad parental advice. Those of you born prior to 1965 probably share my painful memories of having to sit on the edge of the pond or lake until it had been exactly an hour since you had last eaten – eaten anything! A thoughtlessly nibbled cookie could result in a stomach cramp that could send you to the bottom, never to take another breath of air. It is unclear which medical genius came up with this idea, but I hope he is rotting in hell, doubled over with unremitting abdominal cramps. A rough calculation reveals that between the ages of 7 and 14 years, I wasted nearly 1,000 child-hours sitting on the edge of the town pool, impatiently waiting for peanut butter sandwiches to digest. I know my mother knew that the whole stomach cramp thing was bogus. But, she wasn’t strong enough to swim against the terrible tide of old wives’ tales.
Despite these trivial errors when I was a young child, my mother did her best parenting when I reached adolescence. Pleasantville, N.Y., was a small town of 5,000, and my mother seemed to know two-thirds of them by their first name. Or, at least I thought she did. Her web of contacts covered the town like a blanket. I was convinced that, like Santa Claus, she knew when I was sleeping, she knew when I was awake, she knew if I’d been bad or good. You get the picture. It was only many years later that I realized I had been buffaloed. Her omniscience had been a masterful act ... but it had worked.
Dr. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years. He has authored several books on behavioral pediatrics, including “Coping with a Picky Eater.” E-mail him at [email protected].
Like most pediatricians, I often think about parenting. It’s hard to ignore when your days and some nights are surrounded by it in a variety of forms ... the good the bad, and the ugly. Recently, I was trying to recall my own parents’ (mostly my Mom’s) style of parenting and discovered that I was having trouble remembering many of the specifics of how my sister and I were raised.
The haziness of that recollection could simply reflect my aging memory, but I prefer to interpret it as a sign that our parents consciously avoided being heavy handed in their approach. Granted, my sister and I have grown up to be reasonably agreeable adults and were relatively unadventurous children. But, we were far from angelic.
My mother was a quiet person. In fact, I have trouble recalling much about the sound of her voice. She was not one to debate or argue. Although my parents had their disagreements, my mother would refuse to argue, which infuriated my father. Instead she would state her position once and wait for the issue to play out. The result usually vindicated her softly stated position.
But, my mother wasn’t perfect, and it is those few parenting missteps that I remember ... most of them fondly. For example, when she first heard me swear (I don’t even remember the word), she said that I was to have my mouth washed out with soap. I assume she had heard this advice from my grandmother. However, she wasn’t quite clear on the technique. The result was a lot of fumbling around with a large bar of soap and a very small mouth. It certainly wasn’t a deterrent in large part, because I sensed she was giggling during the ordeal.
I don’t recall being spanked, but my mother was not adverse to physical deterrent. After years of reminding me to sit up straight at the dinner table, she took to nonchalantly – and without a word – poking me between the shoulder blades with a fork as she passed behind me while serving supper. No more idle threats, just a sharp reminder. Of course, it was no more effective than the soap, and I am still a sloucher.
My mother occasionally made attempts at anticipatory guidance with mixed results. One cold December morning as I was heading off to school, she took me aside and cautioned, “Now Willy, don’t ever stick your tongue on a cold pipe.” I’m not sure what prompted this warning because it was decades before this foolishness made it to the silver screen in Jean Shepherd’s “A Christmas Story.” Up to that point, putting my tongue on a cold pipe was an activity that had never crossed my little mind. But now she had planted the seed and for many winters I couldn’t pass a parking meter without the little devil on my shoulder whispering in my ear, “Go try it.”
The only parenting misstep for which I still hold a grudge is one in which my mother ignored her ample storehouse of good sense and floated with the mainstream of bad parental advice. Those of you born prior to 1965 probably share my painful memories of having to sit on the edge of the pond or lake until it had been exactly an hour since you had last eaten – eaten anything! A thoughtlessly nibbled cookie could result in a stomach cramp that could send you to the bottom, never to take another breath of air. It is unclear which medical genius came up with this idea, but I hope he is rotting in hell, doubled over with unremitting abdominal cramps. A rough calculation reveals that between the ages of 7 and 14 years, I wasted nearly 1,000 child-hours sitting on the edge of the town pool, impatiently waiting for peanut butter sandwiches to digest. I know my mother knew that the whole stomach cramp thing was bogus. But, she wasn’t strong enough to swim against the terrible tide of old wives’ tales.
Despite these trivial errors when I was a young child, my mother did her best parenting when I reached adolescence. Pleasantville, N.Y., was a small town of 5,000, and my mother seemed to know two-thirds of them by their first name. Or, at least I thought she did. Her web of contacts covered the town like a blanket. I was convinced that, like Santa Claus, she knew when I was sleeping, she knew when I was awake, she knew if I’d been bad or good. You get the picture. It was only many years later that I realized I had been buffaloed. Her omniscience had been a masterful act ... but it had worked.
Dr. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years. He has authored several books on behavioral pediatrics, including “Coping with a Picky Eater.” E-mail him at [email protected].
Like most pediatricians, I often think about parenting. It’s hard to ignore when your days and some nights are surrounded by it in a variety of forms ... the good the bad, and the ugly. Recently, I was trying to recall my own parents’ (mostly my Mom’s) style of parenting and discovered that I was having trouble remembering many of the specifics of how my sister and I were raised.
The haziness of that recollection could simply reflect my aging memory, but I prefer to interpret it as a sign that our parents consciously avoided being heavy handed in their approach. Granted, my sister and I have grown up to be reasonably agreeable adults and were relatively unadventurous children. But, we were far from angelic.
My mother was a quiet person. In fact, I have trouble recalling much about the sound of her voice. She was not one to debate or argue. Although my parents had their disagreements, my mother would refuse to argue, which infuriated my father. Instead she would state her position once and wait for the issue to play out. The result usually vindicated her softly stated position.
But, my mother wasn’t perfect, and it is those few parenting missteps that I remember ... most of them fondly. For example, when she first heard me swear (I don’t even remember the word), she said that I was to have my mouth washed out with soap. I assume she had heard this advice from my grandmother. However, she wasn’t quite clear on the technique. The result was a lot of fumbling around with a large bar of soap and a very small mouth. It certainly wasn’t a deterrent in large part, because I sensed she was giggling during the ordeal.
I don’t recall being spanked, but my mother was not adverse to physical deterrent. After years of reminding me to sit up straight at the dinner table, she took to nonchalantly – and without a word – poking me between the shoulder blades with a fork as she passed behind me while serving supper. No more idle threats, just a sharp reminder. Of course, it was no more effective than the soap, and I am still a sloucher.
My mother occasionally made attempts at anticipatory guidance with mixed results. One cold December morning as I was heading off to school, she took me aside and cautioned, “Now Willy, don’t ever stick your tongue on a cold pipe.” I’m not sure what prompted this warning because it was decades before this foolishness made it to the silver screen in Jean Shepherd’s “A Christmas Story.” Up to that point, putting my tongue on a cold pipe was an activity that had never crossed my little mind. But now she had planted the seed and for many winters I couldn’t pass a parking meter without the little devil on my shoulder whispering in my ear, “Go try it.”
The only parenting misstep for which I still hold a grudge is one in which my mother ignored her ample storehouse of good sense and floated with the mainstream of bad parental advice. Those of you born prior to 1965 probably share my painful memories of having to sit on the edge of the pond or lake until it had been exactly an hour since you had last eaten – eaten anything! A thoughtlessly nibbled cookie could result in a stomach cramp that could send you to the bottom, never to take another breath of air. It is unclear which medical genius came up with this idea, but I hope he is rotting in hell, doubled over with unremitting abdominal cramps. A rough calculation reveals that between the ages of 7 and 14 years, I wasted nearly 1,000 child-hours sitting on the edge of the town pool, impatiently waiting for peanut butter sandwiches to digest. I know my mother knew that the whole stomach cramp thing was bogus. But, she wasn’t strong enough to swim against the terrible tide of old wives’ tales.
Despite these trivial errors when I was a young child, my mother did her best parenting when I reached adolescence. Pleasantville, N.Y., was a small town of 5,000, and my mother seemed to know two-thirds of them by their first name. Or, at least I thought she did. Her web of contacts covered the town like a blanket. I was convinced that, like Santa Claus, she knew when I was sleeping, she knew when I was awake, she knew if I’d been bad or good. You get the picture. It was only many years later that I realized I had been buffaloed. Her omniscience had been a masterful act ... but it had worked.
Dr. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years. He has authored several books on behavioral pediatrics, including “Coping with a Picky Eater.” E-mail him at [email protected].
Solidarity II
In my last column, I wondered if the job satisfaction among American physicians had dipped so low that unionizing might have become a reasonable option. Much to my surprise when I opened the op-ed section of the Jan. 14, 2015, New York Times I discovered an article (Want to Be Happy? Join a Union) that added a bit of kindling to the spark I had hoped to ignite in my column.
Columnist John Guida interviewed two political scientists who had recently completed a study on labor union membership and life satisfaction in the United States (it appears to be unpublished at this point – but there is a link in the Times article to an October 2014 draft). Using data from a multiyear World Values Survey, these researchers discovered that union members are more satisfied than workers who were not in a union. This positive boost to life satisfaction was demonstrable across a broad selection of demographic groups: rich/poor, male/female, old/young, and disparate levels of education.
These political scientists found that being a union member generated a bigger boost of life satisfaction than that achieved by an increase in income. In an interview for the New York Times column, the authors postulated that the effect that they were observing could be occurring along four channels. One was a greater satisfaction in work experiences. The second was a feeling of greater job security. Do you think either of those benefits might sound appealing to some dissatisfied physicians? The other two were an increase in the number of opportunities for social intervention, and a positive feeling that can accompany participation in what they called democratic citizenship.
Although this study casts a warm glow over joining a union, unionization has an image problem here in the United States. Membership is down, and a study referred to in the Times article suggests that Americans have less confidence in unions than they do in banks.
As I suggested in my previous column, I sense that most physicians are not primarily troubled by their income. However, it is frustrating work environments and the lack of control or what these authors called “democratic citizenship” that is most frustrating. From a purely public relations standpoint, unionizing and going on strike for more money has the potential of creating a negative impression of the physicians who have organized. However, a work action with the aim of improving work conditions has a much more savory sound to it. And, as these political scientists have demonstrated, it is life satisfaction and not an increase in income that is the true benefit of unionization.
I have moved out of the workforce and am just sitting here on the sidelines watching with interest. But it seems to me that more of you who are still working should be looking outside the box for ways in which to improve your job (and life) satisfaction. If you are 50 years old and trying to calculate how many years it will be until you can retire, you have a problem. Unionization may be an answer. As the political scientists noted at the end of this column, their study “can give new meaning to the adage, ‘don’t mourn, organize.’ ”
Dr. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years. He has authored several books on behavioral pediatrics, including “Coping with a Picky Eater.” E-mail him at [email protected].
In my last column, I wondered if the job satisfaction among American physicians had dipped so low that unionizing might have become a reasonable option. Much to my surprise when I opened the op-ed section of the Jan. 14, 2015, New York Times I discovered an article (Want to Be Happy? Join a Union) that added a bit of kindling to the spark I had hoped to ignite in my column.
Columnist John Guida interviewed two political scientists who had recently completed a study on labor union membership and life satisfaction in the United States (it appears to be unpublished at this point – but there is a link in the Times article to an October 2014 draft). Using data from a multiyear World Values Survey, these researchers discovered that union members are more satisfied than workers who were not in a union. This positive boost to life satisfaction was demonstrable across a broad selection of demographic groups: rich/poor, male/female, old/young, and disparate levels of education.
These political scientists found that being a union member generated a bigger boost of life satisfaction than that achieved by an increase in income. In an interview for the New York Times column, the authors postulated that the effect that they were observing could be occurring along four channels. One was a greater satisfaction in work experiences. The second was a feeling of greater job security. Do you think either of those benefits might sound appealing to some dissatisfied physicians? The other two were an increase in the number of opportunities for social intervention, and a positive feeling that can accompany participation in what they called democratic citizenship.
Although this study casts a warm glow over joining a union, unionization has an image problem here in the United States. Membership is down, and a study referred to in the Times article suggests that Americans have less confidence in unions than they do in banks.
As I suggested in my previous column, I sense that most physicians are not primarily troubled by their income. However, it is frustrating work environments and the lack of control or what these authors called “democratic citizenship” that is most frustrating. From a purely public relations standpoint, unionizing and going on strike for more money has the potential of creating a negative impression of the physicians who have organized. However, a work action with the aim of improving work conditions has a much more savory sound to it. And, as these political scientists have demonstrated, it is life satisfaction and not an increase in income that is the true benefit of unionization.
I have moved out of the workforce and am just sitting here on the sidelines watching with interest. But it seems to me that more of you who are still working should be looking outside the box for ways in which to improve your job (and life) satisfaction. If you are 50 years old and trying to calculate how many years it will be until you can retire, you have a problem. Unionization may be an answer. As the political scientists noted at the end of this column, their study “can give new meaning to the adage, ‘don’t mourn, organize.’ ”
Dr. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years. He has authored several books on behavioral pediatrics, including “Coping with a Picky Eater.” E-mail him at [email protected].
In my last column, I wondered if the job satisfaction among American physicians had dipped so low that unionizing might have become a reasonable option. Much to my surprise when I opened the op-ed section of the Jan. 14, 2015, New York Times I discovered an article (Want to Be Happy? Join a Union) that added a bit of kindling to the spark I had hoped to ignite in my column.
Columnist John Guida interviewed two political scientists who had recently completed a study on labor union membership and life satisfaction in the United States (it appears to be unpublished at this point – but there is a link in the Times article to an October 2014 draft). Using data from a multiyear World Values Survey, these researchers discovered that union members are more satisfied than workers who were not in a union. This positive boost to life satisfaction was demonstrable across a broad selection of demographic groups: rich/poor, male/female, old/young, and disparate levels of education.
These political scientists found that being a union member generated a bigger boost of life satisfaction than that achieved by an increase in income. In an interview for the New York Times column, the authors postulated that the effect that they were observing could be occurring along four channels. One was a greater satisfaction in work experiences. The second was a feeling of greater job security. Do you think either of those benefits might sound appealing to some dissatisfied physicians? The other two were an increase in the number of opportunities for social intervention, and a positive feeling that can accompany participation in what they called democratic citizenship.
Although this study casts a warm glow over joining a union, unionization has an image problem here in the United States. Membership is down, and a study referred to in the Times article suggests that Americans have less confidence in unions than they do in banks.
As I suggested in my previous column, I sense that most physicians are not primarily troubled by their income. However, it is frustrating work environments and the lack of control or what these authors called “democratic citizenship” that is most frustrating. From a purely public relations standpoint, unionizing and going on strike for more money has the potential of creating a negative impression of the physicians who have organized. However, a work action with the aim of improving work conditions has a much more savory sound to it. And, as these political scientists have demonstrated, it is life satisfaction and not an increase in income that is the true benefit of unionization.
I have moved out of the workforce and am just sitting here on the sidelines watching with interest. But it seems to me that more of you who are still working should be looking outside the box for ways in which to improve your job (and life) satisfaction. If you are 50 years old and trying to calculate how many years it will be until you can retire, you have a problem. Unionization may be an answer. As the political scientists noted at the end of this column, their study “can give new meaning to the adage, ‘don’t mourn, organize.’ ”
Dr. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years. He has authored several books on behavioral pediatrics, including “Coping with a Picky Eater.” E-mail him at [email protected].
When those genes no longer fit
If you ask most folks why some of us are obese, they will answer that it’s because overweight people have eaten too much of the wrong foods and not exercised enough. If prompted, they might expand their response by saying that people who come from families with overweight members will usually have more trouble maintaining a healthy weight. It seems to me that just about covers our state-of-the-art understanding of obesity.
You can argue that there is always new information coming out from experiments on genetically altered mice. And, the recently appreciated relationship between sleep deprivation and being overweight sounds interesting. But it still boils down to the simple equation of too much energy in and too little burned.
In 2007, it was discovered that a variant of a gene known as FTO was closely linked to excess weight gain in humans. Individuals with one copy of the gene were on average 3.5 pounds heavier than were those without the gene. Those people with a double copy of the variant gene were 7 pounds heavier and 50% more likely to be obese than was the general population.
It looked like the FTO gene might be one of the answers beyond the simplicity of too much consumption and too little expenditure. But why would the gene suddenly become more prevalent over the last 5 or 6 decades that obesity has become epidemic in America? It seemed unlikely that this shift could occur in such a short time frame.
A recently published study in Proceedings of the National Academy of Sciences (PNAS 2014 [doi: 10.1073/pnas.1411893111]) suggests another more plausible explanation. Using data from the venerable and ongoing Framingham Heart Study, the researchers found that the FTO variant became a risk factor only after World War II. In other words, people with the FTO variant born prior to 1942 weren’t any more likely to be overweight than the rest of the population.
What has changed since the 1940s? Our diet has shifted toward more processed and fried foods. And, our lives and our jobs have become more sedentary. Television crept into our living rooms in the 1950s and into our bedrooms in the 1970s.
The FTO gene variant may have been advantageous to humans in lean times when the heavier of us were more likely to survive long periods of starvation. But now here in the land of fries and soft drinks, the gene has become hazardous to our health. We now look (and are) fat wearing the same genes that seemed to fit us so well a century ago.
We must have sympathy for those of us who have a gene that makes us more vulnerable when food is plentiful, and technology has made it easier to survive with very little energy expenditure. It is tempting to hope that someday scientists will find a way to alter the offending genes to help those cursed to carry them. But this kind of manipulation must be considered cautiously because a natural or man-made catastrophe on a global scale could once again make this gene advantageous.
We must face the fact that it is the environment in which we live – an environment that we have altered and can continue to alter – that is the primary driver of the obesity epidemic. One wonders whether we are experiencing other epidemics analogous to the FTO/obesity story.
Attention-deficit/hyperactivity disorder (ADHD) comes to mind. Some observers feel that a short attention span and impulsivity may have been advantageous when we were hunter-gatherers. The disadvantages of those traits were just a nuisance when having a formal education was merely optional for success. However, we have now trapped those who carry these traits in a one-size-fits-all educational system and sleep deprived them with a combination of electric lights and electronic distractions, to name just a few of the environmental changes that we have imposed.
Maybe it’s not the genes, but the environment that is the issue. The problem is that we haven’t found the genetic variant(s) that might allow us to answer these kinds of questions about ADHD.
Dr. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years. He has authored several books on behavioral pediatrics, including “Coping with a Picky Eater.” E-mail him at [email protected].
If you ask most folks why some of us are obese, they will answer that it’s because overweight people have eaten too much of the wrong foods and not exercised enough. If prompted, they might expand their response by saying that people who come from families with overweight members will usually have more trouble maintaining a healthy weight. It seems to me that just about covers our state-of-the-art understanding of obesity.
You can argue that there is always new information coming out from experiments on genetically altered mice. And, the recently appreciated relationship between sleep deprivation and being overweight sounds interesting. But it still boils down to the simple equation of too much energy in and too little burned.
In 2007, it was discovered that a variant of a gene known as FTO was closely linked to excess weight gain in humans. Individuals with one copy of the gene were on average 3.5 pounds heavier than were those without the gene. Those people with a double copy of the variant gene were 7 pounds heavier and 50% more likely to be obese than was the general population.
It looked like the FTO gene might be one of the answers beyond the simplicity of too much consumption and too little expenditure. But why would the gene suddenly become more prevalent over the last 5 or 6 decades that obesity has become epidemic in America? It seemed unlikely that this shift could occur in such a short time frame.
A recently published study in Proceedings of the National Academy of Sciences (PNAS 2014 [doi: 10.1073/pnas.1411893111]) suggests another more plausible explanation. Using data from the venerable and ongoing Framingham Heart Study, the researchers found that the FTO variant became a risk factor only after World War II. In other words, people with the FTO variant born prior to 1942 weren’t any more likely to be overweight than the rest of the population.
What has changed since the 1940s? Our diet has shifted toward more processed and fried foods. And, our lives and our jobs have become more sedentary. Television crept into our living rooms in the 1950s and into our bedrooms in the 1970s.
The FTO gene variant may have been advantageous to humans in lean times when the heavier of us were more likely to survive long periods of starvation. But now here in the land of fries and soft drinks, the gene has become hazardous to our health. We now look (and are) fat wearing the same genes that seemed to fit us so well a century ago.
We must have sympathy for those of us who have a gene that makes us more vulnerable when food is plentiful, and technology has made it easier to survive with very little energy expenditure. It is tempting to hope that someday scientists will find a way to alter the offending genes to help those cursed to carry them. But this kind of manipulation must be considered cautiously because a natural or man-made catastrophe on a global scale could once again make this gene advantageous.
We must face the fact that it is the environment in which we live – an environment that we have altered and can continue to alter – that is the primary driver of the obesity epidemic. One wonders whether we are experiencing other epidemics analogous to the FTO/obesity story.
Attention-deficit/hyperactivity disorder (ADHD) comes to mind. Some observers feel that a short attention span and impulsivity may have been advantageous when we were hunter-gatherers. The disadvantages of those traits were just a nuisance when having a formal education was merely optional for success. However, we have now trapped those who carry these traits in a one-size-fits-all educational system and sleep deprived them with a combination of electric lights and electronic distractions, to name just a few of the environmental changes that we have imposed.
Maybe it’s not the genes, but the environment that is the issue. The problem is that we haven’t found the genetic variant(s) that might allow us to answer these kinds of questions about ADHD.
Dr. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years. He has authored several books on behavioral pediatrics, including “Coping with a Picky Eater.” E-mail him at [email protected].
If you ask most folks why some of us are obese, they will answer that it’s because overweight people have eaten too much of the wrong foods and not exercised enough. If prompted, they might expand their response by saying that people who come from families with overweight members will usually have more trouble maintaining a healthy weight. It seems to me that just about covers our state-of-the-art understanding of obesity.
You can argue that there is always new information coming out from experiments on genetically altered mice. And, the recently appreciated relationship between sleep deprivation and being overweight sounds interesting. But it still boils down to the simple equation of too much energy in and too little burned.
In 2007, it was discovered that a variant of a gene known as FTO was closely linked to excess weight gain in humans. Individuals with one copy of the gene were on average 3.5 pounds heavier than were those without the gene. Those people with a double copy of the variant gene were 7 pounds heavier and 50% more likely to be obese than was the general population.
It looked like the FTO gene might be one of the answers beyond the simplicity of too much consumption and too little expenditure. But why would the gene suddenly become more prevalent over the last 5 or 6 decades that obesity has become epidemic in America? It seemed unlikely that this shift could occur in such a short time frame.
A recently published study in Proceedings of the National Academy of Sciences (PNAS 2014 [doi: 10.1073/pnas.1411893111]) suggests another more plausible explanation. Using data from the venerable and ongoing Framingham Heart Study, the researchers found that the FTO variant became a risk factor only after World War II. In other words, people with the FTO variant born prior to 1942 weren’t any more likely to be overweight than the rest of the population.
What has changed since the 1940s? Our diet has shifted toward more processed and fried foods. And, our lives and our jobs have become more sedentary. Television crept into our living rooms in the 1950s and into our bedrooms in the 1970s.
The FTO gene variant may have been advantageous to humans in lean times when the heavier of us were more likely to survive long periods of starvation. But now here in the land of fries and soft drinks, the gene has become hazardous to our health. We now look (and are) fat wearing the same genes that seemed to fit us so well a century ago.
We must have sympathy for those of us who have a gene that makes us more vulnerable when food is plentiful, and technology has made it easier to survive with very little energy expenditure. It is tempting to hope that someday scientists will find a way to alter the offending genes to help those cursed to carry them. But this kind of manipulation must be considered cautiously because a natural or man-made catastrophe on a global scale could once again make this gene advantageous.
We must face the fact that it is the environment in which we live – an environment that we have altered and can continue to alter – that is the primary driver of the obesity epidemic. One wonders whether we are experiencing other epidemics analogous to the FTO/obesity story.
Attention-deficit/hyperactivity disorder (ADHD) comes to mind. Some observers feel that a short attention span and impulsivity may have been advantageous when we were hunter-gatherers. The disadvantages of those traits were just a nuisance when having a formal education was merely optional for success. However, we have now trapped those who carry these traits in a one-size-fits-all educational system and sleep deprived them with a combination of electric lights and electronic distractions, to name just a few of the environmental changes that we have imposed.
Maybe it’s not the genes, but the environment that is the issue. The problem is that we haven’t found the genetic variant(s) that might allow us to answer these kinds of questions about ADHD.
Dr. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years. He has authored several books on behavioral pediatrics, including “Coping with a Picky Eater.” E-mail him at [email protected].
Listening
It is becoming increasingly obvious that we physicians are doing a pretty shabby job of listening to our patients. In a recent op-ed piece in the New York Times I read that a recent study (Doc, Shut Up and Listen by Nirmal Joshi, Jan. 4, 2015) found that on average doctors waited only 18 seconds before interrupting the patient. It is not unusual for me to hear complaints from friends about physicians they have visited who didn’t seem to be interested in what they had to say. In fact, it has happened to me.
The problem of physicians not listening isn’t just about patient dissatisfaction. The failure to hear what the patient said, or could have said if given the chance, can result in delayed or missed diagnoses and the ordering of costly and unnecessary diagnostic studies.
So, if physicians aren’t listening what are we doing during encounters with our patients? Many of us, and soon most of us, have our noses in computer screens looking through bloated and poorly organized electronic medical records or mouse clicking through templates to create the illusion of meaningful use. But, for the moment let’s stop beating that tired and dysfunctional horse of EHR’s and look deeper into what else could be interfering with listening.
The knee-jerk response that is most often offered is that we just don’t have enough time to listen. How often is that really the case? I wonder if we physicians had 40 minutes for an office visit instead of 20 minutes, how many of us would do a significantly better job of functional listening? I have always suspected that the notion that longer visits are automatically more effective at getting to the heart of the patient’s problem and moving toward a solution is a myth.
Listening is a skill. If you hand me a Rubik’s Cube and ask me to solve it, you could give me 15 minutes or give me an hour it won’t make any difference because I have no experience with Rubik’s Cubes. Learning how to ask questions that have a high likelihood of getting at what is really troubling the patient and then listening to their responses is a skill. A few master physicians are born with that ability and some doctors will never get it. However, it is a skill that most of us can be taught if medical schools and house officer training program knew how to teach it.
In the Times op-ed piece, Nirmal Joshi, the chief medical officer of Pinnacle Health Systems, Harrisburg, Penn., describes a physician training program in Harrisburg, in which the doctors participated in mock patient interviews in which the patient-actors provided feedback. The physicians also were provided with physician-coaches in real life clinical encounters. The result was a 40% increase in patient satisfaction. Other studies have shown that increased satisfaction correlates with improved outcomes.
You could argue that incorporating these listening skills are going gobble up more time. It probably would, more so on the steep slope of the learning curve. There will always be patients who ramble on and are hard to redirect even by the most skillful history taker. However, with practice I think physicians will find that listening with care will often not take as much time than they fear. It will certainly make the encounters more satisfying.
But, let’s look at that issue of how we are spending our time again. How often are office visits driven by the physician’s agenda and not by the patient’s? How much time do we spend lecturing and badgering patients in an attempt to follow advice that we think is important but they obviously haven’t? That wasted time could have been better invested in listening for the answer of why they haven’t complied in the past.
Finally, is the issue of caring. Unfortunately, this may mean a significant shift in attitude for some of us. If we genuinely care what the patient thinks is important, finding the time to listen won’t be that difficult.
Dr. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years. He has authored several books on behavioral pediatrics, including “Coping with a Picky Eater.” E-mail him at [email protected]. Scan this QR code to view similar articles or go to pediatricnews.com.
It is becoming increasingly obvious that we physicians are doing a pretty shabby job of listening to our patients. In a recent op-ed piece in the New York Times I read that a recent study (Doc, Shut Up and Listen by Nirmal Joshi, Jan. 4, 2015) found that on average doctors waited only 18 seconds before interrupting the patient. It is not unusual for me to hear complaints from friends about physicians they have visited who didn’t seem to be interested in what they had to say. In fact, it has happened to me.
The problem of physicians not listening isn’t just about patient dissatisfaction. The failure to hear what the patient said, or could have said if given the chance, can result in delayed or missed diagnoses and the ordering of costly and unnecessary diagnostic studies.
So, if physicians aren’t listening what are we doing during encounters with our patients? Many of us, and soon most of us, have our noses in computer screens looking through bloated and poorly organized electronic medical records or mouse clicking through templates to create the illusion of meaningful use. But, for the moment let’s stop beating that tired and dysfunctional horse of EHR’s and look deeper into what else could be interfering with listening.
The knee-jerk response that is most often offered is that we just don’t have enough time to listen. How often is that really the case? I wonder if we physicians had 40 minutes for an office visit instead of 20 minutes, how many of us would do a significantly better job of functional listening? I have always suspected that the notion that longer visits are automatically more effective at getting to the heart of the patient’s problem and moving toward a solution is a myth.
Listening is a skill. If you hand me a Rubik’s Cube and ask me to solve it, you could give me 15 minutes or give me an hour it won’t make any difference because I have no experience with Rubik’s Cubes. Learning how to ask questions that have a high likelihood of getting at what is really troubling the patient and then listening to their responses is a skill. A few master physicians are born with that ability and some doctors will never get it. However, it is a skill that most of us can be taught if medical schools and house officer training program knew how to teach it.
In the Times op-ed piece, Nirmal Joshi, the chief medical officer of Pinnacle Health Systems, Harrisburg, Penn., describes a physician training program in Harrisburg, in which the doctors participated in mock patient interviews in which the patient-actors provided feedback. The physicians also were provided with physician-coaches in real life clinical encounters. The result was a 40% increase in patient satisfaction. Other studies have shown that increased satisfaction correlates with improved outcomes.
You could argue that incorporating these listening skills are going gobble up more time. It probably would, more so on the steep slope of the learning curve. There will always be patients who ramble on and are hard to redirect even by the most skillful history taker. However, with practice I think physicians will find that listening with care will often not take as much time than they fear. It will certainly make the encounters more satisfying.
But, let’s look at that issue of how we are spending our time again. How often are office visits driven by the physician’s agenda and not by the patient’s? How much time do we spend lecturing and badgering patients in an attempt to follow advice that we think is important but they obviously haven’t? That wasted time could have been better invested in listening for the answer of why they haven’t complied in the past.
Finally, is the issue of caring. Unfortunately, this may mean a significant shift in attitude for some of us. If we genuinely care what the patient thinks is important, finding the time to listen won’t be that difficult.
Dr. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years. He has authored several books on behavioral pediatrics, including “Coping with a Picky Eater.” E-mail him at [email protected]. Scan this QR code to view similar articles or go to pediatricnews.com.
It is becoming increasingly obvious that we physicians are doing a pretty shabby job of listening to our patients. In a recent op-ed piece in the New York Times I read that a recent study (Doc, Shut Up and Listen by Nirmal Joshi, Jan. 4, 2015) found that on average doctors waited only 18 seconds before interrupting the patient. It is not unusual for me to hear complaints from friends about physicians they have visited who didn’t seem to be interested in what they had to say. In fact, it has happened to me.
The problem of physicians not listening isn’t just about patient dissatisfaction. The failure to hear what the patient said, or could have said if given the chance, can result in delayed or missed diagnoses and the ordering of costly and unnecessary diagnostic studies.
So, if physicians aren’t listening what are we doing during encounters with our patients? Many of us, and soon most of us, have our noses in computer screens looking through bloated and poorly organized electronic medical records or mouse clicking through templates to create the illusion of meaningful use. But, for the moment let’s stop beating that tired and dysfunctional horse of EHR’s and look deeper into what else could be interfering with listening.
The knee-jerk response that is most often offered is that we just don’t have enough time to listen. How often is that really the case? I wonder if we physicians had 40 minutes for an office visit instead of 20 minutes, how many of us would do a significantly better job of functional listening? I have always suspected that the notion that longer visits are automatically more effective at getting to the heart of the patient’s problem and moving toward a solution is a myth.
Listening is a skill. If you hand me a Rubik’s Cube and ask me to solve it, you could give me 15 minutes or give me an hour it won’t make any difference because I have no experience with Rubik’s Cubes. Learning how to ask questions that have a high likelihood of getting at what is really troubling the patient and then listening to their responses is a skill. A few master physicians are born with that ability and some doctors will never get it. However, it is a skill that most of us can be taught if medical schools and house officer training program knew how to teach it.
In the Times op-ed piece, Nirmal Joshi, the chief medical officer of Pinnacle Health Systems, Harrisburg, Penn., describes a physician training program in Harrisburg, in which the doctors participated in mock patient interviews in which the patient-actors provided feedback. The physicians also were provided with physician-coaches in real life clinical encounters. The result was a 40% increase in patient satisfaction. Other studies have shown that increased satisfaction correlates with improved outcomes.
You could argue that incorporating these listening skills are going gobble up more time. It probably would, more so on the steep slope of the learning curve. There will always be patients who ramble on and are hard to redirect even by the most skillful history taker. However, with practice I think physicians will find that listening with care will often not take as much time than they fear. It will certainly make the encounters more satisfying.
But, let’s look at that issue of how we are spending our time again. How often are office visits driven by the physician’s agenda and not by the patient’s? How much time do we spend lecturing and badgering patients in an attempt to follow advice that we think is important but they obviously haven’t? That wasted time could have been better invested in listening for the answer of why they haven’t complied in the past.
Finally, is the issue of caring. Unfortunately, this may mean a significant shift in attitude for some of us. If we genuinely care what the patient thinks is important, finding the time to listen won’t be that difficult.
Dr. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years. He has authored several books on behavioral pediatrics, including “Coping with a Picky Eater.” E-mail him at [email protected]. Scan this QR code to view similar articles or go to pediatricnews.com.
Solidarity
Does it seem a little dark around here to you? Could it be the cloud of discontent and disillusionment that is hovering over many of America’s physicians? There is a lot for doctors to fret about ... the uncertainty associated with the Affordable Care Act, time gobbling and attention diverting electronic medical records, and the ever-present threat of a malpractice suit – to name just a few.
Among the complaints that I hear most often is “Medicine is becoming a business.” Well, folks, let’s rethink this. Practicing medicine has always been a business. Of course, medicine is a bit of an odd duck – 30% science and 70% art. And while we may like to believe that our goal to alleviate suffering is nobler than are those of other professions, medicine is still a business. Very few of us have the luxury of practicing without hope of financial return.
However, what has changed over the last quarter-century is that many of us have sold the business. For a variety of reasons, many of them falling under the umbrella of “quality of life issues,” we have changed roles from being owner to that of employee. Not surprisingly, most of us are chaffing in the traces of that new role. Individuals who aspire to be physicians are generally not the kind of people who will happily give up control of anything. But becoming an employee means giving up control of a big chunk of one’s professional life. As health care delivery entities continue to grow in size encouraged by the Affordable Care Act, that increase in size will shrink what little power the employee has even further.
A few physicians are trying to buck the trend by remaining owner/operators of either “slow medicine” or “boutique” practices. However, the massive burden of medical school debt will continue to crush the entrepreneurial spirit of even the most idealistic young graduates, and I don’t foresee a time when the majority of physicians will again own their practices.
Even if there is a revolutionary change in how we fund medical education, it’s time for physicians to accept the fact that they are employees. But instead of quietly grumbling about the situation, maybe it’s time for physicians to join together and become activist employees.
I can hear you gasp, “Is he talking about forming unions and going out on strike?” Well, kind of. I know that sounds so ugly and is beneath you as a professional, something the French might do, but not us here in “the Land of the Free.”
Organizing and taking action is not totally foreign to American physicians. You may feel you were underpaid as a house officer. But your compensation would have been far less robust had it not been for a group of 450 residents at the Boston City Hospital who in 1967 organized a work action that resulted in a raise in the base pay for interns from $3,600 to $6,600. Instead of a strike, the house officers initiated a “heal-in” in which they were more liberal in admitting patients and raised the intensity of the care for inpatients. The resulting congestion in the hospital forced the administrators to yield to their demands for a reasonable salary.
You may not be sufficiently dissatisfied to feel like joining other physicians in a work action, but I sense there are some pockets of physician unrest in this country such that forming a union may begin appearing on their list of options.
While you may tend to see strikes as being mostly about the money, employees are often more concerned about their working conditions. If the company you work for has just “upgraded” your computer system so that it now takes you an extra hour each day to see just twenty patients, you might legitimately complain that your working conditions have become so intolerable that you are ready to join up and take action.
Remember, it doesn’t have to be a strike. It could be a “slowdown” or a “speedup” designed to create enough chaos for your employer to get its attention. Could it negatively affect some patients? The honest answer is yes. I doubt that there has ever been a successful work action that hasn’t resulted in some collateral damage.
But is it worth the risks? That’s for you to decide. I’m simply observing that the shift in the landscape has given physicians who want more of a say in their work environments few options. Maybe it’s time for you to think beyond the familiar boundaries of the profession and add a little bite to your growl.
Dr. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years. He has authored several books on behavioral pediatrics, including “How to Say No to Your Toddler.” E-mail him at [email protected].
Does it seem a little dark around here to you? Could it be the cloud of discontent and disillusionment that is hovering over many of America’s physicians? There is a lot for doctors to fret about ... the uncertainty associated with the Affordable Care Act, time gobbling and attention diverting electronic medical records, and the ever-present threat of a malpractice suit – to name just a few.
Among the complaints that I hear most often is “Medicine is becoming a business.” Well, folks, let’s rethink this. Practicing medicine has always been a business. Of course, medicine is a bit of an odd duck – 30% science and 70% art. And while we may like to believe that our goal to alleviate suffering is nobler than are those of other professions, medicine is still a business. Very few of us have the luxury of practicing without hope of financial return.
However, what has changed over the last quarter-century is that many of us have sold the business. For a variety of reasons, many of them falling under the umbrella of “quality of life issues,” we have changed roles from being owner to that of employee. Not surprisingly, most of us are chaffing in the traces of that new role. Individuals who aspire to be physicians are generally not the kind of people who will happily give up control of anything. But becoming an employee means giving up control of a big chunk of one’s professional life. As health care delivery entities continue to grow in size encouraged by the Affordable Care Act, that increase in size will shrink what little power the employee has even further.
A few physicians are trying to buck the trend by remaining owner/operators of either “slow medicine” or “boutique” practices. However, the massive burden of medical school debt will continue to crush the entrepreneurial spirit of even the most idealistic young graduates, and I don’t foresee a time when the majority of physicians will again own their practices.
Even if there is a revolutionary change in how we fund medical education, it’s time for physicians to accept the fact that they are employees. But instead of quietly grumbling about the situation, maybe it’s time for physicians to join together and become activist employees.
I can hear you gasp, “Is he talking about forming unions and going out on strike?” Well, kind of. I know that sounds so ugly and is beneath you as a professional, something the French might do, but not us here in “the Land of the Free.”
Organizing and taking action is not totally foreign to American physicians. You may feel you were underpaid as a house officer. But your compensation would have been far less robust had it not been for a group of 450 residents at the Boston City Hospital who in 1967 organized a work action that resulted in a raise in the base pay for interns from $3,600 to $6,600. Instead of a strike, the house officers initiated a “heal-in” in which they were more liberal in admitting patients and raised the intensity of the care for inpatients. The resulting congestion in the hospital forced the administrators to yield to their demands for a reasonable salary.
You may not be sufficiently dissatisfied to feel like joining other physicians in a work action, but I sense there are some pockets of physician unrest in this country such that forming a union may begin appearing on their list of options.
While you may tend to see strikes as being mostly about the money, employees are often more concerned about their working conditions. If the company you work for has just “upgraded” your computer system so that it now takes you an extra hour each day to see just twenty patients, you might legitimately complain that your working conditions have become so intolerable that you are ready to join up and take action.
Remember, it doesn’t have to be a strike. It could be a “slowdown” or a “speedup” designed to create enough chaos for your employer to get its attention. Could it negatively affect some patients? The honest answer is yes. I doubt that there has ever been a successful work action that hasn’t resulted in some collateral damage.
But is it worth the risks? That’s for you to decide. I’m simply observing that the shift in the landscape has given physicians who want more of a say in their work environments few options. Maybe it’s time for you to think beyond the familiar boundaries of the profession and add a little bite to your growl.
Dr. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years. He has authored several books on behavioral pediatrics, including “How to Say No to Your Toddler.” E-mail him at [email protected].
Does it seem a little dark around here to you? Could it be the cloud of discontent and disillusionment that is hovering over many of America’s physicians? There is a lot for doctors to fret about ... the uncertainty associated with the Affordable Care Act, time gobbling and attention diverting electronic medical records, and the ever-present threat of a malpractice suit – to name just a few.
Among the complaints that I hear most often is “Medicine is becoming a business.” Well, folks, let’s rethink this. Practicing medicine has always been a business. Of course, medicine is a bit of an odd duck – 30% science and 70% art. And while we may like to believe that our goal to alleviate suffering is nobler than are those of other professions, medicine is still a business. Very few of us have the luxury of practicing without hope of financial return.
However, what has changed over the last quarter-century is that many of us have sold the business. For a variety of reasons, many of them falling under the umbrella of “quality of life issues,” we have changed roles from being owner to that of employee. Not surprisingly, most of us are chaffing in the traces of that new role. Individuals who aspire to be physicians are generally not the kind of people who will happily give up control of anything. But becoming an employee means giving up control of a big chunk of one’s professional life. As health care delivery entities continue to grow in size encouraged by the Affordable Care Act, that increase in size will shrink what little power the employee has even further.
A few physicians are trying to buck the trend by remaining owner/operators of either “slow medicine” or “boutique” practices. However, the massive burden of medical school debt will continue to crush the entrepreneurial spirit of even the most idealistic young graduates, and I don’t foresee a time when the majority of physicians will again own their practices.
Even if there is a revolutionary change in how we fund medical education, it’s time for physicians to accept the fact that they are employees. But instead of quietly grumbling about the situation, maybe it’s time for physicians to join together and become activist employees.
I can hear you gasp, “Is he talking about forming unions and going out on strike?” Well, kind of. I know that sounds so ugly and is beneath you as a professional, something the French might do, but not us here in “the Land of the Free.”
Organizing and taking action is not totally foreign to American physicians. You may feel you were underpaid as a house officer. But your compensation would have been far less robust had it not been for a group of 450 residents at the Boston City Hospital who in 1967 organized a work action that resulted in a raise in the base pay for interns from $3,600 to $6,600. Instead of a strike, the house officers initiated a “heal-in” in which they were more liberal in admitting patients and raised the intensity of the care for inpatients. The resulting congestion in the hospital forced the administrators to yield to their demands for a reasonable salary.
You may not be sufficiently dissatisfied to feel like joining other physicians in a work action, but I sense there are some pockets of physician unrest in this country such that forming a union may begin appearing on their list of options.
While you may tend to see strikes as being mostly about the money, employees are often more concerned about their working conditions. If the company you work for has just “upgraded” your computer system so that it now takes you an extra hour each day to see just twenty patients, you might legitimately complain that your working conditions have become so intolerable that you are ready to join up and take action.
Remember, it doesn’t have to be a strike. It could be a “slowdown” or a “speedup” designed to create enough chaos for your employer to get its attention. Could it negatively affect some patients? The honest answer is yes. I doubt that there has ever been a successful work action that hasn’t resulted in some collateral damage.
But is it worth the risks? That’s for you to decide. I’m simply observing that the shift in the landscape has given physicians who want more of a say in their work environments few options. Maybe it’s time for you to think beyond the familiar boundaries of the profession and add a little bite to your growl.
Dr. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years. He has authored several books on behavioral pediatrics, including “How to Say No to Your Toddler.” E-mail him at [email protected].
Square pegs and round holes
How many times have you been asked by the parent of a child with attention-deficit/hyperactivity disorder when he will outgrow it? Or even, if he will ever outgrow it? My answer has always been, “I suspect that your son will always have whatever the brain structure or chemistry is contributing to the behaviors you are seeing now. But, we can hope that as an adult he will have found a job and an environment that better suits his talents and vulnerabilities.”
It turns out that like many of my responses to parents, my answer was only half right. In a long essay in the New York Times (“A Natural Fix for ADHD,” Nov 2, 2014), Dr. Richard A. Friedman, professor of clinical psychiatry at Cornell University, New York, writes that there is some evidence that adults who were diagnosed with ADHD as children will outgrow the condition. And, that they will have MRIs that no longer demonstrate the asynchrony that was present when they had symptoms. However, the adults whose ADHD symptoms and behaviors have persisted continue to have abnormal scans.
This sounds like a typical chicken-and-egg situation. Did the brains of the lucky children who stumbled onto a path that better suited their strengths and vulnerabilities “normalize” in response to the more compatible environment? Or, did some maturational process occur in their neural connections that now allows them to thrive in an environment that they would have found so challenging as children?
Dr. Friedman doesn’t offer us an answer, but his conclusion echoes the advice that I had been peddling. He recommends that “we should be doing everything we can to help young people with ADHD select situations – whether school now or professions later – that are a better fit for their novelty-seeking behavior.” Behaviors that may have helped us survive as we wandered the environment as nomads now get those of us prone to distraction into trouble within the confines of our modern “civilized” societies.
Education should not just involve teaching students about the world they inhabit. It also must strive to help them learn more about themselves, both their strengths and their weaknesses. With this information, the well-educated student will be more likely to find a path on which he feels successful.
Unfortunately, our one-size-doesn’t-fit-all educational system is failing when it comes to helping students find careers in which they can thrive and be rewarded. Although industries across the country are crying out for skilled workers, the students who chose the “vocational” path continue to face the stigma of not having a 4-year college education. Unreasonable concerns about workplace safety and memories about the horrors of child labor make it difficult for young people to experience a variety of work environments and role models that open the door to a career in which they would thrive.
There is always the risk of channeling young people into an educational path based on their apparent aptitudes. However, as it stands today, we are guilty of not offering students a chance to experience a broad variety of options. At present, we are trying to fit square pegs into round holes. Although education has focused on rounding off some of those sharp edges, it also must help students find niches into which they can more comfortably fit.
Dr. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years. He has authored several books on behavioral pediatrics, including “How to Say No to Your Toddler.” E-mail him at [email protected].
How many times have you been asked by the parent of a child with attention-deficit/hyperactivity disorder when he will outgrow it? Or even, if he will ever outgrow it? My answer has always been, “I suspect that your son will always have whatever the brain structure or chemistry is contributing to the behaviors you are seeing now. But, we can hope that as an adult he will have found a job and an environment that better suits his talents and vulnerabilities.”
It turns out that like many of my responses to parents, my answer was only half right. In a long essay in the New York Times (“A Natural Fix for ADHD,” Nov 2, 2014), Dr. Richard A. Friedman, professor of clinical psychiatry at Cornell University, New York, writes that there is some evidence that adults who were diagnosed with ADHD as children will outgrow the condition. And, that they will have MRIs that no longer demonstrate the asynchrony that was present when they had symptoms. However, the adults whose ADHD symptoms and behaviors have persisted continue to have abnormal scans.
This sounds like a typical chicken-and-egg situation. Did the brains of the lucky children who stumbled onto a path that better suited their strengths and vulnerabilities “normalize” in response to the more compatible environment? Or, did some maturational process occur in their neural connections that now allows them to thrive in an environment that they would have found so challenging as children?
Dr. Friedman doesn’t offer us an answer, but his conclusion echoes the advice that I had been peddling. He recommends that “we should be doing everything we can to help young people with ADHD select situations – whether school now or professions later – that are a better fit for their novelty-seeking behavior.” Behaviors that may have helped us survive as we wandered the environment as nomads now get those of us prone to distraction into trouble within the confines of our modern “civilized” societies.
Education should not just involve teaching students about the world they inhabit. It also must strive to help them learn more about themselves, both their strengths and their weaknesses. With this information, the well-educated student will be more likely to find a path on which he feels successful.
Unfortunately, our one-size-doesn’t-fit-all educational system is failing when it comes to helping students find careers in which they can thrive and be rewarded. Although industries across the country are crying out for skilled workers, the students who chose the “vocational” path continue to face the stigma of not having a 4-year college education. Unreasonable concerns about workplace safety and memories about the horrors of child labor make it difficult for young people to experience a variety of work environments and role models that open the door to a career in which they would thrive.
There is always the risk of channeling young people into an educational path based on their apparent aptitudes. However, as it stands today, we are guilty of not offering students a chance to experience a broad variety of options. At present, we are trying to fit square pegs into round holes. Although education has focused on rounding off some of those sharp edges, it also must help students find niches into which they can more comfortably fit.
Dr. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years. He has authored several books on behavioral pediatrics, including “How to Say No to Your Toddler.” E-mail him at [email protected].
How many times have you been asked by the parent of a child with attention-deficit/hyperactivity disorder when he will outgrow it? Or even, if he will ever outgrow it? My answer has always been, “I suspect that your son will always have whatever the brain structure or chemistry is contributing to the behaviors you are seeing now. But, we can hope that as an adult he will have found a job and an environment that better suits his talents and vulnerabilities.”
It turns out that like many of my responses to parents, my answer was only half right. In a long essay in the New York Times (“A Natural Fix for ADHD,” Nov 2, 2014), Dr. Richard A. Friedman, professor of clinical psychiatry at Cornell University, New York, writes that there is some evidence that adults who were diagnosed with ADHD as children will outgrow the condition. And, that they will have MRIs that no longer demonstrate the asynchrony that was present when they had symptoms. However, the adults whose ADHD symptoms and behaviors have persisted continue to have abnormal scans.
This sounds like a typical chicken-and-egg situation. Did the brains of the lucky children who stumbled onto a path that better suited their strengths and vulnerabilities “normalize” in response to the more compatible environment? Or, did some maturational process occur in their neural connections that now allows them to thrive in an environment that they would have found so challenging as children?
Dr. Friedman doesn’t offer us an answer, but his conclusion echoes the advice that I had been peddling. He recommends that “we should be doing everything we can to help young people with ADHD select situations – whether school now or professions later – that are a better fit for their novelty-seeking behavior.” Behaviors that may have helped us survive as we wandered the environment as nomads now get those of us prone to distraction into trouble within the confines of our modern “civilized” societies.
Education should not just involve teaching students about the world they inhabit. It also must strive to help them learn more about themselves, both their strengths and their weaknesses. With this information, the well-educated student will be more likely to find a path on which he feels successful.
Unfortunately, our one-size-doesn’t-fit-all educational system is failing when it comes to helping students find careers in which they can thrive and be rewarded. Although industries across the country are crying out for skilled workers, the students who chose the “vocational” path continue to face the stigma of not having a 4-year college education. Unreasonable concerns about workplace safety and memories about the horrors of child labor make it difficult for young people to experience a variety of work environments and role models that open the door to a career in which they would thrive.
There is always the risk of channeling young people into an educational path based on their apparent aptitudes. However, as it stands today, we are guilty of not offering students a chance to experience a broad variety of options. At present, we are trying to fit square pegs into round holes. Although education has focused on rounding off some of those sharp edges, it also must help students find niches into which they can more comfortably fit.
Dr. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years. He has authored several books on behavioral pediatrics, including “How to Say No to Your Toddler.” E-mail him at [email protected].
Your son and football?
Imagine that you have finished dinner and have just sat down to watch the last half of the nightly news. Your 9-year-old son whom you have watched play soccer since he was 5 years old hands you a crumpled sheet of paper extracted from his backpack and asks, “Dad, can you sign this permission slip so I can play football?” Will you respond, “Sure, when is the first practice?”
Or will this be the jumping-off point for a dissertation on why you think football is a bad idea? Will you tell him that you are concerned that he will sustain a concussion, or two or three? Will you ask him why he would want a play a sport whose top level players are steroid pumped, inarticulate wife beaters? Or, will you tell him that the football culture tolerates the evils of hazing and fosters aggressive behavior?
Before we go any further, I must offer the disclaimer that I played high school football wearing a leather helmet. And that I played college football for 2 years until the handwriting on the locker room wall said, “Your skill level makes it very unlikely that you will ever get off the bench; maybe you should focus on lacrosse.” Which I did.
Although I had a few “stingers,” I never sustained any serious injuries other than a torn hamstring that still plagues me. My two concussions were unrelated to contact sports. As a team doctor for the local high school, I’m sure I sent several concussed players back onto the field. But in retrospect, I and most other physicians back then were working with a definition of concussion that was far too narrow. The most serious injuries I encountered as a game physician occurred during soccer matches.
I read the same headlines you do about what appear to be late effects in professional athletes of repeated blows to the head. I am repulsed by the off-field behavior of both collegiate and professional football players, and I continue to search unsuccessfully for admirable role models in the ranks of high-profile athletes.
Despite all the unseemly publicity, television revenues from professional football continue to surge unabated. However, I hear an undercurrent of discomfort with football from parents and some pediatricians: “Why would I allow my child to play a dangerous sport with despicable role models?” That’s a good question, and is the same one I asked you in the first line of this letter. I wouldn’t be surprised if some time in the not-too-distant future, the level of discomfort reaches a point that groups such as the American Academy of Pediatrics suggest that parents be strongly discouraged from allowing their children to play football.
I hope that this point is never reached because from my personal and professional experience, football can offer enough positives to make its risks acceptable – risks that are on a par with most activities that involve getting off the couch and physically interacting with peers and the environment. Football helped me to learn initiative (some might confuse this with aggression). It allowed me to enjoy the benefits of succeeding and failing as a member of a team. It exposed me to the value of careful preparation and meticulous attention to detail. One could argue that I could have acquired those insights and skills by participating in other activities, athletic or not. But for me it happened to be football. Were there downsides? Yes, because football was the only fall sport at my high school, it had the feel of an exclusive fraternity, a feeling that I have grown to dislike.
Would I sign my son’s permission slip to play football? Yes. Would I worry about him getting hurt? No more than I would when he played soccer and hockey. Because despite his dreams, we live in a town that isn’t football obsessed, and he isn’t going to have a 10-year career in professional sports. The risks of cumulative traumatic brain injury are too small to consider.
The bigger risk is that he might encounter a coach with a win-at-any-cost attitude and the moral character of a doorknob. But that can happen in any sport. Together he and I will continue to search for good role models in other avenues of life.
Dr. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years. He has authored several books on behavioral pediatrics including “How to Say No to Your Toddler.” To comment, e-mail him at [email protected].
Imagine that you have finished dinner and have just sat down to watch the last half of the nightly news. Your 9-year-old son whom you have watched play soccer since he was 5 years old hands you a crumpled sheet of paper extracted from his backpack and asks, “Dad, can you sign this permission slip so I can play football?” Will you respond, “Sure, when is the first practice?”
Or will this be the jumping-off point for a dissertation on why you think football is a bad idea? Will you tell him that you are concerned that he will sustain a concussion, or two or three? Will you ask him why he would want a play a sport whose top level players are steroid pumped, inarticulate wife beaters? Or, will you tell him that the football culture tolerates the evils of hazing and fosters aggressive behavior?
Before we go any further, I must offer the disclaimer that I played high school football wearing a leather helmet. And that I played college football for 2 years until the handwriting on the locker room wall said, “Your skill level makes it very unlikely that you will ever get off the bench; maybe you should focus on lacrosse.” Which I did.
Although I had a few “stingers,” I never sustained any serious injuries other than a torn hamstring that still plagues me. My two concussions were unrelated to contact sports. As a team doctor for the local high school, I’m sure I sent several concussed players back onto the field. But in retrospect, I and most other physicians back then were working with a definition of concussion that was far too narrow. The most serious injuries I encountered as a game physician occurred during soccer matches.
I read the same headlines you do about what appear to be late effects in professional athletes of repeated blows to the head. I am repulsed by the off-field behavior of both collegiate and professional football players, and I continue to search unsuccessfully for admirable role models in the ranks of high-profile athletes.
Despite all the unseemly publicity, television revenues from professional football continue to surge unabated. However, I hear an undercurrent of discomfort with football from parents and some pediatricians: “Why would I allow my child to play a dangerous sport with despicable role models?” That’s a good question, and is the same one I asked you in the first line of this letter. I wouldn’t be surprised if some time in the not-too-distant future, the level of discomfort reaches a point that groups such as the American Academy of Pediatrics suggest that parents be strongly discouraged from allowing their children to play football.
I hope that this point is never reached because from my personal and professional experience, football can offer enough positives to make its risks acceptable – risks that are on a par with most activities that involve getting off the couch and physically interacting with peers and the environment. Football helped me to learn initiative (some might confuse this with aggression). It allowed me to enjoy the benefits of succeeding and failing as a member of a team. It exposed me to the value of careful preparation and meticulous attention to detail. One could argue that I could have acquired those insights and skills by participating in other activities, athletic or not. But for me it happened to be football. Were there downsides? Yes, because football was the only fall sport at my high school, it had the feel of an exclusive fraternity, a feeling that I have grown to dislike.
Would I sign my son’s permission slip to play football? Yes. Would I worry about him getting hurt? No more than I would when he played soccer and hockey. Because despite his dreams, we live in a town that isn’t football obsessed, and he isn’t going to have a 10-year career in professional sports. The risks of cumulative traumatic brain injury are too small to consider.
The bigger risk is that he might encounter a coach with a win-at-any-cost attitude and the moral character of a doorknob. But that can happen in any sport. Together he and I will continue to search for good role models in other avenues of life.
Dr. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years. He has authored several books on behavioral pediatrics including “How to Say No to Your Toddler.” To comment, e-mail him at [email protected].
Imagine that you have finished dinner and have just sat down to watch the last half of the nightly news. Your 9-year-old son whom you have watched play soccer since he was 5 years old hands you a crumpled sheet of paper extracted from his backpack and asks, “Dad, can you sign this permission slip so I can play football?” Will you respond, “Sure, when is the first practice?”
Or will this be the jumping-off point for a dissertation on why you think football is a bad idea? Will you tell him that you are concerned that he will sustain a concussion, or two or three? Will you ask him why he would want a play a sport whose top level players are steroid pumped, inarticulate wife beaters? Or, will you tell him that the football culture tolerates the evils of hazing and fosters aggressive behavior?
Before we go any further, I must offer the disclaimer that I played high school football wearing a leather helmet. And that I played college football for 2 years until the handwriting on the locker room wall said, “Your skill level makes it very unlikely that you will ever get off the bench; maybe you should focus on lacrosse.” Which I did.
Although I had a few “stingers,” I never sustained any serious injuries other than a torn hamstring that still plagues me. My two concussions were unrelated to contact sports. As a team doctor for the local high school, I’m sure I sent several concussed players back onto the field. But in retrospect, I and most other physicians back then were working with a definition of concussion that was far too narrow. The most serious injuries I encountered as a game physician occurred during soccer matches.
I read the same headlines you do about what appear to be late effects in professional athletes of repeated blows to the head. I am repulsed by the off-field behavior of both collegiate and professional football players, and I continue to search unsuccessfully for admirable role models in the ranks of high-profile athletes.
Despite all the unseemly publicity, television revenues from professional football continue to surge unabated. However, I hear an undercurrent of discomfort with football from parents and some pediatricians: “Why would I allow my child to play a dangerous sport with despicable role models?” That’s a good question, and is the same one I asked you in the first line of this letter. I wouldn’t be surprised if some time in the not-too-distant future, the level of discomfort reaches a point that groups such as the American Academy of Pediatrics suggest that parents be strongly discouraged from allowing their children to play football.
I hope that this point is never reached because from my personal and professional experience, football can offer enough positives to make its risks acceptable – risks that are on a par with most activities that involve getting off the couch and physically interacting with peers and the environment. Football helped me to learn initiative (some might confuse this with aggression). It allowed me to enjoy the benefits of succeeding and failing as a member of a team. It exposed me to the value of careful preparation and meticulous attention to detail. One could argue that I could have acquired those insights and skills by participating in other activities, athletic or not. But for me it happened to be football. Were there downsides? Yes, because football was the only fall sport at my high school, it had the feel of an exclusive fraternity, a feeling that I have grown to dislike.
Would I sign my son’s permission slip to play football? Yes. Would I worry about him getting hurt? No more than I would when he played soccer and hockey. Because despite his dreams, we live in a town that isn’t football obsessed, and he isn’t going to have a 10-year career in professional sports. The risks of cumulative traumatic brain injury are too small to consider.
The bigger risk is that he might encounter a coach with a win-at-any-cost attitude and the moral character of a doorknob. But that can happen in any sport. Together he and I will continue to search for good role models in other avenues of life.
Dr. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years. He has authored several books on behavioral pediatrics including “How to Say No to Your Toddler.” To comment, e-mail him at [email protected].
Sunrise calls
When your pager vibrates at 7 o’clock in the morning, it is unlikely to be alerting you of good news. It may be a parent who assumes that because his child’s medical home has evening office hours that there will be a receptionist sitting there at sunup to help him make an appointment for a nonurgent complaint. Or, it may be a call from a parent who knows that she doesn’t have a medical emergency on her hands, but who would like your advice about whether she should take a day off from work or send her child to day care.
But, an uncomfortable number of daybreak calls come from parents with what eventually turns out to be a desperately ill child. I have witnessed those scenarios often enough that even though I am retired, I break into a cold sweat when the home phone rings anytime between 6 and 7 in the morning.
A study from Wake Forest University, Winston-Salem, N.C., published in the November 2014 issue of Pediatrics, supports my long-standing discomfort with patients who present in the early morning. (McCrory et al. “Off-Hours Admission to Pediatric Intensive Care and Mortality,” Pediatrics 2014;134:e1345-e1353 [doi: 10.1542/peds.2014-1071]). In a retrospective study of nearly a quarter of a million admissions to 99 perinatal ICUs over a 3-year period, the investigators discovered that admission in off-hours and weekends “does not independently increase the odds of mortality.”
However, they found that admission from 6 to 11 in the morning is “associated with an increased risk of death.” This may not have been the result the investigators were expecting, and their analyses don’t suggest a cause. In the discussion portion of the paper, they offer some explanations that are in sync with my observations. First, ICUs are generally fully staffed 24-7-365. While the lights maybe dimmed slightly, there is seldom a diurnal variation in the attentiveness and quality of the caregivers in an ICU. Contrast this to an ordinary medical/surgical floor on which the staffing levels drop precipitously when the sun goes down. The skeletal staff is usually working in the dark, resorting to flashlights and ankle-level lighting to make their observations. And ... things are missed. Things like skin-color changes and the quality of respirations that become obvious when the morning shift arrives and the lights go on. “Holy s**t! This patient needs to be in the PICU!” And, the PICU now receives a patient at 7:30 a.m. who has a greater odds of mortality because the illness has percolated in the dark overnight.
The same phenomenon occurs in the outpatient setting. At night, sleep deprivation may cloud a parent’s observational skills. The lights in the bedroom may have been left off in hopes of keeping the child more comfortable. The parent may have called the doctor and been shunted to a triage nurse who is unfamiliar with the family and whose algorithm fails at a critical branch point. Or, the call may have been fielded by an answering service that is more interested in protecting its client’s sleep than serving the needs of the caller.
Or the parent may have spoken to the doctor early in the evening, but was hesitant to call again and wake her when the child’s condition changed. House officers can fall into the same trap when their misplaced concern about the sleep needs of the physician to whom they report prevents them from making a critical call for help.
Again, the result is that when the sun comes up, a child whose illness might have been more easily managed in the PICU at 2:30 a.m. doesn’t arrive in the unit until those deadly hours between 6 a.m. and 11 a.m. While there may be diurnal variations in the inherent mortality of some pathological conditions, this study from North Carolina suggests that when the lights go out, critical observations go unmade and so do wise decisions.
Dr. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years. He has authored several books on behavioral pediatrics, including “How to Say No to Your Toddler.” E-mail him at [email protected].
When your pager vibrates at 7 o’clock in the morning, it is unlikely to be alerting you of good news. It may be a parent who assumes that because his child’s medical home has evening office hours that there will be a receptionist sitting there at sunup to help him make an appointment for a nonurgent complaint. Or, it may be a call from a parent who knows that she doesn’t have a medical emergency on her hands, but who would like your advice about whether she should take a day off from work or send her child to day care.
But, an uncomfortable number of daybreak calls come from parents with what eventually turns out to be a desperately ill child. I have witnessed those scenarios often enough that even though I am retired, I break into a cold sweat when the home phone rings anytime between 6 and 7 in the morning.
A study from Wake Forest University, Winston-Salem, N.C., published in the November 2014 issue of Pediatrics, supports my long-standing discomfort with patients who present in the early morning. (McCrory et al. “Off-Hours Admission to Pediatric Intensive Care and Mortality,” Pediatrics 2014;134:e1345-e1353 [doi: 10.1542/peds.2014-1071]). In a retrospective study of nearly a quarter of a million admissions to 99 perinatal ICUs over a 3-year period, the investigators discovered that admission in off-hours and weekends “does not independently increase the odds of mortality.”
However, they found that admission from 6 to 11 in the morning is “associated with an increased risk of death.” This may not have been the result the investigators were expecting, and their analyses don’t suggest a cause. In the discussion portion of the paper, they offer some explanations that are in sync with my observations. First, ICUs are generally fully staffed 24-7-365. While the lights maybe dimmed slightly, there is seldom a diurnal variation in the attentiveness and quality of the caregivers in an ICU. Contrast this to an ordinary medical/surgical floor on which the staffing levels drop precipitously when the sun goes down. The skeletal staff is usually working in the dark, resorting to flashlights and ankle-level lighting to make their observations. And ... things are missed. Things like skin-color changes and the quality of respirations that become obvious when the morning shift arrives and the lights go on. “Holy s**t! This patient needs to be in the PICU!” And, the PICU now receives a patient at 7:30 a.m. who has a greater odds of mortality because the illness has percolated in the dark overnight.
The same phenomenon occurs in the outpatient setting. At night, sleep deprivation may cloud a parent’s observational skills. The lights in the bedroom may have been left off in hopes of keeping the child more comfortable. The parent may have called the doctor and been shunted to a triage nurse who is unfamiliar with the family and whose algorithm fails at a critical branch point. Or, the call may have been fielded by an answering service that is more interested in protecting its client’s sleep than serving the needs of the caller.
Or the parent may have spoken to the doctor early in the evening, but was hesitant to call again and wake her when the child’s condition changed. House officers can fall into the same trap when their misplaced concern about the sleep needs of the physician to whom they report prevents them from making a critical call for help.
Again, the result is that when the sun comes up, a child whose illness might have been more easily managed in the PICU at 2:30 a.m. doesn’t arrive in the unit until those deadly hours between 6 a.m. and 11 a.m. While there may be diurnal variations in the inherent mortality of some pathological conditions, this study from North Carolina suggests that when the lights go out, critical observations go unmade and so do wise decisions.
Dr. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years. He has authored several books on behavioral pediatrics, including “How to Say No to Your Toddler.” E-mail him at [email protected].
When your pager vibrates at 7 o’clock in the morning, it is unlikely to be alerting you of good news. It may be a parent who assumes that because his child’s medical home has evening office hours that there will be a receptionist sitting there at sunup to help him make an appointment for a nonurgent complaint. Or, it may be a call from a parent who knows that she doesn’t have a medical emergency on her hands, but who would like your advice about whether she should take a day off from work or send her child to day care.
But, an uncomfortable number of daybreak calls come from parents with what eventually turns out to be a desperately ill child. I have witnessed those scenarios often enough that even though I am retired, I break into a cold sweat when the home phone rings anytime between 6 and 7 in the morning.
A study from Wake Forest University, Winston-Salem, N.C., published in the November 2014 issue of Pediatrics, supports my long-standing discomfort with patients who present in the early morning. (McCrory et al. “Off-Hours Admission to Pediatric Intensive Care and Mortality,” Pediatrics 2014;134:e1345-e1353 [doi: 10.1542/peds.2014-1071]). In a retrospective study of nearly a quarter of a million admissions to 99 perinatal ICUs over a 3-year period, the investigators discovered that admission in off-hours and weekends “does not independently increase the odds of mortality.”
However, they found that admission from 6 to 11 in the morning is “associated with an increased risk of death.” This may not have been the result the investigators were expecting, and their analyses don’t suggest a cause. In the discussion portion of the paper, they offer some explanations that are in sync with my observations. First, ICUs are generally fully staffed 24-7-365. While the lights maybe dimmed slightly, there is seldom a diurnal variation in the attentiveness and quality of the caregivers in an ICU. Contrast this to an ordinary medical/surgical floor on which the staffing levels drop precipitously when the sun goes down. The skeletal staff is usually working in the dark, resorting to flashlights and ankle-level lighting to make their observations. And ... things are missed. Things like skin-color changes and the quality of respirations that become obvious when the morning shift arrives and the lights go on. “Holy s**t! This patient needs to be in the PICU!” And, the PICU now receives a patient at 7:30 a.m. who has a greater odds of mortality because the illness has percolated in the dark overnight.
The same phenomenon occurs in the outpatient setting. At night, sleep deprivation may cloud a parent’s observational skills. The lights in the bedroom may have been left off in hopes of keeping the child more comfortable. The parent may have called the doctor and been shunted to a triage nurse who is unfamiliar with the family and whose algorithm fails at a critical branch point. Or, the call may have been fielded by an answering service that is more interested in protecting its client’s sleep than serving the needs of the caller.
Or the parent may have spoken to the doctor early in the evening, but was hesitant to call again and wake her when the child’s condition changed. House officers can fall into the same trap when their misplaced concern about the sleep needs of the physician to whom they report prevents them from making a critical call for help.
Again, the result is that when the sun comes up, a child whose illness might have been more easily managed in the PICU at 2:30 a.m. doesn’t arrive in the unit until those deadly hours between 6 a.m. and 11 a.m. While there may be diurnal variations in the inherent mortality of some pathological conditions, this study from North Carolina suggests that when the lights go out, critical observations go unmade and so do wise decisions.
Dr. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years. He has authored several books on behavioral pediatrics, including “How to Say No to Your Toddler.” E-mail him at [email protected].
Paleo-Parenting
Two years ago, I saw a young man in my office who had decided not to return to college after his freshman year. He had been a very good high school hockey player, and I asked him what he was doing now to stay fit. He replied that he had joined a fitness facility, part of a national franchise system, and “I’m going on the Paleo Diet.” As I was among the clueless at that time, I quizzed him about his diet.
He told me that it was an attempt to duplicate the diet of our ancestors prior to the development of agriculture (thought to be about 10,000 years ago). This meant no processed food, no dairy products, no grains or legumes, no refined sugars. Lean meat, nuts, fruits, and low-starch vegetables were okay.
When I saw him for a follow-up visit 2 months later, I asked how he was doing with what I called his “caveman diet.” He said, “I lasted about 6 weeks, but I’m still working out four times a week.” It turns out that while my patient had drifted away from his paleolithic diet, enough other people have climbed on the bandwagon that there are now a couple of magazines devoted what has broadened beyond diet to what could be called a paleo lifestyle.
Devotees of the live-like-our-ancestors movement hope to avoid the “diseases of civilization by exercising frequently, particularly doing things that mimic our ancestors activities such as running, jumping, climbing, and throwing. A committed paleo person should wear a minimum of clothes and try to go barefoot as often as possible. He should have frequent contact with nature and get plenty of sun exposure for his source of vitamin D. His sleep patterns should be in sync with the sun cycle, and he should avoid stress by simplifying and downsizing his life.
This sounds like a lifestyle most toddlers strive for everyday. They prefer to run around nude and shoeless, climb just for fun, and throw anything within reach. It got me wondering what paleo parenting might be look like. Certainly, it would begin with breastfeeding. But, for how long? I don’t think we know the answer to that. It may not have been as long some breastfeeding advocates believe.
I suspect young children are smart enough to find shade in the middle of day to take a nap if we allow them. My obsession with the hazards of sleep deprivation makes the paleo’s attempt to link sleep to the sun cycle particularly appealing. It would benefit parents as well as the children for them to all go to bed when the sun went down. Paleo parenting would mean no TV. What a concept!
Of course, there are several flies in this ancestral ointment. First, I suspect that our prehistoric ancestors seldom lived into their fourth decade. How many of the “diseases of civilization” are simply the effect of aging on bodies that were not genetically engineered for longevity? How much do we really know about the diet and lifestyle of our paleo ancestors? Carbon isotope studies and microscopic analysis of ancient stool samples are pretty scanty evidence.
And, why choose to set our target to emulate before the development of agriculture? Some grain and a few root vegetables aren’t going to send our children on the road to obesity if they are active and getting adequate amounts of sleep.
There can be many advantages to adopting an “ancestral lifestyle,” but we don’t have to peel the onion all the way back to prehistory to reap the benefits. Heck, I bet if we rolled back to pretelevision, we would be a much healthier society.
Dr. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years. He has authored several books on behavioral pediatrics, including “How to Say No to Your Toddler.” E-mail him at [email protected].
Two years ago, I saw a young man in my office who had decided not to return to college after his freshman year. He had been a very good high school hockey player, and I asked him what he was doing now to stay fit. He replied that he had joined a fitness facility, part of a national franchise system, and “I’m going on the Paleo Diet.” As I was among the clueless at that time, I quizzed him about his diet.
He told me that it was an attempt to duplicate the diet of our ancestors prior to the development of agriculture (thought to be about 10,000 years ago). This meant no processed food, no dairy products, no grains or legumes, no refined sugars. Lean meat, nuts, fruits, and low-starch vegetables were okay.
When I saw him for a follow-up visit 2 months later, I asked how he was doing with what I called his “caveman diet.” He said, “I lasted about 6 weeks, but I’m still working out four times a week.” It turns out that while my patient had drifted away from his paleolithic diet, enough other people have climbed on the bandwagon that there are now a couple of magazines devoted what has broadened beyond diet to what could be called a paleo lifestyle.
Devotees of the live-like-our-ancestors movement hope to avoid the “diseases of civilization by exercising frequently, particularly doing things that mimic our ancestors activities such as running, jumping, climbing, and throwing. A committed paleo person should wear a minimum of clothes and try to go barefoot as often as possible. He should have frequent contact with nature and get plenty of sun exposure for his source of vitamin D. His sleep patterns should be in sync with the sun cycle, and he should avoid stress by simplifying and downsizing his life.
This sounds like a lifestyle most toddlers strive for everyday. They prefer to run around nude and shoeless, climb just for fun, and throw anything within reach. It got me wondering what paleo parenting might be look like. Certainly, it would begin with breastfeeding. But, for how long? I don’t think we know the answer to that. It may not have been as long some breastfeeding advocates believe.
I suspect young children are smart enough to find shade in the middle of day to take a nap if we allow them. My obsession with the hazards of sleep deprivation makes the paleo’s attempt to link sleep to the sun cycle particularly appealing. It would benefit parents as well as the children for them to all go to bed when the sun went down. Paleo parenting would mean no TV. What a concept!
Of course, there are several flies in this ancestral ointment. First, I suspect that our prehistoric ancestors seldom lived into their fourth decade. How many of the “diseases of civilization” are simply the effect of aging on bodies that were not genetically engineered for longevity? How much do we really know about the diet and lifestyle of our paleo ancestors? Carbon isotope studies and microscopic analysis of ancient stool samples are pretty scanty evidence.
And, why choose to set our target to emulate before the development of agriculture? Some grain and a few root vegetables aren’t going to send our children on the road to obesity if they are active and getting adequate amounts of sleep.
There can be many advantages to adopting an “ancestral lifestyle,” but we don’t have to peel the onion all the way back to prehistory to reap the benefits. Heck, I bet if we rolled back to pretelevision, we would be a much healthier society.
Dr. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years. He has authored several books on behavioral pediatrics, including “How to Say No to Your Toddler.” E-mail him at [email protected].
Two years ago, I saw a young man in my office who had decided not to return to college after his freshman year. He had been a very good high school hockey player, and I asked him what he was doing now to stay fit. He replied that he had joined a fitness facility, part of a national franchise system, and “I’m going on the Paleo Diet.” As I was among the clueless at that time, I quizzed him about his diet.
He told me that it was an attempt to duplicate the diet of our ancestors prior to the development of agriculture (thought to be about 10,000 years ago). This meant no processed food, no dairy products, no grains or legumes, no refined sugars. Lean meat, nuts, fruits, and low-starch vegetables were okay.
When I saw him for a follow-up visit 2 months later, I asked how he was doing with what I called his “caveman diet.” He said, “I lasted about 6 weeks, but I’m still working out four times a week.” It turns out that while my patient had drifted away from his paleolithic diet, enough other people have climbed on the bandwagon that there are now a couple of magazines devoted what has broadened beyond diet to what could be called a paleo lifestyle.
Devotees of the live-like-our-ancestors movement hope to avoid the “diseases of civilization by exercising frequently, particularly doing things that mimic our ancestors activities such as running, jumping, climbing, and throwing. A committed paleo person should wear a minimum of clothes and try to go barefoot as often as possible. He should have frequent contact with nature and get plenty of sun exposure for his source of vitamin D. His sleep patterns should be in sync with the sun cycle, and he should avoid stress by simplifying and downsizing his life.
This sounds like a lifestyle most toddlers strive for everyday. They prefer to run around nude and shoeless, climb just for fun, and throw anything within reach. It got me wondering what paleo parenting might be look like. Certainly, it would begin with breastfeeding. But, for how long? I don’t think we know the answer to that. It may not have been as long some breastfeeding advocates believe.
I suspect young children are smart enough to find shade in the middle of day to take a nap if we allow them. My obsession with the hazards of sleep deprivation makes the paleo’s attempt to link sleep to the sun cycle particularly appealing. It would benefit parents as well as the children for them to all go to bed when the sun went down. Paleo parenting would mean no TV. What a concept!
Of course, there are several flies in this ancestral ointment. First, I suspect that our prehistoric ancestors seldom lived into their fourth decade. How many of the “diseases of civilization” are simply the effect of aging on bodies that were not genetically engineered for longevity? How much do we really know about the diet and lifestyle of our paleo ancestors? Carbon isotope studies and microscopic analysis of ancient stool samples are pretty scanty evidence.
And, why choose to set our target to emulate before the development of agriculture? Some grain and a few root vegetables aren’t going to send our children on the road to obesity if they are active and getting adequate amounts of sleep.
There can be many advantages to adopting an “ancestral lifestyle,” but we don’t have to peel the onion all the way back to prehistory to reap the benefits. Heck, I bet if we rolled back to pretelevision, we would be a much healthier society.
Dr. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years. He has authored several books on behavioral pediatrics, including “How to Say No to Your Toddler.” E-mail him at [email protected].