Monday, April 27, 2009

My Evening with the Future

Steven Lewis

Throw away the crystal ball and spend an hour with Google Health: the future will be right in front of you. Providers who prefer the pedestal to parity and see themselves as traffic cops on the health information highway are in for the shock of their lives.

I intended to compare the on-line personal health records of both Google and Microsoft, but I couldn’t convince Microsoft that I was Phyllis Diller from Scottsdale AZ (you have to be a US resident to sign up). Google let me in, and in a few minutes I had my own record. Well, not quite my own. (Health record privacy purists, skip the next paragraph.)

Touch wood, I’m a healthy guy. I don’t have any chronic conditions (unless you count seasonal hay fever, a cat allergy and a severe reaction to sopranos), have never had surgery, take no prescription drugs, and, having read too much quality and outcomes research, am a bit of a fatalist. There will be no colorectal cancer screening bazooka shoved up my behind unless I get to watch a video of my doctor smiling happily through the procedure. With so little data to enter, I made stuff up – gave myself type 2 diabetes, angina, arthritis, and added 25 pounds – to test the ingenuity of the architecture.

Hello Frank Gehry. The software leads you through a consumer’s garden of neurotic delights – and I mean that in a good sense. You can input your own remarks and notes, set permissions for access, amend or delete entries. You can second guess the diagnosis or advice you got from your doctor in the seven minutes he spent with you. You can get drug price and therapeutic equivalence comparisons – your own reference based pricing program is at your fingertips. You can catch contraindicated drug combinations. You can learn your odds of falling prey to various health breakdowns by linking your profile to web-based risk calculators. You are handed the key to a cornucopia of safe information injection sites tailored to your profile.

But, dammit, this is America, and in America nothing good happens unless money changes hands. Some of the featured partner sites offer free advice, but they’ll also sell you drugs or other products. Others are more brazenly mercenary: at the Cleveland Clinic site it’s cash for counsel. Want a second medical opinion from MyConsult? That’ll be $565, and it’s not insured. The web site supplies a sample report so you can see what you’re likely to get: mainly of a summary of what the patient sent in, and a 3 paragraph opinion. I figure a good senior resident could knock one off in half an hour, an hour tops. Bargain-hunters can score a nutritional consult about your gout (honest) for a mere $95, or 2 sessions on high blood pressure for $165.

Perhaps you’re more inclined to Blueprint for Wellness, which offers a package of 29 lab tests, a wellness questionnaire, and a personal wellness report for $134. TrialX.org will match your condition to a database of 25,000 clinical trials and help you find clinical investigators expert in your condition – all free. It’s essentially a dating service for researchers and a potentially vast pool of trial recruits.

To those of us of a certain age for whom a long distance call was a special and costly luxury, this is all a bit disorienting, even creepy. Google has an incentive to choose its partners carefully – it doesn’t want to jeopardize its carefully cultivated we’re-not-like-the-other-megacorps image. No filter is foolproof, and no interaction is risk-free. Nonetheless, the methodologically innocent are far better off with Google as the quality control overseer than heading off solo into a cyberspace full of charlatans.

Still, there are reasons to pause and reflect. Will the model undermine the doctor-patient relationship? Will it fuel an even greater obsession with tests and assessments, luring millions more to the already-vast army of the worried well? What happens if providers act on inaccurate or misleading patient-created information? What jurisprudence will arise from the inevitable litigation when something goes wrong?

Yeah, whatever. Quibble all you want; it’s here, it’s growing, there will be no pausing and precious little reflecting. It’s a predictable workaround, an evening up of the odds against Fortress Healthcare that keeps patients in the dark and puts a firewall between them and their health record. Some enlightened health care organizations, such as Group Health Cooperative in Seattle, put patients at the centre of their e-health strategies. How strangely foreign to the Canadian Way.

Canada’s e-health leaders can either ignore it or embrace it. Ignoring it risks dooming the provider-oriented, paternalistic plans to rapid obsolescence. They should take what’s good about it – joint production and ownership of health information, the potential to create networks of trustworthy information sites, the linkage to self-management tools tailored to individual profiles, the fantastic communications capacity – and revamp their e-health plans, fast.

The democratization of knowledge and the desire to be treated like an adult in health care transactions are irresistible forces. It’s a near-miracle that Canadians for the most part continue to put up with the arrogance, inconvenience, secrecy, and error build into the existing system. We could have seen this coming a decade ago, but we are too often a nation of deer in a world of headlights. Google and Microsoft have set off the alarm. Pushing the sleep button won’t cut it much longer.

Monday, April 20, 2009

Pay for Performance: The Wrong Time, the Wrong Place?

Steven Lewis

It sounds like such a good idea: don’t pay people to show up and scurry about, pay them for proven performance. It’s the new Big Thing in health care financing. As usual, the Brits have pursued it most vigorously. Some Canadian health care executives get bonuses for achieving certain targets. The US Medicare plan has quit reimbursing hospitals for the costs of dealing with avoidable mishaps such as falls and bed sores. Health care cheques should come with performance strings attached. About time, right?

Well, yes, if your overlook the P4P track record. Renowned British health economist Alan Maynard found lots to be cautious about in his review of experiences to date. The Hay Group believes that even 5% to 10% of income at risk is insufficient to produce a significant effect, let alone the 1% to 2% typically on the table in such arrangements. In the UK, the vaunted GP bonus schemes – which can add tens of thousands of pounds to physician incomes – have turned into base pay. The average GP practice scores 95% of the bonus-triggering points available and virtually all get 90% or more. But the number of complaints per practice – one reasonable measure of satisfaction – varies considerably.

On examination, the very essence of P4P is troubling. It is a profoundly pessimistic concept of what makes people tick in health care: we can’t rely on organizational culture, professionalism, devotion to public service, or commitment to excellence to get the desired results, so let’s just concede that it’s all about the money. Managers and practitioners are hardened cynics for whom pecunia vincit omnia -- cash conquers all. So let’s tell them what to accomplish, ring the economic bell and watch the Pavlovian throng stampede to improvement via the cash-stuffed trough.

Dishearteningly, P4P writ large becomes a self-fulfilling prophesy. Adopt its assumptions and fund or pay accordingly and you will indeed turn civilized people into econocentric shadows of their selves. Set up the game and people will learn the rules and play accordingly. Moreover, the game will inevitably lack sophistication, because to dole out the rewards, the goals must be clear and simple; the results easily measurable and immediate; and the reach modest (no one will play if it’s too hard to win). All nuance and complexity are obliterated by the basic algebra of the payout. So it’s hardly any wonder that British GPs are walking away with the dough. Ask not for what the bell tolls – it tolls for fee.

But what if we’re just learning, and eventually get it right, particularly if we learn from our masters in the private sector? You’re doubtless as inspired as I am by the corporate CEOs with incomes almost entirely driven by the value of their (occasionally back-dated) stock options and the quarterly earnings statements. They sure knew how to tally up the performance points. You get what you pay for, and the denizens of Wall Street decided to pay for scams so absurd that they make the Nigerian please-be-my-agent-for-millions howler look like Protestant-ethic capitalism at its sober best. IKEA CEO Anders Dahlvig refuses to take his company public precisely to avoid the tyranny of get-rich-quickism that makes a virtue of impatience and myopia and rewards Ponzi schemes over substance. But he never claimed to be as smart as the guys who ran Lehman Brothers.

For the hundredth time in a seemingly infinite series, the world is learning two key lessons: you don’t get something for nothing, and appealing to baser instincts will improve neither humans nor their achievements. Health care is a uniquely fraught enterprise that deals with uncertainty, vulnerability, tragedy, hope, and trust. Of course it involves great amounts of money and to which neither individuals nor organizations can be indifferent. Health care takes place in a messy world, not a monastery. But money is a resource for achieving other ends, and if it defines us or crowds out nobler preoccupations, the means become the end, the aperture narrows, and the golden calf beckons.

Doesn’t it seem odd that we would have to coin the notion of “pay for performance” in the first place? What the hell else are we paying for? When did “doing one’s job” uncouple from “doing one’s job well”? Suggesting that ordinary performance – not spectacular, but merely satisfactory, like being nice to your patients or doing Pap tests at the recommended interval – deserves a bonus debases the entire enterprise. It creates a cultural norm in which lousy performance is the natural state and the passable is redefined as extraordinary. It dumbs performance down and leaves out the hard parts.

Show me a P4P system that rewards first class care of the frail elderly, life-enhancing management of multiple chronic conditions, reduced need for surgery fifteen years from now, or ending one’s career with sunny disposition and compassion intact, and I’m all ears. But in my preferred world, the first dollar and the last pay for excellence across the board, an ethos of care, devotion to the public good, and the perpetual search for knowledge. Pay individuals well and fund organizations fairly. Settle the money issues swiftly so all can focus on what the money is supposed to achieve. Do this well and we’ll have pay for performance – not as cause-and-effect, but as a harmonious feature of a thriving culture.

Providers who practice to chase income targets and dangled bonuses are different from providers who want a reasonable income to pursue their callings out of love for what they do and a drive to serve people better. For those who crave the buzz of the financial transaction, there is a vast world beyond health care to explore. Health care that takes its cues from the rantings of the Chicago School and the MBA culture imperils its values and its practitioners. If those twin intellectual frauds can take down an economy, they can easily corrupt health care. Health care culture needs more than a behaviourist tweak and tuck. The worst imaginable outcome would be that P4P as currently conceived actually worked as intended, for that would prove just how far we have fallen.

Bottom-Up vs. Top-Down Innovation – and Hot Air.

Neil Seeman

“Hot Air” by Marjorie Priceman is a delightful children’s book about the first hot-air balloon – invented by brothers Joseph-Michel and Jacques-Étienne Montgolfier. In September, 1783 their Aerostat Réveillon reached 1,500 feet, its passengers being a sheep, a duck and a rooster. The Montgolfiers thought that smoke propelled the balloon. Years later it was discovered that hot air rises because it weighs less than cool air.


More than two centuries later, the debate over the source of innovation rages on: Does change come from daring researchers (the Montgolfiers)? From the throngs of people who cheer on the invention and send forth news of the sensation (in this case, to Paris)? Or from the monarch (it was King Louis XVI in 1783) who champions the invention?


Health policy wonks – especially those interested in the transformative power of health information technology (HIT) – are fixated on whether innovation comes from the “bottom up” (the patient/consumer) or the “top down” (government or the hospital).


Author Vijay Vaitheeswaran describes the bottom-up/top-down debate in the Economist Magazine’s current special report on health care and technology. Denmark is perhaps most cited as the showpiece for top-down HIT innovation, with almost all Danes having regular access to an electronic health tool to manage their appointments, track medications, and to ensure they don’t take the wrong kind of medication, too much medication or drugs that should not be used simultaneously.


So here’s one top-down formula for the successful adoption of HIT: Mandate common security standards, data-sharing protocols, and consistent interpretations of privacy law.


On the other side of the debate, many “health 2.0” or “peer-to-peer” enthusiasts tend to believe in bottom-up innovation: Give patients the tools (e.g., their complete online medical records), and the doctors and hospital CEOs and government leaders will step into line. I used to believe this. I now understand things are a bit more complex.


Too much focus on the bottom-up/top-down debate misses the real goal: making sick patients healthier faster, or managing and preventing illness altogether. This may happen bottom-up, top-down, or, more often than not in my opinion, by combination or accident. In many cases (as with the hot air balloon) we don’t really know why for years to come.


The management and health policy communities tend to ignore the reality of the happy accident. We are trapped in a cognitive bias: we think that if quality or outcomes improve within any organization, this must be by dint of “process improvement” or because of a charismatic leader who “just got things done.” From the hot air balloon to Twitter to Viagra, history abounds with accident as the seat of innovation. Perhaps the most we can do is make the ground fertile for more accidents to happen: hire lots of smart, diverse people willing every so often to bonk senior management on the head to experiment.


I don’t pretend to know the answer to why innovation and adoption of HIT takes off more quickly in some jurisdictions than in others. I am skeptical of those who think they know the answer to this very difficult question, given the deep socio-cultural differences among neighborhoods in the very same city, much less among countries or continents.


But I do say this: all the energy the academy, consulting firms, large companies and governments spend on debating this question could be channeled into something more productive: curiosity-seeking, idea-generation and free-form debate among patients and providers and others working in the system. Consider parking 15 percent of your organization’s time into tinkering with how to improve healthcare for everyone.


Sometimes a competitor will steal your idea. (Some claim that the hot air balloon was invented some 74 years earlier by the Portuguese priest Bartolomeu de Gusmão.) Sometimes a monarch or government official will take all the credit. Never mind. If just once we soar high, then it will all have been worth it.


Neil Seeman is Director and Primary Investigator of the Health Strategy Innovation Cell, based at Massey College at the University of Toronto. neil.seeman@utoronto.ca

Wednesday, April 15, 2009

The Obesity Epidemic and the Rise and Fall of Public Health




"They don't understand how this could happen. I tell them that they have crushed their knees under their own weight."
-----------------------------------------------------
I'm at the annual meeting of the West Virginia Medical Association, and the conversation has turned to obesity. My colleague, an orthopedic surgeon with a local practice, explains how his clientele has grown younger with each passing year. Whereas he used to operate on people in their 70s for hip and knee replacements, he now sees patients as young as 40.

How can a 40-year-old ruin his knees? The doctor describes patients with body mass index (BMI) values of 45 - the equivalent of 152 kilograms (335 pounds) for a man with a 1.8-metre (six-foot) frame. No one at my dinner table is shocked. In West Virginia, such stories are too common. The state ranks as one of the fattest in America, ranked second overall for obesity. Whereas obesity rates in America are high, West Virginia is the ground zero of this ailment: 30% of the population has a BMI exceeding 30. And like the rest of North America, this is a new phenomenon: in 1991, no state's obesity rate exceeded 20%.

More troubling still is the fact that, on paper, West Virginia seems to have done everything right. For a generation, schoolchildren have learned about good nutrition - it is part of the curriculum. Public officials have gone as far as to study the use of video games such as Dance Dance Revolution in physical education classes. And politicians have implemented a variety of policies favoured by Seeman and Hobbs in their papers: bringing together stakeholders and nudging people in the right direction with everything from taxes on junk food to regulating school lunch options. Yet, it doesn't seem to matter. West Virginia's problems have been getting worse in recent years.

Is West Virginia the future of North America? It could be. How are we to avoid this fate? Start by looking at the rise and fall of public health.

A Brief History of Public Health

Dr. Sara Josephine Baker was a pioneer, helping to save the lives of tens of thousands. She did this without magnetic resonance imaging or even a computed tomography scanner.

Like many of the leaders of public health in the early 20th century, she focused her work on the poor. Among her initiatives was setting up a milk station in Hell's Kitchen, thereby enabling poor children to get clean, pasteurized milk. Dr. Baker didn't invent pasteurization, nor did she perfect it. Her work simply aimed at making basic food products safe and available.

For much of the 20th century, public health focused on a handful of goals to improve the environment people live and work in: sanitation, clean water and safe food. Across North America, Dr. Baker and her colleagues made it possible to grow up and grow old. Coupled with a long-standing commitment to immunization, public health officials can largely take credit for the incredible leap in life expectancy over the first half of the 20th century. Their methods may have been straightforward, but the results were extraordinary: the expansion of life expectancy during the age of Dr. Baker exceeded the expansion seen during the medical revolution of the latter half of the century.

But public health did relatively little to change people's behaviour. That is, until 1964, when a government committee issued a report and saved millions of North American lives.

In a balanced review, the Surgeon General's Advisory Committee Report Smoking and Health acknowledged the benefits of smoking (including its relaxing qualities); it also concluded definitively that tobacco is linked to cancer. The report had a profound effect. Today, people widely accept the connection between tobacco and cancer. In the 1950s, people were less certain - in 1958, only 44% of Americans saw the link.

Why? It wasn't for lack of evidence: by 1950, the Journal of the American Medical Association (JAMA) had shown in a large sample that 96.5% of the lung cancer patients were smokers." By 1958, JAMA published another landmark study showing that cancer and smoking go hand in hand. But the tobacco industry had been clever, buying advertising and physicians to contradict the evidence.

But the report of 1964 was a game changer. The committee members spent 14 months reviewing the world scientific literature and concluded that "cigarette smoking is a health hazard of sufficient import in the United States to warrant appropriate remedial action" (page 33). By the late 1960s, 71% of Americans believed that smoking causes cancer. And so began a decades-long fall in smoking rates. When the report was issued, roughly half of adult Americans smoked; today, the rate is down to one in five.

If the surgeon general's report changed America, it also had a profound effect on public health. Gone were the days when it focused on environmental factors in health. Inspired by the success in the war on smoking, public health now focuses on bettering people and the choices they make. Today, public health is as much concerned with safe sex as it is with safe food.

But there is a problem with this approach: it applies a 1960s solution to 21st-century problems. The surgeon general's report was issued at a time profoundly different from today - before the Internet and mass health literacy. Indeed, the war on smoking itself has fallen on hard times, with smoking rates remaining relatively stable over the past decade despite a record amount spent by governments on education.

How Are We to Address the Obesity Epidemic?

In many ways, the solutions outlined in the papers by Seeman and Hobbs simply wish to continue on as public health has for the past four decades, seeking to inform and push people toward better choices. Professor Hobbs, as an example, writes, "As a business owner, for example, I may prefer not to reveal the high level of artery-clogging saturated fat or trans fat in the cookies I market. However, government regulations - the rules - may require me to list on the package label the nutritional content of my product." Later, she argues for some type of government leadership in this field, and I assume that she applauds the efforts of cities like New York to make caloric counts mandatory on some restaurant menus. It's hard to argue against such transparency measures. It's also hard to feel that this is particularly useful. In this day and age, does anyone really consider cookies - heavy with trans fats or otherwise - to be healthy? Speaking of New York, do people really walk into a McDonald's with a milkshake on their mind but not understand that this is a high-calorie snack? Hobbs bemoans the "underfunded educational campaigns" of the Bush Administration, but has it ever been easier for people to get informed about good eating habits?

Seeman, too, finds much comfort in education and transparency. He envisions teenagers mentoring children on how to be physically active. Again, it's hard to argue against such a measure. But is childhood obesity really stemming from children who don't know how to run around? Will such programs teach children how to play tag or monkey-in-the-middle?

Hobbs and Seeman do of course advocate many other ideas. That said, these ideas seem antiquated. Both Hobbs and Seeman think that it's the government's role to make exercise more available; they have fashioned themselves as modern-day Dr. Bakers, except that they want to regulate sidewalks and parks (Hobbs) or grant tax-subsidized gym memberships (Seeman) instead of building milk stations.

The approaches of Hobbs and Seeman differ - between the stick and the carrot - but the goal is the same: to get America off its couch and out of the house. But what to make of the West Virginia experience? The state is one large rolling park. Surely the problem there is not a dearth of green space but people's lack of motivation to use it. In fact, American's have never had more disposable income or leisure time, making it easier than ever to buy a skipping rope from the local Wal-Mart and then use it.

How are we to deal with the obesity crisis? First, we can all agree that government policy shouldn't directly foster bad habits. Hobbs is right when she points out that some cheap food, particularly corn syrup-based food, is a consequence of agricultural subsidies. FDR's New Deal may or may not have lifted America out of the Great Depression, but the ongoing subsidies of corn aren't helpful.

Second, we can consider the various indirect subsidies of poor health decisions. Many Americans receive their health coverage from Medicaid (the poor) or Medicare (the elderly). Should taxpayers foot the bill for morbidly obese Americans without any restrictions? Non-government healthcare, whose underpinning is the US Tax Code, also indirectly subsidizes the unhealthy with the health dollars of the healthy - the three-pack-a-day smoker two cubicles down from the fit gentleman pays basically the same monthly premiums. Is that right?

Ultimately, though, I wonder about the limits of government policy. FDR's corn subsidies existed for decades before America grew fat. For all the indirect subsidization of healthcare, being medically ill seems a more overwhelming deterrent than a good deal on an insurance premium.

The recent obesity trend seems to be more about cultural acceptance than government policy. Lawsuits argue that the obese are discriminated against; overweight actors win prized roles and then proclaim their win for the overweight; prominent citizens discuss their inability to lose weight. Even our language has changed - we talk about the "obese" and not the "fat." What then should our politicians do? Legislation will take us only so far. For other major cultural trends - from the decline in divorce to the drop in drinking and driving rates - it has largely been about people speaking up and leading by example.

Maybe our politicians can toughen up their language and speak more like this: "We talk about people being 'at risk of obesity' instead of talking about people who eat too much and take too little exercise. We talk about people being at risk of poverty, or social exclusion: it's as if these things - obesity, alcohol abuse, drug addiction - are purely external events like a plague or bad weather. Of course, circumstances - where you are born, your neighbourhood, your school and the choices your parents make - have a huge impact. But social problems are often the consequence of the choices people make." The speaker isn't from West Virginia but from Britain: David Cameron, the leader of the opposition. But his fundamental idea, that one's life is his or her responsibility, is distinctly American.


About the Author
David Gratzer, MD
Senior Fellow, Manhattan Institute

One of a series of related papers in the journal Healthcare Papers.

References
Cameron, D. Speech in Glasgow. Retrieved July 7, 2008. < http://www.telegraph.co.uk/news/newstopics/ politics/conservative/2263705/David-Cameron-attacks- UK-moral-neutrality---full-text.html > .

Surgeon General's Advisory Committee on Smoking and Health. 1964. Smoking and Health. Report of the Advisory Committee to the Surgeon General of the Public Health Service. Washington, DC: US Department of Health, Education, and Welfare.

Wydner, E.L. and E.A. Graham. 1950. "Tobacco Smoking as a Possible Etiologic Factor in Bronchiogenic Carcinoma; A Study of 684 Proved Cases." JAMA: Journal of the American Medical Association 143(4):329-26.

Hammond, E.C. and D. Horn. 1958. "Smoking and Death Rates - Report on Forty-Four Months of Follow-up of 187,783 Men." JAMA: Journal of the American Medical Association 166(11): 1294-1308.

Tuesday, April 7, 2009

Workplace Rudeness: A New Pandemic?



Smartphone addiction (Blackberry or iPhone) during meetings, showing up late for meetings, and a lack of "Thank-yous" are infecting the workplace. Healthcare is no exception.

What can be done to reverse the trend? Equally important, is workplace rudeness a public health issue?

There are few legitimate excuses for everyday rudeness. Let's forget about our intensely busy selves; the tough economy; or demanding clients or colleagues. We all know that the busiest people, under the most stressful of circumstances, can be the most polite and responsive in the simplest of ways.

Low-tech approaches could help turn the tide. First, my brother, a globe-trotting entrepreneur, has a tag line on his e-mails saying: "Apologies for the curtness of this e-mail; I'm typing with my thumbs." Second, some industry associations now instruct members to respond to correspondence within 24 hours. This is eminently feasible - if you check e-mail even once a day (a policy I recommend for efficiency), and, if busy, say you'll reply at a later date. Last, be tactful if using an out-of-office message. I've seen someone flare in their out-of-office subject line: "VACATION" - nothing more.

I asked a former hospital CEO how to get around the problem of email jail and general time crunch, and he advised, "Hire an EA." "What's that?" I asked (seriously). (It's an Executive Assistant). But there may be a simpler, less expensive and more personal solution.

I think the social remedy to workplace rudeness is, paradoxically, to be more blunt.

  1. Be blunt about being late for a meeting

    Tai Huynh, a colleague of mine, tells me: "I personally dislike people arriving late to meetings. I think it's rude, disrespectful to colleagues (especially if the late person is the organizer) and eats into valuable meeting time. For me, the rudeness clock starts ticking at about the 5 minute mark. At about 10 minutes, the disrespect factor kicks in and by about the 15 minute mark, I wonder why the person bothers showing up."

    I figure that toting up the costs to the health system of people being late for meetings - i.e., assessing the waste by reference to the total annual salary of attendees who agreed to come but were late - could save the system significant sums if we are transparent about these costs.

  2. Be blunt about meeting unnecessarily

    According to Glenn Parker and Robert Hoffman, authors of Meeting Excellence, knowing the expense of meetings may be an impetus to make meetings more productive. I believe this could be especially powerful in the culture of health policy, where, to quote one globally renowned physician-researcher, the running ethos is to "to meet to plan to partner." Better to make a decision to partner - or not - at the first meeting, and to start the partnership project (i.e., writing things down) at the first meeting. If you exceed two cancellation notices prior to landing at the first meeting, you know it's not worth it.

    Messrs. Parker and Hoffman point to a survey conducted at the Milwaukee Area Technical College that recorded the time that members of the college's 130-person management council spent in meetings. The evaluators used salary as the basis to calculate how much this time was worth. Meetings reportedly cost the college more than $3 million US per year.

    I know managers who charge people money who are late for meetings. I personally don't like this approach. I have young children: I know just enough about psychology to suspect negative incentives generally don't work in this context. People just end up grousing at the boss. Besides, often times the reason for being late for the meeting is because you've had back-to-back meetings all day and the first one started late. After all the meetings conclude, I've been told by some very senior people in healthcare that the day's real work happens "off line" - whatever that means.

  3. Be blunt about the use of smart-phones while meeting with someone

    Personal digital assistants are another curse. So-called "intraviduals" (a term invented by author Dalton Conley in Everywhere, USA) are neither here nor there when tied to their Blackberry. A friend of mine, economist Patrick Luciani, calls this the "excuse-me-your-Highness-I-have-to-take-this-call" syndrome; even when speaking to the Queen, the other call is always more important.

    I had the pleasure of speaking recently with a wise man who told me that, not long ago, it was the height of rudeness to take another call while speaking to another. Nowadays, it's the new normal.

    "Of all of the standard irritants, uncontrolled BB use is the largest," writes Borys Chabursky. He tells me of an incident where two individuals came in to interview him for a project. "They asked a question and I would start answering and as I did, they would start checking their emails on their BBs. When I suddenly stopped speaking, they, without looking up said, 'it's ok, we're listening, just keep going.' I couldn't believe it and just stopped the interview."

    We need leaders in healthcare to rebel against the culture of passive-aggressive behaviour, Blackberry addiction and meeting creep. Several years ago, University of Michigan-Ann Arbor psychologist Lilia Cortina found that 71% of workers had been mocked, taunted, ignored, or otherwise treated uncivilly by their coworkers and bosses. Last summer, researchers at West Chester University in Pennsylvania found that 75 percent of workers are treated rudely by bosses or colleagues at least once a year. I could not find any comparable Canadian data. If you know of comparable data in healthcare, please share.

  4. Give incentives for good communication manners

    When we talk about "a healthy workplace" in hospitals or care facilities, we often refer to policies and protocols that enforce existing health and safety legislation. Innovative initiatives like the Healthy Healthcare Leadership Charter, created by the Quality Worklife-Quality Healthcare Collaborative (QWQHC), are making important strides forward to cultivate healthier healthcare workplaces.

    Imagine if basic civility were the touchstone for a healthy workplace. That might go a long way to saving money, stopping burn-out, and promoting happiness at work - and at home.

  5. Say "Thank You" more often

    I invite you to express your gratitude and send a "thank you" note to someone who is making a difference to help put a stop to the incivility disorder in the workplace. Thank you for reading this.


About the Author
Neil Seeman, a Longwoods essayist, is Director and Primary Investigator of the Health Strategy Innovation Cell, based at Massey College at the University of Toronto.