A New Jersey law firm pursuing a worker’s compensation case has stumbled onto what might be the ultimate social determinants of health question:
Should work compensate us for negatively affecting our health?
The law firm asserts that employment which causes or exacerbates depression is valid grounds for a worker’s compensation suit. Apparently, this can vary from state to state, as laws regarding mental health and medical care aren’t always in alignment. Depression caused by a physical workplace injury is one thing; depression caused by the workplace itself, quite another. Even if the precedent is still somewhat shaky, the whole notion raises some fascinating questions, and it puts our current system for health and wellness under sharp focus.
Social determinants of health: Workplace effects
We know that lifestyle has an important impact on health. But the reality is that lifestyle is determined less by individual choices and more by the inertia and pressures of work. The health of Americans is directly and indirectly affected by our work and our culture around jobs. There is very little doubt about how much of this impact is negative.
- Desks – Too much sitting has been proven to lead to chronic back pain, decreased energy, and earlier death, leading some researchers to label sitting as “the new smoking” in terms of its ubiquity and harmfulness.
- Vacation Days – Americans are notorious for failing to utilize vacation days. The cultural and economic factors that lead them to leave time-off on the table also lead to increased stress, depression, anxiety, as well as physiological manifestations of all these mental and emotional disorders.
- Office Snacks and Diet – The same culture that drives Americans to work through vacations and time-off also shows up in their dietary choices. Office parties and snack options all trend toward the calorie-rich, nutrition-short variety. Even cutting-edge startups and “hip” employers use free food as a bargaining chip, and nothing is cheaper (or more tantalizing to tired, stressed hungry workers) than junk food.
- Commuting – Americans spend nearly an hour a day on average just driving to and from work. All these hours add up, and researchers have positively associated this ritual with everything from worse blood glucose levels to higher rates of depression and more. The very act of going to and coming home from work makes people sick.
- Fast Food – The kissing cousin of the American commute is the drive-thru. In the same way that cheap, quick, convenient food permeates the office space, so, too, does it dominate the dietary choices of average working types in their off hours. After a long hard day at work, feeling tired and unmotivated, it is all too tempting to just swing through the driving lane of some neon-lit happiness merchant with a dollar menu and to-go bag. Throw marriage and kids into the mix, and fast food goes from an indulgent choice to a default behavior.
- Lack of Sleep – We now know that sleep may be the best medicine of all, even more critical to good health than diet or exercise alone. Yet, Americans routinely fail to get enough sleep, and the trend can often be attributed to work-related stress and the inability to sufficiently unwind or “switch-off” between shifts.
- Less Exercise – All the above factors combine to diminish the time and energy people have to get appropriate exercise. Nurses and wait staff alike can attest that even jobs that keep you on your feet every day don’t fully substitute for a balanced, consistent exercise routine; they do, however, make it harder to squeeze in meaningful physical activity before or after the workday.
There is almost nothing that has gained attention as an “epidemic” threatening public health in this country that isn’t also directly correlated with features of modern employment and work culture. But these aren’t diseases easily treated with more medication or even more education.
Looking upstream to solve America’s health crises
Even though this train of thought started with a lawsuit, the purpose isn’t to assign blame. Blame is categorically unhelpful to improving health outcomes. You can blame an individual patient for poor choices that led to obesity and heart problems, but that doesn’t do much to change behavior or prevent others from making the same choices. You can blame tobacco companies for encouraging people to smoke, but it doesn’t change the efficacy of their message or the availability of addictive products.
You don’t dam a river by building a wall around the ocean. You don’t solve a problem simply by identifying the consequences. To better manage health challenges, moving past blame means going upstream to understand the real causes and identify meaningful opportunities to treat those root causes and improve preventative efforts.
The standard epidemiological process chases outbreaks of viral and bacterial diseases—diseases that can be traced to a patient zero, that can be isolated, and that typically have discrete causes. The goal is to stop or prevent epidemics by isolating their sources, understanding how they spread, and breaking the cycle. That is hard to do with non-communicable conditions of the sort that plague Americans.
If the many types of cancer, cardiovascular disease, obesity, dementia, depression, and suicide all had discrete and easily isolated causes, they wouldn’t comprise the leading causes of death in the United States. It is apparent that these all emerge from a collection of social, cultural, lifestyle, and yes, economic factors. Even chronic respiratory diseases not directly triggered by smoking can often be attributed to environmental factors often associated with working indoors.
Let us not forget that doctors and nurses are far from immune to some of the worst culprits of workplace health hazards: abuse, stress, long hours, and poor diet are all hallmarks of the caregiving profession, not to mention topping the charts for burnout, depression, and suicide risk. Even our healers can’t heal themselves from the ills of work. Working, and all the lifestyle changes that accompany the working life, have accumulated into an unstoppable force driving us to make compromises and excuses that ultimately lower our quality and length of life.
Work: The plague rat you can’t contain
All this raises a complicated question: If institutional causes (work, employment) underlie so much of society’s risk of illness and injury, can anything be done to change the system? Individual responsibility may not be a misattribution, but rather a focus on the only area where influence is still possible. After all, we can’t collectively stop working.
An example of a more pragmatic balance of shared responsibility and mutual accountability can be seen, of all places, in the realm of cybersecurity.
In the wake of the recent news that Equifax, one of America’s three main credit reporting bureaus, was hacked and exposed the data of nearly half the country’s population, a sense of helplessness took hold. We can’t get rid of the credit reporting system, and you can’t simply opt out—it is as universal, it would seem, as employment. Even experts began to express the sentiment that, in today’s digital, highly connected age, no security solution is permanent, and you can generally assume that someone, somewhere, already has your data. Statistically, you just can’t afford to think you haven’t been exposed at least once, or that you won’t be vulnerable sometime soon.
Cybersecurity and data breaches parallel jobs and negative health outcomes quite well:
- exposure is universal (everyone is affected)
- the probability of a negative event is almost absolute (it is a question of when, not if)
- the alternative is impossible (either go back in time or avoid participating in modern society)
- our institutions are centrally responsible
However, in the case of cybersecurity, institutions are still held accountable, structurally and financially, for negative events, even in the face of inevitable breaches and failure.
Organizations are still prioritizing investment in prevention, on the logic that the costs of doing nothing outweigh the costs of even imperfect preventative measures. Individuals, in the meantime, still assume some responsibility for safeguarding their own data, monitoring their own behavior, and making informed decisions about when and how to take risks with respect to security.
Even though no system is perfect, we are holding institutions accountable for taking their role seriously in minimizing negative events. And, even though we look to our institutions for investment and systemic preventative efforts, individuals still take responsibility for personal education and incremental preventative behaviors.
Culpability and cost sharing: Treat health like security
The upshot of this mutual accountability for cybersecurity is that costs are becoming distributed more and more: Prices will likely continue to rise for everything, including healthcare, still one of the most vulnerable targets for hackers and thieves), as the losses and expenses associated with security and data breaches get built in at every level. But there is a collective awareness that it is in everyone’s interest, and that risk can’t be magically eliminated.
That’s harder to do with health, but far from impossible. America has been locked in a debate over who should be responsible for paying and captivated by opportunities to blame people and institutions alike for their health and economic challenges. But in recognizing how profoundly linked our health is with our culture, our economy, with the very work we do and the value we try to add to the world, it would seem our current debates are insufficient. It may be a necessity—even more important than employer-sponsored insurance or government single-payer healthcare—to openly recognize the relationship between work, modern living, and illness.
Like identity theft and data breaches, working is an inevitability for most people. Even those who can afford not to work often busy themselves with some occupation or other; it turns out, not working is also associated with many negative health outcomes. For many people, in spite of the harm it does, employment and work are central to identity, and leaving that behind—whether due to retirement or winning the lottery—can be detrimental to mental and emotional health. We don’t have a way to stop working, even though work is often killing us.
So, in the absence of genuine viable alternatives to working, and taking whatever work is available, can we charge employers and the systems that create work to pay for our healthcare? Should we all be suing employers to get compensation commensurate with the real inputs of our jobs—time and energy, yes, but also health and wellness? Should employers of all sizes become more closely associated with health centers, to better link treatment to the causes of disease and illness?
Our current system disproportionately blames individuals for their own health and bills them to pay for their own care. A different balance is possible, but it won’t likely come from simply inverting the system to put all costs and accountability on the shoulders of employers.
We know from experience that going upstream with research brings us closer to understanding the causes, and possible treatments, of disease. We do it to contain epidemics, we do it to isolate the genes responsible for heritable conditions, and we desperately need to do it to understand the social determinants of health that are making Americans sick. Work seems a likely culprit, but meaningful change and effective prevention will likely take a rebalancing that partly relieves individuals, and better engages institutions.
Edgar Wilson is an Oregon native writing on trends in health, education, and global affairs. He studied conflict resolution and international relations and has worked in industries ranging from international marketing to broadcast journalism. He is currently working as an independent analytical consultant. He can be reached via email ([email protected]) or on Twitter @EdgarTwilson, and more of his work viewed through Contently.