Kidney disease in African Americans is one the most dramatically different occurrences of a disease, and results in significant suffering and death. Generally kidney disease is the result of diabetes and high blood pressure, and given the increased number of both of these in African Americans, there is a six to twelve-fold increased occurrence compared to whites. Additionally, there is a 17-fold greater rate of high blood pressure as a cause of kidney failure in African Americans. If you have high blood pressure or diabetes, or both, your risk for kidney failure resulting in needing dialysis is MUCH higher if you are African American.
Having diabetes and high blood pressure that is controlled on medications almost erases this increased risk. This is why it is critical that if you have high blood pressure, you should take medication to bring it down. If you have diabetes, you should make sure your blood sugars are controlled because if you don’t, your risk for needing dialysis is very high.
Risk for Requiring Dialysis is High
While African Americans are 13 percent of the general population, we make up 35 percent of all patients on chronic dialysis. Diabetes as the leading cause of kidney failure and high blood pressure is the second most common cause.
Not having medical insurance or access to medical facilities and the increased number of people with high blood pressure contribute greatly to kidney disease in African Americans. Having high blood pressure but being on the wrong medications can contribute as well.
Well designed studies have failed to fully account for the excess proportion of kidney disease in Blacks. Anatomically, despite equivalent age, blood pressure, and other factors, African Americans tend to have reduced kidney blood flow. Despite similar dietary salt intake, the kidney’s processing of bodily fluids are somewhat different in African Americans compared to whites. Reducing salt in your diet can greatly improve health.
A Possible Genetic Cause?
Some of the increased risk for kidney disease in African Americans is attributed to a genetic variant (APOL1) found in more than 30% of African Americans and largely absent in white Americans. It is thought that this gene offered protection from African Sleeping Sickness (a frequently deadly disease known in medical circles as African trypanosomiasis) that was carried by the Tsetse fly. Basically, having this gene gave protection from the African Sleeping Sickness and was beneficial in African regions where the tsetse fly lived.
Scientists believe that the increased risk for kidney disease seen in African Americans is equal to the increased occurrence of the same gene that offered protection from the deadly African Sleeping Sickness.
With all of the kidney disease in the African American community, there is one last bit of curious news. African Americans have a better survival rate on dialysis than white Americans. This paradox of improved survival in African Americans after initiation of dialysis has puzzled researchers. Researchers at the Wake Forest School of Medicine suggest that the improved survival may also be due to the very gene that causes the problem . . . the APOL1 gene. In this case the APOL1 gene gives protection against hardening of the arteries while on dialysis.
Here’s What You Need To Do . . .
Kidney disease in African Americans can be a confusing topic to understand and there is a lot to consider. The most important points are:
If you have high blood pressure, take your medicine and watch your salt intake so that your pressure stays normal. That will allow your kidneys to stay normal.
If you have diabetes, take your medicine and watch your diet so that your blood sugars stay normal.
Watch your weight because the bigger you are, the higher your chance for kidney disease.
Three out of four African Americans are lactose intolerant. Lactose intolerance means that if you drink milk, eat yogurt, have cheese, or any other dairy-based product in large amounts, your digestive system will have difficulty digesting it. Most people report feeling bloated and later have loose gassy stool (sorry . . . but these are the facts).
If you are not near a toilet (of your choice), this can be an embarrassing problem. The stomach’s reaction to not being able to digest lactose (a sugar in dairy products) is to simply flush it through its system. For a majority, lactose intolerance in African Americans simply leads to the avoidance of milk and milk-related products.
If only one serving of dairy causes stomach upset and loose stool . . . what will three servings cause? That question is what many African Americans ask themselves, and the answer has been very clear. African Americans drink significantly less milk and eat substantially less cheese and yogurt when compared to the rest of the American population.
The decreased dairy consumption leads to decreased intake of essential nutrients that are found in milk and cheeses. Studies show that African Americans’ intake of the required nutrients calcium, vitamin D, and potassium were all lower than white and Hispanic Americans. And it has been well known in medical circles that African Americans have significantly lower vitamin D levels in their blood.
A Genetic Link for Lactose Intolerance??
The choice for African Americans to avoid milk and related products is not voluntary. Lactose intolerance in African Americans may be due to a genetic design. Research has shown that the proportion of people that are lactose intolerant can be tied to their region of genetic origin. Put simply, regions where dairy herds could be raised safely and efficiently produced people that could digest lactose. Harsher climates in African and Asia restricted the availability of milk, and produced people with much more lactose intolerance, a study at Cornell University found. Researchers found a wide range of lactose intolerances with as low as 2 percent of the population of Denmark descendants as unable to have dairy products compared to nearly 100 percent of the people with Zambian African origin.
Their survey “found that lactose intolerance decreases with increasing latitude and increases with rising temperature”.
Newer information has revealed that maybe there are not as many purely lactose intolerant African Americans as previously thought. Nutritionists have advised that adding milk to a larger meal helps with successful digestion. Some find that having smaller amounts of dairy over time improves digestion and decreases symptoms.
Lactose Intolerance Solutions
Others advise to simply take a lactose enzyme supplement (Lactaid, for example), and the problem is solved because milk, yogurt, or cheese is then easily broken down normally and naturally . . . while the dairy products again provide improved nutrient supplementation.
Other ways of replacing the missing nutrients resulting from low dairy consumption has become fairly easy due to multiple milk equivalents including soy, almond, coconut, and other ‘milks’ that can be used as part of a healthy breakfast. All have been ‘fortified’ with calcium and vitamin D if needed. Oatmeal and/or whole grain cereals with milk equivalents can make a fast and nutritionally efficient meal.
A ‘new’ problem is that African Americans consistently eat fewer breakfasts, and therefore the “opportunity” to have milk, yogurt, cheese, or milk equivalents has substantially decreased. Look at my article on “Diet Differences in African Americans” for more details.
Making a point of having the required serving of calcium and vitamin D in the form of a dairy (or dairy-like) product is the next nutritional priority of Black Americans seeking a longer and healthier life.
And when a stroke occurs, African Americans have them earlier in life and present with more severe and disabling conditions. The “Cardiovascular Quality and Outcomes” group concluded that “compared with other race/ethnicity groups, (African American) patients were less likely to receive IV tissue-type plasminogen activator <3 hours, early antithrombotics, antithrombotics at discharge, and lipid-lowering medication prescribed at discharge,” a study looking at over 200,000 patients showed.
Not surprisingly, with these prescriptive deficiencies in play, data analysis also showed a persistently increased re-hospitalization rate in African Americans at both 30 days and one year for all causes. African Americans also have a 2.4 times higher rate of recurrent strokes than white Americans, and the highest death rate of any racial group.
Stroke patients overseen by neurologists were almost 4 times more likely to receive IV clot dissolving medicine than those seen by non-neurologists for all races and ethnicities (study from the Baylor College of Medicine ), but unfortunately African Americans were half as likely as whites to be seen by a neurologist when presenting with a stroke.
Aspirin to reduce Strokes in African Americans
Aspirin use is decreased among African Americans as compared to whites while the indications for aspirin use are actually higher in African Americans. More African Americans should be taking aspirin because it reduces the risk of stroke, heart disease, and colon cancer. And this was proven at the low dose of 81 mg. The risk for gastrointestinal bleed is much lower than the risk of stroke, heart attack, etc.
African Americans over age 40 should be taking aspirin to help with the increased incidence of colon cancer, heart disease, and strokes.
Overall, prevention experts (USPSTF ) recommend referring adults who have stroke risk factors and are obese to intense behavioral counseling to promote a healthy diet and more physical activity. That means going to your doctor and having a detailed conversation about what you do . . . and what you eat. For example, by decreasing your intake of salt and fried foods, lowering the blood pressure and getting proper exercise, strokes in African Americans can greatly decrease.
Take a look at this video that explains why you need to start your medicine, keep taking it, and come in to make sure it is doing what it’s supposed to be doing. Take care.
Multiple studies over an extended period of time confirm what most doctors and providers already knew, African Americans are more likely to distrust doctors and other healthcare providers than patients of other racial or ethnic groups.
What many of us did not know was why. As providers, we spent many years training to help others. Medicine is a service profession. Why would anyone suspect our intentions, question our motives, or assign us collectively as untrustworthy? The answer lies in the historical experience African Americans had with America’s doctors, hospitals, and researchers.
A History of Abuse
While the Tuskegee Syphilis Study is a ‘classic example’ of abuse based purely on race, unfortunately the American experience has many more examples of why African Americans mistrust the medical community.
From African American’s earliest days in this country, abuse based on race was commonplace. Slaves were frequently used as subjects for dissection, surgical experimentation, and medical testing. J. Marion Sims, MD, the so-called father of modern gynecology perfected many of his surgical techniques on slave girls without anesthesia. Stories of doctors kidnapping and killing southern blacks for experimentation consistently appear in literature throughout American history.
As Vanesa Northington Gamble, MD, PhD put in her article “Under the Shadow of Tuskegee: African Americans and Health Care” tales of ‘medical student’ grave robbers, recount the exploitation of southern blacks as their deceased family members would be stolen and sent to northern medical schools for anatomy dissection. Dr. Gable writes:
“These historical examples clearly demonstrate that African Americans’ distrust of the medical profession has a longer history than the public revelations of the Tuskegee Syphilis Study. There is a collective memory among African Americans about their exploitation by the medical establishment.”
Racial Differences in Trust
Chanita Hughes Halbert published a study in JAMA in 2006 looking at racial differences in trust in healthcare providers. Her study of almost one thousand white American and African American patients found that “compared with whites, African Americans were most likely to report low trust in health care providers.”
“Trust has been described as an expectation that medical care providers (physicians, nurses, and others) will act in ways that demonstrate that the patient’s interests are a priority. Trust is a multidimensional construct that includes perceptions of the health care provider’s technical ability, interpersonal skills, and the extent to which the patient perceives that his or her welfare is placed above other considerations. Trust is an important determinant of adherence to treatment and screening recommendations and the length and quality of relationships with health care providers.”
Fortunately, the level of trust a patient has for any specific provider is not stagnant, it can be earned. Increased exposure to providers in general, and to the same provider in specific, has been shown to improve trust.
In the “Medscape Internist Lifestyle Report 2017“, Carol Peckham looked at internist’s admitted explicit biases “toward specific types or groups of patients” and found wide differences between racial groups in bias for a number of influences. The study further examined if the physician bias actually impacted care delivery, and almost one in five providers (18%) admitted that their bias did impact the quality of their care.
Generally these biases are positive toward white American patients and negative toward African American patients as a study by Oliver et al demonstrated at the University of Virginia. They found providers explicitly preferred white Americans to African Americans with “significantly higher feelings of warmth toward white people” and also found that white American patients were “more medically cooperative than African Americans”. This study found no significant difference in the quality of care between the racial groups.
Biases that effect medical care can be consciously counteracted, and admitting the existence of biases is the critical first step in canceling its effect on medical care. Having a doctor who professes to treating “everyone the same” will undoubtedly provide inferior care to patients that are different.
A study done at Johns Hopkins by Lisa Cooper and colleagues found that primary care physicians who hold unconscious racial biases tend to dominate conversations with African-American patients during routine visits, paying less attention to patients’ social and emotional needs, and making these patients feel less involved in decision making related to their health. These patients also reported reduced trust in their doctors, less respectful treatment, and a lower likelihood of recommending the physician to a friend.
Because there are a limited number of physicians to provide care to African Americans, many patients simply “put up” with biases and unequal treatment . . . with others avoiding healthcare altogether until they they arrive in Emergency Departments with very advanced disease.
Patient Centered Care Improves Quality
Patient centered care can positively improve care, specifically for African Americans. Although this seems obvious, spending time with patients is an easy approach to establishing trust. Fiscella and colleagues measured patient trust against the time spent with a patient and found a one-to-one correlation: the more time spent led to more perceived trust on the part of the patient. Making suggestions about diet changes requires a trusting relationship that involves a non-judgmental regard for the current diet.
Many delays in diagnosis and treatment are simply an outgrowth of the lack of trust. You will not accept someones advice if you don’t trust them.
When it comes to the treatment of high blood pressure in African Americans, there are a number of important differences. For reasons that are not entirely clear, many African Americans patients respond differently from white patients based on the hypertension medication used.
Evidence from studies suggests that African Americans do very well with thiazide diuretics (a “water”pill) and they should be used often for the treatment of high blood pressure (hypertension). Thiazide-type diuretics (chlorthalidone) was better at reducing blood pressure and preventing cardiovascular events like a heart attack or stroke than an ACE (lisinopril), or an alpha-adrenergic blocker (doxazosin) in African Americans as found in the “Antihypertensive and Lipid-Lowering Treatment to Prevent Heart Attack” (ALLHAT) trial.
Best Stroke Prevention in African Americans
For ideal blood pressure control, the thiazide-type diuretic dose should be equivalent to chlorthalidone 12.5 to 25 mg/day or hydrochlorothiazide 25 to 50 mg/day because lower doses have not been found to be as effective. Overall, calcium channel blockers (amlodipine) have also shown great effect in African Americans as an initial choice, and are more effective in decreasing strokes than water pills. Thus an African American male would be best served by amlodipine first line given the stroke prophylaxis, and a African American female better served with a thiazide diuretic initially to get to goal more efficiently.
ACE Inhibitors are not preferred
Angiotensin-converting enzyme (ACE) inhibitors and angiotensin II receptor blocker (ARB) medications are less effective in African Americans for blood pressure control and are associated with worse outcomes. A large study of over 400,000 patients done at the New York University School of Medicine compared outcomes in African Americans and European Americans with three distinct groups:
Their study showed that ACE inhibitors were associated with a significant increase in stroke, heart failure, and combined cardiovascular disease when compared with calcium channel blockers or thiazide diuretics in African Americans. The worse outcomes with angiotensin-converting enzyme (ACE) inhibitors were similar to that of B-blockers in this population.
Because ACE inhibitors are commonly listed as “first-line” medications for hypertension control in national and international guidelines and recommendations, it should be noted that this principally is based on their response in white populations. Based on these large African American-inclusive studies and a number of considerations (including cost, co-morbid conditions and disease propensities), the National Institute for Health and Clinical Excellence clinical practice guideline suggests calcium channel blocker therapy initially in African Americans, and substitute a thiazide-like diuretic in the event of edema or intolerance “or if there is evidence of heart failure, or a high risk of heart failure.”
Putting all of these risks aside (imagine that??), ACE inhibitor blood pressure response in African Americans is usually less when compared to calcium channel blockers, thiazide diuretics, or even B-blockers. Researchers suspect that the low blood pressure response is related to “high sodium intake in salt-sensitive” patients, but others have suggested that hypertension in African Americans may just be different.
More Side Effects in African Americans
African Americans have a greater risk of ACE-related cough, and a higher rate of stopping due to cough compared to other racial groups. African Americans were also more prone to develop ACE-related full allergic reactions.
When considering all of these issues with ACE’s and ARB’s in African Americans, it should be noted that they are essential for preventing kidney disease in people with diabetes, and certain other kidney related problems. So if you don’t know why your on an ACE or ARB, call your physician and ask. The renal-sparing benefits of the ACE and ARB medications is still very valid when used to slow renal function decline (particularly in hypertensive renal disease), and they should still be used for kidney protection in African American patients with diabetes and similar conditions.
So don’t just stop your medications based on this article, please check with your provider. Use this article as a starting point for your discussion. Some providers are aware of these differences, and others may not be fully aware.
The Tuskegee Syphilis Study (originally called “Tuskegee Study of Untreated Syphilis in the Negro Male”) was originally formed to record the natural history of syphilis with the hope of justifying the funding of public treatment programs for African Americans. The study, which began in 1932, included 600 African American men, 399 with syphilis and 201 without. While the study was originally slated to last 6 months, it was extended for over 40 years. Central to the study was the patient’s lack of informed consent. None of the patients were told they had syphilis, instead they were told they had “bad blood” that required monitoring. In exchange for taking part in the study, the men received free medical exams, free meals, and burial insurance. Many physicians, including African Americans, and national physician societies, fully supported the study.
Betrayed Trust & Conspiracy
During the study, researchers not only allowed the disease to progress, but actively blocked the men from receiving curable treatment, not just from the study physicians, but also from other community physicians. The researchers implemented a coordinated effort . . . a verified conspiracy, with area physicians and hospitals to actively block treatment if they presented elsewhere for care. Needless to say, the study required the widespread communication of personal health information across an entire region and involving hundreds of people. The names and a stigmatizing diagnosis were circulated widely, and in a way that the patient would not know. The fact that nearly 400 African American men were denied effective treatment for syphilis without their knowledge or consent so that researchers could document the natural history of the disease, stands as a singular event that largely validates the mistrust African Americans have against the medical establishment.
40 Years Later . . .
It wasn’t until 1972, when a news article reported the study, that a government review panel finally halted it. The Tuskegee Health Benefit Program was established as a settlement for the class action suit brought against the United States. The US agreed to pay all medical and burial expenses for the subjects involved, with added support for their families. During the course of the study, 40 wives contracted the disease and 19 children were born with congenital syphilis. Many credit the Tuskegee Syphilis Study as the main reason informed consent regulations exist today. For many African Americans, the study is the perfect example for why to not trust public health, medical research, or healthcare.
In 1996, a formal apology was issued by the US government and the survivors were invited to the Oval Office by President Clinton.
Some argue that with time the Tuskegee Syphilis Study is merely a distant historical event for most African Americans. A study done at Johns Hopkins looked at awareness of the Tuskegee Syphilis Study and found an overwhelming number of African Americans (81%) were aware of the study and outcomes, while only 28 percent of European Americans had knowledge of the study. With widespread knowledge of this government-sanctioned and funded study within the African American community, mentioning the study as a way to stimulate discussion, and build trust, is a preferable approach to ignoring its existence.
Almost half of people between 18 and 35 have tattoos, and almost one in four regrets it, according to a 2016 Harris Poll. Based on an estimate of about 60 million people in that age group, that would mean that about 7.5 million people have tattoo regret.
As a primary care physician, I’ve noticed anecdotally that many of my younger patients have regrets about their tattoos. When I ask about them, many say that they got them when they were young, and at the time put little or no research into the decision.
With no source (reliable or otherwise) of tattoo information to suggest to my patients, I began to investigate the topic myself. My goal was to write a quick reference for teens that reviewed the health and social issues they might encounter after getting a tattoo.
What I found was myriad unexpected and sometimes shocking concerns that everyone should know. To my surprise, there were a host of reports of ink complications, infections, toxin effects, scarring, burns, chronic irritations and much more.
The ink goes more than skin deep
Among the concerns are the long-term effects tattoo inks can have on the immune system, pathology specimen interpretation and other unforeseen health complications.
The European Society of Tattoo and Pigment Research was established in 2013 with a mission of educating the public about the “fundamental facts about tattooing” which many in the younger generations ignore. That group found barium, copper, mercury and other unsafe components in tattoo inks. Their research also found a disheartening mismatch between the listed ink container contents and its actual chemical composition found on testing.
More recently, the Food and Drug Administration has become more involved with tattoo inks, stating “Many pigments used in tattoo inks are industrial-grade colors suitable for printers’ ink or automobile paint.” Like the studies started overseas, the agency is now examining the chemical composition of inks and pigments and how they break down in the body, as well their short- and long-term safety.
Tattoos have led to errors in medical treatment, testing
Metal-based ink tattoos can react with magnetic resonance imaging studies. For instance, two case studies detail patients who suffered MRI-induced burns in their tattoos that were attributed to iron compounds in tattoo pigments. Radiologists say this magnet-based reaction is rare, but some have suggested simply avoiding iron-based tattoo inks.
Pathologists, meanwhile, are reporting tattoo ink in surgical biopsy specimens of lymph nodes. For instance, a 2015 report in the journal Obstetrics and Gynecology detailed the case of a young woman with cervical cancer which doctors believed had spread to her lymph nodes. After surgery to remove the nodes, they discovered that what appeared to be malignant cells in a scan was actually tattoo ink. A similar misdiagnosis occurred in another patient with melanoma.
Three percent of tattoos get infected, and almost four percent of people who get tattoos recount pain lasting more than a month, a 2015 study from Tulane University School of Medicine found. About 22 percent of participants with new tattoos reported persistent itching that lasted more than a month.
A spate of mycobacterial skin infections in 22 people across four states in 2011 and 2012 was tied to a few specific brands of ink. The Centers for Disease Control and Prevention, in conjunction with local departments of public health, were able to contain these infections through intense tracking and investigation.
A study reported in Hepatology found that “tattoo exposure is associated with HCV (hepatitis C virus) infection, even among those without traditional risk factors. All patients who have tattoos should be considered at higher risk for HCV infection and should be offered HCV counseling and testing.”
Hepatitis, which is 10 times more infectious than HIV, can be transmitted through needles used by tattoo artists. It is the reason the American Red Cross restricts blood donations from individuals with newer tattoos done outside of regulated tattoo facilities.
A study from Tulane University added credence to these blood donation restrictions by showing that 17 percent of all participants had at least one tattoo done somewhere other than a tattoo parlor, and 21 percent admitted to being intoxicated while receiving at least one of their tattoos.
A youthful decision with adult implications
The primary reason Harris Poll respondents reported tattoo regret was they “were too young when they had it done.” The second most common reason, which coincides with the first, is the tattoo “didn’t fit their present lifestyle.”
Whether a tattoo depicts a name, a person, a place or a thing, its meaning and perception are in constant flux. Eric Madfis and Tammi Arford, writing about the dilemma of symbols and tattoo regret, note that “Symbols are dynamic in that they are time-specific, ever-changing, and always in a state of gradual transition.”
Tattoos have a different meaning depending on the interpreter, their relative history and knowledge, and they are dynamic because they can take on different meanings through time and experience. The first person to get a barbed wire tattoo on an upper arm could be seen as clever, inventive, unique and trail-blazing. The one-hundredth person to get the same tattoo was none of these things, and with time, if either was seen in public, both would receive the same reaction.
The “emotional response in the beholder” of any given tattoo can be based on “social stratification” and is not consistently predictable, according to Andrew Timmings at the University of St Andrews in the United Kingdom. Their interviews of hiring managers showed that tattoos can actually hurt job prospects.
Researchers at the Harris Poll found that older respondents are less tolerant of visible tattoos as the prestige of the job position rises. While a vast majority of people age 51 and above are comfortable with professional athletes having tattoos, the acceptance decreases significantly when doctors, primary school teachers and presidential candidates are included.
Understandably, people who have many friends and family with tattoos are generally less stigmatized regarding their tattoo, and tend to suffer less tattoo regret, a study in The Social Science Journal reported in 2014. But the study also found that when tattooed respondents were exposed to individuals without tattoos, like in the workplace or institutions of higher learning, more stigma victimization occurred, and those impacted were more likely to suffer regret and ponder removal.
Getting a tattoo, which is akin to a life-changing (and body-changing) decision, when young is really no different from getting married young (32 percent regret rate) or choosing a college major (37 percent change rate). For many, making a major decision when young is rife with regret. The difference with tattoos is having to face that regret on a daily basis.
Current lasers still have limitations in the colors they can erase with added difficulty stemming from more vibrant tattoo colors. Darker pigmented people tend to have less success with certain lasers and require more sessions to avoid skin damage.
Because the laser shatters the pigment particles under the skin for removal by the body, the issues with infections, scarring and the ink spreading become a concern again. Tattoos covering extensive areas of the body are simply too large to tackle in one session, and could take years to remove.
Laser complications include pain, blistering, scarring and, in some cases, a darkening of the tattoo ink can occur, according to dermatologists.
As technology and the demand for tattoo removal advances, some of the limitations of current lasers will shrink. Newer, easy-to-remove inks are being patented, which may represent a healthier approach due to biodegradable ingredients, and a more predictable laser response. Picosecond lasers are also dramatically decreasing the number of sessions needed in select populations.
Education is the key
With such a large number considering tattoos at a young age, informing young people of the health and social risks could help them avoid tattoos they may come to regret. Adding permanent body art education to health classes could mitigate some of these mistakes and decrease later regret.
With the startling death of Prince at the age of 57,
many began to reflect on the seemingly ‘premature’ death of ground-breaking artists like Michael Jackson, Elvis Presley, or Hank Williams. Even as a physician, I began to wonder a number of questions. Do great music artists die young? Are there certain conditions that are more likely to cause a star’s demise? And finally, is there some lesson to be learned that might help our remaining beloved music artists?
I began by tabulating the vital statistics on the 252 members of Rolling Stone Magazine’s “100 Greatest Artists” from the music industry. The list ranged from the #1 group The Beatles with two members that met a premature death (John Lennon at age 40 and George Harrison at age 58) to the #100 group The Talking Heads without a death amongst them. In between, were stars like Jimi Hendricks who died at age 27 from a drug overdose, Bob Marley who passed at 36 from skin cancer, and Marvin Gaye who was shot and killed by his father at age 44. In all, 82 of the 252 members of this elite group had died.
Homicides and Accidental Deaths
There were six homicides for various reasons ranging from a psychiatric obsession that lead to the shooting of John Lennon, to the planned ‘hits’ on rappers Tupac Shakur and Jam Master Jay. There is still a good deal of controversy surrounding the shooting of Sam Cooke by a female hotel manager who was likely protecting a prostitute who had robbed him. Al Jackson Jr., the renowned drummer with Booker T & the MGs, was shot 5 times in the back by a burglar in his home amongst mysterious circumstances that still baffle authorities.
An accident can happen to anyone, but the “100 Greatest ” have more than their share. There were numerous accidental overdoses including Sid Vicious of the Sex Pistols at age 21, David Ruffin of the Temptations at age 50, Rudy Lewis at age 27 of The Drifters, and country great Gram Parsons who was found dead at age 26.
While your odds of dying in a plane crash are about 1 in 5 million, if you are one of the “100 Greatest” those odds jump to 1 in 84. Buddy Holly, Otis Redding, and Ronnie Van Zant of the Lynyrd Skynyrd Band all died in airplane accidents while on tour.
Increased Liver Disease
While liver-related diseases make up only 1.4% of the general population’s cause of death, it comprised over three times that number among the “100 Greatest Artists” deaths. The increased occurrence of these diseases is probably related to the elevated alcohol and drug use in this group. Liver bile duct cancers that are normally extremely rare in the general population ran suspiciously high in our small but esteemed group with Ray Manzarek of The Doors and Tommy Ramone of the Ramones both dying prematurely from a condition that normally effects less than one in a thousand .
Tobacco Use Effects
The vast majority of the “Great 100” were born in the 1940’s and reached maturity during the 1960’s when tobacco smoking peaked. As a result, an increased number of artists died from lung cancer including George Harrison age 58 of the Beatles, Carl Wilson of the Beachboys at age 51, Richard White of Pink Floyd at age 65, Eddie Kendricks of the Temptations at age 52, and Obie Benson of the Four Tops at age 69. Throat cancer, also linked with smoking, caused the deaths of country great Carl Perkins at 65 and Levon Helm of The Band at 71.
A good number of the “100 Greatest” had heart attacks or heart failure and included Ian Stewart age 47 of the Rolling Stones, blues greats Muddy Waters age 70, Howlin Wolf age 65, Roy Orbison age 52, and Jackie Wilson at age 49.
We recently saw Glenn Frey succumb to pneumonia, but so did Jackie Wilson at age 49, nine years after a having a massive heart attack. James Brown complained of a persistent cough and declining health before he passed at age 73 with the cause listed as congestive heart failure as a result of pneumonia.
Among those dead, the average age was 49.
One of the two shocking outcomes deals with life expectancy. While the average American male has a life expectancy of about 75 years, the males in the “100 Greatest Artists” who have died had an average age of just over 49 years and makes up almost one third of the entire group. Factoring their birth year and a life expectancy of 75 years, only 44 should have died by now, instead of the 82. Incidentally, of the 44 that should have died based on life expectancy, 19 are still alive.
Alcohol and drug abuse
The second shocking outcome was the sobering and disproportional occurrence of alcohol and drug-related deaths ranging from Kurt Cobain’s gunshot suicide while intoxicated to Duane Allman of the Allman Brothers accidental death on a motorcycle while impaired. Members of legendary bands like The Who (John Entwistle age 57 and Keith Moon age 32), The Doors (Jim Morrison age 27), The Byrds (Gene Clark age 46 and Micheal Clarke age 47), The Band (Rick Danko age 55 and Richard Manuel age 42), and others all succumbing to alcohol or drug-induced death.
There were many including The Grateful Dead’s Jerry Garcia and country star Hank Williams who declined more slowly over the years from substance abuse while their organs deteriorated, and the official cause of death was heart-related, but in reality the cause may have been more directly related to substance abuse.
Alcohol and drugs accounted for at least one in ten deaths of these great artists, while nationally substance abuse as a cause of death effects one in 33. The threefold difference points to the much greater access and use of drugs and alcohol among these ultra-talented artists.
Too Much Opioid Use & Abuse
Currently, the US is in the midst of a opioid abuse epidemic with heroin and prescription drug overdoses setting records across the country. Elvis Presley, Jimi Hendrix, Janis Joplin, Sid Vicious, Gram Parsons, Whitney Houston (who didn’t make this 100 Greatest List), Michael Jackson, and now possibly Prince all dying from accidental opioid overdose. While it is still unclear what the cause of death will be in Prince’s case, early evidence points toward opioids.
Controlling the effects of oxycodone, fentanyl, heroin or morphine and thereby reducing accidental death is difficult, and for these stars and countless others across the world, in the end . . . impossible. Put another way, without the inappropriate use of opioids or their addiction, all of these stars could still be alive.
What music could those who died young have created if they were given the chance to live and flourish? And more importantly for us, who’s next?