Francesca Tomasi received her B.A. from the University of Chicago and currently does microbiology research.
Almost 90 years ago, in 1928, Alexander Fleming stumbled upon the chemical compound penicillin and sparked a medical revolution. It was a serendipitous occasion – Fleming had been growing plates of pathogenic bacteria in a lab when one day he noticed some mold growing on one of them. Just some classic contamination, he probably thought, ready to discard it as useless for his research purposes. But upon looking more closely at that fateful petri dish, he noticed wherever the mold grew, bacteria did not. He had inadvertently found a natural antimicrobial compound. From the subsequent isolation, characterization, and commercialization of penicillin, modern medicine was born. As more antibiotics were discovered and put on the market, people stopped dying of conditions like strep throat, ear infections, or pneumonia. Soldiers who were injured on the battlefield no longer succumbed to infections. The life expectancy of infants rose dramatically, and people were quick to invoke euphoria in talks of eradicating infectious diseases forever. Of course, we know this final statement would not prove true. For starters, infectious diseases are also caused by viruses and other microbes against which antibiotics serve no purpose. Secondly, infectious diseases mold with us (no pun intended) – they are inextricably intertwined with our behaviors, geography, and interconnectedness as a species. Thirdly, bacteria evolve under selective pressures (such as the introduction of antibiotics to their environment) and easily transmit genetic information to each other, leading to new strains of good and bad microbes all the time. In the category of “bad microbes,” this includes one of the 21st century’s biggest ongoing medical crises: antibiotic-resistance.
Antibiotics are medicines that treat or prevent bacterial infections. These compounds work either by killing bacteria or by inhibiting their growth and allowing the immune system to clear them. And while we take them as factory-made, packaged pills or – in extreme cases – intravenous injections, antibiotics actually have a long natural history. In fact, the concept of an antibiotic significantly preceded the discovery of penicillin. In 1877, the French bacteriologist Jean Paul Vuillemin coined the term “antibiosis,” or “against life,” to describe something even the ancient Egyptians and Greeks had taken advantage of. They noticed that certain molds and plant materials could help people with infections. Thinking it was probably plants as a whole that worked as remedying herbs, ancient physicians often covered wounds with them. Something there was inhibiting the growth of pathogenic bacteria (whose existence they had no idea about at the time – to ancient physicians, infections were some disharmony in the body’s fluids, or a god-sent curse). About 2000 years later, thanks to the advent of microbiology and the observation of bacteria via microscopes and culture plates, scientists observed that certain bacterial populations negatively influenced others. This notion, that one species could somehow hinder the growth of another, gave rise to Vuillemin’s term “antibiosis.” Louis Pasteur then observed that “if we could intervene in the antagonism observed between some bacteria, it would offer perhaps the greatest hopes for therapeutics.” Simply put, bacteria have been co-evolving for millions and millions of years. A basic facet of life is the competition between living things for resources and survival. Naturally, then, single-celled organisms had to evolve ways to thwart other species from encroaching on their own habitats. One such mechanism was to secrete compounds that killed some bacteria without harming others.
Today, antibiotics come in many forms – natural (extracted from the fruits of antibiosis), semisynthetic (modifications of natural compounds), and synthetic (designed with a specific target in mind). They are classified based on their mode of action (specific targets on bacterial cells), chemical structure, and activity spectrum (what types of bacteria are they effective on?). All classes of antibiotics that are used today were discovered before the 1980s. Novel compounds have been discovered or synthesized since then, but the classes they fall under have so far remained virtually unchanged. The overarching theme of antibiotic activity is the inhibition of essential functions involved in bacterial cell growth: cell wall synthesis (the barrier between a bacterial cell and its environment), essential enzymes (such as those used to break down or build up metabolites), and proteins that carry out other indispensable life functions. The quintessential antibiotic will target bacteria without scathing human cells. As such, drug discovery focuses predominantly on these properties of bacteria that are unique to the microbial world, which is why cell wall synthesis is often a favorite target: the bacterial cell wall is essential to its survival and pathogenesis, and human cells have no such structure. The spectrum of an antibiotic is the consequence of the ubiquity or specificity of its target: these drugs can be narrow spectrum – targeting specific types of bacteria and not others – or broad spectrum, capable of wiping out a wide range of bacteria.
Antibiotics revolutionized healthcare and quality of life in the twentieth century. Often with the help of vaccination programs, they have led to the near elimination of several infections in the developed world such as tuberculosis, diphtheria, pertussis (whooping cough), pneumococcal diseases, and tetanus. But in the words of Voltaire, “Use, do not abuse; neither abstinence nor excess ever renders man happy.” The effectiveness and ease of access of early antibiotics led to their overuse. Antibiotics, an integral part of the medical system, were soon prescribed for all sorts of conditions – indiscriminate to whether or not antibiotics would actually help. This lack of discrimination continues today, albeit at (hopefully) decreasing rates. To make matters worse, antibiotics became an accepted part of farming a few decades ago: livestock were pumped with drugs both to prevent illness and to promote weight gain. Society forced antibiotics into overstaying their welcome, and this has led to a slew of worldwide issues. In 2014, the WHO published its first global report on antibiotic resistance. In it was the statement that antibiotic resistance is “a serious threat [that] is no longer a prediction for the future, it is happening right now in every region of the world and has the potential to affect anyone, of any age, in any country.” Essentially, we have been promoting the evolution drug resistant mutants much faster than we can come up with novel antibiotics; we don’t need to draw upon basic laws of nature to realize this is a precarious and unsustainable situation. The more bacteria are exposed to the same drug, the more chances they have of developing defenses against it.
In 2013, the Infectious Disease Society of America made exactly this point: a struggling antibiotic pipeline was no match for bacteria’s growing spectrum of resistance. Since 2009, a mere two new antibiotics had been approved in the United States. Meanwhile, several antibiotics to treat Gram-negative bacteria like pathogenic E. coli were undergoing clinical trials, but these drugs lacked novelty. In truth, they were mostly combinations of existing treatments that did not even fully address the drug resistant capabilities of the bugs they were intended to treat. In the words of Dr. Dennis Maki of the Infectious Disease Society about a decade and a half earlier (1998), “[t]he development of new antibiotics without having mechanisms to ensure their appropriate use is much like supplying your alcoholic patients with a finer brandy.”
So what’s stopping success? The same two words that hinder many an effort: time and money. The effects of antibiotic misuse and overuse first reared their heads in the 1970s as research labs around the world plowed on in search for novel ways to stave off bacterial infections. As an academic understanding of bacterial physiology and resistance mechanisms emerged, pharmaceutical efforts in the war against bugs started to dwindle. The problem is this: every drug that enters the market costs several billion dollars in R&D and testing. And since the vast majority of these drugs never make it to the shelves, pharmaceutical companies rely heavily on products that will rake in billions of dollars. What sounds more profitable: antibiotics that will probably become obsolete after a few years or antidepressants, anti-inflammatories, and erectile dysfunction drugs that will never lose a spot on the market? Some of the biggest names in pharma – Pfizer, Sanofi (formerly known as Aventis), and Bristol-Meyers to name a new – dropped like flies in antibiotic research, all for financial reasons.
In a decade of last-resort drugs (we have been turning now more than ever to stronger, more toxic antibiotics to cure increasingly difficult-to-treat infections), we need to regenerate incentive, and moral motivation is not enough in the business of making billions. In 2012, Congress passed GAIN, the Generating Antibiotic Incentives Now Act. The law, signed by President Obama in July of that year, created the Antibacterial Drug Development Task Force and lays down financial incentives in exchange for turning industry back onto antibacterial research: lengthened drug-patent exclusivity and faster FDA approval processes. This has sparked a rise in antibacterial efforts that will hopefully continue for the foreseeable future, as large companies partner with smaller ones to tackle one of the world’s most pressing emerging public health concerns.
Nonetheless, a new discovery here and there, while a powerful stride in the right direction, is not enough to keep up with the minute-scale evolutionary pace of bacteria. In tandem with pharmaceutical incentives to develop new drugs, CDC-designed antibiotic stewardship programs are being enacted across the country in order to push doctors to ensure they only prescribe antibiotics when absolutely necessary. In the words of Dr. Tom Frieden, director of the CDC, “Improving antibiotic prescribing can save today’s patients from deadly infections and protect lifesaving antibiotics for tomorrow’s patients.” Antibiotics have been on the market for less than a century, and antibiotic stewardship programs are the only true large-scale effort to date in controlling antibiotic use. It does not take sophisticated big data analysis to see that there is a direct relationship between the use of antibiotics and rates of drug resistance in bacteria, yet policy is severely lacking. And antibiotic use in humans is only the tip of the iceberg: as mentioned, farms are rife with antibiotics that fatten up livestock and prevent illnesses in large-scale factory farms. In fact, approximately 80% of America’s antibiotics are not even given to humans. And yet, when the FDA issued guidance for antibiotic use in animals, the operative word was “voluntary.” Fortunately, several companies have already complied and are withdrawing antibiotics from their livestock’s diets. Nonetheless, the booming market of veterinary antibiotics is worth almost $20 billion, and this is because Congress waived the strict regulation of antibiotics deemed “safe” back during World War II. Antibiotics have since (recently) been banned for the sole purpose of fattening up animals, but the umbrella of infection prevention in large farms is maintaining a steady flow of antibiotics in buckets and troughs.
This graph from the CDC shows the weak trajectory of antibiotic development and market approval since the advent of modern medicine’s biggest feat. Despite the pessimism, there are strong glimmers of hope. In January, a paradigm shift in the fight against antibiotic resistance was declared with the discovery of Teixobactin, a brand-new antibiotic (the first in nearly 30 years). It will not be on the market for at least another five years (assuming it passes all safety and regulatory criteria), but the drug’s discovery is groundbreaking for multiple reasons. For one, it has been shown in models to treat a wide variety of conditions, from C. difficile, a severe intestinal infection, to tuberculosis. Furthermore, the methods leading to its discovery are refreshingly innovative. Remember Vuillemin’s theory of antibiosis? What better dynamic natural environment could be teeming with microbes on Earth from which to discover natural antibiotic compounds than earth itself? Scientists set off years ago to culture soil bacteria and screen them for potential therapeutic compounds. There was a slight problem, though – about 99% of microbes cannot grow in known laboratory conditions, creating a barrier between nature and the bench. Things changed in Teixobactin’s story: researchers at Northeastern University in Boston used an electronic chip to grow microbes in soil and extract their antimicrobial compounds. The methods and molecular findings alone will be the subject of a future Infective Perspective article, but the overarching themes make for an optimistic conclusion to this one. Teixobactin has a new mechanism of action: multiple targets. And for this reason, if the drug makes it onto the market, the development of resistance against it is still decades away. And even if Teixobactin itself does not make it onto the shelves, this renaissance of antibiotic discovery will likely push us in the right direction as we turn to the greatest provider of antibiotics out there: Mother Nature.
Francesca Tomasi received her B.A. from the University of Chicago and currently does microbiology research.
“[T]here is as yet no evidence of contagion.”
-July 3, 1981, New York Times
HIV crept into the human race decades before anyone would find it. One viral variant likely spilled over from sooty mangabey monkeys in West Africa in the 1960s; the other appeared in Central Africa in the 1930s. They lay low as a result of geographic isolation and a significantly less globalized world than the one we know today. In 1959, two men died: one was from Congo and died there. The other was a Jamaican-American shipping clerk who died in New York. In 1966, a man in Haiti became infected in what is one of the first domestically-acquired infections in the Americas. In 1975, reports of wasting and similar symptoms now associated with AIDS began coming in from Africa. It was not until a pattern of similar, rare illnesses appeared in a nation with a robust public health surveillance system that the budding epidemic would be picked up by any sort of radar.
In 1981, the CDC published its weekly Morbidity and Mortality Report. This one discussed a cluster of Pneumocystis carinii, a rare lung infection, in five previously healthy, young gay men in Los Angeles. The patients were also sick with other rare infections that the immune system should usually be able hold at bay, red flags that something was wrong with their immune systems. What at first seemed like a small series of unfortunate coincidences would turn out to be the beginning of one of the biggest – and worst – pandemics in human history. Within a day of the CDC’s report, doctors in other parts of the country, especially in New York, started describing similar cases. Opportunistic infections (infections occurring in immunocompromised individuals) were visibly on the rise, and only in homosexual men. Furthermore, an aggressive cancer known as Kaposi’s sarcoma suddenly re-emerged in homosexual men. This cancer typically had an incidence rate of less than 0.16 out of 100,000 people in the US, so its emergence in several hundred individuals within a single region was alarming. As a result, the CDC joined forces with other public health groups to establish a Kaposi’s sarcoma and Opportunistic Infection (KSOI) surveillance network. Their goal was to identify potential risk factors for these conditions and reverse whatever was causing them. On July 3, 1981, just shy of a month after the initial case reports were published, the New York Times published the first of many thousands of articles that would describe the nascent contagion: Rare Cancer Seen in 41 Homosexuals.
The New York Times’ seminal coverage of Kaposi’s sarcoma contained the very information that would eventually form the basis for the cloud of anger, hate, and fear cast over HIV/AIDS victims to this day. “…[T]here [is] no apparent danger to non-homosexuals,” one physician advised in the article, “…no cases have been reported to date outside the homosexual community or in women.” Because this “contagion” was first recognized in homosexual men, from day one it was identified as a “Gay Disease,” a devastating knockback to a global population already dealing with harsh social stigma. By 1982, half a year after Pneumocystis, Kaposi’s sarcoma, and other rare pathologies first noticeably emerged in the States, almost 300 cases of severe immunodeficiency were reported in the US. All were homosexual men, most in their thirties and forties. By mid-April in 1982, the CDC estimated that tens of thousands of individuals were affected by the disease. The term AIDS, for Acquired Immunodeficiency Syndrome, was officially coined by the CDC in September 1982 to describe a “disease at least moderately predictive of a defect in cell-mediated immunity, occurring in a person with no known case for diminished resistance to that disease.”
Suddenly, however, AIDS was no longer only infecting homosexuals. In December of 1982, the first case of AIDS in an infant was reported. The child had been the recipient of a blood transfusion. Precipitously, almost two dozen cases of “unexplained immunodeficiency and opportunistic infections in infants” were reported in another CDC Morbidity and Mortality Weekly Report. By April of the following year, the CDC had widened the AIDS-susceptible population to “homosexual men with multiple sex partners, injection drug users, Haitians, and hemophiliacs” (individuals with a rare bleeding disorder in which the blood doesn't clot normally). That same year, the deadly AIDS-causing virus HIV was discovered, albeit under the name HTLV-III/LAV (human T-cell lymphotropic virus-type III/lymphadenopathy-associated virus). The CDC quickly published a report outlining the methods for viral transmission which included sex, blood transfusions, and intravenous drug use. Direct contact, food, water, air, and contaminated surfaces were officially ruled out as transmission aids. Of course, much of the world stopped listening at “deadly virus.”
“Anger and intolerance are the enemies of correct understanding.”
– Mahatma Ghandi
In October, protests and riots that had slowly been building up gained national attention: tenants in a New York home had tried to evict a physician because he was treating AIDS patients. Citing that the tenants were “frightened of the AIDS patient,” the article goes on to say that the state repelled this move. Nonetheless, the frightening uncertainty of a new, fatal illness had started to fuel public bedlam, despite the fact that HIV had already been definitively proven as a sexually transmitted disease – mere bystanders would never be at risk of infection.
HIV spread like wildfire. By 1985, it had been reported in every region of the world. At the same time, an epidemic of discrimination was also unfolding into a pandemic. The United States military began testing all recruits for HIV and banned anyone who tested positive. Children with HIV were barred from schools. Individuals were shunned by their families, peers, and communities. Others were refused treatment. Still more lost their jobs and could not find new employment. In China, for instance, a man was refused a job as an elementary school teacher because of his positive HIV status. Now, the nation is pushing to ban health tests as prerequisite for employment. The FDA only just (December 2015) lifted its ban on homosexual men to donate blood even if they are HIV-negative. Travel and residence bans are only just beginning to be lifted; for years, individuals were forced to disclose HIV status before moving somewhere, and their travel was restricted in many parts of the world. Australia criminalized non-HIV disclosure. Countries have had to pass legislation in order to protect people from HIV shaming and bias, making it one of few infectious disease for which there are laws protecting individuals from discrimination (other protected diseases include TB and different forms of Hepatitis). Lastly, of course, there is awful psychological distress that comes with being diagnosed with a chronic, potentially fatal disease (if left untreated), and facing shame in lieu of acceptance and care.
Public misconceptions of HIV fueled the epidemic of fear alongside social stigmas associated with the sexual nature of the disease. HIV was quickly associated exclusively with controversial behaviors – homosexuality, drug use, sex work, sex in general (which is taboo in some cultures), or infidelity. Infection by HIV was a consequence of individual irresponsibility and immoral character, and AIDS was the punishment for this disgraceful behavior. For some perspective, in the year 2015, 75 countries around the world still reported homosexuality as a crime. Because of these stigmas, many people infected with HIV did not even seek treatment for fear of receiving a scarlet letter. And, of course, inaccurate information spread more quickly than HIV itself. People developed irrational misperceptions of their own personal risk of disease despite widespread PSAs about the actual routes of transmission of the disease.
Predominant modes of transmission also dictate the type and degree of stigma in different regions. For example, HIV transmission in sub-Saharan Africa is mainly driven by heterosexual sex. As a result, stigma there is not anti-IVDU or anti-homosexuality driven; rather, it is fueled by disdain for infidelity and sex work. When high-risk groups are marginalized, the effects of infection are only amplified: rather than being able to focus on prevention and treatment, groups become isolated and continue to propagate illness.
“When 'I' is replaced by 'We', illness becomes wellness.” – Shannon Alder
Stigma limits access to disease testing, treatment, and education. In fact, the World Health Organizations has stated that fears of stigma and discrimination are the number one reason individuals refuse to get tested, disclose their HIV status, or take the very effective antiretroviral drugs that exist today.
Had the individuals first infected with HIV been heterosexual, would the HIV/AIDS stigma be the same? The delicate, controversial nature of the behaviors most frequently associated with HIV infection made it pretty much impossible for HIV ever to go without stigma. Negative societal perceptions of “unconventional” behaviors such as intravenous drug use, infidelity, and sex work amplified HIV’s stigma to scales much larger than infections that are air-, insect-, or animal-borne. Furthermore, HIV and homosexuality fueled each other’s stigma as HIV was initially named a “Gay Disease,” and gay men were collectively assumed to have AIDS.
As we have discussed many times on Infective Perspective, infectious diseases have played a role in shaping society since the beginning of human history. Many other infectious diseases carry detrimental stigmas, including tuberculosis, which is now the leading cause of death in HIV-infected individuals. What the world has not realized, however, is that infectious diseases do not have to shape society exclusively in negative ways. In 1984, Secretary Margaret Heckler of the Department of Health and Human Services announced that a vaccine for HIV would hopefully be available within 2 years. 32 years later, we do not have a vaccine, but HIV is no longer a death sentence. While there is still a long way to go (read: an HIV vaccine is the holy grail of anti-HIV efforts and will ultimately help eradicate the disease), millions of people from so many different fields (medicine, research, public health, government, social activism, and so on) have come together to fight this viral beast. Mother to child transmission is preventable, and HIV-negative individuals can take prophylactic anti-retroviral drugs to prevent infection should they have sex with HIV-positive people. HIV-positive men and women can live full lives on anti-retroviral therapy that maintains a low viral count. They can get married and have children without passing their infection on to loved ones. Despite its massive social stigma, HIV has brought out the power of the human mind and global cooperation. The journey is not without its challenges and pitfalls, or conflicts and barriers, but the fact that it is a journey that has seen such staggering progress in just three decades should be a light of hope and a spark to ignite a global resolve to conquer all devastating illnesses.
“History doesn’t repeat itself, but it does rhyme.” – Mark Twain
Through travel and technology, the world will only continue to become increasingly inter-connected, creating more opportunities for major outbreaks. While we may not always be able to predict the next pandemic, we can foil stigma from the start. Public information transparency, clear communication, and an immediate focus on treatment will steer infectious diseases onto the optimist’s path. In this scenario, the world joins forces to defeat a common enemy, pouring resources into vaccines, drugs, and care for those affected by the disease. People who are not themselves infected will feel empathy for those who are. They will understand that even though they are not behaviorally at risk of contracting this particular disease, they may just as well fall into the path of the next one. On the other hand, covert investigations and the “othering” of diseases (labeling them with specific social, cultural, or geographical communities) steer pathogens down the pessimist’s path. They fuel paranoia and cynicism, hindering progress and giving pathogens exactly what they want: the chance to spread uninterrupted. In the words of infectious disease physician William Schaffner, “It's always an uncertainty. We're always at the infectious disease roulette table.” Stigma is never productive. Care and communication are.
Sana Sohail is a third-year undergraduate at the University of Chicago studying biological sciences and art.
Unpredictability is terrifying. There is something about uncertainty and a loss of control that never fails to instill fear. Ebola and Zika have both recently forced us to confront an anxiety around infectious diseases not found in outbreaks of strep throat or the seasonal flu. Infectious diseases like SARS and cholera appear to erupt without warning, rapidly spreading and decimating populations at seemingly uncontrollable rates. Most of the time, an outbreak’s causes or mechanisms of transmission are not well understood by the greater public. Furthermore, media hype more often than not becomes the epicenter of a second epidemic: one of fear and uncertainty.
The culture of fear around infectious diseases is understandable, but a social barrier can become become a physical barrier against diagnosis and treatment through stigmatization of a disease and its victims. Survivors of Ebola report being shunned and abandoned by their families and community; those suffering from leprosy in India are quarantined in so-called “leper colonies” and targeted for the visual deformities caused by the disease; Asian populations all over the world faced discrimination in the wake the SARS epidemic of 2003. Not coincidentally, much of this stigma falls disproportionately upon impoverished and minority populations who are blamed during pandemics and epidemics. The use of labels like the “Asian flu” or, in the case of H1N1, the “Mexican disease”, serve to isolate these populations and countries, “othering” them in ways that propagate stigma and fuel xenophobia. The stigma surrounding the AIDS epidemic since the 1980s has been unparalleled in recent history until very recently. This past winter, another disease has come to surpass HIV/AIDS as a leading cause of death worldwide: tuberculosis (TB).
There is a good and bad side to this change. According to the World Health Organization (WHO), the re-emergence of TB is due both to decreasing rates of mortality from HIV/AIDS and improved data collection for tuberculosis. Tuberculosis killed roughly 1.5 million people in 2014 and researchers believe that over one-third of people with active tuberculosis are “either undiagnosed or not reported”. What is more surprising is that despite being one of the top infectious diseases in the world, the majority of tuberculosis cases are curable though the use of 4 antimicrobial drug treatments over 6 months. If this is the case, how can we understand the persistence of hundreds of thousands of undiagnosed cases? How do we address the necessary months-long adherence to a lengthy treatment program? How can we understand the social stigma and public perceptions of the disease?
A Disease and its Symptoms
For many of us, tuberculosis seems like a disease of the past, associated with the Victorian Era and its romanticized depictions of “consumption”. Caused by a bacterial called Myobacterium tuberculosis, the disease’s symptoms are variable, which makes it more difficult to diagnose and separate from other infections. Tuberculosis can affect different parts of the body, such as the lymph nodes or the bones, but it is most commonly associated with the lungs. Pulmonary tuberculosis causes chest pains, fatigue, loss of appetite and weight, fevers, and the coughing up of blood.
There are two classes of tuberculosis: latent and active. A staggering one-third of the world’s entire population is estimated to have latent tuberculosis. In this case, they have been infected by the bacteria in small enough amounts such that their immune system can stave off symptoms. Those with latent tuberculosis cannot transmit the disease to others and a small percentage (usually around 10%) actually become sick. Latent tuberculosis can become active tuberculosis in cases where the immune system becomes suppressed, or active tuberculosis can develop shortly after becoming infected with the bacteria.
Several myths and misconceptions surround the transmission of tuberculosis (which is not uncommon for contagious diseases). Tuberculosis is spread through the air, by inhaling infected droplets that contain the bacteria. This can happen when someone who is infected coughs, sneezes, or talks. However, infection only takes place after a long time of close exposure with someone who is infected, which explains why it is often found in families and friends of those who are infected. According to tbfacts.org, tuberculosis is not spread through skin contact, shared food, water, or toothbrushes, or kissing.
Notably, there is a high rate of co-infection with tuberculosis and HIV, which makes sense considering that tuberculosis becomes active when the immune system is weakened. Data from WHO reveal that people who have HIV are 20-30 times more likely to develop active tuberculosis than those who do not have HIV, and one-third of HIV deaths in 2014 were due to tuberculosis infection. Roughly 9 million people a year get tuberculosis; countries in Africa, the Middle East, and parts of Asia bear most of the burden of cases. That is not to say, however, that other nations are TB-free. The United States, for instance, sees about 10,000 cases per year; infection trends through the years are currently leveling off into a concerning plateau instead of continuing on the steady decrease to be expected in a country with potent public health infrastructure.
Pulling Out Weeds by Their Roots:
The Basis of the Stigma Around TB
For countries with high rates of tuberculosis, its prevalence can be explained by three main factors: (1) poor public infrastructure; (2) limited integrated health education or awareness of TB and its transmission; and (3) the concentration of the disease in impoverished communities without access to medical care or adequate nutrition or sanitation. TB’s prevalence can also be explained by anthropological factors, as these systemic problems compound societal perceptions of illness. Tuberculosis spreads rapidly in poor, urban communities, where people are densely packed together and sanitation levels are low. These trends led to tuberculosis’s association with poverty (a characteristic that frequently overlaps with disadvantaged and disenfranchised minorities).
Infectious diseases have a long history of being inextricably linked to qualities like socioeconomic status, ethnicity, genetics, and morals. The internalization of societal norms results in sufferers of tuberculosis understanding their illness as a reflection of undesirable qualities like a low caste, poverty, and their heritage. HIV’s close relationship with tuberculosis carries its own widespread stigma of immoral behavior, further exacerbating the poor perception of tuberculosis in some societies. Low education levels and societal beliefs also propagate stigma: misconceptions that tuberculosis is caused by a curse or smoking, or transmitted by sharing food or utensils, make it difficult to determine and understand the actual cause and spread of the disease.
It is crucial to keep in mind that the basis of this stigma is fear of infection. To deal with this fear, infected individuals are isolated and ostracized from their communities and daily life in order to create a comforting sense of distance. In Ghana, those with active tuberculosis “cannot work in public spaces or attend community events”, while in some parts of India, a diagnosis of tuberculosis damages marriage prospects and can lead to the individual being abandoned by their family.
A long case-study conducted last year in Zambia supplied interview excerpts about experienced stigma:
“The nephew of my neighbour got the diagnosis TB at the clinic, this means they will do a household screening, but the family refused. The aunt said: “no one can have TB, because I believe in God”, even though the nephew is smear-positive. Instead of testing, they do nothing. The nephew now has to sleep alone, eat alone and no one talks to him. He is taking treatment on his own (TB patient during FGD).”
The Sequels to Stigma
In the excerpt above, the nephew was not only ostracized for his illness, but he also did not get the necessary testing and screening due to social stigma. For many people living in communities like these, the costs of revealing their diagnosis or going for treatment does not outweigh the costs of losing their jobs, their families, access to services, and their social standing. A damaging result of stigma is decreased adherence to preventive measures, low detection rates as people refuse testing, and reduced treatment compliance—all of this culminates in not only a threat to the life of the sick individual as the disease progresses and worsens, but an increased risk of of transmission to the rest of the community.
In order to avoid isolation and abandonment, community and family members may hide their diagnosis and attribute their symptoms to other causes. The effect of this stigma is so strong that some families do not disclose a member’s death to tuberculosis, fearing judgment and social repercussions despite the essentiality of such information for data collection, infection surveillance, and targeted tuberculosis screening. Patients may refuse treatment due to not wanting to be seen by the hospital or treatment facility, which they would have to visit for an extensive treatment regimen. Even for patients who do receive treatment and recover, they may return home to a community that views them with fear over their (albeit no longer existent) contagiousness, leaving a constant weight of stigma on the survivor.
Women and children were found to be particularly vulnerable to this stigma. WHO has acknowledged that childhood tuberculosis is under-researched and is only just beginning to be monitored more closely while women in many of these poor communities are in already-precarious economic positions with reduced access to medical care and education.
An Ethical Problem in Global Health
The stigma around tuberculosis could be a strong factor in its high mortality rates and prevalence in certain countries. Fear over being seen as someone with TB makes it difficult for people to disclose their condition or seek and continue treatment. This causes tuberculosis cases to go undetected and untreated, resulting in much higher chances that the microbe will spread throughout the community. The question remains, however: how can stigma be appropriately and compassionately dealt with?
The use of quarantine and isolation as a tactic for preventing the spread of infectious diseases is an age-old precaution, but it sometimes comes under fire socially for its negative effects on people and targeted populations. Even for those of us living in the States who think tuberculosis is behind us, the discussion of stigma and the ethics of isolation for this disease is still relevant. Over the past couple of decades, drug-resistant forms of tuberculosis have appeared, including strains that are proving to be incredibly difficult – if not impossible – to cure with even the most powerful anti-TB drugs. MDR-TB (multi-drug-resistant tuberculosis) prevalence has increased globally as a result of a lack of or incomplete adherence to normal tuberculosis treatment. The even more resistant form of tuberculosis, XDR-TB (Extremely Drug Resistant Tuberculosis), has also begun to emerge.
Clearly, tuberculosis is not a thing of the past. As different forms of the disease continue to evolve and spread, the myths and misconceptions surrounding its cause and transmission need to be dispelled in order to create a safer, encouraging environment for patients. We need to remain conscious of the way we socially approach and consider sickness, particularly when it does not seem to affect us. With concerns over the refugee crisis dominating the news, medical and political conversations over the risk of infectious diseases and their management are sure to take center stage, continuing this important discussion of disease, stigma, and treatment.
To learn more about a current campaign working to combat the stigma around Tb, visit http://www.unmaskstigma.org/