Francesca Tomasi received her B.A. from the University of Chicago and currently does microbiology research.
Almost 90 years ago, in 1928, Alexander Fleming stumbled upon the chemical compound penicillin and sparked a medical revolution. It was a serendipitous occasion – Fleming had been growing plates of pathogenic bacteria in a lab when one day he noticed some mold growing on one of them. Just some classic contamination, he probably thought, ready to discard it as useless for his research purposes. But upon looking more closely at that fateful petri dish, he noticed wherever the mold grew, bacteria did not. He had inadvertently found a natural antimicrobial compound. From the subsequent isolation, characterization, and commercialization of penicillin, modern medicine was born. As more antibiotics were discovered and put on the market, people stopped dying of conditions like strep throat, ear infections, or pneumonia. Soldiers who were injured on the battlefield no longer succumbed to infections. The life expectancy of infants rose dramatically, and people were quick to invoke euphoria in talks of eradicating infectious diseases forever. Of course, we know this final statement would not prove true. For starters, infectious diseases are also caused by viruses and other microbes against which antibiotics serve no purpose. Secondly, infectious diseases mold with us (no pun intended) – they are inextricably intertwined with our behaviors, geography, and interconnectedness as a species. Thirdly, bacteria evolve under selective pressures (such as the introduction of antibiotics to their environment) and easily transmit genetic information to each other, leading to new strains of good and bad microbes all the time. In the category of “bad microbes,” this includes one of the 21st century’s biggest ongoing medical crises: antibiotic-resistance.
Antibiotics are medicines that treat or prevent bacterial infections. These compounds work either by killing bacteria or by inhibiting their growth and allowing the immune system to clear them. And while we take them as factory-made, packaged pills or – in extreme cases – intravenous injections, antibiotics actually have a long natural history. In fact, the concept of an antibiotic significantly preceded the discovery of penicillin. In 1877, the French bacteriologist Jean Paul Vuillemin coined the term “antibiosis,” or “against life,” to describe something even the ancient Egyptians and Greeks had taken advantage of. They noticed that certain molds and plant materials could help people with infections. Thinking it was probably plants as a whole that worked as remedying herbs, ancient physicians often covered wounds with them. Something there was inhibiting the growth of pathogenic bacteria (whose existence they had no idea about at the time – to ancient physicians, infections were some disharmony in the body’s fluids, or a god-sent curse). About 2000 years later, thanks to the advent of microbiology and the observation of bacteria via microscopes and culture plates, scientists observed that certain bacterial populations negatively influenced others. This notion, that one species could somehow hinder the growth of another, gave rise to Vuillemin’s term “antibiosis.” Louis Pasteur then observed that “if we could intervene in the antagonism observed between some bacteria, it would offer perhaps the greatest hopes for therapeutics.” Simply put, bacteria have been co-evolving for millions and millions of years. A basic facet of life is the competition between living things for resources and survival. Naturally, then, single-celled organisms had to evolve ways to thwart other species from encroaching on their own habitats. One such mechanism was to secrete compounds that killed some bacteria without harming others.
Today, antibiotics come in many forms – natural (extracted from the fruits of antibiosis), semisynthetic (modifications of natural compounds), and synthetic (designed with a specific target in mind). They are classified based on their mode of action (specific targets on bacterial cells), chemical structure, and activity spectrum (what types of bacteria are they effective on?). All classes of antibiotics that are used today were discovered before the 1980s. Novel compounds have been discovered or synthesized since then, but the classes they fall under have so far remained virtually unchanged. The overarching theme of antibiotic activity is the inhibition of essential functions involved in bacterial cell growth: cell wall synthesis (the barrier between a bacterial cell and its environment), essential enzymes (such as those used to break down or build up metabolites), and proteins that carry out other indispensable life functions. The quintessential antibiotic will target bacteria without scathing human cells. As such, drug discovery focuses predominantly on these properties of bacteria that are unique to the microbial world, which is why cell wall synthesis is often a favorite target: the bacterial cell wall is essential to its survival and pathogenesis, and human cells have no such structure. The spectrum of an antibiotic is the consequence of the ubiquity or specificity of its target: these drugs can be narrow spectrum – targeting specific types of bacteria and not others – or broad spectrum, capable of wiping out a wide range of bacteria.
Antibiotics revolutionized healthcare and quality of life in the twentieth century. Often with the help of vaccination programs, they have led to the near elimination of several infections in the developed world such as tuberculosis, diphtheria, pertussis (whooping cough), pneumococcal diseases, and tetanus. But in the words of Voltaire, “Use, do not abuse; neither abstinence nor excess ever renders man happy.” The effectiveness and ease of access of early antibiotics led to their overuse. Antibiotics, an integral part of the medical system, were soon prescribed for all sorts of conditions – indiscriminate to whether or not antibiotics would actually help. This lack of discrimination continues today, albeit at (hopefully) decreasing rates. To make matters worse, antibiotics became an accepted part of farming a few decades ago: livestock were pumped with drugs both to prevent illness and to promote weight gain. Society forced antibiotics into overstaying their welcome, and this has led to a slew of worldwide issues. In 2014, the WHO published its first global report on antibiotic resistance. In it was the statement that antibiotic resistance is “a serious threat [that] is no longer a prediction for the future, it is happening right now in every region of the world and has the potential to affect anyone, of any age, in any country.” Essentially, we have been promoting the evolution drug resistant mutants much faster than we can come up with novel antibiotics; we don’t need to draw upon basic laws of nature to realize this is a precarious and unsustainable situation. The more bacteria are exposed to the same drug, the more chances they have of developing defenses against it.
In 2013, the Infectious Disease Society of America made exactly this point: a struggling antibiotic pipeline was no match for bacteria’s growing spectrum of resistance. Since 2009, a mere two new antibiotics had been approved in the United States. Meanwhile, several antibiotics to treat Gram-negative bacteria like pathogenic E. coli were undergoing clinical trials, but these drugs lacked novelty. In truth, they were mostly combinations of existing treatments that did not even fully address the drug resistant capabilities of the bugs they were intended to treat. In the words of Dr. Dennis Maki of the Infectious Disease Society about a decade and a half earlier (1998), “[t]he development of new antibiotics without having mechanisms to ensure their appropriate use is much like supplying your alcoholic patients with a finer brandy.”
So what’s stopping success? The same two words that hinder many an effort: time and money. The effects of antibiotic misuse and overuse first reared their heads in the 1970s as research labs around the world plowed on in search for novel ways to stave off bacterial infections. As an academic understanding of bacterial physiology and resistance mechanisms emerged, pharmaceutical efforts in the war against bugs started to dwindle. The problem is this: every drug that enters the market costs several billion dollars in R&D and testing. And since the vast majority of these drugs never make it to the shelves, pharmaceutical companies rely heavily on products that will rake in billions of dollars. What sounds more profitable: antibiotics that will probably become obsolete after a few years or antidepressants, anti-inflammatories, and erectile dysfunction drugs that will never lose a spot on the market? Some of the biggest names in pharma – Pfizer, Sanofi (formerly known as Aventis), and Bristol-Meyers to name a new – dropped like flies in antibiotic research, all for financial reasons.
In a decade of last-resort drugs (we have been turning now more than ever to stronger, more toxic antibiotics to cure increasingly difficult-to-treat infections), we need to regenerate incentive, and moral motivation is not enough in the business of making billions. In 2012, Congress passed GAIN, the Generating Antibiotic Incentives Now Act. The law, signed by President Obama in July of that year, created the Antibacterial Drug Development Task Force and lays down financial incentives in exchange for turning industry back onto antibacterial research: lengthened drug-patent exclusivity and faster FDA approval processes. This has sparked a rise in antibacterial efforts that will hopefully continue for the foreseeable future, as large companies partner with smaller ones to tackle one of the world’s most pressing emerging public health concerns.
Nonetheless, a new discovery here and there, while a powerful stride in the right direction, is not enough to keep up with the minute-scale evolutionary pace of bacteria. In tandem with pharmaceutical incentives to develop new drugs, CDC-designed antibiotic stewardship programs are being enacted across the country in order to push doctors to ensure they only prescribe antibiotics when absolutely necessary. In the words of Dr. Tom Frieden, director of the CDC, “Improving antibiotic prescribing can save today’s patients from deadly infections and protect lifesaving antibiotics for tomorrow’s patients.” Antibiotics have been on the market for less than a century, and antibiotic stewardship programs are the only true large-scale effort to date in controlling antibiotic use. It does not take sophisticated big data analysis to see that there is a direct relationship between the use of antibiotics and rates of drug resistance in bacteria, yet policy is severely lacking. And antibiotic use in humans is only the tip of the iceberg: as mentioned, farms are rife with antibiotics that fatten up livestock and prevent illnesses in large-scale factory farms. In fact, approximately 80% of America’s antibiotics are not even given to humans. And yet, when the FDA issued guidance for antibiotic use in animals, the operative word was “voluntary.” Fortunately, several companies have already complied and are withdrawing antibiotics from their livestock’s diets. Nonetheless, the booming market of veterinary antibiotics is worth almost $20 billion, and this is because Congress waived the strict regulation of antibiotics deemed “safe” back during World War II. Antibiotics have since (recently) been banned for the sole purpose of fattening up animals, but the umbrella of infection prevention in large farms is maintaining a steady flow of antibiotics in buckets and troughs.
This graph from the CDC shows the weak trajectory of antibiotic development and market approval since the advent of modern medicine’s biggest feat. Despite the pessimism, there are strong glimmers of hope. In January, a paradigm shift in the fight against antibiotic resistance was declared with the discovery of Teixobactin, a brand-new antibiotic (the first in nearly 30 years). It will not be on the market for at least another five years (assuming it passes all safety and regulatory criteria), but the drug’s discovery is groundbreaking for multiple reasons. For one, it has been shown in models to treat a wide variety of conditions, from C. difficile, a severe intestinal infection, to tuberculosis. Furthermore, the methods leading to its discovery are refreshingly innovative. Remember Vuillemin’s theory of antibiosis? What better dynamic natural environment could be teeming with microbes on Earth from which to discover natural antibiotic compounds than earth itself? Scientists set off years ago to culture soil bacteria and screen them for potential therapeutic compounds. There was a slight problem, though – about 99% of microbes cannot grow in known laboratory conditions, creating a barrier between nature and the bench. Things changed in Teixobactin’s story: researchers at Northeastern University in Boston used an electronic chip to grow microbes in soil and extract their antimicrobial compounds. The methods and molecular findings alone will be the subject of a future Infective Perspective article, but the overarching themes make for an optimistic conclusion to this one. Teixobactin has a new mechanism of action: multiple targets. And for this reason, if the drug makes it onto the market, the development of resistance against it is still decades away. And even if Teixobactin itself does not make it onto the shelves, this renaissance of antibiotic discovery will likely push us in the right direction as we turn to the greatest provider of antibiotics out there: Mother Nature.