Betting the Farm: Global Policy Action Against the Emergence of Antimicrobial Resistance in Food Animals and Perspectives on the Precautionary Principle
Luke Versten received his B.S. in Biological Sciences from The University of Chicago and is currently pursuing an MPH in Epidemiology of Microbial Disease at Yale University.
In a newly published set of recommendations aimed at preserving medically important antibiotics, last week the World Health Organization called for governments and food animal producers to adhere to rules discouraging use of such antimicrobials to promote livestock growth, as well as recommending complete restriction of these antimicrobials for prophylactic use in animals that have not been clinically diagnosed with disease. The recent WHO recommendations demonstrate that global leaders have recognized that the looming threat of antibiotic resistance will not be resolved solely by addressing use of antibiotics in humans, considering that about 70% of antimicrobial drugs deemed medically important to treating human infection are currently sold for use in food animal production.
The WHO’s recent guidances were preceded by similar directives implemented by the FDA last year, as well as laws that have existed in the European Union for over a decade. In the United States, reductions in antibiotic use in food production animals have so far been defined not only by market forces such as the consumer driven shift in demand toward food raised without antibiotics, but also by FDA Veterinary Feed Directives limiting medically important antibiotic use for growth promotion and requiring prescription prior to any antibiotic use. It should be noted that these guidances are not without loopholes, and unlike the recent WHO recommendations, many of the drugs previously used for growth promotion are still approved for prophylactic use. Regardless, the WHO’s recommendations as well as well as FDA guidances are recent developments, and time will tell as to whether they will result in reduced incidence of resistance emergence in animals as well as humans. In the meantime, the question now becomes relevant to how we identify the practices that are significantly contributing to the issue of antibiotic resistance and how we might accurately predict whether any net benefit to human health can be realized if adjustment or elimination of these practices were to take place.
While there is universal agreement that antibiotic use will invariably lead to resistance, it has nonetheless been difficult to estimate the magnitude of the reservoir of antibiotic resistance genes from non-human sources and whether this poses an indirect risk to public health by increasing the gene pool from which bacteria pathogenic to humans can acquire resistance. Indeed, individual links of the causal chain such as emergence of resistance resulting from antibiotic administration in animals, transmission to humans, and infection with resistant pathogens of zoonotic origin have been proven for some common pathogens such as Campylobacter jejuni and MRSA in food production animals. However, some argue that there is a paucity of evidence providing a holistic quantitative description of the magnitude of zoonotic resistance reservoirs, and cite that a lack of definitive proof of the risk of such transmission as a legitimate threat to human health warrants deeper investigation before any policy decisions are made.
The lack of robust quantitative evidence to either support or contradict the proposition that further restrictions on antibiotic use would result in no negative impact to animal health as well as a positive measurable public health benefit brings to light the need to examine the balance between a requirement for scientific evidence of a phenomenon that is fundamentally difficult to measure and the potential costs of inaction. Of course, speculative fear of future harm does not constitute injury, but given the evidence of the potential for the emergence of resistance from zoonotic sources, is it reasonable to consider current rates of antibiotic use innocent before proven guilty? Those opposed to use of the precautionary principle argue that policy informed by such a code of belief inherently relies on conclusions that are potentially forgone. Stated differently, at the confluence of basic science research and public health policy, does a precautionary course of action merit foregoing more rigorous scientific examination? Or does the relative scarcity of incontrovertible data justify inaction on a globally recognized threat?
More alarming would be to consider engaging in preemptive measures in food production to control bacteria resistant to medically important antibiotics only once a ‘sufficient’ number of people were to experience harm. The biological complexities of prevention and transmission of antimicrobial resistance have already complicated treatment of infection in humans. Ought we to wait for incontrovertible evidence of devastating clinical effects in human patients first? This dilemma demands that any decisions made balance available evidence with the intrinsic uncertainties, and makes precautionary decisive action appealing in this case. However, we must acknowledge that policy informed by scientific consensus ought to reflect a balance between risk management and demonstrable effect of benefit from shifting the status quo of antimicrobial use.
In order to fully understand the impact of such measures, it will be imperative to implement systems monitoring antibiotic use in these settings as well as enhanced surveillance mechanisms to detect emergence resistant pathogens, and then integrate this data with antibiotic sales data, livestock production numbers, and resistance trends in humans and food. While ensuring adequate resources and funding for such outcomes assessments will be subject to more nuanced deliberation, the recent FDA directives and WHO guidelines are a step in the right direction when it comes to reducing irresponsible antimicrobial use in animals. To continue in the face of intrinsic difficulties in measurability under the dangerous assumption that plausible threats to human health are insignificant until demonstrated to be dangerous cannot be considered less unscientific than proceeding with caution.