Betting the Farm: Global Policy Action Against the Emergence of Antimicrobial Resistance in Food Animals and Perspectives on the Precautionary Principle
Luke Versten received his B.S. in Biological Sciences from The University of Chicago and is currently pursuing an MPH in Epidemiology of Microbial Disease at Yale University.
In a newly published set of recommendations aimed at preserving medically important antibiotics, last week the World Health Organization called for governments and food animal producers to adhere to rules discouraging use of such antimicrobials to promote livestock growth, as well as recommending complete restriction of these antimicrobials for prophylactic use in animals that have not been clinically diagnosed with disease. The recent WHO recommendations demonstrate that global leaders have recognized that the looming threat of antibiotic resistance will not be resolved solely by addressing use of antibiotics in humans, considering that about 70% of antimicrobial drugs deemed medically important to treating human infection are currently sold for use in food animal production.
The WHO’s recent guidances were preceded by similar directives implemented by the FDA last year, as well as laws that have existed in the European Union for over a decade. In the United States, reductions in antibiotic use in food production animals have so far been defined not only by market forces such as the consumer driven shift in demand toward food raised without antibiotics, but also by FDA Veterinary Feed Directives limiting medically important antibiotic use for growth promotion and requiring prescription prior to any antibiotic use. It should be noted that these guidances are not without loopholes, and unlike the recent WHO recommendations, many of the drugs previously used for growth promotion are still approved for prophylactic use. Regardless, the WHO’s recommendations as well as well as FDA guidances are recent developments, and time will tell as to whether they will result in reduced incidence of resistance emergence in animals as well as humans. In the meantime, the question now becomes relevant to how we identify the practices that are significantly contributing to the issue of antibiotic resistance and how we might accurately predict whether any net benefit to human health can be realized if adjustment or elimination of these practices were to take place.
While there is universal agreement that antibiotic use will invariably lead to resistance, it has nonetheless been difficult to estimate the magnitude of the reservoir of antibiotic resistance genes from non-human sources and whether this poses an indirect risk to public health by increasing the gene pool from which bacteria pathogenic to humans can acquire resistance. Indeed, individual links of the causal chain such as emergence of resistance resulting from antibiotic administration in animals, transmission to humans, and infection with resistant pathogens of zoonotic origin have been proven for some common pathogens such as Campylobacter jejuni and MRSA in food production animals. However, some argue that there is a paucity of evidence providing a holistic quantitative description of the magnitude of zoonotic resistance reservoirs, and cite that a lack of definitive proof of the risk of such transmission as a legitimate threat to human health warrants deeper investigation before any policy decisions are made.
The lack of robust quantitative evidence to either support or contradict the proposition that further restrictions on antibiotic use would result in no negative impact to animal health as well as a positive measurable public health benefit brings to light the need to examine the balance between a requirement for scientific evidence of a phenomenon that is fundamentally difficult to measure and the potential costs of inaction. Of course, speculative fear of future harm does not constitute injury, but given the evidence of the potential for the emergence of resistance from zoonotic sources, is it reasonable to consider current rates of antibiotic use innocent before proven guilty? Those opposed to use of the precautionary principle argue that policy informed by such a code of belief inherently relies on conclusions that are potentially forgone. Stated differently, at the confluence of basic science research and public health policy, does a precautionary course of action merit foregoing more rigorous scientific examination? Or does the relative scarcity of incontrovertible data justify inaction on a globally recognized threat?
More alarming would be to consider engaging in preemptive measures in food production to control bacteria resistant to medically important antibiotics only once a ‘sufficient’ number of people were to experience harm. The biological complexities of prevention and transmission of antimicrobial resistance have already complicated treatment of infection in humans. Ought we to wait for incontrovertible evidence of devastating clinical effects in human patients first? This dilemma demands that any decisions made balance available evidence with the intrinsic uncertainties, and makes precautionary decisive action appealing in this case. However, we must acknowledge that policy informed by scientific consensus ought to reflect a balance between risk management and demonstrable effect of benefit from shifting the status quo of antimicrobial use.
In order to fully understand the impact of such measures, it will be imperative to implement systems monitoring antibiotic use in these settings as well as enhanced surveillance mechanisms to detect emergence resistant pathogens, and then integrate this data with antibiotic sales data, livestock production numbers, and resistance trends in humans and food. While ensuring adequate resources and funding for such outcomes assessments will be subject to more nuanced deliberation, the recent FDA directives and WHO guidelines are a step in the right direction when it comes to reducing irresponsible antimicrobial use in animals. To continue in the face of intrinsic difficulties in measurability under the dangerous assumption that plausible threats to human health are insignificant until demonstrated to be dangerous cannot be considered less unscientific than proceeding with caution.
Kayla Knilans received her Ph.D. from the University of North Carolina and currently studies inflammatory mediators and epithelial repair in inflammatory bowel disease.
Malaria is a mosquito-borne infectious disease caused by parasites of the Plasmodium genus. According to CDC estimates, in 2015 there were around 212 million malaria infections that caused 429,000 deaths. Globally, the number of malaria infections has been in decline, an accomplishment that is largely attributed to the Roll Back Malaria initiative that established a partnership between the World Health Organization (WHO), the World Bank, UNICEF and the United Nations Development Programme in 1998. However, WHO has reported that as of 2015, only 44% of countries endemic for malaria were on track to meet 2020 infection reduction goals. Furthermore, there is concern that increasing global temperatures could cause an increase in the number of malaria infections in several regions due to a longer infection season and an expanding territory for Anopheles mosquitoes, the known vectors for malaria, though the magnitude of the effect is still heavily debated. Sub-Saharan Africa is one geographical region that is expected to be impacted by these changes.
Currently, Sub-Saharan Africa contains 90% of the global malaria infections and 92% percent of malaria-associated deaths. It is therefore a heavy focus for anti-malaria policy efforts, which have been successful in reducing total malaria cases. One study reported that as of 2015, ~660 million clinical cases of malaria have been averted since 2000. The WHO reports that overall, more children and pregnant women in Sub-Saharan have improved access to diagnostic testing and preventive treatment. In 2015, 51% of children seeking care for a fever in 22 African countries received a diagnostic test for malaria, compared to 29% in 2010. However, in some regions health systems are still under-resourced and/or poorly accessible; the WHO reports that 35% of children with a fever were not taken to a health care facility in a survey of 23 African countries.
One anti-malaria measure that has been successfully implemented across regions and populations in Sub-Saharan Africa is the use of insecticide-treated nets. These nets have been shown to be effective even in regions where mosquitoes have demonstrated resistance to pyrethroids, the only class of insecticide currently used in insecticidal nets. One study attributed the used of insecticide-treated nets to be responsible for 68% of malaria case reductions between 2000 and 2015. The study noted that the success of insecticide-treated nets in reducing malaria cases is at least partially due to their cost effectiveness and ease of distribution.
Going forward, there are several reported barriers to continuing successful anti-malaria policy implementation in Sub-Saharan Africa. The major barrier is funding available for anti-malaria efforts. While funding for malaria research and policy implementation rose sharply between 2000 and 2012, and was a critical factor in the global reduction of malaria cases during those years, it has since flatlined. Global funding for malaria as of 2015 was $2.9 billion, well short of the $6.4 billion goal for 2020 that WHO has estimated will be required for further malaria case reduction efforts.
A funding shortage will impact the amount of malaria research, including basic science, clinical, and epidemiological research. It will also slow progress in updating health systems, including training healthcare workers, making diagnostic technologies available, improving health record systems, increasing availability to anti-malarial drugs, and improving accessibility of health clinics to rural and poor populations. Other reported barriers include lack of education about the causes and treatment for malaria, adherence of households to malaria control efforts, cultural norms that conflict with malaria control measures, environmental and health concerns about the use of pesticides, and rising resistance to treatments in both the malaria parasite and the mosquito vectors.
A recent survey of policymakers in Uganda, Tanzania, and Kenya revealed additional complications with making and implementing anti-malaria policy in their respective countries. While funding was identified as a limiting factor in implementing policy, policymakers added that funding cycles often do not line up with seasonal malaria cycles, so funds are not available when they are most needed. Policymakers also expressed a general frustration with politicians, whose priorities and political pressures could undermine policy implementation. Policymakers in Uganda expressed that government leaders have previously enacted policy decisions without fully involving the policymaking community.
Interestingly, the survey participants also noted a discord between researchers and policy makers. Survey participants in Kenya noted that the priorities of research organizations were not always aligned with the national health research needs. They also voiced a need for a venue to bring together researchers and policymakers, and that researchers needed to make their work more accessible to and meaningful to policymakers. This highlights a critical point that is often overlooked in the current debate within the scientific community regarding publication paywalls. Publication paywalls restrict the access of scientific publications to paying customers, usually an academic institution’s library. While these paywalls can slow research progress at resource-limited institutions, paywalls can also have a more direct impact for policymakers on the ability to enact successful policies to improve public health. Without access to all the information, drafting evidence-supported policies that will be effective and supported by local lawmakers becomes more burdensome.
The implementation of successful anti-malaria policy will require significant funding commitments and the cooperation of researchers, policymakers, and politicians in order to meet infection reduction goals. Increasing communication with policymakers should be of interest to the scientific community to help facilitate these efforts. Enhanced communication could lead to funding announcements that are in line with the needs of policymakers and assure that research is accessible to policymakers.