We all assume that medical decisions are based on solid, indisputable scientific evidence. Yet, this is not always the case. Some medical dogmas are seen as a given truth, yet when in search for empirical evidence to back up such notions, there is none to be found. This is the case for resistance management. The belief is that resistance can be prevented by killing all pathogens within an infection as rapidly as possible; hence physicians typically urge their patients to finish drug courses even after they no longer feel sick. This concept traces back to Alexander Flemming, the discoverer of penicillin, who warned the world against drug resistance in his 1945 Nobel Prize lecture with the words “if you use penicillin, use enough”.
However, this view is based on the notion that we need only prevent new resistant mutants from arising. Yet, when drug-resistant mutants are already present at the time of treatment, such aggressive approach is likely to select for these the fastest by rapidly killing all sensitive competitors. I demonstrated this phenomenon in a rodent malaria model: selection for resistance indeed occurred more intensely following aggressive treatment than following lower dose treatment, without any benefit to host health or infectivity. Of course, mice are not men, but such experiments provide proof of principle that current approaches may not do the best job of resistance management under all circumstances and that other options may be possible and should be explored.
Our society is in the midst of an emerging drug-resistance crisis. Current approaches have not served us well so far. I believe we need to start thinking out of the box and adopt an evidence-based approach across a wide range of infectious diseases to manage resistance evolution.