


One of the most injurious trends of modern times is the politicization of areas of life previously immune to politics. Recent years have seen corporations, nonprofits, and the military, among other institutions, move toward becoming explicitly political, and always in one direction: to the left.
Add health care to the list. Increasingly, various areas within the field, from medical schools to professional organizations to continuing-education programs for practicing doctors, have a left-wing tilt. In the cover story for the latest issue of National Review, I detail some examples of how bad things have gotten by digging through medical-school curricula and talking to some of the medical professionals who object to what is happening.
For an explanation of how we got here, as well as a superior alternative to what the medical field is becoming, check out the full piece.