I, Daniel Schwen [GFDL, CC-BY-SA-3.0 or CC-BY-SA-2.5], via Wikimedia Commons |
The thing is, I cannot imagine anything more fundamentally anti-American than this use of the phrase. The very foundation of what it is to be American, I would think, is to be (little "d") democratic. Our society is built on the principle that government serves by the will of the people. The people, the society, not you and your personal opinions. Of course, "the people" is an abstract concept - the will of "the people" really consists of the will of many of the people, and your personal viewpoint is one that must be taken into account in determining that will. That's why we vote.
In the end though, sometimes the vote goes against you. Sometimes a president gets voted in on a mildly liberal platform, promises he'll reform healthcare and then actually does it. Well, you may not like it (I'm certainly not sure what I think of "Obamacare") but it is, to some degree or another, the will of the people.
So it's not your will. Well, good thing is we are a democracy, and you can fight Obamacare, or anything else about liberalism you don't like, and try and get things changed. Of course the system isn't perfect, many people are disenfranchised in one way or another by our society because all human societies have at least some oppression built into them.1 But you can try, and you won't be shot for it, and that's a good thing. More importantly, it's the fundamentally American thing, the ability to disagree, to debate, and then to build a society off of the results of that debate, in our case determined by vote.
The flipside of that is, however, that if the debate doesn't go your way, you don't dare call the outcome anti-American (assuming the result isn't one that someone disenfranchises someone). To call viewpoints you disagree with anti-American is to subvert debate, to say that democracy is fine and all as long as it goes my way, and so is, in short, to be a tyrant. It is to be anti-American.
________________________________________________
1. I don't mean to be flippant about the oppression that exists in our society. I'm a Christian, I believe Christ came to set all captives free, and that all societies that exist for the sake of the small elite and the expense of the weak and downtrodden (read: all societies ever until Christ returns) are to some degree or another demonic. We should fight the demonic, and always strive to make society more equitable, even if we never reach our goal.