We live in a society now where progressive politics, no matter where one resides, seems to have replaced, in some cases, what had been known as societal norms. These have to do with personal responsibility and family. It’s no longer your fault that you lost your job, even if you were a terrible employee: the government will make sure you’re provided. Your children are not yours, they belong to the state and if you teach them anything that we don’t approve of We’ll take them from you. Deny Trans women are actual biological women? Be cancelled/ In fact, become erased. Try to engage in your rights of free speech and religion? We can’t have that.
During patriotic holidays, as July 4th is here in the U.S., I often here people, politicians mostly, jabbering about how our rights are from some God – usuallyof course the Judeo-Christian version of the same. Even when I was a believer, i found that an odd thing to say simply because any grade schooler knows exactly where our rights, at least in the U.S., are derived. I realize that politicians, and yes, even some pundits on television, pander to the general public but is it really necessary to give credit for basic human rights to an invisible diety? The question has always been, for me at least, is why does the United States get these special rights that no other country have?
I think we have gone too far in what many believe are basic human rights without considering what actually are those same. It’s troubling to me that people say that healthcare, housing, education, yes even food is a basic human right but how do we implement these “basic human rights” for all when there are only a few that pay for them?