I don’t care what religion (or non-religion) it is you practice, but in America… Christianity (for example)… is tied eerily closely to the right!
For that reason, it has become increasingly difficult for me to understand why! Just, why!
Why do I believe?
Why are displays of “Christianity” becoming so ridiculous beyond measure?
It ultimately makes me question the process of finding meaning (through religion) in the world today!
I am educated on many religions, and I understand the hold that the right has on Christianity.
I also understand how mixing politics with religion is not only dirty, but it’s the only way the Republicans can exist today!
At least it was before the current president…
Everything has now become a free for all!
The whole thing cheapens what religion stands for, when it’s so closely combined with politics.
All I am left with is that mainstream Christianity’s following is all the same.
Fine for you maybe… but, far from fine for me!
For me personally… the only way I can operate is to believe in something that’s not really supported!
And, what I mean by this is… I believe in Jesus, and what Jesus stood for!
But, my interpretation of the Bible may not be the same as yours!
Whether any god is real or true or has ever existed, means less to me than the movements that attempt (and, do an effective job) at turning me away from my belief in the Christian god.
It’s all just so discouraging for those who want to follow an institution of value.
Do you have any thoughts on what’s happening in the U.S. with regards to Christianity (or any religion for that matter)? And, how are politics and religion intertwined where you live?