I’m watching Apocalypse in the Tropics documentary on Netflix about evangelicals and politics in Brazil and it’s mind boggling. Why do the religious people just blindly do whatever the pastors tell them?
I’m watching Apocalypse in the Tropics documentary on Netflix about evangelicals and politics in Brazil and it’s mind boggling. Why do the religious people just blindly do whatever the pastors tell them?
Evangelical in the sense of protestant christians or in the sense of that crazy cult that’s going on over in the Americas? Maybe it’s just the news giving me the wrong idea, but I really don’t recognise my religion just one ocean away.
I am a scepticist, but (or rather because) I grew up with a progressive church that allows and encourages critical thinking. Very tame stances overall, no overly aggressive rethoric, laughing and coloring you hair very much allowed. Then you cross the pond and hear fuming people talk about filthy infidels and holy wars like wth…
I think these people are not necessarily easy to manipulate, but indoctrinated to hell and back