Why isn't Christianity considered evil? After reading the Bible, I noticed that homosexuality is 'abominable', that if anyone chooses to work on a sunday then they should be 'put to death', that slavery is fine, animal sacrifice is fine and that the mentally-ill are possessed by the devil. Why then, do we not actively supress Christianity? How can a Christian legitimately believe that homosexuality, for example, is fine and still call themselves a Christian, despite what it says in the Bible? It seems to me that it is an evil moral theory to subscribe to.