What you believe matters, but branding and marketing experts tell us that it all starts with "perception." No matter how great a product may be, I'll never try it until I have a favorable perception. Once that happens, I'll consider giving it a shot. In a similar way, for centuries, Christians have tried to share their faith with the culture, but today are losing enormous ground when it comes to perception. Society is experiencing one of the most dramatic shifts in history on issues such as abortion, gay marriage, stem cell research, religious freedom, personal privacy, sexuality, and more. Many of those issues bump up against historic positions of the Church, which in a media driven world, generates enormous (and usually negative) publicity. Because of those negative perceptions, many people turn away without even considering the positive aspects of faith.
That's why I'm wondering if it's time to re-think how the Christian community engages the world. So here's the question:
Without the Christian community altering their historic principles, or tossing them out altogether, what do you think are the five most important things Christians could do today that would start to change your perception? In other words, what changes or actions could Christians do publicly that would be so compelling, they would cause you to give another look at what it means to follow the teachings of Jesus?
Seriously, what would you list?