Tuesday, April 7, 2009

Post Christian United States


It is obvious that the Christian religion is on the decline in the United States. On April 4th of this year, Newsweek magazine published an article by Jon Meacham entitled “The End of Christian America.” President Obama visiting Turkey this week said "We do not consider ourselves a Christian nation…." The influence of Evangelicals in politics is waning and church attendance continues to decline. We are on the verge of entering – if we have not already entered – a Post-Christian era in the United States. For many, this is more than a little depressing.

The teachings of Jesus were never meant to be the foundation for a religion. In fact, Jesus is one of the most antireligious voices in history. From the beginning, his words were meant to be lived, not simply studied and memorized. The “church” was intended to be a fellowship of strugglers whose intention was to walk as Jesus did – not a place for club members to gather weekly. Religion is humanity’s attempt to redeem itself. Jesus reveals a Heavenly Father who reaches to us.

The call of Jesus in 2009 is the same as it was when he proclaimed it. "'Love the Lord your God with all your heart and with all your soul and with all your mind.' This is the first and greatest commandment. And the second is like it: 'Love your neighbor as yourself.' All the Law and the Prophets hang on these two commandments." Decline in a religion does not change our responsibility to our commitment.

No comments:

Disclaimer!

Please note that the advertisements from Google or Ad Sense are not necessarily recommended by DarthBode, Lord Moonbat or any other member of this blog's editorial team. Recommended resources are clearly marked.
Add to Technorati Favorites

Labels

When you support RMR, you help us to serve...

  • The members and staff of the St. Johns Youth Group, Albuquerque, New Mexico
  • The staff of Capernaum Ministries, Wichita, Kansas
  • The staff and membership of Hilltop Urban Church, Wichita, Kansas