Click to enlarge.
A Pew survey came out last week showing that the ranks of Christians are declining in America, while the unaffiliated and other faiths are growing. Of course Christianity will remain a super-majority in the U.S. for decades to come, but it offers some hope that in the distant future, the religious right might lose their grip on national politics. Maybe.