Settled by pioneers seeking religious liberty and founded in a war fought against a despotic king who presided over the national church, the United States of America has had connections with Christianity. But, according to a recent survey, many Americans believe that relationship has changed since the nation’s creation. The results of the Public Religion Research Institute study indicate the vast majority of Americans believe that the United States was founded as a Christian nation, though only about a third believe it still is today. President Obama is among those who believe America is not, at least today, a Christian nation.
While only 14% of Americans polled believe the United States has never been a Christian nation, nearly half (45%) believe that it once was, but no longer is today. Only 35% believe that the U.S. has always been and currently remains a Christian nation, a number that has steadily declined since a 2010 survey showed that 42% believed as such. FULL REPORT