As long as it does not interfere with their office one will have to accept that Christianity is part of the fabric of US society and hopefully people can be tolerant of one another's beliefs.
But I don't think it really is. People like to go on and on about how we were founded as a "Christian" nation, but we weren't. The "Founding Fathers" were deists or Atheists. They used the word "god" in a very different sense than modern conservative Christians. I think it's incredibly disrespectful to the vast array of people and beliefs and opinions that we have in this country to keep assuming that everyone is okay with having a "Christian" nation here, particularly when the vast majority of very religious politicians tend to like making restrictive laws that deprive others of basic rights (e.g. gay marriage and adoption rights). The impulse of evangelical Christianity is to convert others, make them believe like you do, and reshape the laws in order to reflect those beliefs - that isn't really reconcilable with office, in my opinion.