I keep hearing about how America is turning away from God, that America is no longer the "utopia" it used to be. Things that once hid in dark corners to prevent mankind from seeing the sinfulness of others are out in the open. Things adults once would whisper to one another so their children wouldn't hear are being proclaimed in kindergarten classrooms. Things that once repulsed now regale. America has changed radically in a sense. But in another sense it has not. The exposure of evil in men's hearts does not change America from what it was but lays bare what it always has been. It's all clutter that's been pushed under the bed of euphemisms and hush tones. the bed is being replaced with Satan's lies that these things are good.
Because society has embraced these lies christians can no longer blend into the culture. America was never up made completely of christians and it still isn't now. The lines are now just more sharply drawn. People believe evil is good and good is evil. sin corrupts entirely and entirely it corrupts. So we must always remember to "be diligent to present [ourselves] approved to God as a workman who does not need to be ashamed, accurately handling the word of truth"(2 Tim 2:15)
Remember, God is where we find our hope, not in the state of America. And we must not run away from culture nor must we embrace and blend. We must exist within yet be separate. One of the many paradoxes of christianity.
"He said to them, "Go into all the world and preach the gospel to all creation." (Mark 16:15)