Both Christians and non-Christians throughout the ages realize that the Christian faith provides the only foundation needed for nations to be free, and to retain our God given liberty. Christians have been given a commission, to make and keep America a Christian nation! Christ the Lord said," All authority has been given to Me in heaven and on earth. Go therefore and make disciples of all nations, baptizing them in the name of the Father and the Son and the Holy Spirit, teaching them to observe ALL that I have commanded you; and lo, I am with you always, even to the end of the age" (Matt. 28: 18-20). Simply stated, Christians are commanded to do our best to make America a Christian nation. What does this mean, to make America a Christian nation? A Christian nation is one built upon the Word of God, and substained with His presence.