by Jodie Walton III, Office Manager
Personally, I think we now live in a time where everyone has an opinion or an absolute about everything. But what I am seeing as a problem is how our culture has changed and moved away from God. God was the one who we listened to for our opinion and absolutes. We’ve drifted so far from where this country was founded (even though we stole it from Native Americans, but that’s another blog for another day), we know this from the statement in our Declaration of Independence, that all men are CREATED equal. Now we have take separation of church and state to a whole new level. God isn’t accepted anywhere but church and Christian TV and radio stations. The culture of America has completely engulfed and shaped out version of Christianity. We (Americans) have allowed the “American Dream” to shape the look of what the Christian life is supposed to be, we have also allowed the progression of “being cool” to surpass what holiness is supposed to be. Wearing plugs, skinny jeans, low cut shirts, and the like have all become norms and if anyone says that we need to get rid of them, or not participate, people call them “traditional”. Since when is the culture to shape our relationship with Christ or how Christianity is? Have we gotten to a point where we have allowed sinful things to shape our religion? Or am I just thinking too “traditional” and making things out to be too serious? ‘Cause I remember a time when just about anything would send you to hell, lol. Women wearing pants, men wearing earrings, wearing jeans to church, wearing make-up, and so much more. Granted, the Christian culture is to change according to the times (I’m not expecting people to walk around in robes and sandals), but to what extent? Personally I think we as American Christians need to stop making things out to be convenient for us. Things like prosperity preaching, “accepting” homosexuality in the church, and “secularized Christianity”(where people live how they want, but are “saved” still) have taken a significant stance in America.
Throughout history it was religion that shaped cultures and not the other way around. I am not saying that we need to be religious (that leads to empty and mindless repetition and no relationship with Christ), but we as Christians need to rise up and take back this country and removed all things not Christ-like. And it needs to start inside the body. No culture, no matter where it is from, in and of itself is not evil. There are portions of every culture that are still acceptable when lined up with the Word. But the real question still remains, will we the body of Christ adhere to this first, or will we keep making excuses to have what we want? I hate to say it, but I think it will be the latter. I think we as believers need to get back to prayer, and relying on God and His word for making decisions in EVERYTHING! I truly believe Jesus is the answer to all of America’s problems; everything from homelessness, to our subpar education system in certain areas, to welfare, and all the way to our corrupt and God-less government. The “American Dream” has blinded so many of us, let’s take the blindfold off and start looking at our world through God’s eyes and heart.
Posted at 3:03 PM | Comments (0)