So the Republican party thinks it can "win back" control of Congress from the Democrats in this year's election.
How does that matter? Really... how does that matter?
I have been observing politics for most of my life. I'll admit that once upon a time I did believe that there were fundamental differences between the Democrats and Republicans and that those were "the only" parties that seriously existed. So I was in the same mindset as the vast majority of Americans.
Then I grew out of it. Woke up. Came to my senses. Saw things for how they really are...
Saw too much of what's running this country as one big damned fabrication. Not a government of enlightened individuals but a glorified puppet show entertaining the masses with smoke and mirrors.
And now, now... it doesn't bother me one whit about which party is "in control" in Washington.
Because, let's get real folks: do things ever honestly change for the better depending on whether it's the Democrats or Republicans that are in power?
This country endured sixteen consecutive years of the worst Presidents in its 200-plus year history. One was a Democrat and the other was a Republican. Neither left this nation in a better state than how they found it (the Republican one was hands-down the most destructive "President" yet).
But still, too many people in this country are entranced by the projected allure of these mere mortals. They look for the quick fix of "someone else" and ignore the wisdom that God has not only given us, but expects us to use on our own.
I don't see how this country will prosper for much longer when most of us refuse to think for ourselves and instead let the Republicans, or the Democrats, or Barack Obama, or Sarah Palin, or Glenn Beck, or anyone else but God carry our hopes for something better.