What Do Democrats Have to Offer?
They've done nothing at all to help improve America. That won't change in 2021.
Since January 20, 2017, Inauguration Day for President Donald Trump, you’d be hard pressed to come up with anything the Democrats have accomplished. We do know one thing for certain — they HATE President Trump.
In fact, that’s been the only thing we’ve heard. Trump colluded with Russia. Trump lied. Trump committed treason. Trump did this, that, and something else. One accusation after another has been thrown against this president, unlike any other political leader I have witnessed in my lifetime. It must really fray their nerves to know that, no matter what outrageous claim they make against him, nothing sticks. And it’s not as if he doesn’t have actual flaws they could accuse him of. But they go for the most outlandish charges they can think of and it bounces off him.
I know there will be an “October” surprise. Maybe that’s when Adam Schiff FINALLY produces the evidence he’s claimed to have had for years now of Trump’s collusion with Russia. Maybe it’s just me, but does anyone else wonder if he really has anything? Moreover, why would any self-respecting reporter even ask him a question expecting an honest answer? Perhaps it’s because reporters with any self-respect are in short supply these days.
What have Nancy Pelosi, Chuck Schumer, and all of the Democrats in Washington actually done for us? They spent $30 million trying to impeach Trump. They’re holding up relief for small businesses during this pandemic until they get money to bail out Democrat cities and states that are going bankrupt. Nothing says “pandemic aid” like bailing out California, Illinois, New York City, Chicago, and other Democrat-controlled failed states and cities.
The Democrat Convention had nothing good to say about America. No hope for the future. No plans to help restore America to what she has been in the past. No stories of people who have accomplished the American Dream or those who have overcome hardship to be successful. Just gloom and doom; Trump is evil; and Biden needs to win.
If Biden wins, will the Democrats go back to work for us? Will they finally take up legislation that helps Americans get on with their lives? How will eliminating Trump’s tax cuts help us? How will the costly Green New Deal help create jobs? Will they rein in Black Lives Matter and antifa rioters, or are they still just “peaceful protestors”?
What exactly will they do that they can’t be doing now to show they really want to make America better again? If they’ve done NOTHING for four years, they’ll start doing something if we elect them? We’re to believe them based on what? Especially if half of the country, in their eyes, are evil, racist enemies of America because they supported Trump! Color me skeptical!
Something to think about?