Has America Gotten Worse Since Biden Came Into Office?


America has gotten worse.


America has gotten better.

A number of Conservative media outlets are openly admitting that America has gotten worse since President Biden has come into office. Others however are claiming that Biden has made America better. What do you think?

Related Polls

Load More Polls Loading...No more polls.

Leave a comment

Your email address will not be published. Required fields are marked *