Is America Weaker Under Biden’s Leadership?


YES

America is weaker.


NO

America is not weaker.

A number of Republican lawmakers including even some Democrats have admitted that President Biden has made America weaker. Biden has confidently said that he has made America much stronger. What do you think?

Related Polls

Load More Polls Loading...No more polls.

Leave a comment

Your email address will not be published. Required fields are marked *