The White House, and United States Senate, Public domain, via Wikimedia Commons
YES
America is weaker.
NO
America is not weaker.
A number of Republican lawmakers including even some Democrats have admitted that President Biden has made America weaker. Biden has confidently said that he has made America much stronger. What do you think?