when did the USA actually win? was it in iraq? or afghanistan? or vietnam? or korea? all spectacular failures...even ww2 was won by russia all the USA did was nuke fishermen to make one guy ruling crippled japan surrender..
is there a war the US army was in where the USA can say: things are now better?
no?
I have no idea why you guys are sending your army anywhere honestly speaking



