No, Karine Jean-Pierre didn’t say the US and Ukraine won World War II
During World War II, the U.S. was allied with the Soviet Union against Nazi Germany. The U.S. did not fight with Ukraine, which was occupied by German forces during most of the war. Ukraine was returned to Soviet control in 1944. We rate the claim that Jean-Pierre said the United States and Ukraine won World War II Pants on Fire!