The United States soccer team wins the first Women's World Cup.
In 1991, the US Women's National Soccer Team won the first ever Women's FIFA Women's World Cup in the finals against Norway, where they won 2-1.
In 1991, the US Women's National Soccer Team won the first ever Women's FIFA Women's World Cup in the finals against Norway, where they won 2-1.