• ☭CommieWolf☆@lemmygrad.ml
    link
    fedilink
    arrow-up
    2
    ·
    7 months ago

    I wonder if football culture in the US is any better, as from what I know, football (soccer) is considered more often to be a sport for women, particularly young girls. Is football culture there less right wing and toxic?