Wikifunctions is a new site that has been added to the list of sites operated by WMF. I definitely see uses for it in automating updates on Wikipedia and bots (and also for programmers to reference), but their goal is to translate Wikipedia articles to more languages by writing them in code that has a lot of linguistic information. I have mixed feelings about this, as I don’t like existing programs that automatically generate articles (see the Cebuano and Dutch Wikipedias), and I worry that the system will be too complicated for average people.

  • Lvxferre
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    10 months ago

    This is an encyclopedia, so there are no pronouns like “I”, so this simplifies this issue. The remaining ones are in the third person, and if we link them to data about the person that is referred to it would solve this.

    The pronoun is an example. You are confusing the example with the issue.

    This issue is that, if some language out there marks a distinction, whoever writes the abstract version of the text will need to mark it, as that info won’t “magically” pop out of nowhere. The issue won’t appear just in the pronouns, but every where.

    A longuist [linguist] doesn’t necessarily need to know a language in order to analyze its grammar, and a lot of the work needed in Wikifunctions is like this.

    Usually when you aren’t proficient in a language but still describing it, you focus on a single aspect of its grammar (for example, “unergative verbs”) and either a single variety or a group of related ones.

    What the abstract version of the text would require is nowhere close to that. It’s more like demanding the linguist to output a full grammar, to usable levels, of every language found in Wikipedia, to write down a text about some asteroid, using a notation that is cross-linguistically consistent and comprehensible.

    Also note that descriptions coming from linguists who are not proficient in a variety in question tend to be poorer.