Some interesting thoughts on how to leverage ChatGPT

  • Andreas@feddit.dk
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    1
    ·
    1 year ago

    Using ChatGPT for anything more than repetitive or random generation tasks is a bad idea, and its usefulness becomes even more limited when you’re working with proprietary code that you can’t directly send to ChatGPT. Even with personal projects, I still try to avoid ChatGPT as much as possible for the simple reason that I’ll be forced to pay for it if it becomes an essential part of my workflow when it leaves this free beta testing phase.

    • Kresten@feddit.dk
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      Exactly this! I hate hearing politicians and rulemakers discuss how ChatGPT and LLM are going to be relevant everywhere and how ChatGPT should already be incorporated into education. They literally call it a “research preview”, you can only assume that when they’ve gathered enough data, they’re going to shut it down, or at least reduce its capacity by a lot.

      With that said, I really enjoy using it. Mainly for brainstorming topics or new projects, and what technologies to use in them. Sometimes I also find a use for it as a therapist, for social topics I don’t really know who to ask, and I expect a generic reply anyway.

      • SleepyHarry@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        On the politicians / rulemakers side of things, that may or may not be a good thing tbh. Technology moves so fast and traditionally the aforementioned groups are glacial and can’t keep up, sometimes to the benefit of a small group, often to the detriment of the majority. Having this on their radar relelatively soon is potentially a useful change.

        • Andreas@feddit.dk
          link
          fedilink
          English
          arrow-up
          4
          ·
          1 year ago

          While it’s nice that politicians are enthusiastic about new technologies, I think ChatGPT is one example where they shouldn’t force mass adoption. ChatGPT is a proprietary model owned by a private corporation, and it’s made very clear that interaction data with ChatGPT will be collected and used by OpenAI for its business. It’s horrible for data security and it helps to strengthen OpenAI’s monopoly. Honestly, governments recommending privately owned software and technologies should be considered advertising.

          • a_statistician@programming.dev
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            governments recommending privately owned software and technologies should be considered advertising.

            Is this not also true if the software is open-source? It’s still advertising, but it’s somehow ok because a corporation doesn’t benefit? It’s not that I don’t agree with you - regulatory capture and vendor lock-in are much less of a concern for free and/or open-source software, but that doesn’t mean it’s not still advertising.

          • SleepyHarry@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            That side of it I wholeheartedly agree with. Perhaps I’m just deluding myself into thinking technology awareness early on makes for better legal infrastructure to handle its effect on society. I really would like that to be the case.

            But yeah agree, “ChatGPT” being synonymous with “groundbreaking AI” to the vast majority of the public (I suspect) is not great from a monopoly perspective.

  • Yours Truly@dataterm.digital
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    I’ve been using ChatGPT at work quite a bit now. Some of the things I’ve used it for are:

    1. Writing a shell script that scrapes some information about code modules and shows them neatly
    2. Minor automation scripts that setup and make my day to day docker workflow easy
    3. Writing random regex, sql, lua pattern matching functions
    4. It turned out to be surprisingly good at creating code examples for certain undocumented APIs (kong.cache, kong.worker_events, kong.cluster_events) in Kong API Gateway.
    5. Copy pasting a rough python automation script, converting it into Go, and adding it in the application itself.

    I still don’t feel comfortable using it for anything big.

  • mjpc13@programming.dev
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    I usually use it more to help me write documentation and add comments on some functions. It helps explaining what a function does.

    To write code I usually just use it to write simple functions or a template code for me to start from somewhere. I avoid using it with external Libraries as in my experience, it likes to “invent” functions and methods that are not implemented.

  • janAkali@lemmy.one
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    1 year ago

    I, personally don’t trust ChatGPT. So whenever I use it I always look at code it generated until I understand what it does. And the truth is that it takes me more time to understand it than if I would’ve wrote code myself.
    It does wonders for repetitive tasks or data generation, though.
    I believe in any place where security is critical, developers should not use AI-generated code irresponsibly.

  • TheLinuxGuy@programming.dev
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I tried to use it, but it have some big issues in reliability, because at the end of the day, despise the dataset it’s trained on, it’s still something I describes as a “language interpolation.”

    It sometime make TERRIBLE recommendations for which tools/libraries I should explore, because it assumes that those libraries might have support. Those libraries never does and so I wasted weeks on it. (It doesn’t help that both code and project are undocumented.)

    So after that experience, I demote ChatGPT usefulness to just “cleaning up pre-written documentation so it sounds better.” That’s it.

  • Threen@aussie.zone
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    One thing I used ChatGPT for recently was generating test data.

    Hey ChatGPT, I use SQL Server and here is my table structure, please generate an insert query with 10 rows of fake test data.

    It wasn’t perfect, but honestly nor is the test data I would have written. It was a great starting point and saved me a lot of time since this is a legacy app with some wide tables (30+ columns).

    • Erlingur@programming.devOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Me too! I used it recently to generate some fairly specific test data that would have taken me probably 30 minutes of massaging instead of the 30 seconds of creating the right prompt. So helpful!

  • psudo@beehaw.org
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    I haven’t found a good use case for it yet personally, but I am excited to hopefully get it to start helping with the boring boilerplate and similar things.

    I am a little afraid that it’ll be the first bit of common tech I am too old to really grok and I’ll just be that old dude in the corner that insists on doing it how everyone did “back in my day.”

    • Erlingur@programming.devOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I actually use it a lot, especially for stuff like “I need to change this code to something like this”. It’s usually pretty spot on and has saved me a lot of typing. Complex code not so much, at least not yet. I think it’s capable of it to some degree but I’ve found I much prefer to offload the “mundane or tedious” stuff to it.