It’s clear that companies are currently unable to make chatbots like ChatGPT comply with EU law, when processing data about individuals. If a system cannot produce accurate and transparent results, it cannot be used to generate data about individuals. The technology has to follow the legal requirements, not the other way around.

  • gedaliyah@lemmy.worldOPM
    link
    fedilink
    arrow-up
    13
    arrow-down
    6
    ·
    7 months ago

    Or on the other hand, maybe we have to admit that these technologies were released before they were finished, and that was a dangerous decision. It’s now been well documented that chat gpt and similar technologies were rushed to the public against the advice of some of their developers.

    The developers will need to devise ways for the LLMs to understand their own training data.

    • db0@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      18
      arrow-down
      5
      ·
      7 months ago

      Llm tech is not rushed. The models are not for accurate information and trying to use them this way is out of their scope. What’s rushed is corpos trying to use them for searches