2 Matching Annotations
  1. Jul 2023
    1. they aremost common when asking for quotes, sources, citations, or other detailed information

      When hallucination is most likely to appear in LLMs.

    Created with Sketch. Visit annotations in context

    Created with Sketch. Tags

    Created with Sketch. Annotators

    Created with Sketch. URL

  2. Jun 2023
    1. Now that the open-source community is remixing LLMs, it’s no longer possible to regulate the technology by dictating what research and development can be done;

      There is a certain analogy with security of the open source, and how to ensure that open source code, which ends up being integral part of commercial products, is secure at the outset. It might not be possible to hold every code-writer in the open source community accountable for vulnerabilities, but there are certain moments later on when that code is picked up and commercialised by others, which allow the window of accountability. It is similar with LLM: it is when a certain code is picked up by others (often for monetisation or some other benefit) that accountability exists as well.

    Created with Sketch. Visit annotations in context

    Created with Sketch. Tags

    Created with Sketch. Annotators

    Created with Sketch. URL