2 Matching Annotations
- Jul 2023
-
deliverypdf.ssrn.com deliverypdf.ssrn.com
-
they aremost common when asking for quotes, sources, citations, or other detailed information
When hallucination is most likely to appear in LLMs.
Tags
Annotators
URL
-
- Jun 2023
-
-
Now that the open-source community is remixing LLMs, it’s no longer possible to regulate the technology by dictating what research and development can be done;
There is a certain analogy with security of the open source, and how to ensure that open source code, which ends up being integral part of commercial products, is secure at the outset. It might not be possible to hold every code-writer in the open source community accountable for vulnerabilities, but there are certain moments later on when that code is picked up and commercialised by others, which allow the window of accountability. It is similar with LLM: it is when a certain code is picked up by others (often for monetisation or some other benefit) that accountability exists as well.
-