One just needs to put enough poison on the internet to get the malicious URL suggested by LLMs. What a time to be alive!
Honestly, I may be an accelerationist in terms of poisoning the LLM well if it gets us sooner to an industry-wide consensus that LLM output is a significant security risk.
Honestly, I may be an accelerationist in terms of poisoning the LLM well if it gets us sooner to an industry-wide consensus that LLM output is a significant security risk.