logoalt Hacker News

holodukeyesterday at 9:32 PM0 repliesview on HN

With LLM tool use potentially every cat action could be a prompt injection