> I think you are confusing the spec as "this is how it must be built", as opposed to, "this is what the software must do and must not do to be acceptable".
You can't enforce a "do not do this" to an LLM. Just putting it in the context by saying "don't do this" makes it more likely that it will eventually do that.
Yes, I agree. If you tell humans "do not think of pink elephants", they are more likely to think about pink elephants.
Therefore, you must not use humans for any important work.