logoalt Hacker News

dalemhurleytoday at 7:21 AM4 repliesview on HN

Theoretically yes. It is entirely possible to poison the training data for a supply chain attack against vibe coders. The trick would be to make it extremely specific for a high value target so it is not picked up by a wide range of people. You could also target a specific open source project that is used by another widely used product.

However there is so many factors involved beyond your control that it would not be a viable option compared to other possible security attacks.


Replies

2ndorderthoughttoday at 10:47 AM

I believe this is possible but unlikely. I don't think a Chinese company trying to break down the US's stronghold in this field would do this short term. I think it is in their best interest to be cheaper, better, easier, and more trust worthy until competition looks silly.

It's like suggesting BYD has a high likelihood of making their cars into weapons or something. It's not in the company or their countries interest to do that.

Sure it could happen but I bet it would only happen in a targeted way. Why risk all credibility right now and engage in cyber warfare?

show 1 reply
mazurnificationtoday at 8:08 AM

But propaganda or non ethical marketing - why not? (That is bias toward pointing to certain provider(s)).

wallst07today at 10:52 AM

or more obvious like TikTok.

Meaning Tiktok in the us is complete garbage for kids, almost like a virus. Whereas in China it's more educational.

_blktoday at 8:56 AM

Would be interesting to hook up a much simpler LLM as fact checker to see when errors are introduced.

If I had to place a hidden target it'd probably be around RNGs or publicly exposed services..