I sometimes wonder if there are any security risks with using Chinese LLMs. Is there?
Theoretically yes. It is entirely possible to poison the training data for a supply chain attack against vibe coders. The trick would be to make it extremely specific for a high value target so it is not picked up by a wide range of people. You could also target a specific open source project that is used by another widely used product.
However there is so many factors involved beyond your control that it would not be a viable option compared to other possible security attacks.
If there is, couldn't they exist in any model?
I don't mean that flippantly. These things are dumped in the wild, used on common (largely) open source execution chains. If you find a software exploit, it's going to affect your population too.
Wet exploits are a bit harder to track. I'd assume there are plenty of biases based on training material but who knows if these models have a MKUltra training programme integrated into them?
Backdooring software at scale.
Spearphishing.
Building reliance and exploiting it, through state subsidies, dumping, and market manipulation.
Handicapping provision to the west for competitive advantage.
What about LLMs from other origins? What makes them less risky?
I sometimes wonder is there are any security risks with using LLMs from the US.
From my experience, kinda the opposite? It's like Chinese software is... Harder to weaponize or hurt yourself on. Deepseek is definitely censored, but I've never caught it being dishonest in a sneaky way.
There must be. The executives at my company wouldn't have banned them all for no reason after all.
[dead]
Is this a serious comment? It honestly reads like the last famous words.
Of course there are risks.
All China (or anyone) has to do is deliver a close to equal product at a much cheaper price and make it scaleable / usable... which is what they're doing. It doesn't have to be malicious at all. Just a good product at a good price. The US is basically in a recession that's hiding behind insane AI investments.