The fact is, currently, AI can’t write good code. I’m sure that at some point in the future they will - but we’re not there yet, and probably have some years still.
Imagine at some point in the future, where an AI can program any piece of software you want for you, and do it well. At that point, the value of code itself will be minimal. If you keep your code proprietary, I’ll just get the AI to re-implement the functionality anew and publish it.
Therefore, all code will be permissive open source. There would be no point in keeping anything proprietary, and also no point in applying copyleft. But at this point the copyleft “hack” would simply be unnecessary, so permissive open source would be just as good.
Until then, me not using AI doesn’t in any way prevent others from training AI on my code. So I just don’t see training on my code as a valid reason to avoid it. I don’t use AI currently - but that’s for entirely pragmatic reasons: I’m not yet happy with the code it generates.
That’s a retaliatory tariff. Meta broke the law, and the EU retaliated.