logoalt Hacker News

ArtTimeInvestortoday at 10:30 AM5 repliesview on HN

It is a step into the right direction.

Over time, more and more work is going to be done by AI though. At some point, it will be unthinkably slow and expensive to let humans work on anything.

To do *that* locally, you need GPUs and LLMs.

How will Europe solve these two?


Replies

Joeritoday at 10:44 AM

The EU chips act is subsidizing new fab construction in Europe.

Meanwhile the french Mistral is partnering with Nvidia to build an AI data center near Paris on which their LLMs will run.

But I agree this is not enough to make the EU a contender in the race with the US and China. The EU still has not seriously considered decoupling from American big tech.

arter45today at 2:01 PM

Not all AI uses LLM, and for some common LLM applications like summarization and translation you can already use CPU only models. The government, or even your average employer, is not going to need a lot of AI video generation or other really GPU intensive tasks. Prompt processing is currently more GPU oriented, but I don't see it as an impossible challenge given, say, 10-15 years.

Also, CPU-only doesn't necessarily mean "on your own computer". You can easily have 100 TB RAM in a couple of racks.

tonyedgecombetoday at 1:05 PM

Do you people have to squeeze a comment about AI into every post?

m_muellertoday at 10:35 AM

I think it depends on how strong the compression advancements are going to be, such that much can be done locally in the future. I'd be interested in experiences of others here in using Gemma4, which is at the forefront of "intelligence per gigabyte" atm. (according to benches).

ErroneousBoshtoday at 10:32 AM

No-one needs LLMs.

AI has no value.

show 3 replies