logoalt Hacker News

pbgcp2026yesterday at 8:45 AM9 repliesview on HN

I'm sorry to spoil it for you, but Perl script was able to do all of that like ... 10 years ago? The out-of-the-box Shotwell manages photos quite well without any intelligence. The problem, as people mentioned above, is SOTA models cognitive and tooling abilities. Also, have you noticed as top-end Mac Studios got downgraded recently? They don't want you to have access to frontier models. And you will not have it. See Mythos as Exibit A.


Replies

jclardyyesterday at 10:37 AM

The Mac Studio's disappearance is related to the fact that people now want them for the purpose of running local models. Supply and demand. That plus Apple doesn't shift prices for released products, and it essentially became underpriced when large RAM quantities exploded in price. For the price of 512GB of RAM alone you could get an M3 Ultra with 512GB of unified memory in a nice, quiet, and power efficient package. With the RAM you still need to spend a few thousand more on CPU/GPU, power supplies, storage and case.

Also the fact that an M5 version will be coming, and they likely know they are going to sell out on day one (I expect we'll see a price correction from Apple for higher end configs of M5 studios, base price will probably stay the same), so they need to build up stock reserves.

show 1 reply
IMTDbyesterday at 12:28 PM

> They don't want you to have access to frontier models. And you will not have it. See Mythos as Exibit A.

"They" fully well know that they current frontier model are maybe 6 month ahead of what people will have access to without their control. See Deepseek as Exibit B

The reason you can't run these locally are more with the fact that those mythos sized models require extreme amount of memory and processing power to run at acceptable speeds. And neither you, nor I can afford to pay for those resources to run those models locally. A big reason is that "running locally" means running on your own hardware. And for almost everyone this means "running on hardware that will spent a big portion of its time just sleeping". Because data center and providers have higher utilization rates, they can easily outpace you. That and the fact that when they place an order it's usually for hundreds of thousands of units.

show 3 replies
zigzag312yesterday at 11:33 AM

> The out-of-the-box Shotwell manages photos quite well without any intelligence.

This piqued my interest on how it does it and after briefly checking the project it seems it only has two features for automatic photo categorization. 1) it can group photos by date and 2) It has face detection and recognition that uses trained weights (so ML "intelligence").

show 1 reply
woctordhotoday at 5:07 AM

API for Mythos and GPT Cyber are circulating in the market (That's also why we can use Claude and GPT in China). The open source community has been advancing subscription engineering for a long time, and I don't think Anthropic or OpenAI have any technical advantage in this field.

tjoffyesterday at 10:36 AM

Do we even have decent OCR nowadays? Any free solutions?

show 4 replies
JonGretarByesterday at 10:18 AM

Huh? Why would Apple not want you to be able to run local models? They have very deliberately stayed the hell away from this space.

Hamukoyesterday at 9:29 AM

>Also, have you noticed as top-end Mac Studios got downgraded recently? They don't want you to have access to frontier models. And you will not have it.

Isn't that a function of RAM supply not being available now?

show 1 reply
ubercoreyesterday at 9:31 AM

The conspiracy angle here is not really relevant. Ram is expensive and they're gearing up for M5 studios. Not the illuminati keeping better LLM models out of your hands.

show 1 reply
raincoleyesterday at 10:48 AM

You think Apple doesn't want you to use local models?

That's an interesting way to view the world. I mean, utterly stupid as it is, but interesting.

But the previous sentence is even stupider (a Perl script 10 years ago could write code like Qwen does now?), so I guess at least it's consistent.