logoalt Hacker News

dimmkeyesterday at 1:51 AM6 repliesview on HN

I feel like Anthropic is going down a bad path here with billing things this way. Especially as local LLM continues to develop so fast.

I downgraded from my $200 a month plan to my $20 plan and hit limits constantly. I try to use the API access I purchased separately, and it doesn't work with Claude Code (something about the 1 million context requiring extra usage) so I have to use it Continue. Then I get instantly rate limited when it's trying to read 1-2 files.

It just sucks. This whole landscape is still emerging, but if this is what it's like now, pre enshittification, when these companies have shitloads of money - it's going to be so much worse when they start to tighten the screws.

Right now my own incentive is to stop being dependent on Claude for as much as I can as quickly as I can.


Replies

harrallyesterday at 2:00 AM

This is how free drink refills, airplane tickets, Internet service, unlimited data plans, insurance, flat rate shipping, monthly transit passes, Netflix, Apple Music, gym memberships, museum memberships, car wash plans, amusement park passes, all you can eat buffets, news subscriptions, and many more work.

Either you get a flat rate fee based on certain allowed usage patterns or everyone has to be billed à la carte.

show 3 replies
lelanthranyesterday at 7:18 AM

> I feel like Anthropic is going down a bad path here with billing things this way.

What do you expect them to do? You are looking at a business currently running at a loss, and complaining about their billing even though this is not a price-rise?

Unrelated, is it still possible to use $10k/m worth of tokens on their $200/plan?

show 2 replies
trashfaceyesterday at 4:27 PM

We can hope that they optimize the models. I still think its going to be very hard for them to charge $100 or $200 a month at scale from many people, especially with AI "taking jobs". To the extent that happens most of those people won't find replacement income.

boppo1yesterday at 2:09 AM

>Especially as local LLM continues to develop so fast.

I'm sorry is there anything even close to sonnet, much less opus, that can be run on a 4080? Or 64gb of ram, even slowly?

show 5 replies
username44yesterday at 6:18 AM

You can use the API with CC, you just need to log out and log in, selecting API usage.

Alexzoofficialyesterday at 7:40 AM

[flagged]