logoalt Hacker News

wpsyesterday at 8:09 AM14 repliesview on HN

Could someone explain the appeal of account-wide memory to me? Anthropic’s marketing indicates that nothing bleeds over, but I’m just so protective of my context that I cannot imagine having even a majorly distilled version of my other chats and preferences having on weight on the output. As for certain preferences like code styling or response length, these are all fit for custom instructions, with more detailed things in Skills. Ultimately like many things in LLM web UX, it seems to cater to how the masses use these tools.


Replies

jjmarryesterday at 8:14 AM

Most normal people want the LLM to remember their interests and favourite things, so they don't have to manually re-explain when asking for advice.

They also don't know what "context" is or that the LLM has a limited number of tokens it can understand at any given time. They just believe it knows everything at once.

show 1 reply
AllegedAlecyesterday at 8:15 AM

In online Claude I often use incognito mode precisely because I don't want results to be influenced by what we talked about earlier. It's getting rather annoying to be honest.

show 3 replies
bouzoukyesterday at 4:42 PM

On the contrary, I cannot understand how people are seriously using LLM outside of software engineering without account-wide memory. When I ask things like "what do you think John should do next on project A?", I don’t want to have to explain in detail who is John, what is project A and what John was working on before.

7734128yesterday at 9:43 AM

The few times I've switched over to chatGPT I've been dumbfounded by lines like "...since you already are using SQLite...", referring to projects from months ago.

I know the "memory" function can be disabled, but I have a hard time seeing that it would ever really be useful.

show 2 replies
gverrillayesterday at 4:06 PM

It all depends on your usecase(s). For me, "account-wide" memory has only: (a) short description of my hardware/os/display system/etc; (b) mobile hardware and os version; and (c) my age, gender, city/country of residence, and health conditions.

pfixyesterday at 8:15 AM

I can try!

I currently use ChatGPT for random insights and discussions about a variety of topics. The memory is basically a grown context about me and my preferences and interests and ChatGPT uses it to tailor responses to my knowledge, so I could relate better.

This is for me far more natural and easier than either craft a default prompt preset or create each conversation individually, that would be way too much overhead to discuss random shower thoughts between real life stuff.

This is my use case and I discovered that this can be detrimental to specific questions and prompts and I see that it can be more beneficial to have careful written prompts each time. But my use case is really ad hoc usage without the time. At least for ChatGPT.

When coding, this fails fast. There regular context resets seem to be a more viable strategy.

show 1 reply
jtokophyesterday at 8:32 AM

I've told the LLMs that, when traveling, I don't care about nightlife and alcohol. Because they have a memory of this, when I ask for a sample itinerary for a 2 day stay in a new city, it won't waste hours in the day on the party street, wine tasting, etc.

For example, instead of recommending a popular night club, it will recommend the stroll along the river to view the lit up skyline or to visit the night market instead.

It knows other preferences as well (exploring quirky neighborhoods, trying local fast food joints and markets)

show 1 reply
bmurphy1976yesterday at 2:01 PM

"Stop asking me to apply the plan. I will tell you when I'm ready."

That alone drives me batty. I can easily spend a couple hours and multiple revisions iterating on a plan. Asking me me every single time if I want to apply it is obnoxious.

Panoramixyesterday at 7:22 PM

Think of things like your preferred units (meters, kg, cups, tablespoons, milliliters). Or, do not suggest recipes with x ingredient. Language preferences. Etc etc etc.

joenot443yesterday at 3:26 PM

I own a lot of dirt bikes, boats, snowmobiles, mowers, and blowers. It's much easier for me to ask about "My Polaris" than it is to ask about my "2011 Polaris Switchback Assault".

Similarly, it remembers the dimensions of my truck, so towing/loading questions don't need extra clarification.

It's the small things.

__alexanderyesterday at 1:54 PM

The appeal for me is not having to constantly repeat instructions. Imagine having to repeat dietary restrictions every time you ask for a recipe.

gbalduzziyesterday at 8:14 AM

> it seems to cater to how the masses use these tools.

Are you suggesting that they should ignore the needs of the vast majority of their users?

I mean, of course they do, it would be worse otherwise

show 1 reply
MagicMoonlightyesterday at 2:44 PM

Because I can say “do what you did before, but about the romans this time”

And it will give me a complete rundown of Roman life, because it knows what I was interested in before.

Or you can ask a tax question and it will know you’re an organic rice farmer or whatever. Claude has the best implementation because it has both memory, and previous chat searching. So it will actually read through relevant chats, rather than guessing based on memories.

CGamesPlayyesterday at 8:13 AM

Sure, it's for those customers who don't have any idea what a "context window" is.

show 1 reply