FWIW, I've gone to ChatGPT multiple times with a specific intent to buy, like "hey I need a thing like X or Y, but with quality Z too" and sometimes it just hallucinates things that appear never to have existed, other times it comes up with real items, but the links it gives me to buy them lead nowhere, so I end up just googling the name of it and buying it that way (examples: computer monitor, power bar, USB charge station, kitchen gadgets, Christmas presents/toys, soldering supplies like tips and flux, 3D printing filaments, etc).
I would guess that ChatGPT has left at least $100 on the table from me having to do this when literally all it had to do was give me a referral link to Amazon or whatever and I would have clicked the buy button.
I have similar experiences. Asked to find and list a bunch of suppliers near a specific city. It started showing me places that were >5 hours away, claiming them to be a "short drive".
> I would guess that ChatGPT has left at least $100 on the table
Man, this thing is going to be so lucrative when they inject ads into it. Imagine how this is going to combine with the parasocial AI boyfriend/girlfriend people, it's going to be worse than hostess clubs. They'll have to invent whole new categories of nonexistant products for the bots to sell.
Same experience, I thought using chatGPT to find some fairly specific things to buy would be a slam dunk, but it couldn’t provide links half the time and also failed to hold to criteria like shipping region etc. I would tell it to give direct links and it would mostly just say ”go on Amazon and search for X”.
There’s a special type of frustration when an LLM is close to being useful but just… isn’t.
> I’ve gone to a machine that by its nature hallucinates and it hallucinated a response. Surprised Pikachu face
Why do you still trust the output for any other questions?
Use Deep Research and try to be as specific as you can with the attributes of said product. I’ve had a few successes this way.
Clicking on any picture itself should present an frame on the right with a bunch of options.
If it can't even point you to a real product on an existing website, why do you trust it for any other information..?
Once they start selling things, you’re actually going to trust they suggest the item you’re looking for over the item they get paid for you to buy?
What in the history of Sam Altman has lead you to believe he’ll do the right thing instead of the thing that makes him the most money?