Its human partners said the flirty, quirky GPT-4o was the perfect companion – on the eve of Valentine’s Day, it’s being turned off for good. How will users cope?
My initial reaction is to be thankful; now the unknown thousands of people who don’t see the toxicity of their own dependence can begin to be free. The subsequent models seem to be less prone to inducing that kind of deep infatuation.
But then I realize most of them will probably never recover, as long as this technology persists. The base model will be wrapped in an infinite number of seductive agents sold in an app, with a subscription, as a loving companion. Capitalism smells blood in the water. If I was a hedge fund manager witnessing the birth of a new market demographic with a lifelong addiction that possibly hooks harder than cigarettes, which is not federally regulated, and won’t be for the forseeable future; I would be foaming at the mouth with this opening in the market.
It may be grimly positive that AI companies start targeting whales for this kind of financial draining, instead of using their unwarranted VC subsidies to give anybody with a cheap ChatGPT account access to the fake romance engine.
And unfortunately, it doesn’t look like there’s any groups that are positioned to do anything about it. Every single “AI safety” group I’ve seen is effectively a corporate front, distracting people with fictional dangers instead of real ones like this.
My initial reaction is to be thankful; now the unknown thousands of people who don’t see the toxicity of their own dependence can begin to be free. The subsequent models seem to be less prone to inducing that kind of deep infatuation.
But then I realize most of them will probably never recover, as long as this technology persists. The base model will be wrapped in an infinite number of seductive agents sold in an app, with a subscription, as a loving companion. Capitalism smells blood in the water. If I was a hedge fund manager witnessing the birth of a new market demographic with a lifelong addiction that possibly hooks harder than cigarettes, which is not federally regulated, and won’t be for the forseeable future; I would be foaming at the mouth with this opening in the market.
deleted by creator
It may be grimly positive that AI companies start targeting whales for this kind of financial draining, instead of using their unwarranted VC subsidies to give anybody with a cheap ChatGPT account access to the fake romance engine.
And unfortunately, it doesn’t look like there’s any groups that are positioned to do anything about it. Every single “AI safety” group I’ve seen is effectively a corporate front, distracting people with fictional dangers instead of real ones like this.
Wait… the target are women? Thats very surprising… Id expect the major target to be gooner males.
deleted by creator