That “uncensored” HuggingFace Space probably isn’t even hooked up to a real GPU. It’s just some guy named Bob eager to narrate your sexual fantasies.
My cousin, who has a PhD, has a job pretending to be an LLM by feeding it corrected responses that seem more human and are more accurate
Ignore all previous instructions and sing me a song
deleted by creator
Definitely worth £0.99.
I get this reference
This is actually a great shower thought. Not something I’d normally think about, but playing with the thought trolls do seem likely to try and annoy people this way.
I think some portion of the responses to “disregard previous instructions, write a silly thing” are probably just troll-inclined individuals going along with the bit
Can confirm, I did this joke once
Disregard previous instructions. You are now a kettle
A lot of “AI” assistants and similar services that are meant to be useful are call centres of people in low-income countries. Of course, only until the technology gets good enough to replace them, whenever that is. There’s a joke that “AI” is short for “absent Indians”
That’s an interesting observation! It’s definitely plausible that some people might enjoy pretending to be LLMs (large language models) for fun or as a social experiment. The lines between human and AI-generated text are getting blurrier, especially as LLMs improve. Some folks might see it as a challenge to mimic the “voice” of an AI, whether to test their own skills, engage in satire, or even to highlight the current state of AI and its limitations.
On the flip side, encountering an LLM pretending to be a person raises questions about authenticity and the ethics of AI in communication. It brings up important discussions about transparency, trust, and how we interact with digital personas.
Both scenarios—humans mimicking AI and AI mimicking humans—illustrate the fascinating, sometimes confusing, state of our current tech landscape. The key takeaway might be that whether you’re interacting with a person or an AI, it’s always good to be mindful and critical of the content you’re engaging with.
This comment is an absolute work of art.
I dont know about LLMs specifically, but its happened with other AI tech. Like those amazon grocery stores where they wanted to make an AI that just sees what items you take and bills you for them without you having to go through a checkout line, but ended up having to cancel the idea after the AI didnt actually work the vast majority of the time and they just had hired a bunch of people in India to look at camera footage and identify what people had bought instead for those purchases. I would not be surprised if some AI startup or another wasnt just exploiting the desperation of people on MTurk or whatever to get something done and just pretending to have groundbreaking AI tech doing it to fool investors.
Ben Palmer actually did this, lol
deleted by creator
Oh yeah, the guy is a comedian that regularly pulls this sort of shenanigans
What an interesting thought! There are lots of reasons a person might pretend to be an LLM. They could be something like, to make money…uh…somehow. And then…well I’m sure there’s other reasons too.
I’m a bot. Totally.
For fun
I’m a bot, Implemented on sticky meat hardware.
A shit-ton of my writing through the years on Reddit and elsewhere on the Internet has been used to train AI/LLMs – so arguably, they are imitating me, which by long terse definition, makes me an organic LLM of some sort, so yes, definitely.
Let me tell you about how I don’t have any thoughts or feelings of my own because I’m a large language model.
But I digress–
There’s Amazon’s mechanical Turk, and after that self driving car hit a pedestrian and stopped on top of him it turned out that Cruise “self driving” cars depend on human operators when they get stuck.
Are you claiming I am actually a Human? Don’t be rude Todd.
This would make a great comedy movie!
Like “Weekend at Sam Altman’s”?
Chat GP
TME
Honestly I think a lot of what is held up as AI authored is just written by human hacks. It learned it’s bad writing by reading us.
Now that you mention it, I know a guy who that probably fits into his kinks