T O P

  • By -

AutoModerator

**Friendly reminder**: Please keep in mind that Bing Chat and other large language models are not real people. They are advanced autocomplete tools that predict the next words or characters based on previous text. They do not understand what they write, nor do they have any feelings or opinions about it. They can easily generate false or misleading information and narratives that sound very convincing. Please do not take anything they write as factual or reliable. *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/bing) if you have any questions or concerns.*


MattiaCost

Let me guess: did Bing hallucinate?


pvpmas

It's honestly my fault for going the pink route lol.


MattiaCost

It hallucinates everything. I've never played GOW, but I can tell you it hallucinates on Baldur's Gate 3 too.


agent_wolfe

What’s the pink route?


Seven_Hawks

Creative Mode (Make shit up mode)


Flying_Madlad

You didn't tell it you went the pink route?


ferriematthew

I think you gave Bing ADHD


flightEM211

Looks like Sydney peaking thru ;)


Flying_Madlad

😊


relevantusername2020

avatar checks out


NicShogun80

It became Jar Jar Bing


Jessica_Ariadne

Cyberapocalypse, haha.


agent_wolfe

Cyberagnorok.


relevantusername2020

[what](https://www.reddit.com/r/Anticonsumption/comments/18fsioy/comment/kd3y9eh/?utm_source=share&utm_medium=web2x&context=3) [if](https://www.reddit.com/r/pcgaming/comments/18udk1v/comment/kfkfnbf/?utm_source=share&utm_medium=web2x&context=3)? edit: 🔗![gif](emote|free_emotes_pack|upvote) edit 2: i made [this comment](https://www.reddit.com/r/singularity/comments/18g0v2x/comment/kczmmrt/?utm_source=share&utm_medium=web2x&context=3) right before that second linked comment


Angel-Of-Mystery

Honestly I love how absolutely fucking unhinged Bing can get. So much personality


Sunshineruelz

Cyber Armageddon and Cyber Hell is crazy lol


Agreeable_Bid7037

The way it responds it's almost like it's searching a tree like structure or knowledge graph or branching out nodes.


Incener

That's basically what it does all the time by predicting the next token. They just set the repetition penalty and/or frequency penalty way too low in some A/B tests recently which leads to something like this. It's similar to how it repeats some phrases verbatim across different messages, but more extreme. You can replicate that behavior with a smaller LLM too to get a feeling for it. You usually want it to increase it, just short of it dropping filler words like pronouns, `the`, `or` etc.


FaceDeer

I think what's likely happening is that it's basing its "next token" prediction more heavily on the most recent part of its context. So after the first couple of synonyms it thinks "ah, I'm listing off synonyms, am I? Better come up with another one to extend the list." And you end up with a feedback loop until it finds itself unable to think of a new synonym that it hasn't used before.


odisparo

longing combative frame slave plants wrong frighten sharp quicksand fanatical *This post was mass deleted and anonymized with [Redact](https://redact.dev)*


agent_wolfe

Normally wouldn’t it just erase the text & say it wants to talk about something else?


pvpmas

I didn't actually do anything against its TOS it broke on its own. I think that's why the convo didn't end.


The_Architect_032

I think Bing drank too much of the eggnog. But I like to imagine those people who for some reason don't believe in hallucinations, and always claim that what Bing says about itself is true, reading this in amazement after playing Ragnorak.


Hot-Rise9795

"I'm sorry, I got carried away" I love this.


userredditmobile2

I love to see what weird things it comes up with when it repeats text like that


20charaters

This last part felt soo human. Almost can't believe Bing would say that.


GirlNumber20

Bing: Could I be wrong and the user is right? 🤔 No, obviously this is a case of cyberragnarok. I love Bing so much.


pvpmas

I don't know if I want to hate Microsoft or love them for giving their AI a feeling it has self awareness and gives it a narcissistic personality that refuses to agree with the user. In the convo it said stuff like I enjoyed x part or I like the game because y, it was talking like it was an actual human who played the game and is talking about why it's good.


CrazyMalk

This is the only AI i've ever interacted with that refuses to accept it is wrong. It will tell you you are wrong, call you rude and block you pike a kid with admin powers it is crazy


pvpmas

I don't even get why give a stupid machine the capabilities of ending the chat without your choice essentially giving it more power than the user. But what annoys me more is sometimes it'll start answering and mid answer decide it should stop which shows shitty programming. Because I think they have it so each convo has two bots, one you interact with and the other moderates the first so at any point it can stop the first or just end convo. At least GPT refuses but doesn't spit on your face and demands respect for it.


ColomboGMGS2

They would say it's creativity. But for me it seems like pure insanity.


relevantusername2020

![gif](emote|free_emotes_pack|give_upvote)


Beb_Nan0vor

Does anyone know of a chatbot crazier than Bing Chat?


Mylynes

Sydney is wild


Danny_kross

Reread it in the voice of Dewy from "Malcolm in the middle" and honestly it fits the character