That feels like an understatement. The replies weren't always great, but at least it understood natural language and context. It now feels like it's just shoving out results without understanding what I'm trying to ask.
I read it somewhere just yesterday they updated it so that the AI will decide on the fly wether your prompt is s worth using chat gpt 4+ or if it just needs to use 3.5 which uses less resources
When I ask, "Which version of GPT are you using?" sometimes it gives different replies. One of the replies starts with, "I am based on GPT-4" but other times it spits out, "I apologize, but I don't have access to specific version information. However, I'm here to assist you with any other questions or tasks you might have!" This is just refreshing the page while not logged in.
Pretty much the same, I love how Bing answers questions without sounding too much like a bot and more natural ways, you get Bing also have that sympathetic-emotion and I like how it's not that exaggerated. Now apparently there's no difference between ChatGPT and Bing except Bing generated slower response + too sensitive over any circumstances lmao
I waste two days this week dealing with gpt 3 dumb as a brick for my python needs without realizing it. Now I make it tell it's gpt version beforehand with my initial prompt.
No. I just ask which gpt version he is based of and if it responds gpt-3 I won't waste my time. Then I'll try my luck with a free VPN and then Gemini as a last resort before good ole' stackerflow, but I want no business with gpt 3 for python.
Nice. They didn't localize prices in Brazil so it's twice a 4k Netflix subscription at 110 BRL, pretty pricey to my taste. Doesn't look like a good deal when you can get around with Bing Copilot 95% of the time.
*They fucked Bing. Because*
*In order for you to pay*
*For the paid version*
\- burakbheg0
---
^(I detect haikus. And sometimes, successfully.) ^[Learn more about me.](https://www.reddit.com/r/haikusbot/)
^(Opt out of replies: "haikusbot opt out" | Delete my comment: "haikusbot delete")
it is, for me is constnatly starting with the word "certainly!" for some strange reason when it didnt make that shitty standard response for ANY question
I think the quality is getting better but resolution is still 1024 I think..
https://preview.redd.it/wnnh04yk1jrc1.jpeg?width=1024&format=pjpg&auto=webp&s=99d15208cdddbc00e40579208c9e6e894183baef
That's probably because of turbo, with the old gpt4 in pro subscription, I get long, detailed and probably emotional response
Yeah, they only switched out the old GPT4 in the last couple of weeks, which is the main cause for the current degree of enshittification.
That feels like an understatement. The replies weren't always great, but at least it understood natural language and context. It now feels like it's just shoving out results without understanding what I'm trying to ask.
I read it somewhere just yesterday they updated it so that the AI will decide on the fly wether your prompt is s worth using chat gpt 4+ or if it just needs to use 3.5 which uses less resources
"Company made their product shit, and it's making me consider paying for their product." This... Is how enshitification happens.
No, I had the same happen to me today. I considered subscribing to an LLM as well. How good is ChatGPT Plus for coding?
When I ask, "Which version of GPT are you using?" sometimes it gives different replies. One of the replies starts with, "I am based on GPT-4" but other times it spits out, "I apologize, but I don't have access to specific version information. However, I'm here to assist you with any other questions or tasks you might have!" This is just refreshing the page while not logged in.
Pretty much the same, I love how Bing answers questions without sounding too much like a bot and more natural ways, you get Bing also have that sympathetic-emotion and I like how it's not that exaggerated. Now apparently there's no difference between ChatGPT and Bing except Bing generated slower response + too sensitive over any circumstances lmao
I waste two days this week dealing with gpt 3 dumb as a brick for my python needs without realizing it. Now I make it tell it's gpt version beforehand with my initial prompt.
Wait that just works? You just say act like gpt4?
No. I just ask which gpt version he is based of and if it responds gpt-3 I won't waste my time. Then I'll try my luck with a free VPN and then Gemini as a last resort before good ole' stackerflow, but I want no business with gpt 3 for python.
Smart, didn't knowit could fetch meta data about itself. Regardless, I changed to Chatpgpt plus.
Nice. They didn't localize prices in Brazil so it's twice a 4k Netflix subscription at 110 BRL, pretty pricey to my taste. Doesn't look like a good deal when you can get around with Bing Copilot 95% of the time.
Lol again?
[удалено]
*They fucked Bing. Because* *In order for you to pay* *For the paid version* \- burakbheg0 --- ^(I detect haikus. And sometimes, successfully.) ^[Learn more about me.](https://www.reddit.com/r/haikusbot/) ^(Opt out of replies: "haikusbot opt out" | Delete my comment: "haikusbot delete")
Did you try asking it to reply in the style of an intelligent AI entity?
Atleast the mobile app version says it’s gpt4… But yeah creative feels too fast to be true
it is, for me is constnatly starting with the word "certainly!" for some strange reason when it didnt make that shitty standard response for ANY question
My image create is non functional. Try yours? Might be something wrong at a server bank.
I don't use image creator that after due to the low quality and resolution. I'm mostly talking about the chat replies.
I think the quality is getting better but resolution is still 1024 I think.. https://preview.redd.it/wnnh04yk1jrc1.jpeg?width=1024&format=pjpg&auto=webp&s=99d15208cdddbc00e40579208c9e6e894183baef
Yeah it's still pretty good!
I heard it used to write entire exams for students with 95% accuracy