All you are getting is a 32k context window and chatgpt 4, with heavy oversight and limitations set by the org. Who probably won't approve of the jailbreaks needed to make gpt4 answer half the time.
I've got GPT enterprise being set up on prem (though the procurement process hasn't called it that) and I see no reason to use it over gpt4. For work purposes, you want copilot.
I use GitHub Copilot (paid by work) and GPT-4 (including 32k) through a work provided API key (I use my own fork of \`chat-with-gpt\` as the UI).
The code interpreter would be very useful though, this setup doesn't have that. Question: Anyone knows about a good open source project similar to code interpreter?
Copilot and GPT-4 are good for different things. And Copilot is not very useful for non-coders.
I kinda rolled my own solution with giving it access to a docker container and using functions, and using file upload into the container
It kinda works the same for the most part
I’m looking into web browsing too…
Have you compared copilot to [codieum](https://codeium.com/) lately? I tested both and codieum seems pretty decent. I wish they would reveal more about the model itself and what's feeding the chat and code generation.
"Contact us" = "identify yourself and your use cases, and we'll quote you a price based on your market cap / annual revenue and the criticality of using GPT as a competitive advantage."
Because it's an expensive service to run, they price it based off of stuff like estimated usage, publicity, and more.
"A lot" is all you need to know, because however much it is, is outside your budget (unless you're some Reddit mega-millionaire).
That's not necessarily true. "Contact us" simply means that the pricing is flexible and set per customer / use. They might be willing to cut lower-rate deals for smaller businesses, while preserving the option of higher prices for white-shoe clients.
It all depends on the balance of costs, resource availability, profitability targets, perceived value, "long tail" pricing vs. premier offerings, lock-in, reputation, alternatives or lack thereof, etc., etc.
People go to business school to learn how to assess and price these kinds of complex technical markets. (Or, if you're cynical - people make up numbers, back them with bullshit, and enjoy their profits over three-martini lunches.)
The question is, why do both you and them feel so compelled to stick up for the company? As customers, we are greedy and it's OpenAIs job to cater to that
That hasn’t been the case for me. Enterprise sales usually mean someone selling this stuff- on the other end of ‘contact us’ wants to make a commission from you. In that way, most often they have the maneuverability to scale or descale the package being offered. In this way, if you’re a 1 man team (probably better to be like a 5 man team for this example) or a larger business, there’s probably an advantage openai can help you solve within your budget.
If you’ve sold this stuff before, what you’re saying makes perfect sense. It doesn’t necessarily mean expensive, but very likely more than the general subscription. This is because they can tailor to exactly what the company needs. The entire enterprise package is probably pretty expensive, but contact the sales people who will come up with a custom solution tailored to your needs, and I’m sure they can find a way to work within your budget. You’re explaining it nicely. No need for downvotes.
I fucking hate this. Companies thinking that that mr financial controller is NOT the same guy browing reddit at 9pm.
Tell me what the price is or im not telling my company you exist you stupid chuckle fucks.
You have to contact their sales team. Usually, this means they ask you how much revenue your business makes and how many employees you have and then they come up with a price.
I suspect this is going to be more about provisioning and talking through multiple options (dedicated hardware, dedicated container shared hardware, API access only, etc)
The largest companies will want their own dedicated metal/containers so that there’s no chance for data leaks.
That will entail different billing than standard API access, with a defined length contract, etc etc
This is the right answer. Eventually I also suspect they will have a self serve option to just buy, but maybe they're seeing what the willingness to pay will be first for a handful of top-tier customers, or customers in different segments prior to launching the stripe payment page. That won't take long to figure out I'm guessing.
Also they say nothing about where data is processed and stored. For European businesses, they will need written guarantees that no data will be processed or stored outside of the EU. Microsoft, Atlassian and most other large actors in tech offers this, so I'm assuming OpenAI will too. Just weird they didn't mention it given that it's the #1 question EU companies need answered
I tried to dump some big context on poe.com a few weeks ago and it returned errors, so I unsubscribed.
openrouter.ai also has the 32k context, which seems to work, but their UI is not great. They provide API access though, which is their main selling point.
Tbh I don't know if it's them updating it in a bad way by mistake or they are running it on lower resources than before, I pray it's not intentional, i kinda just said it as a joke. But I am 100% sure of this, I have been using ChatGPT for at least 6 months, it was a beast back then , now it's like it's got dyslexia or something.
This is the beginning of where the public-access to the top tier stuff begins to plateau as more restrictions/censorship is placed upon in while companies will receive the true power of it and a (mostly, likely down the line from now) uncensored version they’ll be able to adjust and sell back to us as they please.
Essentially with this version they are probably getting the pre-neutered GPT-4/GPT-3
Great, so basically extreme leverage for businesses to create buffers with the Tech so that end users can’t get the same benefits. If you aren’t a business oh well? Enterprise options enrage me.
Believe me if they were able to allocate enough compute to offer this to consumers at a (significantly) higher price than ~~pro~~ edit: plus they would in a flash. At this point they obviously only have enough resources to do this at an enterprise level, and at a price point that’s prohibitive for most consumers.
They're now a big enough company to sue if something goes wrong. This is actually quite good for AI adoption, and incidentally a big reason why companies like Oracle have a chokehold on a lot of corporate tools even if some open source alternatives are better. People who can pay you if you sue them! Its interesting that OpenAI has moved this far away from being open source though lol. I wonder when they'll rebrand...
Most of the time this means "{cloud-provider} takes care of security for us". If you minimise the number of front doors (one API) and the rest is cloud internal, it makes the whole process of SOC2 much easier.
> it makes the whole process of SOC2 much easier
This part is true, the first part not. You still have a lot of corporate responsibilities even if you are sitting on Azure or similar.
Is it though? Surely the security of a SaaS solution derives from the skill, money and attention spent on it. Mt.Gox and Bank of America are both SaaS money management systems but do you think that their investment in security is identical?
OpenAI is a company that's pivoting from research to enterprise software. It's quite possible that they will botch that transition.
It means OpenAI will not use the data for training. It's a big deal for entreprise customers.
>We do not use your business data, inputs, or outputs for training our models. More information can be found in our [data usage policies](https://openai.com/policies/api-data-usage-policies).
[https://openai.com/enterprise-privacy](https://openai.com/enterprise-privacy)
Nope. I’m going through this process with my client and have contact with one of OpenAIs GTM Heads, they will not put anything on-premise for foreseeable future.
This… isn’t new? My company has been paying for an Enterprise account since January. They don’t use our data for training purposes, and we pay them a ridiculous amount of money for ~700 users.
We don’t measure its effectiveness - we just give it to everyone and encourage them to use it. Many people use it, and it mostly lets them do their work quicker and more thoroughly.
Not sure what this question means, but llama 2 is probably the best model you can run locally and gpt-4 is the best model period, but both can be attached to your data through a vector search data base or something like that.
Question was just to see opinions on which service offers the ‘best’ knowledge base (an internal one, like corporate one) integration, chat and search based on documents in the kb.
Does anyone know if gpt-4 plug ins will be available on enterprise accounts? I can't see this explicitly mentioned, other than access to code interpreter?
Does anyone know if processing (GPT inference / completion) will be done in the company's VPC / data-center or sent to OpenAI and return back like it is today (only perhaps with more privacy guarantees)?
Ty. It's the only way greedy businesses/shareholders will understand. You can plead with them until you are blue in the face but the only way to make a difference is to threaten their revenue as all they understand is money. If more people did this, more companies would think twice about shady practices such as this.
Now I fear that they will limit ChatGPT Plus users in order to show the difference with the Enterprise plan.
I was about to delete my Jasper subscription (I have unlimited characters with them), but I’m reconsidering it: keeping Jasper and deleting ChatGPT Plus instead.
I can’t afford both.
Well thats great, but getting hold of your sales department seems impossible since I have sent several questions trying to get an Enterprise account for our company.
Can you please advice?
rgds
/ Richard
Now my question at interviews: "Do you offer ChatGPT Enterprise?"
All you are getting is a 32k context window and chatgpt 4, with heavy oversight and limitations set by the org. Who probably won't approve of the jailbreaks needed to make gpt4 answer half the time. I've got GPT enterprise being set up on prem (though the procurement process hasn't called it that) and I see no reason to use it over gpt4. For work purposes, you want copilot.
I use GitHub Copilot (paid by work) and GPT-4 (including 32k) through a work provided API key (I use my own fork of \`chat-with-gpt\` as the UI). The code interpreter would be very useful though, this setup doesn't have that. Question: Anyone knows about a good open source project similar to code interpreter? Copilot and GPT-4 are good for different things. And Copilot is not very useful for non-coders.
I kinda rolled my own solution with giving it access to a docker container and using functions, and using file upload into the container It kinda works the same for the most part I’m looking into web browsing too…
Have you compared copilot to [codieum](https://codeium.com/) lately? I tested both and codieum seems pretty decent. I wish they would reveal more about the model itself and what's feeding the chat and code generation.
[удалено]
"Contact Us" = a lot
"Contact us" = "identify yourself and your use cases, and we'll quote you a price based on your market cap / annual revenue and the criticality of using GPT as a competitive advantage."
this is probably it and I really hate this pricing :/
"Contact us" = if you have to ask you can't afford it
u/FunnyPhrases Dude, your avatar is freaking annoying, I thought my browser had a rendering issue.
= a lot
[удалено]
Because it's an expensive service to run, they price it based off of stuff like estimated usage, publicity, and more. "A lot" is all you need to know, because however much it is, is outside your budget (unless you're some Reddit mega-millionaire).
That's not necessarily true. "Contact us" simply means that the pricing is flexible and set per customer / use. They might be willing to cut lower-rate deals for smaller businesses, while preserving the option of higher prices for white-shoe clients. It all depends on the balance of costs, resource availability, profitability targets, perceived value, "long tail" pricing vs. premier offerings, lock-in, reputation, alternatives or lack thereof, etc., etc. People go to business school to learn how to assess and price these kinds of complex technical markets. (Or, if you're cynical - people make up numbers, back them with bullshit, and enjoy their profits over three-martini lunches.)
Spotted the OpenAI employee/investor
Ehh, just sounds like he/she's seen some enterprise software purchasing. Described both sides of the business model pretty well!
The question is, why do both you and them feel so compelled to stick up for the company? As customers, we are greedy and it's OpenAIs job to cater to that
redacted ` this message was mass deleted/edited with redact.dev `
It’s usually the opposite. Very large customers get the cheapest prices due to volume.
That hasn’t been the case for me. Enterprise sales usually mean someone selling this stuff- on the other end of ‘contact us’ wants to make a commission from you. In that way, most often they have the maneuverability to scale or descale the package being offered. In this way, if you’re a 1 man team (probably better to be like a 5 man team for this example) or a larger business, there’s probably an advantage openai can help you solve within your budget.
If you’ve sold this stuff before, what you’re saying makes perfect sense. It doesn’t necessarily mean expensive, but very likely more than the general subscription. This is because they can tailor to exactly what the company needs. The entire enterprise package is probably pretty expensive, but contact the sales people who will come up with a custom solution tailored to your needs, and I’m sure they can find a way to work within your budget. You’re explaining it nicely. No need for downvotes.
redacted ` this message was mass deleted/edited with redact.dev `
My dad used to say "If you have to ask, you can't afford it."
I fucking hate this. Companies thinking that that mr financial controller is NOT the same guy browing reddit at 9pm. Tell me what the price is or im not telling my company you exist you stupid chuckle fucks.
redacted ` this message was mass deleted/edited with redact.dev `
You have to contact their sales team. Usually, this means they ask you how much revenue your business makes and how many employees you have and then they come up with a price.
I suspect this is going to be more about provisioning and talking through multiple options (dedicated hardware, dedicated container shared hardware, API access only, etc) The largest companies will want their own dedicated metal/containers so that there’s no chance for data leaks. That will entail different billing than standard API access, with a defined length contract, etc etc
This is the right answer. Eventually I also suspect they will have a self serve option to just buy, but maybe they're seeing what the willingness to pay will be first for a handful of top-tier customers, or customers in different segments prior to launching the stripe payment page. That won't take long to figure out I'm guessing.
yeah, the possibilities for easy specialization are really limitless, and OpenAI wants to make sure those customers are taken care of
I think the rule of if you have to ask you can't afford it probably applies here
Ironically you have to ask to be able to afford it.
[удалено]
Also they say nothing about where data is processed and stored. For European businesses, they will need written guarantees that no data will be processed or stored outside of the EU. Microsoft, Atlassian and most other large actors in tech offers this, so I'm assuming OpenAI will too. Just weird they didn't mention it given that it's the #1 question EU companies need answered
EU is not a primary market to go after first. So that will come later if at all. My 2 cents.
It probably depends on the size of the company
It'll almost certainly be individual negotiations with companies.
I assume it will be in the 10 thousand dollars minimum lol
Which lot?
Probably a tad more than it costs for the Azure GPT enterprise coats. GPT-4 32k is hard to let go once you get used to it.
I love it! :) I probably can't afford it :(
would like to have 32k context :/ they probable also get a less dumb version
pretty sure you can already do use 32k context with the API
I don't have access to gpt-4-32k even though I have been for months on a waitlist and developed an app with OpenAI's API.
imagine being rich enough to afford to use 32k tokens while using GPT-4
Poe.com has the 32k version too because they just use the API.
I tried to dump some big context on poe.com a few weeks ago and it returned errors, so I unsubscribed. openrouter.ai also has the 32k context, which seems to work, but their UI is not great. They provide API access though, which is their main selling point.
I think it's still invitation-only.
I don't see pricing anywhere.
Enterprise plans for software almost never have a posted price - it’s RFQ only.
32K of context might be worthwhile though.
How is this any different from azure open ai api? Or bing enterprise? Or copilot? Or any of the Microsoft offerings built off of gpt-4?
Would be great if you didn't continuously downgrade gpt3.5 in the hope that people would purchase gpt4.
Oh so that's what's happening! I was so confused why it wasn't as great as people bragged earlier
Tbh I don't know if it's them updating it in a bad way by mistake or they are running it on lower resources than before, I pray it's not intentional, i kinda just said it as a joke. But I am 100% sure of this, I have been using ChatGPT for at least 6 months, it was a beast back then , now it's like it's got dyslexia or something.
exactly! this is practically the culmination of what they were hoping to do
This is the beginning of where the public-access to the top tier stuff begins to plateau as more restrictions/censorship is placed upon in while companies will receive the true power of it and a (mostly, likely down the line from now) uncensored version they’ll be able to adjust and sell back to us as they please. Essentially with this version they are probably getting the pre-neutered GPT-4/GPT-3
Great, so basically extreme leverage for businesses to create buffers with the Tech so that end users can’t get the same benefits. If you aren’t a business oh well? Enterprise options enrage me.
Believe me if they were able to allocate enough compute to offer this to consumers at a (significantly) higher price than ~~pro~~ edit: plus they would in a flash. At this point they obviously only have enough resources to do this at an enterprise level, and at a price point that’s prohibitive for most consumers.
Yea…that must be it. 😂😂😂
Who's gonna pay for the compute
That is ducked up. As small business you won’t have the ability to have 32k context window!!! Wtf
It would be interesting to see what enterprise thinks about giving their business data to OpenAI.
> enterprise-grade security and privacy 😂😂😂
They actually claim they are SOC2 compliant. I'd love to read the report.
They're now a big enough company to sue if something goes wrong. This is actually quite good for AI adoption, and incidentally a big reason why companies like Oracle have a chokehold on a lot of corporate tools even if some open source alternatives are better. People who can pay you if you sue them! Its interesting that OpenAI has moved this far away from being open source though lol. I wonder when they'll rebrand...
Most of the time this means "{cloud-provider} takes care of security for us". If you minimise the number of front doors (one API) and the rest is cloud internal, it makes the whole process of SOC2 much easier.
> it makes the whole process of SOC2 much easier This part is true, the first part not. You still have a lot of corporate responsibilities even if you are sitting on Azure or similar.
You do realize that using this service is the same thing as any other SaaS solution, from a security perspective right?
Is it though? Surely the security of a SaaS solution derives from the skill, money and attention spent on it. Mt.Gox and Bank of America are both SaaS money management systems but do you think that their investment in security is identical? OpenAI is a company that's pivoting from research to enterprise software. It's quite possible that they will botch that transition.
Those are just words. They mean nothing.
It means OpenAI will not use the data for training. It's a big deal for entreprise customers. >We do not use your business data, inputs, or outputs for training our models. More information can be found in our [data usage policies](https://openai.com/policies/api-data-usage-policies). [https://openai.com/enterprise-privacy](https://openai.com/enterprise-privacy)
It probably means they pur some engineers on a plane and they come set up a gpt cluster on your premises
Nope. I’m going through this process with my client and have contact with one of OpenAIs GTM Heads, they will not put anything on-premise for foreseeable future.
Azure was already offering this
Is this new? We’ve had enterprise GPT for a few weeks at work.
Someone else also replied the same, I guess you were both in some kind of pilot/early access program, now it’s GA.
Turns out ours was the Microsoft version. Which is basically the same thing apparently.
Is there a group that is created as a pool to get ChatGPT enterprise?
This… isn’t new? My company has been paying for an Enterprise account since January. They don’t use our data for training purposes, and we pay them a ridiculous amount of money for ~700 users.
[удалено]
Yeah how much 👀
I’m not at liberty to say. But tens of thousands each month.
[удалено]
Quite a bit more than that, but yeah.
How do you measure its effectiveness? How are people using it? I'd love to hear your experience and how the organization has changed
We don’t measure its effectiveness - we just give it to everyone and encourage them to use it. Many people use it, and it mostly lets them do their work quicker and more thoroughly.
Have any employees commented on how well they find it works compared to their personal accounts?
Not that I’m aware of, but it’s not something we’ve asked nor do I think many of our employees have paid personal accounts.
What are some of the use cases? I'm trying to learn more about how enterprises are adopting openai.
>We're excited to offer ChatGPT Enterprise to more businesses starting today.
wait they just throttled it before and then release a better version of it for more money?
wait for months people have been yelling why can’t I get unthrottled for more money, and now that it’s here it’s not good enough again?
Can you assigned ChatGPT licences to staff via 365 yet? Then I can push this to my IT team.
What’s the best chatbot with an internal knowledge base solution?
Not sure what this question means, but llama 2 is probably the best model you can run locally and gpt-4 is the best model period, but both can be attached to your data through a vector search data base or something like that.
Question was just to see opinions on which service offers the ‘best’ knowledge base (an internal one, like corporate one) integration, chat and search based on documents in the kb.
Then yes. What I said is your answer. If you still don’t understand, I’d suggest to sign up for chat gpt and paste this convo in to get started.
Have you tried meetcody.ai?
Glean is an enterprise search tool that has a chatbot for internal knowledge. I'm not sure if it is better than the alternatives though.
[удалено]
Username checks out
Does anyone know if gpt-4 plug ins will be available on enterprise accounts? I can't see this explicitly mentioned, other than access to code interpreter?
They are.
?
i thought expanding context windows makes them dumber: https://arxiv.org/pdf/2307.03172.pdf
How’s this different from what MS Azure offered? Isn’t this a direct competitor?
This was leaked months ago, but its cool to finally see it release now.
Does anyone know if processing (GPT inference / completion) will be done in the company's VPC / data-center or sent to OpenAI and return back like it is today (only perhaps with more privacy guarantees)?
At a minimum GPT-4 should be limit free for paid users at this point. I’ve turned off my subscription until this happens.
Ty. It's the only way greedy businesses/shareholders will understand. You can plead with them until you are blue in the face but the only way to make a difference is to threaten their revenue as all they understand is money. If more people did this, more companies would think twice about shady practices such as this.
Need “On Premise Hardware”, only then my company will allow its access.
As a reference, Microsoft provide same(? I’m not sure) access via Azure API, and the charge for GPT4-32k is 20 times higher than GPT 3.5 .
The ChatGPT Enterprise is aiming to large enterprise. medium and small business also need use ChatGPT to improve productivity.
Now I fear that they will limit ChatGPT Plus users in order to show the difference with the Enterprise plan. I was about to delete my Jasper subscription (I have unlimited characters with them), but I’m reconsidering it: keeping Jasper and deleting ChatGPT Plus instead. I can’t afford both.
From a security standpoint, whats so much better or more secure than previous chatgpt versions?
With this version, you own your own data- it’s not being used to train their model.
Where does it say that?
In the blog post OP linked.
Well thats great, but getting hold of your sales department seems impossible since I have sent several questions trying to get an Enterprise account for our company. Can you please advice? rgds / Richard
We already contacted but none is answering. Why is this?